Examples of 'kappa coefficient' in a sentence

Meaning of "kappa coefficient"

kappa coefficient: A statistical measure used to assess the level of agreement between two raters or judges when evaluating categorical items, with values ranging from -1 to 1

How to use "kappa coefficient" in a sentence

Basic
Advanced
kappa coefficient
The kappa coefficient was used to compare the methods.
To study the correlation between the tests we used the kappa coefficient k.
Kappa coefficient was used to verify agreement between both questionnaire applications.
The level of agreement was computed by the Kappa coefficient.
The kappa coefficient provided an agreement proportion matched among observers.
The agreement between the methods was evaluated by Kappa coefficient.
The kappa coefficient was used to assess the agreement between the methods.
Reproducibility between the methods was evaluated by the Kappa coefficient.
Generalized kappa coefficient was estimated for quantifying the level of interobserver agreement.
The correlation between the evaluators was measured by the Kappa coefficient.
The statistical kappa coefficient was used to evaluate the concordance between the databases.
The agreement between the evaluators was assessed using the Kappa coefficient.
The kappa coefficient value excludes the agreement that would be randomly expected.
The measurement of the quality of the inferences was determined by the Kappa coefficient.
The kappa coefficient was used for measurement of agreement between clinical diagnosis and neurochemical testing.

See also

Classification accuracy was assessed using overall accuracy and Kappa coefficient.
The weighted Kappa coefficient was used as a measure of agreement between observers.
The statistical analysis of the reliability among evaluators was performed through the Kappa coefficient.
The weighted kappa coefficient Kw was used for analysis of the categorical variables.
To evaluate the degree of concordance between the expert nurses the Kappa coefficient was applied.
We used the Kappa coefficient k to assess the level of agreement between the consultants.
The reproducibility of the questionnaire in the second interview was analyzed using the Kappa coefficient.
The Kappa coefficient was applied to evaluate the concordance level of the total obtained scores.
The interobserver agreement for these findings varied from moderate to almost perfect Kappa coefficient.
The kappa coefficient showed reasonable agreement between the BBS and the plantar sensitivity test.
The interpretation of the Kappa coefficient adopted the Fontelles protocol.
Inter-observer agreement was evaluated with kappa coefficient.
Low Kappa coefficient and PPV values stand out.
Inter-rater reliability was determined using the kappa coefficient.
Kappa coefficient was used to assess the intra-examiner agreement during data collection.
The categorization of the exercises was established by the agreement cohen 's kappa coefficient.
Because of the small size of the sample, kappa coefficient values are a little distorted.
The theoretical dimension of the instrument was established by calculating the cohen 's kappa coefficient.
The kappa coefficient was used for evaluating inter-rater reliability in the selection of acceptable curves.
Intra and inter observer reliability were investigated, by calculating the cohen 's kappa coefficient.
The statistics analyses included the kappa coefficient and interclass correlation and fischer ¿ s test and mcnemar ¿.
The inter-observer reliability was obtained through the weighted kappa coefficient.
The Cohen ´ s kappa coefficient revealed insufficient agreement.
The inter-evaluator agreement on the judgment of hypernasality was obtained by the kappa coefficient.
Agreement using Kappa coefficient was done, and interpreted according to Feiss et al.
In this view, reliability of ordinal variables is commonly assessed using the weighted kappa coefficient.
In addition, the best Kappa coefficient was observed between such definitions.
Other two studies used correlation analysis and probability analysis and / or kappa coefficient.
Thus, the Kappa coefficient was used to measure agreement among them.
The tool 's reproducibility was tested with Kappa coefficient.
Ing the Kappa coefficient which corrects the possibility of agree -.
When classified only in normal weight and overweight, the Kappa coefficient underwent changes.
The kappa coefficient showed poor agreement among the tests Table 3.
As for fundus and angiography findings, we measured the Kappa coefficient of agreement among examiners.
The kappa coefficient of 0.5 indicates moderate agreement between the two tests.

You'll also be interested in:

Examples of using Coefficient
Hospital data coefficient x police data or mathematically
Calculation of the peak braking force coefficient
The upeak coefficient of variation is calculated as follows
Show more
Examples of using Kappa
Kappa would not accept women of color
That you keep a kappa in your house
Kappa is like the mom you did have
Show more

Search by letter in the English dictionary