Le Coefficient Kappa

Introduction   

Définition de l'accord

Accord entre 2 juges

Accord entre plusieurs juges

Limites du test

Usages du test

Applications pratiques

Conclusion

Annexes

Références

 

Téléchargement 

Liens Internet

Note de l'auteur

Contact

Références

 

1.           Cohen J. : A coefficient of agreement for nominal scales., Educ. Psychol. Meas., 1960, 20, 27-46.

2.           Kendall M.G. : Rank correlation methods, Hafner Pub.Co, New-York.

3.            Siegel S., Castellan N.J. Jr. : Nonparametric Statistics for the Behavioral Sciences, McGraw-Hill International Editions, 1988, 2nd ed..

4.            Grenier : Décision médicale, Masson, 1993.

5.            Landis J.R., Koch G.G. : The Measurement of Observer Agreement for Categorical Data, Biometrics, 1977a, 33, 159-174.

6.            Fleiss J.L., Cohen J., and Everitt B.S. : Large sample standard errors of kappa and weighted kappa, Psychol. Bull., 1969, 72, 323-327.

7.            Hubert J.L. : A general formula for the variance of Cohen’s weighted kappa, Psychol. Bull., 1978, 85, 183-184.

8.            Cicchetti D.V., Fleiss J.L. : Comparaison of the null distributions of weighted Kappa and the C ordinal statistic, Appl. Psychol. Meas., 1977, 1, 195-201.

9.            Fermanian J. : Mesure de l’accord entre deux juges. Cas qualitatif, Rev. Epidém. et Santé Publ., 1984, 32, 140-147.

10.       Fleiss J.L. : Inference about weighted Kappa in the non-null case, Appl. Psychol. Meas., 1978, 1, 113-117.

11.       Fleiss J.L. : Statistical Methods for Rates and Proportions, John Wiley and Sons, New York, 1981.  

12.            Landis J.R., Koch G.G. : A one-way components of variance model for categorical data, , Biometrics, 1977b, 33, 671-679.

13.       Fleiss J.L., Cuzick J. : The reliability of dichotomous judgments : Unequal numbers of judges per subject. Appl. Psychol. Meas., 1979, 3, 537-542.

14.       Feinstein A.R., Cicchetti D.V. : High agreement but low kappa : I. The problems of Two Paradoxes, J. Clin. Epidemiol., 1990, 43, 543-548.

15.       Scott W.A. : Reliability of content analysis : The case of nominal scale coding, Public Opinion Q, 1955, 19, 321-325.

16.       Bennet E.M., Alpert R., Goldstein A.C. : Communications through limited response questioning, Public Opinion Q, 1954, 18, 303-308.

17.       Cicchetti D.V., Feinstein A.R. : High agreement but low kappa : II. Resolving the paradoxes, J. Clin. Epidemiol., 1990, 43, 551-558.

18.       Byrt T., Bishop J., Carlin J.B. : Biais, Prevalence and Kappa, J. Clin. Epidemiol., 1993, 46, 423-429.

19.       Holley J.W., Guilford J.P. : A note on the G index of agreement, Educ. Psychol. Bull., 1964, 32, 281-288.

20.       Hui S.L., Walter S.D. : Estimating the error rates of diagnostic tests, Biometrics, 1980, 36, 167-171.

21.       Walter S.D. : Measuring the reliability of clinical data : the case for using three observers, Rev. Epidém. et Santé Publ., 1984, 32, 206-211.

22.       Walter S.D., Irwig L.M. : Estimation of test error rates, disease prevalence and relative risk from misclassified data : a review, J. Clin. Epidemiol., 1988, 41, 923-937.

23.       Bertrand P., Benichou J., Chastang C. : Estimation par la méthode de Hui et Walter de la sensibilité et la spécificité d’un test diagnostique en l’abscence d’un test de référence : résultats d’une étude de simulation, Rev. Epidém. et santé Publ., 1994, 42, 502-511.

24.   Reed III J.F., Reed J.J. : Cohen’s weighted kappa with Turbo Pascal (FORTRAN), Computer Methods and Programs in Biomedecine, 1992, 38, 153-165.

25.       Boushka W.M., Marinez Y.N., Prihoda T.J., Dunford R., Barnwell G.M. : A computer program for calculating kappa : application to interexaminer agreement in periodontal research, Computer Methods and Programs in Biomedecine, 1990, 33, 35-41.

26.       Landis J.R., Koch G.G. : A one-way components of variance model for categorical data, , Biometrics, 1977b, 33, 671-679.

27.       Haley S.M., Osberg J.S. : Kappa Coefficient Calculation Using Multiple Ratings Per Subjects : A Special Communication, Phys. Ther., 1989, 69, 970-974.

 

Pour tout savoir ou presque sur le test statistique Kappa...