alexa Evaluation of interrater reliability for posture observations in a field study.


Occupational Medicine & Health Affairs

Author(s): Burt S, Punnett L

Abstract Share this page

Abstract This paper examines the interrater reliability of a quantitative observational method of assessing non-neutral postures required by work tasks. Two observers independently evaluated 70 jobs in an automotive manufacturing facility, using a procedure that included observations of 18 postures of the upper extremities and back. Interrater reliability was evaluated using percent agreement, kappa, intraclass correlation coefficients and generalized linear mixed modeling. Interrater agreement ranged from 26\% for right shoulder elevation to 99 for left wrist flexion, but agreement was at best moderate when using kappa. Percent agreement is an inadequate measure, because it does not account for chance, and can lead to inflated measures of reliability. The use of more appropriate statistical methods may lead to greater insight into sources of variability in reliability and validity studies and may help to develop more effective ergonomic exposure assessment methods. Interrater reliability was acceptable for some of the postural observations in this study.
This article was published in Appl Ergon and referenced in Occupational Medicine & Health Affairs

Relevant Expert PPTs

Relevant Speaker PPTs

Recommended Conferences

Relevant Topics

Peer Reviewed Journals
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2017-18
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

© 2008-2017 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version