• ABSTRACT
    • This study assessed the reliability and validity of a new classification system for fractures of the femur after hip arthroplasty. Forty radiographs were evaluated by 6 observers, 3 experts and 3 nonexperts. Each observer read the radiographs on 2 separate occasions and classified each case as to its type (A, B, C) and subtype (B1, B2, B3). Reliability was assessed by looking at the intraobserver and interobserver agreement using the kappa statistic. Validity was assessed within the B group by looking at the agreement between the radiographic classification and the intraoperative findings. Our findings suggest that this classification system is reliable and valid. Intraobserver agreement was consistent across observers, ranging from 0.73 to 0.83. There was a negligible difference between experts and nonexperts. Interobserver agreement was 0.61 for the first reading and 0.64 for the second reading by kappa analysis, indicating substantial agreement between observers. Validity analysis revealed an observed agreement kappa value of 0.78, indicating substantial agreement. This study has shown that this classification is reliable and valid.