A dedicated BI-RADS training programme: Effect on the inter-observer variation among screening radiologists.
SourceEuropean Journal of Radiology, 81, 9, (2012), pp. 2184-8
01 september 2012
Article / Letter to editor
Display more detailsDisplay less details
Epidemiology, Biostatistics & HTA
European Journal of Radiology
SubjectNCEBP 1: Molecular epidemiology ONCOL 5: Aetiology, screening and detection; NCEBP 2: Evaluation of complex medical interventions ONCOL 5: Aetiology, screening and detection
INTRODUCTION: The Breast Imaging Reporting and Data System (BI-RADS) was introduced in the Dutch breast cancer screening programme to improve communication between medical specialists. Following introduction, a substantial variation in the use of the BI-RADS lexicon for final assessment categories was noted among screening radiologists. We set up a dedicated training programme to reduce this variation. This study evaluates whether this programme was effective. MATERIALS AND METHODS: Two comparable test sets were read before and after completion of the training programme. Each set contained 30 screening mammograms of referred women selected from screening practice. The sets were read by 25 experienced and 30 new screening radiologists. Cohen's kappa (kappa) was used to calculate the inter-observer agreement. The BI-RADS 2003 version was implemented in the screening programme as the BI-RADS 2008 version requires the availability of diagnostic work-up, and this is unavailable. RESULTS: The inter-observer agreement of all participating radiologists (n=55) with the expert panel increased from a pre-training kappa-value of 0.44 to a post-training kappa-value of 0.48 (p=0.14). The inter-observer agreement of the new screening radiologists (n=30) with the expert panel increased from kappa=0.41 to kappa=0.50 (p=0.01), whereas there was no difference in agreement among the 25 experienced radiologists (from kappa=0.48 to kappa=0.46, p=0.60). CONCLUSION: Our training programme in the BI-RADS lexicon resulted in a significant improvement of agreement among new screening radiologists. Overall, the agreement among radiologists was moderate (guidelines Landis and Koch). This is in line with results found in the literature.
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.