Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer.


Autoria(s): Gianinazzi, Micòl E; Rueegg, Corina S; Zimmerman, Karin; Kuehni, Claudia E; Michel, Gisela
Data(s)

22/05/2015

Resumo

BACKGROUND The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC) which is based on a medical record abstraction, we conducted a second independent abstraction of data with the aim to assess a) intra-rater reliability of one rater at two time points; b) the possible learning effects between these two time points compared to a gold-standard; and c) inter-rater reliability. METHOD Within the TaCC study we conducted a systematic medical record abstraction in the 9 Swiss clinics with pediatric oncology wards. In a second phase we selected a subsample of medical records in 3 clinics to conduct a second independent abstraction. We then assessed intra-rater reliability at two time points, the learning effect over time (comparing each rater at two time-points with a gold-standard) and the inter-rater reliability of a selected number of variables. We calculated percentage agreement and Cohen's kappa. FINDINGS For the assessment of the intra-rater reliability we included 154 records (80 for rater 1; 74 for rater 2). For the inter-rater reliability we could include 70 records. Intra-rater reliability was substantial to excellent (Cohen's kappa 0-6-0.8) with an observed percentage agreement of 75%-95%. In all variables learning effects were observed. Inter-rater reliability was substantial to excellent (Cohen's kappa 0.70-0.83) with high agreement ranging from 86% to 100%. CONCLUSIONS Our study showed that data abstracted from medical records are reliable. Investigating intra-rater and inter-rater reliability can give confidence to draw conclusions from the abstracted data and increase data quality by minimizing systematic errors.

Formato

application/pdf

Identificador

http://boris.unibe.ch/69175/1/Gianinazzi%20PLoSOne%202015.pdf

Gianinazzi, Micòl E; Rueegg, Corina S; Zimmerman, Karin; Kuehni, Claudia E; Michel, Gisela (2015). Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer. PLoS ONE, 10(5), e0124290. Public Library of Science 10.1371/journal.pone.0124290 <http://dx.doi.org/10.1371/journal.pone.0124290>

doi:10.7892/boris.69175

info:doi:10.1371/journal.pone.0124290

info:pmid:26001046

urn:issn:1932-6203

Idioma(s)

eng

Publicador

Public Library of Science

Relação

http://boris.unibe.ch/69175/

Direitos

info:eu-repo/semantics/openAccess

Fonte

Gianinazzi, Micòl E; Rueegg, Corina S; Zimmerman, Karin; Kuehni, Claudia E; Michel, Gisela (2015). Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer. PLoS ONE, 10(5), e0124290. Public Library of Science 10.1371/journal.pone.0124290 <http://dx.doi.org/10.1371/journal.pone.0124290>

Palavras-Chave #610 Medicine & health #360 Social problems & social services
Tipo

info:eu-repo/semantics/article

info:eu-repo/semantics/publishedVersion

PeerReviewed