19 resultados para process data
Resumo:
This study proposed a novel statistical method that modeled the multiple outcomes and missing data process jointly using item response theory. This method follows the "intent-to-treat" principle in clinical trials and accounts for the correlation between outcomes and missing data process. This method may provide a good solution to chronic mental disorder study. ^ The simulation study demonstrated that if the true model is the proposed model with moderate or strong correlation, ignoring the within correlation may lead to overestimate of the treatment effect and result in more type I error than specified level. Even if the within correlation is small, the performance of proposed model is as good as naïve response model. Thus, the proposed model is robust for different correlation settings if the data is generated by the proposed model.^
Resumo:
Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.
Resumo:
Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (
Resumo:
Over the last decade, adverse events and medical errors have become a main focus of interest for the standards of quality and safety in the U.S. healthcare system (Weinstein & Henderson, 2009). Particularly when a medical error occurs, the disclosure of medical errors and its practices have become a focal point of the healthcare process. Patients and family members who have experienced a medical error might be able to provide knowledge and insight on how to improve the disclose process. However, patient and family member are not typically involved in the disclosure process, thus their experiences go unnoticed. ^ The purpose of this research was to explore how best to include patients and family members in the disclosure process regarding a medical error. The research consisted of 28 qualitative interviews from three stakeholder groups: Hospital Administrators, Clinical Service Providers, and Patients and Family Members. They were asked for their ideas and suggestions on how best to include patients and family members in the disclosure process. Framework Analysis was used to analyze this data and find prevalent themes based on the primary research question. A secondary aim was to index categories created based on the interviews that were collected. Data was used from the Texas Disclosure and Compensation Study with Dr. Eric Thomas as the Principal Investigator. Full acknowledgement of access to this data is given to Dr. Thomas. ^ The themes from the research revealed that each stakeholder group was interested and open to including patients and family members in the disclosure process and that the disclosure process should not be a "one-way" avenue. The themes gave many suggestions regarding how to best include patients and family members in the disclosure process of a medical error. Secondary aims revealed several ways to assess the ideas and suggestion given by the stakeholders. Overall, acceptability of getting the perspective of patients and family members was the most common theme. Comparison of each stakeholder group revealed that including patients and family members would be beneficial to improving hospital disclosure practices. ^ Conclusions included a list of recommendations and measureable appropriate strategies that could provide hospital with key stakeholders insights on how to improve their disclosure process. Sharing patients and family members experience with healthcare providers can encourage a shift in culture where patients are valued and active in participating in hospital practices. To my knowledge, this research is the very first of its kind and moves the disclosure process conversation forward in a patient-family member inclusion direction that will assist in improving disclosure practices. Future research should implement and evaluate the success of the various inclusion strategies.^