3 resultados para sampling error


Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND Missed, delayed or incorrect diagnoses are considered to be diagnostic errors. The aim of this paper is to describe the methodology of a study to analyse cognitive aspects of the process by which primary care (PC) physicians diagnose dyspnoea. It examines the possible links between the use of heuristics, suboptimal cognitive acts and diagnostic errors, using Reason's taxonomy of human error (slips, lapses, mistakes and violations). The influence of situational factors (professional experience, perceived overwork and fatigue) is also analysed. METHODS Cohort study of new episodes of dyspnoea in patients receiving care from family physicians and residents at PC centres in Granada (Spain). With an initial expected diagnostic error rate of 20%, and a sampling error of 3%, 384 episodes of dyspnoea are calculated to be required. In addition to filling out the electronic medical record of the patients attended, each physician fills out 2 specially designed questionnaires about the diagnostic process performed in each case of dyspnoea. The first questionnaire includes questions on the physician's initial diagnostic impression, the 3 most likely diagnoses (in order of likelihood), and the diagnosis reached after the initial medical history and physical examination. It also includes items on the physicians' perceived overwork and fatigue during patient care. The second questionnaire records the confirmed diagnosis once it is reached. The complete diagnostic process is peer-reviewed to identify and classify the diagnostic errors. The possible use of heuristics of representativeness, availability, and anchoring and adjustment in each diagnostic process is also analysed. Each audit is reviewed with the physician responsible for the diagnostic process. Finally, logistic regression models are used to determine if there are differences in the diagnostic error variables based on the heuristics identified. DISCUSSION This work sets out a new approach to studying the diagnostic decision-making process in PC, taking advantage of new technologies which allow immediate recording of the decision-making process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Non-alcoholic fatty liver disease (NAFLD) is an emerging health concern in both developed and non-developed world, encompassing from simple steatosis to non-alcoholic steatohepatitis (NASH), cirrhosis and liver cancer. Incidence and prevalence of this disease are increasing due to the socioeconomic transition and change to harmful diet. Currently, gold standard method in NAFLD diagnosis is liver biopsy, despite complications and lack of accuracy due to sampling error. Further, pathogenesis of NAFLD is not fully understood, but is well-known that obesity, diabetes and metabolic derangements played a major role in disease development and progression. Besides, gut microbioma and host genetic and epigenetic background could explain considerable interindividual variability. Knowledge that epigenetics, heritable events not caused by changes in DNA sequence, contribute to development of diseases has been a revolution in the last few years. Recently, evidences are accumulating revealing the important role of epigenetics in NAFLD pathogenesis and in NASH genesis. Histone modifications, changes in DNA methylation and aberrant profiles or microRNAs could boost development of NAFLD and transition into clinical relevant status. PNPLA3 genotype GG has been associated with a more progressive disease and epigenetics could modulate this effect. The impact of epigenetic on NAFLD progression could deserve further applications on therapeutic targets together with future non-invasive methods useful for the diagnosis and staging of NAFLD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.