373 resultados para 730305 Diagnostic methods
Resumo:
Introduction Two symposia on “cardiovascular diseases and vulnerable plaques” Cardiovascular disease (CVD) is the leading cause of death worldwide. Huge effort has been made in many disciplines including medical imaging, computational modeling, bio- mechanics, bioengineering, medical devices, animal and clinical studies, population studies as well as genomic, molecular, cellular and organ-level studies seeking improved methods for early detection, diagnosis, prevention and treatment of these diseases [1-14]. However, the mechanisms governing the initiation, progression and the occurrence of final acute clinical CVD events are still poorly understood. A large number of victims of these dis- eases who are apparently healthy die suddenly without prior symptoms. Available screening and diagnostic methods are insufficient to identify the victims before the event occurs [8,9]. Most cardiovascular diseases are associated with vulnerable plaques. A grand challenge here is to develop new imaging techniques, predictive methods and patient screening tools to identify vulnerable plaques and patients who are more vulnerable to plaque rupture and associated clinical events such as stroke and heart attack, and recommend proper treatment plans to prevent those clinical events from happening. Articles in this special issue came from two symposia held recently focusing on “Cardio-vascular Diseases and Vulnerable Plaques: Data, Modeling, Predictions and Clinical Applications.” One was held at Worcester Polytechnic Institute (WPI), Worcester, MA, USA, July 13-14, 2014, right after the 7th World Congress of Biomechanics. This symposium was endorsed by the World Council of Biomechanics, and partially supported by a grant from NIH-National Institute of Biomedical Image and Bioengineering. The other was held at Southeast University (SEU), Nanjing, China, April 18-20, 2014.
Resumo:
Endometriosis is a common gynecological disease that affects up to 10% of women in their reproductive years. It causes pelvic pain, severe dysmenorrhea, and subfertility. The disease is defined as the presence of tissue resembling endometrium in sites outside the uterus. Its cause remains uncertain despite >50 years of hypothesis-driven research, and thus the therapeutic options are limited. Disease predisposition is inherited as a complex genetic trait, which provides an alternative route to understanding the disease. We seek to identify susceptibility loci, using a positional-cloning approach that starts with linkage analysis to identify genomic regions likely to harbor these genes. We conducted a linkage study of 1,176 families (931 from an Australian group and 245 from a U.K. group), each with at least two members--mainly affected sister pairs--with surgically diagnosed disease. We have identified a region of significant linkage on chromosome 10q26 (maximum LOD score [MLS] of 3.09; genomewide P = .047) and another region of suggestive linkage on chromosome 20p13 (MLS = 2.09). Minor peaks (with MLS > 1.0) were found on chromosomes 2, 6, 7, 8, 12, 14, 15, and 17. This is the first report of linkage to a major locus for endometriosis. The findings will facilitate discovery of novel positional genetic variants that influence the risk of developing this debilitating disease. Greater understanding of the aberrant cellular and molecular mechanisms involved in the etiology and pathophysiology of endometriosis should lead to better diagnostic methods and targeted treatments.
Resumo:
Background: Patterns of diagnosis and management for men diagnosed with prostate cancer in Queensland, Australia, have not yet been systematically documented and so assumptions of equity are untested. This longitudinal study investigates the association between prostate cancer diagnostic and treatment outcomes and key area-level characteristics and individual-level demographic, clinical and psychosocial factors.---------- Methods/Design: A total of 1064 men diagnosed with prostate cancer between February 2005 and July 2007 were recruited through hospital-based urology outpatient clinics and private practices in the centres of Brisbane, Townsville and Mackay (82% of those referred). Additional clinical and diagnostic information for all 6609 men diagnosed with prostate cancer in Queensland during the study period was obtained via the population-based Queensland Cancer Registry. Respondent data are collected using telephone and self-administered questionnaires at pre-treatment and at 2 months, 6 months, 12 months, 24 months, 36 months, 48 months and 60 months post-treatment. Assessments include demographics, medical history, patterns of care, disease and treatment characteristics together with outcomes associated with prostate cancer, as well as information about quality of life and psychological adjustment. Complementary detailed treatment information is abstracted from participants’ medical records held in hospitals and private treatment facilities and collated with health service utilisation data obtained from Medicare Australia. Information about the characteristics of geographical areas is being obtained from data custodians such as the Australian Bureau of Statistics. Geo-coding and spatial technology will be used to calculate road travel distances from patients’ residences to treatment centres. Analyses will be conducted using standard statistical methods along with multilevel regression models including individual and area-level components.---------- Conclusions: Information about the diagnostic and treatment patterns of men diagnosed with prostate cancer is crucial for rational planning and development of health delivery and supportive care services to ensure equitable access to health services, regardless of geographical location and individual characteristics. This study is a secondary outcome of the randomised controlled trial registered with the Australian New Zealand Clinical Trials Registry (ACTRN12607000233426)
Comparison of standard image segmentation methods for segmentation of brain tumors from 2D MR images
Resumo:
In the analysis of medical images for computer-aided diagnosis and therapy, segmentation is often required as a preliminary step. Medical image segmentation is a complex and challenging task due to the complex nature of the images. The brain has a particularly complicated structure and its precise segmentation is very important for detecting tumors, edema, and necrotic tissues in order to prescribe appropriate therapy. Magnetic Resonance Imaging is an important diagnostic imaging technique utilized for early detection of abnormal changes in tissues and organs. It possesses good contrast resolution for different tissues and is, thus, preferred over Computerized Tomography for brain study. Therefore, the majority of research in medical image segmentation concerns MR images. As the core juncture of this research a set of MR images have been segmented using standard image segmentation techniques to isolate a brain tumor from the other regions of the brain. Subsequently the resultant images from the different segmentation techniques were compared with each other and analyzed by professional radiologists to find the segmentation technique which is the most accurate. Experimental results show that the Otsu’s thresholding method is the most suitable image segmentation method to segment a brain tumor from a Magnetic Resonance Image.
Resumo:
Circuit breaker restrikes are unwanted occurrence, which can ultimately lead to breaker. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks. In 2008 a non-intrusive radiometric restrike measurement method, as well a restrike hardware detection algorithm was developed. The limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current detection methods and algorithms required the use of wide bandwidth current transformers and voltage dividers. A novel non-intrusive restrike diagnostic algorithm using ATP (Alternative Transient Program) and wavelet transforms is proposed. Wavelet transforms are the most common use in signal processing, which is divided into two tests, i.e. restrike detection and energy level based on deteriorated waveforms in different types of restrike. A ‘db5’ wavelet was selected in the tests as it gave a 97% correct diagnostic rate evaluated using a database of diagnostic signatures. This was also tested using restrike waveforms simulated under different network parameters which gave a 92% correct diagnostic responses. The diagnostic technique and methodology developed in this research can be applied to any power monitoring system with slight modification for restrike detection.
Resumo:
Introduction The suitability of video conferencing (VC) technology for clinical purposes relevant to geriatric medicine is still being established. This project aimed to determine the validity of the diagnosis of dementia via VC. Methods This was a multisite, noninferiority, prospective cohort study. Patients, aged 50 years and older, referred by their primary care physician for cognitive assessment, were assessed at 4 memory disorder clinics. All patients were assessed independently by 2 specialist physicians. They were allocated one face-to-face (FTF) assessment (Reference standard – usual clinical practice) and an additional assessment (either usual FTF assessment or a VC assessment) on the same day. Each specialist physician had access to the patient chart and the results of a battery of standardized cognitive assessments administered FTF by the clinic nurse. Percentage agreement (P0) and the weighted kappa statistic with linear weight (Kw) were used to assess inter-rater reliability across the 2 study groups on the diagnosis of dementia (cognition normal, impaired, or demented). Results The 205 patients were allocated to group: Videoconference (n = 100) or Standard practice (n = 105); 106 were men. The average age was 76 (SD 9, 51–95) and the average Standardized Mini-Mental State Examination Score was 23.9 (SD 4.7, 9–30). Agreement for the Videoconference group (P0= 0.71; Kw = 0.52; P < .0001) and agreement for the Standard Practice group (P0= 0.70; Kw = 0.50; P < .0001) were both statistically significant (P < .05). The summary kappa statistic of 0.51 (P = .84) indicated that VC was not inferior to FTF assessment. Conclusions Previous studies have shown that preliminary standardized assessment tools can be reliably administered and scored via VC. This study focused on the geriatric assessment component of the interview (interpretation of standardized assessments, taking a history and formulating a diagnosis by medical specialist) and identified high levels of agreement for diagnosing dementia. A model of service incorporating either local or remote administered standardized assessments, and remote specialist assessment, is a reliable process for enabling the diagnosis of dementia for isolated older adults.
Resumo:
Background: Patients with chest pain contribute substantially to emergency department attendances, lengthy hospital stay, and inpatient admissions. A reliable, reproducible, and fast process to identify patients presenting with chest pain who have a low short-term risk of a major adverse cardiac event is needed to facilitate early discharge. We aimed to prospectively validate the safety of a predefined 2-h accelerated diagnostic protocol (ADP) to assess patients presenting to the emergency department with chest pain symptoms suggestive of acute coronary syndrome. Methods: This observational study was undertaken in 14 emergency departments in nine countries in the Asia-Pacific region, in patients aged 18 years and older with at least 5 min of chest pain. The ADP included use of a structured pre-test probability scoring method (Thrombolysis in Myocardial Infarction [TIMI] score), electrocardiograph, and point-of-care biomarker panel of troponin, creatine kinase MB, and myoglobin. The primary endpoint was major adverse cardiac events within 30 days after initial presentation (including initial hospital attendance). This trial is registered with the Australia-New Zealand Clinical Trials Registry, number ACTRN12609000283279. Findings: 3582 consecutive patients were recruited and completed 30-day follow-up. 421 (11•8%) patients had a major adverse cardiac event. The ADP classified 352 (9•8%) patients as low risk and potentially suitable for early discharge. A major adverse cardiac event occurred in three (0•9%) of these patients, giving the ADP a sensitivity of 99•3% (95% CI 97•9–99•8), a negative predictive value of 99•1% (97•3–99•8), and a specificity of 11•0% (10•0–12•2). Interpretation: This novel ADP identifies patients at very low risk of a short-term major adverse cardiac event who might be suitable for early discharge. Such an approach could be used to decrease the overall observation periods and admissions for chest pain. The components needed for the implementation of this strategy are widely available. The ADP has the potential to affect health-service delivery worldwide.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
Background: Although rapid diagnostic tests (RDTs) for Plasmodium falciparum infection that target histidine rich protein 2 (PfHRP2) are generally sensitive, their performance has been reported to be variable. One possible explanation for variable test performance is differences in expression level of PfHRP in different parasite isolates. Methods: Total RNA and protein were extracted from synchronised cultures of 7 P. falciparum lines over 5 time points of the life cycle, and from synchronised ring stages of 10 falciparum lines. Using quantitative real-time polymerase chain reaction, Western blot analysis and ELISA we investigated variations in the transcription and protein levels of pfhrp2, pfhrp3 and PfHRP respectively in the different parasite lines, over the parasite intraerythrocytic life cycle. Results: Transcription of pfhrp2 and pfhrp3 in different parasite lines over the parasite life cycle was observed to vary relative to the control parasite K1. In some parasite lines very low transcription of these genes was observed. The peak transcription was observed in ring-stage parasites. Pfhrp2 transcription was observed to be consistently higher than pfhrp3 transcription within parasite lines. The intraerythrocytic lifecycle stage at which the peak level of protein was present varied across strains. Total protein levels were more constant relative to total mRNA transcription, however a maximum 24 fold difference in expression at ring-stage parasites relative to the K1 strain was observed. Conclusions: The levels of transcription of pfhrp2 and pfhrp3, and protein expression of PfHRP varied between different P. falciparum strains. This variation may impact on the detection sensitivity of PfHRP2-detecting RDTs.
Resumo:
BACKGROUND: Effective diagnosis of malaria is a major component of case management. Rapid diagnostic tests (RDTs) based on Plasmodium falciparumhistidine-rich protein 2 (PfHRP2) are popular for diagnosis of this most virulent malaria infection. However, concerns have been raised about the longevity of the PfHRP2 antigenaemia following curative treatment in endemic regions. METHODS: A model of PfHRP2 production and decay was developed to mimic the kinetics of PfHRP2 antigenaemia during infections. Data from two human infection studies was used to fit the model, and to investigate PfHRP2 kinetics. Four malaria RDTs were assessed in the laboratory to determine the minimum detectable concentration of PfHRP2. RESULTS: Fitting of the PfHRP2 dynamics model indicated that in malaria naive hosts, P. falciparum parasites of the 3D7 strain produce 1.4 x 10(-)(1)(3) g of PfHRP2 per parasite per replication cycle. The four RDTs had minimum detection thresholds between 6.9 and 27.8 ng/mL. Combining these detection thresholds with the kinetics of PfHRP2, it is predicted that as few as 8 parasites/muL may be required to maintain a positive RDT in a chronic infection. CONCLUSIONS: The results of the model indicate that good quality PfHRP2-based RDTs should be able to detect parasites on the first day of symptoms, and that the persistence of the antigen will cause the tests to remain positive for at least seven days after treatment. The duration of a positive test result following curative treatment is dependent on the duration and density of parasitaemia prior to treatment and the presence and affinity of anti-PfHRP2 antibodies.
Resumo:
Background: Malaria rapid diagnostic tests (RDTs) are increasingly used by remote health personnel with minimal training in laboratory techniques. RDTs must, therefore, be as simple, safe and reliable as possible. Transfer of blood from the patient to the RDT is critical to safety and accuracy, and poses a significant challenge to many users. Blood transfer devices were evaluated for accuracy and precision of volume transferred, safety and ease of use, to identify the most appropriate devices for use with RDTs in routine clinical care. Methods: Five devices, a loop, straw-pipette, calibrated pipette, glass capillary tube, and a new inverted cup device, were evaluated in Nigeria, the Philippines and Uganda. The 227 participating health workers used each device to transfer blood from a simulated finger-prick site to filter paper. For each transfer, the number of attempts required to collect and deposit blood and any spilling of blood during transfer were recorded. Perceptions of ease of use and safety of each device were recorded for each participant. Blood volume transferred was calculated from the area of blood spots deposited on filter paper. Results: The overall mean volumes transferred by devices differed significantly from the target volume of 5 microliters (p < 0.001). The inverted cup (4.6 microliters) most closely approximated the target volume. The glass capillary was excluded from volume analysis as the estimation method used is not compatible with this device. The calibrated pipette accounted for the largest proportion of blood exposures (23/225, 10%); exposures ranged from 2% to 6% for the other four devices. The inverted cup was considered easiest to use in blood collection (206/ 226, 91%); the straw-pipette and calibrated pipette were rated lowest (143/225 [64%] and 135/225 [60%] respectively). Overall, the inverted cup was the most preferred device (72%, 163/227), followed by the loop (61%, 138/227). Conclusions: The performance of blood transfer devices varied in this evaluation of accuracy, blood safety, ease of use, and user preference. The inverted cup design achieved the highest overall performance, while the loop also performed well. These findings have relevance for any point-of-care diagnostics that require blood sampling.
Resumo:
Background Accurate diagnosis is essential for prompt and appropriate treatment of malaria. While rapid diagnostic tests (RDTs) offer great potential to improve malaria diagnosis, the sensitivity of RDTs has been reported to be highly variable. One possible factor contributing to variable test performance is the diversity of parasite antigens. This is of particular concern for Plasmodium falciparum histidine-rich protein 2 (PfHRP2)-detecting RDTs since PfHRP2 has been reported to be highly variable in isolates of the Asia-Pacific region. Methods The pfhrp2 exon 2 fragment from 458 isolates of P. falciparum collected from 38 countries was amplified and sequenced. For a subset of 80 isolates, the exon 2 fragment of histidine-rich protein 3 (pfhrp3) was also amplified and sequenced. DNA sequence and statistical analysis of the variation observed in these genes was conducted. The potential impact of the pfhrp2 variation on RDT detection rates was examined by analysing the relationship between sequence characteristics of this gene and the results of the WHO product testing of malaria RDTs: Round 1 (2008), for 34 PfHRP2-detecting RDTs. Results Sequence analysis revealed extensive variations in the number and arrangement of various repeats encoded by the genes in parasite populations world-wide. However, no statistically robust correlation between gene structure and RDT detection rate for P. falciparum parasites at 200 parasites per microlitre was identified. Conclusions The results suggest that despite extreme sequence variation, diversity of PfHRP2 does not appear to be a major cause of RDT sensitivity variation.
Resumo:
Background Rapid diagnostic tests (RDTs) for detection of Plasmodium falciparum infection that target P. falciparum histidine-rich protein 2 (PfHRP2), a protein that circulates in the blood of patients infected with this species of malaria, are widely used to guide case management. Understanding determinants of PfHRP2 availability in circulation is therefore essential to understanding the performance of PfHRP2-detecting RDTs. Methods The possibility that pre-formed host anti-PfHRP2 antibodies may block target antigen detection, thereby causing false negative test results was investigated in this study. Results Anti-PfHRP2 antibodies were detected in 19/75 (25%) of plasma samples collected from patients with acute malaria from Cambodia, Nigeria and the Philippines, as well as in 3/28 (10.7%) asymptomatic Solomon Islands residents. Pre-incubation of plasma samples from subjects with high-titre anti-PfHRP2 antibodies with soluble PfHRP2 blocked the detection of the target antigen on two of the three brands of RDTs tested, leading to false negative results. Pre-incubation of the plasma with intact parasitized erythrocytes resulted in a reduction of band intensity at the highest parasite density, and a reduction of lower detection threshold by ten-fold on all three brands of RDTs tested. Conclusions These observations indicate possible reduced sensitivity for diagnosis of P. falciparum malaria using PfHRP2-detecting RDTs among people with high levels of specific antibodies and low density infection, as well as possible interference with tests configured to detect soluble PfHRP2 in saliva or urine samples. Further investigations are required to assess the impact of pre-formed anti-PfHRP2 antibodies on RDT performance in different transmission settings.
Resumo:
Background: Malaria rapid diagnostic tests (RDTs) are appropriate for case management, but persistent antigenaemia is a concern for HRP2-detecting RDTs in endemic areas. It has been suggested that pan-pLDH test bands on combination RDTs could be used to distinguish persistent antigenaemia from active Plasmodium falciparum infection, however this assumes all active infections produce positive results on both bands of RDTs, an assertion that has not been demonstrated. Methods: In this study, data generated during the WHO-FIND product testing programme for malaria RDTs was reviewed to investigate the reactivity of individual test bands against P. falciparum in 18 combination RDTs. Each product was tested against multiple wild-type P. falciparum only samples. Antigen levels were measured by quantitative ELISA for HRP2, pLDH and aldolase. Results: When tested against P. falciparum samples at 200 parasites/μL, 92% of RDTs were positive; 57% of these on both the P. falciparum and pan bands, while 43% were positive on the P. falciparum band only. There was a relationship between antigen concentration and band positivity; ≥4 ng/mL of HRP2 produced positive results in more than 95% of P. falciparum bands, while ≥45 ng/mL of pLDH was required for at least 90% of pan bands to be positive. Conclusions: In active P. falciparum infections it is common for combination RDTs to return a positive HRP2 band combined with a negative pan-pLDH band, and when both bands are positive, often the pan band is faint. Thus active infections could be missed if the presence of a HRP2 band in the absence of a pan band is interpreted as being caused solely by persistent antigenaemia.
Resumo:
Background: Changing perspectives on the natural history of celiac disease (CD), new serology and genetic tests, and amended histological criteria for diagnosis cast doubt on past prevalence estimates for CD. We set out to establish a more accurate prevalence estimate for CD using a novel serogenetic approach.Methods: The human leukocyte antigen (HLA)-DQ genotype was determined in 356 patients with 'biopsy-confirmed' CD, and in two age-stratified, randomly selected community cohorts of 1,390 women and 1,158 men. Sera were screened for CD-specific serology.Results: Only five 'biopsy-confirmed' patients with CD did not possess the susceptibility alleles HLA-DQ2.5, DQ8, or DQ2.2, and four of these were misdiagnoses. HLA-DQ2.5, DQ8, or DQ2.2 was present in 56% of all women and men in the community cohorts. Transglutaminase (TG)-2 IgA and composite TG2/deamidated gliadin peptide (DGP) IgA/IgG were abnormal in 4.6% and 5.6%, respectively, of the community women and 6.9% and 6.9%, respectively, of the community men, but in the screen-positive group, only 71% and 75%, respectively, of women and 65% and 63%, respectively, of men possessed HLA-DQ2.5, DQ8, or DQ2.2. Medical review was possible for 41% of seropositive women and 50% of seropositive men, and led to biopsy-confirmed CD in 10 women (0.7%) and 6 men (0.5%), but based on relative risk for HLA-DQ2.5, DQ8, or DQ2.2 in all TG2 IgA or TG2/DGP IgA/IgG screen-positive subjects, CD affected 1.3% or 1.9%, respectively, of females and 1.3% or 1.2%, respectively, of men. Serogenetic data from these community cohorts indicated that testing screen positives for HLA-DQ, or carrying out HLA-DQ and further serology, could have reduced unnecessary gastroscopies due to false-positive serology by at least 40% and by over 70%, respectively.Conclusions: Screening with TG2 IgA serology and requiring biopsy confirmation caused the community prevalence of CD to be substantially underestimated. Testing for HLA-DQ genes and confirmatory serology could reduce the numbers of unnecessary gastroscopies. © 2013 Anderson et al.; licensee BioMed Central Ltd.