66 resultados para Celiac Disease, Diagnostic Accuracy, tTG-IgA, Small Bowel Biopsy

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction The suitability of video conferencing (VC) technology for clinical purposes relevant to geriatric medicine is still being established. This project aimed to determine the validity of the diagnosis of dementia via VC. Methods This was a multisite, noninferiority, prospective cohort study. Patients, aged 50 years and older, referred by their primary care physician for cognitive assessment, were assessed at 4 memory disorder clinics. All patients were assessed independently by 2 specialist physicians. They were allocated one face-to-face (FTF) assessment (Reference standard – usual clinical practice) and an additional assessment (either usual FTF assessment or a VC assessment) on the same day. Each specialist physician had access to the patient chart and the results of a battery of standardized cognitive assessments administered FTF by the clinic nurse. Percentage agreement (P0) and the weighted kappa statistic with linear weight (Kw) were used to assess inter-rater reliability across the 2 study groups on the diagnosis of dementia (cognition normal, impaired, or demented). Results The 205 patients were allocated to group: Videoconference (n = 100) or Standard practice (n = 105); 106 were men. The average age was 76 (SD 9, 51–95) and the average Standardized Mini-Mental State Examination Score was 23.9 (SD 4.7, 9–30). Agreement for the Videoconference group (P0= 0.71; Kw = 0.52; P < .0001) and agreement for the Standard Practice group (P0= 0.70; Kw = 0.50; P < .0001) were both statistically significant (P < .05). The summary kappa statistic of 0.51 (P = .84) indicated that VC was not inferior to FTF assessment. Conclusions Previous studies have shown that preliminary standardized assessment tools can be reliably administered and scored via VC. This study focused on the geriatric assessment component of the interview (interpretation of standardized assessments, taking a history and formulating a diagnosis by medical specialist) and identified high levels of agreement for diagnosing dementia. A model of service incorporating either local or remote administered standardized assessments, and remote specialist assessment, is a reliable process for enabling the diagnosis of dementia for isolated older adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Changing perspectives on the natural history of celiac disease (CD), new serology and genetic tests, and amended histological criteria for diagnosis cast doubt on past prevalence estimates for CD. We set out to establish a more accurate prevalence estimate for CD using a novel serogenetic approach.Methods: The human leukocyte antigen (HLA)-DQ genotype was determined in 356 patients with 'biopsy-confirmed' CD, and in two age-stratified, randomly selected community cohorts of 1,390 women and 1,158 men. Sera were screened for CD-specific serology.Results: Only five 'biopsy-confirmed' patients with CD did not possess the susceptibility alleles HLA-DQ2.5, DQ8, or DQ2.2, and four of these were misdiagnoses. HLA-DQ2.5, DQ8, or DQ2.2 was present in 56% of all women and men in the community cohorts. Transglutaminase (TG)-2 IgA and composite TG2/deamidated gliadin peptide (DGP) IgA/IgG were abnormal in 4.6% and 5.6%, respectively, of the community women and 6.9% and 6.9%, respectively, of the community men, but in the screen-positive group, only 71% and 75%, respectively, of women and 65% and 63%, respectively, of men possessed HLA-DQ2.5, DQ8, or DQ2.2. Medical review was possible for 41% of seropositive women and 50% of seropositive men, and led to biopsy-confirmed CD in 10 women (0.7%) and 6 men (0.5%), but based on relative risk for HLA-DQ2.5, DQ8, or DQ2.2 in all TG2 IgA or TG2/DGP IgA/IgG screen-positive subjects, CD affected 1.3% or 1.9%, respectively, of females and 1.3% or 1.2%, respectively, of men. Serogenetic data from these community cohorts indicated that testing screen positives for HLA-DQ, or carrying out HLA-DQ and further serology, could have reduced unnecessary gastroscopies due to false-positive serology by at least 40% and by over 70%, respectively.Conclusions: Screening with TG2 IgA serology and requiring biopsy confirmation caused the community prevalence of CD to be substantially underestimated. Testing for HLA-DQ genes and confirmatory serology could reduce the numbers of unnecessary gastroscopies. © 2013 Anderson et al.; licensee BioMed Central Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A 17-year-old white adolescent had a history of chronic diarrhea, delayed puberty, and growth failure. Investigations excluded cystic fibrosis, Shwachman syndrome, and endocrine causes of growth failure. Severe steatorrhea was diagnosed from fecal fat studies, and a jejunal suction biopsy showed total villus atrophy, consistent with a diagnosis of celiac diseases. Following introduction of a gluten-free diet, his appetite and growth improved, but he continued to have abdominal discomfort and loose offensive bowel motions. One year later, severe steatorrhea was present. A repeat jejunal biopsy showed partial recovery of villus architecture. Serum immunoreactive trypsinogen level was low, which was highly suggestive of exocrine pancreatic failure. Results of quantitative pancreatic stimulation test confirmed the presence of primary pancreatic insufficiency. After introduction of oral pancreatic enzyme supplements with meals, his gastrointestinal symptoms resolved and growth velocity accelerated. Previously, primary pancreatic insufficiency has only been described in elderly patients with long-standing untreated celiac disease. This case, however, emphasizes that pancreatic failure can occur with celiac disease at any age. Determination of a serum immunoreactive trypsinogen level should be considered a useful screening tool for pancreatic insufficiency in patients with celiac disease who have not responded to a gluten-free diet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New classification criteria for axial spondyloarthritis have been developed with the goal of increasing sensitivity of criteria for early inflammatory spondyloarthritis. However these criteria substantially increase heterogeneity of the resulting disease group, reducing their value in both research and clinical settings. Further research to establish criteria based on better knowledge of the natural history of non-radiographic axial spondyloarthritis, its aetiopathogenesis and response to treatment is required. In the meantime the modified New York criteria for ankylosing spondylitis remain a very useful classification criteria set, defining a relatively homogenous group of cases for clinical use and research studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Malnutrition is a common problem in children with end-stage liver disease (ESLD), and accurate assessment of nutritional status is essential in managing these children. In a retrospective study, we compared nutritional assessment by anthropometry with that by body composition. We analyzed all consecutive measurements of total body potassium (TBK, n = 186) of children less than 3 years old with ESLD awaiting transplantation found in our database. The TBK values obtained by whole body counting of 40K were compared with reference TRK values of healthy children. The prevalence of malnutrition, as assessed by weight (weight Z score < -2) was 28%, which was significantly lower (chi-square test, p < 0.0001) than the prevalence of malnutrition (76%) assessed by TBK (< 90% of expected TRK for age). These results demonstrated that body weight underestimated the nutritional deficit and stressed the importance of measuring body composition as part of assessing nutritional status of children with ESLD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background: Studies that compare Indigenous Australian and non-Indigenous patients who experience a cardiac event or chest pain are inconclusive about the reasons for the differences in-hospital and survival rates. The advances in diagnostic accuracy, medication and specialised workforce has contributed to a lower case fatality and lengthen survival rates however this is not evident in the Indigenous Australian population. A possible driver contributing to this disparity may be the impact of patient-clinician interface during key interactions during the health care process. Methods/Design: This study will apply an Indigenous framework to describe the interaction between Indigenous patients and clinicians during the continuum of cardiac health care, i.e. from acute admission, secondary and rehabilitative care. Adopting an Indigenous framework is more aligned with Indigenous realities, knowledge, intellects, histories and experiences. A triple layered designed focus group will be employed to discuss patient-clinician engagement. Focus groups will be arranged by geographic clusters i.e. metropolitan and a regional centre. Patient informants will be identified by Indigenous status (i.e. Indigenous and non-Indigenous) and the focus groups will be convened separately. The health care provider focus groups will be convened on an organisational basis i.e. state health providers and Aboriginal Community Controlled Health Services. Yarning will be used as a research method to facilitate discussion. Yarning is in congruence with the oral traditions that are still a reality in day-to-day Indigenous lives. Discussion: This study is nestled in a larger research program that explores the drivers to the disparity of care and health outcomes for Indigenous and non-Indigenous Australians who experience an acute cardiac admission. A focus on health status, risk factors and clinical interventions may camouflage critical issues within a patient-clinician exchange. This approach may provide a way forward to reduce the appalling health disadvantage experienced within the Indigenous Australian communities. Keywords: Patient-clinician engagement, Qualitative, Cardiovascular disease, Focus groups, Indigenous

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background:: The first major Crohn's disease (CD) susceptibility gene, NOD2, implicates the innate intestinal immune system and other pattern recognition receptors in the pathogenesis of this chronic, debilitating disorder. These include the Toll‐like receptors, specifically TLR4 and TLR5. A variant in the TLR4 gene (A299G) has demonstrated variable association with CD. We aimed to investigate the relationship between TLR4 A299G and TLR5 N392ST, and an Australian inflammatory bowel disease cohort, and to explore the strength of association between TLR4 A299G and CD using global meta‐analysis. Methods:: Cases (CD = 619, ulcerative colitis = 300) and controls (n = 360) were genotyped for TLR4 A299G, TLR5 N392ST, and the 4 major NOD2 mutations. Data were interrogated for case‐control analysis prior to and after stratification by NOD2 genotype. Genotype–phenotype relationships were also sought. Meta‐analysis was conducted via RevMan. Results:: The TLR4 A299G variant allele showed a significant association with CD compared to controls (P = 0.04) and a novel NOD2 haplotype was identified which strengthened this (P = 0.003). Furthermore, we identified that TLR4 A299G was associated with CD limited to the colon (P = 0.02). In the presence of the novel NOD2 haplotype, TLR4 A299G was more strongly associated with colonic disease (P < 0.001) and nonstricturing disease (P = 0.009). A meta‐analysis of 11 CD cohorts identified a 1.5‐fold increase in risk for the variant TLR4 A299G allele (P < 0.00001). Conclusions:: TLR 4 A299G appears to be a significant risk factor for CD, in particular colonic, nonstricturing disease. Furthermore, we identified a novel NOD2 haplotype that strengthens the relationship between TLR4 A299G and these phenotypes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Oropharyngeal aspiration (OPA) can lead to recurrent respiratory illnesses and chronic lung disease in children. Current clinical feeding evaluations performed by speech pathologists have poor reliability in detecting OPA when compared to radiological procedures such as the modified barium swallow (MBS). Improved ability to diagnose OPA accurately via clinical evaluation potentially reduces reliance on expensive, less readily available radiological procedures. Our study investigates the utility of adding cervical auscultation (CA), a technique of listening to swallowing sounds, in improving the diagnostic accuracy of a clinical evaluation for the detection of OPA. Methods We plan an open, unblinded, randomised controlled trial at a paediatric tertiary teaching hospital. Two hundred and sixteen children fulfilling the inclusion criteria will be randomised to one of the two clinical assessment techniques for the clinical detection of OPA: (1) clinical feeding evaluation only (CFE) group or (2) clinical feeding evaluation with cervical auscultation (CFE + CA) group. All children will then undergo an MBS to determine radiologically assessed OPA. The primary outcome is the presence or absence of OPA, as determined on MBS using the Penetration-Aspiration Scale. Our main objective is to determine the sensitivity, specificity, negative and positive predictive values of ‘CFE + CA’ versus ‘CFE’ only compared to MBS-identified OPA. Discussion Early detection and appropriate management of OPA is important to prevent chronic pulmonary disease and poor growth in children. As the reliability of CFE to detect OPA is low, a technique that can improve the diagnostic accuracy of the CFE will help minimise consequences to the paediatric respiratory system. Cervical auscultation is a technique that has previously been documented as a clinical adjunct to the CFE; however, no published RCTs addressing the reliability of this technique in children exist. Our study will be the first to establish the utility of CA in assessing and diagnosing OPA risk in young children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: This study explored gene expression differences in predicting response to chemoradiotherapy in esophageal cancer. PURPOSE:: A major pathological response to neoadjuvant chemoradiation is observed in about 40% of esophageal cancer patients and is associated with favorable outcomes. However, patients with tumors of similar histology, differentiation, and stage can have vastly different responses to the same neoadjuvant therapy. This dichotomy may be due to differences in the molecular genetic environment of the tumor cells. BACKGROUND DATA: Diagnostic biopsies were obtained from a training cohort of esophageal cancer patients (13), and extracted RNA was hybridized to genome expression microarrays. The resulting gene expression data was verified by qRT-PCR. In a larger, independent validation cohort (27), we examined differential gene expression by qRT-PCR. The ability of differentially-regulated genes to predict response to therapy was assessed in a multivariate leave-one-out cross-validation model. RESULTS: Although 411 genes were differentially expressed between normal and tumor tissue, only 103 genes were altered between responder and non-responder tumor; and 67 genes differentially expressed >2-fold. These included genes previously reported in esophageal cancer and a number of novel genes. In the validation cohort, 8 of 12 selected genes were significantly different between the response groups. In the predictive model, 5 of 8 genes could predict response to therapy with 95% accuracy in a subset (74%) of patients. CONCLUSIONS: This study has identified a gene microarray pattern and a set of genes associated with response to neoadjuvant chemoradiation in esophageal cancer. The potential of these genes as biomarkers of response to treatment warrants further investigation. Copyright © 2009 by Lippincott Williams & Wilkins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: In non-small-cell lung cancer (NSCLC), the epidermal growth factor receptor (EGFR) and cyclooxygenase-2 (COX-2) play major roles in tumorigenesis. This phase I/II study evaluated combined therapy with the EGFR tyrosine kinase inhibitor (TKI) gefitinib and the COX-2 inhibitor rofecoxib in platinum-pretreated, relapsed, metastatic NSCLC (n = 45). Patients and Methods: Gefitinib 250 mg/d was combined with rofecoxib (dose escalated from 12.5 to 25 to 50 mg/d through three cohorts, each n = 6). Because the rofecoxib maximum-tolerated dose was not reached, the 50 mg/d cohort was expanded for efficacy evaluation (n = 33). Results: Among the 42 assessable patients, there was one complete response (CR) and two partial responses (PRs) and 12 patients with stable disease (SD); disease control rate was 35.7% (95% CI, 21.6% to 52.0%). Median time to tumor progression was 55 days (95% CI, 47 to 70 days), and median survival was 144 days (95% CI, 103 to 190 days). In a pilot study, matrix-assisted laser desorption/ionization (MALDI) proteomics analysis of baseline serum samples could distinguish patients with an objective response from those with SD or progressive disease (PD), and those with disease control (CR, PR, and SD) from those with PD. The regimen was generally well tolerated, with predictable toxicities including skin rash and diarrhea. Conclusion: Gefitinib combined with rofecoxib provided disease control equivalent to that expected with single-agent gefitinib and was generally well tolerated. Baseline serum proteomics may help identify those patients most likely to benefit from EGFR TKIs. © 2007 by American Society of Clinical Oncology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Current blood based diagnostic assays to detect heart failure (HF) have large intra-individual and inter-individual variations which have made it difficult to determine whether the changes in the analyte levels reflect an actual change in disease activity. Human saliva mirrors the body's health and well being and similar to 20% of proteins that are present in blood are also found in saliva. Saliva has numerous advantages over blood as a diagnostic fluid which allows for a non-invasive, simple, and safe sample collection. The aim of our study was to develop an immunoassay to detect NT-proBNP in saliva and to determine if there is a correlation with blood levels. Methods: Saliva samples were collected from healthy volunteers (n = 40) who had no underlying heart conditions and HF patients (n = 45) at rest. Samples were stored at -80 degrees C until analysis. A customised homogeneous sandwich AlphaLISA((R)) immunoassay was used to quantify NT-proBNP levels in saliva. Results: Our NT-proBNP immunoassay was validated against a commercial Roche assay on plasma samples collected from HF patients (n = 37) and the correlation was r(2) = 0.78 (p<0.01, y = 1.705 x +1910.8). The median salivary NT-proBNP levels in the healthy and HF participants were <16 pg/mL and 76.8 pg/mL, respectively. The salivary NT-proBNP immunoassay showed a clinical sensitivity of 82.2% and specificity of 100%, positive predictive value of 100% and negative predictive value of 83.3%, with an overall diagnostic accuracy of 90.6%. Conclusion: We have firstly demonstrated that NT-proBNP can be detected in saliva and that the levels were higher in heart failure patients compared with healthy control subjects. Further studies will be needed to demonstrate the clinical relevance of salivary NT-proBNP in unselected, previously undiagnosed populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past 10 years, the use of saliva as a diagnostic fluid has gained attention and has become a translational research success story. Some of the current nanotechnologies have been demonstrated to have the analytical sensitivity required for the use of saliva as a diagnostic medium to detect and predict disease progression. However, these technologies have not yet been integrated into current clinical practice and work flow. As a diagnostic fluid, saliva offers advantages over serum because it can be collected noninvasively by individuals with modest training, and it offers a cost-effective approach for the screening of large populations. Gland-specific saliva can also be used for diagnosis of pathology specific to one of the major salivary glands. There is minimal risk of contracting infections during saliva collection, and saliva can be used in clinically challenging situations, such as obtaining samples from children or handicapped or anxious patients, in whom blood sampling could be a difficult act to perform. In this review we highlight the production of and secretion of saliva, the salivary proteome, transportation of biomolecules from blood capillaries to salivary glands, and the diagnostic potential of saliva for use in detection of cardiovascular disease and oral and breast cancers. We also highlight the barriers to application of saliva testing and its advancement in clinical settings. Saliva has the potential to become a first-line diagnostic sample of choice owing to the advancements in detection technologies coupled with combinations of biomolecules with clinical relevance.