940 resultados para Diagnostic radiology
Resumo:
Over the past 10 years, the use of saliva as a diagnostic fluid has gained attention and has become a translational research success story. Some of the current nanotechnologies have been demonstrated to have the analytical sensitivity required for the use of saliva as a diagnostic medium to detect and predict disease progression. However, these technologies have not yet been integrated into current clinical practice and work flow. As a diagnostic fluid, saliva offers advantages over serum because it can be collected noninvasively by individuals with modest training, and it offers a cost-effective approach for the screening of large populations. Gland-specific saliva can also be used for diagnosis of pathology specific to one of the major salivary glands. There is minimal risk of contracting infections during saliva collection, and saliva can be used in clinically challenging situations, such as obtaining samples from children or handicapped or anxious patients, in whom blood sampling could be a difficult act to perform. In this review we highlight the production of and secretion of saliva, the salivary proteome, transportation of biomolecules from blood capillaries to salivary glands, and the diagnostic potential of saliva for use in detection of cardiovascular disease and oral and breast cancers. We also highlight the barriers to application of saliva testing and its advancement in clinical settings. Saliva has the potential to become a first-line diagnostic sample of choice owing to the advancements in detection technologies coupled with combinations of biomolecules with clinical relevance.
Resumo:
This article describes the detection of DNA mutations using novel Au-Ag coated GaN substrate as SERS (surface-enhanced Raman spectroscopy) diagnostic platform. Oligonucleotide sequences corresponding to the BCR-ABL (breakpoint cluster region-Abelson) gene responsible for development of chronic myelogenous leukemia were used as a model system to demonstrate the discrimination between the wild type and Met244Val mutations. The thiolated ssDNA (single-strand DNA) was immobilized on the SERS-active surface and then hybridized to a labeled target sequence from solution. An intense SERS signal of the reporter molecule MGITC was detected from the complementary target due to formation of double helix. The SERS signal was either not observed, or decreased dramatically for a negative control sample consisting of labeled DNA that was not complementary to the DNA probe. The results indicate that our SERS substrate offers an opportunity for the development of novel diagnostic assays.
Resumo:
Developing follicles and follicular cysts in the ovary are lined by granulosa cells. Approximately the size of histiocytes, non-neoplastic granulosa cells have scant granular to foamy cytoplasm and mildly atypical hyperchromatic nuclei, which may be mitotically active. 1 Displaced granulosa cells, derived from normal follicles and introduced into ovarian vascular channels, ovarian stroma and the fallopian tube, have been reported to cause diagnostic difficulty in histol- ogy, as they may mimic small cell carcinoma or other metastatic carcinomas. 2–4 The cells are thought to be displaced artefactually due to surgical trauma or during sectioning in the laboratory or during ovulation...
Resumo:
Despite the importance of paediatric pneumonia as a cause of short and long-term morbidity and mortality worldwide, a reliable gold standard for its diagnosis remains elusive. The utility of clinical, microbiological and radiological diagnostic approaches varies widely within and between populations and is heavily dependent on the expertise and resources available in various settings. Here we review the role of radiology in the diagnosis of paediatric pneumonia. Chest radiographs (CXRs) are the most widely employed test, however, they are not indicated in ambulatory settings, cannot distinguish between viral and bacterial infections and have a limited role in the ongoing management of disease. A standardised definition of alveolar pneumonia on a CXR exists for epidemiological studies targeting bacterial pneumonias but it should not be extrapolated to clinical settings. Radiography, computed tomography and to a lesser extent ultrasonography and magnetic resonance imaging play an important role in complicated pneumonias but there are limitations that preclude their use as routine diagnostic tools. Large population-based studies are needed in different populations to address many of the knowledge gaps in the radiological diagnosis of pneumonia in children, however, the feasibility of such studies is an important barrier.
Resumo:
Background Rapid diagnostic tests (RDTs) for detection of Plasmodium falciparum infection that target P. falciparum histidine-rich protein 2 (PfHRP2), a protein that circulates in the blood of patients infected with this species of malaria, are widely used to guide case management. Understanding determinants of PfHRP2 availability in circulation is therefore essential to understanding the performance of PfHRP2-detecting RDTs. Methods The possibility that pre-formed host anti-PfHRP2 antibodies may block target antigen detection, thereby causing false negative test results was investigated in this study. Results Anti-PfHRP2 antibodies were detected in 19/75 (25%) of plasma samples collected from patients with acute malaria from Cambodia, Nigeria and the Philippines, as well as in 3/28 (10.7%) asymptomatic Solomon Islands residents. Pre-incubation of plasma samples from subjects with high-titre anti-PfHRP2 antibodies with soluble PfHRP2 blocked the detection of the target antigen on two of the three brands of RDTs tested, leading to false negative results. Pre-incubation of the plasma with intact parasitized erythrocytes resulted in a reduction of band intensity at the highest parasite density, and a reduction of lower detection threshold by ten-fold on all three brands of RDTs tested. Conclusions These observations indicate possible reduced sensitivity for diagnosis of P. falciparum malaria using PfHRP2-detecting RDTs among people with high levels of specific antibodies and low density infection, as well as possible interference with tests configured to detect soluble PfHRP2 in saliva or urine samples. Further investigations are required to assess the impact of pre-formed anti-PfHRP2 antibodies on RDT performance in different transmission settings.
Resumo:
Background: Malaria rapid diagnostic tests (RDTs) are appropriate for case management, but persistent antigenaemia is a concern for HRP2-detecting RDTs in endemic areas. It has been suggested that pan-pLDH test bands on combination RDTs could be used to distinguish persistent antigenaemia from active Plasmodium falciparum infection, however this assumes all active infections produce positive results on both bands of RDTs, an assertion that has not been demonstrated. Methods: In this study, data generated during the WHO-FIND product testing programme for malaria RDTs was reviewed to investigate the reactivity of individual test bands against P. falciparum in 18 combination RDTs. Each product was tested against multiple wild-type P. falciparum only samples. Antigen levels were measured by quantitative ELISA for HRP2, pLDH and aldolase. Results: When tested against P. falciparum samples at 200 parasites/μL, 92% of RDTs were positive; 57% of these on both the P. falciparum and pan bands, while 43% were positive on the P. falciparum band only. There was a relationship between antigen concentration and band positivity; ≥4 ng/mL of HRP2 produced positive results in more than 95% of P. falciparum bands, while ≥45 ng/mL of pLDH was required for at least 90% of pan bands to be positive. Conclusions: In active P. falciparum infections it is common for combination RDTs to return a positive HRP2 band combined with a negative pan-pLDH band, and when both bands are positive, often the pan band is faint. Thus active infections could be missed if the presence of a HRP2 band in the absence of a pan band is interpreted as being caused solely by persistent antigenaemia.
Resumo:
The present invention relates generally to methods for diagnosing and treating infectious diseases and other conditions related thereto. More particularly, the present invention relates to methods for determining the presence of organisms of the Chlamydiaceae family in a subject, including species of Chlamydia, and to methods for determining the stage of an infection caused by such organisms. The present invention also relates to kits for use with the diagnostic methods. The methods and kits of the present invention are particularly useful in relation to human and non-human, i.e. veterinary subjects. The present invention further relates to methods for identifying proteins or nucleic acid sequences associated with chlamydial infection in a subject. Such proteins or nucleic acid sequences are not only useful in relation to the diagnostic methods of the invention but are also useful in the development of methods and agents for preventing and/or treating chlamydial infection in a subject, such as but not limited to, immunotherapeutic methods and agents.
Resumo:
Background: Changing perspectives on the natural history of celiac disease (CD), new serology and genetic tests, and amended histological criteria for diagnosis cast doubt on past prevalence estimates for CD. We set out to establish a more accurate prevalence estimate for CD using a novel serogenetic approach.Methods: The human leukocyte antigen (HLA)-DQ genotype was determined in 356 patients with 'biopsy-confirmed' CD, and in two age-stratified, randomly selected community cohorts of 1,390 women and 1,158 men. Sera were screened for CD-specific serology.Results: Only five 'biopsy-confirmed' patients with CD did not possess the susceptibility alleles HLA-DQ2.5, DQ8, or DQ2.2, and four of these were misdiagnoses. HLA-DQ2.5, DQ8, or DQ2.2 was present in 56% of all women and men in the community cohorts. Transglutaminase (TG)-2 IgA and composite TG2/deamidated gliadin peptide (DGP) IgA/IgG were abnormal in 4.6% and 5.6%, respectively, of the community women and 6.9% and 6.9%, respectively, of the community men, but in the screen-positive group, only 71% and 75%, respectively, of women and 65% and 63%, respectively, of men possessed HLA-DQ2.5, DQ8, or DQ2.2. Medical review was possible for 41% of seropositive women and 50% of seropositive men, and led to biopsy-confirmed CD in 10 women (0.7%) and 6 men (0.5%), but based on relative risk for HLA-DQ2.5, DQ8, or DQ2.2 in all TG2 IgA or TG2/DGP IgA/IgG screen-positive subjects, CD affected 1.3% or 1.9%, respectively, of females and 1.3% or 1.2%, respectively, of men. Serogenetic data from these community cohorts indicated that testing screen positives for HLA-DQ, or carrying out HLA-DQ and further serology, could have reduced unnecessary gastroscopies due to false-positive serology by at least 40% and by over 70%, respectively.Conclusions: Screening with TG2 IgA serology and requiring biopsy confirmation caused the community prevalence of CD to be substantially underestimated. Testing for HLA-DQ genes and confirmatory serology could reduce the numbers of unnecessary gastroscopies. © 2013 Anderson et al.; licensee BioMed Central Ltd.
Resumo:
INTRODUCTION Radiological evaluation of the paediatric cervical spine can be a challenge due to the normal anatomic variants and injuries that are unique to children. We aimed to identify the usefulness of plain X-rays in comparison with CT and MRI in diagnosing Paediatric cervical spinal injuries. METHODS Retrospective review of imaging studies of children diagnosed with paediatric cervical spine injuries who had presented to two tertiary hospitals in Queensland. RESULTS There were 38 patients with 18 male and 20 female .The mean age was 8.6 years. Plain Cervical Spine X-rays (3views, AP lateral and open mouth views) were done in 34 patients. The remaining 8 children had a suspected head injury and hence had CT scans of their neck done at the time of CT head scan. Of these images taken, X-rays were diagnostic in 28 (82%) patients. CONCLUSION X- Rays still have a role to play in the diagnosis of pediatric cervical spinal injuries and should be considered as the first line in fully conscious patients and their usefulness should not be overlooked in light of the newer imaging modalities.
Resumo:
The risk of developing osteoporosis is determined by the interaction of several mostly unknown genes and environmental factors. Genetic studies in osteoporosis have largely focussed on association studies of a small number of candidate genes, with few linkage studies performed, and large areas of the genome remaining unexplored. Identifying the genes involved in osteoporosis would be a major advance in our understanding of the causation of the disease, and lead to advances in diagnosis, risk prediction, and potentially preventive and therapeutic measures.
Resumo:
IMPORTANCE Patients with chest pain represent a high health care burden, but it may be possible to identify a patient group with a low short-term risk of adverse cardiac events who are suitable for early discharge. OBJECTIVE To compare the effectiveness of a rapid diagnostic pathway with a standard-care diagnostic pathway for the assessment of patients with possible cardiac chest pain in a usual clinical practice setting. DESIGN, SETTING, AND PARTICIPANTS A single-center, randomized parallel-group trial with blinded outcome assessments was conducted in an academic general and tertiary hospital. Participants included adults with acute chest pain consistent with acute coronary syndrome for whom the attending physician planned further observation and troponin testing. Patient recruitment occurred from October 11, 2010, to July 4, 2012, with a 30-day follow-up. INTERVENTIONS An experimental pathway using an accelerated diagnostic protocol (Thrombolysis in Myocardial Infarction score, 0; electrocardiography; and 0- and 2-hour troponin tests) or a standard-care pathway (troponin test on arrival at hospital, prolonged observation, and a second troponin test 6-12 hours after onset of pain) serving as the control. MAIN OUTCOMES AND MEASURES Discharge from the hospital within 6 hours without a major adverse cardiac event occurring within 30 days. RESULTS Fifty-two of 270 patients in the experimental group were successfully discharged within 6 hours compared with 30 of 272 patients in the control group (19.3% vs 11.0%; odds ratio, 1.92; 95% CI, 1.18-3.13; P = .008). It required 20 hours to discharge the same proportion of patients from the control group as achieved in the experimental group within 6 hours. In the experimental group, 35 additional patients (12.9%) were classified as low risk but admitted to an inpatient ward for cardiac investigation. None of the 35 patients received a diagnosis of acute coronary syndrome after inpatient evaluation. CONCLUSIONS AND RELEVANCE Using the accelerated diagnostic protocol in the experimental pathway almost doubled the proportion of patients with chest pain discharged early. Clinicians could discharge approximately 1 of 5 patients with chest pain to outpatient follow-up monitoring in less than 6 hours. This diagnostic strategy could be easily replicated in other centers because no extra resources are required.
Resumo:
Objective Risk scores and accelerated diagnostic protocols can identify chest pain patients with low risk of major adverse cardiac event who could be discharged early from the ED, saving time and costs. We aimed to derive and validate a chest pain score and accelerated diagnostic protocol (ADP) that could safely increase the proportion of patients suitable for early discharge. Methods Logistic regression identified statistical predictors for major adverse cardiac events in a derivation cohort. Statistical coefficients were converted to whole numbers to create a score. Clinician feedback was used to improve the clinical plausibility and the usability of the final score (Emergency Department Assessment of Chest pain Score [EDACS]). EDACS was combined with electrocardiogram results and troponin results at 0 and 2 h to develop an ADP (EDACS-ADP). The score and EDACS-ADP were validated and tested for reproducibility in separate cohorts of patients. Results In the derivation (n = 1974) and validation (n = 608) cohorts, the EDACS-ADP classified 42.2% (sensitivity 99.0%, specificity 49.9%) and 51.3% (sensitivity 100.0%, specificity 59.0%) as low risk of major adverse cardiac events, respectively. The intra-class correlation coefficient for categorisation of patients as low risk was 0.87. Conclusion The EDACS-ADP identified approximately half of the patients presenting to the ED with possible cardiac chest pain as having low risk of short-term major adverse cardiac events, with high sensitivity. This is a significant improvement on similar, previously reported protocols. The EDACS-ADP is reproducible and has the potential to make considerable cost reductions to health systems.
Resumo:
Thirty-seven surface (0-0.10 or 0-0.20 m) soils covering a wide range of soil types (16 Vertosols, 6 Ferrosols, 6 Dermosols, 4 Hydrosols, 2 Kandosols, 1 Sodosol, 1 Rudosol, and 1 Chromosol) were exhaustively cropped in 2 glasshouse experiments. The test species were Panicum maximum cv. Green Panic in Experiment A and Avena sativa cv. Barcoo in Experiment B. Successive forage harvests were taken until the plants could no longer grow in most soils because of severe potassium (K) deficiency. Soil samples were taken prior to cropping and after the final harvest in both experiments, and also after the initial harvest in Experiment B. Samples were analysed for solution K, exchangeable K (Exch K), tetraphenyl borate extractable K for extraction periods of 15 min (TBK15) and 60 min (TBK60), and boiling nitric acid extractable K (Nitric K). Inter-correlations between the initial levels of the various soil K parameters indicated that the following pools were in sequential equilibrium: solution K, Exch K, fast release fixed K [estimated as (TBK15-Exch K)], and slow release fixed K [estimated as (TBK60-TBK15)]. Structural K [estimated as (Nitric K-TBK60)] was not correlated with any of the other pools. However, following exhaustive drawdown of soil K by cropping, structural K became correlated with solution K, suggesting dissolution of K minerals when solution K was low. The change in the various K pools following cropping was correlated with K uptake at Harvest 1 ( Experiment B only) and cumulative K uptake ( both experiments). The change in Exch K for 30 soils was linearly related to cumulative K uptake (r = 0.98), although on average, K uptake was 35% higher than the change in Exch K. For the remaining 7 soils, K uptake considerably exceeded the change in Exch K. However, the changes in TBK15 and TBK60 were both highly linearly correlated with K uptake across all soils (r = 0.95 and 0.98, respectively). The slopes of the regression lines were not significantly different from unity, and the y-axis intercepts were very small. These results indicate that the plant is removing K from the TBK pool. Although the change in Exch K did not consistently equate with K uptake across all soils, initial Exch K was highly correlated with K uptake (r = 0.99) if one Vertosol was omitted. Exchangeable K is therefore a satisfactory diagnostic indicator of soil K status for the current crop. However, the change in Exch K following K uptake is soil-dependent, and many soils with large amounts of TBK relative to Exch K were able to buffer changes in Exch K. These soils tended to be Vertosols occurring on floodplains. In contrast, 5 soils (a Dermosol, a Rudosol, a Kandosol, and 2 Hydrosols) with large amounts of TBK did not buffer decreases in Exch K caused by K uptake, indicating that the TBK pool in these soils was unavailable to plants under the conditions of these experiments. It is likely that K fertiliser recommendations will need to take account of whether the soil has TBK reserves, and the availability of these reserves, when deciding rates required to raise exchangeable K status to adequate levels.
Resumo:
A 300-strong Angus-Brahman cattle herd near Springsure, central Queensland, was being fed Acacia shirleyi (lancewood) browse during drought and crossed a 5-hectare, previously burnt area with an almost pure growth of Dysphania glomulifera subspecies glomulifera (red crumbweed) on their way to drinking water. Forty cows died of cyanide poisoning over 2 days before further access to the plant was prevented. A digital image of a plant specimen made on a flat-bed scanner and transmitted by email was used to identify D glomulifera. Specific advice on the plant's poisonous properties and management of the case was then provided by email within 2 hours of an initial telephone call by the field veterinarian to the laboratory some 600 km away. The conventional method using physical transport of a pressed dried plant specimen to confirm the identification took 5 days. D glomulifera was identified in the rumen of one of two cows necropsied. The cyanogenic potential of D glomulifera measured 4 days after collection from the site of cattle deaths was 18,600 mg HCN/kg in dry matter. The lethal dose of D glomulifera for a 420 kg cow was estimated as 150 to 190 g wet weight. The plant also contained 4.8% KNO3 equivalent in dry matter, but nitrate-nitrite poisoning was not involved in the deaths.
Resumo:
In dentistry, basic imaging techniques such as intraoral and panoramic radiography are in most cases the only imaging techniques required for the detection of pathology. Conventional intraoral radiographs provide images with sufficient information for most dental radiographic needs. Panoramic radiography produces a single image of both jaws, giving an excellent overview of oral hard tissues. Regardless of the technique, plain radiography has only a limited capability in the evaluation of three-dimensional (3D) relationships. Technological advances in radiological imaging have moved from two-dimensional (2D) projection radiography towards digital, 3D and interactive imaging applications. This has been achieved first by the use of conventional computed tomography (CT) and more recently by cone beam CT (CBCT). CBCT is a radiographic imaging method that allows accurate 3D imaging of hard tissues. CBCT has been used for dental and maxillofacial imaging for more than ten years and its availability and use are increasing continuously. However, at present, only best practice guidelines are available for its use, and the need for evidence-based guidelines on the use of CBCT in dentistry is widely recognized. We evaluated (i) retrospectively the use of CBCT in a dental practice, (ii) the accuracy and reproducibility of pre-implant linear measurements in CBCT and multislice CT (MSCT) in a cadaver study, (iii) prospectively the clinical reliability of CBCT as a preoperative imaging method for complicated impacted lower third molars, and (iv) the tissue and effective radiation doses and image quality of dental CBCT scanners in comparison with MSCT scanners in a phantom study. Using CBCT, subjective identification of anatomy and pathology relevant in dental practice can be readily achieved, but dental restorations may cause disturbing artefacts. CBCT examination offered additional radiographic information when compared with intraoral and panoramic radiographs. In terms of the accuracy and reliability of linear measurements in the posterior mandible, CBCT is comparable to MSCT. CBCT is a reliable means of determining the location of the inferior alveolar canal and its relationship to the roots of the lower third molar. CBCT scanners provided adequate image quality for dental and maxillofacial imaging while delivering considerably smaller effective doses to the patient than MSCT. The observed variations in patient dose and image quality emphasize the importance of optimizing the imaging parameters in both CBCT and MSCT.