935 resultados para Diagnostic checking
Resumo:
Over the past 10 years, the use of saliva as a diagnostic fluid has gained attention and has become a translational research success story. Some of the current nanotechnologies have been demonstrated to have the analytical sensitivity required for the use of saliva as a diagnostic medium to detect and predict disease progression. However, these technologies have not yet been integrated into current clinical practice and work flow. As a diagnostic fluid, saliva offers advantages over serum because it can be collected noninvasively by individuals with modest training, and it offers a cost-effective approach for the screening of large populations. Gland-specific saliva can also be used for diagnosis of pathology specific to one of the major salivary glands. There is minimal risk of contracting infections during saliva collection, and saliva can be used in clinically challenging situations, such as obtaining samples from children or handicapped or anxious patients, in whom blood sampling could be a difficult act to perform. In this review we highlight the production of and secretion of saliva, the salivary proteome, transportation of biomolecules from blood capillaries to salivary glands, and the diagnostic potential of saliva for use in detection of cardiovascular disease and oral and breast cancers. We also highlight the barriers to application of saliva testing and its advancement in clinical settings. Saliva has the potential to become a first-line diagnostic sample of choice owing to the advancements in detection technologies coupled with combinations of biomolecules with clinical relevance.
Resumo:
This article describes the detection of DNA mutations using novel Au-Ag coated GaN substrate as SERS (surface-enhanced Raman spectroscopy) diagnostic platform. Oligonucleotide sequences corresponding to the BCR-ABL (breakpoint cluster region-Abelson) gene responsible for development of chronic myelogenous leukemia were used as a model system to demonstrate the discrimination between the wild type and Met244Val mutations. The thiolated ssDNA (single-strand DNA) was immobilized on the SERS-active surface and then hybridized to a labeled target sequence from solution. An intense SERS signal of the reporter molecule MGITC was detected from the complementary target due to formation of double helix. The SERS signal was either not observed, or decreased dramatically for a negative control sample consisting of labeled DNA that was not complementary to the DNA probe. The results indicate that our SERS substrate offers an opportunity for the development of novel diagnostic assays.
Resumo:
Developing follicles and follicular cysts in the ovary are lined by granulosa cells. Approximately the size of histiocytes, non-neoplastic granulosa cells have scant granular to foamy cytoplasm and mildly atypical hyperchromatic nuclei, which may be mitotically active. 1 Displaced granulosa cells, derived from normal follicles and introduced into ovarian vascular channels, ovarian stroma and the fallopian tube, have been reported to cause diagnostic difficulty in histol- ogy, as they may mimic small cell carcinoma or other metastatic carcinomas. 2–4 The cells are thought to be displaced artefactually due to surgical trauma or during sectioning in the laboratory or during ovulation...
Resumo:
Background Rapid diagnostic tests (RDTs) for detection of Plasmodium falciparum infection that target P. falciparum histidine-rich protein 2 (PfHRP2), a protein that circulates in the blood of patients infected with this species of malaria, are widely used to guide case management. Understanding determinants of PfHRP2 availability in circulation is therefore essential to understanding the performance of PfHRP2-detecting RDTs. Methods The possibility that pre-formed host anti-PfHRP2 antibodies may block target antigen detection, thereby causing false negative test results was investigated in this study. Results Anti-PfHRP2 antibodies were detected in 19/75 (25%) of plasma samples collected from patients with acute malaria from Cambodia, Nigeria and the Philippines, as well as in 3/28 (10.7%) asymptomatic Solomon Islands residents. Pre-incubation of plasma samples from subjects with high-titre anti-PfHRP2 antibodies with soluble PfHRP2 blocked the detection of the target antigen on two of the three brands of RDTs tested, leading to false negative results. Pre-incubation of the plasma with intact parasitized erythrocytes resulted in a reduction of band intensity at the highest parasite density, and a reduction of lower detection threshold by ten-fold on all three brands of RDTs tested. Conclusions These observations indicate possible reduced sensitivity for diagnosis of P. falciparum malaria using PfHRP2-detecting RDTs among people with high levels of specific antibodies and low density infection, as well as possible interference with tests configured to detect soluble PfHRP2 in saliva or urine samples. Further investigations are required to assess the impact of pre-formed anti-PfHRP2 antibodies on RDT performance in different transmission settings.
Resumo:
Background: Malaria rapid diagnostic tests (RDTs) are appropriate for case management, but persistent antigenaemia is a concern for HRP2-detecting RDTs in endemic areas. It has been suggested that pan-pLDH test bands on combination RDTs could be used to distinguish persistent antigenaemia from active Plasmodium falciparum infection, however this assumes all active infections produce positive results on both bands of RDTs, an assertion that has not been demonstrated. Methods: In this study, data generated during the WHO-FIND product testing programme for malaria RDTs was reviewed to investigate the reactivity of individual test bands against P. falciparum in 18 combination RDTs. Each product was tested against multiple wild-type P. falciparum only samples. Antigen levels were measured by quantitative ELISA for HRP2, pLDH and aldolase. Results: When tested against P. falciparum samples at 200 parasites/μL, 92% of RDTs were positive; 57% of these on both the P. falciparum and pan bands, while 43% were positive on the P. falciparum band only. There was a relationship between antigen concentration and band positivity; ≥4 ng/mL of HRP2 produced positive results in more than 95% of P. falciparum bands, while ≥45 ng/mL of pLDH was required for at least 90% of pan bands to be positive. Conclusions: In active P. falciparum infections it is common for combination RDTs to return a positive HRP2 band combined with a negative pan-pLDH band, and when both bands are positive, often the pan band is faint. Thus active infections could be missed if the presence of a HRP2 band in the absence of a pan band is interpreted as being caused solely by persistent antigenaemia.
Resumo:
The present invention relates generally to methods for diagnosing and treating infectious diseases and other conditions related thereto. More particularly, the present invention relates to methods for determining the presence of organisms of the Chlamydiaceae family in a subject, including species of Chlamydia, and to methods for determining the stage of an infection caused by such organisms. The present invention also relates to kits for use with the diagnostic methods. The methods and kits of the present invention are particularly useful in relation to human and non-human, i.e. veterinary subjects. The present invention further relates to methods for identifying proteins or nucleic acid sequences associated with chlamydial infection in a subject. Such proteins or nucleic acid sequences are not only useful in relation to the diagnostic methods of the invention but are also useful in the development of methods and agents for preventing and/or treating chlamydial infection in a subject, such as but not limited to, immunotherapeutic methods and agents.
Resumo:
Background: Changing perspectives on the natural history of celiac disease (CD), new serology and genetic tests, and amended histological criteria for diagnosis cast doubt on past prevalence estimates for CD. We set out to establish a more accurate prevalence estimate for CD using a novel serogenetic approach.Methods: The human leukocyte antigen (HLA)-DQ genotype was determined in 356 patients with 'biopsy-confirmed' CD, and in two age-stratified, randomly selected community cohorts of 1,390 women and 1,158 men. Sera were screened for CD-specific serology.Results: Only five 'biopsy-confirmed' patients with CD did not possess the susceptibility alleles HLA-DQ2.5, DQ8, or DQ2.2, and four of these were misdiagnoses. HLA-DQ2.5, DQ8, or DQ2.2 was present in 56% of all women and men in the community cohorts. Transglutaminase (TG)-2 IgA and composite TG2/deamidated gliadin peptide (DGP) IgA/IgG were abnormal in 4.6% and 5.6%, respectively, of the community women and 6.9% and 6.9%, respectively, of the community men, but in the screen-positive group, only 71% and 75%, respectively, of women and 65% and 63%, respectively, of men possessed HLA-DQ2.5, DQ8, or DQ2.2. Medical review was possible for 41% of seropositive women and 50% of seropositive men, and led to biopsy-confirmed CD in 10 women (0.7%) and 6 men (0.5%), but based on relative risk for HLA-DQ2.5, DQ8, or DQ2.2 in all TG2 IgA or TG2/DGP IgA/IgG screen-positive subjects, CD affected 1.3% or 1.9%, respectively, of females and 1.3% or 1.2%, respectively, of men. Serogenetic data from these community cohorts indicated that testing screen positives for HLA-DQ, or carrying out HLA-DQ and further serology, could have reduced unnecessary gastroscopies due to false-positive serology by at least 40% and by over 70%, respectively.Conclusions: Screening with TG2 IgA serology and requiring biopsy confirmation caused the community prevalence of CD to be substantially underestimated. Testing for HLA-DQ genes and confirmatory serology could reduce the numbers of unnecessary gastroscopies. © 2013 Anderson et al.; licensee BioMed Central Ltd.
Resumo:
The risk of developing osteoporosis is determined by the interaction of several mostly unknown genes and environmental factors. Genetic studies in osteoporosis have largely focussed on association studies of a small number of candidate genes, with few linkage studies performed, and large areas of the genome remaining unexplored. Identifying the genes involved in osteoporosis would be a major advance in our understanding of the causation of the disease, and lead to advances in diagnosis, risk prediction, and potentially preventive and therapeutic measures.
Resumo:
IMPORTANCE Patients with chest pain represent a high health care burden, but it may be possible to identify a patient group with a low short-term risk of adverse cardiac events who are suitable for early discharge. OBJECTIVE To compare the effectiveness of a rapid diagnostic pathway with a standard-care diagnostic pathway for the assessment of patients with possible cardiac chest pain in a usual clinical practice setting. DESIGN, SETTING, AND PARTICIPANTS A single-center, randomized parallel-group trial with blinded outcome assessments was conducted in an academic general and tertiary hospital. Participants included adults with acute chest pain consistent with acute coronary syndrome for whom the attending physician planned further observation and troponin testing. Patient recruitment occurred from October 11, 2010, to July 4, 2012, with a 30-day follow-up. INTERVENTIONS An experimental pathway using an accelerated diagnostic protocol (Thrombolysis in Myocardial Infarction score, 0; electrocardiography; and 0- and 2-hour troponin tests) or a standard-care pathway (troponin test on arrival at hospital, prolonged observation, and a second troponin test 6-12 hours after onset of pain) serving as the control. MAIN OUTCOMES AND MEASURES Discharge from the hospital within 6 hours without a major adverse cardiac event occurring within 30 days. RESULTS Fifty-two of 270 patients in the experimental group were successfully discharged within 6 hours compared with 30 of 272 patients in the control group (19.3% vs 11.0%; odds ratio, 1.92; 95% CI, 1.18-3.13; P = .008). It required 20 hours to discharge the same proportion of patients from the control group as achieved in the experimental group within 6 hours. In the experimental group, 35 additional patients (12.9%) were classified as low risk but admitted to an inpatient ward for cardiac investigation. None of the 35 patients received a diagnosis of acute coronary syndrome after inpatient evaluation. CONCLUSIONS AND RELEVANCE Using the accelerated diagnostic protocol in the experimental pathway almost doubled the proportion of patients with chest pain discharged early. Clinicians could discharge approximately 1 of 5 patients with chest pain to outpatient follow-up monitoring in less than 6 hours. This diagnostic strategy could be easily replicated in other centers because no extra resources are required.
Resumo:
Objective Risk scores and accelerated diagnostic protocols can identify chest pain patients with low risk of major adverse cardiac event who could be discharged early from the ED, saving time and costs. We aimed to derive and validate a chest pain score and accelerated diagnostic protocol (ADP) that could safely increase the proportion of patients suitable for early discharge. Methods Logistic regression identified statistical predictors for major adverse cardiac events in a derivation cohort. Statistical coefficients were converted to whole numbers to create a score. Clinician feedback was used to improve the clinical plausibility and the usability of the final score (Emergency Department Assessment of Chest pain Score [EDACS]). EDACS was combined with electrocardiogram results and troponin results at 0 and 2 h to develop an ADP (EDACS-ADP). The score and EDACS-ADP were validated and tested for reproducibility in separate cohorts of patients. Results In the derivation (n = 1974) and validation (n = 608) cohorts, the EDACS-ADP classified 42.2% (sensitivity 99.0%, specificity 49.9%) and 51.3% (sensitivity 100.0%, specificity 59.0%) as low risk of major adverse cardiac events, respectively. The intra-class correlation coefficient for categorisation of patients as low risk was 0.87. Conclusion The EDACS-ADP identified approximately half of the patients presenting to the ED with possible cardiac chest pain as having low risk of short-term major adverse cardiac events, with high sensitivity. This is a significant improvement on similar, previously reported protocols. The EDACS-ADP is reproducible and has the potential to make considerable cost reductions to health systems.
Resumo:
The Fabens method is commonly used to estimate growth parameters k and l infinity in the von Bertalanffy model from tag-recapture data. However, the Fabens method of estimation has an inherent bias when individual growth is variable. This paper presents an asymptotically unbiassed method using a maximum likelihood approach that takes account of individual variability in both maximum length and age-at-tagging. It is assumed that each individual's growth follows a von Bertalanffy curve with its own maximum length and age-at-tagging. The parameter k is assumed to be a constant to ensure that the mean growth follows a von Bertalanffy curve and to avoid overparameterization. Our method also makes more efficient use nf thp measurements at tno and recapture and includes diagnostic techniques for checking distributional assumptions. The method is reasonably robust and performs better than the Fabens method when individual growth differs from the von Bertalanffy relationship. When measurement error is negligible, the estimation involves maximizing the profile likelihood of one parameter only. The method is applied to tag-recapture data for the grooved tiger prawn (Penaeus semisulcatus) from the Gulf of Carpentaria, Australia.
Resumo:
Behçet's syndrome is very rare in children, especially those under 10 years of age. Clinical and radiological features are described in 4 children, including 2 under the age of 5 years, with the syndrome. As in other pediatric cases reported, the incomplete form of Behçet's syndrome was present in each case. All 4 patients had oral and genital mucosal effects, arthritis and gastrointestinal and dermatological manifestations. Ophthalmological symptoms occurred in only 1 patient. Radiologically, the 4 cases demonstrated the spectrum of gastrointestinal involvement, from minimal irregularity and thickening of the terminal ileum to gross irregularity and deformity of the terminal ileum and cecum. Because of the difficulty in differentiating Behçet's syndrome from other forms of inflammatory bowel disease it is suggested that in children with gastrointestinal involvement, 3 major criteria be present before the diagnosis of Behçet's syndrome is made.
Resumo:
Degradation of RNA in diagnostic specimens can cause false-negative test results and potential misdiagnosis when tests rely on the detection of specific RNA sequence. Current molecular methods of checking RNA integrity tend to be host species or group specific, necessitating libraries of primers and reaction conditions. The objective here was to develop a universal (multi-species) quality assurance tool for determining the integrity of RNA in animal tissues submitted to a laboratory for analyses. Ribosomal RNA (16S rRNA) transcribed from the mitochondrial 16S rDNA was used as template material for reverse transcription to cDNA and was amplified using polymerase chain reaction (PCR). As mitochondrial DNA has a high level of conservation, the primers used were shown to reverse transcribe and amplify RNA from every animal species tested. Deliberate degradation of rRNA template through temperature abuse of samples resulted in no reverse transcription and amplification. Samples spiked with viruses showed that single-stranded viral RNA and rRNA in the same sample degraded at similar rates, hence reverse transcription and PCR amplification of 16S rRNA could be used as a test of sample integrity and suitability for analysis that required the sample's RNA, including viral RNA. This test will be an invaluable quality assurance tool for determination of RNA integrity from tissue samples, thus avoiding erroneous test results that might occur if degraded target RNA is used unknowingly as template material for reverse transcription and subsequent PCR amplification.
Resumo:
Thirty-seven surface (0-0.10 or 0-0.20 m) soils covering a wide range of soil types (16 Vertosols, 6 Ferrosols, 6 Dermosols, 4 Hydrosols, 2 Kandosols, 1 Sodosol, 1 Rudosol, and 1 Chromosol) were exhaustively cropped in 2 glasshouse experiments. The test species were Panicum maximum cv. Green Panic in Experiment A and Avena sativa cv. Barcoo in Experiment B. Successive forage harvests were taken until the plants could no longer grow in most soils because of severe potassium (K) deficiency. Soil samples were taken prior to cropping and after the final harvest in both experiments, and also after the initial harvest in Experiment B. Samples were analysed for solution K, exchangeable K (Exch K), tetraphenyl borate extractable K for extraction periods of 15 min (TBK15) and 60 min (TBK60), and boiling nitric acid extractable K (Nitric K). Inter-correlations between the initial levels of the various soil K parameters indicated that the following pools were in sequential equilibrium: solution K, Exch K, fast release fixed K [estimated as (TBK15-Exch K)], and slow release fixed K [estimated as (TBK60-TBK15)]. Structural K [estimated as (Nitric K-TBK60)] was not correlated with any of the other pools. However, following exhaustive drawdown of soil K by cropping, structural K became correlated with solution K, suggesting dissolution of K minerals when solution K was low. The change in the various K pools following cropping was correlated with K uptake at Harvest 1 ( Experiment B only) and cumulative K uptake ( both experiments). The change in Exch K for 30 soils was linearly related to cumulative K uptake (r = 0.98), although on average, K uptake was 35% higher than the change in Exch K. For the remaining 7 soils, K uptake considerably exceeded the change in Exch K. However, the changes in TBK15 and TBK60 were both highly linearly correlated with K uptake across all soils (r = 0.95 and 0.98, respectively). The slopes of the regression lines were not significantly different from unity, and the y-axis intercepts were very small. These results indicate that the plant is removing K from the TBK pool. Although the change in Exch K did not consistently equate with K uptake across all soils, initial Exch K was highly correlated with K uptake (r = 0.99) if one Vertosol was omitted. Exchangeable K is therefore a satisfactory diagnostic indicator of soil K status for the current crop. However, the change in Exch K following K uptake is soil-dependent, and many soils with large amounts of TBK relative to Exch K were able to buffer changes in Exch K. These soils tended to be Vertosols occurring on floodplains. In contrast, 5 soils (a Dermosol, a Rudosol, a Kandosol, and 2 Hydrosols) with large amounts of TBK did not buffer decreases in Exch K caused by K uptake, indicating that the TBK pool in these soils was unavailable to plants under the conditions of these experiments. It is likely that K fertiliser recommendations will need to take account of whether the soil has TBK reserves, and the availability of these reserves, when deciding rates required to raise exchangeable K status to adequate levels.