99 resultados para HYPOTHESIS TESTS
Resumo:
The performances of two rapid tests and a standard serological test for the diagnosis of visceral leishmaniasis (VL) were compared using sera from 193 patients with VL and 85 controls. The Kala-Azar Detect®, IT-LEISH® and IFI-LH® assays showed sensitivities of 88.1%, 93.3% and 88.6%, respectively, and specificities of 90.6%, 96.5% and 80%, respectively. The sensitivity values were similar for both rapid tests, but the specificity and positive predictive values of IT-LEISH® were higher than the corresponding values for IFI-LH®. Both rapid tests showed satisfactory performances and can be used in primary health care settings; however, IT-LEISH® permits the use of whole blood, making this assay more suitable for bedside diagnosis.
Resumo:
A single strain of Mycobacterium abscessus subsp. bolletii, characterised by a particular rpoB sequevar and two highly related pulsed field gel electrophoresis patterns has been responsible for a nationwide outbreak of surgical infections in Brazil since 2004. In this study, we developed molecular tests based on polymerase chain reaction restriction-enzyme analysis (PRA) and sequencing for the rapid identification of this strain. Sequences of 15 DNA regions conserved in mycobacteria were retrieved from GenBank or sequenced and analysed in silico. Single nucleotide polymorphisms specific to the epidemic strain and located in enzyme recognition sites were detected in rpoB, the 3' region of the 16S rDNA and gyrB. The three tests that were developed, i.e., PRA-rpoB, PRA-16S and gyrB sequence analysis, showed 100%, 100% and 92.31% sensitivity and 93.06%, 90.28% and 100% specificity, respectively, for the discrimination of the surgical strain from other M. abscessus subsp. bolletii isolates, including 116 isolates from 95 patients, one environmental isolate and two type strains. The results of the three tests were stable, as shown by results obtained for different isolates from the same patient. In conclusion, due to the clinical and epidemiological importance of this strain, these tests could be implemented in reference laboratories for the rapid preliminary diagnosis and epidemiological surveillance of this epidemic strain.
Resumo:
The diagnosis of leprosy continues to be based on clinical symptoms and early diagnosis and treatment are critical to preventing disability and transmission. Sensitive and specific laboratory tests are not available for diagnosing leprosy. Despite the limited applicability of anti-phenolic glycolipid-I (PGL-I) serology for diagnosis, it has been suggested as an additional tool to classify leprosy patients (LPs) for treatment purposes. Two formats of rapid tests to detect anti-PGL-I antibodies [ML immunochromatography assay (ICA) and ML Flow] were compared in different groups, multibacillary patients, paucibacillary patients, household contacts and healthy controls in Brazil and Nepal. High ML Flow intra-test concordance was observed and low to moderate agreement between the results of ML ICA and ML Flow tests on the serum of LPs was observed. LPs were "seroclassified" according to the results of these tests and the seroclassification was compared to other currently used classification systems: the World Health Organization operational classification, the bacilloscopic index and the Ridley-Jopling classification. When analysing the usefulness of these tests in the operational classification of PB and MB leprosy for treatment and follow-up purposes, the ML Flow test was the best point-of-care test for subjects in Nepal and despite the need for sample dilution, the ML ICA test yielded better performance among Brazilian subjects. Our results identified possible ways to improve the performance of both tests.
Resumo:
Mice experimentally infected with a pathogenic strain of Leptospira interrogans serovar Canicola produced false negative results (prozone effect) in a microscopic agglutination test (MAT). This prozone effect occurred in several serum samples collected at different post-infection times, but it was more prominent in samples collected from seven-42 days post-infection and for 1:50 and 1:100 sample dilutions. This phenomenon was correlated with increased antibody titres in the early post-infection phase. While prozone effects are often observed in serological agglutination assays for the diagnosis of animal brucellosis and human syphilis, they are not widely reported in leptospirosis MATs.
Resumo:
Despite major improvements in its treatment and diagnosis, sepsis is still a leading cause of death and admittance to the intensive care unit (ICU). Failure to identify patients at high risk of developing septic shock contributes to an increase in the sepsis burden and rapid molecular tests are currently the most promising avenue to aid in patient risk determination and therapeutic anticipation. The primary goal of this study was to evaluate the genetic susceptibility that affects sepsis outcome in 72 sepsis patients admitted to the ICU. Seven polymorphisms were genotyped in key inflammatory response genes in sepsis, including tumour necrosis factor-α,interlelukin (IL)-1β, IL-10,IL-8, Toll-like receptor 4, CXCR1and CXCR2. The primary finding showed that patients who were homozygous for the major A allele in IL-10rs1800896 had almost five times higher chance to develop septic shock compared to heterozygotes. Similarly, selected clinical features and CXCR2rs1126579 single nucleotide polymorphisms modulated septic shock susceptibility without affecting survival. These data support the hypothesis that molecular testing has clinical usefulness to improve sepsis prognostic models. Therefore, enrichment of the ICU portfolio by including these biomarkers will aid in the early identification of sepsis patients who may develop septic shock.
Resumo:
There is insufficient evidence of the usefulness of dengue diagnostic tests under routine conditions. We sought to analyse how physicians are using dengue diagnostics to inform research and development. Subjects attending 14 health institutions in an endemic area of Colombia with either a clinical diagnosis of dengue or for whom a dengue test was ordered were included in the study. Patterns of test-use are described herein. Factors associated with the ordering of dengue diagnostic tests were identified using contingency tables, nonparametric tests and logistic regression. A total of 778 subjects were diagnosed with dengue by the treating physician, of whom 386 (49.5%) were tested for dengue. Another 491 dengue tests were ordered in subjects whose primary diagnosis was not dengue. Severe dengue classification [odds ratio (OR) 2.2; 95% confidence interval (CI) 1.1-4.5], emergency consultation (OR 1.9; 95% CI 1.4-2.5) and month of the year (OR 3.1; 95% CI 1.7-5.5) were independently associated with ordering of dengue tests. Dengue tests were used both to rule in and rule out diagnosis. The latter use is not justified by the sensitivity of current rapid dengue diagnostic tests. Ordering of dengue tests appear to depend on a combination of factors, including physician and institutional preferences, as well as other patient and epidemiological factors.
Resumo:
This study aimed to standardise an in-house real-time polymerase chain reaction (rtPCR) to allow quantification of hepatitis B virus (HBV) DNA in serum or plasma samples, and to compare this method with two commercial assays, the Cobas Amplicor HBV monitor and the Cobas AmpliPrep/Cobas TaqMan HBV test. Samples from 397 patients from the state of São Paulo were analysed by all three methods. Fifty-two samples were from patients who were human immunodeficiency virus and hepatitis C virus positive, but HBV negative. Genotypes were characterised, and the viral load was measure in each sample. The in-house rtPCR showed an excellent success rate compared with commercial tests; inter-assay and intra-assay coefficients correlated with commercial tests (r = 0.96 and r = 0.913, p < 0.001) and the in-house test showed no genotype-dependent differences in detection and quantification rates. The in-house assay tested in this study could be used for screening and quantifying HBV DNA in order to monitor patients during therapy.
Resumo:
Low malathion concentrations influence metabolism in Chironomus sancticaroli (Diptera, Chironomidae) in acute and chronic toxicity tests. Organophosphate compounds are used in agro-systems, and in programs to control pathogen vectors. Because they are continuously applied, organophosphates often reach water sources and may have an impact on aquatic life. The effects of acute and chronic exposure to the organophosphate insecticide malathion on the midge Chironomus sancticaroli are evaluated. To that end, three biochemical biomarkers, acetylcholinesterase (AChE), alpha (EST-α) and beta (EST-β) esterase were used. Acute bioassays with five concentrations of malathion, and chronic bioassays with two concentrations of malathion were carried out. In the acute exposure test, AChE, EST-α and EST-β activities declined by 66, 40 and 37%, respectively, at 0.251 µg L-1 and more than 80% at 1.37, 1.96 and 2.51 µg L-1. In chronic exposure tests, AChE and EST-α activities declined by 28 and 15% at 0.251 µg L-1. Results of the present study show that low concentrations of malathion can influence larval metabolism, indicating high toxicity for Chironomus sancticaroli and environmental risk associated with the use of organophosphates.
Resumo:
The application of organic wastes to agricultural soils is not risk-free and can affect soil invertebrates. Ecotoxicological tests based on the behavioral avoidance of earthworms and springtails were performed to evaluate effects of different fertilization strategies on soil quality and habitat function for soil organisms. These tests were performed in soils treated with: i) slurry and chemical fertilizers, according to the conventional fertilization management of the region, ii) conventional fertilization + sludge and iii) unfertilized reference soil. Both fertilization strategies contributed to soil acidity mitigation and caused no increase in soil heavy metal content. Avoidance test results showed no negative effects of these strategies on soil organisms, compared with the reference soil. However, results of the two fertilization managements differed: Springtails did not avoid soils fertilized with dairy sludge in any of the tested combinations. Earthworms avoided soils treated with sludge as of May 2004 (DS1), when compared with conventional fertilization. Possibly, the behavioral avoidance of earthworms is more sensitive to soil properties (other than texture, organic matter and heavy metal content) than springtails
Resumo:
Due to the difficulty of estimating water percolation in unsaturated soils, the purpose of this study was to estimate water percolation based on time-domain reflectometry (TDR). In two drainage lysimeters with different soil textures TDR probes were installed, forming a water monitoring system consisting of different numbers of probes. The soils were saturated and covered with plastic to prevent evaporation. Tests of internal drainage were carried out using a TDR 100 unit with constant dielectric readings (every 15 min). To test the consistency of TDR-estimated percolation levels in comparison with the observed leachate levels in the drainage lysimeters, the combined null hypothesis was tested at 5 % probability. A higher number of probes in the water monitoring system resulted in an approximation of the percolation levels estimated from TDR - based moisture data to the levels measured by lysimeters. The definition of the number of probes required for water monitoring to estimate water percolation by TDR depends on the soil physical properties. For sandy clay soils, three batteries with four probes installed at depths of 0.20, 0.40, 0.60, and 0.80 m, at a distance of 0.20, 0.40 and 0.6 m from the center of lysimeters were sufficient to estimate percolation levels equivalent to the observed. In the sandy loam soils, the observed and predicted percolation levels were not equivalent even when using four batteries with four probes each, at depths of 0.20, 0.40, 0.60, and 0.80 m.
Resumo:
The cropping system influences the interception of water by plants, water storage in depressions on the soil surface, water infiltration into the soil and runoff. The aim of this study was to quantify some hydrological processes under no tillage cropping systems at the edge of a slope, in 2009 and 2010, in a Humic Dystrudept soil, with the following treatments: corn, soybeans, and common beans alone; and intercropped corn and common bean. Treatments consisted of four simulated rainfall tests at different times, with a planned intensity of 64 mm h-1 and 90 min duration. The first test was applied 18 days after sowing, and the others at 39, 75 and 120 days after the first test. Different times of the simulated rainfall and stages of the crop cycle affected soil water content prior to the rain, and the time runoff began and its peak flow and, thus, the surface hydrological processes. The depth of the runoff and the depth of the water intercepted by the crop + soil infiltration + soil surface storage were affected by the crop systems and the rainfall applied at different times. The corn crop was the most effective treatment for controlling runoff, with a water loss ratio of 0.38, equivalent to 75 % of the water loss ratio exhibited by common bean (0.51), the least effective treatment in relation to the others. Total water loss by runoff decreased linearly with an increase in the time that runoff began, regardless of the treatment; however, soil water content on the gravimetric basis increased linearly from the beginning to the end of the rainfall.
Resumo:
Bipolaris maydis was consistently isolated from infected Paspalum atratum cv. Pojuca plants showing leaf spot symptoms in the Cerrado of Brazil, in 2002. Pathogenicity tests under greenhouse conditions and subsequent reisolations of B. maydis from artificially inoculated Pojuca seedlings confirmed the hypothesis that this fungus was the causal agent of the disease. Symptoms of leaf spot appeared four days after inoculation in 100% of the inoculated Pojuca plants. All seven species of grasses evaluated were susceptible to B. maydis. The occurrence of leaf spot of Pojuca caused by B. maydis is reported for the first time in Brazil.
Resumo:
Edge effects are considered a key factor in regulating the structure of plant communities in different ecosystems. However, regardless to few studies, edge influence does not seem to be decisive in semiarid regions such as the Brazilian tropical dry forest known as Caatinga but this issue remains inconclusive. The present study tests the null hypothesis that the plant community of shrubs and trees does not change in its structure due to edge effects. Twenty-four plots (20 x 20 m) were set up in a fragment of Caatinga, in which 12 plots were in the forest edges and 12 plots were inside the fragment. Tree richness, abundance and species composition did not differ between edge and interior plots. The results of this study are in agreement with the pattern previously found for semiarid environments and contrasts with previous results obtained in different environments such as Rainforests, Savanna and Forest of Araucaria, which indicate abrupt differences between the border and interior of the plant communities in these ecosystems, and suggest that the community of woody plants of the Caatinga is not ecologically affected by the presence of edges.
Resumo:
This study aimed to describe the probabilistic structure of the annual series of extreme daily rainfall (Preabs), available from the weather station of Ubatuba, State of São Paulo, Brazil (1935-2009), by using the general distribution of extreme value (GEV). The autocorrelation function, the Mann-Kendall test, and the wavelet analysis were used in order to evaluate the presence of serial correlations, trends, and periodical components. Considering the results obtained using these three statistical methods, it was possible to assume the hypothesis that this temporal series is free from persistence, trends, and periodicals components. Based on quantitative and qualitative adhesion tests, it was found that the GEV may be used in order to quantify the probabilities of the Preabs data. The best results of GEV were obtained when the parameters of this function were estimated using the method of maximum likelihood. The method of L-moments has also shown satisfactory results.