79 resultados para sequential-tests
Resumo:
This study was conducted to compare among the most recent generation of five screening tests licensed in Argentina, in order to evaluate which of the tests has the best sensitivity for detection of antibodies against hepatitis C virus (HCV). The tests analyzed were: Detect-HCV™ (3.0) Biochem ImmunoSystems, Canada; Hepatitis C EIA Wiener Lab., Argentina; Equipar HCV Ab, Italy; Murex HCV 4.0, UK and Serodia-HCV particles agglutination test, Japan. The results obtained showed high discrepancy between the different kits used and show that some of the tests assessed have a low sensitivity for anti-HCV detection in both chronic infections and early seroconversion, and indicate that among the commercially available kits in Argentina, Murex HCV 4.0 (UK) and Serodia-HCV particles agglutination test (Japan) have the best sensitivity for HCV screening. Although the sensitivity of the assays is the first parameter to be considered for blood screening, more studies should be carried out to assess the specificity of such assays.
Resumo:
Significant advances were made in the diagnosis of filariasis in the 1990s with the emergence of three new alternative tools: ultrasound and tests to detect circulating antigen using two monoclonal antibodies, Og4C3 and AD12-ICT-card. This study aimed to identify which of these methods is the most sensitive for diagnosis of infection. A total of 256 individuals, all male and carrying microfilariae (1-15,679 MF/mL), diagnosed by nocturnal venous blood samples, were tested by all three techniques. The tests for circulating filarial antigen concurred 100% and correctly identified 246/256 (96.69%) of the positive individuals, while ultrasound detected only 186/256 (73.44%). Of the circulating antigen tests, ICT-card was the most convenient method for identification of Wuchereria bancrofti carriers. It was easy to perform, practical and quick.
Resumo:
Orally transmitted Chagas disease (ChD), which is a well-known entity in the Brazilian Amazon Region, was first documented in Venezuela in December 2007, when 103 people attending an urban public school in Caracas became infected by ingesting juice that was contaminated with Trypanosoma cruzi. The infection occurred 45-50 days prior to the initiation of the sampling performed in the current study. Parasitological methods were used to diagnose the first nine symptomatic patients; T. cruzi was found in all of them. However, because this outbreak was managed as a sudden emergency during Christmas time, we needed to rapidly evaluate 1,000 people at risk, so we decided to use conventional serology to detect specific IgM and IgG antibodies via ELISA as well as indirect haemagglutination, which produced positive test results for 9.1%, 11.9% and 9.9% of the individuals tested, respectively. In other more restricted patient groups, polymerase chain reaction (PCR) provided more sensitive results (80.4%) than blood cultures (16.2%) and animal inoculations (11.6%). Although the classical diagnosis of acute ChD is mainly based on parasitological findings, highly sensitive and specific serological techniques can provide rapid results during large and severe outbreaks, as described herein. The use of these serological techniques allows prompt treatment of all individuals suspected of being infected, resulting in reduced rates of morbidity and mortality.
Resumo:
The performances of two rapid tests and a standard serological test for the diagnosis of visceral leishmaniasis (VL) were compared using sera from 193 patients with VL and 85 controls. The Kala-Azar Detect®, IT-LEISH® and IFI-LH® assays showed sensitivities of 88.1%, 93.3% and 88.6%, respectively, and specificities of 90.6%, 96.5% and 80%, respectively. The sensitivity values were similar for both rapid tests, but the specificity and positive predictive values of IT-LEISH® were higher than the corresponding values for IFI-LH®. Both rapid tests showed satisfactory performances and can be used in primary health care settings; however, IT-LEISH® permits the use of whole blood, making this assay more suitable for bedside diagnosis.
Resumo:
A single strain of Mycobacterium abscessus subsp. bolletii, characterised by a particular rpoB sequevar and two highly related pulsed field gel electrophoresis patterns has been responsible for a nationwide outbreak of surgical infections in Brazil since 2004. In this study, we developed molecular tests based on polymerase chain reaction restriction-enzyme analysis (PRA) and sequencing for the rapid identification of this strain. Sequences of 15 DNA regions conserved in mycobacteria were retrieved from GenBank or sequenced and analysed in silico. Single nucleotide polymorphisms specific to the epidemic strain and located in enzyme recognition sites were detected in rpoB, the 3' region of the 16S rDNA and gyrB. The three tests that were developed, i.e., PRA-rpoB, PRA-16S and gyrB sequence analysis, showed 100%, 100% and 92.31% sensitivity and 93.06%, 90.28% and 100% specificity, respectively, for the discrimination of the surgical strain from other M. abscessus subsp. bolletii isolates, including 116 isolates from 95 patients, one environmental isolate and two type strains. The results of the three tests were stable, as shown by results obtained for different isolates from the same patient. In conclusion, due to the clinical and epidemiological importance of this strain, these tests could be implemented in reference laboratories for the rapid preliminary diagnosis and epidemiological surveillance of this epidemic strain.
Resumo:
The diagnosis of leprosy continues to be based on clinical symptoms and early diagnosis and treatment are critical to preventing disability and transmission. Sensitive and specific laboratory tests are not available for diagnosing leprosy. Despite the limited applicability of anti-phenolic glycolipid-I (PGL-I) serology for diagnosis, it has been suggested as an additional tool to classify leprosy patients (LPs) for treatment purposes. Two formats of rapid tests to detect anti-PGL-I antibodies [ML immunochromatography assay (ICA) and ML Flow] were compared in different groups, multibacillary patients, paucibacillary patients, household contacts and healthy controls in Brazil and Nepal. High ML Flow intra-test concordance was observed and low to moderate agreement between the results of ML ICA and ML Flow tests on the serum of LPs was observed. LPs were "seroclassified" according to the results of these tests and the seroclassification was compared to other currently used classification systems: the World Health Organization operational classification, the bacilloscopic index and the Ridley-Jopling classification. When analysing the usefulness of these tests in the operational classification of PB and MB leprosy for treatment and follow-up purposes, the ML Flow test was the best point-of-care test for subjects in Nepal and despite the need for sample dilution, the ML ICA test yielded better performance among Brazilian subjects. Our results identified possible ways to improve the performance of both tests.
Resumo:
Mice experimentally infected with a pathogenic strain of Leptospira interrogans serovar Canicola produced false negative results (prozone effect) in a microscopic agglutination test (MAT). This prozone effect occurred in several serum samples collected at different post-infection times, but it was more prominent in samples collected from seven-42 days post-infection and for 1:50 and 1:100 sample dilutions. This phenomenon was correlated with increased antibody titres in the early post-infection phase. While prozone effects are often observed in serological agglutination assays for the diagnosis of animal brucellosis and human syphilis, they are not widely reported in leptospirosis MATs.
Resumo:
There is insufficient evidence of the usefulness of dengue diagnostic tests under routine conditions. We sought to analyse how physicians are using dengue diagnostics to inform research and development. Subjects attending 14 health institutions in an endemic area of Colombia with either a clinical diagnosis of dengue or for whom a dengue test was ordered were included in the study. Patterns of test-use are described herein. Factors associated with the ordering of dengue diagnostic tests were identified using contingency tables, nonparametric tests and logistic regression. A total of 778 subjects were diagnosed with dengue by the treating physician, of whom 386 (49.5%) were tested for dengue. Another 491 dengue tests were ordered in subjects whose primary diagnosis was not dengue. Severe dengue classification [odds ratio (OR) 2.2; 95% confidence interval (CI) 1.1-4.5], emergency consultation (OR 1.9; 95% CI 1.4-2.5) and month of the year (OR 3.1; 95% CI 1.7-5.5) were independently associated with ordering of dengue tests. Dengue tests were used both to rule in and rule out diagnosis. The latter use is not justified by the sensitivity of current rapid dengue diagnostic tests. Ordering of dengue tests appear to depend on a combination of factors, including physician and institutional preferences, as well as other patient and epidemiological factors.
Resumo:
This study aimed to standardise an in-house real-time polymerase chain reaction (rtPCR) to allow quantification of hepatitis B virus (HBV) DNA in serum or plasma samples, and to compare this method with two commercial assays, the Cobas Amplicor HBV monitor and the Cobas AmpliPrep/Cobas TaqMan HBV test. Samples from 397 patients from the state of São Paulo were analysed by all three methods. Fifty-two samples were from patients who were human immunodeficiency virus and hepatitis C virus positive, but HBV negative. Genotypes were characterised, and the viral load was measure in each sample. The in-house rtPCR showed an excellent success rate compared with commercial tests; inter-assay and intra-assay coefficients correlated with commercial tests (r = 0.96 and r = 0.913, p < 0.001) and the in-house test showed no genotype-dependent differences in detection and quantification rates. The in-house assay tested in this study could be used for screening and quantifying HBV DNA in order to monitor patients during therapy.
Resumo:
Low malathion concentrations influence metabolism in Chironomus sancticaroli (Diptera, Chironomidae) in acute and chronic toxicity tests. Organophosphate compounds are used in agro-systems, and in programs to control pathogen vectors. Because they are continuously applied, organophosphates often reach water sources and may have an impact on aquatic life. The effects of acute and chronic exposure to the organophosphate insecticide malathion on the midge Chironomus sancticaroli are evaluated. To that end, three biochemical biomarkers, acetylcholinesterase (AChE), alpha (EST-α) and beta (EST-β) esterase were used. Acute bioassays with five concentrations of malathion, and chronic bioassays with two concentrations of malathion were carried out. In the acute exposure test, AChE, EST-α and EST-β activities declined by 66, 40 and 37%, respectively, at 0.251 µg L-1 and more than 80% at 1.37, 1.96 and 2.51 µg L-1. In chronic exposure tests, AChE and EST-α activities declined by 28 and 15% at 0.251 µg L-1. Results of the present study show that low concentrations of malathion can influence larval metabolism, indicating high toxicity for Chironomus sancticaroli and environmental risk associated with the use of organophosphates.
Resumo:
The timing of N application to maize is a key factor to be considered in no-till oat/maize sequential cropping. This study aimed to evaluate the influence of pre-planting, planting and sidedress N application on oat residue decomposition, on soil N immobilisation and remineralisation and on N uptake by maize plants in no-till oat/maize sequential cropping. Undisturbed soil cores of 10 and 20 cm diameter were collected from the 0-15 cm layer of a no-till Red Latossol, when the oat cover crop was in the milk-grain stage. Two greenhouse experiments were conducted simultaneously. Experiment A, established in the 10 cm diameter cores and without plant cultivation, was used to asses N dynamics in soil and oat residues. Experiment B, established in the 20 cm diameter cores and with maize cultivation, was used to assess plant growth and N uptake. An amount of 6.0 Mg ha-1 dry matter of oat residues was spread on the surface of the cores. A rate of 90 kg N ha-1 applied as ammonium sulphate in both experiments was split in pre-planting, planting and sidedress applications as follows: (a) 00-00-00 (control), (b) 90-00-00 (pre-planting application, 20 days before planting), (c) 00-90-00 (planting application), (d) 00-30-60 (split in a planting and a sidedress application 31 days after emergence), (e) 00-00-00* (control, without oat residue) and (f) 90-00-00* (pre-planting application, without oat residue). The N concentration and N content in oat residues were not affected during decomposition by N fertilisation. Most of the fertiliser NH4+-N was converted into NO3--N within 20 days after application. A significant decrease in NO3--N contents in the 0-4 cm layer was observed in all treatments between 40 and 60 days after the oat residue placement on the soil surface, suggesting the occurrence of N immobilisation in this period. Considering that most of the inorganic N was converted into NO3- and that no immobilisation of the pre planting fertiliser N occurred at the time of its application, it was possible to conclude that pre-planting applied N was prone to losses by leaching. On the other hand, with split N applications, maize plants showed N deficiency symptoms before sidedress application. Two indications for fertiliser-N management in no-till oat/maize sequential cropping could be suggested: (a) in case of split application, the sidedress should be earlier than 30 days after emergence, and (b) if integral application is preferred to save field operations, this should be done at planting.
Resumo:
The application of organic wastes to agricultural soils is not risk-free and can affect soil invertebrates. Ecotoxicological tests based on the behavioral avoidance of earthworms and springtails were performed to evaluate effects of different fertilization strategies on soil quality and habitat function for soil organisms. These tests were performed in soils treated with: i) slurry and chemical fertilizers, according to the conventional fertilization management of the region, ii) conventional fertilization + sludge and iii) unfertilized reference soil. Both fertilization strategies contributed to soil acidity mitigation and caused no increase in soil heavy metal content. Avoidance test results showed no negative effects of these strategies on soil organisms, compared with the reference soil. However, results of the two fertilization managements differed: Springtails did not avoid soils fertilized with dairy sludge in any of the tested combinations. Earthworms avoided soils treated with sludge as of May 2004 (DS1), when compared with conventional fertilization. Possibly, the behavioral avoidance of earthworms is more sensitive to soil properties (other than texture, organic matter and heavy metal content) than springtails
Resumo:
The cropping system influences the interception of water by plants, water storage in depressions on the soil surface, water infiltration into the soil and runoff. The aim of this study was to quantify some hydrological processes under no tillage cropping systems at the edge of a slope, in 2009 and 2010, in a Humic Dystrudept soil, with the following treatments: corn, soybeans, and common beans alone; and intercropped corn and common bean. Treatments consisted of four simulated rainfall tests at different times, with a planned intensity of 64 mm h-1 and 90 min duration. The first test was applied 18 days after sowing, and the others at 39, 75 and 120 days after the first test. Different times of the simulated rainfall and stages of the crop cycle affected soil water content prior to the rain, and the time runoff began and its peak flow and, thus, the surface hydrological processes. The depth of the runoff and the depth of the water intercepted by the crop + soil infiltration + soil surface storage were affected by the crop systems and the rainfall applied at different times. The corn crop was the most effective treatment for controlling runoff, with a water loss ratio of 0.38, equivalent to 75 % of the water loss ratio exhibited by common bean (0.51), the least effective treatment in relation to the others. Total water loss by runoff decreased linearly with an increase in the time that runoff began, regardless of the treatment; however, soil water content on the gravimetric basis increased linearly from the beginning to the end of the rainfall.
Resumo:
Phosphate release kinetics from manures are of global interest because sustainable plant nutrition with phosphate will be a major concern in the future. Although information on the bioavailability and chemical composition of P present in manure used as fertilizer are important to understand its dynamics in the soil, such studies are still scarce. Therefore, P extraction was evaluated in this study by sequential chemical fractionation, desorption with anion-cation exchange resin and 31P nuclear magnetic resonance (31P-NMR) spectroscopy to assess the P forms in three different dry manure types (i.e. poultry, cattle and swine manure). All three methods showed that the P forms in poultry, cattle and swine dry manures are mostly inorganic and highly bioavailable. The estimated P pools showed that organic and recalcitrant P forms were negligible and highly dependent on the Ca:P ratio in manures. The results obtained here showed that the extraction of P with these three different methods allows a better understanding and complete characterization of the P pools present in the manures.