973 resultados para infection rates


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Babesia spp. infections were investigated in Bos taurus x Bos indicus dairy cows and calves and in Boophilus microplus engorged female ticks and eggs. Blood samples and engorged female ticks were collected from 25 cows and 27 calves. Babesia spp. was detected in ticks by microscopic examination of hemolymph of engorged female and by squashes of egg samples. Cattle infection was investigated in blood thin smears and by DNA amplification methods (PCR and nested PCR), using specific primers for Babesia bovis and Babesia bigemina. Merozoites of B. bovis (3 animals) and B. bigemina (12 animals) were detected exclusively in blood smears of calves. DNA amplification methods revealed that the frequency of B. bigemina infection in calves (92.6%) and in cows (84%) and of B. bovis in calves (85.2%) and in cows (100%) did not differ significantly (P > 0.05). Babesia spp. infection was more frequent in female ticks and eggs collected from calves (P < 0.01) than from cows, especially in those which had patent parasitemia. Hatching rates of B. microplus larvae were assessed according to the origin of engorged females, parasiternia of the vertebrate host, frequency and intensity of infection in engorged female tick, and frequency of egg infection. Hatching rate was lower in samples collected from calves (P < 0.01) than from cows, and in those in which Babesia spp. was detected in egg samples (P < 0.01). Published by Elsevier B.V.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Considering that little is known about the epidemiology of Neospora caninum infection in humans, particularly in populations with high Toxoplasma gondii infection rates, the present study aimed to investigate the presence of antibodies to N. caninum in T. gondii-seropositive and -seronegative individuals. A total of 256 serum samples divided into four groups (61 samples from human immunodeficiency virus [HIV]-positive patients, 50 samples from patients with neurological disorders, 91 samples from newborns, and 54 samples from healthy subjects) were assessed for N. caninum and T. gondii serologies by indirect fluorescent-antibody test, enzyme-linked immunosorbent assay, and immunoblotting (IB). Immunoglobulin G antibodies to N. caninum were predominantly detected in HIV-infected patients (38%) and patients with neurological disorders (18%), while newborns and healthy subjects showed lower seropositivity rates (5% and 6%, respectively). Seropositivity to N. caninum was significantly associated with seropositivity to T. gondii in both HIV-infected patients and patients with neurological disorders. Seroreactivity to N. caninum was confirmed by IB, with positive sera predominantly recognizing the 29-kDa antigen of N. caninum. The results of this study indicate the presence of N. caninum infection or exposure in humans, particularly in HIV-infected patients or patients with neurological disorders, who could have opportunistic and concurrent infections with T. gondii. These findings may bring a new concern for the unstable clinical health of HIV-infected patients and the actual role of N. caninum infection in immunocompromised patients.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The present study provides the first epidemiological data on infection with Babesia bovis in cattle raised in the southwestern Brazilian Amazon. Blood clot samples were filtered through nylon cloth before being submitted to DNA extraction. PCR and nested-PCR were applied to assess the frequency of infection with B. bovis in calves with ages from 4 to 12 months bred in 4 microregions each in the states of Rondônia and Acre. After the DNA was extracted from the samples, the infection in cattle was investigated by amplification of the rap1 gene from B. bovis. The DNA amplification results revealed a frequency of infection with B. bovis of 95.1% (272/286) in the samples from Rondônia and 96.1% (195/203) in those from Acre. The high frequency of B. bovis infection in the animals with ages from 4 to 12 months indicates a situation of enzootic stability in the regions studied. The infection rates are comparable to those detected by immunodiagnostic techniques in other endemic regions of Brazil. © 2012 Elsevier GmbH.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In the laboratory, Amblyomma cajennense (Acari: Ixodidae) (Fabricius) larvae, nymphs and adults were exposed to Rickettsia rickettsii by feeding on needle-inoculated animals, and thereafter reared on uninfected guinea pigs or rabbits. Regardless of the tick stage that acquired the infection, subsequent tick stages were shown to be infected (confirming transstadial and transovarial transmissions) and were able to transmit R. rickettsii to uninfected animals, as demonstrated by serological and molecular analyses. However, the larval, nymphal and adult stages of A. cajennense were shown to be partially refractory to R. rickettsii infection, as in all cases, only part of the ticks became infected by this agent, after being exposed to rickettsemic animals. In addition, less than 50% of the infected engorged females transmitted rickettsiae transovarially, and when they did so, only part of the offspring became infected, indicating that vertical transmission alone is not enough to maintain R. rickettsii in A. cajennense for multiple generations. Finally, the R. rickettsii-infected tick groups had lower reproductive performance than the uninfected control group. Our results indicate that A. cajennense have a low efficiency to maintain R. rickettsii for successive generations, as R. rickettsii-infection rates should decline drastically throughout the successive tick generations.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have previously demonstrated that a patient's antibody reaction pattern in a confirmatory line immunoassay (INNO-LIA™ HIV I/II Score) provides information on the duration of infection, which is unaffected by clinical, immunological and viral variables. In this report we have set out to determine the diagnostic performance of Inno-Lia algorithms for identifying incident infections in patients with known duration of infection and evaluated the algorithms in annual cohorts of HIV notifications. Methods Diagnostic sensitivity was determined in 527 treatment-naive patients infected for up to 12 months. Specificity was determined in 740 patients infected for longer than 12 months. Plasma was tested by Inno-Lia and classified as either incident (< = 12 m) or older infection by 26 different algorithms. Incident infection rates (IIR) were calculated based on diagnostic sensitivity and specificity of each algorithm and the rule that the total of incident results is the sum of true-incident and false-incident results, which can be calculated by means of the pre-determined sensitivity and specificity. Results The 10 best algorithms had a mean raw sensitivity of 59.4% and a mean specificity of 95.1%. Adjustment for overrepresentation of patients in the first quarter year of infection further reduced the sensitivity. In the preferred model, the mean adjusted sensitivity was 37.4%. Application of the 10 best algorithms to four annual cohorts of HIV-1 notifications totalling 2'595 patients yielded a mean IIR of 0.35 in 2005/6 (baseline) and of 0.45, 0.42 and 0.35 in 2008, 2009 and 2010, respectively. The increase between baseline and 2008 and the ensuing decreases were highly significant. Other adjustment models yielded different absolute IIR, although the relative changes between the cohorts were identical for all models. Conclusions The method can be used for comparing IIR in annual cohorts of HIV notifications. The use of several different algorithms in combination, each with its own sensitivity and specificity to detect incident infection, is advisable as this reduces the impact of individual imperfections stemming primarily from relatively low sensitivities and sampling bias.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The purpose of this study was to acquire information about the effect of an antibacterial and biodegradable poly-L-lactide (PLLA) coated titanium plate osteosynthesis on local infection resistance. For our in vitro and in vivo experiments, we used six-hole AO DC minifragment titanium plates. The implants were coated with biodegradable, semiamorphous PLLA (coating about 30 microm thick). This acted as a carrier substance to which either antibiotics or antiseptics were added. The antibiotic we applied was a combination of Rifampicin and fusidic acid; the antiseptic was a combination of Octenidin and Irgasan. This produced the following groups: Group I: six-hole AO DC minifragment titanium plate without PLLA; Group II: six-hole AO DC minifragment titanium plate with PLLA without antibiotics/antiseptics; Group III: six-hole AO DC minifragment titanium plate with PLLA + 3% Rifampicin and 7% fusidic acid; Group IV: six-hole AO DC minifragment titanium plate with PLLA + 2% Octenidin and 8% Irgasan. In vitro, we investigated the degradation and the release of the PLLA coating over a period of 6 weeks, the bactericidal efficacy of antibiotics/antiseptics after their release from the coating and the bacterial adhesion of Staphylococcus aureus to the implants. In vivo, we compared the infection rates in white New Zealand rabbits after titanium plate osteosynthesis of the tibia with or without antibacterial coating after local percutaneous bacterial inoculations at different concentrations (2 x 10(5)-2 x 10(8)): The plate, the contaminated soft tissues and the underlying bone were removed under sterile conditions after 28 days and quantitatively evaluated for bacterial growth. A stepwise experimental design with an "up-and-down" dosage technique was used to adjust the bacterial challenge in the area of the ID50 (50% infection dose). Statistical evaluation of the differences between the infection rates of both groups was performed using the two-sided Fisher exact test (p < 0.05). Over a period of 6 weeks, a continuous degradation of the PLLA coating of 13%, on average, was seen in vitro in 0.9% NaCl solution. The elution tests on titanium implants with antibiotic or antiseptic coatings produced average release values of 60% of the incorporated antibiotic or 62% of the incorporated antiseptic within the first 60 min. This was followed by a much slower, but nevertheless continuous, release of the incorporated antibiotic and antiseptic over days and weeks. At the end of the test period of 42 days, 20% of the incorporated antibiotic and 15% of the incorporated antiseptic had not yet been released from the coating. The antibacterial effect of the antibiotic/antiseptic is not lost by integrating it into the PLLA coating. The overall infection rate in the in vivo investigation was 50%. For Groups I and II the infection rate was both 83% (10 of 12 animals). In Groups III and IV with antibacterial coating, the infection rate was both 17% (2 of 12 animals). The ID50 in the antibacterial coated Groups III and IV was recorded as 1 x 10(8) CFU, whereas the ID50 values in the Groups I and II without antibacterial coating were a hundred times lower at 1 x 10(6) CFU, respectively. The difference between the groups with and without antibacterial coating was statistically significant (p = 0.033). Using an antibacterial biodegradable PLLA coating on titanium plates, a significant reduction of infection rate in an in vitro and in vivo investigation could be demonstrated. For the first time, to our knowledge, we were able to show, under standardized and reproducible conditions, that an antiseptic coating leads to the same reduction in infection rate as an antibiotic coating. Taking the problem of antibiotic-induced bacterial resistance into consideration, we thus regard the antiseptic coating, which shows the same level of effectiveness, as advantageous.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

OBJECTIVE: Nursing in 'live islands' and routine high dose intravenous immunoglobulins after allogeneic hematopoietic stem cell transplantation were abandoned by many teams in view of limited evidence and high costs. METHODS: This retrospective single-center study examines the impact of change from nursing in 'live islands' to care in single rooms (SR) and from high dose to targeted intravenous immunoglobulins (IVIG) on mortality and infection rate of adult patients receiving an allogeneic stem cell or bone marrow transplantation in two steps and three time cohorts (1993-1997, 1997-2000, 2000-2003). RESULTS: Two hundred forty-eight allogeneic hematopoetic stem cell transplantations were performed in 227 patients. Patient characteristics were comparable in the three cohorts for gender, median age, underlying disease, and disease stage, prophylaxis for graft versus host disease (GvHD) and cytomegalovirus constellation. The incidence of infections (78.4%) and infection rates remained stable (rates/1000 days of neutropenia for sepsis 17.61, for pneumonia 6.76). Cumulative incidence of GvHD and transplant-related mortality did not change over time. CONCLUSIONS: Change from nursing in 'live islands' to SR and reduction of high dose to targeted IVIG did not result in increased infection rates or mortality despite an increase in patient age. These results support the current practice.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: The burden of enterococcal infections has increased over the last decades with vancomycin-resistant enterococci (VRE) being a major health problem. Solid organ transplantation is considered as a risk factor. However, little is known about the relevance of enterococci in solid organ transplantation recipients in areas with a low VRE prevalence. METHODS: We examined the epidemiology of enterococcal events in patients followed in the Swiss Transplant Cohort Study between May 2008 and September 2011 and analyzed risk factors for infection, aminopenicillin resistance, treatment, and outcome. RESULTS: Of the 1234 patients, 255 (20.7%) suffered from 392 enterococcal events (185 [47.2%] infections, 205 [52.3%] colonizations, and 2 events with missing clinical information). Only 2 isolates were VRE. The highest infection rates were found early after liver transplantation (0.24/person-year) consisting in 58.6% of Enterococcus faecium. The highest colonization rates were documented in lung transplant recipients (0.33/person-year), with 46.5% E. faecium. Age, prophylaxis with a betalactam antibiotic, and liver transplantation were significantly associated with infection. Previous antibiotic treatment, intensive care unit stay, and lung transplantation were associated with aminopenicillin resistance. Only 4/205 (2%) colonization events led to an infection. Adequate treatment did not affect microbiological clearance rates. Overall mortality was 8%; no deaths were attributable to enterococcal events. CONCLUSIONS: Enterococcal colonizations and infections are frequent in transplant recipients. Progression from colonization to infection is rare. Therefore, antibiotic treatment should be used restrictively in colonization. No increased mortality because of enterococcal infection was noted

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND Bolt-kit systems are increasingly used as an alternative to conventional external cerebrospinal fluid (CSF) drainage systems. Since 2009 we regularly utilize bolt-kit external ventricular drainage (EVD) systems with silver-bearing catheters inserted manually with a hand drill and skull screws for emergency ventriculostomy. For non-emergency situations, we use conventional ventriculostomy with subcutaneous tunneled silver-bearing catheters, performed in the operating room with a pneumatic drill. This retrospective analysis compared the two techniques in terms of infection rates. METHODS 152 patients (aged 17-85 years, mean=55.4 years) were included in the final analysis; 95 received bolt-kit silver-bearing catheters and 57 received conventionally implanted silver-bearing catheters. The primary endpoint combined infection parameters: occurrence of positive CSF culture, colonization of catheter tips, or elevated CSF white blood cell counts (>4/μl). Secondary outcome parameters were presence of microorganisms in CSF or on catheter tips. Incidence of increased CSF cell counts and number of patients with catheter malposition were also compared. RESULTS The primary outcome, defined as analysis of combined infection parameters (occurrence of either positive CSF culture, colonization of the catheter tips or raised CSF white blood cell counts >4/μl)was not significantly different between the groups (58.9% bolt-kit group vs. 63.2% conventionally implanted group, p=0.61, chi-square-test). The bolt-kit group was non-inferior and not superior to the conventional group (relative risk reduction of 6.7%; 90% confidence interval: -19.9% to 25.6%). Secondary outcomes showed no statistically significant difference in the incidence of microorganisms in CSF (2.1% bolt-kit vs. 5.3% conventionally implanted; p=0.30; chi-square-test). CONCLUSIONS This analysis indicates that silver-bearing EVD catheters implanted with a bolt-kit system outside the operating room do not significantly elevate the risk of CSF infection as compared to conventional implant methods.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In developing countries, infection and malnutrition, and their interaction effects, account for the majority of childhood deaths and chronic deficits in growth and development. To promote child health, the causal determinants of infection and malnutrition and cost-effective interventions must be identified. To this end, medical examinations of 988 children (age two weeks to 14 years) living at three altitudes (coastal < 300m; sierra (TURN) 3,000m; and altiplano > 4,000m) in Chile's northermost Department of Arica revealed that 393 (40%) of the youngsters harbored one or more infections. When sorted by region and ethnicity, indigenous children of the highlands had infection rates 50% higher than children of Spanish descent living near the coast.^ An ecological model was developed and used to examine the causal path of infection and measure the effect of single and combined environmental variables. Family variables significantly linked to child health included maternal health, age and education. Significant child determinants of infection included the child's nutrient intake and medical history. When compared to children well and free of disease, infected youngsters reported a higher incidence of recent illness and a lower intake of basic foodstuffs. Traditional measures of child health, e.g. birth condition, weaning history, maternal fertility, and family wealth, did not differentiate between well and infected children.^ When height, weight, arm circumference, and subcapular skinfold measurements were compared, infected children, regardless of age, had smaller arm circumferences, the statistical difference being the greatest for males, age nine to eleven. Height and weight, the traditional growth indices, did not differentiate between well and infected groups.^ Infection is not determined by a single environmental factor or even a series of variables. Child health is ecological in nature and cannot be improved independent of changes in the environment that surrounds the child. To focus on selected child health needs, such as feeding programs or immunization campaigns, without simultaneously attending to the environment from which the needs arose is an inappropriate use of time, personnel, and money. ^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Ocean acidification, caused by increased atmospheric carbon dioxide (CO2) concentrations, is currently an important environmental problem. It is therefore necessary to investigate the effects of ocean acidification on all life stages of a wide range of marine organisms. However, few studies have examined the effects of increased CO2 on early life stages of organisms, including corals. Using a range of pH values (pH 7.3, 7.6, and 8.0) in manipulative duplicate aquarium experiments, we have evaluated the effects of increased CO2 on early life stages (larval and polyp stages) of Acropora spp. with the aim of estimating CO2 tolerance thresholds at these stages. Larval survival rates did not differ significantly between the reduced pH and control conditions. In contrast, polyp growth and algal infection rates were significantly decreased at reduced pH levels compared to control conditions. These results suggest that future ocean acidification may lead to reduced primary polyp growth and delayed establishment of symbiosis. Stress exposure experiments using longer experimental time scales and lower levels of CO2 concentrations than those used in this study are needed to establish the threshold of CO2 emissions required to sustain coral reef ecosystems.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Adaptive immunity in vertebrates can confer increased resistance against invading pathogens upon re-infection. But how specific parasite genotypes affect the transition from innate to adaptive immunity is poorly understood. Here, we investigated the effects of homologous and heterologous exposures of genetically distinct parasite lineages of the eye fluke Diplostomum pseudospathaceum on gene expression patterns of adaptive immunity in sticklebacks (Gasterosteus aculeatus). We showed that observable differences were largely attributable to final exposures and that there is no transcription pattern characteristic for a general response to repeated infections with D. pseudospathaceum. Final exposure did not unify expression patterns of heterologous pre-exposed fish. Interestingly, heterologous final exposures showed similarities between different treatment groups subjected to homologous pre-exposure. The observed pattern was supported by parasite infection rates and suggests that host immunization was optimized towards an adaptive immune response that favored effectiveness against parasite diversity over specificity.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Schistosomiasis japonica is a zoonosis of major public health importance in southern China. We undertook a drug intervention to test the hypothesis that buffalo are major reservoirs for human infection in the marshlands/lake areas, where one million people are infected. We compared human and buffalo infection rates and intensity in an intervention village (Jishan), where humans and buffalo were treated with praziquantel, and a control village (Hexi), where only humans were treated, in the Poyang Lake region. Over the four-year study, human incidence in Jishan decreased but increased in Hexi. Adjustment of incidence by age, sex, water exposure, year, and village further confirmed the decreased human infection in Jishan. Chemotherapy for buffaloes resulted in a decrease in buffalo infection rates in Jishan, which coincided with the reduction in human infection rates there in the last two years of the study. Mathematical modeling predicted that buffalo are responsible for 75% of human transmission in Jishan. Copyright © 2006 by The American Society of Tropical Medicine and Hygiene.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We present a data based statistical study on the effects of seasonal variations in the growth rates of the gastro-intestinal (GI) parasitic infection in livestock. The alluded growth rate is estimated through the variation in the number of eggs per gram (EPG) of faeces in animals. In accordance with earlier studies, our analysis too shows that rainfall is the dominant variable in determining EPG infection rates compared to other macro-parameters like temperature and humidity. Our statistical analysis clearly indicates an oscillatory dependence of EPG levels on rainfall fluctuations. Monsoon recorded the highest infection with a comparative increase of at least 2.5 times compared to the next most infected period (summer). A least square fit of the EPG versus rainfall data indicates an approach towards a super diffusive (i. e. root mean square displacement growing faster than the square root of the elapsed time as obtained for simple diffusion) infection growth pattern regime for low rainfall regimes (technically defined as zeroth level dependence) that gets remarkably augmented for large rainfall zones. Our analysis further indicates that for low fluctuations in temperature (true on the bulk data), EPG level saturates beyond a critical value of the rainfall, a threshold that is expected to indicate the onset of the nonlinear regime. The probability density functions (PDFs) of the EPG data show oscillatory behavior in the large rainfall regime (greater than 500 mm), the frequency of oscillation, once again, being determined by the ambient wetness (rainfall, and humidity). Data recorded over three pilot projects spanning three measures of rainfall and humidity bear testimony to the universality of this statistical argument. © 2013 Chattopadhyay and Bandyopadhyay.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

RATIONALE: Limitations in methods for the rapid diagnosis of hospital-acquired infections often delay initiation of effective antimicrobial therapy. New diagnostic approaches offer potential clinical and cost-related improvements in the management of these infections. OBJECTIVES: We developed a decision modeling framework to assess the potential cost-effectiveness of a rapid biomarker assay to identify hospital-acquired infection in high-risk patients earlier than standard diagnostic testing. METHODS: The framework includes parameters representing rates of infection, rates of delayed appropriate therapy, and impact of delayed therapy on mortality, along with assumptions about diagnostic test characteristics and their impact on delayed therapy and length of stay. Parameter estimates were based on contemporary, published studies and supplemented with data from a four-site, observational, clinical study. Extensive sensitivity analyses were performed. The base-case analysis assumed 17.6% of ventilated patients and 11.2% of nonventilated patients develop hospital-acquired infection and that 28.7% of patients with hospital-acquired infection experience delays in appropriate antibiotic therapy with standard care. We assumed this percentage decreased by 50% (to 14.4%) among patients with true-positive results and increased by 50% (to 43.1%) among patients with false-negative results using a hypothetical biomarker assay. Cost of testing was set at $110/d. MEASUREMENTS AND MAIN RESULTS: In the base-case analysis, among ventilated patients, daily diagnostic testing starting on admission reduced inpatient mortality from 12.3 to 11.9% and increased mean costs by $1,640 per patient, resulting in an incremental cost-effectiveness ratio of $21,389 per life-year saved. Among nonventilated patients, inpatient mortality decreased from 7.3 to 7.1% and costs increased by $1,381 with diagnostic testing. The resulting incremental cost-effectiveness ratio was $42,325 per life-year saved. Threshold analyses revealed the probabilities of developing hospital-acquired infection in ventilated and nonventilated patients could be as low as 8.4 and 9.8%, respectively, to maintain incremental cost-effectiveness ratios less than $50,000 per life-year saved. CONCLUSIONS: Development and use of serial diagnostic testing that reduces the proportion of patients with delays in appropriate antibiotic therapy for hospital-acquired infections could reduce inpatient mortality. The model presented here offers a cost-effectiveness framework for future test development.