979 resultados para Intensive agricultural area
Resumo:
Tuberculosis has emerged as a major concern in patients with immuno-mediated diseases, including psoriasis, undergoing treatment with biologicals. However, it is not known whether the chronically activated immune system of psoriasis patients interferes with their Mycobacterium tuberculosis (Mtb)-specific immunity, especially in tuberculosis-endemic areas like Brazil. We evaluated T-cell responses to a Mtb lysate and to the recombinant Mtb proteins ESAT-6 and Ag85B of tuberculin skin test (TST) positive and TST negative patients with severe or mild/moderate, untreated psoriasis in three different assays: lymphocyte proliferation, enzyme immunoassay for interferon (IFN)-gamma and interleukin (IL)-10 production by peripheral blood mononuclear cells and overnight enzyme immunospot (ELISpot) for enumerating IFN-gamma-secreting cells. In our cohort, a low proportion (29%) of the severe psoriasis patients tested were TST-positive. IFN-gamma and IL-10 secretion and T-cell proliferation to Mtb antigens were reduced in TST-negative but not in TST-positive patients with severe psoriasis when compared to healthy controls with the same TST status. Similarly, severe psoriasis patients had decreased cytokine secretion and proliferative response to phytohemagglutinin. However, most psoriasis patients and healthy controls showed detectable numbers of IFN-gamma-secreting effector-memory T-cells in response to Mtb antigens by ELISpot. TST-negative, mild/moderate psoriasis patients had responses that were mostly intermediary between TST-negative controls and severe psoriasis patients. Thus, patients with severe psoriasis possess decreased anti-Mtb central memory T-cell responses, which may lead to false-negative results in the diagnosis of TB infection, but retain T-cell memory-effector activity against Mtb antigens. We hypothesize that the latter may confer some protection against tuberculosis reactivation.
Resumo:
Systems approaches can help to evaluate and improve the agronomic and economic viability of nitrogen application in the frequently water-limited environments. This requires a sound understanding of crop physiological processes and well tested simulation models. Thus, this experiment on spring wheat aimed to better quantify water x nitrogen effects on wheat by deriving some key crop physiological parameters that have proven useful in simulating crop growth. For spring wheat grown in Northern Australia under four levels of nitrogen (0 to 360 kg N ha(-1)) and either entirely on stored soil moisture or under full irrigation, kernel yields ranged from 343 to 719 g m(-2). Yield increases were strongly associated with increases in kernel number (9150-19950 kernels m(-2)), indicating the sensitivity of this parameter to water and N availability. Total water extraction under a rain shelter was 240 mm with a maximum extraction depth of 1.5 m. A substantial amount of mineral nitrogen available deep in the profile (below 0.9 m) was taken up by the crop. This was the source of nitrogen uptake observed after anthesis. Under dry conditions this late uptake accounted for approximately 50% of total nitrogen uptake and resulted in high (>2%) kernel nitrogen percentages even when no nitrogen was applied,Anthesis LAI values under sub-optimal water supply were reduced by 63% and under sub-optimal nitrogen supply by 50%. Radiation use efficiency (RUE) based on total incident short-wave radiation was 1.34 g MJ(-1) and did not differ among treatments. The conservative nature of RUE was the result of the crop reducing leaf area rather than leaf nitrogen content (which would have affected photosynthetic activity) under these moderate levels of nitrogen limitation. The transpiration efficiency coefficient was also conservative and averaged 4.7 Pa in the dry treatments. Kernel nitrogen percentage varied from 2.08 to 2.42%. The study provides a data set and a basis to consider ways to improve simulation capabilities of water and nitrogen effects on spring wheat. (C) 1997 Elsevier Science B.V.
Resumo:
Candidemia is associated with high morbidity and mortality resulting in significant increases in the length of patients` hospitalization and in healthcare costs. Critically ill patients are at particular risk for candidemia because of their debilitated condition and frequent need for invasive procedures. The aim of this study was to characterize the incidence and epidemiology of candidemia over a seven-year period in intensive care units (ICUs) and the use of fluconazole and caspofungin in a large university-affiliated hospital. All cases of candidemia were identified by surveillance, using the Centers for Diseases Control and Prevention criteria. Demographic variables, use of antifungal (fluconazole and caspofungin) and patient outcomes were evaluated. The 2 test for linear trend was employed to evaluate the distribution of Candida spp. and the use of fluconazole and caspofungin by defined daily dose (DDD) per 1,000 patients-days during the study period. One hundred and eight episodes of candidemia were identified. The overall incidence of candidemia (P=0.20) and incidence of non-Candida albicans Candida infections (P=0.32) remained stable over the study period and ranged from 0.3-0.9 episodes per 1,000 catheter-days and 0.39-0.83 episodes per 1,000 patients-days. However, the use of fluconazole and caspofungin increased significantly (P0.001). While there were no reports of the use of fluconazole for prophylaxis in 1999, its use for this purpose increased from 3% in 2000 to 7.0% (P=0.07) in 2006. C. albicans was the most frequent specie isolated and burns and cancer were the most frequent underlying conditions. The overall mortality was 76%. There was no difference between C. albicans and non-C. albicans Candida infections when the crude and 14-day mortality rates were compared. Our data demonstrated that C. albicans is still the most frequent species causing candidemia in our intensive care units. Our rates of candidemia are lower than those reported from the region and similar to American and European hospitals. Although the incidence of blood stream infections (BSI) and candidemia remained stable, the use of fluconazole and caspofungin increased significantly over the years included in this study but had no impact on the incidence of infections caused by non-C. albicans Candida species.
Resumo:
The Leishmune (R) vaccine has been used in endemic areas to prevent canine visceral leishmaniasis in Brazil, but cytokine production induced by vaccination has rarely been investigated in dogs. This study aimed to evaluate the immune response of dogs vaccinated with Leishmune FML vaccine (Fort Dodge) against total antigen of Leishmania (Leishmania) chagasi (TAg) and FML. Twenty healthy dogs from Aracatuba, Sao Paulo, Brazil, an endemic leishmaniasis area, received three consecutive subcutaneous injection of Leishmune vaccine at 21-day intervals. PBMC were isolated before and 10 days after completing vaccination and lymphoproliferative response and antibody production against FML or total promastigote antigen were tested. Cytokines IFN-gamma, IL-4 and TNF-alpha were measured in culture supernatant and CD4+/CD25+ and CD8+/CD25+ T cell presence was determined. Analysis of the data indicated that the vaccine conferred humoral responses (100%) against both antigens and cellular immunity to FML (85%) and total antigen (80%), the supernatant of cultured cells stimulated with TAg and FML showed an increase in IFN-gamma (P < 0.05), and the vaccine reduced CD4+/CD25+ T cell presence compared to that observed before vaccination. These responses may constitute part of the immune mechanism induced by Leishmune. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Objectives: To analyze mortality rates of children with severe sepsis and septic shock in relation to time-sensitive fluid resuscitation and treatments received and to define barriers to the implementation of the American College of Critical Care Medicine/Pediatric Advanced Life Support guidelines in a pediatric intensive care unit in a developing country. Methods: Retrospective chart review and prospective analysis of septic shock treatment in a pediatric intensive care unit of a tertiary care teaching hospital. Ninety patients with severe sepsis or septic shock admitted between July 2002 and June 2003 were included in this study. Results: Of the 90 patients, 83% had septic shock and 17% had severe sepsis; 80 patients had preexisting severe chronic diseases. Patients with septic shock who received less than a 20-mL/kg dose of resuscitation fluid in the first hour of treatment had a mortality rate of 73%, whereas patients who received more than a 40-mL/kg dose in the first hour of treatment had a mortality rate of 33% (P < 0.05.) Patients treated less than 30 minutes after diagnosis of severe sepsis and septic shock had a significantly lower mortality rate (40%) than patients treated more than 60 Minutes after diagnosis (P < 0.05). Controlling for the risk of mortality, early fluid resuscitation was associated with a 3-fold reduction in the odds of death (odds ratio, 0.33; 95% confidence interval, 0.13-0.85). The most important barriers to achieve adequate severe sepsis and septic shock treatment were lack of adequate vascular access, lack of recognition of early shock, shortage of health care providers, and nonuse of goals and treatment protocols. Conclusions: The mortality rate was higher for children older than years, for those who received less than 40 mL/kg in the first hour, and for those whose treatment was not initiated in the first 30 Minutes after the diagnosis of septic shock. The acknowledgment of existing barriers to a timely fluid administration and the establishment of objectives to overcome these barriers may lead to a more successful implementation of the American College of Critical Care Medicine guidelines and reduced mortality rates for children with septic shock in the developing world.
Resumo:
The purpose of this study was to describe the reproductive profile and frequency of genital infections among women living in the Serra Pelada, a former mining village in the Para state, Brazil. A descriptive study of women living in the mining area of Serra Pelada was performed in 2004 through interviews that gathered demographics and clinical data, and assessed risk behaviors of 209 randomly-selected women. Blood samples were collected for rapid assay for HIV; specimens were taken for Pap smears and Gram stains. Standard descriptive statistical analyses were performed and prevalence was calculated to reflect the relative frequency of each disease. Of the 209 participants, the median age was 38 years, with almost 70% having less than four years of education and 77% having no income or under 1.9 times the minimum wage of Brazil. About 30% did not have access to health care services during the preceding year. Risk behaviors included: alcohol abuse, 24.4%; illicit drug abuse, 4.3%; being a sex worker, 15.8%; and domestic violence, 17.7%. Abnormal Pap smear was found in 8.6%. Prevalence rates of infection were: HIV, 1.9%; trichomoniasis, 2.9%; bacterial vaginosis, 18.7%; candidiasis, 5.7%; Chlamydial-related cytological changes, 3.3%; and HPV-related cytological changes, 3.8%. Women living in this mining area in Brazil are economically and socially vulnerable to health problems. It is important to point out the importance of concomitant broader strategies that include reducing poverty and empowering women to make improvements regarding their health.
Resumo:
Measures employed to control visceral leishmaniasis in Brazil have focused on vector control by residual insecticide spraying and diagnosis of infection with elimination of positive dogs. We describe dog culling and replacement in a Brazilian endemic area (the Alvorada District, Aracatuba, SP) in order to better understand dog population dynamics when elimination of the dog reservoir is adopted as the main control measure. From August 2002 to July 2004, 60.9% of the estimated dog population for the area was culled with a mean age of 34 months old. The presence of anti-Leishmania sp. antibodies was recorded for only 26.7% of the euthanized canines. Replacement was observed in 38.8% of the cases, some of them by 2 or more dogs and in a mean time of 4 months. Dogs were replaced mostly by puppies of both sexes with a mean age of 6.8 months. From August 2002 to April 2005 we were able to follow-up 116 of these dogs, during a mean time of 8.7 months. Canine visceral leishmaniasis seropositivity by ELISA was observed in 42.2% of the followed dogs, 30.6% of which were already positive at the first evaluation. By the end of the follow-up period 37% of the dogs were submitted to euthanasia, with a mean age of 18.3 months. In the studied CVL endemic area of Brazil, euthanasia and the subsequent replacement ratio were high, increasing the dog population turnover and leading to a younger population that might be more susceptible to a variety of other infectious diseases in addition to CVL. Dog culling as a control strategy for VL should be reassessed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Background. Chikungunya, an alphavirus of the Togaviridae family, causes a febrile disease transmitted to humans by the bite of infected Aedes mosquitoes. This infection is reaching endemic levels in many Southeast Asian countries. Symptoms include sudden onset of fever, chills, headache, nausea, vomiting, joint pain with or without swelling, low back pain, and rash. According to the World Health Organization, there are 2 billion people living in Aedes-infested areas. In addition, traveling to these areas is popular, making the potential risk of infections transmitted by the bite of infected Aedes mosquitoes very high. Methods. We proposed a mathematical model to estimate the risk of acquiring chikungunya fever in an Aedes-infested area by taking the prevalence of dengue fever into account. The basic reproduction number for chikungunya fever R-0chik can be written as a function of the basic reproduction number of dengue R-0dengue by calculating the ratio R-0chik/R-0dengue. From R-0chik, we estimated the force of infection and the risk of acquiring the disease both for local residents of a dengue-endemic area and for travelers to this area. Results. We calculated that R-0chik is 64.4% that of R-0dengue. The model was applied to a hypothetical situation, namely, estimating the individual risk of acquiring chikungunya fever in a dengue-endemic area, both for local inhabitants (22% in steady state) and for visiting travelers (from 0.31% to 1.23% depending on the time spent in the area). Conclusions. The method proposed based on the output of a dynamical model is innovative and provided an estimation of the risk of infection, both for local inhabitants and for visiting travelers.
Resumo:
Although dogs are considered the main domestic reservoirs for Visceral Leishmaniosis (VL), which is caused in the Americas by Leishmania chagasi, infected cats have also been recently found in endemic areas of several countries and became a public health concern. Accordingly, the purpose of this study was to evaluate cats with dermatologic lesions from an endemic area of VL and the natural infection of L. chagasi. A total of 55 cats were selected between April 2008 and November 2009 from two major animal shelters of Aracatuba, Southeastern Brazil. All cats underwent general and dermatologic examinations, followed by direct parasitological examination of lymphoid organs, immunosorbent assay (ELISA) and indirect immunofluorescence (IFAT). In addition, detection of amastigotes was performed by immunohistochemistry (IHC) in skin lesions of all cats. VL was diagnosed in 27/55 (49.1%) cats with dermatological problems. Amastigotes were found in lymphoid organs of 10/27 (37.0%) cats; serology of 14/27 (51.9%), 6/27 (22.2%) and 5/27 (18.5%) cats was positive for ELISA, IFAT and both, respectively. The IHC identified 9/27 (33.3%) cats; 5/27 (18.5%) were positive only for IHC and therefore increased the overall sensitivity. Specific FIV antibodies were found in 6/55(10.9%) cats, of which 5/6 (83.3%) had leishmaniosis. Real time PCR followed by amplicon sequencing successfully confirmed L chagasi infection. In conclusion, dermatological lesions in cats from endemic areas was highly associated to visceral leishmaniosis, and therefore skin IHC and differential diagnosis of LV should be always conducted in dermatological patients in such areas. (c) 2011 Elsevier B.V. All rights reserved.
Resumo:
Premature birth is a well-known risk factor for sensorineural hearing loss in general and auditory neuropathy in particular. However, relatively little is known about the underlying causes, in part because there are so few relevant histopathological studies. Here, we report on the analysis of hair cell loss patterns in 54 temporal bones from premature infants and a control group of 46 bones from full-term infants, all of whom spent time in the neonatal intensive care unit at the Hospital de Nios in San Jose, Costa Rica, between 1977 and 1993. The prevalence of significant hair cell loss was higher in the preterm group than the full-term group (41% vs. 28%, respectively). The most striking finding was the frequency of selective inner hair cell loss, an extremely rare histopathological pattern, in the preterm vs. the full-term babies (27% vs. 3%, respectively). The findings suggest that a common cause of non-genetic auditory neuropathy is selective loss of inner hair cells rather than primary damage to the cochlear nerve.
Resumo:
The aim of this study is to investigate the changes in clinical pattern and therapeutic measures in leptospirosis-associated acute kidney injury; a retrospective study with 318 patients in Brazil. Patients were divided according to the time of admission: 1985-1996 (group I) and 1997-2010 (group II). Patients were younger in group I (36 +/- 13 versus 41 +/- 16 years, P = 0.005) and the numbers of oliguria increased (21% versus 41% in group II, P = 0.014). Higher frequency of lung manifestations was observed in group II (P<0.0001). Although increased severity, there was a significant reduction in mortality (20% in group I versus 12% in group II, P = 0.03). Mortality was associated with advanced age, low diastolic blood pressure, oliguria, arrhythmia, and peritoneal dialysis, besides a trend to better mortality with penicillin administration. Leptospirosis is occurring in an older population, with a higher number of oliguria and lung manifestations. However, mortality is decreasing and can be the result of changes in treatment.
Resumo:
PURPOSE. To evaluate and compare rates of change in neuro-retinal rim area (RA) and retinal nerve fiber layer thickness (RNFLT) measurements in glaucoma patients, those with suspected glaucoma, and normal subjects observed over time. METHODS. In this observational cohort study, patients recruited from two longitudinal studies (Diagnostic Innovations in Glaucoma Study-DIGS and African Descent and Evaluation Study-ADAGES) were observed with standard achromatic perimetry (SAP), optic disc stereophotographs, confocal scanning laser ophthalmoscopy (HRT-3; Heidelberg Engineering, Heidelberg, Germany), and scanning laser polarimetry (GDx-VCC; Carl Zeiss Meditec, Inc., Dublin, CA). Glaucoma progression was determined by the Guided Progression Analysis software for standard automated perimetry [SAP] and by masked assessment of serial optic disc stereophotographs by expert graders. Random-coefficients models were used to evaluate rates of change in average RNFLT and global RA measurements and their relationship with glaucoma progression. RESULTS. At baseline, 194 (31%) eyes were glaucomatous, 347 (55%) had suspected glaucoma, and 88 (14%) were normal. Forty-six (9%) eyes showed progression by SAP and/or stereophotographs, during an average follow-up of 3.3 (+/-0.7) years. The average rate of decline for RNFLT measurements was significantly higher in the progressing group than in the non-progressing group (-0.65 vs. -0.11 mu m/y, respectively; P < 0.001), whereas RA decline was not significantly different between these groups (-0.0058 vs. -0.0073 mm(2)/y, respectively; P = 0.727). The areas under the receiver operating characteristic (ROC) curves used to discriminate progressing versus nonprogressing eyes were 0.811 and 0.507 for the rates of change in the RNFLT and RA, respectively (P < 0.001). CONCLUSIONS. The ability to discriminate eyes with progressing glaucoma by SAP and/or stereophotographs from stable eyes was significantly greater for RNFLT than for RA measurements. (Invest Ophthalmol Vis Sci. 2010;51:3531-3539) DOI: 10.1167/iovs.09-4350