113 resultados para intensive agricultural
Resumo:
OBJECTIVES: Although endogenous nitric oxide (NO) is an excitatory mediator in the central nervous system, inhaled NO is not considered to cause neurologic side effects because of its short half-life. This study was motivated by a recent case report about neurologic symptoms and our own observation of severe electroencephalogram (EEG) abnormalities during NO inhalation. DESIGN: Blind, retrospective analyses of EEGs which were registered before, during, and after NO inhalation. EEG was classified in a 5-point rating system by an independent electroencephalographer who was blinded to the patients' clinical histories. Comparisons were made with the previous evaluation documented at recording. Other EEG-influencing parameters such as oxygen saturation, hemodynamics, electrolytes, and pH were evaluated. SETTING: Pediatric intensive care unit of a tertiary care university children's hospital. PATIENTS: Eleven ventilated, long-term paralyzed, sedated children (1 mo to 14 yrs) who had EEG or clinical assessment before NO treatment and EEG during NO inhalation. They were divided into two groups according to the NO-indication (e.g., congenital heart defect, acute respiratory distress syndrome). MEASUREMENTS AND MAIN RESULTS: All 11 patients had an abnormal EEG during NO inhalation. EEG-controls without NO showed remarkable improvement. EEG abnormalities were background slowing, low voltage, suppression burst (n = 2), and sharp waves (n = 2) independent of patients' age, NO-indication, and other EEG-influencing parameters. CONCLUSIONS: These preliminary data suggest the occurrence of EEG-abnormalities after application of inhaled NO in critically ill children. We found no correlation with other potential EEG-influencing parameters, although clinical state, medication, or hypoxemia might contribute. Comprehensive, prospective, clinical assessment regarding a causal relationship between NO-inhalation and EEG-abnormalities and their clinical importance is needed.
Resumo:
Natural fluctuations in soil microbial communities are poorly documented because of the inherent difficulty to perform a simultaneous analysis of the relative abundances of multiple populations over a long time period. Yet, it is important to understand the magnitudes of community composition variability as a function of natural influences (e.g., temperature, plant growth, or rainfall) because this forms the reference or baseline against which external disturbances (e.g., anthropogenic emissions) can be judged. Second, definition of baseline fluctuations in complex microbial communities may help to understand at which point the systems become unbalanced and cannot return to their original composition. In this paper, we examined the seasonal fluctuations in the bacterial community of an agricultural soil used for regular plant crop production by using terminal restriction fragment length polymorphism profiling (T-RFLP) of the amplified 16S ribosomal ribonucleic acid (rRNA) gene diversity. Cluster and statistical analysis of T-RFLP data showed that soil bacterial communities fluctuated very little during the seasons (similarity indices between 0.835 and 0.997) with insignificant variations in 16S rRNA gene richness and diversity indices. Despite overall insignificant fluctuations, between 8 and 30% of all terminal restriction fragments changed their relative intensity in a significant manner among consecutive time samples. To determine the magnitude of community variations induced by external factors, soil samples were subjected to either inoculation with a pure bacterial culture, addition of the herbicide mecoprop, or addition of nutrients. All treatments resulted in statistically measurable changes of T-RFLP profiles of the communities. Addition of nutrients or bacteria plus mecoprop resulted in bacteria composition, which did not return to the original profile within 14 days. We propose that at less than 70% similarity in T-RFLP, the bacterial communities risk to drift apart to inherently different states.
Resumo:
Habitat loss and fragmentation due to land use changes are major threats to biodiversity in forest ecosystems, and they are expected to have important impacts on many taxa and at various spatial scales. Species richness and area relationships (SARs) have been used to assess species diversity patterns and drivers, and thereby in the establishment of conservation and management strategies. Here we propose a hierarchical approach to achieve deeper insights on SARs in small forest islets in intensive farmland and to address the impacts of decreasing naturalness on such relationships. In the intensive dairy landscapes of Northwest Portugal, where small forest stands (dominated by pines, eucalypts or both) represent semi-natural habitat islands, 50 small forest stands were selected and surveyed for vascular plant diversity. A hierarchical analytical framework was devised to determine species richness and inter- and intra-patch SARs for the whole set of forest patches (general patterns) and for each type of forest (specific patterns). Differences in SARs for distinct groups were also tested by considering subsets of species (native, alien, woody, and herbaceous). Overall, values for species richness were confirmed to be different between forest patches exhibiting different levels of naturalness. Whereas higher values of plant diversity were found in pine stands, higher values for alien species were observed in eucalypt stands. Total area of forest (inter-patch SAR) was found not to have a significant impact on species richness for any of the targeted groups of species. However, significant intra-patch SARs were obtained for all groups of species and forest types. A hierarchical approach was successfully applied to scrutinise SARs along a gradient of forest naturalness in intensively managed landscapes. Dominant canopy tree and management intensity were found to reflect differently on distinct species groups as well as to compensate for increasing stand area, buffering SARs among patches, but not within patches. Thus, the maintenance of small semi-natural patches dominated by pines, under extensive practices of forest management, will promote native plant diversity while at the same time contributing to limit the expansion of problematic alien invasive species.
Resumo:
PURPOSE: The recent increase in drug-resistant micro-organisms complicates the management of hospital-acquired bloodstream infections (HA-BSIs). We investigated the epidemiology of HA-BSI and evaluated the impact of drug resistance on outcomes of critically ill patients, controlling for patient characteristics and infection management. METHODS: A prospective, multicentre non-representative cohort study was conducted in 162 intensive care units (ICUs) in 24 countries. RESULTS: We included 1,156 patients [mean ± standard deviation (SD) age, 59.5 ± 17.7 years; 65 % males; mean ± SD Simplified Acute Physiology Score (SAPS) II score, 50 ± 17] with HA-BSIs, of which 76 % were ICU-acquired. Median time to diagnosis was 14 [interquartile range (IQR), 7-26] days after hospital admission. Polymicrobial infections accounted for 12 % of cases. Among monomicrobial infections, 58.3 % were gram-negative, 32.8 % gram-positive, 7.8 % fungal and 1.2 % due to strict anaerobes. Overall, 629 (47.8 %) isolates were multidrug-resistant (MDR), including 270 (20.5 %) extensively resistant (XDR), and 5 (0.4 %) pan-drug-resistant (PDR). Micro-organism distribution and MDR occurrence varied significantly (p < 0.001) by country. The 28-day all-cause fatality rate was 36 %. In the multivariable model including micro-organism, patient and centre variables, independent predictors of 28-day mortality included MDR isolate [odds ratio (OR), 1.49; 95 % confidence interval (95 %CI), 1.07-2.06], uncontrolled infection source (OR, 5.86; 95 %CI, 2.5-13.9) and timing to adequate treatment (before day 6 since blood culture collection versus never, OR, 0.38; 95 %CI, 0.23-0.63; since day 6 versus never, OR, 0.20; 95 %CI, 0.08-0.47). CONCLUSIONS: MDR and XDR bacteria (especially gram-negative) are common in HA-BSIs in critically ill patients and are associated with increased 28-day mortality. Intensified efforts to prevent HA-BSIs and to optimize their management through adequate source control and antibiotic therapy are needed to improve outcomes.
Resumo:
INTRODUCTION. Patients admitted in Intensive Care Unit (ICU) from general wards are more severe and have a higher mortality than those admitted from emergency department as reported [1]. The majority of them develop signs of instability (e.g. tachypnea, tachycardia, hypotension, decreased oxygen saturation and change in conscious state) several hours before ICU admission. Considering this fact and that in-hospital cardiac arrests and unexpected deaths are usually preceded by warning signs, immediate on site intervention by specialists may be effective. This gave an impulse to medical emergency team (MET) implementation, which has been shown to decrease cardiac arrest, morbidity and mortality in several hospitals. OBJECTIVES AND METHODS. In order to verify if the same was true in our hospital and to determine if there was a need for MET, we prospectively collected all non elective ICU admissions of already hospitalized patients (general wards) and of patients remaining more than 3 h in emergency department (considered hospitalized). Instability criteria leading to MET call correspond to those described in the literature. The delay between the development of one criterion and ICU admission was registered. RESULTS. During an observation period of 12 months, 321 patients with our MET criteria were admitted to ICU. 88 patients came from the emergency department, 115 from the surgical and 113 from the medical ward. 65% were male. The median age was 65 years (range 17-89). The delay fromMETcriteria development to ICU admission was higher than 8 h in 155 patients, with a median delay of 32 h and a range of 8.4 h to 10 days. For the remaining 166 patients, an early MET criterion was present up to 8 h (median delay 3 h) before ICU admission. These results are quite concordant with the data reported in the literature (ref 1-8). 122 patients presented signs of sepsis or septic shock, 70 patients a respiratory failure, 58 patients a cardiac emergency. Cardiac arrest represent 5% of our collective of patients. CONCLUSIONS.Similar to others observations, the majority of hospitalized patients admitted on emergency basis in our ICU have warning signs lasting for several hours. More than half of them were unstable for more than 8 h. This shows there is plenty of time for early acute management by dedicated and specialized team such as MET. However, further studies are required to determine if MET implementation can reduce in-hospital cardiac arrests and influence the morbidity, the length of stay and the mortality.
Resumo:
The prevalence of delirium in the Intensive Care Unit (ICU) is reported to vary from 20 to 80 %. Delirium in the ICU is not only a frightening experience for the patient and his or her family, it is also a challenge for the nurses and physicians taking care of the patient. Furthermore, it is also associated with worse outcome, prolonged hospitalisation, increased costs, long-term cognitive impairment and higher mortality rates. Thus, strategies to prevent ICU-delirium in addition to the early diagnosis and treatment of delirium are important. The pathophysiology of delirium is still incompletely understood, but numerous risk factors for the development of delirium have been identified in ICU-patients, among which are potentially modifiable factors such as metabolic disturbances, hypotension, anaemia, fever and infection. Key factors are the prevention and management of common risk factors, including avoiding overzealous sedation and analgesia and creating an environment that enhances reintegration. Once delirium is diagnosed, treatment consists of the use of typical and atypical antipsychotics. Haloperidol is still the drug of choice for the treatment of delirium and can be given intravenously in incremental doses of 1 to 2 to 5 (to 10) mg every 15 - 20 minutes.
Resumo:
BACKGROUND: Antipyresis is a common clinical practice in intensive care, although it is unknown if fever is harmful, beneficial, or a negligible adverse effect of infection and inflammation. METHODS: In a randomized study, rectal temperature and discomfort were assessed in 38 surgical intensive care unit patients without neurotrauma or severe hypoxemia and with fever (temperature >/=38.5 degrees C) and systemic inflammatory response syndrome. Eighteen patients received external cooling while 20 received no antipyretic treatment. RESULTS: Temperature and discomfort decreased similarly in both groups after 24 hours. No significant differences in recurrence of fever, incidence of infection, antibiotic therapy, intensive care unit and hospital length of stay, or mortality were noted between the groups. CONCLUSIONS: These results suggest that the systematic suppression of fever may not be useful in patients without severe cranial trauma or significant hypoxemia. Letting fever take its natural course does not seem to harm patients with systemic inflammatory response syndrome or influence the discomfort level and may save costs.
Resumo:
An increasing number of terminally ill patients are admitted into the intensive care unit, and decisions of limitation, or of palliative care are made to avoid medical futility. The principle of autonomy states that the patient (or in case of necessity his relatives) should make end of life decision after detailed information. The exercise of autonomy is difficult due to the disease of the patient and the nature of invasive treatments, but also due to organisational and communication barriers. The latter can be surmounted by a proactive approach. Early communication with the patient and relatives about the sometimes-limited expectations of an invasive treatment plan, and the possibility of palliative care allow to integer patient's preferences in the formulation of a therapeutical plan.
Resumo:
Objectives Medical futility at the end of life is a growing challenge to medicine. The goals of the authors were to elucidate how clinicians define futility, when they perceive life-sustaining treatment (LST) to be futile, how they communicate this situation and why LST is sometimes continued despite being recognised as futile. Methods The authors reviewed ethics case consultation protocols and conducted semi-structured interviews with 18 physicians and 11 nurses from adult intensive and palliative care units at a tertiary hospital in Germany. The transcripts were subjected to qualitative content analysis. Results Futility was identified in the majority of case consultations. Interviewees associated futility with the failure to achieve goals of care that offer a benefit to the patient's quality of life and are proportionate to the risks, harms and costs. Prototypic examples mentioned are situations of irreversible dependence on LST, advanced metastatic malignancies and extensive brain injury. Participants agreed that futility should be assessed by physicians after consultation with the care team. Intensivists favoured an indirect and stepwise disclosure of the prognosis. Palliative care clinicians focused on a candid and empathetic information strategy. The reasons for continuing futile LST are primarily emotional, such as guilt, grief, fear of legal consequences and concerns about the family's reaction. Other obstacles are organisational routines, insufficient legal and palliative knowledge and treatment requests by patients or families. Conclusion Managing futility could be improved by communication training, knowledge transfer, organisational improvements and emotional and ethical support systems. The authors propose an algorithm for end-of-life decision making focusing on goals of treatment.
Resumo:
Climate change data and predictions for the Himalayas are very sparse and uncertain, characterized by a ?Himalayan data gap? and difficulties in predicting changes due to topographic complexity. A few reliable studies and climate change models for Nepal predict considerable changes: shorter monsoon seasons, more intensive rainfall patterns, higher temperatures, and drought. These predictions are confirmed by farmers who claim that temperatures have been increasing for the past decade and wonder why the rains have ?gone mad.? The number of hazard events, notably droughts, floods, and landslides are increasing and now account for approximately 100 deaths in Nepal annually. Other effects are drinking water shortages and shifting agricultural patterns, with many communities struggling to meet basic food security before climatic conditions started changing. The aim of this paper is to examine existing gaps between current climate models and the realities of local development planning through a case study on flood risk and drinking water management for the Municipality of Dharan in Eastern Nepal. This example highlights current challenges facing local-level governments, namely, flood and landslide mitigation, providing basic amenities ? especially an urgent lack of drinking water during the dry season ? poor local planning capacities, and limited resources. In this context, the challenge for Nepal will be to simultaneously address increasing risks caused by hazard events alongside the omnipresent food security and drinking water issues in both urban and rural areas. Local planning is needed that integrates rural development and disaster risk reduction (DRR) with knowledge about climate change considerations. The paper concludes with a critical analysis of climate change modeling and the gap between scientific data and low-tech and low capacities of local planners to access or implement adequate adaptation measures. Recommendations include the need to bridge gaps between scientific models, the local political reality and local information needs.