172 resultados para Probation intensive
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
We report a case of a 67 year-old-male patient admitted to the intensive care unit in the post-coronary bypass surgery period who presented cardiogenic shock, acute renal failure and three episodes of sepsis, the latter with pulmonary distress at the 30th post-operative day. The patient expired within five days in spite of treatment with vancomycin, imipenem, colistimethate and amphotericin B. At autopsy severe adenovirus pneumonia was found. Viral pulmonary infections following cardiovascular surgery are uncommon. We highlight the importance of etiological diagnosis to a correct treatment approach.
Resumo:
In order to assess the prevalence of and risk factors for aminoglycoside-associated nephrotoxicity in intensive care units (ICUs), we evaluated 360 consecutive patients starting aminoglycoside therapy in an ICU. The patients had a baseline calculated glomerular filtration rate (cGFR) of ?30 ml/min/1.73 m2. Among these patients, 209 (58 per cent) developed aminoglycoside-associated nephrotoxicity (the acute kidney injury [AKI] group, which consisted of individuals with a decrease in cGFR of >20 per cent from the baseline cGFR), while 151 did not (non-AKI group). Both groups had similar baseline cGFRs. The AKI group developed a lower cGFR nadir (45 ± 27 versus 79 ± 39 ml/min/1.73 m2 for the non-AKI group; P < 0.001); was older (56 ± 18 years versus 52 ± 19 years for the non-AKI group; P = 0.033); had a higher prevalence of diabetes (19.6 per cent versus 9.3 per cent for the non-AKI group; P = 0.007); was more frequently treated with other nephrotoxic drugs (51 per cent versus 38 per cent for the non-AKI group; P = 0.024); used iodinated contrast more frequently (18 per cent versus 8 per cent for the non-AKI group; P = 0.0054); and showed a higher prevalence of hypotension (63 per cent versus 44 per cent for the non-AKI group; P = 0.0003), shock (56 per cent versus 31 per cent for the non-AKI group; P < 0.0001), and jaundice (19 per cent versus 8 per cent for the non-AKI group; P = 0.0036). The mortality rate was 44.5 per cent for the AKI group and 29.1 per cent for the non-AKI group (P = 0.0031). A logistic regression model identified as significant (P < 0.05) the following independent factors that affected aminoglycoside-associated nephrotoxicity: a baseline cGFR of <60 ml/min/1.73 m2 (odds ratio [OR], 0.42), diabetes (OR, 2.13), treatment with other nephrotoxins (OR, 1.61) or iodinated contrast (OR, 2.13), and hypotension (OR, 1.83). (To continue) In conclusion, AKI was frequent among ICU patients receiving an aminoglycoside, and it was associated with a high rate of mortality. The presence of diabetes or hypotension and the use of other nephrotoxic drugs and iodinated contrast were independent risk factors for the development of aminoglycoside-associated nephrotoxicity
Resumo:
The aim of this study was to develop the concept of the dignified death of children in Brazilian pediatric intensive care units (PICUs). The Hybrid Model for Concept Development was used to develop a conceptual structure of dignified death in PICUs in an attempt to define the concept. The fieldwork study was carried out by means of in-depth interviews with nine nurses and seven physicians working in PICUs. Not unexpectedly, the concept of dignified death was found to be a complex phenomenon involving aspects related to decisions made by the multidisciplinary team as well as those related to care of the child and the family. Knowledge of the concept`s dimensions can promote reflection on the part of healthcare professionals regarding the values and beliefs underlying their conduct in end-of-life situations. Our hope is that this study may contribute to theoretic and methodological development in the area of end-of-life care.
Resumo:
Purpose Adverse drug events (ADEs) are harmful and occur with alarming frequency in critically ill patients. Complex pharmacotherapy with multiple medications increases the probability of a drug interaction (DI) and ADEs in patients in intensive care units (ICUs). The objective of the study is to determine the frequency of ADEs among patients in the ICU of a university hospital and the drugs implicated. Also, factors associated with ADEs are investigated. Methods This cross-sectional study investigated 299 medical records of patients hospitalized for 5 or more days in an ICU. ADEs were identified through intensive monitoring adopted in hospital pharmacovigilance and also ADE triggers. Adverse drug reactions (ADR) causality was classified using the Naranjo algorithm. Data were analyzed through descriptive analysis, and through univariate and multiple logistic regression. Results The most frequent ADEs were ADRs type A, of possible causality and moderate severity. The most frequent ADR was drug-induced acute kidney injury. Patients with ADEs related to DIs corresponded to 7% of the sample. The multiple logistic regression showed that length of hospitalization (OR = 1.06) and administration of cardiovascular drugs (OR = 2.2) were associated with the occurrence of ADEs. Conclusion Adverse drug reactions of clinical significance were the most frequent ADEs in the ICU studied, which reduces patient safety. The number of ADEs related to drug interactions was small, suggesting that clinical manifestations of drug interactions that harm patients are not frequent in ICUs.
Resumo:
Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.
Resumo:
The standards in this chapter focus on maximising the patient`s ability to adhere to the treatment prescribed. Many people are extremely shocked when they are told they have TB, some refuse to accept it and others are relieved to find out what is wrong and that treatment is available. The reaction depends on many factors, including cultural beliefs and values, previous experience and knowledge of the disease. Even though TB is more common among vulnerable groups, it can affect anyone and it is important for patients to be able to discuss their concerns in relation to their own individual context. The cure for TB relies on the patient receiving a full, uninterrupted course of treatment, which can only be achieved if the patient and the health service work together. A system needs to be in place to trace patients who miss their appointments for treatment (late patients). The best success will be achieved through the use of flexible, innovative and individualised approaches. The treatment and care the patient has received will inevitably have an impact on his or her willingness to attend in the future. A well-defined system of late patient tracing is mandatory in all situations. However, when the rates are high (above 10%), any tracing system will be useless without also examining the service as a whole.
Resumo:
BACKGROUND: Guidelines for red blood cell (RBC) transfusions exist; however, transfusion practices vary among centers. This study aimed to analyze transfusion practices and the impact of patients and institutional characteristics on the indications of RBC transfusions in preterm infants. STUDY DESIGN AND METHODS: RBC transfusion practices were investigated in a multicenter prospective cohort of preterm infants with a birth weight of less than 1500 g born at eight public university neonatal intensive care units of the Brazilian Network on Neonatal Research. Variables associated with any RBC transfusions were analyzed by logistic regression analysis. RESULTS: Of 952 very-low-birth-weight infants, 532 (55.9%) received at least one RBC transfusion. The percentages of transfused neonates were 48.9, 54.5, 56.0, 61.2, 56.3, 47.8, 75.4, and 44.7%, respectively, for Centers 1 through 8. The number of transfusions during the first 28 days of life was higher in Center 4 and 7 than in other centers. After 28 days, the number of transfusions decreased, except for Center 7. Multivariate logistic regression analysis showed higher likelihood of transfusion in infants with late onset sepsis (odds ratio [OR], 2.8; 95% confidence interval [CI], 1.8-4.4), intraventricular hemorrhage (OR, 9.4; 95% CI, 3.3-26.8), intubation at birth (OR, 1.7; 95% CI, 1.0-2.8), need for umbilical catheter (OR, 2.4; 95% CI, 1.3-4.4), days on mechanical ventilation (OR, 1.1; 95% CI, 1.0-1.2), oxygen therapy (OR, 1.1; 95% CI, 1.0-1.1), parenteral nutrition (OR, 1.1; 95% CI, 1.0-1.1), and birth center (p < 0.001). CONCLUSIONS: The need of RBC transfusions in very-low-birth-weight preterm infants was associated with clinical conditions and birth center. The distribution of the number of transfusions during hospital stay may be used as a measure of neonatal care quality.
Resumo:
The sustainability of fast-growing tropical Eucalyptus plantations is of concern in a context of rising fertilizer costs, since large amounts of nutrients are removed with biomass every 6-7 years from highly weathered soils. A better understanding of the dynamics of tree requirements is required to match fertilization regimes to the availability of each nutrient in the soil. The nutrition of Eucalyptus plantations has been intensively investigated and many studies have focused on specific fluxes in the biogeochemical cycles of nutrients. However, studies dealing with complete cycles are scarce for the Tropics. The objective of this paper was to compare these cycles for Eucalyptus plantations in Congo and Brazil, with contrasting climates, soil properties, and management practices. The main features were similar in the two situations. Most nutrient fluxes were driven by crown establishment the two first years after planting and total biomass production thereafter. These forests were characterized by huge nutrient requirements: 155, 10, 52, 55 and 23 kg ha(-1) of N, P, K, Ca and Mg the first year after planting at the Brazilian study site, respectively. High growth rates the first months after planting were essential to take advantage of the large amounts of nutrients released into the soil solutions by organic matter mineralization after harvesting. This study highlighted the predominant role of biological and biochemical cycles over the geochemical cycle of nutrients in tropical Eucalyptus plantations and indicated the prime importance of carefully managing organic matter in these soils. Limited nutrient losses through deep drainage after clear-cutting in the sandy soils of the two study sites showed the remarkable efficiency of Eucalyptus trees in keeping limited nutrient pools within the ecosystem, even after major disturbances. Nutrient input-output budgets suggested that Eucalyptus plantations take advantage of soil fertility inherited from previous land uses and that long-term sustainability will require an increase in the inputs of certain nutrients. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Only 7% of the once extensive forest along the eastern coast of Brazil remains, and much of that is degraded and threatened by agricultural expansion and urbanization. We wondered if methods similar to those developed to establish fast-growing Eucalyptus plantations might also work to enhance survival and growth of rainforest species on degraded pastures composed of highly competitive C(4) grasses. An 8-factor experiment was laid out to contrast the value of different intensities of cultivation, application of fertilizer and weed control on the growth and survival of a mixture of 20 rainforest species planted at two densities: 3 m x 1 m, and 3 m x 2 m. Intensive management increased seedling survival from 90% to 98%, stemwood production and leaf area index (LAI) by similar to 4-fold, and stemwood production per unit of light absorbed by 30%. Annual growth in stem biomass was closely related to LAI alone (r(2) = 0.93, p < 0.0001), and the regression improved further in combination with canopy nitrogen content (r(2) =0.99, p < 0.0001). Intensive management resulted in a nearly closed forest canopy in less than 4 years, and offers a practical means to establish functional forests on abandoned agricultural land. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We used environmental accounting to evaluate high-intensity clonal eucalyptus production in Sao Paolo, Brazil, converting inputs (environmental, material, and labor) to emergy units so ecological efficiency could be compared on a common basis. Input data were compiled under three pH management scenarios (lime, ash, and sludge). The dominant emergy input is environmental work (transpired water, similar to 58% of total emergy), followed by diesel (similar to 15%); most purchased emergy is invested during harvest (41.8% of 7-year production totals). Where recycled materials are used for pH amendment (ash or sludge instead of lime), we observe marked improvements in ecological efficiency; lime (raw) yielded the highest unit emergy value (UEV = emergy per unit energy in the product = 9.6E + 03 sej J(-1)), whereas using sludge and ash (recycled) reduced the UEV to 8.9E + 03 and 8.8E + 03 sej J(-1), respectively. The emergy yield ratio was similarly affected, suggesting better ecological return on energy invested. Sensitivity of resource use to other operational modifications (e.g., decreased diesel, labor, or agrochemicals) was small (<3% change). Emergy synthesis permits comparison of sustainability among forest production systems globally. This eucalyptus scheme shows the highest ecological efficiency of analyzed pulp production operations (UEV range = 1.1 to 3.6E + 04 sej J(-1)) despite high operational intensity.
Resumo:
Objective: To describe an outbreak of imipenem-resistant metallo-beta-lactamase-producing Pseudomonas aeruginosa, enzyme type bla, by horizontal transmission in patients admitted to a mixed adult ICU. Methods: A case-control study was carried out, including 47 patients (cases) and 122 patients (control) admitted to the mixed ICU of a university hospital in Minas Gerais. Brazil from November 2003 to July 2005. The infection site, risk factors, mortality, antibiotic susceptibility, metallo-beta-lactamase (MBL) production, enzyme type, and clonal diversity were analyzed, Results: A temporal/spatial relationship was detected in most patients (94%), overall mortality was 55.3%, and pneumonia was the predominant infection (85%). The majority of isolates (95%) were resistant to imipenem and other antibiotics, except for polymyxin, and showed MBL production (76.7%). Only bla SPM-1 (33%) was identified in the 15 specimens analyzed. In addition, 4 clones were identified, with a predominance of clone A (61.5%) and B (23.1%). On multivariate analysis, advanced age, mechanical ventilation, tracheostomy, and previous imipenem use were significant risk factors for imipenem-resistant P. aeruginosa infection. Conclusions: Clonal dissemination of MBL-producing P. aeruginosa strains with a spatial/temporal relationship disclosed problems in the practice of hospital infection control, low adherence to hand hygiene, and empirical antibiotic use. (C) 2008 Elsevier Espana, S.L. All rights reserved.
Resumo:
Purpose: Although gastrointestinal motility disorders are common in critically ill patients, constipation and its implications have received very little attention. We aimed to determine the incidence of constipation to find risk factors and its implications in critically ill patients Materials and Methods: During a 6-month period, we enrolled all patients admitted to an intensive care unit from an universitary hospital who stayed 3 or more days. Patients submitted to bowel surgery were excluded. Results: Constipation occurred in 69.9% of the patients. There was no difference between constipated and not constipated in terms of sex, age, Acute Physiology and Chronic Health Evaluation II, type of admission (surgical, clinical, or trauma), opiate use, antibiotic therapy, and mechanical ventilation. Early (<24 hours) enteral nutrition was associated with less constipation, a finding that persisted at multivariable analysis (P < .01). Constipation was not associated with greater intensive care unit or mortality, length of stay, or days free from mechanical ventilation. Conclusions: Constipation is very common among critically ill patients. Early enteral nutrition is associated with earlier return of bowel function. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Objective: to determine the relationship between age and in-hospital mortality of elderly patients, admitted to ICU, requiring and not requiring invasive ventilatory support. Design: prospective observational cohort study conducted over a period of 11 months. Setting: medical-surgical ICU at a Brazilian university hospital. Subjects: a total of 840 patients aged 55 years and older were admitted to ICU. Methods: in-hospital death rates for patients requiring and not requiring invasive ventilatory support were compared across three successive age intervals (55-64; 65-74 and 75 or more years), adjusting for severity of illness using the Acute Physiologic Score. Results: age was strongly correlated with mortality among the invasively ventilated subgroup of patients and the multivariate adjusted odds ratios increased progressively with every age increment (OR = 1.60, 95% CI = 1.01-2.54 for 65-74 years old and OR = 2.68, 95% CI = 1.58-4.56 for >= 75 years). For the patients not submitted to invasive ventilatory support, age was not independently associated with in-hospital mortality (OR = 2.28, 95% CI = 0.99-5.25 for 65-74 years old and OR = 1.95, 95% CI = 0.82-4.62 for >= 75 years old). Conclusions: the combination of age and invasive mechanical ventilation is strongly associated with in-hospital mortality. Age should not be considered as a factor related to in-hospital mortality of elderly patients not requiring invasive ventilatory support in ICU.
Resumo:
Objective: The study we assessed how often patients who are manifesting a myocardial infarction (MI) would not be considered candidates for intensive lipid-lowering therapy based on the current guidelines. Methods: In 355 consecutive patients manifesting ST elevation MI (STEMI), admission plasma C-reactive protein (CRP) was measured and Framingham risk score (FRS), PROCAM risk score, Reynolds risk score, ASSIGN risk score, QRISK, and SCORE algorithms were applied. Cardiac computed tomography and carotid ultrasound were performed to assess the coronary artery calcium score (CAC), carotid intima-media thickness (cIMT) and the presence of carotid plaques. Results: Less than 50% of STEMI patients would be identified as having high risk before the event by any of these algorithms. With the exception of FRS (9%), all other algorithms would assign low risk to about half of the enrolled patients. Plasma CRP was <1.0 mg/L in 70% and >2 mg/L in 14% of the patients. The average cIMT was 0.8 +/- 0.2 mm and only in 24% of patients was >= 1.0 mm. Carotid plaques were found in 74% of patients. CAC > 100 was found in 66% of patients. Adding CAC >100 plus the presence of carotid plaque, a high-risk condition would be identified in 100% of the patients using any of the above mentioned algorithms. Conclusion: More than half of patients manifesting STEMI would not be considered as candidates for intensive preventive therapy by the current clinical algorithms. The addition of anatomical parameters such as CAC and the presence of carotid plaques can substantially reduce the CVD risk underestimation. (C) 2010 Elsevier Ireland Ltd. All rights reserved.