51 resultados para Intensive care units pediatric
Resumo:
Premature birth is a well-known risk factor for sensorineural hearing loss in general and auditory neuropathy in particular. However, relatively little is known about the underlying causes, in part because there are so few relevant histopathological studies. Here, we report on the analysis of hair cell loss patterns in 54 temporal bones from premature infants and a control group of 46 bones from full-term infants, all of whom spent time in the neonatal intensive care unit at the Hospital de Nios in San Jose, Costa Rica, between 1977 and 1993. The prevalence of significant hair cell loss was higher in the preterm group than the full-term group (41% vs. 28%, respectively). The most striking finding was the frequency of selective inner hair cell loss, an extremely rare histopathological pattern, in the preterm vs. the full-term babies (27% vs. 3%, respectively). The findings suggest that a common cause of non-genetic auditory neuropathy is selective loss of inner hair cells rather than primary damage to the cochlear nerve.
Resumo:
Decisions for intensive care unit (ICU) admissions in patients with advanced cancer are complex, and the knowledge of survival rates and prognostic factors are essential to these decisions. Ours objectives were to describe the short- and long-term survival of patients with metastatic solid cancer admitted to an ICU due to emergencies and to study the prognostic factors presented at ICU admission that could be associated with hospital mortality. We retrospectively analysed the charts of all patients with metastatic solid cancer admitted over a 1-year period. This gave a study sample of 83 patients. The ICU, hospital, 1-year and 2-year survival rates were 55.4%, 28.9%, 12.0% and 2.4% respectively. Thrombocytopenia (odds ratio 26.2; P = 0.006) and simplified acute physiology score (SAPS II) (odds ratio 1.09; P = 0.026) were independent factors associated with higher hospital mortality. In conclusion, the survival rates of patients with metastatic solid cancer admitted to the ICU due to emergencies were low, but of the same magnitude as other groups of cancer patients admitted to the ICU. The SAPS II score and thrombocytopenia on admission were associated with higher hospital mortality. The characteristics of the metastatic disease, such as number of organs with metastasis and central nervous system metastasis were not associated with the hospital mortality.
Resumo:
Objective: To evaluate the impact of antiretroviral therapy (ART) and the prognostic factors for in-intensive care unit (ICU) and 6-month mortality in human immunodeficiency virus (HIV)-infected patients. Design: A retrospective cohort study was conducted in patients admitted to the ICU from 1996 through 2006. The follow-up period extended for 6 months after ICU admission. Setting: The ICU of a tertiary-care teaching hospital at the Universidade de Sao Paulo, Brazil. Participants: A total of 278 HIV-infected patients admitted to the ICU were selected. We excluded ICU readmissions (37), ICU admissions who stayed less than 24 hours (44), and patients with unavailable medical charts (36). Outcome Measure: In-ICU and 6-month mortality. Main Results: Multivariate logistic regression analysis and Cox proportional hazards models demonstrated that the variables associated with in-ICU and 6-month mortality were sepsis as the cause of admission (odds ratio [OR] = 3.16 [95% confidence interval [CI] 1.65-6.06]); hazards ratio [HR] = 1.37 [95% Cl 1.01-1.88)), an Acute Physiology and Chronic Health Evaluation 11 score >19 [OR = 2.81 (95% CI 1.57-5.04); HR = 2.18 (95% CI 1.62-2.94)], mechanical ventilation during the first 24 hours [OR = 3.92 (95% CI 2.20-6.96); HR = 2.25 (95% CI 1.65-3.07)], and year of ICU admission [OR = 0.90 (95% CI 0.81-0.99); HR = 0.92 [95% CI 0.87-0.97)]. CD4 T-cell count <50 cells/mm(3) Was only associated with ICU mortality [OR = 2.10 (95% Cl 1.17-3.76)]. The use of ART in the ICU was negatively predictive of 6-month mortality in the Cox model [HR = 0.50 (95% CI 0.35-0.71)], especially if this therapy was introduced during the first 4 days of admission to the ICU [HR = 0.58 (95% CI 0.41-0.83)]. Regarding HIV-infected patients admitted to ICU without using ART, those who have started this treatment during ICU, stay presented a better prognosis when time and potential confounding factors were adjusted for [HR 0.55 (95% CI 0.31-0.98)]. Conclusions: The ICU outcome of HIV-infected patients seems to be dependent not only on acute illness severity, but also on the administration of antiretroviral treatment. (Crit Care Med 2009; 37: 1605-1611)
Resumo:
Purpose: The aim of this study was to characterize the first 48-hour evolution of metabolic acidosis of adult patients with diabetic ketoacidosis admitted to the intensive care unit. Materials and Methods: We studied 9 patients retrieved from our prospective collected database, using the physicochemical approach to acid-base disturbances. Results: Mean (SD) age was 34 (13) years; mean (SD) Acute Physiology and Chronic Health Evaluation II score was 16 (10); mean (SD) blood glucose level on admission was 480 (144) mg/dL; mean (SD) pH was 7.17 (0.18); and mean (SD) standard base excess was -16.8 (7.7) mEq/L. On admission, a great part of metabolic acidosis was attributed to unmeasured anions (strong ion gap [SIG], 20 +/- 10 mEq/L), with a wide range of strong ion difference (41 +/- 10 mEq/L). During the first 48 hours of treatment, 297 +/- 180 IU of insulin and 9240 +/- 6505 mL of fluids were used. Metabolic improvement was marked by the normalization of pH, partial correction of standard base excess, and a reduction of hyperglycemia. There was a significant improvement of SIG (7.6 +/- 6.2 mEq/L) and a worsening of strong ion difference acidosis (36 +/- 5 mEq/L) in the first 24 hours, with a trend toward recuperation between 24 and 48 hours (38 +/- 6 mEq/L). Conclusion: Initial metabolic acidosis was due to SIG, and the treatment was associated with a significant decrease of SIG with an elevation of serum chloride above the normal range. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
OBJECTIVE. To evaluate the effectiveness of the oral application of a 0.12% solution of chlorhexidine for prevention of respiratory tract infections among intensive care unit (ICU) patients. DESIGN. The study design was a double-blind, randomized, placebo-controlled trial. SETTING. The study was performed in an ICU in a tertiary care hospital at a public university. PATIENTS. Study participants comprised 194 patients admitted to the ICU with a prospective length of stay greater than 48 hours, randomized into 2 groups: those who received chlorhexidine (n = 98) and those who received a placebo (n = 96). INTERVENTION. Oral rinses with chlorhexidine or a placebo were performed 3 times a day throughout the duration of the patient`s stay in the ICU. Clinical data were collected prospectively. RESULTS. Both groups displayed similar baseline clinical features. The overall incidence of respiratory tract infections (RR, 1.0 [95% confidence interval [CI], 0.63-1.60]) and the rates of ventilator-associated pneumonia per 1,000 ventilator-days were similar in both experimental and control groups (22.6 vs 22.3; P = .95). Respiratory tract infection-free survival time (7.8 vs 6.9 days; P = .61), duration of mechanical ventilation (11.1 vs 11.0 days; P = .61), and length of stay (9.7 vs 10.4 days; P = .67) did not differ between the chlorhexidine and placebo groups. However, patients in the chlorhexidine group exhibited a larger interval between ICU admission and onset of the first respiratory tract infection (11.3 vs 7.6 days; P = .05). The chances of surviving the ICU stay were similar (RR, 1.08 [95% CI, 0.72-1.63]). CONCLUSION. Oral application of a 0.12% solution of chlorhexidine does not prevent respiratory tract infections among ICU patients, although it may retard their onset.
Resumo:
Este estudo teve como objetivo realizar a adaptação cultural do The Environmental Stressor Questionnaire - (ESQ) para a língua portuguesa do Brasil e verificar sua confiabilidade e validade. Foram empregadas as etapas metodológicas recomendadas pela literatura para adaptação cultural. A versão brasileira do ESQ foi aplicada a 106 pacientes de Unidade de Terapia Intensiva (UTI) de dois hospitais, público e privado, do interior do Estado de São Paulo. A confiabilidade foi avaliada quanto à consistência interna e estabilidade (teste e reteste); a validade convergente foi verificada por meio da correlação entre o ESQ e questão genérica sobre estresse em UTI. A confiabilidade foi satisfatória com Alfa de Crombach=0,94 e Coeficiente de Correlação Intraclasse=0,861 (IC95% 0,723; 0,933). Constatou-se correlação entre o escore total do ESQ e a questão genérica sobre estresse (r=0,70), confirmando a validade convergente. A versão brasileira do ESQ mostrou-se uma ferramenta confiável e válida para avaliação de estressores em UTI.
Resumo:
OBJETIVO: Investigar a relação entre adequação da oferta energética e mortalidade na unidade de terapia intensiva em pacientes sob terapia nutricional enteral exclusiva. MÉTODOS: Estudo observacional prospectivo conduzido em uma unidade de terapia intensiva em 2008 e 2009. Foram incluídos pacientes >18 anos que receberam terapia nutricional enteral por >72h. A adequação da oferta de energia foi estimada pela razão administrado/prescrito. Para a investigação da relação entre variáveis preditoras (adequação da oferta energética, escore APACHE II, sexo, idade e tempo de permanência na unidade de terapia intensiva e o desfecho mortalidade na unidade de terapia intensiva, utilizou-se o modelo de regressão logística não condicional. RESULTADOS: Foram incluídos 63 pacientes (média 58 anos, mortalidade 27%), 47,6% dos quais receberam mais de 90% da energia prescrita (adequação média 88,2%). O balanço energético médio foi de -190 kcal/dia. Observou-se associação significativa entre ocorrência de óbito e as variáveis idade e tempo de permanência na unidade de terapia intensiva, após a retirada das variáveis adequação da oferta energética, APACHE II e sexo durante o processo de modelagem. CONCLUSÃO: A adequação da oferta energética não influenciou a taxa de mortalidade na unidade de terapia intensiva. Protocolos de infusão de nutrição enteral seguidos criteriosamente, com adequação administrado/prescrito acima de 70%, parecem ser suficientes para não interferirem na mortalidade. Dessa forma, pode-se questionar a obrigatoriedade de atingir índices próximos a 100%, considerando a elevada frequência com que ocorrem interrupções no fornecimento de dieta enteral devido a intolerância gastrointestinal e jejuns para exames e procedimentos. Pesquisas futuras poderão identificar a meta ideal de adequação da oferta energética que resulte em redução significativa de complicações, mortalidade e custos.
Resumo:
Various molecular systems are available for epidemiological, genetic, evolutionary, taxonomic and systematic studies of innumerable fungal infections, especially those caused by the opportunistic pathogen C. albicans. A total of 75 independent oral isolates were selected in order to compare Multilocus Enzyme Electrophoresis (MLEE), Electrophoretic Karyotyping (EK) and Microsatellite Markers (Simple Sequence Repeats - SSRs), in their abilities to differentiate and group C. albicans isolates (discriminatory power), and also, to evaluate the concordance and similarity of the groups of strains determined by cluster analysis for each fingerprinting method. Isoenzyme typing was performed using eleven enzyme systems: Adh, Sdh, M1p, Mdh, Idh, Gdh, G6pdh, Asd, Cat, Po, and Lap (data previously published). The EK method consisted of chromosomal DNA separation by pulsed-field gel electrophoresis using a CHEF system. The microsatellite markers were investigated by PCR using three polymorphic loci: EF3, CDC3, and HIS3. Dendrograms were generated by the SAHN method and UPGMA algorithm based on similarity matrices (S(SM)). The discriminatory power of the three methods was over 95%, however a paired analysis among them showed a parity of 19.7-22.4% in the identification of strains. Weak correlation was also observed among the genetic similarity matrices (S(SM)(MLEE) x S(SM)(EK) x S(SM)(SSRs)). Clustering analyses showed a mean of 9 +/- 12.4 isolates per cluster (3.8 +/- 8 isolates/taxon) for MLEE, 6.2 +/- 4.9 isolates per cluster (4 +/- 4.5 isolates/taxon) for SSRs, and 4.1 +/- 2.3 isolates per cluster (2.6 +/- 2.3 isolates/taxon) for EK. A total of 45 (13%), 39(11.2%), 5 (1.4%) and 3 (0.9%) clusters pairs from 347 showed similarity (Si) of 0.1-10%, 10.1-20%, 20.1-30% and 30.1-40%, respectively. Clinical and molecular epidemiological correlation involving the opportunistic pathogen C. albicans may be attributed dependently of each method of genotyping (i.e., MLEE, EK, and SSRs) supplemented with similarity and grouping analysis. Therefore, the use of genotyping systems that give results which offer minimum disparity, or the combination of the results of these systems, can provide greater security and consistency in the determination of strains and their genetic relationships. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In the present study, clinical and epidemiological aspects of 529 intoxication cases of organophosphate or carbamate pesticides in the northwest of the state of Parana, Brazil, over a twelve-year period (1994-2005), are presented. One hundred-five of 257 patients (40.8%) who attempted suicide were admitted to Intensive Care Units (ICUs), with an average hospital stay of two days (range 1-40 days). Men corresponded to 56.4% of the cases of suicide attempts and sixteen individuals died. One hundred-forty patients intoxicated due to occupational exposure were all young adults and nine of them were admitted to ICU, with average hospital stays of eight days (range 1-16 days). Of these cases, two patients died. One hundred twenty-four patients intoxicated due to accidental exposure were mainly children and had a hospital average stay of four days. Twenty patients were admitted to the ICU, and one of them died. Overall complications included respiratory failure, convulsions, and aspiration pneumonia. Deliberate ingestion of organophosphates and carbamates Was much more toxic than occupational and accidental exposure. Men aged 15-39 years were the most likely to attempt suicide with these agents and had more prolonged ICU with significant complications and mortality
Resumo:
Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)
Resumo:
Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Sepsis syndrome is caused by inappropriate immune activation due to bacteria and bacterial components released during infection. This syndrome is the leading cause of death in intensive care units. Specialized B-lymphocytes located in the peritoneal and pleural cavities are known as B-1 cells. These cells produce IgM and IL-10, both of which are potent regulators of cell-mediated immunity. It has been suggested that B-1 cells modulate the systemic inflammatory response in sepsis. In this study, we conducted in vitro and in vivo experiments in order to investigate a putative role of B-1 cells in a murine model of LPS-induced sepsis. Macrophages and B-1 cells were studied in monocultures and in co-cultures. The B-1 cells produced the anti-inflammatory cytokine IL-10 in response to LPS. In the B-1 cell-macrophage co-cultures, production of proinflammatory mediators (TNF-alpha, IL-6 and nitrite) was lower than in the macrophage monocultures, whereas that of IL-10 was higher in the co-cultures. Co-culture of B-1 IL-10(-/-) cells and macrophages did not reduce the production of the proinflammatory mediators (TNF-alpha, IL-6 and nitrite). After LPS injection, the mortality rate was higher among Balb/Xid mice, which are B-1 cell deficient, than among wild-type mice (65.0% vs. 0.0%). The Balb/Xid mice also presented a proinflammatory profile of TNF-alpha, IL-6 and nitrite, as well as lower levels of IL-10. In the early phase of LPS stimulation, B-1 cells modulate the macrophage inflammatory response, and the main molecular pathway of that modulation is based on IL-10-mediated intracellular signaling. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
Background: This study evaluated the impact of 2 models of educational intervention on rates of central venous catheter-associated bloodstream infections (CVC-BSIs). Methods: This was a prospective observational study conducted between January 2005 and June 2007 in 2 medical intensive care units (designated ICU A and ICU B) in a large teaching hospital. The study was divided into in 3 periods: baseline (only rates were evaluated), preintervention (questionnaire to evaluate knowledge of health care workers [HCWs] and observation of CVC care in both ICUs), and intervention (in ICU A, tailored, continuous intervention; in ICU B, a single lecture). The preintervention and intervention periods for each ICU were compared. Results: During the preintervention period, 940 CVC-days were evaluated in ICUA and 843 CVC-days were evaluated in ICU B. During the intervention period, 2175 CVC-days were evaluated in ICUA and 1694 CVC-days were evaluated in ICU B. Questions regarding CVC insertion, disinfection during catheter manipulation, and use of an alcohol-based product during dressing application were answered correctly by 70%-100% HCWs. Nevertheless, HCWs` adherence to these practices in the preintervention period was low for CVC handling and dressing, hand hygiene (6%-35%), and catheter hub disinfection (45%-68%). During the intervention period, HCWs` adherence to hand hygiene was 48%-98%, and adherence to hub disinfection was 82%-97%. CVC-BSI rates declined in both units. In ICUA, this decrease was progressive and sustained, from 12CVC-BSIs/1000 CVC-days at baseline to 0 after 9 months. In ICU B, the rate initially dropped from 16.2 to 0 CVC-BSIs/1000 CVC-days, but then increased to 13.7 CVC-BSIs/1000 CVC-days. Conclusion: Personal customized, continuous intervention seems to develop a ""culture of prevention"" and is more effective than single intervention, leading to a sustained reduction of infection rates.
Resumo:
Objective To study the association between maternal preeclampsia and neonatal sepsis in very low birth weight newborns. Study design We studied all infants with birth weights between 500 g and 1500 g who were admitted to 6 neonatal intensive care units of the Brazilian Network on Neonatal Research for 2 years. Exclusion criteria were major malformations, death in the delivery room, and maternal chronic hypertension. Absolute neutrophil count was performed in the first 72 hours of life. Results A total of 911 very low birth weight infants (preeclampsia, 308; non-preeclampsia, 603) were included. The preeclampsia group had significantly higher gestational age, more cesarean deliveries, antenatal steroid, central catheters, total parenteral nutrition, and neutropenia, and less rupture of membranes >18 hours and mechanical ventilation. Both groups had similar incidences of early sepsis (4.6% and 4.2% in preeclampsia and non-preeclampsia groups, respectively) and late sepsis (24% and 22.1% in preeclampsia and non-preeclampsia groups, respectively). Vaginal delivery and neutropenia were associated with multiple logistic regressions with early sepsis, and mechanical ventilation, central catheter, and total parenteral nutrition were associated with late sepsis. Death was associated with neutropenia in very preterm infants. Conclusions Preeclampsia did not increase neonatal sepsis in very low birth weight infants, and death was associated with neutropenia in very preterm infants. (J Pediatr 2010; 157: 434-8).