989 resultados para Intensive supervision program
Resumo:
The purpose of this review is to critically appraise the pain assessment tools for non communicative persons in intensive care available in the literature and to determine their relevance for those with brain injury. Nursing and medical electronic databases were searched to identify pain tools, with a description of psychometric proprieties, in English and French. Seven of the ten tools were considered relevant and systematically evaluated according to the criteria and the indicators in the following five areas: conceptualisation, target population, feasibility and clinical utility, reliability and validity. Results indicate a number of well designed pain tools, but additional work is necessary to establish their accuracy and adequacy for the brain injured non communicative person in intensive care. Recommendations are made to choose the best tool for clinical practice and for research.
Resumo:
OBJECTIVE: To assess whether formatting the medical order sheet has an effect on the accuracy and security of antibiotics prescription. DESIGN: Prospective assessment of antibiotics prescription over time, before and after the intervention, in comparison with a control ward. SETTING: The medical and surgical intensive care unit (ICU) of a university hospital. PATIENTS: All patients hospitalized in the medical or surgical ICU between February 1 and April 30, 1997, and July 1 and August 31, 2000, for whom antibiotics were prescribed. INTERVENTION: Formatting of the medical order sheet in the surgical ICU in 1998. MEASUREMENTS AND MAIN RESULTS: Compliance with the American Society of Hospital Pharmacists' criteria for prescription safety was measured. The proportion of safe orders increased in both units, but the increase was 4.6 times greater in the surgical ICU (66% vs. 74% in the medical ICU and 48% vs. 74% in the surgical ICU). For unsafe orders, the proportion of ambiguous orders decreased by half in the medical ICU (9% vs. 17%) and nearly disappeared in the surgical ICU (1% vs. 30%). The only missing criterion remaining in the surgical ICU was the drug dose unit, which could not be preformatted. The aim of antibiotics prescription (either prophylactic or therapeutic) was indicated only in 51% of the order sheets. CONCLUSIONS: Formatting of the order sheet markedly increased security of antibiotics prescription. These findings must be confirmed in other settings and with different drug classes. Formatting the medical order sheet decreases the potential for prescribing errors before full computerized prescription is available.
Resumo:
This study had the objective of to analyze the demographic and bacteriologic data of 32 hospitalized newborns in an neonatal intensive care unit of a public maternity hospital in Rio de Janeiro city, Brazil, seized by Pseudomonas aeruginosa sepsis during a period ranged from July 1997 to July 1999, and to determine the antimicrobial resistance percentage, serotypes and pulsed field gel electrophoresis (PFGE) patterns of 32 strains isolated during this period. The study group presented mean age of 12.5 days, with higher prevalence of hospital infection in males (59.4%) and vaginal delivery (81.2%), than females (40.6%) and cesarean delivery (18.8%), respectively. In this group, 20 (62.5%) patients received antimicrobials before positive blood cultures presentation. A total of 87.5% of the patients were premature, 62.5% presented very low birth weight and 40.6% had asphyxia. We detected high antimicrobial resistance percentage to b-lactams, chloramphenicol, trimethoprim/sulfamethoxazole and tetracycline among the isolated strains. All isolated strains were classified as multi-drug resistant. Most strains presented serotype O11 while PFGE analysis revealed seven distinct clones with isolation predominance of a single clone (75%) isolated from July 1997 to June 1998.
Resumo:
QUESTION UNDER STUDY: Hospitals transferring patients retain responsibility until admission to the new health care facility. We define safe transfer conditions, based on appropriate risk assessment, and evaluate the impact of this strategy as implemented at our institution. METHODS: An algorithm defining transfer categories according to destination, equipment monitoring, and medication was developed and tested prospectively over 6 months. Conformity with algorithm criteria was assessed for every transfer and transfer category. After introduction of a transfer coordination centre with transfer nurses, the algorithm was implemented and the same survey was carried out over 1 year. RESULTS: Over the whole study period, the number of transfers increased by 40%, chiefly by ambulance from the emergency department to other hospitals and private clinics. Transfers to rehabilitation centres and nursing homes were reassigned to conventional vehicles. The percentage of patients requiring equipment during transfer, such as an intravenous line, decreased from 34% to 15%, while oxygen or i.v. drug requirement remained stable. The percentage of transfers considered below theoretical safety decreased from 6% to 4%, while 20% of transfers were considered safer than necessary. A substantial number of planned transfers could be "downgraded" by mutual agreement to a lower degree of supervision, and the system was stable on a short-term basis. CONCLUSION: A coordinated transfer system based on an algorithm determining transfer categories, developed on the basis of simple but valid medical and nursing criteria, reduced unnecessary ambulance transfers and treatment during transfer, and increased adequate supervision.
Resumo:
Malnutrition is common in critically ill, hospitalized patients and so represents a major problem for intensive care. Nutritional support can be beneficial in such cases and may help preserve vital organ and immune function. Energy requirements, route of delivery and potential complications of nutritional support are discussed in this paper.
Resumo:
Performance prediction and application behavior modeling have been the subject of exten- sive research that aim to estimate applications performance with an acceptable precision. A novel approach to predict the performance of parallel applications is based in the con- cept of Parallel Application Signatures that consists in extract an application most relevant parts (phases) and the number of times they repeat (weights). Executing these phases in a target machine and multiplying its exeuction time by its weight an estimation of the application total execution time can be made. One of the problems is that the performance of an application depends on the program workload. Every type of workload affects differently how an application performs in a given system and so affects the signature execution time. Since the workloads used in most scientific parallel applications have dimensions and data ranges well known and the behavior of these applications are mostly deterministic, a model of how the programs workload affect its performance can be obtained. We create a new methodology to model how a program’s workload affect the parallel application signature. Using regression analysis we are able to generalize each phase time execution and weight function to predict an application performance in a target system for any type of workload within predefined range. We validate our methodology using a synthetic program, benchmarks applications and well known real scientific applications.
Resumo:
RÉSUMÉ L'exercice est utilisé dans le traitement de la lombalgie depuis plus de cent ans. La recherche dans ce domaine a commencé au milieu du XXème siècle puis s'est développée exponentiellement jusqu'à nos jours. La première étude de cette thèse a eu pour but de passer en revue cette abondante littérature scientifique. Il en est ressorti que l'exercice est un moyen efficace de prévention primaire et secondaire de la lombalgie. En tant que modalité de traitement, l'exercice permet de diminuer l'incapacité et la douleur et d'améliorer la condition physique et le statut professionnel des patients lombalgiques subaigus et chroniques. Parmi les caractéristiques de l'exercice, la supervision est essentielle. Des investigations ultérieures sont nécessaires afin d'identifier des sous-groupes de patients répondant favorablement à d'autres caractéristiques de l'exercice. L'exercice est souvent utilisé dans l'optique de maintenir les résultats obtenus à la suite d'un traitement, bien que peu d'études s'y soient penchées. La deuxième partie de cette thèse a eu pour objectifs d'évaluer l'efficacité d'un programme d'exercice (PE) suivi par des patients lombalgiques chroniques ayant complété une restauration fonctionnelle multidisciplinaire (RFM), en comparaison avec le suivi classique (SC) consistant simplement à encourager les patients à adopter un quotidien aussi actif que possible par la suite. Les résultats ont montré que les améliorations obtenues au terme de RFM étaient maintenues par les deux groupes à un an de suivi. Bien qu'aucune différence n'ait été obtenue entre les deux groupes, seul le groupe PE améliorait significativement l'incapacité et l'endurance isométrique des muscles du tronc. Une analyse économique a ensuite été réalisée afin d'évaluer la rentabilité de PE. L'évaluation de la qualité de vie des patients au terme de RFM et à un an de suivi permettait d'estimer les années de vie ajustées par leur qualité (QALYs) gagnées par chaque groupe. Les coûts directs (visites chez le médecin, spécialiste, physio, autres) et indirects (jours d'absence au travail) étaient estimés avant RFM et à un an de suivi à l'aide d'un agenda. Aucune différence significative n'était obtenue entre les groupes. Une mince différence de QALYs en faveur de PE ne se traduisait néanmoins pas en bénéfices mesurables. La recherche future devrait s'attacher à identifier un ou des sous-groupe(s) de patients pour lesquels SC ne permet pas de maintenir à long terme les améliorations obtenues au terme de RFM, et pour lesquels l'efficacité thérapeutique et la rentabilité économique de PE pourraient être accrues. ABSTRACT Exercise is used to treat low back pain for over a hundred years. Research in this area began in the mid-twentieth century and then grew exponentially until nowadays. The first study of this thesis was aimed to review this abundant scientific literature. It showed that exercise is effective in the primary and secondary prevention of low back pain. As a modality of treatment, exercise can reduce disability and pain and improve physical fitness and professional status of patients with subacute and chronic low back pain. Among different exercise characteristics, supervision is essential. Further investigations are needed to identify subgroups of patients responding positively to other characteristics of exercise. Exercise is often used as a post-treatment modality in order to maintain results over time, although only a few studies addressed this issue directly. The purpose of the second part of this thesis was to evaluate the effectiveness of an exercise program (EP) for patients with chronic low back pain who completed a functional multidisciplinary rehabilitation (FMR), compared to the routine follow-up (RF) which simply consisted of encouraging patients to adopt an active daily life thereafter. The results showed that improvements obtained at the end of FMR were maintained by both groups at one year follow-up. Although no difference was obtained between both groups, only the EP group significantly improved disability and isometric endurance of trunk muscles. An economic analysis was then carried out to assess the cost-effectiveness of EP. Based on the evaluation of patients' quality of life after FMR and at one year follow-up, an estimation of adjusted life years for their quality (QALYs) gained by each group was done. Direct costs (physician, specialist, physiotherapist, other therapists visits) and indirect costs (days off work) were measured before FMR and at one year follow-up using a cost diary. No significant difference was obtained between both groups. A slight difference in QALYs in favour of EP did yet not translate into measurable benefits. Future research should focus on identifying subgroups of patients for which RF is insufficient to reach long-term improvements after FMR, and for which the therapeutic effectiveness and cost-effectiveness of EP could be increased.
Resumo:
Reducing comparative optimism regarding risk perceptions in traffic accidents has been proven to be particularly difficult (Delhomme, 2000). This is unfortunate because comparative optimism is assumed to impede preventive action. The present study tested whether a road safety training course could reduce drivers' comparative optimism in high control situations. Results show that the training course efficiently reduced comparative optimism in high control, but not in low control situations. Mechanisms underlying this finding and implications for the design of road safety training courses are discussed.
Resumo:
A field survey on schistosomiais was carried out in 1998, in the municipality of Pedro de Toledo, a low endemic area in the state of São Paulo, Brazil. According to the parasitologic Kato-Katz method, the prevalence rate was 1.6%, with an infection intensity of 40.9 eggs per gram of stool. By the immunofluorescence test (IFT) for detection of IgG and IgM antibodies in the serum, IgG-IFT and IgM-IFT, respectively, prevalence indices of 33.2% and 33.5% were observed. To assess the impact of the schistosomiasis control program in the area, parasitologic and serologic data obtained in 1998, analyzed according to the age, sex, and residence zone, were compared to previous data obtained in a epidemiologic study carried out in 1980, when prevalence indices were of 22.8% and 55.5%, respectively by Kato-Katz and IgG-IFT. A significant fall of the prevalence was observed, indicating that the control measures were effective. Nonetheless, residual transmission was observed, demonstrating the need for a joint effort to include new approaches for better understanding the real situation and improving the control of the disease in low endemic areas.
Resumo:
Multi-resistant gram-negative rods are important pathogens in intensive care units (ICU), cause high rates of mortality, and need infection control measures to avoid spread to another patients. This study was undertaken prospectively with all of the patients hospitalized at ICU, Anesthesiology of the Hospital São Paulo, using the ICU component of the National Nosocomial Infection Surveillance System (NNIS) methodology, between March 1, 1997 and June 30, 1998. Hospital infections occurring during the first three months after the establishment of prevention and control measures (3/1/97 to 5/31/97) were compared to those of the last three months (3/1/98 to 5/31/98). In this period, 933 NNIS patients were studied, with 139 during the first period and 211 in the second period. The overall rates of infection by multi-resistant microorganisms in the first and second periods were, respectively, urinary tract infection: 3.28/1000 patients/day; 2.5/1000 patients/day; pneumonia: 2.10/1000 patients/day; 5.0/1000 patients/day; bloodstream infection: 1.09/1000 patients/day; 2.5/1000 patients/day. A comparison between overall infection rates of both periods (Wilcoxon test) showed no statistical significance (p = 0.067). The use of intervention measures effectively decreased the hospital bloodstream infection rate (p < 0.001), which shows that control measures in ICU can contribute to preventing hospital infections.
Resumo:
Undernutrition is a widespread problem in intensive care unit and is associated with a worse clinical outcome. A state of negative energy balance increases stress catabolism and is associated with increased morbidity and mortality in ICU patients. Undernutrition-related increased morbidity is correlated with an increase in the length of hospital stay and health care costs. Enteral nutrition is the recommended feeding route in critically ill patients, but it is often insufficient to cover the nutritional needs. The initiation of supplemental parenteral nutrition, when enteral nutrition is insufficient, could optimize the nutritional therapy by preventing the onset of early energy deficiency, and thus, could allow to reduce morbidity, length of stay and costs, shorten recovery period and, finally, improve quality of life. (C) 2009 Elsevier Masson SAS. All rights reserved.
Resumo:
A survey was conducted in two pediatric intensive care units in hospitals in Porto Alegre, Brazil, in order to monitor the main respiratory viruses present in bronchiolitis and/or pneumonia and their involvement in the severity of viral respiratory infections. Viral respiratory infection prevalence was 38.7%. In bronchiolitis, respiratory syncytial virus (RSV) was detected in 36% of the cases. In pneumonia, the prevalence rates were similar for adenovirus (10.3%) and RSV (7.7%). There was a difference among the viruses detected in terms of frequency of clinical findings indicating greater severity. Frequency of crackles in patients with RSV (47.3%) showed a borderline significance (p = 0.055, Fisher's exact test) as compared to those with adenovirus (87.5%). The overall case fatality rate in this study was 2.7%, and adenovirus showed a significantly higher case fatality rate (25%) than RSV (2.8%) (p = 0.005). Injected antibiotics were used in 49% of the children with RSV and 60% of those with adenovirus. Adenovirus was not detected in any of the 33 children submitted to oxygen therapy.
Resumo:
Animal toxins are of interest to a wide range of scientists, due to their numerous applications in pharmacology, neurology, hematology, medicine, and drug research. This, and to a lesser extent the development of new performing tools in transcriptomics and proteomics, has led to an increase in toxin discovery. In this context, providing publicly available data on animal toxins has become essential. The UniProtKB/Swiss-Prot Tox-Prot program (http://www.uniprot.org/program/Toxins) plays a crucial role by providing such an access to venom protein sequences and functions from all venomous species. This program has up to now curated more than 5000 venom proteins to the high-quality standards of UniProtKB/Swiss-Prot (release 2012_02). Proteins targeted by these toxins are also available in the knowledgebase. This paper describes in details the type of information provided by UniProtKB/Swiss-Prot for toxins, as well as the structured format of the knowledgebase.
Resumo:
Patients who have overdosed on drugs commonly present to emergency departments, with only the most severe cases requiring intensive care unit (ICU) admission. Such patients typically survive hospitalisation. We studied their longer term functional outcomes and recovery patterns which have not been well described. All patients admitted to the 18-bed ICU of a university-affiliated teaching hospital following drug overdoses between 1 January 2004 and 31 December 2006 were identified. With ethical approval, we evaluated the functional outcome and recovery patterns of the surviving patients 31 months after presentation, by telephone or personal interview. These were recorded as Glasgow outcome score, Karnofsky performance index and present work status. During the three years studied, 43 patients were identified as being admitted to our ICU because of an overdose. The average age was 34 years, 72% were male and the mean APACHE II score was 16.7. Of these, 32 were discharged from hospital alive. Follow-up data was attained on all of them. At a median of 31 months follow-up, a further eight had died. Of the 24 surviving there were 13 unemployed, seven employed and four in custody. The median Glasgow outcome score of survivors was 4.5, their Karnofsky score 80. Admission to ICU for treatment of overdose is associated with a very high risk of death in both the short- and long-term. While excellent functional recovery is achievable, 16% of survivors were held in custody and 54% unemployed.This resource was contributed by The National Documentation Centre on Drug Use.