854 resultados para Rash illness
Resumo:
STUDY OBJECTIVES: Hemispheric stroke in humans is associated with sleep-wake disturbances and sleep electroencephalogram (EEG) changes. The correlation between these changes and stroke extent remains unclear. In the absence of experimental data, we assessed sleep EEG changes after focal cerebral ischemia of different extensions in mice. DESIGN: Following electrode implantation and baseline sleep-wake EEG recordings, mice were submitted to sham surgery (control group), 30 minutes of intraluminal middle cerebral artery (MCA) occlusion (striatal stroke), or distal MCA electrocoagulation (cortical stroke). One and 12 days after stroke, sleep-wake EEG recordings were repeated. The EEG recorded from the healthy hemisphere was analyzed visually and automatically (fast Fourier analysis) according to established criteria. MEASUREMENTS AND RESULTS: Striatal stroke induced an increase in non-rapid eye movement (NREM) sleep and a reduction of rapid eye movement sleep. These changes were detectable both during the light and the dark phase at day 1 and persisted until day 12 after stroke. Cortical stroke induced a less-marked increase in NREM sleep, which was present only at day 1 and during the dark phase. In cortical stroke, the increase in NREM sleep was associated in the wake EEG power spectra, with an increase in the theta and a reduction in the beta activity. CONCLUSION: Cortical and striatal stroke lead to different sleep-wake EEG changes in mice, which probably reflect variable effects on sleep-promoting and wakefulness-maintaining neuronal networks.
Resumo:
To determine the frequency and predictors of sleep disorders in children with cerebral palsy (CP) we analyzed the responses of 173 parents who had completed the Sleep Disturbance Scale for Children. The study population included 100 males (57.8%) and 73 females (42.2%; mean age 8y 10mo [SD 1y 11mo]; range 6y-11y 11mo). Eighty-three children (48.0%) had spastic diplegia, 59 (34.1%) congenital hemiplegia, 18 (10.4%) spastic quadriplegia, and 13 (7.5%) dystonic/dyskinetic CP. Seventy-three children (42.2%) were in Gross Motor Function Classification System Level I, 33 (19.1%) in Level II, 30 (17.3%) in Level III, 23 (13.3%) in Level IV, and 14 (8.1%) in Level V. Thirty children (17.3%) had epilepsy. A total sleep problem score and six factors indicative of the most common areas of sleep disorder in childhood were obtained. Of the children in our study, 23% had a pathological total sleep score, in comparison with 5% of children in the general population. Difficulty in initiating and maintaining sleep, sleep-wake transition, and sleep breathing disorders were the most frequently identified problems. Active epilepsy was associated with the presence of a sleep disorder (odds ratio [OR]=17.1, 95% confidence interval [CI] 2.5-115.3), as was being the child of a single-parent family (OR=3.9, 95% CI 1.3-11.6). Disorders of initiation and maintenance of sleep were more frequent in children with spastic quadriplegia (OR=12.9, 95% CI 1.9-88.0), those with dyskinetic CP (OR=20.6, 95% CI 3.1-135.0), and those with severe visual impairment (OR=12.5, 95% CI 2.5-63.1). Both medical and environmental factors seem to contribute to the increased frequency of chronic sleep disorders in children with CP.
Resumo:
In order to evaluate the effect of head injury in severely traumatized patients on the response of ACTH, GH, PRL, and TSH plasma levels, 36 patients were prospectively studied over 5 consecutive days following injury. They were divided into three groups: Group I, severe isolated head injury (n = 14); Group II, multiple injury combined with severe head injury (n = 12); Group III, multiple injury without head injury (n = 10). No significant trend was observed during the 5 consecutive days. The following changes in plasma levels were observed, compared to normal reference value (median values): ACTH was normal in the three groups; PRL was elevated in Group II and normal in the other groups; GH was elevated in all groups; TSH was elevated in Group III and reduced in Groups I and II. Intergroup comparisons showed significantly lower plasma levels for PRL (p less than 0.05) and TSH (p less than 0.01) in Groups I and II, i.e., head-injured patients, compared to Group III, i.e., traumatized patients without head injury. A relationship was observed between the severity of head injury, as expressed by Glasgow Coma Score, intracranial pressure levels, outcome, and TSH and PRL levels.
Resumo:
During the past twenty years, various instruments have been developed for the assessment of substance use in adolescents, mainly in the United States. However, few of them have been adapted to, and validated in, French-speaking populations. Consequently, although increasing alcohol and drug use among teenagers has become a major concern, the various health and social programs developed in response to this specific problem have received little attention with regard to follow-up and outcome assessment. A standardized multidimensional assessment instrument adapted for adolescents is needed to assess the individual needs of adolescents and assign them to the most appropriate treatment setting, to provide a single measurement within and across health and social systems, and to conduct treatment outcome evaluations. Moreover, having an available instrument makes it possible to develop longitudinal and transcultural research studies. For this reason, a French version of the Adolescent Drug Abuse Diagnosis (ADAD) was developed and validated at the University Child and Adolescent Psychiatric Clinic in Lausanne, Switzerland. This article aims to discuss the methodological issues that we faced when using the ADAD instrument in a 4-year longitudinal study including adolescent substance users. Methodological aspects relating to the content and format of the instrument, the assessment administration and the statistical analyses are discussed.
Resumo:
Evolutionary theory may contribute to practical solutions for control of disease by identifying interventions that may cause pathogens to evolve to reduced virulence. Theory predicts, for example, that pathogens transmitted by water or arthropod vectors should evolve to relatively high levels of virulence because such pathogens can gain the evolutionary benefits of relatively high levels of host exploitation while paying little price from host illness. The entrance of Vibrio cholerae into South America in 1991 has generated a natural experiment that allows testing of this idea by determining whether geographic and temporal variations in toxigenicity correspond to variation in the potential for waterborne transmission. Preliminary studies show such correspondences: toxigenicity is negatively associated with access to uncontaminated water in Brazil; and in Chile, where the potential for waterborne transmission is particularly low, toxigenicity of strains declined between 1991 and 1998. In theory vector-proofing of houses should be similarly associated with benignity of vectorborne pathogens, such as the agents of dengue, malaria, and Chagas' disease. These preliminary studies draw attention to the need for definitive prospective experiments to determine whether interventions such as provisioning of uncontaminated water and vector-proofing of houses cause evolutionary reductions in virulence
Resumo:
BACKGROUND & AIMS: The study was designed to investigate and quantify nutritional support, and particularly enteral nutrition (EN), in critically ill patients with severe hemodynamic failure. METHODS: Prospective, descriptive study in a surgical intensive care unit (ICU) in a university teaching hospital: patients aged 67+/-13 yrs (mean+/-SD) admitted after cardiac surgery with extracorporeal circulation, staying 5 days in the ICU with acute cardiovascular failure. Severity of disease was assessed with SAPS II, and SOFA scores. Variables were energy delivery and balance, nutrition route, vasopressor doses, and infectious complications. Artificial feeding delivered according to ICU protocol. EN was considered from day 2-3. Energy target was set 25 kcal/kg/day to be reached stepwise over 5 days. RESULTS: Seventy out of 1114 consecutive patients were studied, aged 67+/-17 years, and staying 10+/-7 days in the ICU. Median SAPS II was 43. Nine patients died (13%). All patients had circulatory failure: 18 patients required intra-aortic balloon-pump support (IABP). Norepinephrine was required in 58 patients (83%). Forty patients required artificial nutrition. Energy delivery was very variable. There was no abdominal complication related to EN. As a mean, 1360+/-620 kcal/kg/day could be delivered enterally during the first 2 weeks, corresponding to 70+/-35% of energy target. Enteral nutrient delivery was negatively influenced by increasing dopamine and norepinephrine doses, but not by the use of IABP. CONCLUSION: EN is possible in the majority of patients with severe hemodynamic failure, but usually results in hypocaloric feeding. EN should be considered in patients with careful abdominal and energy monitoring.
Resumo:
BACKGROUND: Recombinant human insulin-like growth factor I (rhIGF-I) is a possible disease modifying therapy for amyotrophic lateral sclerosis (ALS, which is also known as motor neuron disease (MND)). OBJECTIVES: To examine the efficacy of rhIGF-I in affecting disease progression, impact on measures of functional health status, prolonging survival and delaying the use of surrogates (tracheostomy and mechanical ventilation) to sustain survival in ALS. Occurrence of adverse events was also reviewed. SEARCH METHODS: We searched the Cochrane Neuromuscular Disease Group Specialized Register (21 November 2011), CENTRAL (2011, Issue 4), MEDLINE (January 1966 to November 2011) and EMBASE (January 1980 to November 2011) and sought information from the authors of randomised clinical trials and manufacturers of rhIGF-I. SELECTION CRITERIA: We considered all randomised controlled clinical trials involving rhIGF-I treatment of adults with definite or probable ALS according to the El Escorial Criteria. The primary outcome measure was change in Appel Amyotrophic Lateral Sclerosis Rating Scale (AALSRS) total score after nine months of treatment and secondary outcome measures were change in AALSRS at 1, 2, 3, 4, 5, 6, 7, 8, 9 months, change in quality of life (Sickness Impact Profile scale), survival and adverse events. DATA COLLECTION AND ANALYSIS: Each author independently graded the risk of bias in the included studies. The lead author extracted data and the other authors checked them. We generated some missing data by making ruler measurements of data in published graphs. We collected data about adverse events from the included trials. MAIN RESULTS: We identified three randomised controlled trials (RCTs) of rhIGF-I, involving 779 participants, for inclusion in the analysis. In a European trial (183 participants) the mean difference (MD) in change in AALSRS total score after nine months was -3.30 (95% confidence interval (CI) -8.68 to 2.08). In a North American trial (266 participants), the MD after nine months was -6.00 (95% CI -10.99 to -1.01). The combined analysis from both RCTs showed a MD after nine months of -4.75 (95% CI -8.41 to -1.09), a significant difference in favour of the treated group. The secondary outcome measures showed non-significant trends favouring rhIGF-I. There was an increased risk of injection site reactions with rhIGF-I (risk ratio 1.26, 95% CI 1.04 to 1.54). . A second North American trial (330 participants) used a novel primary end point involving manual muscle strength testing. No differences were demonstrated between the treated and placebo groups in this study. All three trials were at high risk of bias. AUTHORS' CONCLUSIONS: Meta-analysis revealed a significant difference in favour of rhIGF-I treatment; however, the quality of the evidence from the two included trials was low. A third study showed no difference between treatment and placebo. There is no evidence for increase in survival with IGF1. All three included trials were at high risk of bias.
Resumo:
OBJECTIVE: An implementation study that evaluated the impact of previously adopted guidelines on the clinical practice of medical residents was conducted to improve the recognition and treatment of major depressive disorders (MDDs) in hospitalized patients with somatic diseases. METHODS: Guidelines were implemented in two wards (ENT and oncology) using intranet diffusion, interactive sessions with medical residents, and support material. Discharge letters of 337 and 325 patients, before and after the intervention, respectively, were checked for statement of diagnosis or treatment of MDDs and, in a post hoc analysis, for any mention about psychiatric management. RESULTS: No difference was found in the number of diagnosed or treated MDDs before and after the intervention. However, significantly more statements about psychological status (29/309 vs. 13/327) and its management (36/309 vs. 19/327) were observed after the intervention (P<.01). CONCLUSION: The intervention was not successful in improving the management of MDDs. However, a possible effect on general psychological aspects of medical diseases was observed.
Resumo:
BACKGROUND: The optimal length of stay (LOS) for patients with pulmonary embolism (PE) is unknown. Although reducing LOS is likely to save costs, the effects on patient safety are unclear. We sought to identify patient and hospital factors associated with LOS and assess whether LOS was associated with postdischarge mortality. METHODS: We evaluated patients discharged with a primary diagnosis of PE from 186 acute care hospitals in Pennsylvania (January 2000 through November 2002). We used discrete survival models to examine the association between (1) patient and hospital factors and the time to discharge and (2) LOS and postdischarge mortality within 30 days of presentation, adjusting for patient and hospital factors. RESULTS: Among 15 531 patient discharges with PE, the median LOS was 6 days, and postdischarge mortality rate was 3.3%. In multivariate analysis, patients from Philadelphia were less likely to be discharged on a given day (odds ratio [OR], 0.82; 95% confidence interval [CI], 0.73-0.93), as were black patients (OR, 0.88; 95% CI, 0.82-0.94).The odds of discharge decreased notably with greater patient severity of illness and in patients without private health insurance. Adjusted postdischarge mortality was significantly higher for patients with an LOS of 4 days or less (OR, 1.55; 95% CI, 1.21-2.00) relative to those with an LOS of 5 to 6 days. CONCLUSIONS: Several hospital and patient factors were independently associated with LOS. Patients with a very short LOS had greater postdischarge mortality relative to patients with a typical LOS, suggesting that physicians may inappropriately select patients with PE for early discharge who are at increased risk of complications
Resumo:
BACKGROUND: Classically, clinical trials are based on the placebo-control design. Our aim was to analyze the placebo effect in Huntington's disease. METHODS: Placebo data were obtained from an international, longitudinal, placebo-controlled trial for Huntington's disease (European Huntington's Disease Initiative Study Group). One-hundred and eighty patients were evaluated using the Unified Huntington Disease Rating Scale over 36 months. A placebo effect was defined as an improvement of at least 50% over baseline scores in the Unified Huntington Disease Rating Scale, and clinically relevant when at least 10% of the population met it. RESULTS: Only behavior showed a significant placebo effect, and the proportion of the patients with placebo effect ranged from 16% (first visit) to 41% (last visit). Nondepressed patients with better functional status were most likely to be placebo-responders over time. CONCLUSIONS: In Huntington's disease, behavior seems to be more vulnerable to placebo than overall motor function, cognition, and function
Resumo:
The aging process is associated with gradual and progressive loss of muscle mass along with lowered strength and physical endurance. This condition, sarcopenia, has been widely observed with aging in sedentary adults. Regular aerobic and resistance exercise programs have been shown to counteract most aspects of sarcopenia. In addition, good nutrition, especially adequate protein and energy intake, can help limit and treat age-related declines in muscle mass, strength, and functional abilities. Protein nutrition in combination with exercise is considered optimal for maintaining muscle function. With the goal of providing recommendations for health care professionals to help older adults sustain muscle strength and function into older age, the European Society for Clinical Nutrition and Metabolism (ESPEN) hosted a Workshop on Protein Requirements in the Elderly, held in Dubrovnik on November 24 and 25, 2013. Based on the evidence presented and discussed, the following recommendations are made (a) for healthy older people, the diet should provide at least 1.0-1.2 g protein/kg body weight/day, (b) for older people who are malnourished or at risk of malnutrition because they have acute or chronic illness, the diet should provide 1.2-1.5 g protein/kg body weight/day, with even higher intake for individuals with severe illness or injury, and (c) daily physical activity or exercise (resistance training, aerobic exercise) should be undertaken by all older people, for as long as possible.
Resumo:
OBJECTIVE: To evaluate the relative importance of increased lactate production as opposed to decreased utilization in hyperlactatemic patients, as well as their relation to glucose metabolism. DESIGN: Prospective observational study. SETTING: Surgical intensive care unit of a university hospital. PATIENTS: Seven patients with severe sepsis or septic shock, seven patients with cardiogenic shock, and seven healthy volunteers. INTERVENTIONS: C-labeled sodium lactate was infused at 10 micromol/kg/min and then at 20 micromol/kg/min over 120 mins each. H-labeled glucose was infused throughout. MEASUREMENTS AND MAIN RESULTS: Baseline arterial lactate was higher in septic (3.2 +/- 2.6) and cardiogenic shock patients (2.8 +/- 0.4) than in healthy volunteers (0.9 +/- 0.20 mmol/L, p < .05). Lactate clearance, computed using pharmacokinetic calculations, was similar in septic, cardiogenic shock, and controls, respectively: 10.8 +/- 5.4, 9.6 +/- 2.1, and 12.0 +/- 2.6 mL/kg/min. Endogenous lactate production was determined as the initial lactate concentration multiplied by lactate clearance. It was markedly enhanced in the patients (septic 26.2 +/- 10.5; cardiogenic shock 26.6 +/- 5.1) compared with controls (11.2 +/- 2.7 micromol/kg/min, p < .01). C-lactate oxidation (septic 54 +/- 25; cardiogenic shock 43 +/- 16; controls 65 +/- 15% of a lactate load of 10 micromol/kg/min) and transformation of C-lactate into C-glucose were not different (respectively, 15 +/- 15, 9 +/- 18, and 10 +/- 7%). Endogenous glucose production was markedly increased in the patients (septic 14.8 +/- 1.8; cardiogenic shock 15.0 +/- 1.5) compared with controls (7.2 +/- 1.1 micromol/kg/min, p < .01) and was not influenced by lactate infusion. CONCLUSIONS: In patients suffering from septic or cardiogenic shock, hyperlactatemia was mainly related to increased production, whereas lactate clearance was similar to healthy subjects. Increased lactate production was concomitant to hyperglycemia and increased glucose turnover, suggesting that the latter substantially influences lactate metabolism during critical illness.
Resumo:
The aim of the study was to determine objective radiological signs of danger to life in survivors of manual strangulation and to establish a radiological scoring system for the differentiation between life-threatening and non-life-threatening strangulation by dividing the cross section of the neck into three zones (superficial, middle and deep zone). Forensic pathologists classified 56 survivors of strangulation into life-threatening and non-life-threatening cases by history and clinical examination alone, and two blinded radiologists evaluated the MRIs of the neck. In 15 cases, strangulation was life-threatening (27%), compared with 41 cases in which strangulation was non-life-threatening (73%). The best radiological signs on MRI to differentiate between the two groups were intramuscular haemorrhage/oedema, swelling of platysma and intracutaneous bleeding (all p = 0.02) followed by subcutaneous bleeding (p = 0.034) and haemorrhagic lymph nodes (p = 0.04), all indicating life-threatening strangulation. The radiological scoring system showed a sensitivity and specificity of approximately 70% for life-threatening strangulation, when at least two neck zones were affected. MRI is not only helpful in assessing the severity of strangulation, but is also an excellent documentation tool that is even admissible in court.