945 resultados para Unit Patients
Resumo:
Objectif Un bolus unique d'étomidate inhibe une enzyme mitochondriale impliquée dans la synthèse du cortisol. Au sein de notre institution, tout patient candidat à une chirurgie cardiaque reçoit de l'étomidate à l'induction de l'anesthésie. L'objectif de cette étude a été de déterminer l'incidence des dysfonctions surrénaliennes chez les patients bénéficiant d'une chirurgie cardiaque et nécessitant de hautes doses de noradrénaline au cours de la période postopératoire. Type d'étude Étude rétrospective descriptive dans l'unité de réanimation d'un centre hospitalier universitaire. Patients et méthodes Soixante-trois patients admis en réanimation après chirurgie cardiaque nécessitant plus de 0,2μg/kg par minute de noradrénaline au cours des premières 48 heures postopératoires ont été étudiés. L'insuffisance surrénalienne absolue a été définie par un cortisol basal inférieur à 414nmo/l (15μg/dl), l'insuffisance surrénalienne relative par un cortisol basal entre 414nmo/l (15μg/dl) et 938nmo/l (34μg/dl) avec une augmentation de la cortisolémie (à 60 minutes après un test de stimulation par 250μg de corticotropine de synthèse) inférieure à 250nmo/l (9μg/dl). Résultats Quatorze patients (22 %) ont présenté une fonction surrénalienne normale, 10 (16 %) une insuffisance surrénalienne absolue et 39 (62 %) une insuffisance surrénalienne relative. Tous les patients ont reçu une substitution stéroïdienne, sans aucune différence d'évolution clinique entre les différents groupes. Conclusion L'incidence de l'insuffisance surrénalienne chez les patients qui ont reçu un bolus d'étomidate à l'induction, lors d'une chirurgie cardiaque avec circulation extracorporelle, et présenté une défaillance circulatoire postopératoire, est élevée.
Resumo:
Résumé Introduction : Les patients nécessitant une prise en charge prolongée en milieu de soins intensifs et présentant une évolution compliquée, développent une réponse métabolique intense caractérisée généralement par un hypermétabolisme et un catabolisme protéique. La sévérité de leur atteinte pathologique expose ces patients à la malnutrition, due principalement à un apport nutritionnel insuffisant, et entraînant une balance énergétique déficitaire. Dans un nombre important d'unités de soins intensifs la nutrition des patients n'apparaît pas comme un objectif prioritaire de la prise en charge. En menant une étude prospective d'observation afin d'analyser la relation entre la balance énergétique et le pronostic clinique des patients avec séjours prolongés en soins intensifs, nous souhaitions changer cette attitude et démonter l'effet délétère de la malnutrition chez ce type de patient. Méthodes : Sur une période de 2 ans, tous les patients, dont le séjour en soins intensifs fut de 5 jours ou plus, ont été enrôlés. Les besoins en énergie pour chaque patient ont été déterminés soit par calorimétrie indirecte, soit au moyen d'une formule prenant en compte le poids du patient (30 kcal/kg/jour). Les patients ayant bénéficié d'une calorimétrie indirecte ont par ailleurs vérifié la justesse de la formule appliquée. L'âge, le sexe le poids préopératoire, la taille, et le « Body mass index » index de masse corporelle reconnu en milieu clinique ont été relevés. L'énergie délivrée l'était soit sous forme nutritionnelle (administration de nutrition entérale, parentérale ou mixte) soit sous forme non-nutritionnelle (perfusions : soluté glucosé, apport lipidique non nutritionnel). Les données de nutrition (cible théorique, cible prescrite, énergie nutritionnelle, énergie non-nutritionnelle, énergie totale, balance énergétique nutritionnelle, balance énergétique totale), et d'évolution clinique (nombre des jours de ventilation mécanique, nombre d'infections, utilisation des antibiotiques, durée du séjour, complications neurologiques, respiratoires gastro-intestinales, cardiovasculaires, rénales et hépatiques, scores de gravité pour patients en soins intensifs, valeurs hématologiques, sériques, microbiologiques) ont été analysées pour chacun des 669 jours de soins intensifs vécus par un total de 48 patients. Résultats : 48 patients de 57±16 ans dont le séjour a varié entre 5 et 49 jours (motif d'admission : polytraumatisés 10; chirurgie cardiaque 13; insuffisance respiratoire 7; pathologie gastro-intestinale 3; sepsis 3; transplantation 4; autre 8) ont été retenus. Si nous n'avons pu démontrer une relation entre la balance énergétique et plus particulièrement, le déficit énergétique, et la mortalité, il existe une relation hautement significative entre le déficit énergétique et la morbidité, à savoir les complications et les infections, qui prolongent naturellement la durée du séjour. De plus, bien que l'étude ne comporte aucune intervention et que nous ne puissions avancer qu'il existe une relation de cause à effet, l'analyse par régression multiple montre que le facteur pronostic le plus fiable est justement la balance énergétique, au détriment des scores habituellement utilisés en soins intensifs. L'évolution est indépendante tant de l'âge et du sexe, que du status nutritionnel préopératoire. L'étude ne prévoyait pas de récolter des données économiques : nous ne pouvons pas, dès lors, affirmer que l'augmentation des coûts engendrée par un séjour prolongé en unité de soins intensifs est induite par un déficit énergétique, même si le bon sens nous laisse penser qu'un séjour plus court engendre un coût moindre. Cette étude attire aussi l'attention sur l'origine du déficit énergétique : il se creuse au cours de la première semaine en soins intensifs, et pourrait donc être prévenu par une intervention nutritionnelle précoce, alors que les recommandations actuelles préconisent un apport énergétique, sous forme de nutrition artificielle, qu'à partir de 48 heures de séjour aux soins intensifs. Conclusions : L'étude montre que pour les patients de soins intensifs les plus graves, la balance énergétique devrait être considérée comme un objectif important de la prise en charge, nécessitant l'application d'un protocole de nutrition précoce. Enfin comme l'évolution à l'admission des patients est souvent imprévisible, et que le déficit s'installe dès la première semaine, il est légitime de s'interroger sur la nécessité d'appliquer ce protocole à tous les patients de soins intensifs et ceci dès leur admission. Summary Background and aims: Critically ill patients with complicated evolution are frequently hypermetabolic, catabolic, and at risk of underfeeding. The study aimed at assessing the relationship between energy balance and outcome in critically ill patients. Methods: Prospective observational study conducted in consecutive patients staying 5 days in the surgical ICU of a University hospital. Demographic data, time to feeding, route, energy delivery, and outcome were recorded. Energy balance was calculated as energy delivery minus target. Data in means+ SD, linear regressions between energy balance and outcome variables. Results: Forty eight patients aged 57±16 years were investigated; complete data are available in 669 days. Mechanical ventilation lasted 11±8 days, ICU stay 15+9 was days, and 30-days mortality was 38%. Time to feeding was 3.1 ±2.2 days. Enteral nutrition was the most frequent route with 433 days. Mean daily energy delivery was 1090±930 kcal. Combining enteral and parenteral nutrition achieved highest energy delivery. Cumulated energy balance was between -12,600+ 10,520 kcal, and correlated with complications (P<0.001), already after 1 week. Conclusion: Negative energy balances were correlated with increasing number of complications, particularly infections. Energy debt appears as a promising tool for nutritional follow-up, which should be further tested. Delaying initiation of nutritional support exposes the patients to energy deficits that cannot be compensated later on.
Resumo:
BACKGROUND: The emergency department has been identified as an area within the health care sector with the highest reports of violence. The best way to control violence is to prevent it before it becomes an issue. Ideally, to prevent violent episodes we should eliminate all triggers of frustration and violence. Our study aims to assess the impact of a quality improvement multi-faceted program aiming at preventing incivility and violence against healthcare professionals working at the ophthalmological emergency department of a teaching hospital. METHODS/DESIGN: This study is a single-center prospective, controlled time-series study with an alternate-month design. The prevention program is based on the successive implementation of five complementary interventions: a) an organizational approach with a standardized triage algorithm and patient waiting number screen, b) an environmental approach with clear signage of the premises, c) an educational approach with informational videos for patients and accompanying persons in waiting rooms, d) a human approach with a mediator in waiting rooms and e) a security approach with surveillance cameras linked to the hospital security. The primary outcome is the rate of incivility or violence by patients, or those accompanying them against healthcare staff. All patients admitted to the ophthalmological emergency department, and those accompanying them, will be enrolled. In all, 45,260 patients will be included in over a 24-month period. The unit analysis will be the patient admitted to the emergency department. Data analysis will be blinded to allocation, but due to the nature of the intervention, physicians and patients will not be blinded. DISCUSSION: The strengths of this study include the active solicitation of event reporting, that this is a prospective study and that the study enables assessment of each of the interventions that make up the program. The challenge lies in identifying effective interventions, adapting them to the context of care in an emergency department, and thoroughly assessing their efficacy with a high level of proof.The study has been registered as a cRCT at clinicaltrials.gov (identifier: NCT02015884).
Resumo:
RATIONALE: Many sources of conflict exist in intensive care units (ICUs). Few studies recorded the prevalence, characteristics, and risk factors for conflicts in ICUs. OBJECTIVES: To record the prevalence, characteristics, and risk factors for conflicts in ICUs. METHODS: One-day cross-sectional survey of ICU clinicians. Data on perceived conflicts in the week before the survey day were obtained from 7,498 ICU staff members (323 ICUs in 24 countries). MEASUREMENTS AND MAIN RESULTS: Conflicts were perceived by 5,268 (71.6%) respondents. Nurse-physician conflicts were the most common (32.6%), followed by conflicts among nurses (27.3%) and staff-relative conflicts (26.6%). The most common conflict-causing behaviors were personal animosity, mistrust, and communication gaps. During end-of-life care, the main sources of perceived conflict were lack of psychological support, absence of staff meetings, and problems with the decision-making process. Conflicts perceived as severe were reported by 3,974 (53%) respondents. Job strain was significantly associated with perceiving conflicts and with greater severity of perceived conflicts. Multivariate analysis identified 15 factors associated with perceived conflicts, of which 6 were potential targets for future intervention: staff working more than 40 h/wk, more than 15 ICU beds, caring for dying patients or providing pre- and postmortem care within the last week, symptom control not ensured jointly by physicians and nurses, and no routine unit-level meetings. CONCLUSIONS: Over 70% of ICU workers reported perceived conflicts, which were often considered severe and were significantly associated with job strain. Workload, inadequate communication, and end-of-life care emerged as important potential targets for improvement.
Resumo:
The interest in alternative medicine (AM) is growing. In the USA and Canada, studies showed that 34% of adults and 11% of children use AM. In a prospective cohort study, we investigated the interest in AM among parents of critically ill children in the paediatric Intensive Care Unit (ICU) of a university hospital. From January 1996 to April 1997, we distributed questionnaires to the parents of critically ill children. These strictly anonymous questionnaires were completed at home and returned by mail. Exclusion criteria were short ( < 1 day) or repeated hospitalizations, and insufficient proficiency of the German language. The inclusion criteria were fulfilled by 591 patients; 561 received the questionnaire (95%) and 289 (52%) were returned. Of the respondents, 70% would appreciate AM as a complementary therapy on the ICU, 23% found AM equally or more important than conventional medicine whereas only 7% regarded AM as unimportant. On the ICU, 18% used AM; surprisingly 41% of them did not discuss it with physicians or nurses. An additional 21% would have liked to use AM, but did not do so. Typically, AM-users administered AM also at home to their children and themselves. Their children were however, older.CONCLUSIONS: A substantial proportion of parents used measures of alternative medicine in the intensive care unit, or would have like to do so. However, few had the confidence to discuss this wish with the medical personal. This suggests that alternative medicine is of great interest, even on an intensive care unit. Nevertheless, discussion about alternative medicine seems to be taboo in doctor-patient relations.
Resumo:
BACKGROUND: The obective of this study was to perform a cost-effectiveness analysis comparing intermittent with continuous renal replacement therapy (IRRT versus CRRT) as initial therapy for acute kidney injury (AKI) in the intensive care unit (ICU). METHODS: Assuming some patients would potentially be eligible for either modality, we modeled life year gained, the quality-adjusted life years (QALYs) and healthcare costs for a cohort of 1000 IRRT patients and a cohort of 1000 CRRT patients. We used a 1-year, 5-year and a lifetime horizon. A Markov model with two health states for AKI survivors was designed: dialysis dependence and dialysis independence. We applied Weibull regression from published estimates to fit survival curves for CRRT and IRRT patients and to fit the proportion of dialysis dependence among CRRT and IRRT survivors. We then applied a risk ratio reported in a large retrospective cohort study to the fitted CRRT estimates in order to determine the proportion of dialysis dependence for IRRT survivors. We conducted sensitivity analyses based on a range of differences for daily implementation cost between CRRT and IRRT (base case: CRRT day $632 more expensive than IRRT day; range from $200 to $1000) and a range of risk ratios for dialysis dependence for CRRT as compared with IRRT (from 0.65 to 0.95; base case: 0.80). RESULTS: Continuous renal replacement therapy was associated with a marginally greater gain in QALY as compared with IRRT (1.093 versus 1.078). Despite higher upfront costs for CRRT in the ICU ($4046 for CRRT versus $1423 for IRRT in average), the 5-year total cost including the cost of dialysis dependence was lower for CRRT ($37 780 for CRRT versus $39 448 for IRRT on average). The base case incremental cost-effectiveness analysis showed that CRRT dominated IRRT. This dominance was confirmed by extensive sensitivity analysis. CONCLUSIONS: Initial CRRT is cost-effective compared with initial IRRT by reducing the rate of long-term dialysis dependence among critically ill AKI survivors.
Resumo:
Background Patients with cirrhosis in ChildPugh class C or those in class B who have persistent bleeding at endoscopy are at high risk for treatment failure and a poor prognosis, even if they have undergone rescue treatment with a transjugular intrahepatic porto - systemic shunt (TIPS). This study evaluated the earlier use of TIPS in such patients. Methods We randomly assigned, within 24 hours after admission, a total of 63 patients with cirrhosis and acute variceal bleeding who had been treated with vasoactive drugs plus endoscopic therapy to treatment with a polytetrafluoroethylene-covered stent within 72 hours after randomization (early-TIPS group, 32 patients) or continuation of vasoactive-drug therapy, followed after 3 to 5 days by treatment with propranolol or nadolol and long-term endoscopic band ligation (EBL), with insertion of a TIPS if needed as rescue therapy (pharmacotherapyEBL group, 31 patients). Results During a median follow-up of 16 months, rebleeding or failure to control bleeding occurred in 14 patients in the pharmacotherapyEBL group as compared with 1 patient in the early-TIPS group (P=0.001). The 1-year actuarial probability of remaining free of this composite end point was 50% in the pharmacotherapyEBL group versus 97% in the early-TIPS group (P<0.001). Sixteen patients died (12 in the pharmacotherapyEBL group and 4 in the early-TIPS group, P=0.01). The 1-year actuarial survival was 61% in the pharmacotherapyEBL group versus 86% in the early-TIPS group (P<0.001). Seven patients in the pharmacotherapyEBL group received TIPS as rescue therapy, but four died. The number of days in the intensive care unit and the percentage of time in the hospital during follow-up were significantly higher in the pharmacotherapyEBL group than in the early-TIPS group. No significant diferences were observed between the two treatment groups with respect to serious adverse events. Conclusions In these patients with cirrhosis who were hospitalized for acute variceal bleeding and at high risk for treatment failure, the early use of TIPS was associated with signif icant reductions in treatment failure and in mortality. (Current Controlled Trials number, ISRCTN58150114.)
Resumo:
OBJECTIVE To determine the prevalence and clinical significance of hepatitis G virus (HGV) infection in a large cohort of patients with primary Sjögren¿s syndrome (SS). PATIENTS AND METHODS The study included 100 consecutive patients (92 female and eight male), with a mean age of 62 years (range 31¿80) that were prospectively visited in our unit. All patients fulfilled the European Community criteria for SS and underwent a complete history, physical examination, as well as biochemical and immunological evaluation for liver disease. Two hundred volunteer blood donors were also studied. The presence of HGV-RNA was investigated in the serum of all patients and donors. Aditionally, HBsAg and antibodies to hepatitis C virus were determined. RESULTS Four patients (4%) and six volunteer blood donors (3%) presented HGV-RNA sequences in serum. HGV infection was associated with biochemical signs of liver involvement in two (50%) patients. When compared with primary SS patients without HGV infection, no significant differences were found in terms of clinical or immunological features. HCV coinfection occurs in one (25%) of the four patients with HGV infection. CONCLUSION The prevalence of HGV infection in patients with primary SS is low in the geographical area of the study and HCV coinfection is very uncommon. HGV infection alone does not seen to be an important cause of chronic liver injury in the patients with primary SS in this area.
Resumo:
PURPOSE OF REVIEW: Despite progress in the understanding of the pathophysiology of invasive candidiasis, and the development of new classes of well tolerated antifungals, invasive candidiasis remains a disease difficult to diagnose, and associated with significant morbidity and mortality. Early antifungal treatment may be useful in selected groups of patients who remain difficult to identify prospectively. The purpose of this review is to summarize the recent development of risk-identification strategies targeting early identification of ICU patients susceptible to benefit from preemptive or empirical antifungal treatment. RECENT FINDINGS: Combinations of different risk factors are useful in identifying high-risk patients. Among the many risk factors predisposing to invasive candidiasis, colonization has been identified as one of the most important. In contrast to prospective surveillance of the dynamics of colonization (colonization index), integration of clinical colonization status in risk scores models significantly improve their accuracy in identifying patients at risk of invasive candidiasis. SUMMARY: To date, despite limited prospective validation, clinical models targeted at early identification of patients at risk to develop invasive candidiasis represent a major advance in the management of patients at risk of invasive candidiasis. Moreover, large clinical studies using such risk scores or predictive rules are underway.
Resumo:
OBJECTIVES: To determine the distribution of exercise stages of change in a rheumatoid arthritis (RA) cohort, and to examine patients' perceptions of exercise benefits, barriers, and their preferences for exercise. METHODS: One hundred and twenty RA patients who attended the Rheumatology Unit of a University Hospital were asked to participate in the study. Those who agreed were administered a questionnaire to determine their exercise stage of change, their perceived benefits and barriers to exercise, and their preferences for various features of exercise. RESULTS: Eighty-nine (74%) patients were finally included in the analyses. Their mean age was 58.4 years, mean RA duration 10.1 years, and mean disease activity score 2.8. The distribution of exercise stages of change was as follows: precontemplation (n = 30, 34%), contemplation (n = 11, 13%), preparation (n = 5, 6%), action (n = 2, 2%), and maintenance (n = 39, 45%). Compared to patients in the maintenance stage of change, precontemplators exhibited different demographic and functional characteristics and reported less exercise benefits and more barriers to exercise. Most participants preferred exercising alone (40%), at home (29%), at a moderate intensity (64%), with advice provided by a rheumatologist (34%) or a specialist in exercise and RA (34%). Walking was by far the preferred type of exercise, in both the summer (86%) and the winter (51%). CONCLUSIONS: Our cohort of patients with RA was essentially distributed across the precontemplation and maintenance exercise stages of change. These subgroups of patients exhibit psychological and functional differences that make their needs different in terms of exercise counselling.
Resumo:
OBJECTIVE: Enteral glutamine supplementation and antioxidants have been shown to be beneficial in some categories of critically ill patients. This study investigated the impact on organ function and clinical outcome of an enteral solution enriched with glutamine and antioxidant micronutrients in patients with trauma and with burns. METHODS: This was a prospective study of a historical control group including critically ill, burned and major trauma patients (n = 86, 40 patients with burns and 46 with trauma, 43 in each group) on admission to an intensive care unit in a university hospital (matching for severity, age, and sex). The intervention aimed to deliver a 500-mL enteral solution containing 30 g of glutamine per day, selenium, zinc, and vitamin E (Gln-AOX) for a maximum of 10 d, in addition to control treatment consisting of enteral nutrition in all patients and intravenous trace elements in all burn patients. RESULTS: Patients were comparable at baseline, except for more inhalation injuries in the burn-Gln-AOX group (P = 0.10) and greater neurologic impairment in the trauma-Gln-AOX group (P = 0.022). Intestinal tolerance was good. The full 500-mL dose was rarely delivered, resulting in a low mean glutamine daily dose (22 g for burn patients and 16 g for trauma patients). In burn patients intravenous trace element delivery was superior to the enteral dose. The evolution of the Sequential Organ Failure Assessment score and other outcome variables did not differ significantly between groups. C-reactive protein decreased faster in the Gln-AOX group. CONCLUSION: The Gln-AOX supplement was well tolerated in critically ill, injured patients, but did not improve outcome significantly. The delivery of glutamine below the 0.5-g/kg recommended dose in association with high intravenous trace element substitution doses in burn patients are likely to have blunted the impact by not reaching an efficient treatment dose. Further trials testing higher doses of Gln are required.
Resumo:
Hospitalization in older patients is frequently associated with functional decline. Hospital factors and inadapted process of care are factors leading to this decline. Acute care units specifically developed for older patients can prevent functional decline. These units usually include a comprehensive geriatric evaluation, an interdisciplinary meeting, protocols for the treatment of geriatric syndromes and specific teaching for the care team. Globally, patients' cares are organized to preserve and improve functional performances. This article presents a pilot unit inspired by this model.
Resumo:
Meropenem, a carbapenem antibiotic displaying a broad spectrum of antibacterial activity, is administered in Medical Intensive Care Unit to critically ill patients undergoing continuous veno-venous haemodiafiltration (CVVHDF). However, there are limited data available to substantial rational dosing decisions in this condition. In an attempt to refine our knowledge and propose a rationally designed dosage regimen, we have developed a HPLC method to determine meropenem after solid-phase extraction (SPE) of plasma and dialysate fluids obtained from patients under CVVHDF. The assay comprises the simultaneous measurement of meropenem's open-ring metabolite UK-1a, whose fate has never been studied in CVVHDF patients. The clean-up procedure involved a SPE on C18 cartridge. Matrix components were eliminated with phosphate buffer pH 7.4 followed by 15:85 MeOH-phosphate buffer pH 7.4. Meropenem and UK-1a were subsequently desorbed with MeOH. The eluates were evaporated under nitrogen at room temperature (RT) and reconstituted in phosphate buffer pH 7.4. Separation was performed at RT on a Nucleosil 100-5 microm C18 AB cartridge column (125 x 4 mm I.D.) equipped with a guard column (8 x 4 mm I.D.) with UV-DAD detection set at 208 nm. The mobile phase was 1 ml min(-1), using a step-wise gradient elution program: %MeOH/0.005 M tetrabutylammonium chloride pH 7.4; 10/90-50/50 in 27 min. Over the range of 5-100 microg ml(-1), the regression coefficient of the calibration curves (plasma and dialysate) were >0.998. The absolute extraction recoveries of meropenem and UK-1a in plasma and filtrate-dialysate were stable and ranged from 88-93 to 72-77% for meropenem, and from 95-104 to 75-82% for UK-1a. In plasma and filtrate-dialysate, respectively, the mean intra-assay precision was 4.1 and 2.6% for meropenem and 4.2 and 3.7% for UK-1a. The inter-assay variability was 2.8 and 3.6% for meropenem and 2.3 and 2.8% for UK-1a. The accuracy was satisfactory for both meropenem and UK-1a with deviation never exceeding 9.0% of the nominal concentrations. The stability of meropenem, studied in biological samples left at RT and at +4 degrees C, was satisfactory with < 5% degradation after 1.5 h in blood but reached 22% in filtrate-dialysate samples stored at RT for 8 h, precluding accurate measurements of meropenem excreted unchanged in the filtrate-dialysate left at RT during the CVVHDF procedure. The method reported here enables accurate measurements of meropenem in critically ill patients under CVVHDF, making dosage individualisation possible in such patients. The levels of the metabolite UK-1a encountered in this population of patients were higher than those observed in healthy volunteers but was similar to those observed in patients with renal impairment under hemodialysis.
Resumo:
The authors evaluated ten years of surgical reanimation in the University Centre of Lausanne (CHUV). Irreversible coagulopathy (IC) is the predominant cause of death for the polytraumatized patient. Acidosis, hypothermy, and coagulation troubles are crucial elements of this coagulopathy. The authors looked for a criterion allowing the identification of dying of IC. In a retrospective study, laboratory results of pH, TP, PTT, thrombocyte count and the need for blood transfusion units were checked for each major step of the primary evaluation and treatment of the polytraumatized patients. These results were considered as critical according to criteria of the literature (30). The authors conclude that the apparation of a third critical value may be useful to identify the polytraumatized patient at risk of dying of IC status. This criterion may also guide the trauma team in selecting a damage control surgical approach (DCS). This criterion was then introduced into an algorithm involving the Emergency Department, the operating room and the Intensive Care Unit. This criterion is a new tool to address the patient at the crucial moment to the appropriate hospital structure.
Resumo:
BACKGROUND: Upper limb paresis remains a relevant challenge in stroke rehabilitation. AIM: To evaluate if adding mirror therapy (MT) to conventional therapy (CT) can improve motor recovery of the upper limb in subacute stroke patients. DESIGN: Prospective, single-center, single-blind, randomised, controlled trial. SETTING: Subacute stroke patients referred to a Physical and Rehabilitation Medicine Unit between October 2009 and August 2011. POPULATION: Twenty-six subacute stroke patients (time from stroke <4 weeks) with upper limb paresis (Motricity Index â0/00¤ 77). METHODS: Patients were randomly allocated to the MT (N.=13) or to the CT group (N.=13). Both followed a comprehensive rehabilitative treatment. In addition, MT Group had 30 minutes of MT while the CT group had 30 minutes of sham therapy. Action Research Arm Test (ARAT) was the primary outcome measures. Motricity Index (MI) and the Functional Independence Measure (FIM) were the secondary outcome measures. RESULTS: After one month of treatment patients of both groups showed statistically significant improvements in all the variables measured (P<0.05). Moreover patients of the MT group had greater improvements in the ARAT, MI and FIM values compared to CT group (P<0.01, Glass's Î" Effect Size: 1.18). No relevant adverse event was recorded during the study. CONCLUSION: MT is a promising and easy method to improve motor recovery of the upper limb in subacute stroke patients. CLINICAL REHABILITATION IMPACT: While MT use has been advocated for acute patients with no or negligible motor function, it can be usefully extended to patients who show partial motor recovery. The easiness of implementation, the low cost and the acceptability makes this therapy an useful tool in stroke rehabilitation.