828 resultados para Randomized Controlled Trials As Topic
Resumo:
BACKGROUND: Frequent emergency department users represent a small number of patients but account for a large number of emergency department visits. They should be a focus because they are often vulnerable patients with many risk factors affecting their quality of life (QoL). Case management interventions have resulted in a significant decrease in emergency department visits, but association with QoL has not been assessed. One aim of our study was to examine to what extent an interdisciplinary case management intervention, compared to standard emergency care, improved frequent emergency department users' QoL. METHODS: Data are part of a randomized, controlled trial designed to improve frequent emergency department users' QoL and use of health-care resources at the Lausanne University Hospital, Switzerland. In total, 250 frequent emergency department users (≥5 attendances during the previous 12 months; ≥ 18 years of age) were interviewed between May 2012 and July 2013. Following an assessment focused on social characteristics; social, mental, and somatic determinants of health; risk behaviors; health care use; and QoL, participants were randomly assigned to the control or the intervention group (n=125 in each group). The final sample included 194 participants (20 deaths, 36 dropouts, n=96 in the intervention group, n=99 in the control group). Participants in the intervention group received a case management intervention by an interdisciplinary, mobile team in addition to standard emergency care. The case management intervention involved four nurses and a physician who provided counseling and assistance concerning social determinants of health, substance-use disorders, and access to the health-care system. The participants' QoL was evaluated by a study nurse using the WHOQOL-BREF five times during the study (at baseline, and at 2, 5.5, 9, and 12 months). Four of the six WHOQOL dimensions of QoL were retained here: physical health, psychological health, social relationship, and environment, with scores ranging from 0 (low QoL) to 100 (high QoL). A linear, mixed-effects model with participants as a random effect was run to analyze the change in QoL over time. The effects of time, participants' group, and the interaction between time and group were tested. These effects were controlled for sociodemographic characteristics and health-related variables (i.e., age, gender, education, citizenship, marital status, type of financial resources, proficiency in French, somatic and mental health problems, and behaviors at risk).
Resumo:
BACKGROUND: Frequent emergency department users represent a small number of patients but account for a large number of emergency department visits. They should be a focus because they are often vulnerable patients with many risk factors affecting their quality of life (QoL). Case management interventions have resulted in a significant decrease in emergency department visits, but association with QoL has not been assessed. One aim of our study was to examine to what extent an interdisciplinary case management intervention, compared to standard emergency care, improved frequent emergency department users' QoL. METHODS: Data are part of a randomized, controlled trial designed to improve frequent emergency department users' QoL and use of health-care resources at the Lausanne University Hospital, Switzerland. In total, 250 frequent emergency department users (≥5 attendances during the previous 12 months; ≥ 18 years of age) were interviewed between May 2012 and July 2013. Following an assessment focused on social characteristics; social, mental, and somatic determinants of health; risk behaviors; health care use; and QoL, participants were randomly assigned to the control or the intervention group (n=125 in each group). The final sample included 194 participants (20 deaths, 36 dropouts, n=96 in the intervention group, n=99 in the control group). Participants in the intervention group received a case management intervention by an interdisciplinary, mobile team in addition to standard emergency care. The case management intervention involved four nurses and a physician who provided counseling and assistance concerning social determinants of health, substance-use disorders, and access to the health-care system.
Resumo:
Chronically homeless individuals with alcohol dependence experience severe alcohol-related consequences. It is therefore important to identify factors that might be associated with reduced alcohol-related harm, such as the use of safer-drinking strategies. Whereas effectiveness of safer-drinking strategies has been well-documented among young adults, no studies have explored this topic among more severely affected populations, such as chronically homeless individuals with alcohol dependence. The aims of this study were thus to qualitatively and quantitatively document safer-drinking strategies used in this population. Participants (N=31) were currently or formerly chronically homeless individuals with alcohol dependence participating in a pilot study of extended-release naltrexone and harm-reduction counseling. At weeks 0 and 8, research staff provided a list of safer-drinking strategies for participants to endorse. Implementation of endorsed safer-drinking strategies was recorded at the next appointment. At both time points, strategies to buffer the effects of alcohol on the body (e.g., eating prior to and during drinking) were most highly endorsed, followed by changing the manner in which one drinks (e.g., spacing drinks), and reducing alcohol consumption. Quantitative analyses indicated that all participants endorsed safer-drinking strategies, and nearly all strategies were implemented (80-90% at weeks 0 and 8, respectively). These preliminary findings indicate that chronically homeless people with alcohol dependence use strategies to reduce harm associated with their drinking. Larger randomized controlled trials are needed to test whether interventions that teach safer-drinking strategies may reduce overall alcohol-related harm in this population.
Resumo:
Clinical experience and experimental data suggest that intradialytic hemodynamic profiles could be influenced by the characteristics of the dialysis membranes. Even within the worldwide used polysulfone family, intolerance to specific membranes was occasionally evoked. The aim of this study was to compare hemodynamically some of the commonly used polysulfone dialyzers in Switzerland. We performed an open-label, randomized, cross-over trial, including 25 hemodialysis patients. Four polysulfone dialyzers, A (Revaclear high-flux, Gambro, Stockholm, Sweden), B (Helixone high-flux, Fresenius), C (Xevonta high-flux, BBraun, Melsungen, Germany), and D (Helixone low-flux, Fresenius, Bad Homburg vor der Höhe, Germany), were compared. The hemodynamic profile was assessed and patients were asked to provide tolerance feedback. The mean score (±SD) subjectively assigned to dialysis quality on a 1-10 scale was A 8.4 ± 1.3, B 8.6 ± 1.3, C 8.5 ± 1.6, D 8.5 ± 1.5. Kt/V was A 1.58 ± 0.30, B 1.67 ± 0.33, C 1.62 ± 0.32, D 1.45 ± 0.31. The low- compared with the high-flux membranes, correlated to higher systolic (128.1 ± 13.1 vs. 125.6 ± 12.1 mmHg, P < 0.01) and diastolic (76.8 ± 8.7 vs. 75.3 ± 9.0 mmHg; P < 0.05) pressures, higher peripheral resistance (1.44 ± 0.19 vs. 1.40 ± 0.18 s × mmHg/mL; P < 0.05) and lower cardiac output (3.76 ± 0.62 vs. 3.82 ± 0.59 L/min; P < 0.05). Hypotension events (decrease in systolic blood pressure by >20 mmHg) were 70 with A, 87 with B, 73 with C, and 75 with D (P < 0.01 B vs. A, 0.05 B vs. C and 0.07 B vs. D). The low-flux membrane correlated to higher blood pressure levels compared with the high-flux ones. The Helixone high-flux membrane ensured the best efficiency. Unfortunately, the very same dialyzer correlated to a higher incidence of hypotensive episodes.
Resumo:
Background: Emergency department frequent users (EDFUs) account for a disproportionally high number of emergency department (ED) visits, contributing to overcrowding and high health-care costs. At the Lausanne University Hospital, EDFUs account for only 4.4% of ED patients, but 12.1% of all ED visits. Our study tested the hypothesis that an interdisciplinary case management intervention red. Methods: In this randomized controlled trial, we allocated adult EDFUs (5 or more visits in the previous 12 months) who visited the ED of the University Hospital of Lausanne, Switzerland between May 2012 and July 2013 either to an intervention (N=125) or a standard emergency care (N=125) group and monitored them for 12 months. Randomization was computer generated and concealed, and patients and research staff were blinded to the allocation. Participants in the intervention group, in addition to standard emergency care, received case management from an interdisciplinary team at baseline, and at 1, 3, and 5 months, in the hospital, in the ambulatory care setting, or at their homes. A generalized, linear, mixed-effects model for count data (Poisson distribution) was applied to compare participants' numbers of visits to the ED during the 12 months (Period 1, P1) preceding recruitment to the numbers of visits during the 12 months monitored (Period 2, P2).
Resumo:
Introduction: Frequent emergency department (ED) users are often vulnerable patients with many risk factors affecting their quality of life (QoL). The aim of this study was to examine to what extent a case management intervention improved frequent ED users' QoL. Methods: Data were part of a randomized, controlled trial designed to improve frequent ED users' QoL at the Lausanne University Hospital. A total of 194 frequent ED users (≥ 5 attendances during the previous 12 months; ≥ 18 years of age) were randomly assigned to the control or the intervention group. Participants in the intervention group received a case management intervention (i.e. counseling and assistance concerning social determinants of health, substance-use disorders, and access to the health-care system). QoL was evaluated using the WHOQOL-BREF at baseline and twelve months later. Four dimensions of QoL were retained: physical health, psychological health, social relationship, and environment, with scores ranging from 0 (low QoL) to 100 (high QoL).
Resumo:
Background: There is growing evidence suggesting that prolonged sitting has negative effects on people's weight, chronic diseases and mortality. Interventions to reduce sedentary time can be an effective strategy to increase daily energy expenditure. The purpose of this study is to evaluate the effectiveness of a six-month primary care intervention to reduce daily of sitting time in overweight and mild obese sedentary patients. Method/Design: The study is a randomized controlled trial (RCT). Professionals from thirteen primary health care centers (PHC) will randomly invite to participate mild obese or overweight patients of both gender, aged between 25 and 65 years old, who spend 6 hours at least daily sitting. A total of 232 subjects will be randomly allocated to an intervention (IG) and control group (CG) (116 individuals each group). In addition, 50 subjects with fibromyalgia will be included. Primary outcome is: (1) sitting time using the activPAL device and the Marshall questionnaire. The following parameters will be also assessed: (2) sitting time in work place (Occupational Sitting and Physical Activity Questionnaire), (3) health-related quality of life (EQ-5D), (4) evolution of stage of change (Prochaska and DiClemente's Stages of Change Model), (5) physical inactivity (catalan version of Brief Physical Activity Assessment Tool), (6) number of steps walked (pedometer and activPAL), (7) control based on analysis (triglycerides, total cholesterol, HDL, LDL, glycemia and, glycated haemoglobin in diabetic patients) and (8) blood pressure and anthropometric variables. All parameters will be assessed pre and post intervention and there will be a follow up three, six and twelve months after the intervention. A descriptive analysis of all variables and a multivariate analysis to assess differences among groups will be undertaken. Multivariate analysis will be carried out to assess time changes of dependent variables. All the analysis will be done under the intention to treat principle. Discussion: If the SEDESTACTIV intervention shows its effectiveness in reducing sitting time, health professionals would have a low-cost intervention tool for sedentary overweight and obese patients management.
Resumo:
INTRODUCTION: Mitral isthmus (MI) ablation is an effective option in patients undergoing ablation for persistent atrial fibrillation (AF). Achieving bidirectional conduction block across the MI is challenging, and predictors of MI ablation success remain incompletely understood. We sought to determine the impact of anatomical location of the ablation line on the efficacy of MI ablation. METHODS AND RESULTS: A total of 40 consecutive patients (87% male; 54 ± 10 years) undergoing stepwise AF ablation were included. MI ablation was performed in sinus rhythm. MI ablation was performed from the left inferior PV to either the posterior (group 1) or the anterolateral (group 2) mitral annulus depending on randomization. The length of the MI line (measured with the 3D mapping system) and the amplitude of the EGMs at 3 positions on the MI were measured in each patient. MI block was achieved in 14/19 (74%) patients in group 1 and 15/21 (71%) patients in group 2 (P = NS). Total MI radiofrequency time (18 ± 7 min vs. 17 ± 8 min; P = NS) was similar between groups. Patients with incomplete MI block had a longer MI length (34 ± 6 mm vs. 24 ± 5 mm; P < 0.001), a higher bipolar voltage along the MI (1.75 ± 0.74 mV vs. 1.05 ± 0.69 mV; P < 0.01), and a longer history of continuous AF (19 ± 17 months vs. 10 ± 10 months; P < 0.05). In multivariate analysis, decreased length of the MI was an independent predictor of successful MI block (OR 1.5; 95% CI 1.1-2.1; P < 0.05). CONCLUSIONS: Increased length but not anatomical location of the MI predicts failure to achieve bidirectional MI block during ablation of persistent AF.
Resumo:
BACKGROUND: Fever is a frequent cause of medical consultation among returning travelers. The objectives of this study were to assess whether physicians were able to identify patients with influenza and whether the use of an influenza rapid diagnostic test (iRDT) modified the clinical management of such patients. METHODS: Randomized controlled trial conducted at 2 different Swiss hospitals between December 2008 and November 2012. Inclusion criteria were 1) age ≥18 years, 2) documented fever of ≥38 °C or anamnestic fever + cough or sore throat within the last 4 days, 3) illness occurring within 14 days after returning from a trip abroad, 4) no definitive alternative diagnosis. Physicians were asked to estimate the likelihood of influenza on clinical grounds, and a single nasopharyngeal swab was taken. Thereafter patients were randomized into 2 groups: i) patients with iRDT (BD Directigen A + B) performed on the nasopharyngeal swab, ii) patients receiving usual care. A quantitative PCR to detect influenza was done on all nasopharyngeal swabs after the recruitment period. Clinical management was evaluated on the basis of cost of medical care, number of X-rays requested and prescription of anti-infective drugs. RESULTS: 100 eligible patients were referred to the investigators. 93 patients had a naso-pharyngeal swab for a PCR and 28 (30%) swabs were positive for influenza. The median probability of influenza estimated by the physician was 70% for the PCR positive cases and 30% for the PCR negative cases (p < 0.001). The sensitivity of the iRDT was only 20%, and specificity 100%. Mean medical cost for the patients managed with iRDT and without iRDT were USD 581 (95%CI 454-707) and USD 661 (95%CI 522-800) respectively. 14/60 (23%) of the patients managed with iRDT were prescribed antibiotics versus 13/33 (39%) in the control group (p = 0.15). No patient received antiviral treatment. CONCLUSION: Influenza was a frequent cause of fever among these febrile returning travelers. Based on their clinical assessment, physicians had a higher level of suspicion for influenza in PCR positive cases. The iRDT used in this study showed a disappointingly low sensitivity and can therefore not be recommended for the management of these patients. TRIAL REGISTRATION: ClinicalTrials.gov NCT00821626.
Resumo:
BACKGROUND: Hypoxia-induced pulmonary vasoconstriction increases pulmonary arterial pressure (PAP) and may impede right heart function and exercise performance. This study examined the effects of oral nitrate supplementation on right heart function and performance during exercise in normoxia and hypoxia. We tested the hypothesis that nitrate supplementation would attenuate the increase in PAP at rest and during exercise in hypoxia, thereby improving exercise performance. METHODS: Twelve trained male cyclists [age: 31 ± 7 year (mean ± SD)] performed 15 km time-trial cycling (TT) and steady-state submaximal cycling (50, 100, and 150 W) in normoxia and hypoxia (11% inspired O2) following 3-day oral supplementation with either placebo or sodium nitrate (0.1 mmol/kg/day). We measured TT time-to-completion, muscle tissue oxygenation during TT and systolic right ventricle to right atrium pressure gradient (RV-RA gradient: index of PAP) during steady state cycling. RESULTS: During steady state exercise, hypoxia elevated RV-RA gradient (p > 0.05), while oral nitrate supplementation did not alter RV-RA gradient (p > 0.05). During 15 km TT, hypoxia lowered muscle tissue oxygenation (p < 0.05). Nitrate supplementation further decreased muscle tissue oxygenation during 15 km TT in hypoxia (p < 0.05). Hypoxia impaired time-to-completion during TT (p < 0.05), while no improvements were observed with nitrate supplementation in normoxia or hypoxia (p > 0.05). CONCLUSION: Our findings indicate that oral nitrate supplementation does not attenuate acute hypoxic pulmonary vasoconstriction nor improve performance during time trial cycling in normoxia and hypoxia.
Resumo:
PURPOSE: To evaluate the effect of spironolactone, a mineralocorticoid receptor antagonist, for nonresolving central serous chorioretinopathy. METHODS: This is a prospective, randomized, double-blinded, placebo-controlled crossover study. Sixteen eyes of 16 patients with central serous chorioretinopathy and persistent subretinal fluid (SRF) for at least 3 months were enrolled. Patients were randomized to receive either spironolactone 50 mg or placebo once a day for 30 days, followed by a washout period of 1 week and then crossed over to either placebo or spironolactone for another 30 days. The primary outcome measure was the changes from baseline in SRF thickness at the apex of the serous retinal detachment. Secondary outcomes included subfoveal choroidal thickness and the ETDRS best-corrected visual acuity. RESULTS: The mean duration of central serous chorioretinopathy before enrollment in study eyes was 10 ± 16.9 months. Crossover data analysis showed a statistically significant reduction in SRF in spironolactone treated eyes as compared with the same eyes under placebo (P = 0.04). Secondary analysis on the first period (Day 0-Day 30) showed a significant reduction in subfoveal choroidal thickness in treated eyes as compared with placebo (P = 0.02). No significant changes were observed in the best-corrected visual acuity. There were no complications related to treatment observed. CONCLUSION: In eyes with persistent SRF due to central serous chorioretinopathy, spironolactone significantly reduced both the SRF and the subfoveal choroidal thickness as compared with placebo.
Resumo:
OBJECTIVE: To test the hypothesis that substituting artificially sweetened beverages (ASB) for sugar-sweetened beverages (SSB) decreases intrahepatocellular lipid concentrations (IHCL) in overweight subjects with high SSB consumption. METHODS: About 31 healthy subjects with BMI greater than 25 kg/m(2) and a daily consumption of at least 660 ml SSB were randomized to a 12-week intervention in which they replaced SSBs with ASBs. Their IHCL (magnetic resonance spectroscopy), visceral adipose tissue volume (VAT; magnetic resonance imaging), food intake (2-day food records), and fasting blood concentrations of metabolic markers were measured after a 4-week run-in period and after a 12-week period with ASB or control (CTRL). RESULTS: About 27 subjects completed the study. IHCL was reduced to 74% of the initial values with ASB (N = 14; P < 0.05) but did not change with CTRL. The decrease in IHCL attained with ASB was more important in subjects with IHCL greater than 60 mmol/l than in subjects with low IHCL. ALT decreased significantly with SSB only in subjects with IHCL greater than 60 mmol/l. There was otherwise no significant effect of ASB on body weight, VAT, or metabolic markers. CONCLUSIONS: In subjects with overweight or obesity and a high SSB intake, replacing SSB with ASB decreased intrahepatic fat over a 12-week period.
Resumo:
BACKGROUND AND OBJECTIVES: Hepcidin is the main hormone that regulates iron balance. Its lowering favours digestive iron absorption in cases of iron deficiency or enhanced erythropoiesis. The careful dosage of this small peptide promises new diagnostic and therapeutic strategies. Its measurement is progressively being validated and now its clinical value must be explored in different physiological situations. Here, we evaluate hepcidin levels among premenopausal female donors with iron deficiency without anaemia. MATERIALS AND METHODS: In a preceding study, a 4-week oral iron treatment (80 mg/day) was administered in a randomized controlled trial (n = 145), in cases of iron deficiency without anaemia after a blood donation. We subsequently measured hepcidin at baseline and after 4 weeks of treatment, using mass spectrometry. RESULTS: Iron supplementation had a significant effect on plasma hepcidin compared to the placebo arm at 4 weeks [+0·29 nm [95% CI: 0·18 to 0·40]). There was a significant correlation between hepcidin and ferritin at baseline (R(2) = 0·121, P < 0·001) and after treatment (R(2) = 0·436, P < 0·001). Hepcidin levels at baseline were not predictive of concentration changes for ferritin or haemoglobin. However, hepcidin levels at 4 weeks were significantly higher (0·79 nm [95% CI: 0·53 to 1·05]) among ferritin responders. CONCLUSIONS: This study shows that a 4-week oral treatment of iron increased hepcidin blood concentrations in female blood donors with an initial ferritin concentration of less than 30 ng/ml. Apparently, hepcidin cannot serve as a predictor of response to iron treatment but might serve as a marker of the iron repletion needed for erythropoiesis.
Resumo:
BACKGROUND: Delirium is an acute cognitive impairment among older hospitalized patients. It can persist until discharge and for months after that. Despite proof that evidence-based nursing interventions are effective in preventing delirium in acute hospitals, interventions among home-dwelling older patients is lacking. The aim was to assess feasibility and acceptability of a nursing intervention designed to detect and reduce delirium in older adults after discharge from hospital. METHODS: Randomized clinical pilot trial with a before/after design was used. One hundred and three older adults were recruited in a home healthcare service in French-speaking Switzerland and randomized into an experimental group (EG, n = 51) and a control group (CG, n = 52). The CG received usual homecare. The EG received usual homecare plus five additional nursing interventions at 48 and 72 h and at 7, 14 and 21 days after discharge. These interventions were tailored for detecting and reducing delirium and were conducted by a geriatric clinical nurse (GCN). All patients were monitored at the start of the study (M1) and throughout the month for symptoms of delirium (M2). This was documented in patients' records after usual homecare using the Confusion Assessment Method (CAM). At one month (M2), symptoms of delirium were measured using the CAM, cognitive status was measured using the Mini-Mental State Examination (MMSE), and functional status was measured using Katz and Lawton Index of activities of daily living (ADL/IADL). At the end of the study, participants in the EG and homecare nurses were interviewed about the acceptability of the nursing interventions and the study itself. RESULTS: Feasibility and acceptability indicators reported excellent results. Recruitment, retention, randomization, and other procedures were efficient, although some potentially issues were identified. Participants and nurses considered organizational procedures, data collection, intervention content, the dose-effect of the interventions, and methodology all to be feasible. Duration, patient adherence and fidelity were judged acceptable. Nurses, participants and informal caregivers were satisfied with the relevance and safety of the interventions. CONCLUSIONS: Nursing interventions to detect/improve delirium at home are feasible and acceptable. These results confirm that developing a large-scale randomized controlled trial would be appropriate. TRIAL REGESTRATION: ISRCTN registry no: 16103589 - 19 February 2016.