35 resultados para design factors
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Context Long-term antiretroviral therapy (ART) use in resource-limited countries leads to increasing numbers of patients with HIV taking second-line therapy. Limited access to further therapeutic options makes essential the evaluation of second-line regimen efficacy in these settings. Objectives To investigate failure rates in patients receiving second-line therapy and factors associated with failure and death. Design, Setting, and Participants Multicohort study of 632 patients >14 years old receiving second-line therapy for more than 6 months in 27 ART programs in Africa and Asia between January 2001 and October 2008. Main Outcome Measures Clinical, immunological, virological, and immunovirological failure (first diagnosed episode of immunological or virological failure) rates, and mortality after 6 months of second-line therapy use. Sensitivity analyses were performed using alternative CD4 cell count thresholds for immunological and immunovirological definitions of failure and for cohort attrition instead of death. Results The 632 patients provided 740.7 person-years of follow-up; 119 (18.8%) met World Health Organization failure criteria after a median 11.9 months following the start of second-line therapy (interquartile range [IQR], 8.7-17.0 months), and 34 (5.4%) died after a median 15.1 months (IQR, 11.9-25.7 months). Failure rates were lower in those who changed 2 nucleoside reverse transcriptase inhibitors (NRTIs) instead of 1 (179.2 vs 251.6 per 1000 person-years; incidence rate ratio [IRR], 0.64; 95% confidence interval [CI], 0.42-0.96), and higher in those with lowest adherence index (383.5 vs 176.0 per 1000 person-years; IRR, 3.14; 95% CI, 1.67-5.90 for <80% vs ≥95% [percentage adherent, as represented by percentage of appointments attended with no delay]). Failure rates increased with lower CD4 cell counts when second-line therapy was started, from 156.3 vs 96.2 per 1000 person-years; IRR, 1.59 (95% CI, 0.78-3.25) for 100 to 199/μL to 336.8 per 1000 person-years; IRR, 3.32 (95% CI, 1.81-6.08) for less than 50/μL vs 200/μL or higher; and decreased with time using second-line therapy, from 250.0 vs 123.2 per 1000 person-years; IRR, 1.90 (95% CI, 1.19-3.02) for 6 to 11 months to 212.0 per 1000 person-years; 1.71 (95% CI, 1.01-2.88) for 12 to 17 months vs 18 or more months. Mortality for those taking second-line therapy was lower in women (32.4 vs 68.3 per 1000 person-years; hazard ratio [HR], 0.45; 95% CI, 0.23-0.91); and higher in patients with treatment failure of any type (91.9 vs 28.1 per 1000 person-years; HR, 2.83; 95% CI, 1.38-5.80). Sensitivity analyses showed similar results. Conclusions Among patients in Africa and Asia receiving second-line therapy for HIV, treatment failure was associated with low CD4 cell counts at second-line therapy start, use of suboptimal second-line regimens, and poor adherence. Mortality was associated with diagnosed treatment failure.
Resumo:
STUDY DESIGN: Retrospective case review. OBJECTIVES: In the present study, the neurological outcome, retirement and prognostic factors of patients with spinal cord injury without radiographic abnormality (SCIWORA) were evaluated. SETTING: Swiss national work accident insurance database. METHODS: The medical histories of 32 patients who were insured by the Swiss Accident Insurance Fund (SUVA) and had SCIWORA between 1995 and 2004 were evaluated thoroughly. Moreover, all available magnetic resonance imaging (MRI) scans were evaluated. RESULTS: At the last follow-up, none of the patients had complete spinal cord injury, only 4 patients had severe deficits and 12 patients had normal motor and sensory function in the neurological examination. However, only 7 out of 32 patients had returned to full-time work and 10 out of 32 patients were fully retired. Both the presence of spinal cord change (ρ=0.51) and higher maximum spinal cord compression (ρ=0.57) in MRI scan correlated with the likelihood for retirement; older age (ρ=0.38) and physical load of work (ρ=0.4) correlated with retirement to a lesser extent. CONCLUSION: Although the neurological outcome of SCIWORA is mostly good, the retirement rate is high. Presence of spinal cord change and severity of cord compression are the best predictors for the degree of retirement.
Resumo:
Background Modern methods in intensive care medicine often enable the survival of older critically ill patients. The short-term outcomes for patients treated in intensive care units (ICUs), such as survival to hospital discharge, are well documented. However, relatively little is known about subsequent long-term outcomes. Pain, anxiety and agitation are important stress factors for many critically ill patients. There are very few studies concerned with pain, anxiety and agitation and the consequences in older critically ill patients. The overall aim of this study is to identify how an ICU stay influences an older person's experiences later in life. More specific, this study has the following objectives: (1) to explore the relationship between pain, anxiety and agitation during ICU stays and experiences of the same symptoms in later life; and (2) to explore the associations between pain, anxiety and agitation experienced during ICU stays and their effect on subsequent health-related quality of life, use of the health care system (readmissions, doctor visits, rehabilitation, medication use), living situation, and survival after discharge and at 6 and 12 months of follow-up. Methods/Design A prospective, longitudinal study will be used for this study. A total of 150 older critically ill patients in the ICU will participate (ICU group). Pain, anxiety, agitation, morbidity, mortality, use of the health care system, and health-related quality of life will be measured at 3 intervals after a baseline assessment. Baseline measurements will be taken 48 hours after ICU admission and one week thereafter. Follow-up measurements will take place 6 months and 12 months after discharge from the ICU. To be able to interpret trends in scores on outcome variables in the ICU group, a comparison group of 150 participants, matched by age and gender, recruited from the Swiss population, will be interviewed at the same intervals as the ICU group. Discussion Little research has focused on long term consequences after ICU admission in older critically ill patients. The present study is specifically focussing on long term consequences of stress factors experienced during ICU admission.
Resumo:
Exposure to farming environments has been shown to protect substantially against asthma and atopic disease across Europe and in other parts of the world. The GABRIEL Advanced Surveys (GABRIELA) were conducted to determine factors in farming environments which are fundamental to protecting against asthma and atopic disease. The GABRIEL Advanced Surveys have a multi-phase stratified design. In a first-screening phase, a comprehensive population-based survey was conducted to assess the prevalence of exposure to farming environments and of asthma and atopic diseases (n = 103,219). The second phase was designed to ascertain detailed exposure to farming environments and to collect biomaterial and environmental samples in a stratified random sample of phase 1 participants (n = 15,255). A third phase was carried out in a further stratified sample only in Bavaria, southern Germany, aiming at in-depth respiratory disease and exposure assessment including extensive environmental sampling (n = 895). Participation rates in phase 1 were around 60% but only about half of the participating study population consented to further study modules in phase 2. We found that consenting behaviour was related to familial allergies, high parental education, wheeze, doctor diagnosed asthma and rhinoconjunctivitis, and to a lesser extent to exposure to farming environments. The association of exposure to farm environments with asthma or rhinoconjunctivitis was not biased by participation or consenting behaviour. The GABRIEL Advanced Surveys are one of the largest studies to shed light on the protective 'farm effect' on asthma and atopic disease. Bias with regard to the main study question was able to be ruled out by representativeness and high participation rates in phases 2 and 3. The GABRIEL Advanced Surveys have created extensive collections of questionnaire data, biomaterial and environmental samples promising new insights into this area of research.
Resumo:
Background Chronic localized pain syndromes, especially chronic low back pain (CLBP), are common reasons for consultation in general practice. In some cases chronic localized pain syndromes can appear in combination with chronic widespread pain (CWP). Numerous studies have shown a strong association between CWP and several physical and psychological factors. These studies are population-based cross-sectional and do not allow for assessing chronology. There are very few prospective studies that explore the predictors for the onset of CWP, where the main focus is identifying risk factors for the CWP incidence. Until now there have been no studies focusing on preventive factors keeping patients from developing CWP. Our aim is to perform a cross sectional study on the epidemiology of CLBP and CWP in general practice and to look for distinctive features regarding resources like resilience, self-efficacy and coping strategies. A subsequent cohort study is designed to identify the risk and protective factors of pain generalization (development of CWP) in primary care for CLBP patients. Methods/Design Fifty-nine general practitioners recruit consecutively, during a 5 month period, all patients who are consulting their family doctor because of chronic low back pain (where the pain is lasted for 3 months). Patients are asked to fill out a questionnaire on pain anamnesis, pain-perception, co-morbidities, therapy course, medication, socio demographic data and psychosomatic symptoms. We assess resilience, coping resources, stress management and self-efficacy as potential protective factors for pain generalization. Furthermore, we raise risk factors for pain generalization like anxiety, depression, trauma and critical life events. During a twelve months follow up period a cohort of CLBP patients without CWP will be screened on a regular basis (3 monthly) for pain generalization (outcome: incident CWP). Discussion This cohort study will be the largest study which prospectively analyzes predictors for transition from CLBP to CWP in primary care setting. In contrast to the typically researched risk factors, which increase the probability of pain generalization, this study also focus intensively on protective factors, which decrease the probability of pain generalization.
Resumo:
Background Randomized controlled trials (RCTs) may be discontinued because of apparent harm, benefit, or futility. Other RCTs are discontinued early because of insufficient recruitment. Trial discontinuation has ethical implications, because participants consent on the premise of contributing to new medical knowledge, Research Ethics Committees (RECs) spend considerable effort reviewing study protocols, and limited resources for conducting research are wasted. Currently, little is known regarding the frequency and characteristics of discontinued RCTs. Methods/Design Our aims are, first, to determine the prevalence of RCT discontinuation for specific reasons; second, to determine whether the risk of RCT discontinuation for specific reasons differs between investigator- and industry-initiated RCTs; third, to identify risk factors for RCT discontinuation due to insufficient recruitment; fourth, to determine at what stage RCTs are discontinued; and fifth, to examine the publication history of discontinued RCTs. We are currently assembling a multicenter cohort of RCTs based on protocols approved between 2000 and 2002/3 by 6 RECs in Switzerland, Germany, and Canada. We are extracting data on RCT characteristics and planned recruitment for all included protocols. Completion and publication status is determined using information from correspondence between investigators and RECs, publications identified through literature searches, or by contacting the investigators. We will use multivariable regression models to identify risk factors for trial discontinuation due to insufficient recruitment. We aim to include over 1000 RCTs of which an anticipated 150 will have been discontinued due to insufficient recruitment. Discussion Our study will provide insights into the prevalence and characteristics of RCTs that were discontinued. Effective recruitment strategies and the anticipation of problems are key issues in the planning and evaluation of trials by investigators, Clinical Trial Units, RECs and funding agencies. Identification and modification of barriers to successful study completion at an early stage could help to reduce the risk of trial discontinuation, save limited resources, and enable RCTs to better meet their ethical requirements.
Resumo:
A key issue in the approval process of antidepressants is the inconsistency of results between antidepressant clinical phase III trials. Identifying factors influencing efficacy data is needed to facilitate interpretation of the results.
Resumo:
OBJECTIVE: To examine the duration of methicillin-resistant Staphylococcus aureus (MRSA) carriage and its determinants and the influence of eradication regimens. DESIGN: Retrospective cohort study. SETTING: A 1,033-bed tertiary care university hospital in Bern, Switzerland, in which the prevalence of methicillin resistance among S. aureus isolates is less than 5%. PATIENTS: A total of 116 patients with first-time MRSA detection identified at University Hospital Bern between January 1, 2000, and December 31, 2003, were followed up for a mean duration of 16.2 months. RESULTS: Sixty-eight patients (58.6%) cleared colonization, with a median time to clearance of 7.4 months. Independent determinants for shorter carriage duration were the absence of any modifiable risk factor (receipt of antibiotics, use of an indwelling device, or presence of a skin lesion) (hazard ratio [HR], 0.20 [95% confidence interval {CI}, 0.09-0.42]), absence of immunosuppressive therapy (HR, 0.49 [95% CI, 0.23-1.02]), and hemodialysis (HR, 0.08 [95% CI, 0.01-0.66]) at the time MRSA was first MRSA detected and the administration of decolonization regimen in the absence of a modifiable risk factor (HR, 2.22 [95% CI, 1.36-3.64]). Failure of decolonization treatment was associated with the presence of risk factors at the time of treatment (P=.01). Intermittent screenings that were negative for MRSA were frequent (26% of patients), occurred early after first detection of MRSA (median, 31.5 days), and were associated with a lower probability of clearing colonization (HR, 0.34 [95% CI, 0.17-0.67]) and an increased risk of MRSA infection during follow-up. CONCLUSIONS: Risk factors for MRSA acquisition should be carefully assessed in all MRSA carriers and should be included in infection control policies, such as the timing of decolonization treatment, the definition of MRSA clearance, and the decision of when to suspend isolation measures.
Resumo:
OBJECTIVE: To assess the influence of recipient's and donor's factors as well as surgical events on the occurrence of reperfusion injury after lung transplantation. DESIGN AND SETTING: Retrospective study in the surgical intensive care unit (ICU) of a university hospital. METHODS: We collected data on 60 lung transplantation donor/recipient pairs from June 1993 to May 2001, and compared the demographic, peri- and postoperative variables of patients who experienced reperfusion injury (35%) and those who did not. RESULTS: The occurrence of high systolic pulmonary pressure immediately after transplantation and/or its persistence during the first 48[Symbol: see text]h after surgery was associated with reperfusion injury, independently of preoperative values. Reperfusion injury was associated with difficult hemostasis during transplantation (p[Symbol: see text]=[Symbol: see text]0.03). Patients with reperfusion injury were more likely to require the administration of catecholamine during the first 48[Symbol: see text]h after surgery (p[Symbol: see text]=[Symbol: see text]0.014). The extubation was delayed (p[Symbol: see text]=[Symbol: see text]0.03) and the relative odds of ICU mortality were significantly greater (OR 4.8, 95% CI: 1.06, 21.8) in patients with reperfusion injury. Our analysis confirmed that preexisting pulmonary hypertension increased the incidence of reperfusion injury (p[Symbol: see text]<[Symbol: see text]0.01). CONCLUSIONS: Difficulties in perioperative hemostasis were associated with reperfusion injury. Occurrence of reperfusion injury was associated with postoperative systolic pulmonary hypertension, longer mechanical ventilation and higher mortality. Whether early recognition and treatment of pulmonary hypertension during transplantation can prevent the occurrence of reperfusion injury needs to be investigated.
Resumo:
To clarify the pharmacological profile of the two new calcium channel blockers tiapamil and nisoldipine in humans, their acute effects as compared with those of the reference agent nifedipine were assessed in 10 normal subjects and 10 patients with essential hypertension. Blood pressure (BP), heart rate (HR), plasma and urinary catecholamine, sodium and potassium, plasma renin and aldosterone levels, and urinary prostaglandin E2 and F2 excretion rates were determined before and up to 4 or 5 h (urine values) after intravenous injection of placebo (20 ml 0.9% NaCl), tiapamil 1 mg/kg body weight, nisoldipine 6 micrograms/kg, or nifedipine 15 micrograms/kg. The four studies were performed at weekly intervals according to Latin square design. All three calcium channel blockers significantly (p less than 0.05 or lower) lowered BP and distinctly increased sodium excretion in hypertensive patients, but had only little influence on these parameters in normal subjects. HR was increased in both groups. Changes in BP and HR were maximal at 5 min and largely dissipated 3 h after drug injection. Effects on BP and HR, as well as concomitant mild increases in plasma norepinephrine and renin levels that occurred in both groups, tended to be more pronounced (about double) following nisoldipine than following tiapamil or nifedipine at the dosages given. Plasma aldosterone, epinephrine levels, and prostaglandin excretion rates were not consistently modified. These findings demonstrate that tiapamil and nisoldipine possess distinct antihypertensive properties in humans. Different chronotropic and renin-activating effects of different calcium channel blockers may be determined, at least in part, by a different influence on sympathetic activity.(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
OBJECTIVE: To analyse risk factors in alpine skiing. DESIGN: A controlled multicentre survey of injured and non-injured alpine skiers. SETTING: One tertiary and two secondary trauma centres in Bern, Switzerland. PATIENTS AND METHODS: All injured skiers admitted from November 2007 to April 2008 were analysed using a completed questionnaire incorporating 15 parameters. The same questionnaire was distributed to non-injured controls. Multiple logistic regression was performed. Patterns of combined risk factors were calculated by inference trees. A total of 782 patients and 496 controls were interviewed. RESULTS: Parameters that were significant for the patients were: high readiness for risk (p = 0.0365, OR 1.84, 95% CI 1.04 to 3.27); low readiness for speed (p = 0.0008, OR 0.29, 95% CI 0.14 to 0.60); no aggressive behaviour on slopes (p<0.0001, OR 0.19, 95% CI 0.09 to 0.37); new skiing equipment (p = 0.0228, OR 59, 95% CI 0.37 to 0.93); warm-up performed (p = 0.0015, OR 1.79, 95% CI 1.25 to 2.57); old snow compared with fresh snow (p = 0.0155, OR 0.31, 95% CI 0.12 to 0.80); old snow compared with artificial snow (p = 0.0037, OR 0.21, 95% CI 0.07 to 0.60); powder snow compared with slushy snow (p = 0.0035, OR 0.25, 95% CI 0.10 to 0.63); drug consumption (p = 0.0044, OR 5.92, 95% CI 1.74 to 20.11); and alcohol abstinence (p<0.0001, OR 0.14, 95% CI 0.05 to 0.34). Three groups at risk were detected: (1) warm-up 3-12 min, visual analogue scale (VAS)(speed) >4 and bad weather/visibility; (2) VAS(speed) 4-7, icy slopes and not wearing a helmet; (3) warm-up >12 min and new skiing equipment. CONCLUSIONS: Low speed, high readiness for risk, new skiing equipment, old and powder snow, and drug consumption are significant risk factors when skiing. Future work should aim to identify more precisely specific groups at risk and develop recommendations--for example, a snow weather index at valley stations.
Resumo:
OBJECTIVE: To investigate a large outbreak of scabies in an intensive care unit of a university hospital and an affiliated rehabilitation center, and to establish effective control measures to prevent further transmission. DESIGN: Outbreak investigation. SETTING: The intensive care unit of a 750-bed university hospital and an affiliated 92-bed rehabilitation center. METHODS: All exposed individuals were screened by a senior staff dermatologist. Scabies was diagnosed on the basis of (1) identification of mites by skin scraping, (2) identification of mites by dermoscopy, or (3) clinical examination of patients without history of prior treatment for typical burrows. During a follow-up period of 6 months, the attack rate was calculated as the number of symptomatic individuals divided by the total number of exposed individuals. INTERVENTIONS: All exposed healthcare workers (HCWs) and their household members underwent preemptive treatment. Initially, the most effective registered drug in Switzerland (ie, topical lindane) was prescribed, but this prescription was switched to topical permethrin or systemic ivermectin as a result of the progression of the outbreak. Individuals with any signs or symptoms of scabies underwent dermatological examination. RESULTS: Within 7 months, 19 cases of scabies were diagnosed, 6 in children with a mean age of 3.1 years after exposure to the index patient with HIV and crusted scabies. A total of 1,640 exposed individuals underwent preemptive treatment. The highest attack rate of 26%-32% was observed among HCWs involved in the care of the index patient. A too-restricted definition of individuals at risk, noncompliance with treatment, and the limited effectiveness of lindane likely led to treatment failure, relapse, and reinfestation within families. CONCLUSIONS: Crusted scabies resulted in high attack rates among HCWs and household contacts. Timely institution of hygienic precautions with close monitoring and widespread, simultaneous scabicide treatment of all exposed individuals are essential for control of an outbreak.
Resumo:
OBJECTIVE: This study developed percentile curves for anthropometric (waist circumference) and cardiovascular (lipid profile) risk factors for US children and adolescents. STUDY DESIGN: A representative sample of US children and adolescents from the National Health and Nutrition Examination Survey from 1988 to 1994 (NHANES III) and the current national series (NHANES 1999-2006) were combined. Percentile curves were constructed, nationally weighted, and smoothed using the Lambda, Mu, and Sigma method. The percentile curves included age- and sex-specific percentile values that correspond with and transition into the adult abnormal cut-off values for each of these anthropometric and cardiovascular components. To increase the sample size, a second series of percentile curves was also created from the combination of the 2 NHANES databases, along with cross-sectional data from the Bogalusa Heart Study, the Muscatine Study, the Fels Longitudinal Study and the Princeton Lipid Research Clinics Study. RESULTS: These analyses resulted in a series of growth curves for waist circumference, total cholesterol, LDL cholesterol, triglycerides, and HDL cholesterol from a combination of pediatric data sets. The cut-off for abnormal waist circumference in adult males (102 cm) was equivalent to the 94(th) percentile line in 18-year-olds, and the cut-off in adult females (88 cm) was equivalent to the 84(th) percentile line in 18-year-olds. Triglycerides were found to have a bimodal pattern among females, with an initial peak at age 11 and a second at age 20; the curve for males increased steadily with age. The HDL curve for females was relatively flat, but the male curve declined starting at age 9 years. Similar curves for total and LDL cholesterol were constructed for both males and females. When data from the additional child studies were added to the national data, there was little difference in their patterns or rates of change from year to year. CONCLUSIONS: These curves represent waist and lipid percentiles for US children and adolescents, with identification of values that transition to adult abnormalities. They could be used conditionally for both epidemiological and possibly clinical applications, although they need to be validated against longitudinal data.
Resumo:
Arguably, job satisfaction is one of the most important variables with regard to work. When explaining job satisfaction, research usually focuses on predictor variables in terms of levels but neglects growth rates. Therefore it remains unclear how potential predictors evolve over time and how their development affects job satisfaction. Using multivariate latent growth modeling in a study with 1145 young workers over five years, we analyzed how well job satisfaction is predicted a) by levels of situational (i.e., job control) and dispositional (i.e., Core Self-Evaluations (CSE)) factors and b) by growth per year of these predictors. Results showed both intercepts and slopes to be related to each other, suggesting a joint growth of job control and CSE during early careers. Job satisfaction after five years was best predicted by the slopes of job control (β = .31, p < .001) and CSE (β = .34, p < .01). These findings provide further longitudinal evidence for the role of situational as well as dispositional factors for predicting job satisfaction. In addition, growth rates per year were better predictors than initial levels. Furthermore, a lack of change in job control or CSE went along with a drop in job satisfaction, implying that young workers need to perceive things to be improving in order to increase, or at least maintain, their level of job satisfaction. In terms of theory, the relative importance of levels versus changes deserves more attention. In terms of practical implications, our results suggest a double emphasis on job design (i.e., granting sufficient, and increasing, control) and on personal development (e.g., through training) so that people experience a match between both. Finally, negative associations between initial levels and growth rates suggest that people are quite successful in achieving a reasonable fit between their job characteristics and their needs and goals.