102 resultados para Regression-based decomposition.
em Université de Lausanne, Switzerland
Resumo:
BACKGROUND: QT interval prolongation carries an increased risk of torsade de pointes and death. AIM: We sought to determine the prevalence of QT prolongation in medical inpatients and to identify determinants of this condition. METHODS: We enrolled consecutive patients who were admitted to the internal medicine ward and who had an electrocardiogram performed within 24 h of admission. We collected information on baseline patient characteristics and the use of QT-prolonging drugs. Two blinded readers manually measured the QT intervals. QT intervals were corrected for heart rate using the traditional Bazett formula and the linear regression-based Framingham formula. We used logistic regression to identify patient characteristics and drugs that were independently associated with QTc prolongation. RESULTS: Of 537 inpatients, 22.3% had a prolonged QTc based on the Bazett formula. The adjusted odds for QTc prolongation based on the Bazett correction were significantly higher in patients who had liver disease (OR 2.9, 95% CI: 1.5-5.6), hypokalaemia (OR 3.3, 95% CI: 1.9-5.6) and who were taking ≥1 QT-prolonging drug at admission (OR 1.7, 95% CI: 1.1-2.6). Overall, 50.8% of patients with QTc prolongation received additional QT-prolonging drugs during hospitalisation. CONCLUSIONS: The prevalence of QTc prolongation was high among medical inpatients but depended on the method used to correct for heart rate. The use of QT-prolonging drugs, hypokalaemia and liver disease increased the risk of QTc prolongation. Many patients with QTc prolongation received additional QT-prolonging drugs during hospitalisation, further increasing the risk of torsade de pointes and death.
Resumo:
Medical expenditure risk can pose a major threat to living standards. We derive decomposable measures of catastrophic medical expenditure risk from reference-dependent utility with loss aversion. We propose a quantile regression based method of estimating risk exposure from cross-section data containing information on the means of financing health payments. We estimate medical expenditure risk in seven Asian countries and find it is highest in Laos and China, and is lowest in Malaysia. Exposure to risk is generally higher for households that have less recourse to self-insurance, lower incomes, wealth and education, and suffer from chronic illness.
Resumo:
The present study constitutes an investigation of tobacco consumption, related attitudes and individual differences in smoking or non-smoking behaviors in a sample of adolescents of different ages in the French-speaking part of Switzerland. We investigated three school-age groups (7th-grade, 9th-grade, and the second-year of high school) for differences in attitude and social and cognitive dimensions. We present both descriptive and inferential statistics. On an inferential level, we present a binary logistic regression-based model predicting risk of smoking. The resulting model most importantly suggests a strong relationship between smoking and alcohol consumption (both regular and sporadic). We interpret this result in terms of both the impact of the actual campaigns and the cognitive processes associated with adolescence.
Resumo:
Aim The imperfect detection of species may lead to erroneous conclusions about species-environment relationships. Accuracy in species detection usually requires temporal replication at sampling sites, a time-consuming and costly monitoring scheme. Here, we applied a lower-cost alternative based on a double-sampling approach to incorporate the reliability of species detection into regression-based species distribution modelling.Location Doñana National Park (south-western Spain).Methods Using species-specific monthly detection probabilities, we estimated the detection reliability as the probability of having detected the species given the species-specific survey time. Such reliability estimates were used to account explicitly for data uncertainty by weighting each absence. We illustrated how this novel framework can be used to evaluate four competing hypotheses as to what constitutes primary environmental control of amphibian distribution: breeding habitat, aestivating habitat, spatial distribution of surrounding habitats and/or major ecosystems zonation. The study was conducted on six pond-breeding amphibian species during a 4-year period.Results Non-detections should not be considered equivalent to real absences, as their reliability varied considerably. The occurrence of Hyla meridionalis and Triturus pygmaeus was related to a particular major ecosystem of the study area, where suitable habitat for these species seemed to be widely available. Characteristics of the breeding habitat (area and hydroperiod) were of high importance for the occurrence of Pelobates cultripes and Pleurodeles waltl. Terrestrial characteristics were the most important predictors of the occurrence of Discoglossus galganoi and Lissotriton boscai, along with spatial distribution of breeding habitats for the last species.Main conclusions We did not find a single best supported hypothesis valid for all species, which stresses the importance of multiscale and multifactor approaches. More importantly, this study shows that estimating the reliability of non-detection records, an exercise that had been previously seen as a naïve goal in species distribution modelling, is feasible and could be promoted in future studies, at least in comparable systems.
Resumo:
Background and objective: Cefepime was one of the most used broad-spectrum antibiotics in Swiss public acute care hospitals. The drug was withdrawn from market in January 2007, and then replaced by a generic since October 2007. The goal of the study was to evaluate changes in the use of broad-spectrum antibiotics after the withdrawal of the cefepime original product. Design: A generalized regression-based interrupted time series model incorporating autocorrelated errors assessed how much the withdrawal changed the monthly use of other broad-spectrum antibiotics (ceftazidime, imipenem/cilastin, meropenem, piperacillin/ tazobactam) in defined daily doses (DDD)/100 bed-days from January 2004 to December 2008 [1, 2]. Setting: 10 Swiss public acute care hospitals (7 with\200 beds, 3 with 200-500 beds). Nine hospitals (group A) had a shortage of cefepime and 1 hospital had no shortage thanks to importation of cefepime from abroad. Main outcome measures: Underlying trend of use before the withdrawal, and changes in the level and in the trend of use after the withdrawal. Results: Before the withdrawal, the average estimated underlying trend (coefficient b1) for cefepime was decreasing by -0.047 (95% CI -0.086, -0.009) DDD/100 bed-days per month and was significant in three hospitals (group A, P\0.01). Cefepime withdrawal was associated with a significant increase in level of use (b2) of piperacillin/tazobactam and imipenem/cilastin in, respectively, one and five hospitals from group A. After the withdrawal, the average estimated trend (b3) was greatest for piperacillin/tazobactam (+0.043 DDD/100 bed-days per month; 95% CI -0.001, 0.089) and was significant in four hospitals from group A (P\0.05). The hospital without drug shortage showed no significant change in the trend and the level of use. The hypothesis of seasonality was rejected in all hospitals. Conclusions: The decreased use of cefepime already observed before its withdrawal from the market could be explained by pre-existing difficulty in drug supply. The withdrawal of cefepime resulted in change in level for piperacillin/tazobactam and imipenem/cilastin. Moreover, an increase in trend was found for piperacillin/tazobactam thereafter. As these changes generally occur at the price of lower bacterial susceptibility, a manufacturers' commitment to avoid shortages in the supply of their products would be important. As perspectives, we will measure the impact of the changes in cost and sensitivity rates of these antibiotics.
Resumo:
Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.
Resumo:
The original cefepime product was withdrawn from the Swiss market in January 2007, and replaced by a generic 10 months later. The goals of the study were to assess the impact of this cefepime shortage on the use and costs of alternative broad-spectrum antibiotics, on antibiotic policy, and on resistance of Pseudomonas aeruginosa towards carbapenems, ceftazidime and piperacillin-tazobactam. A generalized regression-based interrupted time series model assessed how much the shortage changed the monthly use and costs of cefepime and of selected alternative broad-spectrum antibiotics (ceftazidime, imipenem-cilastatin, meropenem, piperacillin-tazobactam) in 15 Swiss acute care hospitals from January 2005 to December 2008. Resistance of P. aeruginosa was compared before and after the cefepime shortage. There was a statistically significant increase in the consumption of piperacillin-tazobactam in hospitals with definitive interruption of cefepime supply, and of meropenem in hospitals with transient interruption of cefepime supply. Consumption of each alternative antibiotic tended to increase during the cefepime shortage and to decrease when the cefepime generic was released. These shifts were associated with significantly higher overall costs. There was no significant change in hospitals with uninterrupted cefepime supply. The alternative antibiotics for which an increase in consumption showed the strongest association with a progression of resistance were the carbapenems. The use of alternative antibiotics after cefepime withdrawal was associated with a significant increase in piperacillin-tazobactam and meropenem use and in overall costs, and with a decrease in susceptibility of P. aeruginosa in hospitals. This warrants caution with regard to shortages and withdrawals of antibiotics.
Resumo:
The predictive potential of six selected factors was assessed in 72 patients with primary myelodysplastic syndrome using univariate and multivariate logistic regression analysis of survival at 18 months. Factors were age (above median of 69 years), dysplastic features in the three myeloid bone marrow cell lineages, presence of chromosome defects, all metaphases abnormal, double or complex chromosome defects (C23), and a Bournemouth score of 2, 3, or 4 (B234). In the multivariate approach, B234 and C23 proved to be significantly associated with a reduction in the survival probability. The similarity of the regression coefficients associated with these two factors means that they have about the same weight. Consequently, the model was simplified by counting the number of factors (0, 1, or 2) present in each patient, thus generating a scoring system called the Lausanne-Bournemouth score (LB score). The LB score combines the well-recognized and easy-to-use Bournemouth score (B score) with the chromosome defect complexity, C23 constituting an additional indicator of patient outcome. The predicted risk of death within 18 months calculated from the model is as follows: 7.1% (confidence interval: 1.7-24.8) for patients with an LB score of 0, 60.1% (44.7-73.8) for an LB score of 1, and 96.8% (84.5-99.4) for an LB score of 2. The scoring system presented here has several interesting features. The LB score may improve the predictive value of the B score, as it is able to recognize two prognostic groups in the intermediate risk category of patients with B scores of 2 or 3. It has also the ability to identify two distinct prognostic subclasses among RAEB and possibly CMML patients. In addition to its above-described usefulness in the prognostic evaluation, the LB score may bring new insights into the understanding of evolution patterns in MDS. We used the combination of the B score and chromosome complexity to define four classes which may be considered four possible states of myelodysplasia and which describe two distinct evolutional pathways.
Resumo:
PURPOSE: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. METHOD: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). RESULTS: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. CONCLUSION: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables like soil gas radon measurements as well as more detailed geological information.
Resumo:
BACKGROUND: Healthy lifestyle including sufficient physical activity may mitigate or prevent adverse long-term effects of childhood cancer. We described daily physical activities and sports in childhood cancer survivors and controls, and assessed determinants of both activity patterns. METHODOLOGY/PRINCIPAL FINDINGS: The Swiss Childhood Cancer Survivor Study is a questionnaire survey including all children diagnosed with cancer 1976-2003 at age 0-15 years, registered in the Swiss Childhood Cancer Registry, who survived ≥5 years and reached adulthood (≥20 years). Controls came from the population-based Swiss Health Survey. We compared the two populations and determined risk factors for both outcomes in separate multivariable logistic regression models. The sample included 1058 survivors and 5593 controls (response rates 78% and 66%). Sufficient daily physical activities were reported by 52% (n = 521) of survivors and 37% (n = 2069) of controls (p<0.001). In contrast, 62% (n = 640) of survivors and 65% (n = 3635) of controls reported engaging in sports (p = 0.067). Risk factors for insufficient daily activities in both populations were: older age (OR for ≥35 years: 1.5, 95CI 1.2-2.0), female gender (OR 1.6, 95CI 1.3-1.9), French/Italian Speaking (OR 1.4, 95CI 1.1-1.7), and higher education (OR for university education: 2.0, 95CI 1.5-2.6). Risk factors for no sports were: being a survivor (OR 1.3, 95CI 1.1-1.6), older age (OR for ≥35 years: 1.4, 95CI 1.1-1.8), migration background (OR 1.5, 95CI 1.3-1.8), French/Italian speaking (OR 1.4, 95CI 1.2-1.7), lower education (OR for compulsory schooling only: 1.6, 95CI 1.2-2.2), being married (OR 1.7, 95CI 1.5-2.0), having children (OR 1.3, 95CI 1.4-1.9), obesity (OR 2.4, 95CI 1.7-3.3), and smoking (OR 1.7, 95CI 1.5-2.1). Type of diagnosis was only associated with sports. CONCLUSIONS/SIGNIFICANCE: Physical activity levels in survivors were lower than recommended, but comparable to controls and mainly determined by socio-demographic and cultural factors. Strategies to improve physical activity levels could be similar as for the general population.
Resumo:
Species distribution models (SDMs) are increasingly used to predict environmentally induced range shifts of habitats of plant and animal species. Consequently SDMs are valuable tools for scientifically based conservation decisions. The aims of this paper are (1) to identify important drivers of butterfly species persistence or extinction, and (2) to analyse the responses of endangered butterfly species of dry grasslands and wetlands to likely future landscape changes in Switzerland. Future land use was represented by four scenarios describing: (1) ongoing land use changes as observed at the end of the last century; (2) a liberalisation of the agricultural markets; (3) a slightly lowered agricultural production; and (4) a strongly lowered agricultural production. Two model approaches have been applied. The first (logistic regression with principal components) explains what environmental variables have significant impact on species presence (and absence). The second (predictive SDM) is used to project species distribution under current and likely future land uses. The results of the explanatory analyses reveal that four principal components related to urbanisation, abandonment of open land and intensive agricultural practices as well as two climate parameters are primary drivers of species occurrence (decline). The scenario analyses show that lowered agricultural production is likely to favour dry grassland species due to an increase of non-intensively used land, open canopy forests, and overgrown areas. In the liberalisation scenario dry grassland species show a decrease in abundance due to a strong increase of forested patches. Wetland butterfly species would decrease under all four scenarios as their habitats become overgrown
Resumo:
Background: Blood pressure (BP) is strongly associated with body weight and there is concern that the pediatric overweight epidemic could lead to an increase in children's mean BP. Objectives: We analyzed BP trends from 1998 to 2006 among children of the Seychelles, a rapidly developing middle-income country in Africa. Methods: Serial school-based surveys of weight, height and BP were conducted yearly between 1998-2006 among all students of the country in four school grades (kindergarten, 4th, 7th and 10th years of compulsory school). We used the CDC criteria to define "overweight" (BMI _95th sex-, and age-specific percentile) and the NHBPEP criteria for "elevated BP" (BP _95th sex-, age-, and height specific percentile). Methods for height, weight, and BP measurements were identical over the study period. The trends in mean BMI and mean systolic/diastolic BP were assessed with linear regression. Results: 27,703 children aged 4-18 years (participation rate: 79%) contributed 43,927 observations on weight, height, and BP. The prevalence of overweight increased from 5.1% in 1998-2000 to 8.1% in 2004-2006 among boys, and from 6.1% to 9.1% among girls, respectively. The prevalence of elevated BP was 8.4% in 1998-2000 and 6.9% in 2004-2006 among boys; 9.8% and 7.8% among girls, respectively. Over the 9-years study period, age-adjusted body mass index (BMI) increased by 0.078 kg/m2/year in boys and by 0.083 kg/m2/year in girls (both sexes, P_0.001). Age- and height-adjusted systolic BP decreased by -0.37 mmHg/year in boys and by -0.34 mmHg/year in girls (both sexes, P_0.001). Diastolic BP did not change in boys (-0.02 mmHg/year, P: 0.40) and slightly increased in girls (0.07 mmHg/year, P: 0.003). These trend estimates were altered modestly upon further adjustment for BMI or if analyses were based on median rather than mean values. Conclusion: Although body weight increased markedly between 1998 and 2006 in this population, systolic BP decreased and diastolic BP changed only marginally. This suggests that population increases in body weight are not necessarily associated with corresponding rises in BP in children.
Resumo:
BACKGROUND: Recommended oral voriconazole (VRC) doses are lower than intravenous doses. Because plasma concentrations impact efficacy and safety of therapy, optimizing individual drug exposure may improve these outcomes. METHODS: A population pharmacokinetic analysis (NONMEM) was performed on 505 plasma concentration measurements involving 55 patients with invasive mycoses who received recommended VRC doses. RESULTS: A 1-compartment model with first-order absorption and elimination best fitted the data. VRC clearance was 5.2 L/h, the volume of distribution was 92 L, the absorption rate constant was 1.1 hour(-1), and oral bioavailability was 0.63. Severe cholestasis decreased VRC elimination by 52%. A large interpatient variability was observed on clearance (coefficient of variation [CV], 40%) and bioavailability (CV 84%), and an interoccasion variability was observed on bioavailability (CV, 93%). Lack of response to therapy occurred in 12 of 55 patients (22%), and grade 3 neurotoxicity occurred in 5 of 55 patients (9%). A logistic multivariate regression analysis revealed an independent association between VRC trough concentrations and probability of response or neurotoxicity by identifying a therapeutic range of 1.5 mg/L (>85% probability of response) to 4.5 mg/L (<15% probability of neurotoxicity). Population-based simulations with the recommended 200 mg oral or 300 mg intravenous twice-daily regimens predicted probabilities of 49% and 87%, respectively, for achievement of 1.5 mg/L and of 8% and 37%, respectively, for achievement of 4.5 mg/L. With 300-400 mg twice-daily oral doses and 200-300 mg twice-daily intravenous doses, the predicted probabilities of achieving the lower target concentration were 68%-78% for the oral regimen and 70%-87% for the intravenous regimen, and the predicted probabilities of achieving the upper target concentration were 19%-29% for the oral regimen and 18%-37% for the intravenous regimen. CONCLUSIONS: Higher oral than intravenous VRC doses, followed by individualized adjustments based on measured plasma concentrations, improve achievement of the therapeutic target that maximizes the probability of therapeutic response and minimizes the probability of neurotoxicity. These findings challenge dose recommendations for VRC.
Resumo:
OBJECTIVES: It is still debated if pre-existing minority drug-resistant HIV-1 variants (MVs) affect the virological outcomes of first-line NNRTI-containing ART. METHODS: This Europe-wide case-control study included ART-naive subjects infected with drug-susceptible HIV-1 as revealed by population sequencing, who achieved virological suppression on first-line ART including one NNRTI. Cases experienced virological failure and controls were subjects from the same cohort whose viraemia remained suppressed at a matched time since initiation of ART. Blinded, centralized 454 pyrosequencing with parallel bioinformatic analysis in two laboratories was used to identify MVs in the 1%-25% frequency range. ORs of virological failure according to MV detection were estimated by logistic regression. RESULTS: Two hundred and sixty samples (76 cases and 184 controls), mostly subtype B (73.5%), were used for the analysis. Identical MVs were detected in the two laboratories. 31.6% of cases and 16.8% of controls harboured pre-existing MVs. Detection of at least one MV versus no MVs was associated with an increased risk of virological failure (OR = 2.75, 95% CI = 1.35-5.60, P = 0.005); similar associations were observed for at least one MV versus no NRTI MVs (OR = 2.27, 95% CI = 0.76-6.77, P = 0.140) and at least one MV versus no NNRTI MVs (OR = 2.41, 95% CI = 1.12-5.18, P = 0.024). A dose-effect relationship between virological failure and mutational load was found. CONCLUSIONS: Pre-existing MVs more than double the risk of virological failure to first-line NNRTI-based ART.