233 resultados para Multivariable logistic regression


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although associated with adverse outcomes in other cardiopulmonary diseases, limited evidence exists on the prognostic value of anaemia in patients with acute pulmonary embolism (PE). We sought to examine the associations between anaemia and mortality and length of hospital stay in patients with PE. We evaluated 14,276 patients with a primary diagnosis of PE from 186 hospitals in Pennsylvania, USA. We used random-intercept logistic regression to assess the association between anaemia at the time of presentation and 30-day mortality and discrete-time logistic hazard models to assess the association between anaemia and time to hospital discharge, adjusting for patient (age, gender, race, insurance type, clinical and laboratory variables) and hospital (region, size, teaching status) factors. Anaemia was present in 38.7% of patients at admission. Patients with anaemia had a higher 30-day mortality (13.7% vs. 6.3%; p <0.001) and a longer length of stay (geometric mean, 6.9 vs. 6.6 days; p <0.001) compared to patients without anaemia. In multivariable analyses, anaemia remained associated with an increased odds of death (OR 1.82, 95% CI: 1.60-2.06) and a decreased odds of discharge (OR 0.85, 95% CI: 0.82-0.89). Anaemia is very common in patients presenting with PE and is independently associated with an increased short-term mortality and length of stay.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: In order to facilitate and improve the use of antiretroviral therapy (ART), international recommendations are released and updated regularly. We aimed to study if adherence to the recommendations is associated with better treatment outcomes in the Swiss HIV Cohort Study (SHCS). METHODS: Initial ART regimens prescribed to participants between 1998 and 2007 were classified according to IAS-USA recommendations. Baseline characteristics of patients who received regimens in violation with these recommendations (violation ART) were compared to other patients. Multivariable logistic and linear regression analyses were performed to identify associations between violation ART and (i) virological suppression and (ii) CD4 cell count increase, after one year. RESULTS: Between 1998 and 2007, 4189 SHCS participants started 241 different ART regimens. A violation ART was started in 5% of patients. Female patients (adjusted odds ratio aOR 1.83, 95%CI 1.28-2.62), those with a high education level (aOR 1.49, 95%CI 1.07-2.06) or a high CD4 count (aOR 1.53, 95%CI 1.02-2.30) were more likely to receive violation ART. The proportion of patients with an undetectable viral load (<400 copies/mL) after one year was significantly lower with violation ART than with recommended regimens (aOR 0.54, 95% CI 0.37-0.80) whereas CD4 count increase after one year of treatment was similar in both groups. CONCLUSIONS: Although more than 240 different initial regimens were prescribed, violations of the IAS-USA recommendations were uncommon. Patients receiving these regimens were less likely to have an undetectable viral load after one year, which strengthens the validity of these recommendations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION. Reduced cerebral perfusion pressure (CPP) may worsen secondary damage and outcome after severe traumatic brain injury (TBI), however the optimal management of CPP is still debated. STUDY HYPOTHESIS: We hypothesized that the impact of CPP on outcome is related to brain tissue oxygen tension (PbtO2) level and that reduced CPP may worsen TBI prognosis when it is associated with brain hypoxia. DESIGN. Retrospective analysis of prospective database. METHODS. We analyzed 103 patients with severe TBI who underwent continuous PbtO2 and CPP monitoring for an average of 5 days. For each patient, duration of reduced CPP (\60 mm Hg) and brain hypoxia (PbtO2\15 mm Hg for[30 min [1]) was calculated with linear interpolation method and the relationship between CPP and PbtO2 was analyzed with Pearson's linear correlation coefficient. Outcome at 30 days was assessed with the Glasgow Outcome Score (GOS), dichotomized as good (GOS 4-5) versus poor (GOS 1-3). Multivariable associations with outcome were analyzed with stepwise forward logistic regression. RESULTS. Reduced CPP (n=790 episodes; mean duration 10.2 ± 12.3 h) was observed in 75 (74%) patients and was frequently associated with brain hypoxia (46/75; 61%). Episodes where reduced CPP were associated with normal brain oxygen did not differ significantly between patients with poor versus those with good outcome (8.2 ± 8.3 vs. 6.5 ± 9.7 h; P=0.35). In contrast, time where reduced CPP occurred simultaneously with brain hypoxia was longer in patients with poor than in those with good outcome (3.3±7.4 vs. 0.8±2.3 h; P=0.02). Outcome was significantly worse in patients who had both reduced CPP and brain hypoxia (61% had GOS 1-3 vs. 17% in those with reduced CPP but no brain hypoxia; P\0.01). Patients in whom a positive CPP-PbtO2 correlation (r[0.3) was found also were more likely to have poor outcome (69 vs. 31% in patients with no CPP-PbtO2 correlation; P\0.01). Brain hypoxia was an independent risk factor of poor prognosis (odds ratio for favorable outcome of 0.89 [95% CI 0.79-1.00] per hour spent with a PbtO2\15 mm Hg; P=0.05, adjusted for CPP, age, GCS, Marshall CT and APACHE II). CONCLUSIONS. Low CPP may significantly worsen outcome after severe TBI when it is associated with brain tissue hypoxia. PbtO2-targeted management of CPP may optimize TBI therapy and improve outcome of head-injured patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: We aimed to (i) evaluate psychological distress in adolescent survivors of childhood cancer and compare them to siblings and a norm population; (ii) compare the severity of distress of distressed survivors and siblings with that of psychotherapy patients; and (iii) determine risk factors for psychological distress in survivors. METHODS: We sent a questionnaire to all childhood cancer survivors aged <16 years when diagnosed, who had survived ≥ 5 years and were aged 16-19 years at the time of study. Our control groups were same-aged siblings, a norm population, and psychotherapy patients. Psychological distress was measured with the Brief Symptom Inventory-18 (BSI-18) assessing somatization, depression, anxiety, and a global severity index (GSI). Participants with a T-score ≥ 57 were defined as distressed. We used logistic regression to determine risk factors. RESULTS: We evaluated the BSI-18 in 407 survivors and 102 siblings. Fifty-two survivors (13%) and 11 siblings (11%) had scores above the distress threshold (T ≥ 57). Distressed survivors scored significantly higher in somatization (p=0.027) and GSI (p=0.016) than distressed siblings, and also scored higher in somatization (p ≤ 0.001) and anxiety (p=0.002) than psychotherapy patients. In the multivariable regression, psychological distress was associated with female sex, self-reported late effects, and low perceived parental support. CONCLUSIONS: The majority of survivors did not report psychological distress. However, the severity of distress of distressed survivors exceeded that of distressed siblings and psychotherapy patients. Systematic psychological follow-up can help to identify survivors at risk and support them during the challenging period of adolescence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Non-adherence is one of the strongest predictors of therapeutic failure in HIV-positive patients. Virologic failure with subsequent emergence of resistance reduces future treatment options and long-term clinical success. METHODS: Prospective observational cohort study including patients starting new class of antiretroviral therapy (ART) between 2003 and 2010. Participants were naïve to ART class and completed ≥1 adherence questionnaire prior to resistance testing. Outcomes were development of any IAS-USA, class-specific, or M184V mutations. Associations between adherence and resistance were estimated using logistic regression models stratified by ART class. RESULTS: Of 314 included individuals, 162 started NNRTI and 152 a PI/r regimen. Adherence was similar between groups with 85% reporting adherence ≥95%. Number of new mutations increased with increasing non-adherence. In NNRTI group, multivariable models indicated a significant linear association in odds of developing IAS-USA (odds ratio (OR) 1.66, 95% confidence interval (CI): 1.04-2.67) or class-specific (OR 1.65, 95% CI: 1.00-2.70) mutations. Levels of drug resistance were considerably lower in PI/r group and adherence was only significantly associated with M184V mutations (OR 8.38, 95% CI: 1.26-55.70). Adherence was significantly associated with HIV RNA in PI/r but not NNRTI regimens. CONCLUSION: Therapies containing PI/r appear more forgiving to incomplete adherence compared with NNRTI regimens, which allow higher levels of resistance, even with adherence above 95%. However, in failing PI/r regimens good adherence may prevent accumulation of further resistance mutations and therefore help to preserve future drug options. In contrast, adherence levels have little impact on NNRTI treatments once the first mutations have emerged.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Smokeless tobacco is of increasing interest to public health researchers and policy makers. This study aims to measure prevalence of smokeless tobacco use (nasal dry snuff, snus and chewing tobacco) among young Swiss men, and to describe its correlates. METHODS: We invited 13 245 young men to participate in this survey on socio-economic and substance use data. Response rate was 45.2%. We included 5720 participants. Descriptive statistics and multivariable-adjusted logistic regression were performed. RESULTS: Mean age of participants was 19.5 years. Self-reported use once a month or more often was 8% for nasal dry snuff, 3% for snus and negligible for chewing tobacco. In multivariable-adjusted logistic regression, the odds for nasal dry snuff use increased in non daily smokers [odds ratio (OR) 2.41, 95% confidence interval (CI) 1.90-3.05], compared with non smokers, participants reporting risky weekly drinking volume (OR 3.93, 95% CI 1.86-8.32), compared with abstinents, and binge drinking once a month or more often (OR 7.41, 95% CI 4.11-13.38), compared with never binge drinking. Nasal dry snuff use was positively associated with higher BMI, average or above family income and German language, compared with French, and negatively associated with academic higher education, compared with non higher education, and occasional cannabis use, compared with no cannabis use. Correlates of snus were similar to those of nasal dry snuff. CONCLUSION: One in 12 young Swiss men use nasal dry snuff and 3% use snus. Consumption of smokeless tobacco is associated with a cluster of other risky behaviours, especially binge drinking.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Evidence for a better performance of different highly atherogenic versus traditional lipid parameters for coronary heart disease (CHD) risk prediction is conflicting. We investigated the association of the ratios of sma11 dense low density lipoprotein(LDL)/apoplipoprotein A, aolipoprotein B/apolipoprotein A-I and total cholesterol! HDL-cholesterol and CHD events in patients on combination antiretroviral therapy (cART).Methods: Case control study nested into the Swiss HIV Cohort Study: for each cART-treated patient with a first coronary event between April 1, 2000 and July 31, 2008 (case) we selected four control patients (1) that were without coronary events until the date of the event of the index case, (2) had a plasma sample within ±30 days of the sample date of the respective case, (3) received cART and (4) were then matched for age, gender and smoking status. Lipoproteins were measured by ultracentrifugation. Conditional logistic regression models were used to estimate the independent effects of different lipid ratios and the occurrence of coronary events.Results: In total, 98 cases (19 fatal myocardial infarctions [MI] and 79 non-fatal coronary events [53 definite MIs, 15 possible MIs and 11 coronary angioplasties or bypassesJ) were matched with 392 controls. Cases were more often injecting drug users, less likely to be virologically suppressed and more often on abacavir-containing regimens. In separa te multivariable models of total cholesterol, triglycerides, HDL-cholesterol, systolic blood pressure, abdominal obesity, diabetes and family history of CHD, small dense-LDL and apolipoprotein B were each statistically significantly associated with CHD events (for 1 mg/dl increase: odds ratio [OR] 1.05, 95% CI 1.00-1.11 and 1.15, 95% CI 1.01-1.31, respectively), but the ratiosof small dense-LDLlapolipoprotein A-I (OR 1.26, 95% CI 0.95-1.67), apolipoprotein B/apolipoprotein A-I (OR 1.02, 95% CI 0.97-1.07) and HDL-cholesterol! total cholesterol (OR 0.99 95% CI 0.98-1.00) were not. Following adjustment for HIV related and cART variables these associations were weakened in each model: apolipoprotein B (OR 1.27, 95% CI 1.00-1.30), sd-LDL (OR 1.04, 95% CI 0.99-1.20), small dense-LDLlapolipoprotein A-I (OR 1.17, 95% CI 0.87-1.58), apolipoprotein B/apolipoprotein A-I (OR 1.02, 95% CI 0.97-1.07) and total cholesterolJHDL- cholesterol (OR 0.99, 95% CI 0.99-1.00).Conclusions: In patients receiving cART, small dense-LDL and apolipoprotein B showed the strongest associations with CHD events in models controlling for traditional CHD risk factors including total cholesterol and triglycerides. Adding small dense LDLlapoplipoprotein A-l, apolipoprotein B/apolipoprotein A-I and total cholesterol! HDL-cholesterol ratios did not further improve models of lipid parameters and associations of increased risk for CHD events.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: Therapeutic hypothermia and pharmacological sedation may influence outcome prediction after cardiac arrest. The use of a multimodal approach, including clinical examination, electroencephalography, somatosensory-evoked potentials, and serum neuron-specific enolase, is recommended; however, no study examined the comparative performance of these predictors or addressed their optimal combination. DESIGN: Prospective cohort study. SETTING: Adult ICU of an academic hospital. PATIENTS: One hundred thirty-four consecutive adults treated with therapeutic hypothermia after cardiac arrest. MEASUREMENTS AND MAIN RESULTS: Variables related to the cardiac arrest (cardiac rhythm, time to return of spontaneous circulation), clinical examination (brainstem reflexes and myoclonus), electroencephalography reactivity during therapeutic hypothermia, somatosensory-evoked potentials, and serum neuron-specific enolase. Models to predict clinical outcome at 3 months (assessed using the Cerebral Performance Categories: 5 = death; 3-5 = poor recovery) were evaluated using ordinal logistic regressions and receiving operator characteristic curves. Seventy-two patients (54%) had a poor outcome (of whom, 62 died), and 62 had a good outcome. Multivariable ordinal logistic regression identified absence of electroencephalography reactivity (p < 0.001), incomplete recovery of brainstem reflexes in normothermia (p = 0.013), and neuron-specific enolase higher than 33 μg/L (p = 0.029), but not somatosensory-evoked potentials, as independent predictors of poor outcome. The combination of clinical examination, electroencephalography reactivity, and neuron-specific enolase yielded the best predictive performance (receiving operator characteristic areas: 0.89 for mortality and 0.88 for poor outcome), with 100% positive predictive value. Addition of somatosensory-evoked potentials to this model did not improve prognostic accuracy. CONCLUSIONS: Combination of clinical examination, electroencephalography reactivity, and serum neuron-specific enolase offers the best outcome predictive performance for prognostication of early postanoxic coma, whereas somatosensory-evoked potentials do not add any complementary information. Although prognostication of poor outcome seems excellent, future studies are needed to further improve prediction of good prognosis, which still remains inaccurate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The predictive potential of six selected factors was assessed in 72 patients with primary myelodysplastic syndrome using univariate and multivariate logistic regression analysis of survival at 18 months. Factors were age (above median of 69 years), dysplastic features in the three myeloid bone marrow cell lineages, presence of chromosome defects, all metaphases abnormal, double or complex chromosome defects (C23), and a Bournemouth score of 2, 3, or 4 (B234). In the multivariate approach, B234 and C23 proved to be significantly associated with a reduction in the survival probability. The similarity of the regression coefficients associated with these two factors means that they have about the same weight. Consequently, the model was simplified by counting the number of factors (0, 1, or 2) present in each patient, thus generating a scoring system called the Lausanne-Bournemouth score (LB score). The LB score combines the well-recognized and easy-to-use Bournemouth score (B score) with the chromosome defect complexity, C23 constituting an additional indicator of patient outcome. The predicted risk of death within 18 months calculated from the model is as follows: 7.1% (confidence interval: 1.7-24.8) for patients with an LB score of 0, 60.1% (44.7-73.8) for an LB score of 1, and 96.8% (84.5-99.4) for an LB score of 2. The scoring system presented here has several interesting features. The LB score may improve the predictive value of the B score, as it is able to recognize two prognostic groups in the intermediate risk category of patients with B scores of 2 or 3. It has also the ability to identify two distinct prognostic subclasses among RAEB and possibly CMML patients. In addition to its above-described usefulness in the prognostic evaluation, the LB score may bring new insights into the understanding of evolution patterns in MDS. We used the combination of the B score and chromosome complexity to define four classes which may be considered four possible states of myelodysplasia and which describe two distinct evolutional pathways.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: Overanticoagulated medical inpatients may be particularly prone to bleeding complications. Among medical inpatients with excessive oral anticoagulation (AC), we sought to identify patient and treatment factors associated with bleeding. METHODS: We prospectively identified consecutive patients receiving oral AC admitted to the medical ward of a university hospital (February-July 2006) who had at least one international normalized ratio (INR) value >3.0 during the hospital stay. We recorded patient characteristics, AC-related factors, and concomitant treatments (e.g., platelet inhibitors) that increase the bleeding risk. The outcome was overall bleeding, defined as the occurrence of major or minor bleeding during the hospital stay. We used logistic regression to explore patient and treatment factors associated with bleeding. RESULTS: Overall, 145 inpatients with excessive oral AC comprised our study sample. Atrial fibrillation (59%) and venous thromboembolism (28%) were the most common indications for AC. Twelve patients (8.3%) experienced a bleeding event. Of these, 8 had major bleeding. Women had a somewhat higher risk of major bleeding than men (12.5% vs 4.1%, p = 0.08). Multivariable analysis demonstrated that female gender was independently associated with bleeding (odds ratio [OR] 4.3, 95% confidence interval [95% C1] 1.1-17.8). Age, history of major bleeding, value of the index INR, and concomitant treatment with platelet inhibitors were not independent predictors of bleeding. CONCLUSIONS: We found that hospitalized women experiencing an episode of excessive oral AC have a 4-fold increased risk of bleeding compared with men. Whether overanticoagulated women require more aggressive measures of AC reversal must be examined in further studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

STUDY OBJECTIVES: There is limited information regarding sleep duration and determinants in Switzerland. We aimed to assess the trends and determinants of time in bed as a proxy for sleep duration in the Swiss canton of Geneva. METHODS: Data from repeated, independent cross-sectional representative samples of adults (≥ 18 years) of the Geneva population were collected between 2005 and 2011. Self-reported time in bed, education, monthly income, and nationality were assessed by questionnaire. RESULTS: Data from 3,853 participants (50% women, 51.7 ± 10.9 years) were analyzed. No significant trend was observed between 2005 and 2011 regarding time in bed or the prevalence of short (≤ 6 h/day) and long (> 9 h/day) time in bed. Elderly participants reported a longer time in bed (year-adjusted mean ± standard error: 7.67 ± 0.02, 7.82 ± 0.03, and 8.41 ± 0.04 h/day for 35-50, 50-65, and 65+ years, respectively, p < 0.001), while shorter time in bed was reported by non-Swiss participants (7.77 ± 0.03 vs. 7.92 ± 0.03 h/day for Swiss nationals, p < 0.001), participants with higher education (7.92 ± 0.02 for non-university vs. 7.74 ± 0.03 h/day for university, p < 0.001) or higher income (8.10 ± 0.04, 7.84 ± 0.03, and 7.70 ± 0.03 h/day for < 5,000 SFr; 5,000-9,500 SFr, and > 9,500 SFr, respectively, p < 0.001). Multivariable-adjusted polytomous logistic regression showed short and long time in bed to be positively associated with obesity and negatively associated with income. CONCLUSION: In a Swiss adult population, sleep duration as assessed by time in bed did not change significantly between 2005 and 2011. Both clinical and socioeconomic factors influence time in bed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: To assess consequences of physical violence at work and identify their predictors. METHODS: Among the patients in a medicolegal consultation from 2007 to 2010, the subsample of workplace violence victims (n = 185) was identified and contacted again in average 30 months after the assault. Eighty-six victims (47 %) participated. Ordinal logistic regression analyses assessed the effect of 9 potential risk factors on physical, psychological and work consequences summarized in a severity score (0-9). RESULTS: Severity score distribution was as follows: 4+: 14 %; 1-3: 42 %; and 0: 44 %. Initial psychological distress resulting from the violence was a strong predictor (p < 0.001) of the severity score both on work and long-term psychological consequences. Gender and age did not reach significant levels in multivariable analyses even though female victims had overall more severe consequences. Unexpectedly, only among workers whose jobs implied high awareness of the risk of violence, first-time violence was associated with long-term psychological and physical consequences (p = 0.004). Among the factors assessed at follow-up, perceived lack of employers' support or absence of employer was associated with higher values on the severity score. The seven other assessed factors (initial physical injuries; previous experience of violence; preexisting health problems; working alone; internal violence; lack of support from colleagues; and lack of support from family or friends) were not significantly associated with the severity score. CONCLUSIONS: Being a victim of workplace violence can result in long-term consequences on health and employment, their severity increases with the seriousness of initial psychological distress. Support from the employer can help prevent negative outcomes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Studies that systematically assess change in ulcerative colitis (UC) extent over time in adult patients are scarce. AIM: To assess changes in disease extent over time and to evaluate clinical parameters associated with this change. METHODS: Data from the Swiss IBD cohort study were analysed. We used logistic regression modelling to identify factors associated with a change in disease extent. RESULTS: A total of 918 UC patients (45.3% females) were included. At diagnosis, UC patients presented with the following disease extent: proctitis [199 patients (21.7%)], left-sided colitis [338 patients (36.8%)] and extensive colitis/pancolitis [381 (41.5%)]. During a median disease duration of 9 [4-16] years, progression and regression was documented in 145 patients (15.8%) and 149 patients (16.2%) respectively. In addition, 624 patients (68.0%) had a stable disease extent. The following factors were identified to be associated with disease progression: treatment with systemic glucocorticoids [odds ratio (OR) 1.704, P = 0.025] and calcineurin inhibitors (OR: 2.716, P = 0.005). No specific factors were found to be associated with disease regression. CONCLUSIONS: Over a median disease duration of 9 [4-16] years, about two-thirds of UC patients maintained the initial disease extent; the remaining one-third had experienced either progression or regression of the disease extent.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Low vitamin D status has been associated with an increased risk of developing type 2 diabetes and insulin resistance (IR), although this has been recently questioned. OBJECTIVE: We examined the association between serum vitamin D metabolites and incident IR. METHODS: This was a prospective, population-based study derived from the CoLaus (Cohorte Lausannoise) study including 3856 participants (aged 51.2 ± 10.4 y; 2217 women) free from diabetes or IR at baseline. IR was defined as a homeostasis model assessment (HOMA) index >2.6. Fasting plasma insulin and glucose were measured at baseline and at follow-up to calculate the HOMA index. The association of vitamin D metabolites with incident IR was analyzed by logistic regression, and the results were expressed for each independent variable as ORs and 95% CIs. RESULTS: During the 5.5-y follow-up, 649 (16.9%) incident cases of IR were identified. Participants who developed IR had lower baseline serum concentrations of 25-hydroxyvitamin D3 [25(OH)D3 (25-hydroxycholecalciferol); 45.9 ± 22.8 vs. 49.9 ± 22.6 nmol/L; P < 0.001], total 25(OH)D3 (25(OH)D3 + epi-25-hydroxyvitamin D3 [3-epi-25(OH)D3]; 49.1 ± 24.3 vs. 53.3 ± 24.1 nmol/L; P < 0.001), and 3-epi-25(OH)D3 (4.2 ± 2.9 vs. 4.3 ± 2.5 nmol/L; P = 0.01) but a higher 3-epi- to total 25(OH)D3 ratio (0.09 ± 0.05 vs. 0.08 ± 0.04; P = 0.007). Multivariable analysis adjusting for month of sampling, age, and sex showed an inverse association between 25(OH)D3 and the likelihood of developing IR [ORs (95% CIs): 0.86 (0.68, 1.09), 0.60 (0.46, 0.78), and 0.57 (0.43, 0.75) for the second, third, and fourth quartiles compared with the first 25(OH)D3 quartile; P-trend < 0.001]. Similar associations were found between total 25(OH)D3 and incident IR. There was no significant association between 3-epi-25(OH)D3 and IR, yet a positive association was observed between the 3-epi- to total 25(OH)D3 ratio and incident IR. Further adjustment for body mass index, sedentary status, and smoking attenuated the association between 25(OH)D3, total 25(OH)D3, and the 3-epi- to total 25(OH)D3 ratio and the likelihood of developing IR. CONCLUSION: In the CoLaus study in healthy adults, the risk of incident IR is not associated with serum concentrations of 25(OH)D3 and total 25(OH)D3.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Species distribution models (SDMs) are increasingly used to predict environmentally induced range shifts of habitats of plant and animal species. Consequently SDMs are valuable tools for scientifically based conservation decisions. The aims of this paper are (1) to identify important drivers of butterfly species persistence or extinction, and (2) to analyse the responses of endangered butterfly species of dry grasslands and wetlands to likely future landscape changes in Switzerland. Future land use was represented by four scenarios describing: (1) ongoing land use changes as observed at the end of the last century; (2) a liberalisation of the agricultural markets; (3) a slightly lowered agricultural production; and (4) a strongly lowered agricultural production. Two model approaches have been applied. The first (logistic regression with principal components) explains what environmental variables have significant impact on species presence (and absence). The second (predictive SDM) is used to project species distribution under current and likely future land uses. The results of the explanatory analyses reveal that four principal components related to urbanisation, abandonment of open land and intensive agricultural practices as well as two climate parameters are primary drivers of species occurrence (decline). The scenario analyses show that lowered agricultural production is likely to favour dry grassland species due to an increase of non-intensively used land, open canopy forests, and overgrown areas. In the liberalisation scenario dry grassland species show a decrease in abundance due to a strong increase of forested patches. Wetland butterfly species would decrease under all four scenarios as their habitats become overgrown