59 resultados para Linear programming models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Meta-analyses have established elevated fibrinogen and D-dimer levels in the circulation as biological risk factors for the development and progression of coronary artery disease (CAD). Here, we investigated whether vital exhaustion (VE), a known psychosocial risk factor for CAD, is associated with fibrinogen and D-dimer levels in a sample of apparently healthy school teachers. The teaching profession has been proposed as a potentially high stressful occupation due to enhanced psychosocial stress at the workplace. Plasma fibrinogen and D-dimer levels were measured in 150 middle-aged male and female teachers derived from the first year of the Trier-Teacher-Stress-Study. Log-transformed levels were analyzed using linear regression. Results yielded a significant association between VE and fibrinogen (p = 0.02) but not D-dimer controlling for relevant covariates. Further investigation of possible interaction effects resulted in a significant association between fibrinogen and the interaction term "VE x gender" (p = 0.05). In a secondary analysis, we reran linear regression models for males and females separately. Gender-specific results revealed that the association between fibrinogen and VE remained significant in males but not females. In sum, the present data support the notion that fibrinogen levels are positively related to VE. Elevated fibrinogen might be one biological pathway by which chronic work stress may impact on teachers' cardiovascular health in the long run.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Peak oxygen uptake (peak Vo(2)) is an established integrative measurement of maximal exercise capacity in cardiovascular disease. After heart transplantation (HTx) peak Vo(2) remains reduced despite normal systolic left ventricular function, which highlights the relevance of diastolic function. In this study we aim to characterize the predictive significance of cardiac allograft diastolic function for peak Vo(2). METHODS: Peak Vo(2) was measured using a ramp protocol on a bicycle ergometer. Left ventricular (LV) diastolic function was assessed with tissue Doppler imaging sizing the velocity of the early (Ea) and late (Aa) apical movement of the mitral annulus, and conventional Doppler measuring early (E) and late (A) diastolic transmitral flow propagation. Correlation coefficients were calculated and linear regression models fitted. RESULTS: The post-transplant time interval of the 39 HTxs ranged from 0.4 to 20.1 years. The mean age of the recipients was 55 +/- 14 years and body mass index (BMI) was 25.4 +/- 3.9 kg/m(2). Mean LV ejection fraction was 62 +/- 4%, mean LV mass index 108 +/- 22 g/m(2) and mean peak Vo(2) 20.1 +/- 6.3 ml/kg/min. Peak Vo(2) was reduced in patients with more severe diastolic dysfunction (pseudonormal or restrictive transmitral inflow pattern), or when E/Ea was > or =10. Peak Vo(2) correlated with recipient age (r = -0.643, p < 0.001), peak heart rate (r = 0.616, p < 0.001) and BMI (r = -0.417, p = 0.008). Of all echocardiographic measurements, Ea (r = 0.561, p < 0.001) and Ea/Aa (r = 0.495, p = 0.002) correlated best. Multivariate analysis identified age, heart rate, BMI and Ea/Aa as independent predictors of peak Vo(2). CONCLUSIONS: Diastolic dysfunction is relevant for the limitation of maximal exercise capacity after HTx.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the association between arterial blood pressure (ABP) during the first 24 h and mortality in sepsis. DESIGN: Retrospective cohort study. SETTING: Multidisciplinary intensive care unit (ICU). PATIENTS AND PARTICIPANTS: A total of 274 septic patients. INTERVENTIONS: None. MEASUREMENTS AND RESULTS: Hemodynamic, and laboratory parameters were extracted from a PDMS database. The hourly time integral of ABP drops below clinically relevant systolic arterial pressure (SAP), mean arterial pressure (MAP), and mean perfusion pressure (MPP = MAP - central venous pressure) levels was calculated for the first 24 h after ICU admission and compared with 28-day-mortality. Binary and linear regression models (adjusted for SAPS II as a measure of disease severity), and a receiver operating characteristic (ROC) analysis were applied. The areas under the ROC curve were largest for the hourly time integrals of ABP drops below MAP 60 mmHg (0.779 vs. 0.764 for ABP drops below MAP 55 mmHg; P < or = 0.01) and MPP 45 mmHg. No association between the hourly time integrals of ABP drops below certain SAP levels and mortality was detected. One or more episodes of MAP < 60 mmHg increased the risk of death by 2.96 (CI 95%, 1.06-10.36, P = 0.04). The area under the ROC curve to predict the need for renal replacement therapy was highest for the hourly time integral of ABP drops below MAP 75 mmHg. CONCLUSIONS: A MAP level > or = 60 mmHg may be as safe as higher MAP levels during the first 24 h of ICU therapy in septic patients. A higher MAP may be required to maintain kidney function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to highly erodible volcanic soils and a harsh climate, livestock grazing in Iceland has led to serious soil erosion on about 40% of the country's surface. Over the last 100 years, various revegetation and restoration measures were taken on large areas distributed all over Iceland in an attempt to counteract this problem. The present research aimed to develop models for estimating percent vegetation cover (VC) and aboveground biomass (AGB) based on satellite data, as this would make it possible to assess and monitor the effectiveness of restoration measures over large areas at a fairly low cost. Models were developed based on 203 vegetation cover samples and 114 aboveground biomass samples distributed over five SPOT satellite datasets. All satellite datasets were atmospherically corrected, and digital numbers were converted into ground reflectance. Then a selection of vegetation indices (VIs) was calculated, followed by simple and multiple linear regression analysis of the relations between the field data and the calculated VIs. Best results were achieved using multiple linear regression models for both %VC and AGB. The model calibration and validation results showed that R2 and RMSE values for most VIs do not vary very much. For percent VC, R2 values range between 0.789 and 0.822, leading to RMSEs ranging between 15.89% and 16.72%. For AGB, R2 values for low-biomass areas (AGB < 800 g/m2) range between 0.607 and 0.650, leading to RMSEs ranging between 126.08 g/m2 and 136.38 g/m2. The AGB model developed for all areas, including those with high biomass coverage (AGB > 800 g/m2), achieved R2 values between 0.487 and 0.510, resulting in RMSEs ranging from 234 g/m2 to 259.20 g/m2. The models predicting percent VC generally overestimate observed low percent VC and slightly underestimate observed high percent VC. The estimation models for AGB behave in a similar way, but over- and underestimation are much more pronounced. These results show that it is possible to estimate percent VC with high accuracy based on various VIs derived from SPOT satellite data. AGB of restoration areas with low-biomass values of up to 800 g/m2 can likewise be estimated with high accuracy based on various VIs derived from SPOT satellite data, whereas in the case of high biomass coverage, estimation accuracy decreases with increasing biomass values. Accordingly, percent VC can be estimated with high accuracy anywhere in Iceland, whereas AGB is much more difficult to estimate, particularly for areas with high-AGB variability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Agroforestry is a sustainable land use method with a long tradition in the Bolivian Andes. A better understanding of people’s knowledge and valuation of woody species can help to adjust actor-oriented agroforestry systems. In this case study, carried out in a peasant community of the Bolivian Andes, we aimed at calculating the cultural importance of selected agroforestry species, and at analysing the intracultural variation in the cultural importance and knowledge of plants according to peasants’ sex, age, and migration. Methods Data collection was based on semi-structured interviews and freelisting exercises. Two ethnobotanical indices (Composite Salience, Cultural Importance) were used for calculating the cultural importance of plants. Intracultural variation in the cultural importance and knowledge of plants was detected by using linear and generalised linear (mixed) models. Results and discussion The culturally most important woody species were mainly trees and exotic species (e.g. Schinus molle, Prosopis laevigata, Eucalyptus globulus). We found that knowledge and valuation of plants increased with age but that they were lower for migrants; sex, by contrast, played a minor role. The age effects possibly result from decreasing ecological apparency of valuable native species, and their substitution by exotic marketable trees, loss of traditional plant uses or the use of other materials (e.g. plastic) instead of wood. Decreasing dedication to traditional farming may have led to successive abandonment of traditional tool uses, and the overall transformation of woody plant use is possibly related to diminishing medicinal knowledge. Conclusions Age and migration affect how people value woody species and what they know about their uses. For this reason, we recommend paying particular attention to the potential of native species, which could open promising perspectives especially for the young migrating peasant generation and draw their interest in agroforestry. These native species should be ecologically sound and selected on their potential to provide subsistence and promising commercial uses. In addition to offering socio-economic and environmental services, agroforestry initiatives using native trees and shrubs can play a crucial role in recovering elements of the lost ancient landscape that still forms part of local people’s collective identity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To describe the CD4 cell count at the start of combination antiretroviral therapy (cART) in low-income (LIC), lower middle-income (LMIC), upper middle-income (UMIC), and high-income (HIC) countries. METHODS Patients aged 16 years or older starting cART in a clinic participating in a multicohort collaboration spanning 6 continents (International epidemiological Databases to Evaluate AIDS and ART Cohort Collaboration) were eligible. Multilevel linear regression models were adjusted for age, gender, and calendar year; missing CD4 counts were imputed. RESULTS In total, 379,865 patients from 9 LIC, 4 LMIC, 4 UMIC, and 6 HIC were included. In LIC, the median CD4 cell count at cART initiation increased by 83% from 80 to 145 cells/μL between 2002 and 2009. Corresponding increases in LMIC, UMIC, and HIC were from 87 to 155 cells/μL (76% increase), 88 to 135 cells/μL (53%), and 209 to 274 cells/μL (31%). In 2009, compared with LIC, median counts were 13 cells/μL [95% confidence interval (CI): -56 to +30] lower in LMIC, 22 cells/μL (-62 to +18) lower in UMIC, and 112 cells/μL (+75 to +149) higher in HIC. They were 23 cells/μL (95% CI: +18 to +28 cells/μL) higher in women than men. Median counts were 88 cells/μL (95% CI: +35 to +141 cells/μL) higher in countries with an estimated national cART coverage >80%, compared with countries with <40% coverage. CONCLUSIONS Median CD4 cell counts at the start of cART increased 2000-2009 but remained below 200 cells/μL in LIC and MIC and below 300 cells/μL in HIC. Earlier start of cART will require substantial efforts and resources globally.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: Assessment and treatment of psychological distress in cancer patients was recognized as a major challenge. The role of spouses, caregivers, and significant others became of salient importance not only because of their supportive functions but also in respect to their own burden. The purpose of this study was to assess the amount of distress in a mixed sample of cancer patients and their partners and to explore the dyadic interdependence. METHODS: An initial sample of 154 dyads was recruited, and distress questionnaires (Hospital Anxiety and Depression Scale, Symptom Checklist 9-Item Short Version and 12-Item Short Form Health Survey) were assessed over four time points. Linear mixed models and actor-partner interdependence models were applied. RESULTS: A significant proportion of patients and their partners (up to 40%) reported high levels of anxiety, depression, psychological distress, and low quality of life over the course of the investigation. Mixed model analyses revealed that higher risks for clinical relevant anxiety and depression in couples exist for female patients and especially for female partners. Although psychological strain decreased over time, the risk for elevated distress in female partners remained. Modeling patient-partner interdependence over time stratified by patients' gender revealed specific effects: a moderate correlation between distress in patients and partners, and a transmission of distress from male patients to their female partners. CONCLUSIONS: Our findings provide empirical support for gender-specific transmission of distress in dyads coping with cancer. This should be considered as an important starting point for planning systemic psycho-oncological interventions and conceptualizing further research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to the ongoing trend towards increased product variety, fast-moving consumer goods such as food and beverages, pharmaceuticals, and chemicals are typically manufactured through so-called make-and-pack processes. These processes consist of a make stage, a pack stage, and intermediate storage facilities that decouple these two stages. In operations scheduling, complex technological constraints must be considered, e.g., non-identical parallel processing units, sequence-dependent changeovers, batch splitting, no-wait restrictions, material transfer times, minimum storage times, and finite storage capacity. The short-term scheduling problem is to compute a production schedule such that a given demand for products is fulfilled, all technological constraints are met, and the production makespan is minimised. A production schedule typically comprises 500–1500 operations. Due to the problem size and complexity of the technological constraints, the performance of known mixed-integer linear programming (MILP) formulations and heuristic approaches is often insufficient. We present a hybrid method consisting of three phases. First, the set of operations is divided into several subsets. Second, these subsets are iteratively scheduled using a generic and flexible MILP formulation. Third, a novel critical path-based improvement procedure is applied to the resulting schedule. We develop several strategies for the integration of the MILP model into this heuristic framework. Using these strategies, high-quality feasible solutions to large-scale instances can be obtained within reasonable CPU times using standard optimisation software. We have applied the proposed hybrid method to a set of industrial problem instances and found that the method outperforms state-of-the-art methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE We aimed to create an index to stratify cryptogenic stroke (CS) patients with patent foramen ovale (PFO) by their likelihood that the stroke was related to their PFO. METHODS Using data from 12 component studies, we used generalized linear mixed models to predict the presence of PFO among patients with CS, and derive a simple index to stratify patients with CS. We estimated the stratum-specific PFO-attributable fraction and stratum-specific stroke/TIA recurrence rates. RESULTS Variables associated with a PFO in CS patients included younger age, the presence of a cortical stroke on neuroimaging, and the absence of these factors: diabetes, hypertension, smoking, and prior stroke or TIA. The 10-point Risk of Paradoxical Embolism score is calculated from these variables so that the youngest patients with superficial strokes and without vascular risk factors have the highest score. PFO prevalence increased from 23% (95% confidence interval [CI]: 19%-26%) in those with 0 to 3 points to 73% (95% CI: 66%-79%) in those with 9 or 10 points, corresponding to attributable fraction estimates of approximately 0% to 90%. Kaplan-Meier estimated stroke/TIA 2-year recurrence rates decreased from 20% (95% CI: 12%-28%) in the lowest Risk of Paradoxical Embolism score stratum to 2% (95% CI: 0%-4%) in the highest. CONCLUSION Clinical characteristics identify CS patients who vary markedly in PFO prevalence, reflecting clinically important variation in the probability that a discovered PFO is likely to be stroke-related vs incidental. Patients in strata more likely to have stroke-related PFOs have lower recurrence risk.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: We examined the influence of clinical, radiologic, and echocardiographic characteristics on antithrombotic choice in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO), hypothesizing that features suggestive of paradoxical embolism might lead to greater use of anticoagulation. Methods: The Risk of Paradoxical Embolism Study combined 12 databases to create the largest dataset of patients with CS and known PFO status. We used generalized linear mixed models with a random effect of component study to explore whether anticoagulation was preferentially selected based on the following: (1) younger age and absence of vascular risk factors, (2) “high-risk” echocardiographic features, and (3) neuroradiologic findings. Results: A total of 1,132 patients with CS and PFO treated with anticoagulation or antiplatelets were included. Overall, 438 participants (39%) were treated with anticoagulation with a range (by database) of 22% to 54%. Treatment choice was not influenced by age or vascular risk factors. However, neuroradiologic findings (superficial or multiple infarcts) and high-risk echocardiographic features (large shunts, shunt at rest, and septal hypermobility) were predictors of anticoagulation use. Conclusion: Both antithrombotic regimens are widely used for secondary stroke prevention in patients with CS and PFO. Radiologic and echocardiographic features were strongly associated with treatment choice, whereas conventional vascular risk factors were not. Prior observational studies are likely to be biased by confounding by indication.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

SUMMARY BACKGROUND/OBJECTIVES Orthodontic management of maxillary canine impaction (MCI), including forced eruption, may result in significant root resorption; however, the association between MCI and orthodontically induced root resorption (OIRR) is not yet sufficiently established. The purpose of this retrospective cohort study was to comparatively evaluate the severity of OIRR of maxillary incisors in orthodontically treated patients with MCI. Additionally, impaction characteristics were associated with OIRR severity. SUBJECTS AND METHODS The sample comprised 48 patients undergoing fixed-appliance treatment-24 with unilateral/bilateral MCI and 24 matched controls without impaction. OIRR was calculated using pre- and post-operative panoramic tomograms. The orientation of eruption path, height, sector location, and follicle/tooth ratio of the impacted canine were also recorded. Mann-Whitney U-test and univariate and multivariate linear mixed models were used to test for the associations of interest. RESULTS Maxillary central left incisor underwent more OIRR in the impaction group (mean difference = 0.58mm, P = 0.04). Overall, the impaction group had 0.38mm more OIRR compared to the control (95% confidence interval, CI: 0.03, 0.74; P = 0.04). However, multivariate analysis demonstrated no difference in the amount of OIRR between impaction and non-impaction groups overall. A positive association between OIRR and initial root length was observed (95% CI: 0.08, 0.27; P < 0.001). The severity of canine impaction was not found to be a significant predictor of OIRR. LIMITATIONS This study was a retrospective study and used panoramic tomograms for OIRR measurements. CONCLUSIONS This study indicates that MCI is a weak OIRR predictor. Interpretation of the results needs caution due to the observational nature of the present study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to associate changes in dairy farmers' self-reported attitude, knowledge, and behavior with the decrease in incidence rate of clinical mastitis (IRCM). Farmer-diagnosed clinical mastitis cases were obtained from two surveys conducted before (July 2004-June 2005) and at the end (2009) of a mastitis control program in the Netherlands. Information on farmers' attitude, knowledge, and behavior was also obtained by sending the farmers the same questionnaire during both surveys. Multivariable linear regression models identified that the herd level 2004 IRCM explained 28% of the variation in the decrease of IRCM. Changes in farmers' attitude and knowledge, and changes in farmers' behavior additionally explained 24% and 5%, respectively. These results suggest that the way management measures are executed may be at least as important as the fact that they are executed. No control group was available for this study because the intervention was applied at the national level. We therefore do not claim any causal relationships.