936 resultados para Linear programming models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Peak oxygen uptake (peak Vo(2)) is an established integrative measurement of maximal exercise capacity in cardiovascular disease. After heart transplantation (HTx) peak Vo(2) remains reduced despite normal systolic left ventricular function, which highlights the relevance of diastolic function. In this study we aim to characterize the predictive significance of cardiac allograft diastolic function for peak Vo(2). METHODS: Peak Vo(2) was measured using a ramp protocol on a bicycle ergometer. Left ventricular (LV) diastolic function was assessed with tissue Doppler imaging sizing the velocity of the early (Ea) and late (Aa) apical movement of the mitral annulus, and conventional Doppler measuring early (E) and late (A) diastolic transmitral flow propagation. Correlation coefficients were calculated and linear regression models fitted. RESULTS: The post-transplant time interval of the 39 HTxs ranged from 0.4 to 20.1 years. The mean age of the recipients was 55 +/- 14 years and body mass index (BMI) was 25.4 +/- 3.9 kg/m(2). Mean LV ejection fraction was 62 +/- 4%, mean LV mass index 108 +/- 22 g/m(2) and mean peak Vo(2) 20.1 +/- 6.3 ml/kg/min. Peak Vo(2) was reduced in patients with more severe diastolic dysfunction (pseudonormal or restrictive transmitral inflow pattern), or when E/Ea was > or =10. Peak Vo(2) correlated with recipient age (r = -0.643, p < 0.001), peak heart rate (r = 0.616, p < 0.001) and BMI (r = -0.417, p = 0.008). Of all echocardiographic measurements, Ea (r = 0.561, p < 0.001) and Ea/Aa (r = 0.495, p = 0.002) correlated best. Multivariate analysis identified age, heart rate, BMI and Ea/Aa as independent predictors of peak Vo(2). CONCLUSIONS: Diastolic dysfunction is relevant for the limitation of maximal exercise capacity after HTx.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the association between arterial blood pressure (ABP) during the first 24 h and mortality in sepsis. DESIGN: Retrospective cohort study. SETTING: Multidisciplinary intensive care unit (ICU). PATIENTS AND PARTICIPANTS: A total of 274 septic patients. INTERVENTIONS: None. MEASUREMENTS AND RESULTS: Hemodynamic, and laboratory parameters were extracted from a PDMS database. The hourly time integral of ABP drops below clinically relevant systolic arterial pressure (SAP), mean arterial pressure (MAP), and mean perfusion pressure (MPP = MAP - central venous pressure) levels was calculated for the first 24 h after ICU admission and compared with 28-day-mortality. Binary and linear regression models (adjusted for SAPS II as a measure of disease severity), and a receiver operating characteristic (ROC) analysis were applied. The areas under the ROC curve were largest for the hourly time integrals of ABP drops below MAP 60 mmHg (0.779 vs. 0.764 for ABP drops below MAP 55 mmHg; P < or = 0.01) and MPP 45 mmHg. No association between the hourly time integrals of ABP drops below certain SAP levels and mortality was detected. One or more episodes of MAP < 60 mmHg increased the risk of death by 2.96 (CI 95%, 1.06-10.36, P = 0.04). The area under the ROC curve to predict the need for renal replacement therapy was highest for the hourly time integral of ABP drops below MAP 75 mmHg. CONCLUSIONS: A MAP level > or = 60 mmHg may be as safe as higher MAP levels during the first 24 h of ICU therapy in septic patients. A higher MAP may be required to maintain kidney function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While revenue management (RM) is traditionally considered a tool of service operations, RM shows considerable potential for application in manufacturing operations. The typical challenges in make-to-order manufacturing are fixed manufacturing capacities and a great variety in offered products, going along with pronounced fluctuations in demand and profitability. Since Harris and Pinder in the mid-90s, numerous papers have furthered the understanding of RM theory in this environment. Nevertheless, results to be expected from applying the developed methods to a practical industry setting have yet to be reported. To this end, this paper investigates a possible application of RM at ThyssenKrupp VDM, leading to considerable improvements in several areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to highly erodible volcanic soils and a harsh climate, livestock grazing in Iceland has led to serious soil erosion on about 40% of the country's surface. Over the last 100 years, various revegetation and restoration measures were taken on large areas distributed all over Iceland in an attempt to counteract this problem. The present research aimed to develop models for estimating percent vegetation cover (VC) and aboveground biomass (AGB) based on satellite data, as this would make it possible to assess and monitor the effectiveness of restoration measures over large areas at a fairly low cost. Models were developed based on 203 vegetation cover samples and 114 aboveground biomass samples distributed over five SPOT satellite datasets. All satellite datasets were atmospherically corrected, and digital numbers were converted into ground reflectance. Then a selection of vegetation indices (VIs) was calculated, followed by simple and multiple linear regression analysis of the relations between the field data and the calculated VIs. Best results were achieved using multiple linear regression models for both %VC and AGB. The model calibration and validation results showed that R2 and RMSE values for most VIs do not vary very much. For percent VC, R2 values range between 0.789 and 0.822, leading to RMSEs ranging between 15.89% and 16.72%. For AGB, R2 values for low-biomass areas (AGB < 800 g/m2) range between 0.607 and 0.650, leading to RMSEs ranging between 126.08 g/m2 and 136.38 g/m2. The AGB model developed for all areas, including those with high biomass coverage (AGB > 800 g/m2), achieved R2 values between 0.487 and 0.510, resulting in RMSEs ranging from 234 g/m2 to 259.20 g/m2. The models predicting percent VC generally overestimate observed low percent VC and slightly underestimate observed high percent VC. The estimation models for AGB behave in a similar way, but over- and underestimation are much more pronounced. These results show that it is possible to estimate percent VC with high accuracy based on various VIs derived from SPOT satellite data. AGB of restoration areas with low-biomass values of up to 800 g/m2 can likewise be estimated with high accuracy based on various VIs derived from SPOT satellite data, whereas in the case of high biomass coverage, estimation accuracy decreases with increasing biomass values. Accordingly, percent VC can be estimated with high accuracy anywhere in Iceland, whereas AGB is much more difficult to estimate, particularly for areas with high-AGB variability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Agroforestry is a sustainable land use method with a long tradition in the Bolivian Andes. A better understanding of people’s knowledge and valuation of woody species can help to adjust actor-oriented agroforestry systems. In this case study, carried out in a peasant community of the Bolivian Andes, we aimed at calculating the cultural importance of selected agroforestry species, and at analysing the intracultural variation in the cultural importance and knowledge of plants according to peasants’ sex, age, and migration. Methods Data collection was based on semi-structured interviews and freelisting exercises. Two ethnobotanical indices (Composite Salience, Cultural Importance) were used for calculating the cultural importance of plants. Intracultural variation in the cultural importance and knowledge of plants was detected by using linear and generalised linear (mixed) models. Results and discussion The culturally most important woody species were mainly trees and exotic species (e.g. Schinus molle, Prosopis laevigata, Eucalyptus globulus). We found that knowledge and valuation of plants increased with age but that they were lower for migrants; sex, by contrast, played a minor role. The age effects possibly result from decreasing ecological apparency of valuable native species, and their substitution by exotic marketable trees, loss of traditional plant uses or the use of other materials (e.g. plastic) instead of wood. Decreasing dedication to traditional farming may have led to successive abandonment of traditional tool uses, and the overall transformation of woody plant use is possibly related to diminishing medicinal knowledge. Conclusions Age and migration affect how people value woody species and what they know about their uses. For this reason, we recommend paying particular attention to the potential of native species, which could open promising perspectives especially for the young migrating peasant generation and draw their interest in agroforestry. These native species should be ecologically sound and selected on their potential to provide subsistence and promising commercial uses. In addition to offering socio-economic and environmental services, agroforestry initiatives using native trees and shrubs can play a crucial role in recovering elements of the lost ancient landscape that still forms part of local people’s collective identity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To describe the CD4 cell count at the start of combination antiretroviral therapy (cART) in low-income (LIC), lower middle-income (LMIC), upper middle-income (UMIC), and high-income (HIC) countries. METHODS Patients aged 16 years or older starting cART in a clinic participating in a multicohort collaboration spanning 6 continents (International epidemiological Databases to Evaluate AIDS and ART Cohort Collaboration) were eligible. Multilevel linear regression models were adjusted for age, gender, and calendar year; missing CD4 counts were imputed. RESULTS In total, 379,865 patients from 9 LIC, 4 LMIC, 4 UMIC, and 6 HIC were included. In LIC, the median CD4 cell count at cART initiation increased by 83% from 80 to 145 cells/μL between 2002 and 2009. Corresponding increases in LMIC, UMIC, and HIC were from 87 to 155 cells/μL (76% increase), 88 to 135 cells/μL (53%), and 209 to 274 cells/μL (31%). In 2009, compared with LIC, median counts were 13 cells/μL [95% confidence interval (CI): -56 to +30] lower in LMIC, 22 cells/μL (-62 to +18) lower in UMIC, and 112 cells/μL (+75 to +149) higher in HIC. They were 23 cells/μL (95% CI: +18 to +28 cells/μL) higher in women than men. Median counts were 88 cells/μL (95% CI: +35 to +141 cells/μL) higher in countries with an estimated national cART coverage >80%, compared with countries with <40% coverage. CONCLUSIONS Median CD4 cell counts at the start of cART increased 2000-2009 but remained below 200 cells/μL in LIC and MIC and below 300 cells/μL in HIC. Earlier start of cART will require substantial efforts and resources globally.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Little is known about the effects of hypothermia therapy and subsequent rewarming on the PQRST intervals and heart rate variability (HRV) in term newborns with hypoxic-ischemic encephalopathy (HIE). OBJECTIVES: This study describes the changes in the PQRST intervals and HRV during rewarming to normal core body temperature of 2 newborns with HIE after hypothermia therapy. METHODS: Within 6 h after birth, 2 newborns with HIE were cooled to a core body temperature of 33.5 degrees C for 72 h using a cooling blanket, followed by gradual rewarming (0.5 degrees C per hour) until the body temperature reached 36.5 degrees C. Custom instrumentation recorded the electrocardiogram from the leads used for clinical monitoring of vital signs. Generalized linear mixed models were calculated to estimate temperature-related changes in PQRST intervals and HRV. Results: For every 1 degrees C increase in body temperature, the heart rate increased by 9.2 bpm (95% CI 6.8-11.6), the QTc interval decreased by 21.6 ms (95% CI 17.3-25.9), and low and high frequency HRV decreased by 0.480 dB (95% CI 0.052-0.907) and 0.938 dB (95% CI 0.460-1.416), respectively. CONCLUSIONS: Hypothermia-induced changes in the electrocardiogram should be monitored carefully in future studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Second-generation antipsychotics (SGAs) are increasingly prescribed to treat psychiatric symptoms in pediatric patients infected with HIV. We examined the relationship between prescribed SGAs and physical growth in a cohort of youth with perinatally acquired HIV-1 infection. Pediatric AIDS Clinical Trials Group (PACTG), Protocol 219C (P219C), a multicenter, longitudinal observational study of children and adolescents perinatally exposed to HIV, was conducted from September 2000 until May 2007. The analysis included P219C participants who were perinatally HIV-infected, 3-18 years old, prescribed first SGA for at least 1 month, and had available baseline data prior to starting first SGA. Each participant prescribed an SGA was matched (based on gender, age, Tanner stage, baseline body mass index [BMI] z score) with 1-3 controls without antipsychotic prescriptions. The main outcomes were short-term (approximately 6 months) and long-term (approximately 2 years) changes in BMI z scores from baseline. There were 236 participants in the short-term and 198 in the long-term analysis. In linear regression models, youth with SGA prescriptions had increased BMI z scores relative to youth without antipsychotic prescriptions, for all SGAs (short-term increase = 0.192, p = 0.003; long-term increase = 0.350, p < 0.001), and for risperidone alone (short-term = 0.239, p = 0.002; long-term = 0.360, p = 0.001). Participants receiving both protease inhibitors (PIs) and SGAs showed especially large increases. These findings suggest that growth should be carefully monitored in youth with perinatally acquired HIV who are prescribed SGAs. Future research should investigate the interaction between PIs and SGAs in children and adolescents with perinatally acquired HIV infection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To examine the relationships between physical growth and medications prescribed for symptoms of attention-deficit hyperactivity disorder in children with HIV. METHODS: Analysis of data from children with perinatally acquired HIV (N = 2251; age 3-19 years), with and without prescriptions for stimulant and nonstimulant medications used to treat attention-deficit hyperactivity disorder, in a long-term observational study. Height and weight measurements were transformed to z scores and compared across medication groups. Changes in z scores during a 2-year interval were compared using multiple linear regression models adjusting for selected covariates. RESULTS: Participants with (n = 215) and without (n = 2036) prescriptions were shorter than expected based on US age and gender norms (p < .001). Children without prescriptions weighed less at baseline than children in the general population (p < .001) but gained height and weight at a faster rate (p < .001). Children prescribed stimulants were similar to population norms in baseline weight; their height and weight growth velocities were comparable with the general population and children without prescriptions (for weight, p = .511 and .100, respectively). Children prescribed nonstimulants had the lowest baseline height but were similar to population norms in baseline weight. Their height and weight growth velocities were comparable with the general population but significantly slower than children without prescriptions (p = .01 and .02, respectively). CONCLUSION: The use of stimulants to treat symptoms of attention-deficit hyperactivity disorder does not significantly exacerbate the potential for growth delay in children with HIV and may afford opportunities for interventions that promote physical growth. Prospective studies are needed to confirm these findings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: Assessment and treatment of psychological distress in cancer patients was recognized as a major challenge. The role of spouses, caregivers, and significant others became of salient importance not only because of their supportive functions but also in respect to their own burden. The purpose of this study was to assess the amount of distress in a mixed sample of cancer patients and their partners and to explore the dyadic interdependence. METHODS: An initial sample of 154 dyads was recruited, and distress questionnaires (Hospital Anxiety and Depression Scale, Symptom Checklist 9-Item Short Version and 12-Item Short Form Health Survey) were assessed over four time points. Linear mixed models and actor-partner interdependence models were applied. RESULTS: A significant proportion of patients and their partners (up to 40%) reported high levels of anxiety, depression, psychological distress, and low quality of life over the course of the investigation. Mixed model analyses revealed that higher risks for clinical relevant anxiety and depression in couples exist for female patients and especially for female partners. Although psychological strain decreased over time, the risk for elevated distress in female partners remained. Modeling patient-partner interdependence over time stratified by patients' gender revealed specific effects: a moderate correlation between distress in patients and partners, and a transmission of distress from male patients to their female partners. CONCLUSIONS: Our findings provide empirical support for gender-specific transmission of distress in dyads coping with cancer. This should be considered as an important starting point for planning systemic psycho-oncological interventions and conceptualizing further research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to the ongoing trend towards increased product variety, fast-moving consumer goods such as food and beverages, pharmaceuticals, and chemicals are typically manufactured through so-called make-and-pack processes. These processes consist of a make stage, a pack stage, and intermediate storage facilities that decouple these two stages. In operations scheduling, complex technological constraints must be considered, e.g., non-identical parallel processing units, sequence-dependent changeovers, batch splitting, no-wait restrictions, material transfer times, minimum storage times, and finite storage capacity. The short-term scheduling problem is to compute a production schedule such that a given demand for products is fulfilled, all technological constraints are met, and the production makespan is minimised. A production schedule typically comprises 500–1500 operations. Due to the problem size and complexity of the technological constraints, the performance of known mixed-integer linear programming (MILP) formulations and heuristic approaches is often insufficient. We present a hybrid method consisting of three phases. First, the set of operations is divided into several subsets. Second, these subsets are iteratively scheduled using a generic and flexible MILP formulation. Third, a novel critical path-based improvement procedure is applied to the resulting schedule. We develop several strategies for the integration of the MILP model into this heuristic framework. Using these strategies, high-quality feasible solutions to large-scale instances can be obtained within reasonable CPU times using standard optimisation software. We have applied the proposed hybrid method to a set of industrial problem instances and found that the method outperforms state-of-the-art methods.