938 resultados para Linear Mixed Integer Multicriteria Optimization
Resumo:
Objective: To evaluate a new triaxial accelerometer device for prediction of energy expenditure, measured as VO2/kg, in obese adults and normal-weight controls during activities of daily life. Subjects and methods: Thirty-seven obese adults (Body Mass Index (BMI) 37±5.4) and seventeen controls (BMI 23±1.8) performed eight activities for 5 to 8 minutes while wearing a triaxial accelerometer on the right thigh. Simultaneously, VO2 and VCO2 were measured using a portable metabolic system. The relationship between accelerometer counts (AC) and VO2/kg was analysed using spline regression and linear mixed-effects models. Results: For all activities, VO2/kg was significantly lower in obese participants than in normalweight controls. A linear relationship between AC and VO2/kg existed only within accelerometer values from 0 to 300 counts/min, with an increase of 3.7 (95%-confidence interval (CI) 3.4 - 4.1) and 3.9 ml/min (95%-CI 3.4 - 4.3) per increase of 100 counts/min in obese and normal-weight adults, respectively. Linear modelling of the whole range yields wide prediction intervals for VO2/kg of ± 6.3 and ±7.3 ml/min in both groups. Conclusion: In obese and normal-weight adults, the use of AC for predicting energy expenditure, defined as VO2/kg, from a broad range of physical activities, characterized by varying intensities and types of muscle work, is limited.
Resumo:
Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
Background Agroforestry is a sustainable land use method with a long tradition in the Bolivian Andes. A better understanding of people’s knowledge and valuation of woody species can help to adjust actor-oriented agroforestry systems. In this case study, carried out in a peasant community of the Bolivian Andes, we aimed at calculating the cultural importance of selected agroforestry species, and at analysing the intracultural variation in the cultural importance and knowledge of plants according to peasants’ sex, age, and migration. Methods Data collection was based on semi-structured interviews and freelisting exercises. Two ethnobotanical indices (Composite Salience, Cultural Importance) were used for calculating the cultural importance of plants. Intracultural variation in the cultural importance and knowledge of plants was detected by using linear and generalised linear (mixed) models. Results and discussion The culturally most important woody species were mainly trees and exotic species (e.g. Schinus molle, Prosopis laevigata, Eucalyptus globulus). We found that knowledge and valuation of plants increased with age but that they were lower for migrants; sex, by contrast, played a minor role. The age effects possibly result from decreasing ecological apparency of valuable native species, and their substitution by exotic marketable trees, loss of traditional plant uses or the use of other materials (e.g. plastic) instead of wood. Decreasing dedication to traditional farming may have led to successive abandonment of traditional tool uses, and the overall transformation of woody plant use is possibly related to diminishing medicinal knowledge. Conclusions Age and migration affect how people value woody species and what they know about their uses. For this reason, we recommend paying particular attention to the potential of native species, which could open promising perspectives especially for the young migrating peasant generation and draw their interest in agroforestry. These native species should be ecologically sound and selected on their potential to provide subsistence and promising commercial uses. In addition to offering socio-economic and environmental services, agroforestry initiatives using native trees and shrubs can play a crucial role in recovering elements of the lost ancient landscape that still forms part of local people’s collective identity.
Resumo:
The objective of this longitudinal study, conducted in a neonatal intensive care unit, was to characterize the response to pain of high-risk very low birth weight infants (<1,500 g) from 23 to 38 weeks post-menstrual age (PMA) by measuring heart rate variability (HRV). Heart period data were recorded before, during, and after a heel lanced or wrist venipunctured blood draw for routine clinical evaluation. Pain response to the blood draw procedure and age-related changes of HRV in low-frequency and high-frequency bands were modeled with linear mixed-effects models. HRV in both bands decreased during pain, followed by a recovery to near-baseline levels. Venipuncture and mechanical ventilation were factors that attenuated the HRV response to pain. HRV at the baseline increased with post-menstrual age but the growth rate of high-frequency power was reduced in mechanically ventilated infants. There was some evidence that low-frequency HRV response to pain improved with advancing PMA.
Resumo:
BACKGROUND: Little is known about the effects of hypothermia therapy and subsequent rewarming on the PQRST intervals and heart rate variability (HRV) in term newborns with hypoxic-ischemic encephalopathy (HIE). OBJECTIVES: This study describes the changes in the PQRST intervals and HRV during rewarming to normal core body temperature of 2 newborns with HIE after hypothermia therapy. METHODS: Within 6 h after birth, 2 newborns with HIE were cooled to a core body temperature of 33.5 degrees C for 72 h using a cooling blanket, followed by gradual rewarming (0.5 degrees C per hour) until the body temperature reached 36.5 degrees C. Custom instrumentation recorded the electrocardiogram from the leads used for clinical monitoring of vital signs. Generalized linear mixed models were calculated to estimate temperature-related changes in PQRST intervals and HRV. Results: For every 1 degrees C increase in body temperature, the heart rate increased by 9.2 bpm (95% CI 6.8-11.6), the QTc interval decreased by 21.6 ms (95% CI 17.3-25.9), and low and high frequency HRV decreased by 0.480 dB (95% CI 0.052-0.907) and 0.938 dB (95% CI 0.460-1.416), respectively. CONCLUSIONS: Hypothermia-induced changes in the electrocardiogram should be monitored carefully in future studies.
Resumo:
OBJECTIVE: Assessment and treatment of psychological distress in cancer patients was recognized as a major challenge. The role of spouses, caregivers, and significant others became of salient importance not only because of their supportive functions but also in respect to their own burden. The purpose of this study was to assess the amount of distress in a mixed sample of cancer patients and their partners and to explore the dyadic interdependence. METHODS: An initial sample of 154 dyads was recruited, and distress questionnaires (Hospital Anxiety and Depression Scale, Symptom Checklist 9-Item Short Version and 12-Item Short Form Health Survey) were assessed over four time points. Linear mixed models and actor-partner interdependence models were applied. RESULTS: A significant proportion of patients and their partners (up to 40%) reported high levels of anxiety, depression, psychological distress, and low quality of life over the course of the investigation. Mixed model analyses revealed that higher risks for clinical relevant anxiety and depression in couples exist for female patients and especially for female partners. Although psychological strain decreased over time, the risk for elevated distress in female partners remained. Modeling patient-partner interdependence over time stratified by patients' gender revealed specific effects: a moderate correlation between distress in patients and partners, and a transmission of distress from male patients to their female partners. CONCLUSIONS: Our findings provide empirical support for gender-specific transmission of distress in dyads coping with cancer. This should be considered as an important starting point for planning systemic psycho-oncological interventions and conceptualizing further research.
Resumo:
The southernmost European natural and planted pine forests are among the most vulnerable areas to warming-induced drought decline. Both drought stress and management factors (e.g., stand origin or reduced thinning) may induce decline by reducing the water available to trees but their relative importances have not been properly assessed. The role of stand origin - densely planted vs. naturally regenerated stands - as a decline driver can be assessed by comparing the growth and vigor responses to drought of similar natural vs. planted stands. Here, we compare these responses in natural and planted Black pine (Pinus nigra) stands located in southern Spain. We analyze how environmental factors - climatic (temperature and precipitation anomalies) and site conditions - and biotic factors - stand structure (age, tree size, density) and defoliation by the pine processionary moth - drive radial growth and crown condition at stand and tree levels. We also assess the climatic trends in the study area over the last 60 years. We use dendrochronology, linear mixed-effects models of basal area increment and structural equation models to determine how natural and planted stands respond to drought and current competition intensity. We observed that a temperature rise and a decrease in precipitation during the growing period led to increasing drought stress during the late 20th century. Trees from planted stands experienced stronger growth reductions and displayed more severe crown defoliation after severe droughts than those from natural stands. High stand density negatively drove growth and enhanced crown dieback, particularly in planted stands. Also pine processionary moth defoliation was more severe in the growth of natural than in planted stands but affected tree crown condition similarly in both stand types. In response to drought, sharp growth reduction and widespread defoliation of planted Mediterranean pine stands indicate that they are more vulnerable and less resilient to drought stress than natural stands. To mitigate forest decline of planted stands in xeric areas such as the Mediterranean Basin, less dense and more diverse stands should be created through selective thinning or by selecting species or provenances that are more drought tolerant. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
OBJECTIVE We aimed to create an index to stratify cryptogenic stroke (CS) patients with patent foramen ovale (PFO) by their likelihood that the stroke was related to their PFO. METHODS Using data from 12 component studies, we used generalized linear mixed models to predict the presence of PFO among patients with CS, and derive a simple index to stratify patients with CS. We estimated the stratum-specific PFO-attributable fraction and stratum-specific stroke/TIA recurrence rates. RESULTS Variables associated with a PFO in CS patients included younger age, the presence of a cortical stroke on neuroimaging, and the absence of these factors: diabetes, hypertension, smoking, and prior stroke or TIA. The 10-point Risk of Paradoxical Embolism score is calculated from these variables so that the youngest patients with superficial strokes and without vascular risk factors have the highest score. PFO prevalence increased from 23% (95% confidence interval [CI]: 19%-26%) in those with 0 to 3 points to 73% (95% CI: 66%-79%) in those with 9 or 10 points, corresponding to attributable fraction estimates of approximately 0% to 90%. Kaplan-Meier estimated stroke/TIA 2-year recurrence rates decreased from 20% (95% CI: 12%-28%) in the lowest Risk of Paradoxical Embolism score stratum to 2% (95% CI: 0%-4%) in the highest. CONCLUSION Clinical characteristics identify CS patients who vary markedly in PFO prevalence, reflecting clinically important variation in the probability that a discovered PFO is likely to be stroke-related vs incidental. Patients in strata more likely to have stroke-related PFOs have lower recurrence risk.
Resumo:
Objective: We examined the influence of clinical, radiologic, and echocardiographic characteristics on antithrombotic choice in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO), hypothesizing that features suggestive of paradoxical embolism might lead to greater use of anticoagulation. Methods: The Risk of Paradoxical Embolism Study combined 12 databases to create the largest dataset of patients with CS and known PFO status. We used generalized linear mixed models with a random effect of component study to explore whether anticoagulation was preferentially selected based on the following: (1) younger age and absence of vascular risk factors, (2) “high-risk” echocardiographic features, and (3) neuroradiologic findings. Results: A total of 1,132 patients with CS and PFO treated with anticoagulation or antiplatelets were included. Overall, 438 participants (39%) were treated with anticoagulation with a range (by database) of 22% to 54%. Treatment choice was not influenced by age or vascular risk factors. However, neuroradiologic findings (superficial or multiple infarcts) and high-risk echocardiographic features (large shunts, shunt at rest, and septal hypermobility) were predictors of anticoagulation use. Conclusion: Both antithrombotic regimens are widely used for secondary stroke prevention in patients with CS and PFO. Radiologic and echocardiographic features were strongly associated with treatment choice, whereas conventional vascular risk factors were not. Prior observational studies are likely to be biased by confounding by indication.
Resumo:
SUMMARY BACKGROUND/OBJECTIVES Orthodontic management of maxillary canine impaction (MCI), including forced eruption, may result in significant root resorption; however, the association between MCI and orthodontically induced root resorption (OIRR) is not yet sufficiently established. The purpose of this retrospective cohort study was to comparatively evaluate the severity of OIRR of maxillary incisors in orthodontically treated patients with MCI. Additionally, impaction characteristics were associated with OIRR severity. SUBJECTS AND METHODS The sample comprised 48 patients undergoing fixed-appliance treatment-24 with unilateral/bilateral MCI and 24 matched controls without impaction. OIRR was calculated using pre- and post-operative panoramic tomograms. The orientation of eruption path, height, sector location, and follicle/tooth ratio of the impacted canine were also recorded. Mann-Whitney U-test and univariate and multivariate linear mixed models were used to test for the associations of interest. RESULTS Maxillary central left incisor underwent more OIRR in the impaction group (mean difference = 0.58mm, P = 0.04). Overall, the impaction group had 0.38mm more OIRR compared to the control (95% confidence interval, CI: 0.03, 0.74; P = 0.04). However, multivariate analysis demonstrated no difference in the amount of OIRR between impaction and non-impaction groups overall. A positive association between OIRR and initial root length was observed (95% CI: 0.08, 0.27; P < 0.001). The severity of canine impaction was not found to be a significant predictor of OIRR. LIMITATIONS This study was a retrospective study and used panoramic tomograms for OIRR measurements. CONCLUSIONS This study indicates that MCI is a weak OIRR predictor. Interpretation of the results needs caution due to the observational nature of the present study.
Resumo:
The challenge for sustainable organic dairy farming is identification of cows that are well adapted to forage-based production systems. Therefore, the aim of this study was to compare the grazing behaviour, physical activity and metabolic profile of two different Holstein strains kept in an organic grazing system without concentrate supplementation. Twelve Swiss (HCH ; 566 kg body weight (BW) and 12 New Zealand Holstein-Friesian (HNZ ; 530 kg BW) cows in mid-lactation were kept in a rotational grazing system. After an adaptation period, the milk yield, nutrient intake, physical activity and grazing behaviour were recorded for each cow for 7 days. On three consecutive days, blood was sampled at 07:00, 12:00 and 17:00 h from each cow by jugular vein puncture. Data were analysed using linear mixed models. No differences were found in milk yield, but milk fat (3.69 vs. 4.05%, P = 0.05) and milk protein percentage (2.92 vs. 3.20%, P < 0.01) were lower in HCH than in HNZ cows. Herbage intake did not differ between strains, but organic matter digestibility was greater (P = 0.01) in HCH compared to HNZ cows. The HCH cows spent less (P = 0.04) time ruminating (439 vs. 469 min/day) and had a lower (P = 0.02) number of ruminating boli when compared to the HNZ cows. The time spent eating and physical activity did not differ between strains. Concentrations of IGF-1 and T3 were lower (P ≤ 0.05) in HCH than HNZ cows. In conclusion, HCH cows were not able to increase dry matter intake in order to express their full genetic potential for milk production when kept in an organic grazing system without concentrate supplementation. On the other hand, HNZ cows seem to compensate for the reduced nutrient availability better than HCH cows but could not use that advantage for increased production efficiency
Resumo:
OBJECTIVE Poison centres offer rapid and comprehensive support for emergency physicians managing poisoned patients. This study investigates institutional, case-specific and poisoning-specific factors which influence the decision of emergency physicians to contact a poison centre. METHODS Retrospective, consecutive review of all poisoning-related admissions to the emergency departments (EDs) of a primary care hospital and a university hospital-based tertiary referral centre during 2007. Corresponding poison centre consultations were extracted from the poison centre database. Data were matched and analysed by logistic regression and generalised linear mixed models. RESULTS 545 poisonings were treated in the participating EDs (350 (64.2%) in the tertiary care centre, 195 (35.8%) in the primary care hospital). The poison centre was consulted in 62 (11.4%) cases (38 (61.3%) by the tertiary care centre and 24 (38.7%) by the primary care hospital). Factors significantly associated with poison centre consultation included gender (female vs male) (OR 2.99; 95% CI 1.69 to 5.29; p<0.001), number of ingested substances (>1 vs 1) (OR 2.84; 95% CI 1.65 to 4.9; p<0.001) and situation (accidental vs intentional) (OR 2.76; 95% CI 1.05 to 7.25; p=0.039). In contrast, age, medical history and hospital size did not influence poison centre consultation. Poison centre consultation was significantly higher during the week, and significantly less during night shifts. The poison centre was consulted significantly more when patients were admitted to intensive care units (OR 5.81; 95% CI 3.25 to 10.37; p<0.001). Asymptomatic and severe versus mild cases were associated with more frequent consultation (OR 4.48; 95% CI 1.78 to 11.26; p=0.001 and OR 2.76; 95% CI 1.42 to 5.38; p=0.003). CONCLUSIONS We found low rates of poison centre consultation by emergency physicians. It appears that intensive care unit admission and other factors reflecting either complexity or uncertainty of the clinical situation are the strongest predictors for poison centre consultation. Hospital size did not influence referral behaviour.
Resumo:
Objective: Minimizing resection and preserving leaflet tissue has been previously shown to be beneficial for mitral valve function and leaflet kinematics after repair of acute posterior leaflet prolapse in porcine valves. We examined the effects of different additional methods of mitral valve repair (neochordoplasty, ring annuloplasty, edge-to-edge repair and triangular resection) on hemodynamics at different heart rates in an experimental model. Methods: Severe acute P2 prolapse was created in eight porcine mitral valves by resecting the posterior marginal chordae. Valve hemodynamics was quantified under pulsatile conditions in an in vitro heart simulator before and after surgical manipulation. Mitral regurgitation was corrected using four different methods of repair on the same valve: neochordoplasty with expanded polytetrafluoroethylene sutures alone and together with ring annuloplasty, edge-to-edge repair and triangular resection, both with non-restrictive annuloplasty. Residual mitral valve leak, trans-valvular pressure gradients, flow and cardiac output were measured at 60 and 80 beats/min. A validated statistical linear mixed model was used to analyze the effect of treatment. The p values were calculated using a two-sided Wald test. Results: Only neochordoplasty with expanded polytetrafluoroethylene sutures but without ring annuloplasty achieved similar hemodynamics compared to those of the native mitral valve (p range 0.071-0.901). Trans-valvular diastolic pressure gradients were within a physiologic range but significantly higher than those of the native valve following neochordoplasty with ring annuloplasty (p=0.000), triangular resection (p=0.000) and edge-to-edge repair (p=0.000). Neochordoplasty alone was significantly better in terms of hemodynamic than neochordoplasty with a ring annuloplasty (p=0.000). These values were stable regardless of heart rate or ring size. Conclusions: Neochordoplasty without ring annuloplasty is the only repair technique able to achieve almost native physiological hemodynamics after correction of leaflet prolapse in a porcine experimental model of acute chordal rupture.
Resumo:
Infrared thermography (IRT) was used to assess the effect of routine claw trimming on claw temperature. In total, 648 IRT observations each were collected from 81 cows housed in 6 tiestalls before and 3 wk after claw trimming. The feet were classified as either healthy (nonlesion group, n = 182) or affected with infectious foot disorders (group IFD, n = 142). The maximal surface temperatures of the coronary band and skin and the difference of the maximal temperatures (ΔT) between the lateral and medial claws of the respective foot were assessed. Linear mixed models, correcting for the hierarchical structure of the data, ambient temperature, and infectious status of the claws, were developed to evaluate the effect of time in relation to the trimming event (d 0 versus d 21) and claw (medial versus lateral). Front feet and hind feet were analyzed separately. Ambient temperature and infectious foot status were identified as external and internal factors, respectively, that significantly affected claw temperature. Before claw trimming, the lateral claws of the hind feet were significantly warmer compared with the medial claws, whereas such a difference was not evident for the claws of the front feet. At d 21, ΔT of the hind feet was reduced by ≥ 0.25 °C, whereas it was increased by ≤ 0.13 °C in the front feet compared with d 0. Therefore, trimming was associated with a remarkable decrease of ΔT of the hind claws. Equalizing the weight bearing of the hind feet by routine claw trimming is associated with a measurable reduction of ΔT between the paired hind claws.