49 resultados para Linear Mixed Integer Multicriteria Optimization
Resumo:
Cystic fibrosis (CF) is caused by mutations in the CF transmembrane conductance regulator gene (CFTR). Disease severity in CF varies greatly, and sibling studies strongly indicate that genes other than CFTR modify disease outcome. Syntaxin 1A (STX1A) has been reported as a negative regulator of CFTR and other ion channels. We hypothesized that STX1A variants act as a CF modifier by influencing the remaining function of mutated CFTR. We identified STX1A variants by genomic resequencing patients from the Bernese CF Patient Data Registry and applied linear mixed model analysis to establish genotype-phenotype correlations, revealing STX1A rs4363087 (c.467-38A>G) to significantly influence lung function. The same STX1A risk allele was recognized in the European CF Twin and Sibling Study (P=0.0027), demonstrating that the genotype-phenotype association of STX1A to CF disease severity is robust enough to allow replication in two independent CF populations. rs4363087 is in linkage disequilibrium to the exonic variant rs2228607 (c.204C>T). Considering that neither rs4363087 nor rs2228607 changes the amino-acid sequence of STX1A, we investigated their effects on mRNA level. We show that rs2228607 reinforces aberrant splicing of STX1A mRNA, leading to nonsense-mediated mRNA decay. In conclusion, we demonstrate the clinical relevance of STX1A variants in CF, and evidence the functional relevance of STX1A variant rs2228607 at molecular level. Our findings show that genes interacting with CFTR can modify CF disease progression.European Journal of Human Genetics advance online publication, 10 April 2013; doi:10.1038/ejhg.2013.57.
Resumo:
There is growing evidence that the great phenotypic variability in patients with cystic fibrosis (CF) not only depends on the genotype, but apart from a combination of environmental and stochastic factors predominantly also on modifier gene effects. It has been proposed that genes interacting with CF transmembrane conductance regulator (CFTR) and epithelial sodium channel (ENaC) are potential modifiers. Therefore, we assessed the impact of single-nucleotide polymorphisms (SNPs) of several of these interacters on CF disease outcome. SNPs that potentially alter gene function were genotyped in 95 well-characterized p.Phe508del homozygous CF patients. Linear mixed-effect model analysis was used to assess the relationship between sequence variants and the repeated measurements of lung function parameters. In total, we genotyped 72 SNPs in 10 genes. Twenty-five SNPs were used for statistical analysis, where we found strong associations for one SNP in PPP2R4 with the lung clearance index (P ≤ 0.01), the specific effective airway resistance (P ≤ 0.005) and the forced expiratory volume in 1 s (P ≤ 0.005). In addition, we identified one SNP in SNAP23 to be significantly associated with three lung function parameters as well as one SNP in PPP2R1A and three in KRT19 to show a significant influence on one lung function parameter each. Our findings indicate that direct interacters with CFTR, such as SNAP23, PPP2R4 and PPP2R1A, may modify the residual function of p.Phe508del-CFTR while variants in KRT19 may modulate the amount of p.Phe508del-CFTR at the apical membrane and consequently modify CF disease.
Resumo:
Uterine smooth muscle specimens were collected from euthanatized mares in estrus and diestrus. Longitudinal and circular specimens were mounted in organ baths and the signals transcribed to a Grass polygraph. After equilibration time and 2 g preload, their physiologic isometric contractility was recorded for a continuous 2.0 h. Area under the curve, frequency and time occupied by contractions were studied. Differences between cycle phases, between muscle layers, and over the recorded time periods were statistically evaluated using linear mixed-effect models. In the mare, physiologic contractility of the uterus decreased significantly over time for all variables evaluated (time as covariate on a continuous scale). For area under the curve, there was a significant effect of muscle layer (longitudinal > circular). For frequency, higher values were recorded in estrus for circular smooth muscle layer, whereas higher values were seen in longitudinal smooth muscle layers during diestrus. In longitudinal layer and in diestrus, more time was occupied by contractions than in circular layer, and in estrus. This study is describing physiologic myometrial motility in the organ bath depending on cycle phase.
Resumo:
Milk cortisol concentration was determined under routine management conditions on 4 farms with an auto-tandem milking parlor and 8 farms with 1 of 2 automatic milking systems (AMS). One of the AMS was a partially forced (AMSp) system, and the other was a free cow traffic (AMSf) system. Milk samples were collected for all the cows on a given farm (20 to 54 cows) for at least 1 d. Behavioral observations were made during the milking process for a subset of 16 to 20 cows per farm. Milk cortisol concentration was evaluated by milking system, time of day, behavior during milking, daily milk yield, and somatic cell count using linear mixed-effects models. Milk cortisol did not differ between systems (AMSp: 1.15 +/- 0.07; AMSf: 1.02 +/- 0.12; auto-tandem parlor: 1.01 +/- 0.16 nmol/L). Cortisol concentrations were lower in evening than in morning milkings (1.01 +/- 0.12 vs. 1.24 +/- 0.13 nmol/L). The daily periodicity of cortisol concentration was characterized by an early morning peak and a late afternoon elevation in AMSp. A bimodal pattern was not evident in AMSf. Finally, milk cortisol decreased by a factor of 0.915 in milking parlors, by 0.998 in AMSp, and increased by a factor of 1.161 in AMSf for each unit of ln(somatic cell count/1,000). We conclude that milking cows in milking parlors or AMS does not result in relevant stress differences as measured by milk cortisol concentrations. The biological relevance of the difference regarding the daily periodicity of milk cortisol concentrations observed between the AMSp and AMSf needs further investigation.
Resumo:
BACKGROUND: Radio-frequency electromagnetic fields (RF EMF) of mobile communication systems are widespread in the living environment, yet their effects on humans are uncertain despite a growing body of literature. OBJECTIVES: We investigated the influence of a Universal Mobile Telecommunications System (UMTS) base station-like signal on well-being and cognitive performance in subjects with and without self-reported sensitivity to RF EMF. METHODS: We performed a controlled exposure experiment (45 min at an electric field strength of 0, 1, or 10 V/m, incident with a polarization of 45 degrees from the left back side of the subject, weekly intervals) in a randomized, double-blind crossover design. A total of 117 healthy subjects (33 self-reported sensitive, 84 nonsensitive subjects) participated in the study. We assessed well-being, perceived field strength, and cognitive performance with questionnaires and cognitive tasks and conducted statistical analyses using linear mixed models. Organ-specific and brain tissue-specific dosimetry including uncertainty and variation analysis was performed. RESULTS: In both groups, well-being and perceived field strength were not associated with actual exposure levels. We observed no consistent condition-induced changes in cognitive performance except for two marginal effects. At 10 V/m we observed a slight effect on speed in one of six tasks in the sensitive subjects and an effect on accuracy in another task in nonsensitive subjects. Both effects disappeared after multiple end point adjustment. CONCLUSIONS: In contrast to a recent Dutch study, we could not confirm a short-term effect of UMTS base station-like exposure on well-being. The reported effects on brain functioning were marginal and may have occurred by chance. Peak spatial absorption in brain tissue was considerably smaller than during use of a mobile phone. No conclusions can be drawn regarding short-term effects of cell phone exposure or the effects of long-term base station-like exposure on human health.
Resumo:
BACKGROUND: Few data are available on the long-term immunologic response to antiretroviral therapy (ART) in resource-limited settings, where ART is being rapidly scaled up using a public health approach, with a limited repertoire of drugs. OBJECTIVES: To describe immunologic response to ART among ART patients in a network of cohorts from sub-Saharan Africa, Latin America, and Asia. STUDY POPULATION/METHODS: Treatment-naive patients aged 15 and older from 27 treatment programs were eligible. Multilevel, linear mixed models were used to assess associations between predictor variables and CD4 cell count trajectories following ART initiation. RESULTS: Of 29 175 patients initiating ART, 8933 (31%) were excluded due to insufficient follow-up time and early lost to follow-up or death. The remaining 19 967 patients contributed 39 200 person-years on ART and 71 067 CD4 cell count measurements. The median baseline CD4 cell count was 114 cells/microl, with 35% having less than 100 cells/microl. Substantial intersite variation in baseline CD4 cell count was observed (range 61-181 cells/microl). Women had higher median baseline CD4 cell counts than men (121 vs. 104 cells/microl). The median CD4 cell count increased from 114 cells/microl at ART initiation to 230 [interquartile range (IQR) 144-338] at 6 months, 263 (IQR 175-376) at 1 year, 336 (IQR 224-472) at 2 years, 372 (IQR 242-537) at 3 years, 377 (IQR 221-561) at 4 years, and 395 (IQR 240-592) at 5 years. In multivariable models, baseline CD4 cell count was the most important determinant of subsequent CD4 cell count trajectories. CONCLUSION: These data demonstrate robust and sustained CD4 response to ART among patients remaining on therapy. Public health and programmatic interventions leading to earlier HIV diagnosis and initiation of ART could substantially improve patient outcomes in resource-limited settings.
Resumo:
Background and Aim In patients with cystic fibrosis (CF) the architecture of the developing lungs and the ventilation of lung units are progressively affected, influencing intrapulmonary gas mixing and gas exchange. We examined the long-term course of blood gas measurements in relation to characteristics of lung function and the influence of different CFTR genotype upon this process. Methods Serial annual measurements of PaO2 and PaCO2 assessed in relation to lung function, providing functional residual capacity (FRCpleth), lung clearance index (LCI), trapped gas (VTG), airway resistance (sReff), and forced expiratory indices (FEV1, FEF50), were collected in 178 children (88 males; 90 females) with CF, over an age range of 5 to 18 years. Linear mixed model analysis and binary logistic regression analysis were used to define predominant lung function parameters influencing oxygenation and carbon dioxide elimination. Results PaO2 decreased linearly from age 5 to 18 years, and was mainly associated with FRCpleth, (p < 0.0001), FEV1 (p < 0.001), FEF50 (p < 0.002), and LCI (p < 0.002), indicating that oxygenation was associated with the degree of pulmonary hyperinflation, ventilation inhomogeneities and impeded airway function. PaCO2 showed a transitory phase of low PaCO2 values, mainly during the age range of 5 to 12 years. Both PaO2 and PaCO2 presented with different progression slopes within specific CFTR genotypes. Conclusion In the long-term evaluation of gas exchange characteristics, an association with different lung function patterns was found and was closely related to specific genotypes. Early examination of blood gases may reveal hypocarbia, presumably reflecting compensatory mechanisms to improve oxygenation.
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.
Resumo:
OBJECTIVES Zidovudine (ZDV) is recommended for first-line antiretroviral therapy (ART) in resource-limited settings. ZDV may, however, lead to anemia and impaired immunological response. We compared CD4+ cell counts over 5 years between patients starting ART with and without ZDV in southern Africa. DESIGN Cohort study. METHODS Patients aged at least 16 years who started first-line ART in South Africa, Botswana, Zambia, or Lesotho were included. We used linear mixed-effect models to compare CD4+ cell count trajectories between patients on ZDV-containing regimens and patients on other regimens, censoring follow-up at first treatment change. Impaired immunological recovery, defined as a CD4+ cell count below 100 cells/μl at 1 year, was assessed in logistic regression. Analyses were adjusted for baseline CD4+ cell count and hemoglobin level, age, sex, type of regimen, viral load monitoring, and calendar year. RESULTS A total of 72,597 patients starting ART, including 19,758 (27.2%) on ZDV, were analyzed. Patients on ZDV had higher CD4+ cell counts (150 vs.128 cells/μl) and hemoglobin level (12.0 vs. 11.0 g/dl) at baseline, and were less likely to be women than those on other regimens. Adjusted differences in CD4+ cell counts between regimens containing and not containing ZDV were -16 cells/μl [95% confidence interval (CI) -18 to -14] at 1 year and -56 cells/μl (95% CI -59 to -52) at 5 years. Impaired immunological recovery was more likely with ZDV compared to other regimens (odds ratio 1.40, 95% CI 1.22-1.61). CONCLUSION In southern Africa, ZDV is associated with inferior immunological recovery compared to other backbones. Replacing ZDV with another nucleoside reverse transcriptase inhibitor could avoid unnecessary switches to second-line ART.
Resumo:
Objective: To evaluate a new triaxial accelerometer device for prediction of energy expenditure, measured as VO2/kg, in obese adults and normal-weight controls during activities of daily life. Subjects and methods: Thirty-seven obese adults (Body Mass Index (BMI) 37±5.4) and seventeen controls (BMI 23±1.8) performed eight activities for 5 to 8 minutes while wearing a triaxial accelerometer on the right thigh. Simultaneously, VO2 and VCO2 were measured using a portable metabolic system. The relationship between accelerometer counts (AC) and VO2/kg was analysed using spline regression and linear mixed-effects models. Results: For all activities, VO2/kg was significantly lower in obese participants than in normalweight controls. A linear relationship between AC and VO2/kg existed only within accelerometer values from 0 to 300 counts/min, with an increase of 3.7 (95%-confidence interval (CI) 3.4 - 4.1) and 3.9 ml/min (95%-CI 3.4 - 4.3) per increase of 100 counts/min in obese and normal-weight adults, respectively. Linear modelling of the whole range yields wide prediction intervals for VO2/kg of ± 6.3 and ±7.3 ml/min in both groups. Conclusion: In obese and normal-weight adults, the use of AC for predicting energy expenditure, defined as VO2/kg, from a broad range of physical activities, characterized by varying intensities and types of muscle work, is limited.
Resumo:
Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
Background Agroforestry is a sustainable land use method with a long tradition in the Bolivian Andes. A better understanding of people’s knowledge and valuation of woody species can help to adjust actor-oriented agroforestry systems. In this case study, carried out in a peasant community of the Bolivian Andes, we aimed at calculating the cultural importance of selected agroforestry species, and at analysing the intracultural variation in the cultural importance and knowledge of plants according to peasants’ sex, age, and migration. Methods Data collection was based on semi-structured interviews and freelisting exercises. Two ethnobotanical indices (Composite Salience, Cultural Importance) were used for calculating the cultural importance of plants. Intracultural variation in the cultural importance and knowledge of plants was detected by using linear and generalised linear (mixed) models. Results and discussion The culturally most important woody species were mainly trees and exotic species (e.g. Schinus molle, Prosopis laevigata, Eucalyptus globulus). We found that knowledge and valuation of plants increased with age but that they were lower for migrants; sex, by contrast, played a minor role. The age effects possibly result from decreasing ecological apparency of valuable native species, and their substitution by exotic marketable trees, loss of traditional plant uses or the use of other materials (e.g. plastic) instead of wood. Decreasing dedication to traditional farming may have led to successive abandonment of traditional tool uses, and the overall transformation of woody plant use is possibly related to diminishing medicinal knowledge. Conclusions Age and migration affect how people value woody species and what they know about their uses. For this reason, we recommend paying particular attention to the potential of native species, which could open promising perspectives especially for the young migrating peasant generation and draw their interest in agroforestry. These native species should be ecologically sound and selected on their potential to provide subsistence and promising commercial uses. In addition to offering socio-economic and environmental services, agroforestry initiatives using native trees and shrubs can play a crucial role in recovering elements of the lost ancient landscape that still forms part of local people’s collective identity.
Resumo:
OBJECTIVE: Assessment and treatment of psychological distress in cancer patients was recognized as a major challenge. The role of spouses, caregivers, and significant others became of salient importance not only because of their supportive functions but also in respect to their own burden. The purpose of this study was to assess the amount of distress in a mixed sample of cancer patients and their partners and to explore the dyadic interdependence. METHODS: An initial sample of 154 dyads was recruited, and distress questionnaires (Hospital Anxiety and Depression Scale, Symptom Checklist 9-Item Short Version and 12-Item Short Form Health Survey) were assessed over four time points. Linear mixed models and actor-partner interdependence models were applied. RESULTS: A significant proportion of patients and their partners (up to 40%) reported high levels of anxiety, depression, psychological distress, and low quality of life over the course of the investigation. Mixed model analyses revealed that higher risks for clinical relevant anxiety and depression in couples exist for female patients and especially for female partners. Although psychological strain decreased over time, the risk for elevated distress in female partners remained. Modeling patient-partner interdependence over time stratified by patients' gender revealed specific effects: a moderate correlation between distress in patients and partners, and a transmission of distress from male patients to their female partners. CONCLUSIONS: Our findings provide empirical support for gender-specific transmission of distress in dyads coping with cancer. This should be considered as an important starting point for planning systemic psycho-oncological interventions and conceptualizing further research.