78 resultados para Linear mixed effect models
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Uterine smooth muscle specimens were collected from euthanatized mares in estrus and diestrus. Longitudinal and circular specimens were mounted in organ baths and the signals transcribed to a Grass polygraph. After equilibration time and 2 g preload, their physiologic isometric contractility was recorded for a continuous 2.0 h. Area under the curve, frequency and time occupied by contractions were studied. Differences between cycle phases, between muscle layers, and over the recorded time periods were statistically evaluated using linear mixed-effect models. In the mare, physiologic contractility of the uterus decreased significantly over time for all variables evaluated (time as covariate on a continuous scale). For area under the curve, there was a significant effect of muscle layer (longitudinal > circular). For frequency, higher values were recorded in estrus for circular smooth muscle layer, whereas higher values were seen in longitudinal smooth muscle layers during diestrus. In longitudinal layer and in diestrus, more time was occupied by contractions than in circular layer, and in estrus. This study is describing physiologic myometrial motility in the organ bath depending on cycle phase.
Resumo:
OBJECTIVES Zidovudine (ZDV) is recommended for first-line antiretroviral therapy (ART) in resource-limited settings. ZDV may, however, lead to anemia and impaired immunological response. We compared CD4+ cell counts over 5 years between patients starting ART with and without ZDV in southern Africa. DESIGN Cohort study. METHODS Patients aged at least 16 years who started first-line ART in South Africa, Botswana, Zambia, or Lesotho were included. We used linear mixed-effect models to compare CD4+ cell count trajectories between patients on ZDV-containing regimens and patients on other regimens, censoring follow-up at first treatment change. Impaired immunological recovery, defined as a CD4+ cell count below 100 cells/μl at 1 year, was assessed in logistic regression. Analyses were adjusted for baseline CD4+ cell count and hemoglobin level, age, sex, type of regimen, viral load monitoring, and calendar year. RESULTS A total of 72,597 patients starting ART, including 19,758 (27.2%) on ZDV, were analyzed. Patients on ZDV had higher CD4+ cell counts (150 vs.128 cells/μl) and hemoglobin level (12.0 vs. 11.0 g/dl) at baseline, and were less likely to be women than those on other regimens. Adjusted differences in CD4+ cell counts between regimens containing and not containing ZDV were -16 cells/μl [95% confidence interval (CI) -18 to -14] at 1 year and -56 cells/μl (95% CI -59 to -52) at 5 years. Impaired immunological recovery was more likely with ZDV compared to other regimens (odds ratio 1.40, 95% CI 1.22-1.61). CONCLUSION In southern Africa, ZDV is associated with inferior immunological recovery compared to other backbones. Replacing ZDV with another nucleoside reverse transcriptase inhibitor could avoid unnecessary switches to second-line ART.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
Background: A small pond, c. 90 years old, near Bern, Switzerland contains a population of threespine stickleback (Gasterosteus aculeatus) with two distinct male phenotypes. Males of one type are large, and red, and nest in the shallow littoral zone. The males of the other are small and orange, and nest offshore at slightly greater depth. The females in this population are phenotypically highly variable but cannot easily be assigned to either male type. Question: Is the existence of two sympatric male morphs maintained by substrate-associated male nest site choice and facilitated by female mate preferences? Organisms: Male stickleback caught individually at their breeding sites. Females caught with minnow traps. Methods: In experimental tanks, we simulated the slope and substrate of the two nesting habitats. We then placed individual males in a tank and observed in which habitat the male would build his nest. In a simultaneous two-stimulus choice design, we gave females the choice between a large, red male and a small, orange one. We measured female morphology and used linear mixed effect models to determine whether female preference correlated with female morphology. Results: Both red and orange males preferred nesting in the habitat that simulated the slightly deeper offshore condition. This is the habitat occupied by the small, orange males in the pond itself. The proportion of females that chose a small orange male was similar to that which chose a large red male. Several aspects of female phenotype correlated with the male type that a female preferred.
Resumo:
PURPOSE. To evaluate the role of fellow eye status in determining progression of geographic atrophy (GA) in patients with age-related macular degeneration (AMD). METHODS. A total of 300 eyes with GA of 193 patients from the prospective, longitudinal, natural history FAM Study were classified into three groups according to the AMD manifestation in the fellow eye at baseline examination: (1) bilateral GA, (2) early/intermediate AMD, and (3) exudative AMD. GA areas were quantified based on fundus autofluorescence images using a semiautomated image-processing method, and progression rates (PR) were estimated using two-level, linear, mixed-effects models. RESULTS. Crude GA-PR in the bilateral GA group (mean, 1.64 mm(2)/y; 95% CI, 1.478-1.803) was significantly higher than in the fellow eye early/intermediate group (0.74 mm(2)/y, 0.146-1.342). Although there was a significant difference in baseline GA size (P = 0.0013, t-test), and there was a significant increase in GA-PR by 0.11 mm(2)/y (0.05-0.17) per 1 disc area (DA; 2.54 mm(2)), an additional mean change of -0.79 (-1.43 to -0.15) was given to the PR beside the effect of baseline GA size. However, this difference was only significant when GA size was ?1 DA at baseline with a GA-PR of 1.70 mm(2)/y (1.54-1.85) in the bilateral and 0.95 mm(2)/y (0.37-1.54) in the early/intermediate group. There was no significant difference in PR compared with that in the fellow eye exudative group. CONCLUSIONS. The results indicate that the AMD manifestation of the fellow eye at baseline serves as an indicator for disease progression in eyes with GA ? 1 DA. Predictive characteristics not only contribute to the understanding of pathophysiological mechanisms, but also are useful for the design of future interventional trials in GA patients.
Resumo:
Pspline uses xtmixed to fit a penalized spline regression and plots the smoothed function. Additional covariates can be specified to adjust the smooth and plot partial residuals.
Resumo:
Objective Malnutrition is common in HIV-infected children in Africa and an indication for antiretroviral treatment (ART). We examined anthropometric status and response to ART in children treated at a large public-sector clinic in Malawi. Methods All children aged <15 years who started ART between January 2001 and December 2006 were included and followed until March 2008. Weight and height were measured at regular intervals from 1 year before to 2 years after the start of ART. Sex- and age-standardized z-scores were calculated for weight-for-age (WAZ) and height-for-age (HAZ). Predictors of growth were identified in multivariable mixed-effect models. Results A total of 497 children started ART and were followed for 972 person-years. Median age (interquartile range; IQR) was 8 years (4–11 years). Most children were underweight (52% of children), stunted (69%), in advanced clinical stages (94% in WHO stages 3 or 4) and had severe immunodeficiency (77%). After starting ART, median (IQR) WAZ and HAZ increased from −2.1 (−2.7 to −1.3) and −2.6 (−3.6 to −1.8) to −1.4 (−2.1 to −0.8) and −1.8 (−2.4 to −1.1) at 24 months, respectively (P < 0.001). In multivariable models, baseline WAZ and HAZ scores were the most important determinants of growth trajectories on ART. Conclusions Despite a sustained growth response to ART among children remaining on therapy, normal values were not reached. Interventions leading to earlier HIV diagnosis and initiation of treatment could improve growth response.
Resumo:
Despite the impact of red blood cell (RBC) Life-spans in some disease areas such as diabetes or anemia of chronic kidney disease, there is no consensus on how to quantitatively best describe the process. Several models have been proposed to explain the elimination process of RBCs: random destruction process, homogeneous life-span model, or a series of 4-transit compartment model. The aim of this work was to explore the different models that have been proposed in literature, and modifications to those. The impact of choosing the right model on future outcomes prediction--in the above mentioned areas--was also investigated. Both data from indirect (clinical data) and direct life-span measurement (biotin-labeled data) methods were analyzed using non-linear mixed effects models. Analysis showed that: (1) predictions from non-steady state data will depend on the RBC model chosen; (2) the transit compartment model, which considers variation in life-span in the RBC population, better describes RBC survival data than the random destruction or homogenous life-span models; and (3) the additional incorporation of random destruction patterns, although improving the description of the RBC survival data, does not appear to provide a marked improvement when describing clinical data.
Resumo:
There is growing evidence that the great phenotypic variability in patients with cystic fibrosis (CF) not only depends on the genotype, but apart from a combination of environmental and stochastic factors predominantly also on modifier gene effects. It has been proposed that genes interacting with CF transmembrane conductance regulator (CFTR) and epithelial sodium channel (ENaC) are potential modifiers. Therefore, we assessed the impact of single-nucleotide polymorphisms (SNPs) of several of these interacters on CF disease outcome. SNPs that potentially alter gene function were genotyped in 95 well-characterized p.Phe508del homozygous CF patients. Linear mixed-effect model analysis was used to assess the relationship between sequence variants and the repeated measurements of lung function parameters. In total, we genotyped 72 SNPs in 10 genes. Twenty-five SNPs were used for statistical analysis, where we found strong associations for one SNP in PPP2R4 with the lung clearance index (P ≤ 0.01), the specific effective airway resistance (P ≤ 0.005) and the forced expiratory volume in 1 s (P ≤ 0.005). In addition, we identified one SNP in SNAP23 to be significantly associated with three lung function parameters as well as one SNP in PPP2R1A and three in KRT19 to show a significant influence on one lung function parameter each. Our findings indicate that direct interacters with CFTR, such as SNAP23, PPP2R4 and PPP2R1A, may modify the residual function of p.Phe508del-CFTR while variants in KRT19 may modulate the amount of p.Phe508del-CFTR at the apical membrane and consequently modify CF disease.
Resumo:
Milk cortisol concentration was determined under routine management conditions on 4 farms with an auto-tandem milking parlor and 8 farms with 1 of 2 automatic milking systems (AMS). One of the AMS was a partially forced (AMSp) system, and the other was a free cow traffic (AMSf) system. Milk samples were collected for all the cows on a given farm (20 to 54 cows) for at least 1 d. Behavioral observations were made during the milking process for a subset of 16 to 20 cows per farm. Milk cortisol concentration was evaluated by milking system, time of day, behavior during milking, daily milk yield, and somatic cell count using linear mixed-effects models. Milk cortisol did not differ between systems (AMSp: 1.15 +/- 0.07; AMSf: 1.02 +/- 0.12; auto-tandem parlor: 1.01 +/- 0.16 nmol/L). Cortisol concentrations were lower in evening than in morning milkings (1.01 +/- 0.12 vs. 1.24 +/- 0.13 nmol/L). The daily periodicity of cortisol concentration was characterized by an early morning peak and a late afternoon elevation in AMSp. A bimodal pattern was not evident in AMSf. Finally, milk cortisol decreased by a factor of 0.915 in milking parlors, by 0.998 in AMSp, and increased by a factor of 1.161 in AMSf for each unit of ln(somatic cell count/1,000). We conclude that milking cows in milking parlors or AMS does not result in relevant stress differences as measured by milk cortisol concentrations. The biological relevance of the difference regarding the daily periodicity of milk cortisol concentrations observed between the AMSp and AMSf needs further investigation.
Resumo:
Objective: To evaluate a new triaxial accelerometer device for prediction of energy expenditure, measured as VO2/kg, in obese adults and normal-weight controls during activities of daily life. Subjects and methods: Thirty-seven obese adults (Body Mass Index (BMI) 37±5.4) and seventeen controls (BMI 23±1.8) performed eight activities for 5 to 8 minutes while wearing a triaxial accelerometer on the right thigh. Simultaneously, VO2 and VCO2 were measured using a portable metabolic system. The relationship between accelerometer counts (AC) and VO2/kg was analysed using spline regression and linear mixed-effects models. Results: For all activities, VO2/kg was significantly lower in obese participants than in normalweight controls. A linear relationship between AC and VO2/kg existed only within accelerometer values from 0 to 300 counts/min, with an increase of 3.7 (95%-confidence interval (CI) 3.4 - 4.1) and 3.9 ml/min (95%-CI 3.4 - 4.3) per increase of 100 counts/min in obese and normal-weight adults, respectively. Linear modelling of the whole range yields wide prediction intervals for VO2/kg of ± 6.3 and ±7.3 ml/min in both groups. Conclusion: In obese and normal-weight adults, the use of AC for predicting energy expenditure, defined as VO2/kg, from a broad range of physical activities, characterized by varying intensities and types of muscle work, is limited.
Resumo:
The southernmost European natural and planted pine forests are among the most vulnerable areas to warming-induced drought decline. Both drought stress and management factors (e.g., stand origin or reduced thinning) may induce decline by reducing the water available to trees but their relative importances have not been properly assessed. The role of stand origin - densely planted vs. naturally regenerated stands - as a decline driver can be assessed by comparing the growth and vigor responses to drought of similar natural vs. planted stands. Here, we compare these responses in natural and planted Black pine (Pinus nigra) stands located in southern Spain. We analyze how environmental factors - climatic (temperature and precipitation anomalies) and site conditions - and biotic factors - stand structure (age, tree size, density) and defoliation by the pine processionary moth - drive radial growth and crown condition at stand and tree levels. We also assess the climatic trends in the study area over the last 60 years. We use dendrochronology, linear mixed-effects models of basal area increment and structural equation models to determine how natural and planted stands respond to drought and current competition intensity. We observed that a temperature rise and a decrease in precipitation during the growing period led to increasing drought stress during the late 20th century. Trees from planted stands experienced stronger growth reductions and displayed more severe crown defoliation after severe droughts than those from natural stands. High stand density negatively drove growth and enhanced crown dieback, particularly in planted stands. Also pine processionary moth defoliation was more severe in the growth of natural than in planted stands but affected tree crown condition similarly in both stand types. In response to drought, sharp growth reduction and widespread defoliation of planted Mediterranean pine stands indicate that they are more vulnerable and less resilient to drought stress than natural stands. To mitigate forest decline of planted stands in xeric areas such as the Mediterranean Basin, less dense and more diverse stands should be created through selective thinning or by selecting species or provenances that are more drought tolerant. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Subcortical volumetric brain abnormalities have been observed in mood disorders. However, it is unknown whether these reflect adverse effects predisposing to mood disorders or emerge at illness onset. Magnetic resonance imaging was conducted at baseline and after two years in 111 initially unaffected young adults at increased risk of mood disorders because of a close family history of bipolar disorder and 93 healthy controls (HC). During the follow-up, 20 high-risk subjects developed major depressive disorder (HR-MDD), with the others remaining well (HR-well). Volumes of the lateral ventricles, caudate, putamen, pallidum, thalamus, hippocampus and amygdala were extracted for each hemisphere. Using linear mixed-effects models, differences and longitudinal changes in subcortical volumes were investigated between groups (HC, HR-MDD, HR-well). There were no significant differences for any subcortical volume between groups controlling for multiple testing. Additionally, no significant differences emerged between groups over time. Our results indicate that volumetric subcortical brain abnormalities of these regions using the current method appear not to form familial trait markers for vulnerability to mood disorders in close relatives of bipolar disorder patients over the two-year time period studied. Moreover, they do not appear to reduce in response to illness onset at least for the time period studied.
Resumo:
Infrared thermography (IRT) was used to assess the effect of routine claw trimming on claw temperature. In total, 648 IRT observations each were collected from 81 cows housed in 6 tiestalls before and 3 wk after claw trimming. The feet were classified as either healthy (nonlesion group, n = 182) or affected with infectious foot disorders (group IFD, n = 142). The maximal surface temperatures of the coronary band and skin and the difference of the maximal temperatures (ΔT) between the lateral and medial claws of the respective foot were assessed. Linear mixed models, correcting for the hierarchical structure of the data, ambient temperature, and infectious status of the claws, were developed to evaluate the effect of time in relation to the trimming event (d 0 versus d 21) and claw (medial versus lateral). Front feet and hind feet were analyzed separately. Ambient temperature and infectious foot status were identified as external and internal factors, respectively, that significantly affected claw temperature. Before claw trimming, the lateral claws of the hind feet were significantly warmer compared with the medial claws, whereas such a difference was not evident for the claws of the front feet. At d 21, ΔT of the hind feet was reduced by ≥ 0.25 °C, whereas it was increased by ≤ 0.13 °C in the front feet compared with d 0. Therefore, trimming was associated with a remarkable decrease of ΔT of the hind claws. Equalizing the weight bearing of the hind feet by routine claw trimming is associated with a measurable reduction of ΔT between the paired hind claws.