69 resultados para linear mixed-effects models
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
PURPOSE. To evaluate the role of fellow eye status in determining progression of geographic atrophy (GA) in patients with age-related macular degeneration (AMD). METHODS. A total of 300 eyes with GA of 193 patients from the prospective, longitudinal, natural history FAM Study were classified into three groups according to the AMD manifestation in the fellow eye at baseline examination: (1) bilateral GA, (2) early/intermediate AMD, and (3) exudative AMD. GA areas were quantified based on fundus autofluorescence images using a semiautomated image-processing method, and progression rates (PR) were estimated using two-level, linear, mixed-effects models. RESULTS. Crude GA-PR in the bilateral GA group (mean, 1.64 mm(2)/y; 95% CI, 1.478-1.803) was significantly higher than in the fellow eye early/intermediate group (0.74 mm(2)/y, 0.146-1.342). Although there was a significant difference in baseline GA size (P = 0.0013, t-test), and there was a significant increase in GA-PR by 0.11 mm(2)/y (0.05-0.17) per 1 disc area (DA; 2.54 mm(2)), an additional mean change of -0.79 (-1.43 to -0.15) was given to the PR beside the effect of baseline GA size. However, this difference was only significant when GA size was ?1 DA at baseline with a GA-PR of 1.70 mm(2)/y (1.54-1.85) in the bilateral and 0.95 mm(2)/y (0.37-1.54) in the early/intermediate group. There was no significant difference in PR compared with that in the fellow eye exudative group. CONCLUSIONS. The results indicate that the AMD manifestation of the fellow eye at baseline serves as an indicator for disease progression in eyes with GA ? 1 DA. Predictive characteristics not only contribute to the understanding of pathophysiological mechanisms, but also are useful for the design of future interventional trials in GA patients.
Resumo:
Despite the impact of red blood cell (RBC) Life-spans in some disease areas such as diabetes or anemia of chronic kidney disease, there is no consensus on how to quantitatively best describe the process. Several models have been proposed to explain the elimination process of RBCs: random destruction process, homogeneous life-span model, or a series of 4-transit compartment model. The aim of this work was to explore the different models that have been proposed in literature, and modifications to those. The impact of choosing the right model on future outcomes prediction--in the above mentioned areas--was also investigated. Both data from indirect (clinical data) and direct life-span measurement (biotin-labeled data) methods were analyzed using non-linear mixed effects models. Analysis showed that: (1) predictions from non-steady state data will depend on the RBC model chosen; (2) the transit compartment model, which considers variation in life-span in the RBC population, better describes RBC survival data than the random destruction or homogenous life-span models; and (3) the additional incorporation of random destruction patterns, although improving the description of the RBC survival data, does not appear to provide a marked improvement when describing clinical data.
Resumo:
Milk cortisol concentration was determined under routine management conditions on 4 farms with an auto-tandem milking parlor and 8 farms with 1 of 2 automatic milking systems (AMS). One of the AMS was a partially forced (AMSp) system, and the other was a free cow traffic (AMSf) system. Milk samples were collected for all the cows on a given farm (20 to 54 cows) for at least 1 d. Behavioral observations were made during the milking process for a subset of 16 to 20 cows per farm. Milk cortisol concentration was evaluated by milking system, time of day, behavior during milking, daily milk yield, and somatic cell count using linear mixed-effects models. Milk cortisol did not differ between systems (AMSp: 1.15 +/- 0.07; AMSf: 1.02 +/- 0.12; auto-tandem parlor: 1.01 +/- 0.16 nmol/L). Cortisol concentrations were lower in evening than in morning milkings (1.01 +/- 0.12 vs. 1.24 +/- 0.13 nmol/L). The daily periodicity of cortisol concentration was characterized by an early morning peak and a late afternoon elevation in AMSp. A bimodal pattern was not evident in AMSf. Finally, milk cortisol decreased by a factor of 0.915 in milking parlors, by 0.998 in AMSp, and increased by a factor of 1.161 in AMSf for each unit of ln(somatic cell count/1,000). We conclude that milking cows in milking parlors or AMS does not result in relevant stress differences as measured by milk cortisol concentrations. The biological relevance of the difference regarding the daily periodicity of milk cortisol concentrations observed between the AMSp and AMSf needs further investigation.
Resumo:
Objective: To evaluate a new triaxial accelerometer device for prediction of energy expenditure, measured as VO2/kg, in obese adults and normal-weight controls during activities of daily life. Subjects and methods: Thirty-seven obese adults (Body Mass Index (BMI) 37±5.4) and seventeen controls (BMI 23±1.8) performed eight activities for 5 to 8 minutes while wearing a triaxial accelerometer on the right thigh. Simultaneously, VO2 and VCO2 were measured using a portable metabolic system. The relationship between accelerometer counts (AC) and VO2/kg was analysed using spline regression and linear mixed-effects models. Results: For all activities, VO2/kg was significantly lower in obese participants than in normalweight controls. A linear relationship between AC and VO2/kg existed only within accelerometer values from 0 to 300 counts/min, with an increase of 3.7 (95%-confidence interval (CI) 3.4 - 4.1) and 3.9 ml/min (95%-CI 3.4 - 4.3) per increase of 100 counts/min in obese and normal-weight adults, respectively. Linear modelling of the whole range yields wide prediction intervals for VO2/kg of ± 6.3 and ±7.3 ml/min in both groups. Conclusion: In obese and normal-weight adults, the use of AC for predicting energy expenditure, defined as VO2/kg, from a broad range of physical activities, characterized by varying intensities and types of muscle work, is limited.
Resumo:
The southernmost European natural and planted pine forests are among the most vulnerable areas to warming-induced drought decline. Both drought stress and management factors (e.g., stand origin or reduced thinning) may induce decline by reducing the water available to trees but their relative importances have not been properly assessed. The role of stand origin - densely planted vs. naturally regenerated stands - as a decline driver can be assessed by comparing the growth and vigor responses to drought of similar natural vs. planted stands. Here, we compare these responses in natural and planted Black pine (Pinus nigra) stands located in southern Spain. We analyze how environmental factors - climatic (temperature and precipitation anomalies) and site conditions - and biotic factors - stand structure (age, tree size, density) and defoliation by the pine processionary moth - drive radial growth and crown condition at stand and tree levels. We also assess the climatic trends in the study area over the last 60 years. We use dendrochronology, linear mixed-effects models of basal area increment and structural equation models to determine how natural and planted stands respond to drought and current competition intensity. We observed that a temperature rise and a decrease in precipitation during the growing period led to increasing drought stress during the late 20th century. Trees from planted stands experienced stronger growth reductions and displayed more severe crown defoliation after severe droughts than those from natural stands. High stand density negatively drove growth and enhanced crown dieback, particularly in planted stands. Also pine processionary moth defoliation was more severe in the growth of natural than in planted stands but affected tree crown condition similarly in both stand types. In response to drought, sharp growth reduction and widespread defoliation of planted Mediterranean pine stands indicate that they are more vulnerable and less resilient to drought stress than natural stands. To mitigate forest decline of planted stands in xeric areas such as the Mediterranean Basin, less dense and more diverse stands should be created through selective thinning or by selecting species or provenances that are more drought tolerant. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Subcortical volumetric brain abnormalities have been observed in mood disorders. However, it is unknown whether these reflect adverse effects predisposing to mood disorders or emerge at illness onset. Magnetic resonance imaging was conducted at baseline and after two years in 111 initially unaffected young adults at increased risk of mood disorders because of a close family history of bipolar disorder and 93 healthy controls (HC). During the follow-up, 20 high-risk subjects developed major depressive disorder (HR-MDD), with the others remaining well (HR-well). Volumes of the lateral ventricles, caudate, putamen, pallidum, thalamus, hippocampus and amygdala were extracted for each hemisphere. Using linear mixed-effects models, differences and longitudinal changes in subcortical volumes were investigated between groups (HC, HR-MDD, HR-well). There were no significant differences for any subcortical volume between groups controlling for multiple testing. Additionally, no significant differences emerged between groups over time. Our results indicate that volumetric subcortical brain abnormalities of these regions using the current method appear not to form familial trait markers for vulnerability to mood disorders in close relatives of bipolar disorder patients over the two-year time period studied. Moreover, they do not appear to reduce in response to illness onset at least for the time period studied.
Resumo:
On Swiss rabbit breeding farms, group-housed does are usually kept singly for 12 days around parturition to avoid pseudograviclity, double litters and deleterious fighting for nests. After this isolation phase there is usually an integration of new group members. Here we studied whether keeping the group composition stable would reduce agonistic interactions, stress levels and injuries when regrouping after the isolation phase. Does were kept in 12 pens containing 8 rabbits each. In two trials, with a total of 24 groups, the group composition before and after the 12 days isolation period remained the same (treatment: stable, S) in 12 groups. In the other 12 groups two or three does were replaced after the isolation phase by unfamiliar does (treatment: mixed, M). Does of S-groups had been housed together for one reproduction cycle. One day before and on days 2, 4 and 6 after regrouping, data on lesions, stress levels (faecal corticosterone metabolites, FCM) and agonistic interactions were collected and statistically analysed using mixed effects models. Lesion scores and the frequency of agonistic interactions were highest on day 2 after regrouping and thereafter decrease in both groups. There was a trend towards more lesions in M-groups compared to S-groups. After regrouping FCM levels were increased in M-groups, but not in S-groups. Furthermore, there was a significant interaction of treatment and experimental day on agonistic interactions. Thus, the frequency of biting and boxing increased more in M-groups than in S-groups. These findings indicate that group stability had an effect on agonistic interactions, stress and lesions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Pspline uses xtmixed to fit a penalized spline regression and plots the smoothed function. Additional covariates can be specified to adjust the smooth and plot partial residuals.
Resumo:
Uterine smooth muscle specimens were collected from euthanatized mares in estrus and diestrus. Longitudinal and circular specimens were mounted in organ baths and the signals transcribed to a Grass polygraph. After equilibration time and 2 g preload, their physiologic isometric contractility was recorded for a continuous 2.0 h. Area under the curve, frequency and time occupied by contractions were studied. Differences between cycle phases, between muscle layers, and over the recorded time periods were statistically evaluated using linear mixed-effect models. In the mare, physiologic contractility of the uterus decreased significantly over time for all variables evaluated (time as covariate on a continuous scale). For area under the curve, there was a significant effect of muscle layer (longitudinal > circular). For frequency, higher values were recorded in estrus for circular smooth muscle layer, whereas higher values were seen in longitudinal smooth muscle layers during diestrus. In longitudinal layer and in diestrus, more time was occupied by contractions than in circular layer, and in estrus. This study is describing physiologic myometrial motility in the organ bath depending on cycle phase.
Resumo:
BACKGROUND: Estimates of the decrease in CD4(+) cell counts in untreated patients with human immunodeficiency virus (HIV) infection are important for patient care and public health. We analyzed CD4(+) cell count decreases in the Cape Town AIDS Cohort and the Swiss HIV Cohort Study. METHODS: We used mixed-effects models and joint models that allowed for the correlation between CD4(+) cell count decreases and survival and stratified analyses by the initial cell count (50-199, 200-349, 350-499, and 500-750 cells/microL). Results are presented as the mean decrease in CD4(+) cell count with 95% confidence intervals (CIs) during the first year after the initial CD4(+) cell count. RESULTS: A total of 784 South African (629 nonwhite) and 2030 Swiss (218 nonwhite) patients with HIV infection contributed 13,388 CD4(+) cell counts. Decreases in CD4(+) cell count were steeper in white patients, patients with higher initial CD4(+) cell counts, and older patients. Decreases ranged from a mean of 38 cells/microL (95% CI, 24-54 cells/microL) in nonwhite patients from the Swiss HIV Cohort Study 15-39 years of age with an initial CD4(+) cell count of 200-349 cells/microL to a mean of 210 cells/microL (95% CI, 143-268 cells/microL) in white patients in the Cape Town AIDS Cohort > or =40 years of age with an initial CD4(+) cell count of 500-750 cells/microL. CONCLUSIONS: Among both patients from Switzerland and patients from South Africa, CD4(+) cell count decreases were greater in white patients with HIV infection than they were in nonwhite patients with HIV infection.
Resumo:
OBJECTIVES Zidovudine (ZDV) is recommended for first-line antiretroviral therapy (ART) in resource-limited settings. ZDV may, however, lead to anemia and impaired immunological response. We compared CD4+ cell counts over 5 years between patients starting ART with and without ZDV in southern Africa. DESIGN Cohort study. METHODS Patients aged at least 16 years who started first-line ART in South Africa, Botswana, Zambia, or Lesotho were included. We used linear mixed-effect models to compare CD4+ cell count trajectories between patients on ZDV-containing regimens and patients on other regimens, censoring follow-up at first treatment change. Impaired immunological recovery, defined as a CD4+ cell count below 100 cells/μl at 1 year, was assessed in logistic regression. Analyses were adjusted for baseline CD4+ cell count and hemoglobin level, age, sex, type of regimen, viral load monitoring, and calendar year. RESULTS A total of 72,597 patients starting ART, including 19,758 (27.2%) on ZDV, were analyzed. Patients on ZDV had higher CD4+ cell counts (150 vs.128 cells/μl) and hemoglobin level (12.0 vs. 11.0 g/dl) at baseline, and were less likely to be women than those on other regimens. Adjusted differences in CD4+ cell counts between regimens containing and not containing ZDV were -16 cells/μl [95% confidence interval (CI) -18 to -14] at 1 year and -56 cells/μl (95% CI -59 to -52) at 5 years. Impaired immunological recovery was more likely with ZDV compared to other regimens (odds ratio 1.40, 95% CI 1.22-1.61). CONCLUSION In southern Africa, ZDV is associated with inferior immunological recovery compared to other backbones. Replacing ZDV with another nucleoside reverse transcriptase inhibitor could avoid unnecessary switches to second-line ART.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
OBJECTIVES The aim of the present longitudinal study was to investigate bacterial colonization of the internal implant cavity and to evaluate a possible association with peri-implant bone loss. METHODS A total of 264 paper point samples were harvested from the intra-implant cavity of 66 implants in 26 patients immediately following implant insertion and after 3, 4, and 12 months. Samples were evaluated for Aggregatibacter actinomycetemcomitans, Fusobacterium nucleatum, Porphyromonas gingivalis, Prevotella intermedia, Treponema denticola, and Tannerella forsythia as well as total bacterial counts by real-time PCR. Bone loss was evaluated on standardized radiographs up to 25 months after implant insertion. For the statistical analysis of the data, mixed effects models were fitted. RESULTS There was an increase in the frequency of detection as well as in the mean counts of the selected bacteria over time. The evaluation of the target bacteria revealed a significant association of Pr. intermedia at 4 and 12 months with peri-implant bone loss at 25 months (4 months: P = 0.009; 12 months: P = 0.021). CONCLUSIONS The present study could demonstrate a progressive colonization by periodontopathogenic bacteria in the internal cavities of two-piece implants. The results suggest that internal colonization with Pr. intermedia was associated with peri-implant bone loss.
Resumo:
Background: A small pond, c. 90 years old, near Bern, Switzerland contains a population of threespine stickleback (Gasterosteus aculeatus) with two distinct male phenotypes. Males of one type are large, and red, and nest in the shallow littoral zone. The males of the other are small and orange, and nest offshore at slightly greater depth. The females in this population are phenotypically highly variable but cannot easily be assigned to either male type. Question: Is the existence of two sympatric male morphs maintained by substrate-associated male nest site choice and facilitated by female mate preferences? Organisms: Male stickleback caught individually at their breeding sites. Females caught with minnow traps. Methods: In experimental tanks, we simulated the slope and substrate of the two nesting habitats. We then placed individual males in a tank and observed in which habitat the male would build his nest. In a simultaneous two-stimulus choice design, we gave females the choice between a large, red male and a small, orange one. We measured female morphology and used linear mixed effect models to determine whether female preference correlated with female morphology. Results: Both red and orange males preferred nesting in the habitat that simulated the slightly deeper offshore condition. This is the habitat occupied by the small, orange males in the pond itself. The proportion of females that chose a small orange male was similar to that which chose a large red male. Several aspects of female phenotype correlated with the male type that a female preferred.