78 resultados para Linear mixed effect models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To determine the biomechanical effect of an intervertebral spacer on construct stiffness in a PVC model and cadaveric canine cervical vertebral columns stabilized with monocortical screws/polymethylmethacrylate (PMMA). STUDY DESIGN Biomechanical study. SAMPLE POPULATION PVC pipe; cadaveric canine vertebral columns. METHODS PVC model-PVC pipe was used to create a gap model mimicking vertebral endplate orientation and disk space width of large-breed canine cervical vertebrae; 6 models had a 4-mm gap with no spacer (PVC group 1); 6 had a PVC pipe ring spacer filling the gap (PCV group 2). Animals-large breed cadaveric canine cervical vertebral columns (C2-C7) from skeletally mature dogs without (cadaveric group 1, n = 6, historical data) and with an intervertebral disk spacer (cadaveric group 2, n = 6) were used. All PVC models and cadaver specimens were instrumented with monocortical titanium screws/PMMA. Stiffness of the 2 PVC groups was compared in extension, flexion, and lateral bending using non-destructive 4-point bend testing. Stiffness testing in all 3 directions was performed of the unaltered C4-C5 vertebral motion unit in cadaveric spines and repeated after placement of an intervertebral cortical allograft ring and instrumentation. Data were compared using a linear mixed model approach that also incorporated data from previously tested spines with the same screw/PMMA construct but without disk spacer (cadaveric group 1). RESULTS Addition of a spacer increased construct stiffness in both the PVC model (P < .001) and cadaveric vertebral columns (P < .001) compared to fixation without a spacer. CONCLUSIONS Addition of an intervertebral spacer significantly increased construct stiffness of monocortical screw/PMMA fixation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alpine snowbeds are habitats where the major limiting factors for plant growth are herbivory and a small time window for growth due to late snowmelt. Despite these limitations, snowbed vegetation usually forms a dense carpet of palatable plants due to favourable abiotic conditions for plant growth within the short growing season. These environmental characteristics make snowbeds particularly interesting to study the interplay of facilitation and competition. We hypothesised an interplay between resource competition and facilitation against herbivory. Further, we investigated whether these predicted neighbour effects were species-specific and/or dependent on ontogeny, and whether the balance of positive and negative plant–plant interactions shifted along a snowmelt gradient. We determined the neighbour effects by means of neighbour removal experiments along the snowmelt gradient, and linear mixed model analyses. The results showed that the effects of neighbour removal were weak but generally consistent among species and snowmelt dates, and depended on whether biomass production or survival was considered. Higher total biomass and increased fruiting in removal plots indicated that plants competed for nutrients, water, and light, thereby supporting the hypothesis of prevailing competition for resources in snowbeds. However, the presence of neighbours reduced herbivory and thereby also facilitated survival. For plant growth the facilitative effects against herbivores in snowbeds counterbalanced competition for resources, leading to a weak negative net effect. Overall the neighbour effects were not species-specific and did not change with snowmelt date. Our finding of counterbalancing effects of competition and facilitation within a plant community is of special theoretical value for species distribution models and can explain the success of models that give primary importance to abiotic factors and tend to overlook interrelations between biotic and abiotic effects on plants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Changes in CD4 cell counts are poorly documented in individuals with low or moderate-level viremia while on antiretroviral treatment (ART) in resource-limited settings. We assessed the impact of on-going HIV-RNA replication on CD4 cell count slopes in patients treated with a first-line combination ART. Method Naïve patients on a first-line ART regimen with at least two measures of HIV-RNA available after ART initiation were included in the study. The relationships between mean CD4 cell count change and HIV-RNA at 6 and 12 months after ART initiation (M6 and M12) were assessed by linear mixed models adjusted for gender, age, clinical stage and year of starting ART. Results 3,338 patients were included (14 cohorts, 64% female) and the group had the following characteristics: a median follow-up time of 1.6 years, a median age of 34 years, and a median CD4 cell count at ART initiation of 107 cells/μL. All patients with suppressed HIV-RNA at M12 had a continuous increase in CD4 cell count up to 18 months after treatment initiation. By contrast, any degree of HIV-RNA replication both at M6 and M12 was associated with a flat or a decreasing CD4 cell count slope. Multivariable analysis using HIV-RNA thresholds of 10,000 and 5,000 copies confirmed the significant effect of HIV-RNA on CD4 cell counts both at M6 and M12. Conclusion In routinely monitored patients on an NNRTI-based first-line ART, on-going low-level HIV-RNA replication was associated with a poor immune outcome in patients who had detectable levels of the virus after one year of ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Radio-frequency electromagnetic fields (RF EMF) of mobile communication systems are widespread in the living environment, yet their effects on humans are uncertain despite a growing body of literature. OBJECTIVES: We investigated the influence of a Universal Mobile Telecommunications System (UMTS) base station-like signal on well-being and cognitive performance in subjects with and without self-reported sensitivity to RF EMF. METHODS: We performed a controlled exposure experiment (45 min at an electric field strength of 0, 1, or 10 V/m, incident with a polarization of 45 degrees from the left back side of the subject, weekly intervals) in a randomized, double-blind crossover design. A total of 117 healthy subjects (33 self-reported sensitive, 84 nonsensitive subjects) participated in the study. We assessed well-being, perceived field strength, and cognitive performance with questionnaires and cognitive tasks and conducted statistical analyses using linear mixed models. Organ-specific and brain tissue-specific dosimetry including uncertainty and variation analysis was performed. RESULTS: In both groups, well-being and perceived field strength were not associated with actual exposure levels. We observed no consistent condition-induced changes in cognitive performance except for two marginal effects. At 10 V/m we observed a slight effect on speed in one of six tasks in the sensitive subjects and an effect on accuracy in another task in nonsensitive subjects. Both effects disappeared after multiple end point adjustment. CONCLUSIONS: In contrast to a recent Dutch study, we could not confirm a short-term effect of UMTS base station-like exposure on well-being. The reported effects on brain functioning were marginal and may have occurred by chance. Peak spatial absorption in brain tissue was considerably smaller than during use of a mobile phone. No conclusions can be drawn regarding short-term effects of cell phone exposure or the effects of long-term base station-like exposure on human health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: We examined the influence of clinical, radiologic, and echocardiographic characteristics on antithrombotic choice in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO), hypothesizing that features suggestive of paradoxical embolism might lead to greater use of anticoagulation. Methods: The Risk of Paradoxical Embolism Study combined 12 databases to create the largest dataset of patients with CS and known PFO status. We used generalized linear mixed models with a random effect of component study to explore whether anticoagulation was preferentially selected based on the following: (1) younger age and absence of vascular risk factors, (2) “high-risk” echocardiographic features, and (3) neuroradiologic findings. Results: A total of 1,132 patients with CS and PFO treated with anticoagulation or antiplatelets were included. Overall, 438 participants (39%) were treated with anticoagulation with a range (by database) of 22% to 54%. Treatment choice was not influenced by age or vascular risk factors. However, neuroradiologic findings (superficial or multiple infarcts) and high-risk echocardiographic features (large shunts, shunt at rest, and septal hypermobility) were predictors of anticoagulation use. Conclusion: Both antithrombotic regimens are widely used for secondary stroke prevention in patients with CS and PFO. Radiologic and echocardiographic features were strongly associated with treatment choice, whereas conventional vascular risk factors were not. Prior observational studies are likely to be biased by confounding by indication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Non-steroidal anti-inflammatory drugs (NSAIDs) are the backbone of osteoarthritis pain management. We aimed to assess the effectiveness of different preparations and doses of NSAIDs on osteoarthritis pain in a network meta-analysis. METHODS For this network meta-analysis, we considered randomised trials comparing any of the following interventions: NSAIDs, paracetamol, or placebo, for the treatment of osteoarthritis pain. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the reference lists of relevant articles for trials published between Jan 1, 1980, and Feb 24, 2015, with at least 100 patients per group. The prespecified primary and secondary outcomes were pain and physical function, and were extracted in duplicate for up to seven timepoints after the start of treatment. We used an extension of multivariable Bayesian random effects models for mixed multiple treatment comparisons with a random effect at the level of trials. For the primary analysis, a random walk of first order was used to account for multiple follow-up outcome data within a trial. Preparations that used different total daily dose were considered separately in the analysis. To assess a potential dose-response relation, we used preparation-specific covariates assuming linearity on log relative dose. FINDINGS We identified 8973 manuscripts from our search, of which 74 randomised trials with a total of 58 556 patients were included in this analysis. 23 nodes concerning seven different NSAIDs or paracetamol with specific daily dose of administration or placebo were considered. All preparations, irrespective of dose, improved point estimates of pain symptoms when compared with placebo. For six interventions (diclofenac 150 mg/day, etoricoxib 30 mg/day, 60 mg/day, and 90 mg/day, and rofecoxib 25 mg/day and 50 mg/day), the probability that the difference to placebo is at or below a prespecified minimum clinically important effect for pain reduction (effect size [ES] -0·37) was at least 95%. Among maximally approved daily doses, diclofenac 150 mg/day (ES -0·57, 95% credibility interval [CrI] -0·69 to -0·46) and etoricoxib 60 mg/day (ES -0·58, -0·73 to -0·43) had the highest probability to be the best intervention, both with 100% probability to reach the minimum clinically important difference. Treatment effects increased as drug dose increased, but corresponding tests for a linear dose effect were significant only for celecoxib (p=0·030), diclofenac (p=0·031), and naproxen (p=0·026). We found no evidence that treatment effects varied over the duration of treatment. Model fit was good, and between-trial heterogeneity and inconsistency were low in all analyses. All trials were deemed to have a low risk of bias for blinding of patients. Effect estimates did not change in sensitivity analyses with two additional statistical models and accounting for methodological quality criteria in meta-regression analysis. INTERPRETATION On the basis of the available data, we see no role for single-agent paracetamol for the treatment of patients with osteoarthritis irrespective of dose. We provide sound evidence that diclofenac 150 mg/day is the most effective NSAID available at present, in terms of improving both pain and function. Nevertheless, in view of the safety profile of these drugs, physicians need to consider our results together with all known safety information when selecting the preparation and dose for individual patients. FUNDING Swiss National Science Foundation (grant number 405340-104762) and Arco Foundation, Switzerland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There are concerns about the effects of in utero exposure to antiretroviral drugs (ARVs) on the development of HIV-exposed but uninfected (HEU) children. The aim of this study was to evaluate whether in utero exposure to ARVs is associated with lower birth weight/height and reduced growth during the first 2 years of life. METHODS This cohort study was conducted among HEU infants born between 1996 and 2010 in Tertiary children's hospital in Rio de Janeiro, Brazil. Weight was measured by mechanical scale, and height was measured by measuring board. Z-scores for weight-for-age (WAZ), length-for-age (LAZ) and weight-for-length were calculated. We modeled trajectories by mixed-effects models and adjusted for mother's age, CD4 cell count, viral load, year of birth and family income. RESULTS A total of 588 HEU infants were included of whom 155 (26%) were not exposed to ARVs, 114 (19%) were exposed early (first trimester) and 319 (54%) later. WAZ were lower among infants exposed early compared with infants exposed later: adjusted differences were -0.52 (95% confidence interval [CI]: -0.99 to -0.04, P = 0.02) at birth and -0.22 (95% CI: -0.47 to 0.04, P = 0.10) during follow-up. LAZ were lower during follow-up: -0.35 (95% CI: -0.63 to -0.08, P = 0.01). There were no differences in weight-for-length scores. Z-scores of infants exposed late during pregnancy were similar to unexposed infants. CONCLUSIONS In HEU children, early exposure to ARVs was associated with lower WAZ at birth and lower LAZ up to 2 years of life. Growth of HEU children needs to be monitored closely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality of life is an important outcome in the treatment of patients with schizophrenia. It has been suggested that patients' quality of life ratings (referred to as subjective quality of life, SQOL) might be too heavily influenced by symptomatology to be a valid independent outcome criterion. There has been only limited evidence on the association of symptom change and changes in SQOL over time. This study aimed to examine the association between changes in symptoms and in SQOL among patients with schizophrenia. A pooled data set was obtained from eight longitudinal studies that had used the Brief Psychiatric Rating Scale (BPRS) for measuring psychiatric symptoms and either the Lancashire Quality of Life Profile or the Manchester Short Assessment of Quality of Life for assessing SQOL. The sample comprised 886 patients with schizophrenia. After controlling for heterogeneity of findings across studies using linear mixed models, a reduction in psychiatric symptoms was associated with improvements in SQOL scores. In univariate analyses, changes in all BPRS subscales were associated with changes in SQOL scores. In a multivariate model, only associations between changes in the BPRS depression/anxiety and hostility subscales and changes in SQOL remained significant, with 5% and 0.5% of the variance in SQOL changes being attributable to changes in depression/anxiety and hostility respectively. All BPRS subscales together explained 8.5% of variance. The findings indicate that SQOL changes are influenced by symptom change, in particular in depression/anxiety. The level of influence is limited and may not compromise using SQOL as an independent outcome measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subjective quality of life (SQOL) is an important outcome in the treatment of patients with schizophrenia. However, there is only limited evidence on factors influencing SQOL, and little is known about whether the same factors influence SQOL in patients with schizophrenia and other mental disorders. This study aimed to identify the factors associated with SQOL and test whether these factors are equally important in schizophrenia and other disorders. For this we used a pooled data set obtained from 16 studies that had used either the Lancashire Quality of Life Profile or the Manchester Short Assessment of Quality of Life for assessing SQOL. The sample comprised 3936 patients with schizophrenia, mood disorders, and neurotic disorders. After controlling for confounding factors, within-subject clustering, and heterogeneity of findings across studies in linear mixed models, patients with schizophrenia had more favourable SQOL scores than those with mood and neurotic disorders. In all diagnostic groups, older patients, those in employment, and those with lower symptom scores had higher SQOL scores. Whilst the strength of the association between age and SQOL did not differ across diagnostic groups, symptom levels were more strongly associated with SQOL in neurotic than in mood disorders and schizophrenia. The association of employment and SQOL was stronger in mood and neurotic disorders than in schizophrenia. The findings may inform the use and interpretation of SQOL data for patients with schizophrenia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the numerous health benefits, population physical activity levels are low and declining with age. A continued increase of Internet access allows for website-delivered interventions to be implemented across age-groups, though older people have typically not been considered for this type of intervention. Therefore, the purpose of this study was to evaluate a website-delivered computer-tailored physical activity intervention, with a specific focus on differences in tailored advice acceptability, website usability, and physical activity change between three age-groups. To mimic "real-life" conditions, the intervention, which provided personalized physical activity feedback delivered via the Internet, was implemented and evaluated without any personal contact for the entire duration of the study. Data were collected online at baseline, 1-week, and 1-month follow-up and analyzed for three age-groups (≤44, 45-59, and ≥60 years) using linear mixed models. Overall, 803 adults received the intervention and 288 completed all measures. The oldest age-group increased physical activity more than the other two groups, spent the most time on the website, though had significantly lower perceived Internet self-confidence scores when compared with the youngest age-group. No differences were found in terms of website usability and tailored advice acceptability. These results suggest that website-delivered physical activity interventions can be suitable and effective for older aged adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Few data are available on the long-term immunologic response to antiretroviral therapy (ART) in resource-limited settings, where ART is being rapidly scaled up using a public health approach, with a limited repertoire of drugs. OBJECTIVES: To describe immunologic response to ART among ART patients in a network of cohorts from sub-Saharan Africa, Latin America, and Asia. STUDY POPULATION/METHODS: Treatment-naive patients aged 15 and older from 27 treatment programs were eligible. Multilevel, linear mixed models were used to assess associations between predictor variables and CD4 cell count trajectories following ART initiation. RESULTS: Of 29 175 patients initiating ART, 8933 (31%) were excluded due to insufficient follow-up time and early lost to follow-up or death. The remaining 19 967 patients contributed 39 200 person-years on ART and 71 067 CD4 cell count measurements. The median baseline CD4 cell count was 114 cells/microl, with 35% having less than 100 cells/microl. Substantial intersite variation in baseline CD4 cell count was observed (range 61-181 cells/microl). Women had higher median baseline CD4 cell counts than men (121 vs. 104 cells/microl). The median CD4 cell count increased from 114 cells/microl at ART initiation to 230 [interquartile range (IQR) 144-338] at 6 months, 263 (IQR 175-376) at 1 year, 336 (IQR 224-472) at 2 years, 372 (IQR 242-537) at 3 years, 377 (IQR 221-561) at 4 years, and 395 (IQR 240-592) at 5 years. In multivariable models, baseline CD4 cell count was the most important determinant of subsequent CD4 cell count trajectories. CONCLUSION: These data demonstrate robust and sustained CD4 response to ART among patients remaining on therapy. Public health and programmatic interventions leading to earlier HIV diagnosis and initiation of ART could substantially improve patient outcomes in resource-limited settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On Swiss rabbit breeding farms, group-housed does are usually kept singly for 12 days around parturition to avoid pseudograviclity, double litters and deleterious fighting for nests. After this isolation phase there is usually an integration of new group members. Here we studied whether keeping the group composition stable would reduce agonistic interactions, stress levels and injuries when regrouping after the isolation phase. Does were kept in 12 pens containing 8 rabbits each. In two trials, with a total of 24 groups, the group composition before and after the 12 days isolation period remained the same (treatment: stable, S) in 12 groups. In the other 12 groups two or three does were replaced after the isolation phase by unfamiliar does (treatment: mixed, M). Does of S-groups had been housed together for one reproduction cycle. One day before and on days 2, 4 and 6 after regrouping, data on lesions, stress levels (faecal corticosterone metabolites, FCM) and agonistic interactions were collected and statistically analysed using mixed effects models. Lesion scores and the frequency of agonistic interactions were highest on day 2 after regrouping and thereafter decrease in both groups. There was a trend towards more lesions in M-groups compared to S-groups. After regrouping FCM levels were increased in M-groups, but not in S-groups. Furthermore, there was a significant interaction of treatment and experimental day on agonistic interactions. Thus, the frequency of biting and boxing increased more in M-groups than in S-groups. These findings indicate that group stability had an effect on agonistic interactions, stress and lesions. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.