832 resultados para Poisson generalized linear mixed models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Changes in CD4 cell counts are poorly documented in individuals with low or moderate-level viremia while on antiretroviral treatment (ART) in resource-limited settings. We assessed the impact of on-going HIV-RNA replication on CD4 cell count slopes in patients treated with a first-line combination ART. Method Naïve patients on a first-line ART regimen with at least two measures of HIV-RNA available after ART initiation were included in the study. The relationships between mean CD4 cell count change and HIV-RNA at 6 and 12 months after ART initiation (M6 and M12) were assessed by linear mixed models adjusted for gender, age, clinical stage and year of starting ART. Results 3,338 patients were included (14 cohorts, 64% female) and the group had the following characteristics: a median follow-up time of 1.6 years, a median age of 34 years, and a median CD4 cell count at ART initiation of 107 cells/μL. All patients with suppressed HIV-RNA at M12 had a continuous increase in CD4 cell count up to 18 months after treatment initiation. By contrast, any degree of HIV-RNA replication both at M6 and M12 was associated with a flat or a decreasing CD4 cell count slope. Multivariable analysis using HIV-RNA thresholds of 10,000 and 5,000 copies confirmed the significant effect of HIV-RNA on CD4 cell counts both at M6 and M12. Conclusion In routinely monitored patients on an NNRTI-based first-line ART, on-going low-level HIV-RNA replication was associated with a poor immune outcome in patients who had detectable levels of the virus after one year of ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality of life is an important outcome in the treatment of patients with schizophrenia. It has been suggested that patients' quality of life ratings (referred to as subjective quality of life, SQOL) might be too heavily influenced by symptomatology to be a valid independent outcome criterion. There has been only limited evidence on the association of symptom change and changes in SQOL over time. This study aimed to examine the association between changes in symptoms and in SQOL among patients with schizophrenia. A pooled data set was obtained from eight longitudinal studies that had used the Brief Psychiatric Rating Scale (BPRS) for measuring psychiatric symptoms and either the Lancashire Quality of Life Profile or the Manchester Short Assessment of Quality of Life for assessing SQOL. The sample comprised 886 patients with schizophrenia. After controlling for heterogeneity of findings across studies using linear mixed models, a reduction in psychiatric symptoms was associated with improvements in SQOL scores. In univariate analyses, changes in all BPRS subscales were associated with changes in SQOL scores. In a multivariate model, only associations between changes in the BPRS depression/anxiety and hostility subscales and changes in SQOL remained significant, with 5% and 0.5% of the variance in SQOL changes being attributable to changes in depression/anxiety and hostility respectively. All BPRS subscales together explained 8.5% of variance. The findings indicate that SQOL changes are influenced by symptom change, in particular in depression/anxiety. The level of influence is limited and may not compromise using SQOL as an independent outcome measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subjective quality of life (SQOL) is an important outcome in the treatment of patients with schizophrenia. However, there is only limited evidence on factors influencing SQOL, and little is known about whether the same factors influence SQOL in patients with schizophrenia and other mental disorders. This study aimed to identify the factors associated with SQOL and test whether these factors are equally important in schizophrenia and other disorders. For this we used a pooled data set obtained from 16 studies that had used either the Lancashire Quality of Life Profile or the Manchester Short Assessment of Quality of Life for assessing SQOL. The sample comprised 3936 patients with schizophrenia, mood disorders, and neurotic disorders. After controlling for confounding factors, within-subject clustering, and heterogeneity of findings across studies in linear mixed models, patients with schizophrenia had more favourable SQOL scores than those with mood and neurotic disorders. In all diagnostic groups, older patients, those in employment, and those with lower symptom scores had higher SQOL scores. Whilst the strength of the association between age and SQOL did not differ across diagnostic groups, symptom levels were more strongly associated with SQOL in neurotic than in mood disorders and schizophrenia. The association of employment and SQOL was stronger in mood and neurotic disorders than in schizophrenia. The findings may inform the use and interpretation of SQOL data for patients with schizophrenia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the numerous health benefits, population physical activity levels are low and declining with age. A continued increase of Internet access allows for website-delivered interventions to be implemented across age-groups, though older people have typically not been considered for this type of intervention. Therefore, the purpose of this study was to evaluate a website-delivered computer-tailored physical activity intervention, with a specific focus on differences in tailored advice acceptability, website usability, and physical activity change between three age-groups. To mimic "real-life" conditions, the intervention, which provided personalized physical activity feedback delivered via the Internet, was implemented and evaluated without any personal contact for the entire duration of the study. Data were collected online at baseline, 1-week, and 1-month follow-up and analyzed for three age-groups (≤44, 45-59, and ≥60 years) using linear mixed models. Overall, 803 adults received the intervention and 288 completed all measures. The oldest age-group increased physical activity more than the other two groups, spent the most time on the website, though had significantly lower perceived Internet self-confidence scores when compared with the youngest age-group. No differences were found in terms of website usability and tailored advice acceptability. These results suggest that website-delivered physical activity interventions can be suitable and effective for older aged adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Radio-frequency electromagnetic fields (RF EMF) of mobile communication systems are widespread in the living environment, yet their effects on humans are uncertain despite a growing body of literature. OBJECTIVES: We investigated the influence of a Universal Mobile Telecommunications System (UMTS) base station-like signal on well-being and cognitive performance in subjects with and without self-reported sensitivity to RF EMF. METHODS: We performed a controlled exposure experiment (45 min at an electric field strength of 0, 1, or 10 V/m, incident with a polarization of 45 degrees from the left back side of the subject, weekly intervals) in a randomized, double-blind crossover design. A total of 117 healthy subjects (33 self-reported sensitive, 84 nonsensitive subjects) participated in the study. We assessed well-being, perceived field strength, and cognitive performance with questionnaires and cognitive tasks and conducted statistical analyses using linear mixed models. Organ-specific and brain tissue-specific dosimetry including uncertainty and variation analysis was performed. RESULTS: In both groups, well-being and perceived field strength were not associated with actual exposure levels. We observed no consistent condition-induced changes in cognitive performance except for two marginal effects. At 10 V/m we observed a slight effect on speed in one of six tasks in the sensitive subjects and an effect on accuracy in another task in nonsensitive subjects. Both effects disappeared after multiple end point adjustment. CONCLUSIONS: In contrast to a recent Dutch study, we could not confirm a short-term effect of UMTS base station-like exposure on well-being. The reported effects on brain functioning were marginal and may have occurred by chance. Peak spatial absorption in brain tissue was considerably smaller than during use of a mobile phone. No conclusions can be drawn regarding short-term effects of cell phone exposure or the effects of long-term base station-like exposure on human health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Few data are available on the long-term immunologic response to antiretroviral therapy (ART) in resource-limited settings, where ART is being rapidly scaled up using a public health approach, with a limited repertoire of drugs. OBJECTIVES: To describe immunologic response to ART among ART patients in a network of cohorts from sub-Saharan Africa, Latin America, and Asia. STUDY POPULATION/METHODS: Treatment-naive patients aged 15 and older from 27 treatment programs were eligible. Multilevel, linear mixed models were used to assess associations between predictor variables and CD4 cell count trajectories following ART initiation. RESULTS: Of 29 175 patients initiating ART, 8933 (31%) were excluded due to insufficient follow-up time and early lost to follow-up or death. The remaining 19 967 patients contributed 39 200 person-years on ART and 71 067 CD4 cell count measurements. The median baseline CD4 cell count was 114 cells/microl, with 35% having less than 100 cells/microl. Substantial intersite variation in baseline CD4 cell count was observed (range 61-181 cells/microl). Women had higher median baseline CD4 cell counts than men (121 vs. 104 cells/microl). The median CD4 cell count increased from 114 cells/microl at ART initiation to 230 [interquartile range (IQR) 144-338] at 6 months, 263 (IQR 175-376) at 1 year, 336 (IQR 224-472) at 2 years, 372 (IQR 242-537) at 3 years, 377 (IQR 221-561) at 4 years, and 395 (IQR 240-592) at 5 years. In multivariable models, baseline CD4 cell count was the most important determinant of subsequent CD4 cell count trajectories. CONCLUSION: These data demonstrate robust and sustained CD4 response to ART among patients remaining on therapy. Public health and programmatic interventions leading to earlier HIV diagnosis and initiation of ART could substantially improve patient outcomes in resource-limited settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Agroforestry is a sustainable land use method with a long tradition in the Bolivian Andes. A better understanding of people’s knowledge and valuation of woody species can help to adjust actor-oriented agroforestry systems. In this case study, carried out in a peasant community of the Bolivian Andes, we aimed at calculating the cultural importance of selected agroforestry species, and at analysing the intracultural variation in the cultural importance and knowledge of plants according to peasants’ sex, age, and migration. Methods Data collection was based on semi-structured interviews and freelisting exercises. Two ethnobotanical indices (Composite Salience, Cultural Importance) were used for calculating the cultural importance of plants. Intracultural variation in the cultural importance and knowledge of plants was detected by using linear and generalised linear (mixed) models. Results and discussion The culturally most important woody species were mainly trees and exotic species (e.g. Schinus molle, Prosopis laevigata, Eucalyptus globulus). We found that knowledge and valuation of plants increased with age but that they were lower for migrants; sex, by contrast, played a minor role. The age effects possibly result from decreasing ecological apparency of valuable native species, and their substitution by exotic marketable trees, loss of traditional plant uses or the use of other materials (e.g. plastic) instead of wood. Decreasing dedication to traditional farming may have led to successive abandonment of traditional tool uses, and the overall transformation of woody plant use is possibly related to diminishing medicinal knowledge. Conclusions Age and migration affect how people value woody species and what they know about their uses. For this reason, we recommend paying particular attention to the potential of native species, which could open promising perspectives especially for the young migrating peasant generation and draw their interest in agroforestry. These native species should be ecologically sound and selected on their potential to provide subsistence and promising commercial uses. In addition to offering socio-economic and environmental services, agroforestry initiatives using native trees and shrubs can play a crucial role in recovering elements of the lost ancient landscape that still forms part of local people’s collective identity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Assessment and treatment of psychological distress in cancer patients was recognized as a major challenge. The role of spouses, caregivers, and significant others became of salient importance not only because of their supportive functions but also in respect to their own burden. The purpose of this study was to assess the amount of distress in a mixed sample of cancer patients and their partners and to explore the dyadic interdependence. METHODS: An initial sample of 154 dyads was recruited, and distress questionnaires (Hospital Anxiety and Depression Scale, Symptom Checklist 9-Item Short Version and 12-Item Short Form Health Survey) were assessed over four time points. Linear mixed models and actor-partner interdependence models were applied. RESULTS: A significant proportion of patients and their partners (up to 40%) reported high levels of anxiety, depression, psychological distress, and low quality of life over the course of the investigation. Mixed model analyses revealed that higher risks for clinical relevant anxiety and depression in couples exist for female patients and especially for female partners. Although psychological strain decreased over time, the risk for elevated distress in female partners remained. Modeling patient-partner interdependence over time stratified by patients' gender revealed specific effects: a moderate correlation between distress in patients and partners, and a transmission of distress from male patients to their female partners. CONCLUSIONS: Our findings provide empirical support for gender-specific transmission of distress in dyads coping with cancer. This should be considered as an important starting point for planning systemic psycho-oncological interventions and conceptualizing further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SUMMARY BACKGROUND/OBJECTIVES Orthodontic management of maxillary canine impaction (MCI), including forced eruption, may result in significant root resorption; however, the association between MCI and orthodontically induced root resorption (OIRR) is not yet sufficiently established. The purpose of this retrospective cohort study was to comparatively evaluate the severity of OIRR of maxillary incisors in orthodontically treated patients with MCI. Additionally, impaction characteristics were associated with OIRR severity. SUBJECTS AND METHODS The sample comprised 48 patients undergoing fixed-appliance treatment-24 with unilateral/bilateral MCI and 24 matched controls without impaction. OIRR was calculated using pre- and post-operative panoramic tomograms. The orientation of eruption path, height, sector location, and follicle/tooth ratio of the impacted canine were also recorded. Mann-Whitney U-test and univariate and multivariate linear mixed models were used to test for the associations of interest. RESULTS Maxillary central left incisor underwent more OIRR in the impaction group (mean difference = 0.58mm, P = 0.04). Overall, the impaction group had 0.38mm more OIRR compared to the control (95% confidence interval, CI: 0.03, 0.74; P = 0.04). However, multivariate analysis demonstrated no difference in the amount of OIRR between impaction and non-impaction groups overall. A positive association between OIRR and initial root length was observed (95% CI: 0.08, 0.27; P < 0.001). The severity of canine impaction was not found to be a significant predictor of OIRR. LIMITATIONS This study was a retrospective study and used panoramic tomograms for OIRR measurements. CONCLUSIONS This study indicates that MCI is a weak OIRR predictor. Interpretation of the results needs caution due to the observational nature of the present study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The CD4 cell count or percent (CD4%) at the start of combination antiretroviral therapy (cART) is an important prognostic factor in children starting therapy and an important indicator of program performance. We describe trends and determinants of CD4 measures at cART initiation in children from low-, middle-, and high-income countries. METHODS We included children aged <16 years from clinics participating in a collaborative study spanning sub-Saharan Africa, Asia, Latin America, and the United States. Missing CD4 values at cART start were estimated through multiple imputation. Severe immunodeficiency was defined according to World Health Organization criteria. Analyses used generalized additive mixed models adjusted for age, country, and calendar year. RESULTS A total of 34,706 children from 9 low-income, 6 lower middle-income, 4 upper middle-income countries, and 1 high-income country (United States) were included; 20,624 children (59%) had severe immunodeficiency. In low-income countries, the estimated prevalence of children starting cART with severe immunodeficiency declined from 76% in 2004 to 63% in 2010. Corresponding figures for lower middle-income countries were from 77% to 66% and for upper middle-income countries from 75% to 58%. In the United States, the percentage decreased from 42% to 19% during the period 1996 to 2006. In low- and middle-income countries, infants and children aged 12-15 years had the highest prevalence of severe immunodeficiency at cART initiation. CONCLUSIONS Despite progress in most low- and middle-income countries, many children continue to start cART with severe immunodeficiency. Early diagnosis and treatment of HIV-infected children to prevent morbidity and mortality associated with immunodeficiency must remain a global public health priority.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenge for sustainable organic dairy farming is identification of cows that are well adapted to forage-based production systems. Therefore, the aim of this study was to compare the grazing behaviour, physical activity and metabolic profile of two different Holstein strains kept in an organic grazing system without concentrate supplementation. Twelve Swiss (HCH ; 566 kg body weight (BW) and 12 New Zealand Holstein-Friesian (HNZ ; 530 kg BW) cows in mid-lactation were kept in a rotational grazing system. After an adaptation period, the milk yield, nutrient intake, physical activity and grazing behaviour were recorded for each cow for 7 days. On three consecutive days, blood was sampled at 07:00, 12:00 and 17:00 h from each cow by jugular vein puncture. Data were analysed using linear mixed models. No differences were found in milk yield, but milk fat (3.69 vs. 4.05%, P = 0.05) and milk protein percentage (2.92 vs. 3.20%, P < 0.01) were lower in HCH than in HNZ cows. Herbage intake did not differ between strains, but organic matter digestibility was greater (P = 0.01) in HCH compared to HNZ cows. The HCH cows spent less (P = 0.04) time ruminating (439 vs. 469 min/day) and had a lower (P = 0.02) number of ruminating boli when compared to the HNZ cows. The time spent eating and physical activity did not differ between strains. Concentrations of IGF-1 and T3 were lower (P ≤ 0.05) in HCH than HNZ cows. In conclusion, HCH cows were not able to increase dry matter intake in order to express their full genetic potential for milk production when kept in an organic grazing system without concentrate supplementation. On the other hand, HNZ cows seem to compensate for the reduced nutrient availability better than HCH cows but could not use that advantage for increased production efficiency

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE Poison centres offer rapid and comprehensive support for emergency physicians managing poisoned patients. This study investigates institutional, case-specific and poisoning-specific factors which influence the decision of emergency physicians to contact a poison centre. METHODS Retrospective, consecutive review of all poisoning-related admissions to the emergency departments (EDs) of a primary care hospital and a university hospital-based tertiary referral centre during 2007. Corresponding poison centre consultations were extracted from the poison centre database. Data were matched and analysed by logistic regression and generalised linear mixed models. RESULTS 545 poisonings were treated in the participating EDs (350 (64.2%) in the tertiary care centre, 195 (35.8%) in the primary care hospital). The poison centre was consulted in 62 (11.4%) cases (38 (61.3%) by the tertiary care centre and 24 (38.7%) by the primary care hospital). Factors significantly associated with poison centre consultation included gender (female vs male) (OR 2.99; 95% CI 1.69 to 5.29; p<0.001), number of ingested substances (>1 vs 1) (OR 2.84; 95% CI 1.65 to 4.9; p<0.001) and situation (accidental vs intentional) (OR 2.76; 95% CI 1.05 to 7.25; p=0.039). In contrast, age, medical history and hospital size did not influence poison centre consultation. Poison centre consultation was significantly higher during the week, and significantly less during night shifts. The poison centre was consulted significantly more when patients were admitted to intensive care units (OR 5.81; 95% CI 3.25 to 10.37; p<0.001). Asymptomatic and severe versus mild cases were associated with more frequent consultation (OR 4.48; 95% CI 1.78 to 11.26; p=0.001 and OR 2.76; 95% CI 1.42 to 5.38; p=0.003). CONCLUSIONS We found low rates of poison centre consultation by emergency physicians. It appears that intensive care unit admission and other factors reflecting either complexity or uncertainty of the clinical situation are the strongest predictors for poison centre consultation. Hospital size did not influence referral behaviour.