48 resultados para Poisson generalized linear mixed models
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.
Resumo:
Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
Background Agroforestry is a sustainable land use method with a long tradition in the Bolivian Andes. A better understanding of people’s knowledge and valuation of woody species can help to adjust actor-oriented agroforestry systems. In this case study, carried out in a peasant community of the Bolivian Andes, we aimed at calculating the cultural importance of selected agroforestry species, and at analysing the intracultural variation in the cultural importance and knowledge of plants according to peasants’ sex, age, and migration. Methods Data collection was based on semi-structured interviews and freelisting exercises. Two ethnobotanical indices (Composite Salience, Cultural Importance) were used for calculating the cultural importance of plants. Intracultural variation in the cultural importance and knowledge of plants was detected by using linear and generalised linear (mixed) models. Results and discussion The culturally most important woody species were mainly trees and exotic species (e.g. Schinus molle, Prosopis laevigata, Eucalyptus globulus). We found that knowledge and valuation of plants increased with age but that they were lower for migrants; sex, by contrast, played a minor role. The age effects possibly result from decreasing ecological apparency of valuable native species, and their substitution by exotic marketable trees, loss of traditional plant uses or the use of other materials (e.g. plastic) instead of wood. Decreasing dedication to traditional farming may have led to successive abandonment of traditional tool uses, and the overall transformation of woody plant use is possibly related to diminishing medicinal knowledge. Conclusions Age and migration affect how people value woody species and what they know about their uses. For this reason, we recommend paying particular attention to the potential of native species, which could open promising perspectives especially for the young migrating peasant generation and draw their interest in agroforestry. These native species should be ecologically sound and selected on their potential to provide subsistence and promising commercial uses. In addition to offering socio-economic and environmental services, agroforestry initiatives using native trees and shrubs can play a crucial role in recovering elements of the lost ancient landscape that still forms part of local people’s collective identity.
Resumo:
OBJECTIVE: Assessment and treatment of psychological distress in cancer patients was recognized as a major challenge. The role of spouses, caregivers, and significant others became of salient importance not only because of their supportive functions but also in respect to their own burden. The purpose of this study was to assess the amount of distress in a mixed sample of cancer patients and their partners and to explore the dyadic interdependence. METHODS: An initial sample of 154 dyads was recruited, and distress questionnaires (Hospital Anxiety and Depression Scale, Symptom Checklist 9-Item Short Version and 12-Item Short Form Health Survey) were assessed over four time points. Linear mixed models and actor-partner interdependence models were applied. RESULTS: A significant proportion of patients and their partners (up to 40%) reported high levels of anxiety, depression, psychological distress, and low quality of life over the course of the investigation. Mixed model analyses revealed that higher risks for clinical relevant anxiety and depression in couples exist for female patients and especially for female partners. Although psychological strain decreased over time, the risk for elevated distress in female partners remained. Modeling patient-partner interdependence over time stratified by patients' gender revealed specific effects: a moderate correlation between distress in patients and partners, and a transmission of distress from male patients to their female partners. CONCLUSIONS: Our findings provide empirical support for gender-specific transmission of distress in dyads coping with cancer. This should be considered as an important starting point for planning systemic psycho-oncological interventions and conceptualizing further research.
Resumo:
SUMMARY BACKGROUND/OBJECTIVES Orthodontic management of maxillary canine impaction (MCI), including forced eruption, may result in significant root resorption; however, the association between MCI and orthodontically induced root resorption (OIRR) is not yet sufficiently established. The purpose of this retrospective cohort study was to comparatively evaluate the severity of OIRR of maxillary incisors in orthodontically treated patients with MCI. Additionally, impaction characteristics were associated with OIRR severity. SUBJECTS AND METHODS The sample comprised 48 patients undergoing fixed-appliance treatment-24 with unilateral/bilateral MCI and 24 matched controls without impaction. OIRR was calculated using pre- and post-operative panoramic tomograms. The orientation of eruption path, height, sector location, and follicle/tooth ratio of the impacted canine were also recorded. Mann-Whitney U-test and univariate and multivariate linear mixed models were used to test for the associations of interest. RESULTS Maxillary central left incisor underwent more OIRR in the impaction group (mean difference = 0.58mm, P = 0.04). Overall, the impaction group had 0.38mm more OIRR compared to the control (95% confidence interval, CI: 0.03, 0.74; P = 0.04). However, multivariate analysis demonstrated no difference in the amount of OIRR between impaction and non-impaction groups overall. A positive association between OIRR and initial root length was observed (95% CI: 0.08, 0.27; P < 0.001). The severity of canine impaction was not found to be a significant predictor of OIRR. LIMITATIONS This study was a retrospective study and used panoramic tomograms for OIRR measurements. CONCLUSIONS This study indicates that MCI is a weak OIRR predictor. Interpretation of the results needs caution due to the observational nature of the present study.
Resumo:
BACKGROUND The CD4 cell count or percent (CD4%) at the start of combination antiretroviral therapy (cART) is an important prognostic factor in children starting therapy and an important indicator of program performance. We describe trends and determinants of CD4 measures at cART initiation in children from low-, middle-, and high-income countries. METHODS We included children aged <16 years from clinics participating in a collaborative study spanning sub-Saharan Africa, Asia, Latin America, and the United States. Missing CD4 values at cART start were estimated through multiple imputation. Severe immunodeficiency was defined according to World Health Organization criteria. Analyses used generalized additive mixed models adjusted for age, country, and calendar year. RESULTS A total of 34,706 children from 9 low-income, 6 lower middle-income, 4 upper middle-income countries, and 1 high-income country (United States) were included; 20,624 children (59%) had severe immunodeficiency. In low-income countries, the estimated prevalence of children starting cART with severe immunodeficiency declined from 76% in 2004 to 63% in 2010. Corresponding figures for lower middle-income countries were from 77% to 66% and for upper middle-income countries from 75% to 58%. In the United States, the percentage decreased from 42% to 19% during the period 1996 to 2006. In low- and middle-income countries, infants and children aged 12-15 years had the highest prevalence of severe immunodeficiency at cART initiation. CONCLUSIONS Despite progress in most low- and middle-income countries, many children continue to start cART with severe immunodeficiency. Early diagnosis and treatment of HIV-infected children to prevent morbidity and mortality associated with immunodeficiency must remain a global public health priority.
Resumo:
The challenge for sustainable organic dairy farming is identification of cows that are well adapted to forage-based production systems. Therefore, the aim of this study was to compare the grazing behaviour, physical activity and metabolic profile of two different Holstein strains kept in an organic grazing system without concentrate supplementation. Twelve Swiss (HCH ; 566 kg body weight (BW) and 12 New Zealand Holstein-Friesian (HNZ ; 530 kg BW) cows in mid-lactation were kept in a rotational grazing system. After an adaptation period, the milk yield, nutrient intake, physical activity and grazing behaviour were recorded for each cow for 7 days. On three consecutive days, blood was sampled at 07:00, 12:00 and 17:00 h from each cow by jugular vein puncture. Data were analysed using linear mixed models. No differences were found in milk yield, but milk fat (3.69 vs. 4.05%, P = 0.05) and milk protein percentage (2.92 vs. 3.20%, P < 0.01) were lower in HCH than in HNZ cows. Herbage intake did not differ between strains, but organic matter digestibility was greater (P = 0.01) in HCH compared to HNZ cows. The HCH cows spent less (P = 0.04) time ruminating (439 vs. 469 min/day) and had a lower (P = 0.02) number of ruminating boli when compared to the HNZ cows. The time spent eating and physical activity did not differ between strains. Concentrations of IGF-1 and T3 were lower (P ≤ 0.05) in HCH than HNZ cows. In conclusion, HCH cows were not able to increase dry matter intake in order to express their full genetic potential for milk production when kept in an organic grazing system without concentrate supplementation. On the other hand, HNZ cows seem to compensate for the reduced nutrient availability better than HCH cows but could not use that advantage for increased production efficiency
Resumo:
OBJECTIVE Poison centres offer rapid and comprehensive support for emergency physicians managing poisoned patients. This study investigates institutional, case-specific and poisoning-specific factors which influence the decision of emergency physicians to contact a poison centre. METHODS Retrospective, consecutive review of all poisoning-related admissions to the emergency departments (EDs) of a primary care hospital and a university hospital-based tertiary referral centre during 2007. Corresponding poison centre consultations were extracted from the poison centre database. Data were matched and analysed by logistic regression and generalised linear mixed models. RESULTS 545 poisonings were treated in the participating EDs (350 (64.2%) in the tertiary care centre, 195 (35.8%) in the primary care hospital). The poison centre was consulted in 62 (11.4%) cases (38 (61.3%) by the tertiary care centre and 24 (38.7%) by the primary care hospital). Factors significantly associated with poison centre consultation included gender (female vs male) (OR 2.99; 95% CI 1.69 to 5.29; p<0.001), number of ingested substances (>1 vs 1) (OR 2.84; 95% CI 1.65 to 4.9; p<0.001) and situation (accidental vs intentional) (OR 2.76; 95% CI 1.05 to 7.25; p=0.039). In contrast, age, medical history and hospital size did not influence poison centre consultation. Poison centre consultation was significantly higher during the week, and significantly less during night shifts. The poison centre was consulted significantly more when patients were admitted to intensive care units (OR 5.81; 95% CI 3.25 to 10.37; p<0.001). Asymptomatic and severe versus mild cases were associated with more frequent consultation (OR 4.48; 95% CI 1.78 to 11.26; p=0.001 and OR 2.76; 95% CI 1.42 to 5.38; p=0.003). CONCLUSIONS We found low rates of poison centre consultation by emergency physicians. It appears that intensive care unit admission and other factors reflecting either complexity or uncertainty of the clinical situation are the strongest predictors for poison centre consultation. Hospital size did not influence referral behaviour.
Resumo:
Infrared thermography (IRT) was used to assess the effect of routine claw trimming on claw temperature. In total, 648 IRT observations each were collected from 81 cows housed in 6 tiestalls before and 3 wk after claw trimming. The feet were classified as either healthy (nonlesion group, n = 182) or affected with infectious foot disorders (group IFD, n = 142). The maximal surface temperatures of the coronary band and skin and the difference of the maximal temperatures (ΔT) between the lateral and medial claws of the respective foot were assessed. Linear mixed models, correcting for the hierarchical structure of the data, ambient temperature, and infectious status of the claws, were developed to evaluate the effect of time in relation to the trimming event (d 0 versus d 21) and claw (medial versus lateral). Front feet and hind feet were analyzed separately. Ambient temperature and infectious foot status were identified as external and internal factors, respectively, that significantly affected claw temperature. Before claw trimming, the lateral claws of the hind feet were significantly warmer compared with the medial claws, whereas such a difference was not evident for the claws of the front feet. At d 21, ΔT of the hind feet was reduced by ≥ 0.25 °C, whereas it was increased by ≤ 0.13 °C in the front feet compared with d 0. Therefore, trimming was associated with a remarkable decrease of ΔT of the hind claws. Equalizing the weight bearing of the hind feet by routine claw trimming is associated with a measurable reduction of ΔT between the paired hind claws.
Resumo:
The main objective of this preliminary study was to further clarify the association between testosterone (T) levels and depression by investigating symptom-based depression subtypes in a sample of 64 men. The data were taken from the ZInEP epidemiology survey. Gonadal hormones of a melancholic (n = 25) and an atypical (n = 14) depression subtype, derived from latent class analysis, were compared with those of healthy controls (n = 18). Serum T was assayed using an enzyme-linked immunosorbent assay procedure. Analysis of variance, analysis of covariance, non-parametrical tests, and generalized linear regression models were performed to examine group differences. The atypical depressive subtype showed significantly lower T levels compared with the melancholic depressives. While accumulative evidence indicates that, beyond psychosocial characteristics, the melancholic and atypical depressive subtypes are also distinguishable by biological correlates, the current study expanded this knowledge to include gonadal hormones. Further longitudinal research is warranted to disclose causality by linking the multiple processes in pathogenesis of depression.
Resumo:
BACKGROUND In resource-limited settings, clinical parameters, including body weight changes, are used to monitor clinical response. Therefore, we studied body weight changes in patients on antiretroviral treatment (ART) in different regions of the world. METHODS Data were extracted from the "International Epidemiologic Databases to Evaluate AIDS," a network of ART programmes that prospectively collects routine clinical data. Adults on ART from the Southern, East, West, and Central African and the Asia-Pacific regions were selected from the database if baseline data on body weight, gender, ART regimen, and CD4 count were available. Body weight change over the first 2 years and the probability of body weight loss in the second year were modeled using linear mixed models and logistic regression, respectively. RESULTS Data from 205,571 patients were analyzed. Mean adjusted body weight change in the first 12 months was higher in patients started on tenofovir and/or efavirenz; in patients from Central, West, and East Africa, in men, and in patients with a poorer clinical status. In the second year of ART, it was greater in patients initiated on tenofovir and/or nevirapine, and for patients not on stavudine, in women, in Southern Africa and in patients with a better clinical status at initiation. Stavudine in the initial regimen was associated with a lower mean adjusted body weight change and with weight loss in the second treatment year. CONCLUSIONS Different ART regimens have different effects on body weight change. Body weight loss after 1 year of treatment in patients on stavudine might be associated with lipoatrophy.
Resumo:
PURPOSE. To evaluate the role of fellow eye status in determining progression of geographic atrophy (GA) in patients with age-related macular degeneration (AMD). METHODS. A total of 300 eyes with GA of 193 patients from the prospective, longitudinal, natural history FAM Study were classified into three groups according to the AMD manifestation in the fellow eye at baseline examination: (1) bilateral GA, (2) early/intermediate AMD, and (3) exudative AMD. GA areas were quantified based on fundus autofluorescence images using a semiautomated image-processing method, and progression rates (PR) were estimated using two-level, linear, mixed-effects models. RESULTS. Crude GA-PR in the bilateral GA group (mean, 1.64 mm(2)/y; 95% CI, 1.478-1.803) was significantly higher than in the fellow eye early/intermediate group (0.74 mm(2)/y, 0.146-1.342). Although there was a significant difference in baseline GA size (P = 0.0013, t-test), and there was a significant increase in GA-PR by 0.11 mm(2)/y (0.05-0.17) per 1 disc area (DA; 2.54 mm(2)), an additional mean change of -0.79 (-1.43 to -0.15) was given to the PR beside the effect of baseline GA size. However, this difference was only significant when GA size was ?1 DA at baseline with a GA-PR of 1.70 mm(2)/y (1.54-1.85) in the bilateral and 0.95 mm(2)/y (0.37-1.54) in the early/intermediate group. There was no significant difference in PR compared with that in the fellow eye exudative group. CONCLUSIONS. The results indicate that the AMD manifestation of the fellow eye at baseline serves as an indicator for disease progression in eyes with GA ? 1 DA. Predictive characteristics not only contribute to the understanding of pathophysiological mechanisms, but also are useful for the design of future interventional trials in GA patients.
Resumo:
Alpine snowbeds are habitats where the major limiting factors for plant growth are herbivory and a small time window for growth due to late snowmelt. Despite these limitations, snowbed vegetation usually forms a dense carpet of palatable plants due to favourable abiotic conditions for plant growth within the short growing season. These environmental characteristics make snowbeds particularly interesting to study the interplay of facilitation and competition. We hypothesised an interplay between resource competition and facilitation against herbivory. Further, we investigated whether these predicted neighbour effects were species-specific and/or dependent on ontogeny, and whether the balance of positive and negative plant–plant interactions shifted along a snowmelt gradient. We determined the neighbour effects by means of neighbour removal experiments along the snowmelt gradient, and linear mixed model analyses. The results showed that the effects of neighbour removal were weak but generally consistent among species and snowmelt dates, and depended on whether biomass production or survival was considered. Higher total biomass and increased fruiting in removal plots indicated that plants competed for nutrients, water, and light, thereby supporting the hypothesis of prevailing competition for resources in snowbeds. However, the presence of neighbours reduced herbivory and thereby also facilitated survival. For plant growth the facilitative effects against herbivores in snowbeds counterbalanced competition for resources, leading to a weak negative net effect. Overall the neighbour effects were not species-specific and did not change with snowmelt date. Our finding of counterbalancing effects of competition and facilitation within a plant community is of special theoretical value for species distribution models and can explain the success of models that give primary importance to abiotic factors and tend to overlook interrelations between biotic and abiotic effects on plants.
Resumo:
Despite the impact of red blood cell (RBC) Life-spans in some disease areas such as diabetes or anemia of chronic kidney disease, there is no consensus on how to quantitatively best describe the process. Several models have been proposed to explain the elimination process of RBCs: random destruction process, homogeneous life-span model, or a series of 4-transit compartment model. The aim of this work was to explore the different models that have been proposed in literature, and modifications to those. The impact of choosing the right model on future outcomes prediction--in the above mentioned areas--was also investigated. Both data from indirect (clinical data) and direct life-span measurement (biotin-labeled data) methods were analyzed using non-linear mixed effects models. Analysis showed that: (1) predictions from non-steady state data will depend on the RBC model chosen; (2) the transit compartment model, which considers variation in life-span in the RBC population, better describes RBC survival data than the random destruction or homogenous life-span models; and (3) the additional incorporation of random destruction patterns, although improving the description of the RBC survival data, does not appear to provide a marked improvement when describing clinical data.