838 resultados para linear mixed-effects models
Resumo:
Abstract Background Bronchial challenge tests are used to evaluate bronchial responsiveness in diagnosis and follow-up of asthmatic patients. Challenge induced cough has increasingly been recognized as a valuable diagnostic tool. Various stimuli and protocols have been employed. The aim of this study was to compare cough and dyspnea intensity induced by different stimuli. Methods Twenty asthmatic patients underwent challenge tests with methacholine, bradykinin and exercise. Cough was counted during challenge tests. Dyspnea was assessed by modified Borg scale and visual analogue scale. Statistical comparisons were performed by linear mixed-effects model. Results For cough evaluation, bradykinin was the most potent trigger (p < 0.01). In terms of dyspnea measured by Borg scale, there were no differences among stimuli (p > 0.05). By visual analogue scale, bradykinin induced more dyspnea than other stimuli (p ≤ 0.04). Conclusion Bradykinin seems to be the most suitable stimulus for bronchial challenge tests intended for measuring cough in association with bronchoconstriction.
Resumo:
Uterine smooth muscle specimens were collected from euthanatized mares in estrus and diestrus. Longitudinal and circular specimens were mounted in organ baths and the signals transcribed to a Grass polygraph. After equilibration time and 2 g preload, their physiologic isometric contractility was recorded for a continuous 2.0 h. Area under the curve, frequency and time occupied by contractions were studied. Differences between cycle phases, between muscle layers, and over the recorded time periods were statistically evaluated using linear mixed-effect models. In the mare, physiologic contractility of the uterus decreased significantly over time for all variables evaluated (time as covariate on a continuous scale). For area under the curve, there was a significant effect of muscle layer (longitudinal > circular). For frequency, higher values were recorded in estrus for circular smooth muscle layer, whereas higher values were seen in longitudinal smooth muscle layers during diestrus. In longitudinal layer and in diestrus, more time was occupied by contractions than in circular layer, and in estrus. This study is describing physiologic myometrial motility in the organ bath depending on cycle phase.
Resumo:
BACKGROUND: Estimates of the decrease in CD4(+) cell counts in untreated patients with human immunodeficiency virus (HIV) infection are important for patient care and public health. We analyzed CD4(+) cell count decreases in the Cape Town AIDS Cohort and the Swiss HIV Cohort Study. METHODS: We used mixed-effects models and joint models that allowed for the correlation between CD4(+) cell count decreases and survival and stratified analyses by the initial cell count (50-199, 200-349, 350-499, and 500-750 cells/microL). Results are presented as the mean decrease in CD4(+) cell count with 95% confidence intervals (CIs) during the first year after the initial CD4(+) cell count. RESULTS: A total of 784 South African (629 nonwhite) and 2030 Swiss (218 nonwhite) patients with HIV infection contributed 13,388 CD4(+) cell counts. Decreases in CD4(+) cell count were steeper in white patients, patients with higher initial CD4(+) cell counts, and older patients. Decreases ranged from a mean of 38 cells/microL (95% CI, 24-54 cells/microL) in nonwhite patients from the Swiss HIV Cohort Study 15-39 years of age with an initial CD4(+) cell count of 200-349 cells/microL to a mean of 210 cells/microL (95% CI, 143-268 cells/microL) in white patients in the Cape Town AIDS Cohort > or =40 years of age with an initial CD4(+) cell count of 500-750 cells/microL. CONCLUSIONS: Among both patients from Switzerland and patients from South Africa, CD4(+) cell count decreases were greater in white patients with HIV infection than they were in nonwhite patients with HIV infection.
Resumo:
OBJECTIVES Zidovudine (ZDV) is recommended for first-line antiretroviral therapy (ART) in resource-limited settings. ZDV may, however, lead to anemia and impaired immunological response. We compared CD4+ cell counts over 5 years between patients starting ART with and without ZDV in southern Africa. DESIGN Cohort study. METHODS Patients aged at least 16 years who started first-line ART in South Africa, Botswana, Zambia, or Lesotho were included. We used linear mixed-effect models to compare CD4+ cell count trajectories between patients on ZDV-containing regimens and patients on other regimens, censoring follow-up at first treatment change. Impaired immunological recovery, defined as a CD4+ cell count below 100 cells/μl at 1 year, was assessed in logistic regression. Analyses were adjusted for baseline CD4+ cell count and hemoglobin level, age, sex, type of regimen, viral load monitoring, and calendar year. RESULTS A total of 72,597 patients starting ART, including 19,758 (27.2%) on ZDV, were analyzed. Patients on ZDV had higher CD4+ cell counts (150 vs.128 cells/μl) and hemoglobin level (12.0 vs. 11.0 g/dl) at baseline, and were less likely to be women than those on other regimens. Adjusted differences in CD4+ cell counts between regimens containing and not containing ZDV were -16 cells/μl [95% confidence interval (CI) -18 to -14] at 1 year and -56 cells/μl (95% CI -59 to -52) at 5 years. Impaired immunological recovery was more likely with ZDV compared to other regimens (odds ratio 1.40, 95% CI 1.22-1.61). CONCLUSION In southern Africa, ZDV is associated with inferior immunological recovery compared to other backbones. Replacing ZDV with another nucleoside reverse transcriptase inhibitor could avoid unnecessary switches to second-line ART.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
OBJECTIVES The aim of the present longitudinal study was to investigate bacterial colonization of the internal implant cavity and to evaluate a possible association with peri-implant bone loss. METHODS A total of 264 paper point samples were harvested from the intra-implant cavity of 66 implants in 26 patients immediately following implant insertion and after 3, 4, and 12 months. Samples were evaluated for Aggregatibacter actinomycetemcomitans, Fusobacterium nucleatum, Porphyromonas gingivalis, Prevotella intermedia, Treponema denticola, and Tannerella forsythia as well as total bacterial counts by real-time PCR. Bone loss was evaluated on standardized radiographs up to 25 months after implant insertion. For the statistical analysis of the data, mixed effects models were fitted. RESULTS There was an increase in the frequency of detection as well as in the mean counts of the selected bacteria over time. The evaluation of the target bacteria revealed a significant association of Pr. intermedia at 4 and 12 months with peri-implant bone loss at 25 months (4 months: P = 0.009; 12 months: P = 0.021). CONCLUSIONS The present study could demonstrate a progressive colonization by periodontopathogenic bacteria in the internal cavities of two-piece implants. The results suggest that internal colonization with Pr. intermedia was associated with peri-implant bone loss.
Resumo:
Background: A small pond, c. 90 years old, near Bern, Switzerland contains a population of threespine stickleback (Gasterosteus aculeatus) with two distinct male phenotypes. Males of one type are large, and red, and nest in the shallow littoral zone. The males of the other are small and orange, and nest offshore at slightly greater depth. The females in this population are phenotypically highly variable but cannot easily be assigned to either male type. Question: Is the existence of two sympatric male morphs maintained by substrate-associated male nest site choice and facilitated by female mate preferences? Organisms: Male stickleback caught individually at their breeding sites. Females caught with minnow traps. Methods: In experimental tanks, we simulated the slope and substrate of the two nesting habitats. We then placed individual males in a tank and observed in which habitat the male would build his nest. In a simultaneous two-stimulus choice design, we gave females the choice between a large, red male and a small, orange one. We measured female morphology and used linear mixed effect models to determine whether female preference correlated with female morphology. Results: Both red and orange males preferred nesting in the habitat that simulated the slightly deeper offshore condition. This is the habitat occupied by the small, orange males in the pond itself. The proportion of females that chose a small orange male was similar to that which chose a large red male. Several aspects of female phenotype correlated with the male type that a female preferred.
Resumo:
BACKGROUND There are concerns about the effects of in utero exposure to antiretroviral drugs (ARVs) on the development of HIV-exposed but uninfected (HEU) children. The aim of this study was to evaluate whether in utero exposure to ARVs is associated with lower birth weight/height and reduced growth during the first 2 years of life. METHODS This cohort study was conducted among HEU infants born between 1996 and 2010 in Tertiary children's hospital in Rio de Janeiro, Brazil. Weight was measured by mechanical scale, and height was measured by measuring board. Z-scores for weight-for-age (WAZ), length-for-age (LAZ) and weight-for-length were calculated. We modeled trajectories by mixed-effects models and adjusted for mother's age, CD4 cell count, viral load, year of birth and family income. RESULTS A total of 588 HEU infants were included of whom 155 (26%) were not exposed to ARVs, 114 (19%) were exposed early (first trimester) and 319 (54%) later. WAZ were lower among infants exposed early compared with infants exposed later: adjusted differences were -0.52 (95% confidence interval [CI]: -0.99 to -0.04, P = 0.02) at birth and -0.22 (95% CI: -0.47 to 0.04, P = 0.10) during follow-up. LAZ were lower during follow-up: -0.35 (95% CI: -0.63 to -0.08, P = 0.01). There were no differences in weight-for-length scores. Z-scores of infants exposed late during pregnancy were similar to unexposed infants. CONCLUSIONS In HEU children, early exposure to ARVs was associated with lower WAZ at birth and lower LAZ up to 2 years of life. Growth of HEU children needs to be monitored closely.
Resumo:
In most epidemiological studies, historical monitoring data are scant and must be pooled to identify occupational groups with homogeneous exposures. Homogeneity of exposure is generally assessed in a group of workers who share a common job title or work in a common area. While published results suggest that the degree of homogeneity varies widely across job groups, less is known whether such variation differs across industrial sectors, classes of contaminants, or in the methods used to group workers. Relying upon a compilation of results presented in the literature, patterns of homogeneity among nearly 500 occupational groups of workers were evaluated on the basis of type of industry and agent. Additionally, effects of the characteristics of the sampling strategy on estimated indicators of homogeneity of exposure were assessed. ^ Exposure profiles for occupational groups of workers have typically been assessed under the assumption of stationarity, i.e., the mean exposure level and variance of the distribution that describes the underlying population of exposures are constant over time. Yet, the literature has shown that occupational exposures have declined in the last decades. This renders traditional methods for the description of exposure profiles inadequate. Thus, work was needed to develop appropriate methods to assess homogeneity for groups of workers whose exposures have changed over time. A study was carried out applying mixed effects models with a term for temporal trend to appropriately describe exposure profiles of groups of workers in the nickel-producing industry over a 20-year period. Using a sub-set of groups of nickel-exposed workers, another study was conducted to develop and apply a framework to evaluate the assumption of stationarity of the variances in the presence of systematic changes in exposure levels over time. ^
Resumo:
Asthma is the most common chronic disorder in childhood, affecting an estimated 6.2 million children under 18 years (1). The purpose of this study was to look at individual- and community-level characteristics simultaneously to examine and explain the factors that contribute to the use of emergency department services by children 18 years old or less and to determine if there was an association between air quality and ED visits in the same population, from 2005-2007 in Houston/Harris County. Data were collected from the Houston Safety Net Hospital Emergency Department Use Study and the 2000 US Census. Bivariate and multivariate logistic regression models and mixed effects models were used to analyze data that was collected during the study period.^ There were 704,902 ED visits made by children 18 and younger, who were living in Houston from January 1, 2005 to December 31, 2007. Of those, 19,098 had a primary discharge diagnosis of asthma. Asthma ED visits varied by season, with proportions of ED visits for asthma highest from September-December. African-American children were 2.6 (95% CI, 2.43-2.66) times more likely to have an ED visit for asthma compared to White children. Poverty, single parent headed households, and younger age all a greater likelihood of having gone to the ED for asthma treatment. Compared to Whites living in lightly-monitored pollution areas, African-Americans and Hispanics living in heavily monitored areas were 1.15 (95% CI, 1.04-1.28) times more likely to have an ED visit for asthma.^ Race and poverty seem to account for a large portion of the disparities in ED use found among children. This was true even after accounting for multiple individual- and community-level variables. These results suggest that racial disparities in asthma continue to pose risks for African American children, and they point to the need for additional research into potential explanations and remedies. Programs to reduce inappropriate ED use must be sensitive to an array of complex socioeconomic issues within minority and income populations. ^
Resumo:
BACKGROUND: This observational research study investigated the association of cardiorespiratory fitness and weight status with repeated measures of 24-hr ambulatory blood pressure (24-hr ABP). Little is known about these associations and few data exist examining the interaction between cardiorespiratory fitness and weight status and the contributions of each on 24-hr ABP in youth. ^ METHODS: This research study used secondary analysis data from the "Adolescent Blood Pressure and Anger: Ethnic Differences" study. This current study sample included 374 African-American, Anglo-American, and Mexican-American adolescents 11-16 years of age. Mixed-effects models were used for testing the relationship between weight status and cardiorespiratory fitness and repeated measures of ambulatory blood pressure over 24 hours (24-hr ABP). Weight status was categorized into "normal weight" (BMI<85th percentile), "overweight" (85th≤BMI<95th), and "obese" (BMI≥95th). Cardiorespiratory fitness, determined by heart rate recovery (HRR), was defined as the difference between heart rate at peak exercise and heart rate at two minutes post-exercise, as measured by a height-adjusted step test and stratified into two groups: low and high fitness, using a median split. Ambulatory blood pressure (ABP) was monitored for a 24-hr period on a school day using the Spacelabs ambulatory monitor (Model 90207). Blood pressure and heart rate were recorded at 30 minute intervals throughout the day of recording and at 60 minute intervals during sleep. ^ RESULTS: No significant associations were found between weight status and mean 24-hr systolic blood pressure (SBP) or mean arterial pressure (MAP). A significant and inverse association between weight status and mean 24-hr diastolic blood pressure (DBP) was revealed. Cardiorespiratory fitness was significantly and inversely associated with mean 24-hr ABP. High fitness adolescents had significantly lower mean 24-hr SPB, DBP, and MAP measurements than low fitness adolescents. Compared to low fitness adolescents, high fitness adolescents had 1.90 mmHg, 1.16 mmHg, and 1.68 mmHg lower mean 24-hr SBP, DBP, and MAP, respectively. Additionally, high fitness appeared to afford protection from higher mean 24-hr SBP and MAP, irrespective of weight status. Among normal weight adolescents, low fitness resulted in higher mean 24-hr SBP and MAP, compared to their fit counterparts. Among adolescents categorized as high fitness, increasing weight status did not appear to result in higher mean 24-hr SBP or MAP. Cardiorespiratory fitness, rather than weight status, appeared to be a more dominant predictor of mean 24-hr SBP and MAP. ^ CONCLUSIONS: To our knowledge, this research is the first study to investigate the independent and combined contributions of cardiorespiratory fitness and weight status on 24-hr ABP, all objectively measured. The results of this study may potentially guide and inform future research. It appears that early cardiovascular disease (CVD) prevention should focus on improving cardiorespiratory fitness levels among all adolescents, particularly those adolescents least fit, regardless of their weight status, while obesity prevention efforts continue.^
Resumo:
Access to different environments may lead to inter-population behavioural changes within a species that allow populations to exploit their immediate environments. Elephant seals from Marion Island (MI) and King George Island (KGI) (Isla 25 de Mayo) forage in different oceanic environments and evidently employ different foraging strategies. This study elucidates some of the factors influencing the diving behaviour of male southern elephant seals from these populations tracked between 1999 and 2002. Mixed-effects models were used to determine the influence of bathymetry, population of origin, body length (as a proxy for size) and individual variation on the diving behaviour of adult male elephant seals from the two populations. Males from KGI and MI showed differences in all dive parameters. MI males dived deeper and longer (median: 652.0 m and 34.00 min) than KGI males (median: 359.1 m and 25.50 min). KGI males appeared to forage both benthically and pelagically while MI males in this study rarely reached depths close to the seafloor and appeared to forage pelagically. Model outputs indicate that males from the two populations showed substantial differences in their dive depths, even when foraging in areas of similar water depth. Whereas dive depths were not significantly influenced by the size of the animals, size played a significant role in dive durations, though this was also influenced by the population that elephant seals originated from. This study provides some support for inter-population differences in dive behaviour of male southern elephant seals.
Resumo:
As the Antarctic Circumpolar Current crosses the South-West Indian Ocean Ridge, it creates an extensive eddy field characterised by high sea level anomaly variability. We investigated the diving behaviour of female southern elephant seals from Marion Island during their post-moult migrations in relation to this eddy field in order to determine its role in the animals' at-sea dispersal. Most seals dived within the region significantly more often than predicted by chance, and these dives were generally shallower and shorter than dives outside the eddy field. Mixed effects models estimated reductions of 44.33 ± 3.00 m (maximum depth) and 6.37 ± 0.10 min (dive duration) as a result of diving within the region, along with low between-seal variability (maximum depth: 5.5 % and dive duration: 8.4 %). U-shaped dives increased in frequency inside the eddy field, whereas W-shaped dives with multiple vertical movements decreased. Results suggest that Marion Island's adult female elephant seals' dives are characterised by lowered cost-of-transport when they encounter the eddy field during the start and end of their post-moult migrations. This might result from changes in buoyancy associated with varying body condition upon leaving and returning to the island. Our results do not suggest that the eddy field is a vital foraging ground for Marion Island's southern elephant seals. However, because seals preferentially travel through this area and likely forage opportunistically while minimising transport costs, we hypothesise that climate-mediated changes in the nature or position of this region may alter the seals' at-sea dispersal patterns.
Resumo:
The aim of this study was to determine the most informative sampling time(s) providing a precise prediction of tacrolimus area under the concentration-time curve (AUC). Fifty-four concentration-time profiles of tacrolimus from 31 adult liver transplant recipients were analyzed. Each profile contained 5 tacrolimus whole-blood concentrations (predose and 1, 2, 4, and 6 or 8 hours postdose), measured using liquid chromatography-tandem mass spectrometry. The concentration at 6 hours was interpolated for each profile, and 54 values of AUC(0-6) were calculated using the trapezoidal rule. The best sampling times were then determined using limited sampling strategies and sensitivity analysis. Linear mixed-effects modeling was performed to estimate regression coefficients of equations incorporating each concentration-time point (C0, C1, C2, C4, interpolated C5, and interpolated C6) as a predictor of AUC(0-6). Predictive performance was evaluated by assessment of the mean error (ME) and root mean square error (RMSE). Limited sampling strategy (LSS) equations with C2, C4, and C5 provided similar results for prediction of AUC(0-6) (R-2 = 0.869, 0.844, and 0.832, respectively). These 3 time points were superior to C0 in the prediction of AUC. The ME was similar for all time points; the RMSE was smallest for C2, C4, and C5. The highest sensitivity index was determined to be 4.9 hours postdose at steady state, suggesting that this time point provides the most information about the AUC(0-12). The results from limited sampling strategies and sensitivity analysis supported the use of a single blood sample at 5 hours postdose as a predictor of both AUC(0-6) and AUC(0-12). A jackknife procedure was used to evaluate the predictive performance of the model, and this demonstrated that collecting a sample at 5 hours after dosing could be considered as the optimal sampling time for predicting AUC(0-6).