87 resultados para Poisson generalized linear mixed models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To assess trends in the frequency of concomitant vascular reconstructions (VRs) from 2000 through 2009 among patients who underwent pancreatectomy, as well as to compare the short-term outcomes between patients who underwent pancreatic resection with and without VR. DESIGN Single-center series have been conducted to evaluate the short-term and long-term outcomes of VR during pancreatic resection. However, its effectiveness from a population-based perspective is still unknown. Unadjusted, multivariable, and propensity score-adjusted generalized linear models were performed. SETTING Nationwide Inpatient Sample from 2000 through 2009. PATIENTS A total of 10 206 patients were involved. MAIN OUTCOME MEASURES Incidence of VR during pancreatic resection, perioperative in-hospital complications, and length of hospital stay. RESULTS Overall, 10 206 patients were included in this analysis. Of these, 412 patients (4.0%) underwent VR, with the rate increasing from 0.7% in 2000 to 6.0% in 2009 (P < .001). Patients who underwent pancreatic resection with VR were at a higher risk for intraoperative (propensity score-adjusted odds ratio, 1.94; P = .001) and postoperative (propensity score-adjusted odds ratio, 1.36; P = .008) complications, while the mortality and median length of hospital stay were similar to those of patients without VR. Among the 25% of hospitals with the highest surgical volume, patients who underwent pancreatic surgery with VR had significantly higher rates of postoperative complications and mortality than patients without VR. CONCLUSIONS The frequency of VR during pancreatic surgery is increasing in the United States. In contrast with most single-center analyses, this population-based study demonstrated that patients who underwent VR during pancreatic surgery had higher rates of adverse postoperative outcomes than their counterparts who underwent pancreatic resection only. Prospective studies incorporating long-term outcomes are warranted to further define which patients benefit from VR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uterine smooth muscle specimens were collected from euthanatized mares in estrus and diestrus. Longitudinal and circular specimens were mounted in organ baths and the signals transcribed to a Grass polygraph. After equilibration time and 2 g preload, their physiologic isometric contractility was recorded for a continuous 2.0 h. Area under the curve, frequency and time occupied by contractions were studied. Differences between cycle phases, between muscle layers, and over the recorded time periods were statistically evaluated using linear mixed-effect models. In the mare, physiologic contractility of the uterus decreased significantly over time for all variables evaluated (time as covariate on a continuous scale). For area under the curve, there was a significant effect of muscle layer (longitudinal > circular). For frequency, higher values were recorded in estrus for circular smooth muscle layer, whereas higher values were seen in longitudinal smooth muscle layers during diestrus. In longitudinal layer and in diestrus, more time was occupied by contractions than in circular layer, and in estrus. This study is describing physiologic myometrial motility in the organ bath depending on cycle phase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Milk cortisol concentration was determined under routine management conditions on 4 farms with an auto-tandem milking parlor and 8 farms with 1 of 2 automatic milking systems (AMS). One of the AMS was a partially forced (AMSp) system, and the other was a free cow traffic (AMSf) system. Milk samples were collected for all the cows on a given farm (20 to 54 cows) for at least 1 d. Behavioral observations were made during the milking process for a subset of 16 to 20 cows per farm. Milk cortisol concentration was evaluated by milking system, time of day, behavior during milking, daily milk yield, and somatic cell count using linear mixed-effects models. Milk cortisol did not differ between systems (AMSp: 1.15 +/- 0.07; AMSf: 1.02 +/- 0.12; auto-tandem parlor: 1.01 +/- 0.16 nmol/L). Cortisol concentrations were lower in evening than in morning milkings (1.01 +/- 0.12 vs. 1.24 +/- 0.13 nmol/L). The daily periodicity of cortisol concentration was characterized by an early morning peak and a late afternoon elevation in AMSp. A bimodal pattern was not evident in AMSf. Finally, milk cortisol decreased by a factor of 0.915 in milking parlors, by 0.998 in AMSp, and increased by a factor of 1.161 in AMSf for each unit of ln(somatic cell count/1,000). We conclude that milking cows in milking parlors or AMS does not result in relevant stress differences as measured by milk cortisol concentrations. The biological relevance of the difference regarding the daily periodicity of milk cortisol concentrations observed between the AMSp and AMSf needs further investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: First investigations of the interactions between weather and the incidence of acute myocardial infarctions date back to 1938. The early observation of a higher incidence of myocardial infarctions in the cold season could be confirmed in very different geographical regions and cohorts. While the influence of seasonal variations on the incidence of myocardial infarctions has been extensively documented, the impact of individual meteorological parameters on the disease has so far not been investigated systematically. Hence the present study intended to assess the impact of the essential variables of weather and climate on the incidence of myocardial infarctions. METHODS: The daily incidence of myocardial infarctions was calculated from a national hospitalization survey. The hourly weather and climate data were provided by the database of the national weather forecast. The epidemiological and meteorological data were correlated by multivariate analysis based on a generalized linear model assuming a log-link-function and a Poisson distribution. RESULTS: High ambient pressure, high pressure gradients, and heavy wind activity were associated with an increase in the incidence of the totally 6560 hospitalizations for myocardial infarction irrespective of the geographical region. Snow- and rainfall had inconsistent effects. Temperature, Foehn, and lightning showed no statistically significant impact. CONCLUSIONS: Ambient pressure, pressure gradient, and wind activity had a statistical impact on the incidence of myocardial infarctions in Switzerland from 1990 to 1994. To establish a cause-and-effect relationship more data are needed on the interaction between the pathophysiological mechanisms of the acute coronary syndrome and weather and climate variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES Zidovudine (ZDV) is recommended for first-line antiretroviral therapy (ART) in resource-limited settings. ZDV may, however, lead to anemia and impaired immunological response. We compared CD4+ cell counts over 5 years between patients starting ART with and without ZDV in southern Africa. DESIGN Cohort study. METHODS Patients aged at least 16 years who started first-line ART in South Africa, Botswana, Zambia, or Lesotho were included. We used linear mixed-effect models to compare CD4+ cell count trajectories between patients on ZDV-containing regimens and patients on other regimens, censoring follow-up at first treatment change. Impaired immunological recovery, defined as a CD4+ cell count below 100 cells/μl at 1 year, was assessed in logistic regression. Analyses were adjusted for baseline CD4+ cell count and hemoglobin level, age, sex, type of regimen, viral load monitoring, and calendar year. RESULTS A total of 72,597 patients starting ART, including 19,758 (27.2%) on ZDV, were analyzed. Patients on ZDV had higher CD4+ cell counts (150 vs.128 cells/μl) and hemoglobin level (12.0 vs. 11.0 g/dl) at baseline, and were less likely to be women than those on other regimens. Adjusted differences in CD4+ cell counts between regimens containing and not containing ZDV were -16 cells/μl [95% confidence interval (CI) -18 to -14] at 1 year and -56 cells/μl (95% CI -59 to -52) at 5 years. Impaired immunological recovery was more likely with ZDV compared to other regimens (odds ratio 1.40, 95% CI 1.22-1.61). CONCLUSION In southern Africa, ZDV is associated with inferior immunological recovery compared to other backbones. Replacing ZDV with another nucleoside reverse transcriptase inhibitor could avoid unnecessary switches to second-line ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To evaluate a new triaxial accelerometer device for prediction of energy expenditure, measured as VO2/kg, in obese adults and normal-weight controls during activities of daily life. Subjects and methods: Thirty-seven obese adults (Body Mass Index (BMI) 37±5.4) and seventeen controls (BMI 23±1.8) performed eight activities for 5 to 8 minutes while wearing a triaxial accelerometer on the right thigh. Simultaneously, VO2 and VCO2 were measured using a portable metabolic system. The relationship between accelerometer counts (AC) and VO2/kg was analysed using spline regression and linear mixed-effects models. Results: For all activities, VO2/kg was significantly lower in obese participants than in normalweight controls. A linear relationship between AC and VO2/kg existed only within accelerometer values from 0 to 300 counts/min, with an increase of 3.7 (95%-confidence interval (CI) 3.4 - 4.1) and 3.9 ml/min (95%-CI 3.4 - 4.3) per increase of 100 counts/min in obese and normal-weight adults, respectively. Linear modelling of the whole range yields wide prediction intervals for VO2/kg of ± 6.3 and ±7.3 ml/min in both groups. Conclusion: In obese and normal-weight adults, the use of AC for predicting energy expenditure, defined as VO2/kg, from a broad range of physical activities, characterized by varying intensities and types of muscle work, is limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The southernmost European natural and planted pine forests are among the most vulnerable areas to warming-induced drought decline. Both drought stress and management factors (e.g., stand origin or reduced thinning) may induce decline by reducing the water available to trees but their relative importances have not been properly assessed. The role of stand origin - densely planted vs. naturally regenerated stands - as a decline driver can be assessed by comparing the growth and vigor responses to drought of similar natural vs. planted stands. Here, we compare these responses in natural and planted Black pine (Pinus nigra) stands located in southern Spain. We analyze how environmental factors - climatic (temperature and precipitation anomalies) and site conditions - and biotic factors - stand structure (age, tree size, density) and defoliation by the pine processionary moth - drive radial growth and crown condition at stand and tree levels. We also assess the climatic trends in the study area over the last 60 years. We use dendrochronology, linear mixed-effects models of basal area increment and structural equation models to determine how natural and planted stands respond to drought and current competition intensity. We observed that a temperature rise and a decrease in precipitation during the growing period led to increasing drought stress during the late 20th century. Trees from planted stands experienced stronger growth reductions and displayed more severe crown defoliation after severe droughts than those from natural stands. High stand density negatively drove growth and enhanced crown dieback, particularly in planted stands. Also pine processionary moth defoliation was more severe in the growth of natural than in planted stands but affected tree crown condition similarly in both stand types. In response to drought, sharp growth reduction and widespread defoliation of planted Mediterranean pine stands indicate that they are more vulnerable and less resilient to drought stress than natural stands. To mitigate forest decline of planted stands in xeric areas such as the Mediterranean Basin, less dense and more diverse stands should be created through selective thinning or by selecting species or provenances that are more drought tolerant. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Risk factors and outcomes of bronchial stricture after lung transplantation are not well defined. An association between acute rejection and development of stricture has been suggested in small case series. We evaluated this relationship using a large national registry. METHODS: All lung transplantations between April 1994 and December 2008 per the United Network for Organ Sharing (UNOS) database were analyzed. Generalized linear models were used to determine the association between early rejection and development of stricture after adjusting for potential confounders. The association of stricture with postoperative lung function and overall survival was also evaluated. RESULTS: Nine thousand three hundred thirty-five patients were included for analysis. The incidence of stricture was 11.5% (1,077/9,335), with no significant change in incidence during the study period (P=0.13). Early rejection was associated with a significantly greater incidence of stricture (adjusted odds ratio [AOR], 1.40; 95% confidence interval [CI], 1.22-1.61; p<0.0001). Male sex, restrictive lung disease, and pretransplantation requirement for hospitalization were also associated with stricture. Those who experienced stricture had a lower postoperative peak percent predicted forced expiratory volume at 1 second (FEV1) (median 74% versus 86% for bilateral transplants only; p<0.0001), shorter unadjusted survival (median 6.09 versus 6.82 years; p<0.001) and increased risk of death after adjusting for potential confounders (adjusted hazard ratio 1.13; 95% CI, 1.03-1.23; p=0.007). CONCLUSIONS: Early rejection is associated with an increased incidence of stricture. Recipients with stricture demonstrate worse postoperative lung function and survival. Prospective studies may be warranted to further assess causality and the potential for coordinated rejection and stricture surveillance strategies to improve postoperative outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examined outcomes and trends in surgery and radiation use for patients with locally advanced esophageal cancer, for whom optimal treatment isn't clear. Trends in surgery and radiation for patients with T1-T3N1M0 squamous cell or adenocarcinoma of the mid or distal esophagus in the Surveillance, Epidemiology, and End Results database from 1998 to 2008 were analyzed using generalized linear models including year as predictor; Surveillance, Epidemiology, and End Results doesn't record chemotherapy data. Local treatment was unimodal if patients had only surgery or radiation and bimodal if they had both. Five-year cancer-specific survival (CSS) and overall survival (OS) were analyzed using propensity-score adjusted Cox proportional-hazard models. Overall 5-year survival for the 3295 patients identified (mean age 65.1 years, standard deviation 11.0) was 18.9% (95% confidence interval: 17.3-20.7). Local treatment was bimodal for 1274 (38.7%) and unimodal for 2021 (61.3%) patients; 1325 (40.2%) had radiation alone and 696 (21.1%) underwent only surgery. The use of bimodal therapy (32.8-42.5%, P = 0.01) and radiation alone (29.3-44.5%, P < 0.001) increased significantly from 1998 to 2008. Bimodal therapy predicted improved CSS (hazard ratios [HR]: 0.68, P < 0.001) and OS (HR: 0.58, P < 0.001) compared with unimodal therapy. For the first 7 months (before survival curve crossing), CSS after radiation therapy alone was similar to surgery alone (HR: 0.86, P = 0.12) while OS was worse for surgery only (HR: 0.70, P = 0.001). However, worse CSS (HR: 1.43, P < 0.001) and OS (HR: 1.46, P < 0.001) after that initial timeframe were found for radiation therapy only. The use of radiation to treat locally advanced mid and distal esophageal cancers increased from 1998 to 2008. Survival was best when both surgery and radiation were used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Estimating the prevalence of comorbidities and their associated costs in patients with diabetes is fundamental to optimizing health care management. This study assesses the prevalence and health care costs of comorbid conditions among patients with diabetes compared with patients without diabetes. Distinguishing potentially diabetes- and nondiabetes-related comorbidities in patients with diabetes, we also determined the most frequent chronic conditions and estimated their effect on costs across different health care settings in Switzerland. METHODS Using health care claims data from 2011, we calculated the prevalence and average health care costs of comorbidities among patients with and without diabetes in inpatient and outpatient settings. Patients with diabetes and comorbid conditions were identified using pharmacy-based cost groups. Generalized linear models with negative binomial distribution were used to analyze the effect of comorbidities on health care costs. RESULTS A total of 932,612 persons, including 50,751 patients with diabetes, were enrolled. The most frequent potentially diabetes- and nondiabetes-related comorbidities in patients older than 64 years were cardiovascular diseases (91%), rheumatologic conditions (55%), and hyperlipidemia (53%). The mean total health care costs for diabetes patients varied substantially by comorbidity status (US$3,203-$14,223). Patients with diabetes and more than two comorbidities incurred US$10,584 higher total costs than patients without comorbidity. Costs were significantly higher in patients with diabetes and comorbid cardiovascular disease (US$4,788), hyperlipidemia (US$2,163), hyperacidity disorders (US$8,753), and pain (US$8,324) compared with in those without the given disease. CONCLUSION Comorbidities in patients with diabetes are highly prevalent and have substantial consequences for medical expenditures. Interestingly, hyperacidity disorders and pain were the most costly conditions. Our findings highlight the importance of developing strategies that meet the needs of patients with diabetes and comorbidities. Integrated diabetes care such as used in the Chronic Care Model may represent a useful strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To determine the biomechanical effect of an intervertebral spacer on construct stiffness in a PVC model and cadaveric canine cervical vertebral columns stabilized with monocortical screws/polymethylmethacrylate (PMMA). STUDY DESIGN Biomechanical study. SAMPLE POPULATION PVC pipe; cadaveric canine vertebral columns. METHODS PVC model-PVC pipe was used to create a gap model mimicking vertebral endplate orientation and disk space width of large-breed canine cervical vertebrae; 6 models had a 4-mm gap with no spacer (PVC group 1); 6 had a PVC pipe ring spacer filling the gap (PCV group 2). Animals-large breed cadaveric canine cervical vertebral columns (C2-C7) from skeletally mature dogs without (cadaveric group 1, n = 6, historical data) and with an intervertebral disk spacer (cadaveric group 2, n = 6) were used. All PVC models and cadaver specimens were instrumented with monocortical titanium screws/PMMA. Stiffness of the 2 PVC groups was compared in extension, flexion, and lateral bending using non-destructive 4-point bend testing. Stiffness testing in all 3 directions was performed of the unaltered C4-C5 vertebral motion unit in cadaveric spines and repeated after placement of an intervertebral cortical allograft ring and instrumentation. Data were compared using a linear mixed model approach that also incorporated data from previously tested spines with the same screw/PMMA construct but without disk spacer (cadaveric group 1). RESULTS Addition of a spacer increased construct stiffness in both the PVC model (P < .001) and cadaveric vertebral columns (P < .001) compared to fixation without a spacer. CONCLUSIONS Addition of an intervertebral spacer significantly increased construct stiffness of monocortical screw/PMMA fixation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: A small pond, c. 90 years old, near Bern, Switzerland contains a population of threespine stickleback (Gasterosteus aculeatus) with two distinct male phenotypes. Males of one type are large, and red, and nest in the shallow littoral zone. The males of the other are small and orange, and nest offshore at slightly greater depth. The females in this population are phenotypically highly variable but cannot easily be assigned to either male type. Question: Is the existence of two sympatric male morphs maintained by substrate-associated male nest site choice and facilitated by female mate preferences? Organisms: Male stickleback caught individually at their breeding sites. Females caught with minnow traps. Methods: In experimental tanks, we simulated the slope and substrate of the two nesting habitats. We then placed individual males in a tank and observed in which habitat the male would build his nest. In a simultaneous two-stimulus choice design, we gave females the choice between a large, red male and a small, orange one. We measured female morphology and used linear mixed effect models to determine whether female preference correlated with female morphology. Results: Both red and orange males preferred nesting in the habitat that simulated the slightly deeper offshore condition. This is the habitat occupied by the small, orange males in the pond itself. The proportion of females that chose a small orange male was similar to that which chose a large red male. Several aspects of female phenotype correlated with the male type that a female preferred.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim Our aims were to compare the composition of testate amoeba (TA) communities from Santa Cruz Island, Galápagos Archipelago, which are likely in existence only as a result of anthropogenic habitat transformation, with similar naturally occurring communities from northern and southern continental peatlands. Additionally, we aimed at assessing the importance of niche-based and dispersal-based processes in determining community composition and taxonomic and functional diversity. Location The humid highlands of the central island of Santa Cruz, Galápagos Archipelago. Methods We survey the alpha, beta and gamma taxonomic and functional diversities of TA, and the changes in functional traits along a gradient of wet to dry habitats. We compare the TA community composition, abundance and frequency recorded in the insular peatlands with that recorded in continental peatlands of Northern and Southern Hemispheres. We use generalized linear models to determine how environmental conditions influence taxonomic and functional diversity as well as the mean values of functional traits within communities. We finally apply variance partitioning to assess the relative importance of niche- and dispersal-based processes in determining community composition. Results TA communities in Santa Cruz Island were different from their Northern Hemisphere and South American counterparts with most genera considered as characteristic for Northern Hemisphere and South American Sphagnum peatlands missing or very rare in the Galápagos. Functional traits were most correlated with elevation and site topography and alpha functional diversity to the type of material sampled and site topography. Community composition was more strongly correlated with spatial variables than with environmental ones. Main conclusions TA communities of the Sphagnum peatlands of Santa Cruz Island and the mechanisms shaping these communities contrast with Northern Hemisphere and South American peatlands. Soil moisture was not a strong predictor of community composition most likely because rainfall and clouds provide sufficient moisture. Dispersal limitation was more important than environmental filtering because of the isolation of the insular peatlands from continental ones and the young ecological history of these ecosystems.