922 resultados para Generalized Linear Model
Resumo:
Liver tissue was collected from eight random dairy cows at a slaughterhouse to test if gene expression of pyruvate carboxylase (PC), mitochondrial phosphoenolpyruvate carboxykinase (PEPCKm) and cytosolic phosphoenolpyruvate carboxykinase (PEPCKc) is different at different locations in the liver. Obtained liver samples were analysed for mRNA expression levels of PC, PEPCKc and PEPCKm and subjected to the MIXED procedure of SAS to test for the sampled locations with cow liver as repeated subject. Additionally, the general linear model procedure (GLM) for analysis of variance was applied to test for significant differences for mRNA abundance of PEPCKm, PEPCKc and bPC between the livers. In conclusion, this study demonstrated that mRNA abundance of PC, PEPCKc and PEPCKm is not different between locations in the liver but may differ between individual cows.
Resumo:
OBJECTIVE To assess trends in the frequency of concomitant vascular reconstructions (VRs) from 2000 through 2009 among patients who underwent pancreatectomy, as well as to compare the short-term outcomes between patients who underwent pancreatic resection with and without VR. DESIGN Single-center series have been conducted to evaluate the short-term and long-term outcomes of VR during pancreatic resection. However, its effectiveness from a population-based perspective is still unknown. Unadjusted, multivariable, and propensity score-adjusted generalized linear models were performed. SETTING Nationwide Inpatient Sample from 2000 through 2009. PATIENTS A total of 10 206 patients were involved. MAIN OUTCOME MEASURES Incidence of VR during pancreatic resection, perioperative in-hospital complications, and length of hospital stay. RESULTS Overall, 10 206 patients were included in this analysis. Of these, 412 patients (4.0%) underwent VR, with the rate increasing from 0.7% in 2000 to 6.0% in 2009 (P < .001). Patients who underwent pancreatic resection with VR were at a higher risk for intraoperative (propensity score-adjusted odds ratio, 1.94; P = .001) and postoperative (propensity score-adjusted odds ratio, 1.36; P = .008) complications, while the mortality and median length of hospital stay were similar to those of patients without VR. Among the 25% of hospitals with the highest surgical volume, patients who underwent pancreatic surgery with VR had significantly higher rates of postoperative complications and mortality than patients without VR. CONCLUSIONS The frequency of VR during pancreatic surgery is increasing in the United States. In contrast with most single-center analyses, this population-based study demonstrated that patients who underwent VR during pancreatic surgery had higher rates of adverse postoperative outcomes than their counterparts who underwent pancreatic resection only. Prospective studies incorporating long-term outcomes are warranted to further define which patients benefit from VR.
Resumo:
Generalized linear mixed models (GLMM) are generalized linear models with normally distributed random effects in the linear predictor. Penalized quasi-likelihood (PQL), an approximate method of inference in GLMMs, involves repeated fitting of linear mixed models with “working” dependent variables and iterative weights that depend on parameter estimates from the previous cycle of iteration. The generality of PQL, and its implementation in commercially available software, has encouraged the application of GLMMs in many scientific fields. Caution is needed, however, since PQL may sometimes yield badly biased estimates of variance components, especially with binary outcomes. Recent developments in numerical integration, including adaptive Gaussian quadrature, higher order Laplace expansions, stochastic integration and Markov chain Monte Carlo (MCMC) algorithms, provide attractive alternatives to PQL for approximate likelihood inference in GLMMs. Analyses of some well known datasets, and simulations based on these analyses, suggest that PQL still performs remarkably well in comparison with more elaborate procedures in many practical situations. Adaptive Gaussian quadrature is a viable alternative for nested designs where the numerical integration is limited to a small number of dimensions. Higher order Laplace approximations hold the promise of accurate inference more generally. MCMC is likely the method of choice for the most complex problems that involve high dimensional integrals.
Resumo:
Searching for the neural correlates of visuospatial processing using functional magnetic resonance imaging (fMRI) is usually done in an event-related framework of cognitive subtraction, applying a paradigm comprising visuospatial cognitive components and a corresponding control task. Besides methodological caveats of the cognitive subtraction approach, the standard general linear model with fixed hemodynamic response predictors bears the risk of being underspecified. It does not take into account the variability of the blood oxygen level-dependent signal response due to variable task demand and performance on the level of each single trial. This underspecification may result in reduced sensitivity regarding the identification of task-related brain regions. In a rapid event-related fMRI study, we used an extended general linear model including single-trial reaction-time-dependent hemodynamic response predictors for the analysis of an angle discrimination task. In addition to the already known regions in superior and inferior parietal lobule, mapping the reaction-time-dependent hemodynamic response predictor revealed a more specific network including task demand-dependent regions not being detectable using the cognitive subtraction method, such as bilateral caudate nucleus and insula, right inferior frontal gyrus and left precentral gyrus.
Resumo:
This paper considers a wide class of semiparametric problems with a parametric part for some covariate effects and repeated evaluations of a nonparametric function. Special cases in our approach include marginal models for longitudinal/clustered data, conditional logistic regression for matched case-control studies, multivariate measurement error models, generalized linear mixed models with a semiparametric component, and many others. We propose profile-kernel and backfitting estimation methods for these problems, derive their asymptotic distributions, and show that in likelihood problems the methods are semiparametric efficient. While generally not true, with our methods profiling and backfitting are asymptotically equivalent. We also consider pseudolikelihood methods where some nuisance parameters are estimated from a different algorithm. The proposed methods are evaluated using simulation studies and applied to the Kenya hemoglobin data.
Resumo:
An optimal multiple testing procedure is identified for linear hypotheses under the general linear model, maximizing the expected number of false null hypotheses rejected at any significance level. The optimal procedure depends on the unknown data-generating distribution, but can be consistently estimated. Drawing information together across many hypotheses, the estimated optimal procedure provides an empirical alternative hypothesis by adapting to underlying patterns of departure from the null. Proposed multiple testing procedures based on the empirical alternative are evaluated through simulations and an application to gene expression microarray data. Compared to a standard multiple testing procedure, it is not unusual for use of an empirical alternative hypothesis to increase by 50% or more the number of true positives identified at a given significance level.
Resumo:
In environmental epidemiology, exposure X and health outcome Y vary in space and time. We present a method to diagnose the possible influence of unmeasured confounders U on the estimated effect of X on Y and to propose several approaches to robust estimation. The idea is to use space and time as proxy measures for the unmeasured factors U. We start with the time series case where X and Y are continuous variables at equally-spaced times and assume a linear model. We define matching estimator b(u)s that correspond to pairs of observations with specific lag u. Controlling for a smooth function of time, St, using a kernel estimator is roughly equivalent to estimating the association with a linear combination of the b(u)s with weights that involve two components: the assumptions about the smoothness of St and the normalized variogram of the X process. When an unmeasured confounder U exists, but the model otherwise correctly controls for measured confounders, the excess variation in b(u)s is evidence of confounding by U. We use the plot of b(u)s versus lag u, lagged-estimator-plot (LEP), to diagnose the influence of U on the effect of X on Y. We use appropriate linear combination of b(u)s or extrapolate to b(0) to obtain novel estimators that are more robust to the influence of smooth U. The methods are extended to time series log-linear models and to spatial analyses. The LEP plot gives us a direct view of the magnitude of the estimators for each lag u and provides evidence when models did not adequately describe the data.
Resumo:
Mild cognitive impairment (MCI) often refers to the preclinical stage of dementia, where the majority develop Alzheimer's disease (AD). Given that neurodegenerative burden and compensatory mechanisms might exist before accepted clinical symptoms of AD are noticeable, the current prospective study aimed to investigate the functioning of brain regions in the visuospatial networks responsible for preclinical symptoms in AD using event-related functional magnetic resonance imaging (fMRI). Eighteen MCI patients were evaluated and clinically followed for approximately 3 years. Five progressed to AD (PMCI) and eight remained stable (SMCI). Thirteen age-, gender- and education-matched controls also participated. An angle discrimination task with varying task demands was used. Brain activation patterns as well as task demand-dependent and -independent signal changes between the groups were investigated by using an extended general linear model including individual performance (reaction time [RT]) of each single trial. Similar behavioral (RT and accuracy) responses were observed between MCI patients and controls. A network of bilateral activations, e.g. dorsal pathway, which increased linearly with increasing task demand, was engaged in all subjects. Compared with SMCI patients and controls, PMCI patients showed a stronger relation between task demand and brain activity in left superior parietal lobules (SPL) as well as a general task demand-independent increased activation in left precuneus. Altered brain function can be detected at a group level in individuals that progress to AD before changes occur at the behavioral level. Increased parietal activation in PMCI could reflect a reduced neuronal efficacy due to accumulating AD pathology and might predict future clinical decline in patients with MCI.
Resumo:
Neural correlates of electroencephalographic (EEG) alpha rhythm are poorly understood. Here, we related EEG alpha rhythm in awake humans to blood-oxygen-level-dependent (BOLD) signal change determined by functional magnetic resonance imaging (fMRI). Topographical EEG was recorded simultaneously with fMRI during an open versus closed eyes and an auditory stimulation versus silence condition. EEG was separated into spatial components of maximal temporal independence using independent component analysis. Alpha component amplitudes and stimulus conditions served as general linear model regressors of the fMRI signal time course. In both paradigms, EEG alpha component amplitudes were associated with BOLD signal decreases in occipital areas, but not in thalamus, when a standard BOLD response curve (maximum effect at approximately 6 s) was assumed. The part of the alpha regressor independent of the protocol condition, however, revealed significant positive thalamic and mesencephalic correlations with a mean time delay of approximately 2.5 s between EEG and BOLD signals. The inverse relationship between EEG alpha amplitude and BOLD signals in primary and secondary visual areas suggests that widespread thalamocortical synchronization is associated with decreased brain metabolism. While the temporal relationship of this association is consistent with metabolic changes occurring simultaneously with changes in the alpha rhythm, sites in the medial thalamus and in the anterior midbrain were found to correlate with short time lag. Assuming a canonical hemodynamic response function, this finding is indicative of activity preceding the actual EEG change by some seconds.
Resumo:
Background: The goal of this study was to determine whether site-specific differences in the subgingival microbiota could be detected by the checkerboard method in subjects with periodontitis. Methods: Subjects with at least six periodontal pockets with a probing depth (PD) between 5 and 7 mm were enrolled in the study. Subgingival plaque samples were collected with sterile curets by a single-stroke procedure at six selected periodontal sites from 161 subjects (966 subgingival sites). Subgingival bacterial samples were assayed with the checkerboard DNA-DNA hybridization method identifying 37 species. Results: Probing depths of 5, 6, and 7 mm were found at 50% (n = 483), 34% (n = 328), and 16% (n = 155) of sites, respectively. Statistical analysis failed to demonstrate differences in the sum of bacterial counts by tooth type (P = 0.18) or specific location of the sample (P = 0.78). With the exceptions of Campylobacter gracilis (P <0.001) and Actinomyces naeslundii (P <0.001), analysis by general linear model multivariate regression failed to identify subject or sample location factors as explanatory to microbiologic results. A trend of difference in bacterial load by tooth type was found for Prevotella nigrescens (P <0.01). At a cutoff level of >/=1.0 x 10(5), Porphyromonas gingivalis and Tannerella forsythia (previously T. forsythensis) were present at 48.0% to 56.3% and 46.0% to 51.2% of sampled sites, respectively. Conclusions: Given the similarities in the clinical evidence of periodontitis, the presence and levels of 37 species commonly studied in periodontitis are similar, with no differences between molar, premolar, and incisor/cuspid subgingival sites. This may facilitate microbiologic sampling strategies in subjects during periodontal therapy.
Resumo:
The flammability zone boundaries are very important properties to prevent explosions in the process industries. Within the boundaries, a flame or explosion can occur so it is important to understand these boundaries to prevent fires and explosions. Very little work has been reported in the literature to model the flammability zone boundaries. Two boundaries are defined and studied: the upper flammability zone boundary and the lower flammability zone boundary. Three methods are presented to predict the upper and lower flammability zone boundaries: The linear model The extended linear model, and An empirical model The linear model is a thermodynamic model that uses the upper flammability limit (UFL) and lower flammability limit (LFL) to calculate two adiabatic flame temperatures. When the proper assumptions are applied, the linear model can be reduced to the well-known equation yLOC = zyLFL for estimation of the limiting oxygen concentration. The extended linear model attempts to account for the changes in the reactions along the UFL boundary. Finally, the empirical method fits the boundaries with linear equations between the UFL or LFL and the intercept with the oxygen axis. xx Comparison of the models to experimental data of the flammability zone shows that the best model for estimating the flammability zone boundaries is the empirical method. It is shown that is fits the limiting oxygen concentration (LOC), upper oxygen limit (UOL), and the lower oxygen limit (LOL) quite well. The regression coefficient values for the fits to the LOC, UOL, and LOL are 0.672, 0.968, and 0.959, respectively. This is better than the fit of the "zyLFL" method for the LOC in which the regression coefficient’s value is 0.416.
Resumo:
Four papers, written in collaboration with the author’s graduate school advisor, are presented. In the first paper, uniform and non-uniform Berry-Esseen (BE) bounds on the convergence to normality of a general class of nonlinear statistics are provided; novel applications to specific statistics, including the non-central Student’s, Pearson’s, and the non-central Hotelling’s, are also stated. In the second paper, a BE bound on the rate of convergence of the F-statistic used in testing hypotheses from a general linear model is given. The third paper considers the asymptotic relative efficiency (ARE) between the Pearson, Spearman, and Kendall correlation statistics; conditions sufficient to ensure that the Spearman and Kendall statistics are equally (asymptotically) efficient are provided, and several models are considered which illustrate the use of such conditions. Lastly, the fourth paper proves that, in the bivariate normal model, the ARE between any of these correlation statistics possesses certain monotonicity properties; quadratic lower and upper bounds on the ARE are stated as direct applications of such monotonicity patterns.
Resumo:
The need for a stronger and more durable building material is becoming more important as the structural engineering field expands and challenges the behavioral limits of current materials. One of the demands for stronger material is rooted in the effects that dynamic loading has on a structure. High strain rates on the order of 101 s-1 to 103 s-1, though a small part of the overall types of loading that occur anywhere between 10-8 s-1 to 104 s-1 and at any point in a structures life, have very important effects when considering dynamic loading on a structure. High strain rates such as these can cause the material and structure to behave differently than at slower strain rates, which necessitates the need for the testing of materials under such loading to understand its behavior. Ultra high performance concrete (UHPC), a relatively new material in the U.S. construction industry, exhibits many enhanced strength and durability properties compared to the standard normal strength concrete. However, the use of this material for high strain rate applications requires an understanding of UHPC’s dynamic properties under corresponding loads. One such dynamic property is the increase in compressive strength under high strain rate load conditions, quantified as the dynamic increase factor (DIF). This factor allows a designer to relate the dynamic compressive strength back to the static compressive strength, which generally is a well-established property. Previous research establishes the relationships for the concept of DIF in design. The generally accepted methodology for obtaining high strain rates to study the enhanced behavior of compressive material strength is the split Hopkinson pressure bar (SHPB). In this research, 83 Cor-Tuf UHPC specimens were tested in dynamic compression using a SHPB at Michigan Technological University. The specimens were separated into two categories: ambient cured and thermally treated, with aspect ratios of 0.5:1, 1:1, and 2:1 within each category. There was statistically no significant difference in mean DIF for the aspect ratios and cure regimes that were considered in this study. DIF’s ranged from 1.85 to 2.09. Failure modes were observed to be mostly Type 2, Type 4, or combinations thereof for all specimen aspect ratios when classified according to ASTM C39 fracture pattern guidelines. The Comite Euro-International du Beton (CEB) model for DIF versus strain rate does not accurately predict the DIF for UHPC data gathered in this study. Additionally, a measurement system analysis was conducted to observe variance within the measurement system and a general linear model analysis was performed to examine the interaction and main effects that aspect ratio, cannon pressure, and cure method have on the maximum dynamic stress.
Resumo:
In reverse logistics networks, products (e.g., bottles or containers) have to be transported from a depot to customer locations and, after use, from customer locations back to the depot. In order to operate economically beneficial, companies prefer a simultaneous delivery and pick-up service. The resulting Vehicle Routing Problem with Simultaneous Delivery and Pick-up (VRPSDP) is an operational problem, which has to be solved daily by many companies. We present two mixed-integer linear model formulations for the VRPSDP, namely a vehicle-flow and a commodity-flow model. In order to strengthen the models, domain-reducing preprocessing techniques, and effective cutting planes are outlined. Symmetric benchmark instances known from the literature as well as new asymmetric instances derived from real-world problems are solved to optimality using CPLEX 12.1.
Resumo:
BACKGROUND Microvascular anastomosis is the cornerstone of free tissue transfers. Irrespective of the microsurgical technique that one seeks to integrate or improve, the time commitment in the laboratory is significant. After extensive previous training on several animal models, we sought to identify an animal model that circumvents the following issues: ethical rules, cost, time-consuming and expensive anesthesia, and surgical preparation of tissues required to access vessels before performing the microsurgical training, not to mention that laboratories are closed on weekends. METHODS Between January 2012 and April 2012, a total of 91 earthworms were used for 150 microsurgical training exercises to simulate vascular end-to-side microanastomosis. The training sessions were divided into ten periods of 7 days. Each training session included 15 simulations of end-to-side vascular microanastomoses: larger than 1.5 mm (n=5), between 1.0 and 1.5 mm (n=5), and smaller than 1.0 mm (n=5). A linear model with the main variables being the number of weeks (as a numerical covariate) and the size of the animal (as a factor) was used to determine the trend in time of anastomosis over subsequent weeks as well as the differences between the different size groups. RESULTS The linear model shows a significant trend (p<0.001) in time of anastomosis in the course of the training, as well as significant differences (p<0.001) between the groups of animals of different sizes. For microanastomoses larger than 1.5 mm, the mean anastomosis time decreased from 19.3±1.0 to 11.1±0.4 min between the first and last week of training (decrease of 42.5%). For training with smaller diameters, the results showed a decrease in execution time of 43.2% (diameter between 1.0 and 1.5 mm) and 40.9% (diameter<1.0 mm) between the first and last periods. The study demonstrates an improvement in the dexterity and speed of nodes execution. CONCLUSION The earthworm appears to be a reliable experimental model for microsurgical training of end-to-side microanastomoses. Its numerous advantages are discussed here and we predict training on earthworms will significantly grow and develop in the near future. LEVEL OF EVIDENCE III This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .