8 resultados para POISSON REGRESSION APPROACH
em Duke University
Resumo:
OBJECTIVE: Bacterial colonization of the fetal membranes and its role in pathogenesis of membrane rupture is poorly understood. Prior retrospective work revealed chorion layer thinning in preterm premature rupture of membranes (PPROM) subjects. Our objective was to prospectively examine fetal membrane chorion thinning and to correlate to bacterial presence in PPROM, preterm, and term subjects. STUDY DESIGN: Paired membrane samples (membrane rupture and membrane distant) were prospectively collected from: PPROM = 14, preterm labor (PTL = 8), preterm no labor (PTNL = 8), term labor (TL = 10), and term no labor (TNL = 8), subjects. Sections were probed with cytokeratin to identify fetal trophoblast layer of the chorion using immunohistochemistry. Fluorescence in situ hybridization was performed using broad range 16 s ribosomal RNA probe. Images were evaluated, chorion and choriodecidua were measured, and bacterial fluorescence scored. Chorion thinning and bacterial presence were compared among and between groups using Student's t-test, linear mixed effect model, and Poisson regression model (SAS Cary, NC). RESULTS: In all groups, the fetal chorion cellular layer was thinner at rupture compared to distant site (147.2 vs. 253.7 µm, p<0.0001). Further, chorion thinning was greatest among PPROM subjects compared to all other groups combined, regardless of site sampled [PPROM(114.9) vs. PTL(246.0) vs. PTNL(200.8) vs. TL(217.9) vs. TNL(246.5)]. Bacteria counts were highest among PPROM subjects compared to all other groups regardless of site sampled or histologic infection [PPROM(31) vs. PTL(9) vs. PTNL(7) vs. TL(7) vs. TNL(6)]. Among all subjects at both sites, bacterial counts were inversely correlated with chorion thinning, even excluding histologic chorioamnionitis (p<0.0001 and p = 0.05). CONCLUSIONS: Fetal chorion was uniformly thinner at rupture site compared to distant sites. In PPROM fetal chorion, we demonstrated pronounced global thinning. Although cause or consequence is uncertain, bacterial presence is greatest and inversely correlated with chorion thinning among PPROM subjects.
Resumo:
We introduce a dynamic directional model (DDM) for studying brain effective connectivity based on intracranial electrocorticographic (ECoG) time series. The DDM consists of two parts: a set of differential equations describing neuronal activity of brain components (state equations), and observation equations linking the underlying neuronal states to observed data. When applied to functional MRI or EEG data, DDMs usually have complex formulations and thus can accommodate only a few regions, due to limitations in spatial resolution and/or temporal resolution of these imaging modalities. In contrast, we formulate our model in the context of ECoG data. The combined high temporal and spatial resolution of ECoG data result in a much simpler DDM, allowing investigation of complex connections between many regions. To identify functionally segregated sub-networks, a form of biologically economical brain networks, we propose the Potts model for the DDM parameters. The neuronal states of brain components are represented by cubic spline bases and the parameters are estimated by minimizing a log-likelihood criterion that combines the state and observation equations. The Potts model is converted to the Potts penalty in the penalized regression approach to achieve sparsity in parameter estimation, for which a fast iterative algorithm is developed. The methods are applied to an auditory ECoG dataset.
Resumo:
BACKGROUND AND OBJECTIVES: Pain symptoms are common among Iraq/Afghanistan-era veterans, many of whom continue to experience persistent pain symptoms despite multiple pharmacological interventions. Preclinical data suggest that neurosteroids such as allopregnanolone demonstrate pronounced analgesic properties, and thus represent logical biomarker candidates and therapeutic targets for pain. Allopregnanolone is also a positive GABAA receptor modulator with anxiolytic, anticonvulsant, and neuroprotective actions in rodent models. We previously reported inverse associations between serum allopregnanolone levels and self-reported pain symptom severity in a pilot study of 82 male veterans. METHODS: The current study investigates allopregnanolone levels in a larger cohort of 485 male Iraq/Afghanistan-era veterans to attempt to replicate these initial findings. Pain symptoms were assessed by items from the Symptom Checklist-90-R (SCL-90-R) querying headache, chest pain, muscle soreness, and low back pain over the past 7 days. Allopregnanolone levels were quantified by gas chromatography/mass spectrometry. RESULTS: Associations between pain ratings and allopregnanolone levels were examined with Poisson regression analyses, controlling for age and smoking. Bivariate nonparametric Mann–Whitney analyses examining allopregnanolone levels across high and low levels of pain were also conducted. Allopregnanolone levels were inversely associated with muscle soreness [P = 0.0028], chest pain [P = 0.032], and aggregate total pain (sum of all four pain items) [P = 0.0001]. In the bivariate analyses, allopregnanolone levels were lower in the group reporting high levels of muscle soreness [P = 0.001]. CONCLUSIONS: These findings are generally consistent with our prior pilot study and suggest that allopregnanolone may function as an endogenous analgesic. Thus, exogenous supplementation with allopregnanolone could have therapeutic potential. The characterization of neurosteroid profiles may also have biomarker utility.
Resumo:
OBJECTIVE To use a unique multicomponent administrative data set assembled at a large academic teaching hospital to examine the risk of percutaneous blood and body fluid (BBF) exposures occurring in operating rooms. DESIGN A 10-year retrospective cohort design. SETTING A single large academic teaching hospital. PARTICIPANTS All surgical procedures (n=333,073) performed in 2001-2010 as well as 2,113 reported BBF exposures were analyzed. METHODS Crude exposure rates were calculated; Poisson regression was used to analyze risk factors and account for procedure duration. BBF exposures involving suture needles were examined separately from those involving other device types to examine possible differences in risk factors. RESULTS The overall rate of reported BBF exposures was 6.3 per 1,000 surgical procedures (2.9 per 1,000 surgical hours). BBF exposure rates increased with estimated patient blood loss (17.7 exposures per 1,000 procedures with 501-1,000 cc blood loss and 26.4 exposures per 1,000 procedures with >1,000 cc blood loss), number of personnel working in the surgical field during the procedure (34.4 exposures per 1,000 procedures having ≥15 personnel ever in the field), and procedure duration (14.3 exposures per 1,000 procedures lasting 4 to <6 hours, 27.1 exposures per 1,000 procedures lasting ≥6 hours). Regression results showed associations were generally stronger for suture needle-related exposures. CONCLUSIONS Results largely support other studies found in the literature. However, additional research should investigate differences in risk factors for BBF exposures associated with suture needles and those associated with all other device types. Infect. Control Hosp. Epidemiol. 2015;37(1):80-87.
Resumo:
In regression analysis of counts, a lack of simple and efficient algorithms for posterior computation has made Bayesian approaches appear unattractive and thus underdeveloped. We propose a lognormal and gamma mixed negative binomial (NB) regression model for counts, and present efficient closed-form Bayesian inference; unlike conventional Poisson models, the proposed approach has two free parameters to include two different kinds of random effects, and allows the incorporation of prior information, such as sparsity in the regression coefficients. By placing a gamma distribution prior on the NB dispersion parameter r, and connecting a log-normal distribution prior with the logit of the NB probability parameter p, efficient Gibbs sampling and variational Bayes inference are both developed. The closed-form updates are obtained by exploiting conditional conjugacy via both a compound Poisson representation and a Polya-Gamma distribution based data augmentation approach. The proposed Bayesian inference can be implemented routinely, while being easily generalizable to more complex settings involving multivariate dependence structures. The algorithms are illustrated using real examples. Copyright 2012 by the author(s)/owner(s).
Resumo:
Numerical approximation of the long time behavior of a stochastic di.erential equation (SDE) is considered. Error estimates for time-averaging estimators are obtained and then used to show that the stationary behavior of the numerical method converges to that of the SDE. The error analysis is based on using an associated Poisson equation for the underlying SDE. The main advantages of this approach are its simplicity and universality. It works equally well for a range of explicit and implicit schemes, including those with simple simulation of random variables, and for hypoelliptic SDEs. To simplify the exposition, we consider only the case where the state space of the SDE is a torus, and we study only smooth test functions. However, we anticipate that the approach can be applied more widely. An analogy between our approach and Stein's method is indicated. Some practical implications of the results are discussed. Copyright © by SIAM. Unauthorized reproduction of this article is prohibited.
Resumo:
Estimation of the skeleton of a directed acyclic graph (DAG) is of great importance for understanding the underlying DAG and causal effects can be assessed from the skeleton when the DAG is not identifiable. We propose a novel method named PenPC to estimate the skeleton of a high-dimensional DAG by a two-step approach. We first estimate the nonzero entries of a concentration matrix using penalized regression, and then fix the difference between the concentration matrix and the skeleton by evaluating a set of conditional independence hypotheses. For high-dimensional problems where the number of vertices p is in polynomial or exponential scale of sample size n, we study the asymptotic property of PenPC on two types of graphs: traditional random graphs where all the vertices have the same expected number of neighbors, and scale-free graphs where a few vertices may have a large number of neighbors. As illustrated by extensive simulations and applications on gene expression data of cancer patients, PenPC has higher sensitivity and specificity than the state-of-the-art method, the PC-stable algorithm.
Resumo:
Fitting statistical models is computationally challenging when the sample size or the dimension of the dataset is huge. An attractive approach for down-scaling the problem size is to first partition the dataset into subsets and then fit using distributed algorithms. The dataset can be partitioned either horizontally (in the sample space) or vertically (in the feature space), and the challenge arise in defining an algorithm with low communication, theoretical guarantees and excellent practical performance in general settings. For sample space partitioning, I propose a MEdian Selection Subset AGgregation Estimator ({\em message}) algorithm for solving these issues. The algorithm applies feature selection in parallel for each subset using regularized regression or Bayesian variable selection method, calculates the `median' feature inclusion index, estimates coefficients for the selected features in parallel for each subset, and then averages these estimates. The algorithm is simple, involves very minimal communication, scales efficiently in sample size, and has theoretical guarantees. I provide extensive experiments to show excellent performance in feature selection, estimation, prediction, and computation time relative to usual competitors.
While sample space partitioning is useful in handling datasets with large sample size, feature space partitioning is more effective when the data dimension is high. Existing methods for partitioning features, however, are either vulnerable to high correlations or inefficient in reducing the model dimension. In the thesis, I propose a new embarrassingly parallel framework named {\em DECO} for distributed variable selection and parameter estimation. In {\em DECO}, variables are first partitioned and allocated to m distributed workers. The decorrelated subset data within each worker are then fitted via any algorithm designed for high-dimensional problems. We show that by incorporating the decorrelation step, DECO can achieve consistent variable selection and parameter estimation on each subset with (almost) no assumptions. In addition, the convergence rate is nearly minimax optimal for both sparse and weakly sparse models and does NOT depend on the partition number m. Extensive numerical experiments are provided to illustrate the performance of the new framework.
For datasets with both large sample sizes and high dimensionality, I propose a new "divided-and-conquer" framework {\em DEME} (DECO-message) by leveraging both the {\em DECO} and the {\em message} algorithm. The new framework first partitions the dataset in the sample space into row cubes using {\em message} and then partition the feature space of the cubes using {\em DECO}. This procedure is equivalent to partitioning the original data matrix into multiple small blocks, each with a feasible size that can be stored and fitted in a computer in parallel. The results are then synthezied via the {\em DECO} and {\em message} algorithm in a reverse order to produce the final output. The whole framework is extremely scalable.