6 resultados para Modularité massive
em Duke University
Resumo:
This article describes advances in statistical computation for large-scale data analysis in structured Bayesian mixture models via graphics processing unit (GPU) programming. The developments are partly motivated by computational challenges arising in fitting models of increasing heterogeneity to increasingly large datasets. An example context concerns common biological studies using high-throughput technologies generating many, very large datasets and requiring increasingly high-dimensional mixture models with large numbers of mixture components.We outline important strategies and processes for GPU computation in Bayesian simulation and optimization approaches, give examples of the benefits of GPU implementations in terms of processing speed and scale-up in ability to analyze large datasets, and provide a detailed, tutorial-style exposition that will benefit readers interested in developing GPU-based approaches in other statistical models. Novel, GPU-oriented approaches to modifying existing algorithms software design can lead to vast speed-up and, critically, enable statistical analyses that presently will not be performed due to compute time limitations in traditional computational environments. Supplementalmaterials are provided with all source code, example data, and details that will enable readers to implement and explore the GPU approach in this mixture modeling context. © 2010 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.
Resumo:
OBJECTIVE: To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. STUDY DESIGN: A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. RESULTS: Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with "massive" PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a "stat" complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. CONCLUSION: The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist.
Resumo:
Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.
Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.
Resumo:
African green monkeys (AGM) and other natural hosts for simian immunodeficiency virus (SIV) do not develop an AIDS-like disease following SIV infection. To evaluate differences in the role of SIV-specific adaptive immune responses between natural and nonnatural hosts, we used SIV(agmVer90) to infect vervet AGM and pigtailed macaques (PTM). This infection results in robust viral replication in both vervet AGM and pigtailed macaques (PTM) but only induces AIDS in the latter species. We delayed the development of adaptive immune responses through combined administration of anti-CD8 and anti-CD20 lymphocyte-depleting antibodies during primary infection of PTM (n = 4) and AGM (n = 4), and compared these animals to historical controls infected with the same virus. Lymphocyte depletion resulted in a 1-log increase in primary viremia and a 4-log increase in post-acute viremia in PTM. Three of the four PTM had to be euthanized within 6 weeks of inoculation due to massive CMV reactivation and disease. In contrast, all four lymphocyte-depleted AGM remained healthy. The lymphocyte-depleted AGM showed only a trend toward a prolongation in peak viremia but the groups were indistinguishable during chronic infection. These data show that adaptive immune responses are critical for controlling disease progression in pathogenic SIV infection in PTM. However, the maintenance of a disease-free course of SIV infection in AGM likely depends on a number of mechanisms including non-adaptive immune mechanisms.
Resumo:
© Comer, Clark, Canelas.This study aimed to evaluate how peer-to-peer interactions through writing impact student learning in introductory-level massive open online courses (MOOCs) across disciplines. This article presents the results of a qualitative coding analysis of peer-to-peer interactions in two introductory level MOOCs: English Composition I: Achieving Expertise and Introduction to Chemistry. Results indicate that peer-to-peer interactions in writing through the forums and through peer assessment enhance learner understanding, link to course learning objectives, and generally contribute positively to the learning environment. Moreover, because forum interactions and peer review occur in written form, our research contributes to open distance learning (ODL) scholarship by highlighting the importance of writing to learn as a significant pedagogical practice that should be encouraged more in MOOCs across disciplines.
Resumo:
Much of science progresses within the tight boundaries of what is often seen as a "black box". Though familiar to funding agencies, researchers and the academic journals they publish in, it is an entity that outsiders rarely get to peek into. Crowdfunding is a novel means that allows the public to participate in, as well as to support and witness advancements in science. Here we describe our recent crowdfunding efforts to sequence the Azolla genome, a little fern with massive green potential. Crowdfunding is a worthy platform not only for obtaining seed money for exploratory research, but also for engaging directly with the general public as a rewarding form of outreach.