883 resultados para spatiotemporal epidemic prediction model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Worldwide, 700,000 infants are infected annually by HIV-1, most of them in resource-limited settings. Care for these children requires simple, inexpensive tests. We have evaluated HIV-1 p24 antigen for antiretroviral treatment (ART) monitoring in children. p24 by boosted enzyme-linked immunosorbent assay of heated plasma and HIV-1 RNA were measured prospectively in 24 HIV-1-infected children receiving ART. p24 and HIV-1 RNA concentrations and their changes between consecutive visits were related to the respective CD4+ changes. Age at study entry was 7.6 years; follow-up was 47.2 months, yielding 18 visits at an interval of 2.8 months (medians). There were 399 complete visit data sets and 375 interval data sets. Controlling for variation between individuals, there was a positive relationship between concentrations of HIV-1 RNA and p24 (P < 0.0001). While controlling for initial CD4+ count, age, sex, days since start of ART, and days between visits, the relative change in CD4+ count between 2 successive visits was negatively related to the corresponding relative change in HIV-1 RNA (P = 0.009), but not to the initial HIV-1 RNA concentration (P = 0.94). Similarly, we found a negative relationship with the relative change in p24 over the interval (P < 0.0001), whereas the initial p24 concentration showed a trend (P = 0.08). Statistical support for the p24 model and the HIV-1 RNA model was similar. p24 may be an accurate low-cost alternative to monitor ART in pediatric HIV-1 infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose that we are interested in establishing simple, but reliable rules for predicting future t-year survivors via censored regression models. In this article, we present inference procedures for evaluating such binary classification rules based on various prediction precision measures quantified by the overall misclassification rate, sensitivity and specificity, and positive and negative predictive values. Specifically, under various working models we derive consistent estimators for the above measures via substitution and cross validation estimation procedures. Furthermore, we provide large sample approximations to the distributions of these nonsmooth estimators without assuming that the working model is correctly specified. Confidence intervals, for example, for the difference of the precision measures between two competing rules can then be constructed. All the proposals are illustrated with two real examples and their finite sample properties are evaluated via a simulation study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: Predicting asthma episodes is notoriously difficult but has potentially significant consequences for the individual, as well as for healthcare services. The purpose of this review is to describe recent insights into the prediction of acute asthma episodes in relation to classical clinical, functional or inflammatory variables, as well as present a new concept for evaluating asthma as a dynamically regulated homeokinetic system. RECENT FINDINGS: Risk prediction for asthma episodes or relapse has been attempted using clinical scoring systems, considerations of environmental factors and lung function, as well as inflammatory and immunological markers in induced sputum or exhaled air, and these are summarized here. We have recently proposed that newer mathematical methods derived from statistical physics may be used to understand the complexity of asthma as a homeokinetic, dynamic system consisting of a network comprising multiple components, and also to assess the risk for future asthma episodes based on fluctuation analysis of long time series of lung function. SUMMARY: Apart from the classical analysis of risk factor and functional parameters, this new approach may be used to assess asthma control and treatment effects in the individual as well as in future research trials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a method for evaluating an ensemble of predictive models given a sample of observations comprising the model predictions and the outcome event measured with error. Our formulation allows us to simultaneously estimate measurement error parameters, true outcome — aka the gold standard — and a relative weighting of the predictive scores. We describe conditions necessary to estimate the gold standard and for these estimates to be calibrated and detail how our approach is related to, but distinct from, standard model combination techniques. We apply our approach to data from a study to evaluate a collection of BRCA1/BRCA2 gene mutation prediction scores. In this example, genotype is measured with error by one or more genetic assays. We estimate true genotype for each individual in the dataset, operating characteristics of the commonly used genotyping procedures and a relative weighting of the scores. Finally, we compare the scores against the gold standard genotype and find that Mendelian scores are, on average, the more refined and better calibrated of those considered and that the comparison is sensitive to measurement error in the gold standard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heart rate variability (HRV) exhibits fluctuations characterized by a power law behavior of its power spectrum. The interpretation of this nonlinear HRV behavior, resulting from interactions between extracardiac regulatory mechanisms, could be clinically useful. However, the involvement of intrinsic variations of pacemaker rate in HRV has scarcely been investigated. We examined beating variability in spontaneously active incubating cultures of neonatal rat ventricular myocytes using microelectrode arrays. In networks of mathematical model pacemaker cells, we evaluated the variability induced by the stochastic gating of transmembrane currents and of calcium release channels and by the dynamic turnover of ion channels. In the cultures, spontaneous activity originated from a mobile focus. Both the beat-to-beat movement of the focus and beat rate variability exhibited a power law behavior. In the model networks, stochastic fluctuations in transmembrane currents and stochastic gating of calcium release channels did not reproduce the spatiotemporal patterns observed in vitro. In contrast, long-term correlations produced by the turnover of ion channels induced variability patterns with a power law behavior similar to those observed experimentally. Therefore, phenomena leading to long-term correlated variations in pacemaker cellular function may, in conjunction with extracardiac regulatory mechanisms, contribute to the nonlinear characteristics of HRV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Functional magnetic resonance imaging (fMRI) studies can provide insight into the neural correlates of hallucinations. Commonly, such studies require self-reports about the timing of the hallucination events. While many studies have found activity in higher-order sensory cortical areas, only a few have demonstrated activity of the primary auditory cortex during auditory verbal hallucinations. In this case, using self-reports as a model of brain activity may not be sensitive enough to capture all neurophysiological signals related to hallucinations. We used spatial independent component analysis (sICA) to extract the activity patterns associated with auditory verbal hallucinations in six schizophrenia patients. SICA decomposes the functional data set into a set of spatial maps without the use of any input function. The resulting activity patterns from auditory and sensorimotor components were further analyzed in a single-subject fashion using a visualization tool that allows for easy inspection of the variability of regional brain responses. We found bilateral auditory cortex activity, including Heschl's gyrus, during hallucinations of one patient, and unilateral auditory cortex activity in two more patients. The associated time courses showed a large variability in the shape, amplitude, and time of onset relative to the self-reports. However, the average of the time courses during hallucinations showed a clear association with this clinical phenomenon. We suggest that detection of this activity may be facilitated by examining hallucination epochs of sufficient length, in combination with a data-driven approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The past decade has brought significant advancements in seasonal climate forecasting. However, water resources decision support and management continues to be based almost entirely on historical observations and does not take advantage of climate forecasts. This study builds on previous work that conditioned streamflow ensemble forecasts on observable climate indicators, such as the El Niño-Southern Oscillation (ENSO) and the Pacific Decadal Oscillation (PDO) for use in a decision support model for the Highland Lakes multi-reservoir system in central Texas operated by the Lower Colorado River Authority (LCRA). In the current study, seasonal soil moisture is explored as a climate indicator and predictor of annual streamflow for the LCRA region. The main purpose of this study is to evaluate the correlation of fractional soil moisture with streamflow using the 1950-2000 Variable Infiltration Capacity (VIC) Retrospective Land Surface Data Set over the LCRA region. Correlations were determined by examining different annual and seasonal combinations of VIC modeled fractional soil moisture and observed streamflow. The applicability of the VIC Retrospective Land Surface Data Set as a data source for this study is tested along with establishing and analyzing patterns of climatology for the watershed study area using the selected data source (VIC model) and historical data. Correlation results showed potential for the use of soil moisture as a predictor of streamflow over the LCRA region. This was evident by the good correlations found between seasonal soil moisture and seasonal streamflow during coincident seasons as well as between seasonal and annual soil moisture with annual streamflow during coincident years. With the findings of good correlation between seasonal soil moisture from the VIC Retrospective Land Surface Data Set with observed annual streamflow presented in this study, future research would evaluate the application of NOAA Climate Prediction Center (CPC) forecasts of soil moisture in predicting annual streamflow for use in the decision support model for the LCRA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Skeletal muscle force evaluation is difficult to implement in a clinical setting. Muscle force is typically assessed through either manual muscle testing, isokinetic/isometric dynamometry, or electromyography (EMG). Manual muscle testing is a subjective evaluation of a patient’s ability to move voluntarily against gravity and to resist force applied by an examiner. Muscle testing using dynamometers adds accuracy by quantifying functional mechanical output of a limb. However, like manual muscle testing, dynamometry only provides estimates of the joint moment. EMG quantifies neuromuscular activation signals of individual muscles, and is used to infer muscle function. Despite the abundance of work performed to determine the degree to which EMG signals and muscle forces are related, the basic problem remains that EMG cannot provide a quantitative measurement of muscle force. Intramuscular pressure (IMP), the pressure applied by muscle fibers on interstitial fluid, has been considered as a correlate for muscle force. Numerous studies have shown that an approximately linear relationship exists between IMP and muscle force. A microsensor has recently been developed that is accurate, biocompatible, and appropriately sized for clinical use. While muscle force and pressure have been shown to be correlates, IMP has been shown to be non-uniform within the muscle. As it would not be practicable to experimentally evaluate how IMP is distributed, computational modeling may provide the means to fully evaluate IMP generation in muscles of various shapes and operating conditions. The work presented in this dissertation focuses on the development and validation of computational models of passive skeletal muscle and the evaluation of their performance for prediction of IMP. A transversly isotropic, hyperelastic, and nearly incompressible model will be evaluated along with a poroelastic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many complex and dynamic domains, the ability to generate and then select the appropriate course of action is based on the decision maker's "reading" of the situation--in other words, their ability to assess the situation and predict how it will evolve over the next few seconds. Current theories regarding option generation during the situation assessment and response phases of decision making offer contrasting views on the cognitive mechanisms that support superior performance. The Recognition-Primed Decision-making model (RPD; Klein, 1989) and Take-The-First heuristic (TTF; Johnson & Raab, 2003) suggest that superior decisions are made by generating few options, and then selecting the first option as the final one. Long-Term Working Memory theory (LTWM; Ericsson & Kintsch, 1995), on the other hand, posits that skilled decision makers construct rich, detailed situation models, and that as a result, skilled performers should have the ability to generate more of the available task-relevant options. The main goal of this dissertation was to use these theories about option generation as a way to further the understanding of how police officers anticipate a perpetrator's actions, and make decisions about how to respond, during dynamic law enforcement situations. An additional goal was to gather information that can be used, in the future, to design training based on the anticipation skills, decision strategies, and processes of experienced officers. Two studies were conducted to achieve these goals. Study 1 identified video-based law enforcement scenarios that could be used to discriminate between experienced and less-experienced police officers, in terms of their ability to anticipate the outcome. The discriminating scenarios were used as the stimuli in Study 2; 23 experienced and 26 less-experienced police officers observed temporally-occluded versions of the scenarios, and then completed assessment and response option-generation tasks. The results provided mixed support for the nature of option generation in these situations. Consistent with RPD and TTF, participants typically selected the first-generated option as their final one, and did so during both the assessment and response phases of decision making. Consistent with LTWM theory, participants--regardless of experience level--generated more task-relevant assessment options than task-irrelevant options. However, an expected interaction between experience level and option-relevance was not observed. Collectively, the two studies provide a deeper understanding of how police officers make decisions in dynamic situations. The methods developed and employed in the studies can be used to investigate anticipation and decision making in other critical domains (e.g., nursing, military). The results are discussed in relation to how they can inform future studies of option-generation performance, and how they could be applied to develop training for law enforcement officers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrospinning (ES) can readily produce polymer fibers with cross-sectional dimensions ranging from tens of nanometers to tens of microns. Qualitative estimates of surface area coverage are rather intuitive. However, quantitative analytical and numerical methods for predicting surface coverage during ES have not been covered in sufficient depth to be applied in the design of novel materials, surfaces, and devices from ES fibers. This article presents a modeling approach to ES surface coverage where an analytical model is derived for use in quantitative prediction of surface coverage of ES fibers. The analytical model is used to predict the diameter of circular deposition areas of constant field strength and constant electrostatic force. Experimental results of polyvinyl alcohol fibers are reported and compared to numerical models to supplement the analytical model derived. The analytical model provides scientists and engineers a method for estimating surface area coverage. Both applied voltage and capillary-to-collection-plate separation are treated as independent variables for the analysis. The electric field produced by the ES process was modeled using COMSOL Multiphysics software to determine a correlation between the applied field strength and the size of the deposition area of the ES fibers. MATLAB scripts were utilized to combine the numerical COMSOL results with derived analytical equations. Experimental results reinforce the parametric trends produced via modeling and lend credibility to the use of modeling techniques for the qualitative prediction of surface area coverage from ES. (Copyright: 2014 American Vacuum Society.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many methodologies dealing with prediction or simulation of soft tissue deformations on medical image data require preprocessing of the data in order to produce a different shape representation that complies with standard methodologies, such as mass–spring networks, finite element method s (FEM). On the other hand, methodologies working directly on the image space normally do not take into account mechanical behavior of tissues and tend to lack physics foundations driving soft tissue deformations. This chapter presents a method to simulate soft tissue deformations based on coupled concepts from image analysis and mechanics theory. The proposed methodology is based on a robust stochastic approach that takes into account material properties retrieved directly from the image, concepts from continuum mechanics and FEM. The optimization framework is solved within a hierarchical Markov random field (HMRF) which is implemented on the graphics processor unit (GPU See Graphics processing unit ).