946 resultados para Over-dispersion, Crash prediction, Bayesian method, Intersection safety


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The direct Bayesian admissible region approach is an a priori state free measurement association and initial orbit determination technique for optical tracks. In this paper, we test a hybrid approach that appends a least squares estimator to the direct Bayesian method on measurements taken at the Zimmerwald Observatory of the Astronomical Institute at the University of Bern. Over half of the association pairs agreed with conventional geometric track correlation and least squares techniques. The remaining pairs cast light on the fundamental limits of conducting tracklet association based solely on dynamical and geometrical information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Breast cancer is the most common non-skin cancer and the second leading cause of cancer-related death in women in the United States. Studies on ipsilateral breast tumor relapse (IBTR) status and disease-specific survival will help guide clinic treatment and predict patient prognosis.^ After breast conservation therapy, patients with breast cancer may experience breast tumor relapse. This relapse is classified into two distinct types: true local recurrence (TR) and new ipsilateral primary tumor (NP). However, the methods used to classify the relapse types are imperfect and are prone to misclassification. In addition, some observed survival data (e.g., time to relapse and time from relapse to death)are strongly correlated with relapse types. The first part of this dissertation presents a Bayesian approach to (1) modeling the potentially misclassified relapse status and the correlated survival information, (2) estimating the sensitivity and specificity of the diagnostic methods, and (3) quantify the covariate effects on event probabilities. A shared frailty was used to account for the within-subject correlation between survival times. The inference was conducted using a Bayesian framework via Markov Chain Monte Carlo simulation implemented in softwareWinBUGS. Simulation was used to validate the Bayesian method and assess its frequentist properties. The new model has two important innovations: (1) it utilizes the additional survival times correlated with the relapse status to improve the parameter estimation, and (2) it provides tools to address the correlation between the two diagnostic methods conditional to the true relapse types.^ Prediction of patients at highest risk for IBTR after local excision of ductal carcinoma in situ (DCIS) remains a clinical concern. The goals of the second part of this dissertation were to evaluate a published nomogram from Memorial Sloan-Kettering Cancer Center, to determine the risk of IBTR in patients with DCIS treated with local excision, and to determine whether there is a subset of patients at low risk of IBTR. Patients who had undergone local excision from 1990 through 2007 at MD Anderson Cancer Center with a final diagnosis of DCIS (n=794) were included in this part. Clinicopathologic factors and the performance of the Memorial Sloan-Kettering Cancer Center nomogram for prediction of IBTR were assessed for 734 patients with complete data. Nomogram for prediction of 5- and 10-year IBTR probabilities were found to demonstrate imperfect calibration and discrimination, with an area under the receiver operating characteristic curve of .63 and a concordance index of .63. In conclusion, predictive models for IBTR in DCIS patients treated with local excision are imperfect. Our current ability to accurately predict recurrence based on clinical parameters is limited.^ The American Joint Committee on Cancer (AJCC) staging of breast cancer is widely used to determine prognosis, yet survival within each AJCC stage shows wide variation and remains unpredictable. For the third part of this dissertation, biologic markers were hypothesized to be responsible for some of this variation, and the addition of biologic markers to current AJCC staging were examined for possibly provide improved prognostication. The initial cohort included patients treated with surgery as first intervention at MDACC from 1997 to 2006. Cox proportional hazards models were used to create prognostic scoring systems. AJCC pathologic staging parameters and biologic tumor markers were investigated to devise the scoring systems. Surveillance Epidemiology and End Results (SEER) data was used as the external cohort to validate the scoring systems. Binary indicators for pathologic stage (PS), estrogen receptor status (E), and tumor grade (G) were summed to create PS+EG scoring systems devised to predict 5-year patient outcomes. These scoring systems facilitated separation of the study population into more refined subgroups than the current AJCC staging system. The ability of the PS+EG score to stratify outcomes was confirmed in both internal and external validation cohorts. The current study proposes and validates a new staging system by incorporating tumor grade and ER status into current AJCC staging. We recommend that biologic markers be incorporating into revised versions of the AJCC staging system for patients receiving surgery as the first intervention.^ Chapter 1 focuses on developing a Bayesian method to solve misclassified relapse status and application to breast cancer data. Chapter 2 focuses on evaluation of a breast cancer nomogram for predicting risk of IBTR in patients with DCIS after local excision gives the statement of the problem in the clinical research. Chapter 3 focuses on validation of a novel staging system for disease-specific survival in patients with breast cancer treated with surgery as the first intervention. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the development of an artificial neural network (ANN) method to detect laminar defects following the pattern matching approach utilizing dynamic measurement. Although structural health monitoring (SHM) using ANN has attracted much attention in the last decade, the problem of how to select the optimal class of ANN models has not been investigated in great depth. It turns out that the lack of a rigorous ANN design methodology is one of the main reasons for the delay in the successful application of the promising technique in SHM. In this paper, a Bayesian method is applied in the selection of the optimal class of ANN models for a given set of input/target training data. The ANN design method is demonstrated for the case of the detection and characterisation of laminar defects in carbon fibre-reinforced beams using flexural vibration data for beams with and without non-symmetric delamination damage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The retrieval of wind fields from scatterometer observations has traditionally been separated into two phases; local wind vector retrieval and ambiguity removal. Operationally, a forward model relating wind vector to backscatter is inverted, typically using look up tables, to retrieve up to four local wind vector solutions. A heuristic procedure, using numerical weather prediction forecast wind vectors and, often, some neighbourhood comparison is then used to select the correct solution. In this paper we develop a Bayesian method for wind field retrieval, and show how a direct local inverse model, relating backscatter to wind vector, improves the wind vector retrieval accuracy. We compare these results with the operational U.K. Meteorological Office retrievals, our own CMOD4 retrievals and a neural network based local forward model retrieval. We suggest that the neural network based inverse model, which is extremely fast to use, improves upon current forward models when used in a variational data assimilation scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The principled statistical application of Gaussian random field models used in geostatistics has historically been limited to data sets of a small size. This limitation is imposed by the requirement to store and invert the covariance matrix of all the samples to obtain a predictive distribution at unsampled locations, or to use likelihood-based covariance estimation. Various ad hoc approaches to solve this problem have been adopted, such as selecting a neighborhood region and/or a small number of observations to use in the kriging process, but these have no sound theoretical basis and it is unclear what information is being lost. In this article, we present a Bayesian method for estimating the posterior mean and covariance structures of a Gaussian random field using a sequential estimation algorithm. By imposing sparsity in a well-defined framework, the algorithm retains a subset of “basis vectors” that best represent the “true” posterior Gaussian random field model in the relative entropy sense. This allows a principled treatment of Gaussian random field models on very large data sets. The method is particularly appropriate when the Gaussian random field model is regarded as a latent variable model, which may be nonlinearly related to the observations. We show the application of the sequential, sparse Bayesian estimation in Gaussian random field models and discuss its merits and drawbacks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis addresses data assimilation, which typically refers to the estimation of the state of a physical system given a model and observations, and its application to short-term precipitation forecasting. A general introduction to data assimilation is given, both from a deterministic and' stochastic point of view. Data assimilation algorithms are reviewed, in the static case (when no dynamics are involved), then in the dynamic case. A double experiment on two non-linear models, the Lorenz 63 and the Lorenz 96 models, is run and the comparative performance of the methods is discussed in terms of quality of the assimilation, robustness "in the non-linear regime and computational time. Following the general review and analysis, data assimilation is discussed in the particular context of very short-term rainfall forecasting (nowcasting) using radar images. An extended Bayesian precipitation nowcasting model is introduced. The model is stochastic in nature and relies on the spatial decomposition of the rainfall field into rain "cells". Radar observations are assimilated using a Variational Bayesian method in which the true posterior distribution of the parameters is approximated by a more tractable distribution. The motion of the cells is captured by a 20 Gaussian process. The model is tested on two precipitation events, the first dominated by convective showers, the second by precipitation fronts. Several deterministic and probabilistic validation methods are applied and the model is shown to retain reasonable prediction skill at up to 3 hours lead time. Extensions to the model are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62E16, 65C05, 65C20.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Highway Safety Manual (HSM) is the compilation of national safety research that provides quantitative methods for analyzing highway safety. The HSM presents crash modification functions related to freeway work zone characteristics such as work zone duration and length. These crash modification functions were based on freeway work zones with high traffic volumes in California. When the HSM-referenced model was calibrated for Missouri, the value was 3.78, which is not ideal since it is significantly larger than 1. Therefore, new models were developed in this study using Missouri data to capture geographical, driver behavior, and other factors in the Midwest. Also, new models for expressway and rural two-lane work zones that barely were studied in the literature were developed. A large sample of 20,837 freeway, 8,993 expressway, and 64,476 rural two-lane work zones in Missouri was analyzed to derive 15 work zone crash prediction models. The most appropriate samples of 1,546 freeway, 1,189 expressway, and 6,095 rural two-lane work zones longer than 0.1 mile and with a duration of greater than 10 days were used to make eight, four, and three models, respectively. A challenging question for practitioners is always how to use crash prediction models to make the best estimation of work zone crash count. To solve this problem, a user-friendly software tool was developed in a spreadsheet format to predict work zone crashes based on work zone characteristics. This software selects the best model, estimates the work zone crashes by severity, and converts them to monetary values using standard crash estimates. This study also included a survey of departments of transportation (DOTs), Federal Highway Administration (FHWA) representatives, and contractors to assess the current state of the practice regarding work zone safety. The survey results indicate that many agencies look at work zone safety informally using engineering judgment. Respondents indicated that they would like a tool that could help them to balance work zone safety across projects by looking at crashes and user costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Higher order (2,4) FDTD schemes used for numerical solutions of Maxwell`s equations are focused on diminishing the truncation errors caused by the Taylor series expansion of the spatial derivatives. These schemes use a larger computational stencil, which generally makes use of the two constant coefficients, C-1 and C-2, for the four-point central-difference operators. In this paper we propose a novel way to diminish these truncation errors, in order to obtain more accurate numerical solutions of Maxwell`s equations. For such purpose, we present a method to individually optimize the pair of coefficients, C-1 and C-2, based on any desired grid size resolution and size of time step. Particularly, we are interested in using coarser grid discretizations to be able to simulate electrically large domains. The results of our optimization algorithm show a significant reduction in dispersion error and numerical anisotropy for all modeled grid size resolutions. Numerical simulations of free-space propagation verifies the very promising theoretical results. The model is also shown to perform well in more complex, realistic scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: The purpose of this ecological study was to evaluate the urban spatial and temporal distribution of tuberculosis (TB) in Ribeirão Preto, State of São Paulo, southeast Brazil, between 2006 and 2009 and to evaluate its relationship with factors of social vulnerability such as income and education level. METHODS: We evaluated data from TBWeb, an electronic notification system for TB cases. Measures of social vulnerability were obtained from the SEADE Foundation, and information about the number of inhabitants, education and income of the households were obtained from Brazilian Institute of Geography and Statistics. Statistical analyses were conducted by a Bayesian regression model assuming a Poisson distribution for the observed new cases of TB in each area. A conditional autoregressive structure was used for the spatial covariance structure. RESULTS: The Bayesian model confirmed the spatial heterogeneity of TB distribution in Ribeirão Preto, identifying areas with elevated risk and the effects of social vulnerability on the disease. We demonstrated that the rate of TB was correlated with the measures of income, education and social vulnerability. However, we observed areas with low vulnerability and high education and income, but with high estimated TB rates. CONCLUSIONS: The study identified areas with different risks for TB, given that the public health system deals with the characteristics of each region individually and prioritizes those that present a higher propensity to risk of TB. Complex relationships may exist between TB incidence and a wide range of environmental and intrinsic factors, which need to be studied in future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to describe the Sequential Excavation Method, used for excava-tion in underground works, as well as the related risks and preventive measures. This method has characteristics that differentiate it from other tunnelling techniques: it uses a larger number of workers and equipment; it has a high concurrency of tasks with various workers and equip-ment quite exposed to hazards; and it uses many potentially aggressive chemicals. Firstly, it is given a broad overview of this issue. Afterwards, it will be presented the results of a survey to a sample of experienced technicians, aimed at gauging the relevance of a set of guidelines relat-ing to the design and work phases, applicable to the domestic market and prepared following technical visits to works abroad.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Retrospective analyses suggest that personalized PK-based dosage might be useful for imatinib, as treatment response correlates with trough concentrations (Cmin) in cancer patients. Our objectives were to improve the interpretation of randomly measured concentrations and to confirm its efficiency before evaluating the clinical usefulness of systematic PK-based dosage in chronic myeloid leukemia patients. Methods and Results: A Bayesian method was validated for the prediction of individual Cmin on the basis of a single random observation, and was applied in a prospective multicenter randomized controlled clinical trial. 28 out of 56 patients were enrolled in the systematic dosage individualization arm and had 44 follow-up visits (their clinical follow-up is ongoing). PK-dose-adjustments were proposed in 39% having predicted Cmin significantly away from the target (1000 ng/ml). Recommendations were taken up by physicians in 57%, patients were considered non-compliant in 27%. Median Cmin at study inclusion was 754 ng/ml and differed significantly from the target (p=0.02, Wilcoxon test). On follow-up, Cmin was 984 ng/ml (p=0.82) in the compliant group. CV decreased from 46% to 27% (p=0.02, F-test). Conclusion: PK-based (Bayesian) dosage adjustment is able to bring individual drug exposure closer to a given therapeutic target. Its influence on therapeutic response remains to be evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Left-turning traffic is a major source of conflicts at intersections. Though an average of only 10% to 15% of all approach traffic turns left, these vehicles are involved in approximately 45% of all accidents. This report presents the results of research conducted to develop models which estimate approach accident rates at high speed signalized intersections. The objective of the research was to quantify the relationship between traffic and intersection characteristics, and accident potential of different left turn treatments. Geometric, turning movement counts, and traffic signal phasing data were collected at 100 intersections in Iowa using a questionnaire sent to municipalities. Not all questionnaires resulted in complete data and ultimately complete data were derived for 63 intersections providing a database of 248 approaches. Accident data for the same approaches were obtained from the Iowa Department of Transportation Accident Location and Analysis System (ALAS). Regression models were developed for two different dependent variables: 1) the ratio of the number of left turn accidents per approach to million left turning vehicles per approach, and 2) the ratio of accidents per approach to million traffic movements per approach. A number of regression models were developed for both dependent variables. One model using each dependent variable was developed for intersections with low, medium, and high left turning traffic volumes. As expected, the research indicates that protected left turn phasing has a lower accident potential than protected/permitted or permitted phasing. Left turn lanes and multiple lane approaches are beneficial for reducing accident rates, while raised medians increase the likelihood of accidents. Signals that are part of a signal system tend to have lower accident rates than isolated signals. The resulting regression models may be used to determine the likely impact of various left turn treatments on intersection accident rates. When designing an intersection approach, a traffic engineer may use the models to estimate the accident rate reduction as a result of improved lane configurations and left turn treatments. The safety benefits may then be compared to any costs associated with operational effects to the intersection (i.e., increased delay) to determine the benefits and costs of making intersection safety improvements.