967 resultados para Railroad safety, Bayesian methods, Accident modification factor, Countermeasure selection
Resumo:
The Highway Safety Manual (HSM) estimates roadway safety performance based on predictive models that were calibrated using national data. Calibration factors are then used to adjust these predictive models to local conditions for local applications. The HSM recommends that local calibration factors be estimated using 30 to 50 randomly selected sites that experienced at least a total of 100 crashes per year. It also recommends that the factors be updated every two to three years, preferably on an annual basis. However, these recommendations are primarily based on expert opinions rather than data-driven research findings. Furthermore, most agencies do not have data for many of the input variables recommended in the HSM. This dissertation is aimed at determining the best way to meet three major data needs affecting the estimation of calibration factors: (1) the required minimum sample sizes for different roadway facilities, (2) the required frequency for calibration factor updates, and (3) the influential variables affecting calibration factors. In this dissertation, statewide segment and intersection data were first collected for most of the HSM recommended calibration variables using a Google Maps application. In addition, eight years (2005-2012) of traffic and crash data were retrieved from existing databases from the Florida Department of Transportation. With these data, the effect of sample size criterion on calibration factor estimates was first studied using a sensitivity analysis. The results showed that the minimum sample sizes not only vary across different roadway facilities, but they are also significantly higher than those recommended in the HSM. In addition, results from paired sample t-tests showed that calibration factors in Florida need to be updated annually. To identify influential variables affecting the calibration factors for roadway segments, the variables were prioritized by combining the results from three different methods: negative binomial regression, random forests, and boosted regression trees. Only a few variables were found to explain most of the variation in the crash data. Traffic volume was consistently found to be the most influential. In addition, roadside object density, major and minor commercial driveway densities, and minor residential driveway density were also identified as influential variables.
Resumo:
Bayesian adaptive methods have been extensively used in psychophysics to estimate the point at which performance on a task attains arbitrary percentage levels, although the statistical properties of these estimators have never been assessed. We used simulation techniques to determine the small-sample properties of Bayesian estimators of arbitrary performance points, specifically addressing the issues of bias and precision as a function of the target percentage level. The study covered three major types of psychophysical task (yes-no detection, 2AFC discrimination and 2AFC detection) and explored the entire range of target performance levels allowed for by each task. Other factors included in the study were the form and parameters of the actual psychometric function Psi, the form and parameters of the model function M assumed in the Bayesian method, and the location of Psi within the parameter space. Our results indicate that Bayesian adaptive methods render unbiased estimators of any arbitrary point on psi only when M=Psi, and otherwise they yield bias whose magnitude can be considerable as the target level moves away from the midpoint of the range of Psi. The standard error of the estimator also increases as the target level approaches extreme values whether or not M=Psi. Contrary to widespread belief, neither the performance level at which bias is null nor that at which standard error is minimal can be predicted by the sweat factor. A closed-form expression nevertheless gives a reasonable fit to data describing the dependence of standard error on number of trials and target level, which allows determination of the number of trials that must be administered to obtain estimates with prescribed precision.
Resumo:
Acknowledgements We wish to express our gratitude to the National Geographic Society and the National Research Foundation of South Africa for funding the discovery, recovery, and analysis of the H. naledi material. The study reported here was also made possible by grants from the Social Sciences and Humanities Research Council of Canada, the Canada Foundation for Innovation, the British Columbia Knowledge Development Fund, the Canada Research Chairs Program, Simon Fraser University, the DST/NRF Centre of Excellence in Palaeosciences (COE-Pal), as well as by a Discovery Grant from the Natural Sciences and Engineering Research Council of Canada, a Young Scientist Development Grant from the Paleontological Scientific Trust (PAST), a Baldwin Fellowship from the L.S.B. Leakey Foundation, and a Seed Grant and a Cornerstone Faculty Fellowship from the Texas A&M University College of Liberal Arts. We would like to thank the South African Heritage Resource Agency for the permits necessary to work on the Rising Star site; the Jacobs family for granting access; Wilma Lawrence, Bonita De Klerk, Merrill Van der Walt, and Justin Mukanku for their assistance during all phases of the project; Lucas Delezene for valuable discussion on the dental characters of H. naledi. We would also like to thank Peter Schmid for the preparation of the Dinaledi fossil material; Yoel Rak for explaining in detail some of the characters used in previous studies; William Kimbel for drawing our attention to the possibility that there might be a problem with Dembo et al.’s (2015) codes for the two characters related to the articular eminence; Will Stein for helpful discussion about the Bayesian analyses; Mike Lee for his comments on this manuscript; John Hawks for his support in organizing the Rising Star workshop; and the associate editor and three anonymous reviewers for their valuable comments. We are grateful to S. Potze and the Ditsong Museum, B. Billings and the School of Anatomical Sciences at the University of the Witwatersrand, and B. Zipfel and the Evolutionary Studies Institute at the University of the Witwatersrand for providing access to the specimens in their care; the University of the Witwatersrand, the Evolutionary Studies Institute, and the South African National Centre of Excellence in PalaeoSciences for hosting a number of the authors while studying the material; and the Western Canada Research Grid for providing access to the high-performance computing facilities for the Bayesian analyses. Last but definitely not least, we thank the head of the Rising Star project, Lee Berger, for his leadership and support, and for encouraging us to pursue the study reported here.
Resumo:
Constant technology advances have caused data explosion in recent years. Accord- ingly modern statistical and machine learning methods must be adapted to deal with complex and heterogeneous data types. This phenomenon is particularly true for an- alyzing biological data. For example DNA sequence data can be viewed as categorical variables with each nucleotide taking four different categories. The gene expression data, depending on the quantitative technology, could be continuous numbers or counts. With the advancement of high-throughput technology, the abundance of such data becomes unprecedentedly rich. Therefore efficient statistical approaches are crucial in this big data era.
Previous statistical methods for big data often aim to find low dimensional struc- tures in the observed data. For example in a factor analysis model a latent Gaussian distributed multivariate vector is assumed. With this assumption a factor model produces a low rank estimation of the covariance of the observed variables. Another example is the latent Dirichlet allocation model for documents. The mixture pro- portions of topics, represented by a Dirichlet distributed variable, is assumed. This dissertation proposes several novel extensions to the previous statistical methods that are developed to address challenges in big data. Those novel methods are applied in multiple real world applications including construction of condition specific gene co-expression networks, estimating shared topics among newsgroups, analysis of pro- moter sequences, analysis of political-economics risk data and estimating population structure from genotype data.
Resumo:
Background: Mechanisms underlying the effect of estrogen exposure on breast cancer risk remain unclear. Insulin-like growth factor-1 (IGF-1) levels have been positively associated with breast cancer and are a potential mechanism. Objectives: The objectives of this thesis are: 1) to explore whether the reproductive risk factors and the lifetime cumulative number of menstrual cycles (LCMC), as measures for long-term estrogen exposure, are associated with IGF-1 levels, and 2) to examine the effect of an aromatase inhibitor (AI) on IGF-1 levels, and the potential interaction with BMI. Methods: A cross sectional study and a randomized controlled trial nested with the MAP.3 chemoprevention trial were used to address objective 1 and 2, respectively. 567 postmenopausal women were selected. Anthropometric measurements, lifestyle factors, reproductive characteristics and serum IGF-1 concentrations were collected at baseline and one year. Objective 1. The LCMC was computed as a composite measure of the reproductive characteristics. Multivariable linear regression models were used to assess the association between IGF-1 levels and LCMC and the hormonal risk factors, while adjusting for potential covariates. Objective 2. Changes in IGF-1 were compared between the exemestane and placebo, and effect modification by BMI was tested with an interaction term. Results: Objective 1. Women aged 55 years or older at menopause had 16.26 ng/mL (95% CI: 1.76, 30.75) higher IGF-1 compared to women aged less than 50 years at menopause. Women in the highest category of menstrual cycles (≥500 cycles) had an average 19.00 ng/mL (95%CI: 5.86, 32.14) higher concentration of IGF-1 compared to women in the lowest category (<350). Exogenous hormones had no effect on postmenopausal IGF-1 levels. Objective 2. Exemestane significantly increased IGF-1 levels by 18% (95% CI: 14%-22%); while, placebo had no effect on IGF-1. The changes in IGF-1 were significantly different between the treatment arms (P<0.0001) and no significant interaction was observed between treatment and BMI on IGF-1 changes (P=0.1327). Conclusion: Objective 1. Larger number of menstrual cycles and a later age at menopause are positively associated with IGF-1. IGF-1 may be one mechanism by which prolonged estrogen exposure increases cancer risk. Objective 2. We conclude that the reduced cancer risk observed with AI therapy likely occurs in an IGF-1 independent mechanism. Further studies exploring the clinical consequences of increased IGF-1 on AI therapy are needed.
Resumo:
Background and Objectives: Mobility limitations are a prevalent issue in older adult populations, and an important determinant of disability and mortality. Neighborhood conditions are key determinants of mobility and perception of safety may be one such determinant. Women have more mobility limitations than men, a phenomenon known as the gender mobility gap. The objective of this work was to validate a measure of perception of safety, examine the relationship between neighborhood perception of safety and mobility limitations in seniors, and explore if these effects vary by gender. Methods: This study was cross-sectional, using questionnaire data collected from community-dwelling older adults from four sites in Canada, Colombia, and Brazil. The exposure variable was the neighborhood aggregated Perception of Safety (PoS) scale, derived from the Physical and Social Disorder (PSD) scale by Sampson and Raudenbush. Its construct validity was verified using factor analyses and correlation with similar measures. The Mobility Assessment Tool – short form (MAT-sf), a video-based measure validated cross-culturally in the studied populations, was used to assess mobility limitations. Based on theoretical models, covariates were included in the analysis, both at the neighborhood level (SES, social capital, and built environment) and the individual level (age, gender, education, income, chronic illnesses, depression, cognitive function, BMI, and social participation). Multilevel modeling was used in order to account for neighborhood clustering. Gender specific analyses were carried out. SAS and M-plus were used in this study. Results: PoS was validated across all sites. It loaded in a single factor, after excluding two items, with a Cronbach α value of approximately 0.86. Mobility limitations were present in 22.08% of the sample, 16.32% among men and 27.41% among women. Neighborhood perception of safety was significantly associated with mobility limitations when controlling for all covariates, with an OR of 0.84 (CI 95%: 0.73-0.96), indicating lower odds of having mobility limitations as neighborhood perception of safety improves. Gender did not affect this relationship despite women being more likely to have mobility limitations and live in neighborhoods with poor perception of safety. Conclusion: Neighborhood perception of safety affected the prevalence of mobility limitations in older adults in the studied population.
Resumo:
Background Lumacaftor/ivacaftor combination therapy demonstrated clinical benefits inpatients with cystic fibrosis homozygous for the Phe508del CFTR mutation.Pretreatment lung function is a confounding factor that potentially impacts the efficacyand safety of lumacaftor/ivacaftor therapy. Methods Two multinational, randomised, double-blind, placebo-controlled, parallelgroupPhase 3 studies randomised patients to receive placebo or lumacaftor (600 mgonce daily [qd] or 400 mg every 12 hours [q12h]) in combination with ivacaftor (250 mgq12h) for 24 weeks. Prespecified analyses of pooled efficacy and safety data by lungfunction, as measured by percent predicted forced expiratory volume in 1 second(ppFEV1), were performed for patients with baseline ppFEV1 <40 (n=81) and ≥40(n=1016) and screening ppFEV1 <70 (n=730) and ≥70 (n=342). These studies wereregistered with ClinicalTrials.gov (NCT01807923 and NCT01807949). Findings The studies were conducted from April 2013 through April 2014.Improvements in the primary endpoint, absolute change from baseline at week 24 inppFEV1, were observed with both lumacaftor/ivacaftor doses in the subgroup withbaseline ppFEV1 <40 (least-squares mean difference versus placebo was 3∙7 and 3.3percentage points for lumacaftor 600 mg qd/ivacaftor 250 mg q12h and lumacaftor 400mg q12h/ivacaftor 250 mg q12h, respectively [p<0∙05] and in the subgroup with baselineppFEV1 ≥40 (3∙3 and 2∙8 percentage points, respectively [p<0∙001]). Similar absoluteimprovements versus placebo in ppFEV1 were observed in subgroups with screening 4ppFEV1 <70 (3∙3 and 3∙3 percentage points for lumacaftor 600 mg qd/ivacaftor 250 mgq12h and lumacaftor 400 mg q12h/ivacaftor 250 mg q12h, respectively [p<0∙001]) and≥70 (3∙3 and 1∙9 percentage points, respectively [p=0.002] and [p=0∙079]). Increases inBMI and reduction in number of pulmonary exacerbation events were observed in bothLUM/IVA dose groups vs placebo across all lung function subgroups. Treatment wasgenerally well tolerated, although the incidence of some respiratory adverse events washigher with active treatment than with placebo. Interpretation Lumacaftor/ivacaftor combination therapy benefits patients homozygousfor Phe508del CFTR who have varying degrees of lung function impairment. Funding Vertex Pharmaceuticals Incorporated.
Resumo:
Based on optical imaging and spectroscopy of the Type II-Plateau SN 2013eq, we present a comparative study of commonly used distance determination methods based on Type II supernovae. The occurrence of SN 2013eq in the Hubble flow (z = 0.041 ± 0.001) prompted us to investigate the implications of the difference between "angular" and "luminosity" distances within the framework of the expanding photosphere method (EPM) that relies upon a relation between flux and angular size to yield a distance. Following a re-derivation of the basic equations of the EPM for SNe at non-negligible redshifts, we conclude that the EPM results in an angular distance. The observed flux should be converted into the SN rest frame and the angular size, θ, has to be corrected by a factor of (1 + z)2. Alternatively, the EPM angular distance can be converted to a luminosity distance by implementing a modification of the angular size. For SN 2013eq, we find EPM luminosity distances of DL = 151 ± 18 Mpc and DL = 164 ± 20 Mpc by making use of different sets of dilution factors taken from the literature. Application of the standardized candle method for Type II-P SNe results in an independent luminosity distance estimate (DL = 168 ± 16 Mpc) that is consistent with the EPM estimate. Spectra of SN 2013eq are available in the Weizmann Interactive Supernova data REPository (WISeREP): http://wiserep.weizmann.ac.il
Resumo:
The Highway Safety Manual (HSM) is the compilation of national safety research that provides quantitative methods for analyzing highway safety. The HSM presents crash modification functions related to freeway work zone characteristics such as work zone duration and length. These crash modification functions were based on freeway work zones with high traffic volumes in California. When the HSM-referenced model was calibrated for Missouri, the value was 3.78, which is not ideal since it is significantly larger than 1. Therefore, new models were developed in this study using Missouri data to capture geographical, driver behavior, and other factors in the Midwest. Also, new models for expressway and rural two-lane work zones that barely were studied in the literature were developed. A large sample of 20,837 freeway, 8,993 expressway, and 64,476 rural two-lane work zones in Missouri was analyzed to derive 15 work zone crash prediction models. The most appropriate samples of 1,546 freeway, 1,189 expressway, and 6,095 rural two-lane work zones longer than 0.1 mile and with a duration of greater than 10 days were used to make eight, four, and three models, respectively. A challenging question for practitioners is always how to use crash prediction models to make the best estimation of work zone crash count. To solve this problem, a user-friendly software tool was developed in a spreadsheet format to predict work zone crashes based on work zone characteristics. This software selects the best model, estimates the work zone crashes by severity, and converts them to monetary values using standard crash estimates. This study also included a survey of departments of transportation (DOTs), Federal Highway Administration (FHWA) representatives, and contractors to assess the current state of the practice regarding work zone safety. The survey results indicate that many agencies look at work zone safety informally using engineering judgment. Respondents indicated that they would like a tool that could help them to balance work zone safety across projects by looking at crashes and user costs.
Resumo:
Summaries of the data gathered for this project.
Resumo:
This study evaluated the safety impact of the Safety Edge for construction projects in 2010 and 2011 in Iowa to assess the effectiveness of the treatment in reducing crashes.
Resumo:
The U.S. railroad companies spend billions of dollars every year on railroad track maintenance in order to ensure safety and operational efficiency of their railroad networks. Besides maintenance costs, other costs such as train accident costs, train and shipment delay costs and rolling stock maintenance costs are also closely related to track maintenance activities. Optimizing the track maintenance process on the extensive railroad networks is a very complex problem with major cost implications. Currently, the decision making process for track maintenance planning is largely manual and primarily relies on the knowledge and judgment of experts. There is considerable potential to improve the process by using operations research techniques to develop solutions to the optimization problems on track maintenance. In this dissertation study, we propose a range of mathematical models and solution algorithms for three network-level scheduling problems on track maintenance: track inspection scheduling problem (TISP), production team scheduling problem (PTSP) and job-to-project clustering problem (JTPCP). TISP involves a set of inspection teams which travel over the railroad network to identify track defects. It is a large-scale routing and scheduling problem where thousands of tasks are to be scheduled subject to many difficult side constraints such as periodicity constraints and discrete working time constraints. A vehicle routing problem formulation was proposed for TISP, and a customized heuristic algorithm was developed to solve the model. The algorithm iteratively applies a constructive heuristic and a local search algorithm in an incremental scheduling horizon framework. The proposed model and algorithm have been adopted by a Class I railroad in its decision making process. Real-world case studies show the proposed approach outperforms the manual approach in short-term scheduling and can be used to conduct long-term what-if analyses to yield managerial insights. PTSP schedules capital track maintenance projects, which are the largest track maintenance activities and account for the majority of railroad capital spending. A time-space network model was proposed to formulate PTSP. More than ten types of side constraints were considered in the model, including very complex constraints such as mutual exclusion constraints and consecution constraints. A multiple neighborhood search algorithm, including a decomposition and restriction search and a block-interchange search, was developed to solve the model. Various performance enhancement techniques, such as data reduction, augmented cost function and subproblem prioritization, were developed to improve the algorithm. The proposed approach has been adopted by a Class I railroad for two years. Our numerical results show the model solutions are able to satisfy all hard constraints and most soft constraints. Compared with the existing manual procedure, the proposed approach is able to bring significant cost savings and operational efficiency improvement. JTPCP is an intermediate problem between TISP and PTSP. It focuses on clustering thousands of capital track maintenance jobs (based on the defects identified in track inspection) into projects so that the projects can be scheduled in PTSP. A vehicle routing problem based model and a multiple-step heuristic algorithm were developed to solve this problem. Various side constraints such as mutual exclusion constraints and rounding constraints were considered. The proposed approach has been applied in practice and has shown good performance in both solution quality and efficiency.
Resumo:
The impact of cooking methods (industrial pre-frying, deep-fat frying and baking) on the nutritional quality and safety of chicken breaded nugget samples from supermarket and commercial brands was evaluated. The changes in the quality characteristics (nutritional composition, fatty acids profile, cholesterol and salt) of the fried food and frying oil, after ten consecutive frying operations, were evaluated. The total fat content of nuggets varied between 10.9 and 22.7 g per 100 g of edible portion and the salt content ranged from 0.873 to 1.63 g per 100 g. Taking into account one portion of nuggets, the daily intake of salt can reach 49%, which can have a significant impact on the health of those who regularly consume this type of food, especially considering the prevalence of hypertension around the world. The analysed chicken breaded nuggets are rich in unsaturated fatty acids, which have been related with potential health benefits, namely regarding cardiovascular diseases. The cholesterol content of baked samples was two times higher when compared with the fried ones. The trans fatty acids and polar compounds contents of the frying oil used for frying significantly increased, but the values were still away from the maximum recommended by legal entities for its rejection. From a nutritional point of view, it is possible to conclude that the applied cooking methods can significantly influence the nutritional quality and safety of the analysed chicken breaded nuggets. This study will contribute to important knowledge on how the applied cooking methods can change the nutritional quality and safety of foods, namely of chicken nuggets, and can be very useful for dietary recommendations and nutritional assessment.
Resumo:
Statistical methodology is proposed for comparing molecular shapes. In order to account for the continuous nature of molecules, classical shape analysis methods are combined with techniques used for predicting random fields in spatial statistics. Applying a modification of Procrustes analysis, Bayesian inference is carried out using Markov chain Monte Carlo methods for the pairwise alignment of the resulting molecular fields. Superimposing entire fields rather than the configuration matrices of nuclear positions thereby solves the problem that there is usually no clear one--to--one correspondence between the atoms of the two molecules under consideration. Using a similar concept, we also propose an adaptation of the generalised Procrustes analysis algorithm for the simultaneous alignment of multiple molecular fields. The methodology is applied to a dataset of 31 steroid molecules.
Resumo:
Thesis (Master, Community Health & Epidemiology) -- Queen's University, 2016-10-02 21:02:07.735