969 resultados para school safety
Resumo:
Now in its second edition, this book describes tools that are commonly used in transportation data analysis. The first part of the text provides statistical fundamentals while the second part presents continuous dependent variable models. With a focus on count and discrete dependent variable models, the third part features new chapters on mixed logit models, logistic regression, and ordered probability models. The last section provides additional coverage of Bayesian statistical modeling, including Bayesian inference and Markov chain Monte Carlo methods. Data sets are available online to use with the modeling techniques discussed.
Resumo:
In order to examine time allocation patterns within household-level trip-chaining, simultaneous doubly-censored Tobit models are applied to model time-use behavior within the context of household activity participation. Using the entire sample and a sub-sample of worker households from Tucson's Household Travel Survey, two sets of models are developed to better understand the phenomena of trip-chaining behavior among five types of households: single non-worker households, single worker households, couple non-worker households, couple one-worker households, and couple two-worker households. Durations of out-of-home subsistence, maintenance, and discretionary activities within trip chains are examined. Factors found to be associated with trip-chaining behavior include intra-household interactions with the household types and their structure and household head attributes.
Resumo:
A study was done to develop macrolevel crash prediction models that can be used to understand and identify effective countermeasures for improving signalized highway intersections and multilane stop-controlled highway intersections in rural areas. Poisson and negative binomial regression models were fit to intersection crash data from Georgia, California, and Michigan. To assess the suitability of the models, several goodness-of-fit measures were computed. The statistical models were then used to shed light on the relationships between crash occurrence and traffic and geometric features of the rural signalized intersections. The results revealed that traffic flow variables significantly affected the overall safety performance of the intersections regardless of intersection type and that the geometric features of intersections varied across intersection type and also influenced crash type.
Resumo:
Large trucks are involved in a disproportionately small fraction of the total crashes but a disproportionately large fraction of fatal crashes. Large truck crashes often result in significant congestion due to their large physical dimensions and from difficulties in clearing crash scenes. Consequently, preventing large truck crashes is critical to improving highway safety and operations. This study identifies high risk sites (hot spots) for large truck crashes in Arizona and examines potential risk factors related to the design and operation of the high risk sites. High risk sites were identified using both state of the practice methods (accident reduction potential using negative binomial regression with long crash histories) and a newly proposed method using Property Damage Only Equivalents (PDOE). The hot spots identified via the count model generally exhibited low fatalities and major injuries but large minor injuries and PDOs, while the opposite trend was observed using the PDOE methodology. The hot spots based on the count model exhibited large AADTs, whereas those based on the PDOE showed relatively small AADTs but large fractions of trucks and high posted speed limits. Documented site investigations of hot spots revealed numerous potential risk factors, including weaving activities near freeway junctions and ramps, absence of acceleration lanes near on-ramps, small shoulders to accommodate large trucks, narrow lane widths, inadequate signage, and poor lighting conditions within a tunnel.
Resumo:
Speeding is recognized as a major contributing factor in traffic crashes. In order to reduce speed-related crashes, the city of Scottsdale, Arizona implemented the first fixed-camera photo speed enforcement program (SEP) on a limited access freeway in the US. The 9-month demonstration program spanning from January 2006 to October 2006 was implemented on a 6.5 mile urban freeway segment of Arizona State Route 101 running through Scottsdale. This paper presents the results of a comprehensive analysis of the impact of the SEP on speeding behavior, crashes, and the economic impact of crashes. The impact on speeding behavior was estimated using generalized least square estimation, in which the observed speeds and the speeding frequencies during the program period were compared to those during other periods. The impact of the SEP on crashes was estimated using 3 evaluation methods: a before-and-after (BA) analysis using a comparison group, a BA analysis with traffic flow correction, and an empirical Bayes BA analysis with time-variant safety. The analysis results reveal that speeding detection frequencies (speeds> or =76 mph) increased by a factor of 10.5 after the SEP was (temporarily) terminated. Average speeds in the enforcement zone were reduced by about 9 mph when the SEP was implemented, after accounting for the influence of traffic flow. All crash types were reduced except rear-end crashes, although the estimated magnitude of impact varies across estimation methods (and their corresponding assumptions). When considering Arizona-specific crash related injury costs, the SEP is estimated to yield about $17 million in annual safety benefits.
Resumo:
Identifying crash “hotspots”, “blackspots”, “sites with promise”, or “high risk” locations is standard practice in departments of transportation throughout the US. The literature is replete with the development and discussion of statistical methods for hotspot identification (HSID). Theoretical derivations and empirical studies have been used to weigh the benefits of various HSID methods; however, a small number of studies have used controlled experiments to systematically assess various methods. Using experimentally derived simulated data—which are argued to be superior to empirical data, three hot spot identification methods observed in practice are evaluated: simple ranking, confidence interval, and Empirical Bayes. Using simulated data, sites with promise are known a priori, in contrast to empirical data where high risk sites are not known for certain. To conduct the evaluation, properties of observed crash data are used to generate simulated crash frequency distributions at hypothetical sites. A variety of factors is manipulated to simulate a host of ‘real world’ conditions. Various levels of confidence are explored, and false positives (identifying a safe site as high risk) and false negatives (identifying a high risk site as safe) are compared across methods. Finally, the effects of crash history duration in the three HSID approaches are assessed. The results illustrate that the Empirical Bayes technique significantly outperforms ranking and confidence interval techniques (with certain caveats). As found by others, false positives and negatives are inversely related. Three years of crash history appears, in general, to provide an appropriate crash history duration.
Resumo:
Many studies focused on the development of crash prediction models have resulted in aggregate crash prediction models to quantify the safety effects of geometric, traffic, and environmental factors on the expected number of total, fatal, injury, and/or property damage crashes at specific locations. Crash prediction models focused on predicting different crash types, however, have rarely been developed. Crash type models are useful for at least three reasons. The first is motivated by the need to identify sites that are high risk with respect to specific crash types but that may not be revealed through crash totals. Second, countermeasures are likely to affect only a subset of all crashes—usually called target crashes—and so examination of crash types will lead to improved ability to identify effective countermeasures. Finally, there is a priori reason to believe that different crash types (e.g., rear-end, angle, etc.) are associated with road geometry, the environment, and traffic variables in different ways and as a result justify the estimation of individual predictive models. The objectives of this paper are to (1) demonstrate that different crash types are associated to predictor variables in different ways (as theorized) and (2) show that estimation of crash type models may lead to greater insights regarding crash occurrence and countermeasure effectiveness. This paper first describes the estimation results of crash prediction models for angle, head-on, rear-end, sideswipe (same direction and opposite direction), and pedestrian-involved crash types. Serving as a basis for comparison, a crash prediction model is estimated for total crashes. Based on 837 motor vehicle crashes collected on two-lane rural intersections in the state of Georgia, six prediction models are estimated resulting in two Poisson (P) models and four NB (NB) models. The analysis reveals that factors such as the annual average daily traffic, the presence of turning lanes, and the number of driveways have a positive association with each type of crash, whereas median widths and the presence of lighting are negatively associated. For the best fitting models covariates are related to crash types in different ways, suggesting that crash types are associated with different precrash conditions and that modeling total crash frequency may not be helpful for identifying specific countermeasures.
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile
Resumo:
The costs of work-related crashes In Australia and overseas, fleet safety or work-related road safety is an issue gaining increased attention from researchers, organisations, road safety practitioners and the general community. This attention is primarily in response to the substantial physical, emotional and economic costs associated with work-related road crashes. The increased risk factors and subsequent costs of work-related driving are also now well documented in the literature. For example, it is noteworthy that research has demonstrated that work-related drivers on average report a higher level of crash involvement compared to personal car drivers (Downs et al., 1999; Kweon and Kockelman, 2003) and in particular within Australia, road crashes are the most common form of work-related fatalities (Haworth et al., 2000).
Resumo:
National estimates of the prevalence of child abuse-related injuries are obtained from a variety of sectors including welfare, justice, and health resulting in inconsistent estimates across sectors. The International Classification of Diseases (ICD) is used as the international standard for categorising health data and aggregating data for statistical purposes, though there has been limited validation of the quality, completeness or concordance of these data with other sectors. This research study examined the quality of documentation and coding of child abuse recorded in hospital records in Queensland and the concordance of these data with child welfare records. A retrospective medical record review was used to examine the clinical documentation of over 1000 hospitalised injured children from 20 hospitals in Queensland. A data linkage methodology was used to link these records with records in the child welfare database. Cases were sampled from three sub-groups according to the presence of target ICD codes: Definite abuse, Possible abuse, unintentional injury. Less than 2% of cases coded as being unintentional were recoded after review as being possible abuse, and only 5% of cases coded as possible abuse cases were reclassified as unintentional, though there was greater variation in the classification of cases as definite abuse compared to possible abuse. Concordance of health data with child welfare data varied across patient subgroups. This study will inform the development of strategies to improve the quality, consistency and concordance of information between health and welfare agencies to ensure adequate system responses to children at risk of abuse.
Resumo:
Emergency departments (EDs) are often the first point of contact with an abused child. Despite legal mandate, the reporting of definite or suspected abusive injury to child safety authorities by ED clinicians varies due to a number of factors including training, access to child safety professionals, departmental culture and a fear of ‘getting it wrong’. This study examined the quality of documentation and coding of child abuse captured by ED based injury surveillance data and ED medical records in the state of Queensland and the concordance of these data with child welfare records. A retrospective medical record review was used to examine the clinical documentation of almost 1000 injured children included in the Queensland Injury Surveillance Unit database (QISU) from 10 hospitals in urban and rural centres. Independent experts re-coded the records based on their review of the notes. A data linkage methodology was then used to link these records with records in the state government’s child welfare database. Cases were sampled from three sub-groups according to the surveillance intent codes: Maltreatment by parent, Undetermined and Unintentional injury. Only 0.1% of cases coded as unintentional injury were recoded to maltreatment by parent, while 1.2% of cases coded as maltreatment by parent were reclassified as unintentional and 5% of cases where the intent was undetermined by the triage nurse were recoded as maltreatment by parent. Quality of documentation varied across type of hospital (tertiary referral centre, children’s, urban, regional and remote). Concordance of health data with child welfare data varied across patient subgroups. Outcomes from this research will guide initiatives to improve the quality of intentional child injury surveillance systems.
Resumo:
Signalling layout design is one of the keys to railway operations with fixed-block signalling system and it also carries direct effect on overall train efficiency and safety. Based on an analysis to system objectives, this paper presents an optimization model with two objectives in order to devise an efficient signalling layout scheme. Taking into account the present railway line design practices in China, the paper describes steps of the computer-based signalling layout optimisation with real-coded genetic algorithms. A computer-aided system, based on train movement simulator, has also been employed to assist the optimisation process. A case study on a practical railway line has been conducted to make comparisons between the proposed GA-based approach and the current practices. The results illustrate the improved performance of the proposed approach in reducing signal block joints and shortening minimum train service headway.
Resumo:
This paper describes a number of techniques for GNSS navigation message authentication. A detailed analysis of the security facilitated by navigation message authentication is given. The analysis takes into consideration the risk of critical applications that rely on GPS including transportation, finance and telecommunication networks. We propose a number of cryptographic authentication schemes for navigation data authentication. These authentication schemes provide authenticity and integrity of the navigation data to the receiver. Through software simulation, the performance of the schemes is quantified. The use of software simulation enables the collection of authentication performance data of different data channels, and the impact of various schemes on the infrastructure and receiver. Navigation message authentication schemes have been simulated at the proposed data rates of Galileo and GPS services, for which the resulting performance data is presented. This paper concludes by making recommendations for optimal implementation of navigation message authentication for Galileo and next generation GPS systems.
Resumo:
Tracking/remote monitoring systems using GNSS are a proven method to enhance the safety and security of personnel and vehicles carrying precious or hazardous cargo. While GNSS tracking appears to mitigate some of these threats, if not adequately secured, it can be a double-edged sword allowing adversaries to obtain sensitive shipment and vehicle position data to better coordinate their attacks, and to provide a false sense of security to monitoring centers. Tracking systems must be designed with the ability to perform route-compliance and thwart attacks ranging from low-level attacks such as the cutting of antenna cables to medium and high-level attacks involving radio jamming and signal / data-level simulation, especially where the goods transported have a potentially high value to terrorists. This paper discusses the use of GNSS in critical tracking applications, addressing the mitigation of GNSS security issues, augmentation systems and communication systems in order to provide highly robust and survivable tracking systems.
Resumo:
Since 1996, ther provision of a refuge floor has been a mandatory feature for all new tall buildings in Hong Kong. These floors are designed to provide for building occupants a fire safe environment that is also free from smoke. However, the desired cross ventilation on these floors to achieve the removal of smoke, assumed by the Building Codes of Hong Kong, is still being questioned so that a further scientific study of the wind-induced ventilation of a refuge fllor is needed. This paper presents an investigation into this issue. The developed computational technique used in this paper was adopted to study the wind-induced natural ventilation on a refuge floor. The aim of the investigation was to establish whether a refuge floor with a cetnral core and having cross ventilation produced by only two open opposite external side walls on the refuge floor would provide the required protection in all situations taking into account behaviour of wind due to different floor heights, wall boundary conditions and turbulence intensity profiles. The results revealed that natural ventilation can be increased by increasng the floor heigh provided the wind angle to the building is less than 90 degrees. The effectiveness of the solution was greatly reduced when the wind was blowing at 90 degrees to the refuge floor opening.