937 resultados para crash
Resumo:
Sleepiness is a significant contributor to car crashes and sleepiness related crashes have higher mortality and morbidity than other crashes. Young adult drivers are at particular risk for sleepiness related car crashes. It has been suggested that this is because young adults are typically sleepier than older adults because of chronic sleep loss, and more often drive at times of increased risk of acute sleepiness. This prospective study aimed to determine the relationship between predicted and perceived sleepiness while driving in 47 young-adult drivers over a 4-week period. Sleepiness levels were predicted by a model incorporating known circadian and sleep factors influencing alertness, and compared to subjective ratings of sleepiness during 25 18 driving episodes. Results suggested that young drivers frequently drive while at risk of crashing, at times of predicted sleepiness (>7% of episodes) and at times they felt themselves to be sleepy (>23% of episodes). A significant relationship was found between perceived and predicted estimates of sleepiness. However, the participants nonetheless drove at these times. The results of this study may help preventative programs to specifically target factors leading to increased sleepiness when driving (particularly time of day), and to focus interventions to stop young adults from driving when they feel sleepy. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we obtain detailed data on road traffic crash (RTC) casualties, by severity, for each of the eight state and territory jurisdictions for Australia and use these to estimate and compare the economic impact of RTCs across these regions. We show that the annual cost of RTCs in Australia, in 2003, was approximately $17b, which is approximately 2.3% of the Gross Domestic Product (GDP). Importantly, though, there is remarkable intra-national variation in the incident rates of RTCs in Australia and costs range from approximately 0.62 to 3.63% of Gross State Product (GSP). The paper makes two fundamental contributions: (i) it provides a detailed breakdown of estimated RTC casualties, by state and territory regions in Australia, and (ii) it presents the first sub-national breakdown of RTC costs for Australia. We trust that these contributions will assist policy-makers to understand sub-national variations in the road toll better and will encourage further research on the causes of the marked differences between RTC outcomes across the states and territories of Australia. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Objective: The purpose of this study was to determine whether injury mechanism among injured patients is differentially distributed as a function of acute alcohol consumption (quantity, type, and drinking setting). Method: A cross-sectional study was conducted between October 2000 and October 2001 in the Gold Coast Hospital Emergency Department, Queensland, Australia. Data were collected quarterly over a 12-month period. Every injured patient who presented to the emergency department during the study period for treatment of an injury sustained less than 24 hours prior to presentation was approached for interview. The final sample comprised 593 injured patients (males = 377). Three measures of alcohol consumption in the 6 hours prior to injury were obtained from self-report: quantity, beverage type, and drinking setting. The main outcome measure was mechanism of injury which was categorized into six groups: road traffic crash (RTC), being hit by or against something, fall, cut/piercing, overdose/poisoning, and miscellaneous. Injury intent was also measured (intentional vs unintentional). Results: After controlling for relevant confounding variables, neither quantity nor type of alcohol was significantly associated with injury mechanism. However, drinking setting (i.e., licensed premise) was significantly associated with increased odds of sustaining an intentional versus unintentional injury (odds ratio [OR] = 2.79, 95% confidence interval [CI] = 1.4-5.6); injury through being hit by/against something versus other injury types (OR = 2.59, 95% CI = 1.4-4.9); and reduced odds of sustaining an injury through RTC versus non-RTC (OR = 0.02, 95% CI = 0.004-0.9), compared with not drinking alcohol prior to injury. Conclusions: No previous analytical studies have examined the relationship between injury mechanism and acute alcohol consumption (quantity, type, and setting) across all types of injury and all injury severities while controlling for potentially important confounders (demographic and situational confounders, risk-taking behavior, substance use, and usual drinking patterns). These data suggest that among injured patients, mechanism of injury is not differentially distributed as a function of quantity or type of acute alcohol consumption but may be differentially distributed as a function of drinking setting (i.e., RTC, intentional injury, being hit). Therefore, prevention strategies that focus primarily on the quantity and type of alcohol consumed should be directed generically across injury mechanisms and not limited to particular cause of injury campaigns.
Resumo:
We recorded reflexive OKN in ten younger (32.3±5.98 years) and older (65.6±6.53) visually normal subjects under viewing conditions designed to differentiate M-pathway functioning from other pathways. Subjects were required to gaze straight ahead while viewing vertical gratings of either 0.43 or 1.08 cpd, drifting at either 5 or 20°/sec and presented at either 8 or 80% contrast. Gratings were presented as full field stimulation, central stimulation or peripheral (>15°) stimulation. The order of presentation of conditions was pseudo-randomised at two blocked light levels: ‘mesopic’ or twilight conditions (1.8 cdm-2) and ‘photopic’ or full light conditions (71.5 cdm-2). For the partial fields, central stimulation, mesopic light level, lower temporal frequencies (i.e. number of stripes passing per second) each contributed to greater OKN strength as measured by slow-phase velocity (SPV). For full field stimulation, and especially for higher temporal frequencies and low contrast, there was a significant interaction between age group × light level (p = 0.017): SPV diminished much more among the older than the younger group for the twilight condition compared to full light. Such a clear diminution in M-pathway sensitivity revealed by OKN response has important implications for everyday situations like crash avoidance under twilight driving conditions.
Resumo:
It has been demonstrated, using abstract psychophysical stimuli, that speeds appear slower when contrast is reduced under certain conditions. Does this effect have any real life consequences? One previous study has found, using a low fidelity driving simulator, that participants perceived vehicle speeds to be slower in foggy conditions. We replicated this finding with a more realistic video-based simulator using the Method of Constant Stimuli. We also found that lowering contrast reduced participants’ ability to discriminate speeds. We argue that these reduced contrast effects could partly explain the higher crash rate of drivers with cataracts (this is a substantial societal problem and the crash relationship variance can be accounted for by reduced contrast). Note that even if people with cataracts can calibrate for the shift in their perception of speed using their speedometers (given that cataracts are experienced over long periods), they may still have an increased chance of making errors in speed estimation due to poor speed discrimination. This could result in individuals misjudging vehicle trajectories and thereby inflating their crash risk. We propose interventions that may help address this problem.
Resumo:
The aim of this study was to determine the cues used to signal avoidance of difficult driving situations and to test the hypothesis that drivers with relatively poor high contrast visual acuity (HCVA) have fewer crashes than drivers with relatively poor normalised low contrast visual acuity (NLCVA). This is because those with poorer HCVA are well aware of their difficulties and avoid dangerous driving situations while those poorer NLCVA are often unaware of the extent of their problem. Age, self-reported situation avoidance and HCVA were collected during a practice based study of 690 drivers. Screening was also carried out on 7254 drivers at various venues, mainly motorway sites, throughout the UK. Age, self-reported situation avoidance and prior crash involvement were recorded and Titmus vision screeners were used to measure HCVA and NLCVA. Situation avoidance increased in reduced visibility conditions and was influenced by age and HCVA. Only half of the drivers used visual cues to signal situation avoidance and most of these drivers used high rather than low contrast cues. A statistical model designed to remove confounding interrelationships between variables showed, for drivers that did not report situation avoidance, that crash involvement decreased for drivers with below average HCVA and increased for those with below average NLCVA. These relationships accounted for less than 1% of the crash variance, so the hypothesis was not strongly supported. © 2002 The College of Optometrists.
Resumo:
Background: The binocular Esterman visual field test (EVFT) is the current visual field test for driving in the UK. Merging of monocular field tests (Integrated Visual Field, IVF) has been proposed as an alternative for glaucoma patients. Aims: To examine the level of agreement between the EVFT and IVF for patients with binocular paracentral scotomata, caused by either ophthalmological or neurological conditions, and to compare outcomes with useful field of view (UFOV) performance, a test of visual attention thought to be important in driving. Methods: 60 patients with binocular paracentral scotomata but normal visual acuity (VA) were recruited prospectively. Subjects completed and were classified as “pass” or “fail” for the EVFT, IVF and UFOV. Results: Good agreement occurred between the EVFT and IVF in classifying subjects as “pass” or “fail” (kappa?=?0.84). Classifications disagreed for four subjects with paracentral scotomata of neurological origin (three “passed” IVF yet “failed” EVFT). Mean UFOV scores did not differ between those who “passed” and those who “failed” both visual field tests (p?=?0.11). Agreement between the visual field tests and UFOV was limited (EVFT kappa?=?0.22, IVF kappa 0.32). Conclusions: Although the IVF and EVFT agree well in classifying visual fields with regard to legal fitness to drive in the UK, the IVF “passes” some individuals currently classed as unfit to drive due to paracentral scotomata of non-glaucomatous origin. The suitability of the UFOV for assessing crash risk in those with visual field loss is questionable.
Resumo:
A critical review of previous research revealed that visual attention tests, such as the Useful Field of View (UFOV) test, provided the best means of detecting age-related changes to the visual system that could potentially increase crash risk. However, the question was raised as to whether the UFOV, which was regarded as a static visual attention test, could be improved by inclusion of kinetic targets that more closely represent the driving task. A computer program was written to provide more information about the derivation of UFOV test scores. Although this investigation succeeded in providing new information, some of the commercially protected UFOV test procedures still remain unknown. Two kinetic visual attention tests (DRTS1 and 2), developed at Aston University to investigate inclusion of kinetic targets in visual attention tests, were introduced. The UFOV was found to be more repeatable than either of the kinetic visual attention tests and learning effects or age did not influence these findings. Determinants of static and kinetic visual attention were explored. Increasing target eccentricity led to reduced performance on the UFOV and DRTS1 tests. The DRTS2 was not affected by eccentricity but this may have been due to the style of presentation of its targets. This might also have explained why only the DRTS2 showed laterality effects (i.e. better performance to targets presented on the left hand side of the road). Radial location, explored using the UFOV test, showed that subjects responded best to targets positioned to the horizontal meridian. Distraction had opposite effects on static and kinetic visual attention. While UFOV test performance declined with distraction, DRTS1 performance increased. Previous research had shown that this striking difference was to be expected. Whereas the detection of static targets is attenuated in the presence of distracting stimuli, distracting stimuli that move in a structured flow field enhances the detection of moving targets. Subjects reacted more slowly to kinetic compared to static targets, longitudinal motion compared to angular motion and to increased self-motion. However, the effects of longitudinal motion, angular motion, self-motion and even target eccentricity were caused by target edge speed variations arising because of optic flow field effects. The UFOV test was more able to detect age-related changes to the visual system than were either of the kinetic visual attention tests. The driving samples investigated were too limited to draw firm conclusions. Nevertheless, the results presented showed that neither the DRTS2 nor the UFOV tests were powerful tools for the identification of drivers prone to crashes or poor driving performance.
Resumo:
This book challenges the assumption that it is bad news when the economy doesn’t grow. For decades, it has been widely recognized that there are ecological limits to continuing economic growth and that different ways of living, working and organizing our economies are urgently required. This urgency has increased since the financial crash of 2007–2008, but mainstream economists and politicians are unable to think differently. The authors of this book demonstrate why our economic system demands ecologically unsustainable growth and the pursuit of more ‘stuff’. They believe that what matters is quality, not quantity – a better life based on having fewer material possessions, less production and less work. Such a way of life will emphasize well‑being, community, security and ‘conviviality’. That is, more real wealth. The book will therefore appeal to everyone curious as to how a new post-growth economics can be conceived and enacted. It will be of particular interest to policy makers, politicians, businesspeople, trade unionists, academics, students, journalists and a wide range of people working in the not-for-profit sector. All of the contributors are leading thinkers on green issues and members of the new think-tank Green House.
Resumo:
Society depends on complex IT systems created by integrating and orchestrating independently managed systems. The incredible increase in scale and complexity in them over the past decade means new software-engineering techniques are needed to help us cope with their inherent complexity. The key characteristic of these systems is that they are assembled from other systems that are independently controlled and managed. While there is increasing awareness in the software engineering community of related issues, the most relevant background work comes from systems engineering. The interacting algos that led to the Flash Crash represent an example of a coalition of systems, serving the purposes of their owners and cooperating only because they have to. The owners of the individual systems were competing finance companies that were often mutually hostile. Each system jealously guarded its own information and could change without consulting any other system.
Resumo:
This paper will argue that the American economy could and will absorb the recent shocks, and that in the longer run (within a matter of years), it will somehow convert the revealed weaknesses to its advantage. America has a long record of learning from its excesses to improve the working of its particular brand of capitalism, dating back to the imposition of antitrust controls on the robber barons in the late 1800s and the enhancement of investor protection after the 1929 crash. The American economy has experienced market imperfections of all kinds but it almost always has found, true, not perfect, but fairly reliable regulatory answers and has managed to adapt to change, (e. g. Dodd-Frank Act on financial stability). The U.S. has many times pioneered in the elaboration of both theoretical and policy oriented solutions for conflicts between markets and government to increase economic welfare (Bernanke, 2008, p. 425). There is no single reason why it should not turn the latest financial calamities to its advantage. At the same time, to regain confidence in capitalism as a global system, global efforts are indispensable. To identify some of the global economic conflicts that have a lot to do with U.S. markets in particular, we seek answers to global systemic questions.
Resumo:
Extreme stock price movements are of great concern to both investors and the entire economy. For investors, a single negative return, or a combination of several smaller returns, can possible wipe out so much capital that the firm or portfolio becomes illiquid or insolvent. If enough investors experience this loss, it could shock the entire economy. An example of such a case is the stock market crash of 1987. Furthermore, there has been a lot of recent interest regarding the increasing volatility of stock prices. ^ This study presents an analysis of extreme stock price movements. The data utilized was the daily returns for the Standard and Poor's 500 index from January 3, 1978 to May 31, 2001. Research questions were analyzed using the statistical models provided by extreme value theory. One of the difficulties in examining stock price data is that there is no consensus regarding the correct shape of the distribution function generating the data. An advantage with extreme value theory is that no detailed knowledge of this distribution function is required to apply the asymptotic theory. We focus on the tail of the distribution. ^ Extreme value theory allows us to estimate a tail index, which we use to derive bounds on the returns for very low probabilities on an excess. Such information is useful in evaluating the volatility of stock prices. There are three possible limit laws for the maximum: Gumbel (thick-tailed), Fréchet (thin-tailed) or Weibull (no tail). Results indicated that extreme returns during the time period studied follow a Fréchet distribution. Thus, this study finds that extreme value analysis is a valuable tool for examining stock price movements and can be more efficient than the usual variance in measuring risk. ^
Resumo:
The rate of fatal crashes in Florida has remained significantly higher than the national average for the last several years. The 2003 statistics from the National Highway Traffic Safety Administration (NHTSA), the latest available, show a fatality rate in Florida of 1.71 per 100 million vehicle-miles traveled compared to the national average of 1.48 per 100 million vehicle-miles traveled. The objective of this research is to better understand the driver, environmental, and roadway factors that affect the probability of injury severity in Florida. ^ In this research, the ordered logit model was used to develop six injury severity models; single-vehicle and two-vehicle crashes on urban freeways and urban principal arterials and two-vehicle crashes at urban signalized and unsignalized intersections. The data used in this research included all crashes that occurred on the state highway system for the period from 2001 to 2003 in the Southeast Florida region, which includes the Miami-Dade, Broward and Palm Beach Counties.^ The results of the analysis indicate that the age group and gender of the driver at fault were significant factors of injury severity risk across all models. The greatest risk of severe injury was observed for the age groups 55 to 65 and 66 and older. A positive association between injury severity and the race of the driver at fault was also found. Driver at fault of Hispanic origin was associated with a higher risk of severe injury for both freeway models and for the two-vehicle crash model on arterial roads. A higher risk of more severe injury crash involvement was also found when an African-American was the at fault driver on two-vehicle crashes on freeways. In addition, the arterial class was also found to be positively associated with a higher risk of severe crashes. Six-lane divided arterials exhibited the highest injury severity risk of all arterial classes. The lowest severe injury risk was found for one way roads. Alcohol involvement by the driver at fault was also found to be a significant risk of severe injury for the single-vehicle crash model on freeways. ^
Resumo:
Run-off-road (ROR) crashes have increasingly become a serious concern for transportation officials in the State of Florida. These types of crashes have increased proportionally in recent years statewide and have been the focus of the Florida Department of Transportation. The goal of this research was to develop statistical models that can be used to investigate the possible causal relationships between roadway geometric features and ROR crashes on Florida's rural and urban principal arterials. ^ In this research, Zero-Inflated Poisson (ZIP) and Zero-Inflated Negative Binomial (ZINB) Regression models were used to better model the excessive number of roadway segments with no ROR crashes. Since Florida covers a diverse area and since there are sixty-seven counties, it was divided into four geographical regions to minimize possible unobserved heterogeneity. Three years of crash data (2000–2002) encompassing those for principal arterials on the Florida State Highway System were used. Several statistical models based on the ZIP and ZINB regression methods were fitted to predict the expected number of ROR crashes on urban and rural roads for each region. Each region was further divided into urban and rural areas, resulting in a total of eight crash models. A best-fit predictive model was identified for each of these eight models in terms of AIC values. The ZINB regression was found to be appropriate for seven of the eight models and the ZIP regression was found to be more appropriate for the remaining model. To achieve model convergence, some explanatory variables that were not statistically significant were included. Therefore, strong conclusions cannot be derived from some of these models. ^ Given the complex nature of crashes, recommendations for additional research are made. The interaction of weather and human condition would be quite valuable in discerning additional causal relationships for these types of crashes. Additionally, roadside data should be considered and incorporated into future research of ROR crashes. ^
Resumo:
Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^