970 resultados para large truck crash causation study


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Outside of relatively limited crash testing with large trucks, very little is known regarding the performance of traffic barriers subjected to real-world large truck impacts. The purpose of this study was to investigate real-world large truck impacts into traffic barriers to determine barrier crash involvement rates, the impact performance of barriers not specifically designed to redirect large trucks, and the real-world performance of large-truck-specific barriers. Data sources included the Fatality Analysis Reporting System (2000-2009), the General Estimates System (2000-2009) and 155 in-depth large truck-to-barrier crashes from the Large Truck Crash Causation Study. Large truck impacts with a longitudinal barrier were found to comprise 3 percent of all police-reported longitudinal barrier impacts and roughly the same proportion of barrier fatalities. Based on a logistic regression model predicting barrier penetration, large truck barrier penetration risk was found to increase by a factor of 6 for impacts with barriers designed primarily for passenger vehicles. Although large-truck-specific barriers were found to perform better than non-heavy vehicle specific barriers, the penetration rate of these barriers were found to be 17 percent. This penetration rate is especially a concern because the higher test level barriers are designed to protect other road users, not the occupants of the large truck. Surprisingly, barriers not specifically designed for large truck impacts were found to prevent large truck penetration approximately half of the time. This suggests that adding costlier higher test level barriers may not always be warranted, especially on roadways with lower truck volumes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large trucks are involved in a disproportionately small fraction of the total crashes but a disproportionately large fraction of fatal crashes. Large truck crashes often result in significant congestion due to their large physical dimensions and from difficulties in clearing crash scenes. Consequently, preventing large truck crashes is critical to improving highway safety and operations. This study identifies high risk sites (hot spots) for large truck crashes in Arizona and examines potential risk factors related to the design and operation of the high risk sites. High risk sites were identified using both state of the practice methods (accident reduction potential using negative binomial regression with long crash histories) and a newly proposed method using Property Damage Only Equivalents (PDOE). The hot spots identified via the count model generally exhibited low fatalities and major injuries but large minor injuries and PDOs, while the opposite trend was observed using the PDOE methodology. The hot spots based on the count model exhibited large AADTs, whereas those based on the PDOE showed relatively small AADTs but large fractions of trucks and high posted speed limits. Documented site investigations of hot spots revealed numerous potential risk factors, including weaving activities near freeway junctions and ramps, absence of acceleration lanes near on-ramps, small shoulders to accommodate large trucks, narrow lane widths, inadequate signage, and poor lighting conditions within a tunnel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Overview of hotspot identification (HSID)methods 2. Challenges with HSID 3. Bringing crash severity into the ‘mix’ 4. Case Study: Truck Involved Crashes in Arizona 5. Conclusions • Heavy duty trucks have different performance envelopes than passenger cars and have more difficulty weaving, accelerating, and braking • Passenger vehicles have extremely limited sight distance around trucks • Lane and shoulder widths affect truck crash risk more than passenger cars • Using PDOEs to model truck crashes results in a different set of locations to examine for possible engineering and behavioral problems • PDOE models point to higher societal cost locations, whereas frequency models point to higher crash frequency locations • PDOE models are less sensitive to unreported crashes • PDOE models are a great complement to existing practice

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the power of genetic algorithms at solving the MAX-CLIQUE problem. We measure the performance of a standard genetic algorithm on an elementary set of problem instances consisting of embedded cliques in random graphs. We indicate the need for improvement, and introduce a new genetic algorithm, the multi-phase annealed GA, which exhibits superior performance on the same problem set. As we scale up the problem size and test on \hard" benchmark instances, we notice a degraded performance in the algorithm caused by premature convergence to local minima. To alleviate this problem, a sequence of modi cations are implemented ranging from changes in input representation to systematic local search. The most recent version, called union GA, incorporates the features of union cross-over, greedy replacement, and diversity enhancement. It shows a marked speed-up in the number of iterations required to find a given solution, as well as some improvement in the clique size found. We discuss issues related to the SIMD implementation of the genetic algorithms on a Thinking Machines CM-5, which was necessitated by the intrinsically high time complexity (O(n3)) of the serial algorithm for computing one iteration. Our preliminary conclusions are: (1) a genetic algorithm needs to be heavily customized to work "well" for the clique problem; (2) a GA is computationally very expensive, and its use is only recommended if it is known to find larger cliques than other algorithms; (3) although our customization e ort is bringing forth continued improvements, there is no clear evidence, at this time, that a GA will have better success in circumventing local minima.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An experimental study measuring the performance and wake characteristics of a 1:10th scale horizontal axis turbine in steady uniform flow conditions is presented in this paper.
Large scale towing tests conducted in a lake were devised to model the performance of the tidal turbine and measure the wake produced. As a simplification of the marine environment, towing the turbine in a lake provides approximately steady, uniform inflow conditions. A 16m long x 6m wide catamaran was constructed for the test programme. This doubled as a towing rig and flow measurement platform, providing a fixed frame of reference for measurements in the wake of a horizontal axis tidal turbine. Velocity mapping was conducted using Acoustic Doppler Velocimeters.
The results indicate varying the inflow speed yielded little difference in the efficiency of the turbine or the wake velocity deficit characteristics provided the same tip speed ratio is used. Increasing the inflow velocity from 0.9 m/s to 1.2 m/s influenced the turbulent wake characteristics more markedly. The results also demonstrate that the flow field in the wake of a horizontal axis tidal turbine is strongly affected by the turbine support structure

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Before the advent of genome-wide association studies (GWASs), hundreds of candidate genes for obesity-susceptibility had been identified through a variety of approaches. We examined whether those obesity candidate genes are enriched for associations with body mass index (BMI) compared with non-candidate genes by using data from a large-scale GWAS. A thorough literature search identified 547 candidate genes for obesity-susceptibility based on evidence from animal studies, Mendelian syndromes, linkage studies, genetic association studies and expression studies. Genomic regions were defined to include the genes ±10 kb of flanking sequence around candidate and non-candidate genes. We used summary statistics publicly available from the discovery stage of the genome-wide meta-analysis for BMI performed by the genetic investigation of anthropometric traits consortium in 123 564 individuals. Hypergeometric, rank tail-strength and gene-set enrichment analysis tests were used to test for the enrichment of association in candidate compared with non-candidate genes. The hypergeometric test of enrichment was not significant at the 5% P-value quantile (P = 0.35), but was nominally significant at the 25% quantile (P = 0.015). The rank tail-strength and gene-set enrichment tests were nominally significant for the full set of genes and borderline significant for the subset without SNPs at P < 10(-7). Taken together, the observed evidence for enrichment suggests that the candidate gene approach retains some value. However, the degree of enrichment is small despite the extensive number of candidate genes and the large sample size. Studies that focus on candidate genes have only slightly increased chances of detecting associations, and are likely to miss many true effects in non-candidate genes, at least for obesity-related traits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims and objectives: To examine the impact and obstacles that individual Institutional Research Ethics Committee (IRECs) had on a large-scale national multi-centre clinical audit called the National Benchmarks and Evidence-based National Clinical guidelines for Heart failure management programmes Study.

Background
: Multi-centre research is commonplace in the health care system. However, IRECs continue to fail to differentiate between research and quality audit projects.

Methods: The National Benchmarks and Evidence-based National Clinical guidelines for Heart failure management programmes study used an investigator-developed questionnaire concerning a clinical audit for heart failure programmes throughout Australia. Ethical guidelines developed by the National governing body of health and medical research in Australia classified the National Benchmarks and Evidence-based National Clinical guidelines for Heart failure management programmes Study as a low risk clinical audit not requiring ethical approval by IREC.

Results
: Fifteen of 27 IRECs stipulated that the research proposal undergo full ethical review. None of the IRECs acknowledged: national quality assurance guidelines and recommendations nor ethics approval from other IRECs. Twelve of the 15 IRECs used different ethics application forms. Variability in the type of amendments was prolific. Lack of uniformity in ethical review processes resulted in a six- to eight-month delay in commencing the national study.

Conclusions
: Development of a national ethics application form with full ethical review by the first IREC and compulsory expedited review by subsequent IRECs would resolve issues raised in this paper. IRECs must change their ethics approval processes to one that enhances facilitation of multi-centre research which is now normative process for health services.

Relevance to clinical practice: The findings of this study highlight inconsistent ethical requirements between different IRECs. Also highlighted are the obstacles and delays that IRECs create when undertaking multi-centre clinical audits. However, in our clinical practice it is vital that clinical audits are undertaken for evaluation purposes. The findings of this study raise awareness of inconsistent ethical processes and highlight the need for expedient ethical review for clinical audits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O ambiente econômico global tem registrado um conjunto de mutações poderosos durante a última década. A liderança dos países ocidentais tem sido contestada pelo crescimento de novos atores no interior da arena global, determinando uma mudança dos interesses das nações bem estabelecidas, para realidades que antes eram considerados subdesenvolvidos ou incapaz de desempenhar um papel de liderança dentro do contexto da economia global. Assim, os países emergentes ganharam a atenção dos teóricos e gerentes internacionais, que começaram a olhar para estes assuntos, não só pelo seu potencial econômico, mas também para a identificação de novas soluções para a criação de um mundo mais sustentável. A tese em questão, estruturada como um case study, precisamente tenta compreender e retratar os dois temas mencionados acima. Por um lado, a pesquisa irá investigar as dimensões do mercado da BoP, com um foco no mercado brasileiro, pelo outro lado, o trabalho irá descrever uma soluções possíveis para explorar os mercados em desenvolvimento por grandes empresas privadas: o modelo de negócio social. Este novo paradigma, que combina o desempenho financeiro com a realização de impactos sociais entre a comunidade selecionada, será aprofundar através de um caso concreto implementado pela Coca-Cola Company no Brasil, o Projeto Coletivo. De acordo com as observações preliminares, o estudo de caso terá como objetivo compreender os desafios, as oportunidades, os obstáculos organizacionais e os métodos que podem subir a partir da implementação de um negócio social em um país em desenvolvimento seguindo a perspectiva anteriormente. Os resultados obtidos mostraram que, embora o paradigma pode representam uma solução viável, muitas questões organizacionais e culturais precisam ser levados em consideração para a sua implementação bem sucedida.