432 resultados para Crash Test Criteria


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Illegal street racing has received increased attention in recent years from road safety professionals and the media as jurisdictions in Australia, Canada, and the United States have implemented laws to address the problem, which primarily involves young male drivers. Although some evidence suggests that the prevalence of illegal street racing is increasing, obtaining accurate estimates of the crash risk of this behavior is difficult because of limitations in official data sources. Although crash risk can be explored by examining the proportion of incidents of street racing that result in crashes, or the proportion of all crashes that involve street racing, this paper reports on the findings of a study that explored the riskiness of involved drivers. The driving histories of 183 male drivers with an illegal street racing conviction in Queensland, Australia, were compared with a random sample of 183 male Queensland drivers with the same age distribution. The offender group was found to have significantly more traffic infringements, license sanctions, and crashes than the comparison group. Drivers in the offender group were more likely than the comparison group to have committed infringements related to street racing, such as speeding, "hooning," and offenses related to vehicle defects or illegal modifications. Insufficient statistical capacity prevented full exploration of group differences in the type and nature of earlier crashes. It was concluded, however, that street racing offenders generally can be considered risky drivers who warrant attention and whose risky behavior cannot be explained by their youth alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hot spot identification (HSID) plays a significant role in improving the safety of transportation networks. Numerous HSID methods have been proposed, developed, and evaluated in the literature. The vast majority of HSID methods reported and evaluated in the literature assume that crash data are complete, reliable, and accurate. Crash under-reporting, however, has long been recognized as a threat to the accuracy and completeness of historical traffic crash records. As a natural continuation of prior studies, the paper evaluates the influence that under-reported crashes exert on HSID methods. To conduct the evaluation, five groups of data gathered from Arizona Department of Transportation (ADOT) over the course of three years are adjusted to account for fifteen different assumed levels of under-reporting. Three identification methods are evaluated: simple ranking (SR), empirical Bayes (EB) and full Bayes (FB). Various threshold levels for establishing hotspots are explored. Finally, two evaluation criteria are compared across HSID methods. The results illustrate that the identification bias—the ability to correctly identify at risk sites--under-reporting is influenced by the degree of under-reporting. Comparatively speaking, crash under-reporting has the largest influence on the FB method and the least influence on the SR method. Additionally, the impact is positively related to the percentage of the under-reported PDO crashes and inversely related to the percentage of the under-reported injury crashes. This finding is significant because it reveals that despite PDO crashes being least severe and costly, they have the most significant influence on the accuracy of HSID.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proposed transmission smart grids will use a digital platform for the automation of substations operating at voltage levels of 110 kV and above. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850-8-1 and IEC 61850-9-2 provide an inter-operable solution to support multi-vendor digital process bus solutions, allowing for the removal of potentially lethal voltages and damaging currents from substation control rooms, a reduction in the amount of cabling required in substations, and facilitates the adoption of non-conventional instrument transformers (NCITs). IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. This paper describes a specific test and evaluation system that uses real time simulation, protection relays, PTPv2 time clocks and artificial network impairment that is being used to investigate technical impediments to the adoption of SV process bus systems by transmission utilities. Knowing the limits of a digital process bus, especially when sampled values and NCITs are included, will enable utilities to make informed decisions regarding the adoption of this technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developing safe and sustainable road systems is a common goal in all countries. Applications to assist with road asset management and crash minimization are sought universally. This paper presents a data mining methodology using decision trees for modeling the crash proneness of road segments using available road and crash attributes. The models quantify the concept of crash proneness and demonstrate that road segments with only a few crashes have more in common with non-crash roads than roads with higher crash counts. This paper also examines ways of dealing with highly unbalanced data sets encountered in the study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Road crashes cost world and Australian society a significant proportion of GDP, affecting productivity and causing significant suffering for communities and individuals. This paper presents a case study that generates data mining models that contribute to understanding of road crashes by allowing examination of the role of skid resistance (F60) and other road attributes in road crashes. Predictive data mining algorithms, primarily regression trees, were used to produce road segment crash count models from the road and traffic attributes of crash scenarios. The rules derived from the regression trees provide evidence of the significance of road attributes in contributing to crash, with a focus on the evaluation of skid resistance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a preliminary crash avoidance framework for heavy equipment control systems. Safe equipment operation is a major concern on construction sites since fatal on-site injuries are an industry-wide problem. The proposed framework has potential for effecting active safety for equipment operation. The framework contains algorithms for spatial modeling, object tracking, and path planning. Beyond generating spatial models in fractions of seconds, these algorithms can successfully track objects in an environment and produce a collision-free 3D motion trajectory for equipment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report presents the findings of an exploratory study into the perceptions held by students regarding the use of criterion-referenced assessment in an undergraduate differential equations class. Students in the class were largely unaware of the concept of criterion referencing and of the various interpretations that this concept has among mathematics educators. Our primary goal was to investigate whether explicitly presenting assessment criteria to students was useful to them and guided them in responding to assessment tasks. Quantitative data and qualitative feedback from students indicates that while students found the criteria easy to understand and useful in informing them as to how they would be graded, the manner in which they actually approached the assessment activity was not altered as a result of the use of explicitly communicated grading criteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed pipeline assets systems are crucial to society. The deterioration of these assets and the optimal allocation of limited budget for their maintenance correspond to crucial challenges for water utility managers. Decision makers should be assisted with optimal solutions to select the best maintenance plan concerning available resources and management strategies. Much research effort has been dedicated to the development of optimal strategies for maintenance of water pipes. Most of the maintenance strategies are intended for scheduling individual water pipe. Consideration of optimal group scheduling replacement jobs for groups of pipes or other linear assets has so far not received much attention in literature. It is a common practice that replacement planners select two or three pipes manually with ambiguous criteria to group into one replacement job. This is obviously not the best solution for job grouping and may not be cost effective, especially when total cost can be up to multiple million dollars. In this paper, an optimal group scheduling scheme with three decision criteria for distributed pipeline assets maintenance decision is proposed. A Maintenance Grouping Optimization (MGO) model with multiple criteria is developed. An immediate challenge of such modeling is to deal with scalability of vast combinatorial solution space. To address this issue, a modified genetic algorithm is developed together with a Judgment Matrix. This Judgment Matrix is corresponding to various combinations of pipe replacement schedules. An industrial case study based on a section of a real water distribution network was conducted to test the new model. The results of the case study show that new schedule generated a significant cost reduction compared with the schedule without grouping pipes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Paired speaking tests are now commonly used in both high-stakes testing and classroom assessment contexts. The co-construction of discourse by candidates is regarded as a strength of paired speaking tests, as candidates have the opportunity to display a wider range of interactional competencies, including turn taking, initiating topics and engaging in extended discourse with a partner, rather than an examiner. However, the impact of the interlocutor in such jointly negotiated discourse and the implications for assessing interactional competence are areas of concern. This article reports on the features of interactional competence that were salient to four trained raters of 12 paired speaking tests through the analysis of rater notes, stimulated verbal recalls and rater discussions. Findings enabled the identification of features of the performance noted by raters when awarding scores for interactional competence, and the particular features associated with higher and lower scores. A number of these features were seen by the raters as mutual achievements, which raises the issue of the extent to which it is possible to assess individual contributions to the co-constructed performance. The findings have implications for defining the construct of interactional competence in paired speaking tests and operationalising this in rating scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations (stop-and-go driving). The negative environmental impacts of these oscillations are widely accepted, but their impact on traffic safety has been debated. This paper describes the impact of freeway traffic oscillations on traffic safety. This study employs a matched case-control design using high-resolution traffic and crash data from a freeway segment. Traffic conditions prior to each crash were taken as cases, while traffic conditions during the same periods on days without crashes were taken as controls. These were also matched by presence of congestion, geometry and weather. A total of 82 cases and about 80,000 candidate controls were extracted from more than three years of data from 2004 to 2007. Conditional logistic regression models were developed based on the case-control samples. To verify consistency in the results, 20 different sets of controls were randomly extracted from the candidate pool for varying control-case ratios. The results reveal that the standard deviation of speed (thus, oscillations) is a significant variable, with an average odds ratio of about 1.08. This implies that the likelihood of a (rear-end) crash increases by about 8% with an additional unit increase in the standard deviation of speed. The average traffic states prior to crashes were less significant than the speed variations in congestion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: People with cardiac disease and type 2 diabetes have higher hospital readmission rates (22%)compared to those without diabetes (6%). Self-management is an effective approach to achieve better health outcomes; however there is a lack of specifically designed programs for patients with these dual conditions. This project aims to extend the development and pilot test of a Cardiac-Diabetes Self-Management Program incorporating user-friendly technologies and the preparation of lay personnel to provide follow-up support. Methods/Design: A randomised controlled trial will be used to explore the feasibility and acceptability of the Cardiac-Diabetes Self-Management Program incorporating DVD case studies and trained peers to provide follow-up support by telephone and text-messaging. A total of 30 cardiac patients with type 2 diabetes will be randomised, either to the usual care group, or to the intervention group. Participants in the intervention group will received the Cardiac-Diabetes Self-Management Program in addition to their usual care. The intervention consists of three faceto- face sessions as well as telephone and text-messaging follow up. The face-to-face sessions will be provided by a trained Research Nurse, commencing in the Coronary Care Unit, and continuing after discharge by trained peers. Peers will follow up patients for up to one month after discharge using text messages and telephone support. Data collection will be conducted at baseline (Time 1) and at one month (Time 2). The primary outcomes include self-efficacy, self-care behaviour and knowledge, measured by well established reliable tools. Discussion: This paper presents the study protocol of a randomised controlled trial to pilot evaluates a Cardiac- Diabetes Self-Management program, and the feasibility of incorporating peers in the follow-ups. Results of this study will provide directions for using such mode in delivering a self-management program for patients with both cardiac condition and diabetes. Furthermore, it will provide valuable information of refinement of the intervention program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper will focus on the development of an interactive test engine using Rasch analysis of item responses for question selection and reporting of results. The Rasch analysis is used to determine student ability and question difficulty. This model is widely used in the preparation of paper-based tests and has been the subject of particular use and development at the Australian Council for Education Research (ACER). This paper presents an overview of an interactive implementation of the Rasch analysis model in HyperCard, where student ability estimates are generated 'on the fly' and question difficulty values updated from time to time. The student ability estimates are used to determine question selection and are the basis of scoring and reporting schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses dynamic computer simulation techniques to develop and apply a multi-criteria procedure using non-destructive vibration-based parameters for damage assessment in truss bridges. In addition to changes in natural frequencies, this procedure incorporates two parameters, namely the modal flexibility and the modal strain energy. Using the numerically simulated modal data obtained through finite element analysis of the healthy and damaged bridge models, algorithms based on modal flexibility and modal strain energy changes before and after damage are obtained and used as the indices for the assessment of structural health state. The application of the two proposed parameters to truss-type structures is limited in the literature. The proposed multi-criteria based damage assessment procedure is therefore developed and applied to truss bridges. The application of the approach is demonstrated through numerical simulation studies of a single-span simply supported truss bridge with eight damage scenarios corresponding to different types of deck and truss damage. Results show that the proposed multi-criteria method is effective in damage assessment in this type of bridge superstructure.