869 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been a worldwide trend to increase axle loads and train speeds. This means that railway track degradation will be accelerated, and track maintenance costs will be increased significantly. There is a need to investigate the consequences of increasing traffic load. The aim of the research is to develop a model for the analysis of physical degradation of railway tracks in response to changes in traffic parameters, especially increased axle loads and train speeds. This research has developed an integrated track degradation model (ITDM) by integrating several models into a comprehensive framework. Mechanistic relationships for track degradation hav~ ?een used wherever possible in each of the models contained in ITDM. This overcc:mes the deficiency of the traditional statistical track models which rely heavily on historical degradation data, which is generally not available in many railway systems. In addition statistical models lack the flexibility of incorporating future changes in traffic patterns or maintenance practices. The research starts with reviewing railway track related studies both in Australia and overseas to develop a comprehensive understanding of track performance under various traffic conditions. Existing railway related models are then examined for their suitability for track degradation analysis for Australian situations. The ITDM model is subsequently developed by modifying suitable existing models, and developing new models where necessary. The ITDM model contains four interrelated submodels for rails, sleepers, ballast and subgrade, and track modulus. The rail submodel is for rail wear analysis and is developed from a theoretical concept. The sleeper submodel is for timber sleepers damage prediction. The submodel is developed by modifying and extending an existing model developed elsewhere. The submodel has also incorporated an analysis for the likelihood of concrete sleeper cracking. The ballast and subgrade submodel is evolved from a concept developed in the USA. Substantial modifications and improvements have been made. The track modulus submodel is developed from a conceptual method. Corrections for more global track conditions have been made. The integration of these submodels into one comprehensive package has enabled the interaction between individual track components to be taken into account. This is done by calculating wheel load distribution with time and updating track conditions periodically in the process of track degradation simulation. A Windows-based computer program ~ssociated with ITDM has also been developed. The program enables the user to carry out analysis of degradation of individual track components and to investigate the inter relationships between these track components and their deterioration. The successful implementation of this research has provided essential information for prediction of increased maintenance as a consequence of railway trackdegradation. The model, having been presented at various conferences and seminars, has attracted wide interest. It is anticipated that the model will be put into practical use among Australian railways, enabling track maintenance planning to be optimized and potentially saving Australian railway systems millions of dollars in operating costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: This methodological paper reports on the development and validation of a work sampling instrument and data collection processes to conduct a national study of nurse practitioners’ work patterns. ---------- Design: Published work sampling instruments provided the basis for development and validation of a tool for use in a national study of nurse practitioner work activities across diverse contextual and clinical service models. Steps taken in the approach included design of a nurse practitioner-specific data collection tool and development of an innovative web-based program to train and establish inter rater reliability of a team of data collectors who were geographically dispersed across metropolitan, rural and remote health care settings. ---------- Setting: The study is part of a large funded study into nurse practitioner service. The Australian Nurse Practitioner Study is a national study phased over three years and was designed to provide essential information for Australian health service planners, regulators and consumer groups on the profile, process and outcome of nurse practitioner service. ---------- Results: The outcome if this phase of the study is empirically tested instruments, process and training materials for use in an international context by investigators interested in conducting a national study of nurse practitioner work practices. ---------- Conclusion: Development and preparation of a new approach to describing nurse practitioner practices using work sampling methods provides the groundwork for international collaboration in evaluation of nurse practitioner service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is considerable public, political and professional debate about the need for additional hospital beds in Australia. However, there is no clarity in regard to the definition, meaning and significance of hospital bed counts. Relative to population, there has been a total decline in bed availability in Australia over the past 15 years of 14.6% (22.9% for public hospital beds). This decline is partly offset by reductions in length of stay and changes to models of care; however, the net effect is increased bed occupancy which has in turn resulted in system-wide congestion. Future bed capability needs to be better planned to meet growing demands while at the same time continuing trends for more efficient use. Future planning should be based in part on weighted bed capability matched to need.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years the development and use of crash prediction models for roadway safety analyses have received substantial attention. These models, also known as safety performance functions (SPFs), relate the expected crash frequency of roadway elements (intersections, road segments, on-ramps) to traffic volumes and other geometric and operational characteristics. A commonly practiced approach for applying intersection SPFs is to assume that crash types occur in fixed proportions (e.g., rear-end crashes make up 20% of crashes, angle crashes 35%, and so forth) and then apply these fixed proportions to crash totals to estimate crash frequencies by type. As demonstrated in this paper, such a practice makes questionable assumptions and results in considerable error in estimating crash proportions. Through the use of rudimentary SPFs based solely on the annual average daily traffic (AADT) of major and minor roads, the homogeneity-in-proportions assumption is shown not to hold across AADT, because crash proportions vary as a function of both major and minor road AADT. For example, with minor road AADT of 400 vehicles per day, the proportion of intersecting-direction crashes decreases from about 50% with 2,000 major road AADT to about 15% with 82,000 AADT. Same-direction crashes increase from about 15% to 55% for the same comparison. The homogeneity-in-proportions assumption should be abandoned, and crash type models should be used to predict crash frequency by crash type. SPFs that use additional geometric variables would only exacerbate the problem quantified here. Comparison of models for different crash types using additional geometric variables remains the subject of future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is important to examine the nature of the relationships between roadway, environmental, and traffic factors and motor vehicle crashes, with the aim to improve the collective understanding of causal mechanisms involved in crashes and to better predict their occurrence. Statistical models of motor vehicle crashes are one path of inquiry often used to gain these initial insights. Recent efforts have focused on the estimation of negative binomial and Poisson regression models (and related deviants) due to their relatively good fit to crash data. Of course analysts constantly seek methods that offer greater consistency with the data generating mechanism (motor vehicle crashes in this case), provide better statistical fit, and provide insight into data structure that was previously unavailable. One such opportunity exists with some types of crash data, in particular crash-level data that are collected across roadway segments, intersections, etc. It is argued in this paper that some crash data possess hierarchical structure that has not routinely been exploited. This paper describes the application of binomial multilevel models of crash types using 548 motor vehicle crashes collected from 91 two-lane rural intersections in the state of Georgia. Crash prediction models are estimated for angle, rear-end, and sideswipe (both same direction and opposite direction) crashes. The contributions of the paper are the realization of hierarchical data structure and the application of a theoretically appealing and suitable analysis approach for multilevel data, yielding insights into intersection-related crashes by crash type.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Now in its second edition, this book describes tools that are commonly used in transportation data analysis. The first part of the text provides statistical fundamentals while the second part presents continuous dependent variable models. With a focus on count and discrete dependent variable models, the third part features new chapters on mixed logit models, logistic regression, and ordered probability models. The last section provides additional coverage of Bayesian statistical modeling, including Bayesian inference and Markov chain Monte Carlo methods. Data sets are available online to use with the modeling techniques discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Despite declining rates of cardiovascular disease (CVD) mortality in developed countries, lower socioeconomic groups continue to experience a greater burden of the disease. There are now many evidence-based treatments and prevention strategies for the management of CVD and it is essential that their impact on the more disadvantaged group is understood if socioeconomic inequalities in CVD are to be reduced. Aims: To determine whether key interventions for CVD prevention and treatment are effective among lower socioeconomic groups, to describe barriers to their effectiveness and the potential or actual impact of these interventions on the socioeconomic gradient in CVD. Methods: Interventions were selected from four stages of the CVD continuum. These included smoking reduction strategies, absolute risk assessment, cardiac rehabilitation, secondary prevention medications, and heart failure self-management programmes. Electronic searches were conducted using terms for each intervention combined with terms for socioeconomic status (SES). Results: Only limited evidence was found for the effectiveness of the selected interventions among lower SES groups and there was little exploration of socioeconomic-related barriers to their uptake. Some broad themes and key messages were identified. In the majority of findings examined, it was clear that the underlying material, social and environmental factors associated with disadvantage are a significant barrier to the effectiveness of interventions. Conclusion: Opportunities to reduce socioeconomic inequalities occur at all stages of the CVD continuum. Despite this, current treatment and prevention strategies may be contributing to the widening socioeconomic-CVD gradient. Further research into the impact of best-practice interventions for CVD upon lower SES groups is required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methicillin-resistant Staphylococcus Aureus (MRSA) is a pathogen that continues to be of major concern in hospitals. We develop models and computational schemes based on observed weekly incidence data to estimate MRSA transmission parameters. We extend the deterministic model of McBryde, Pettitt, and McElwain (2007, Journal of Theoretical Biology 245, 470–481) involving an underlying population of MRSA colonized patients and health-care workers that describes, among other processes, transmission between uncolonized patients and colonized health-care workers and vice versa. We develop new bivariate and trivariate Markov models to include incidence so that estimated transmission rates can be based directly on new colonizations rather than indirectly on prevalence. Imperfect sensitivity of pathogen detection is modeled using a hidden Markov process. The advantages of our approach include (i) a discrete valued assumption for the number of colonized health-care workers, (ii) two transmission parameters can be incorporated into the likelihood, (iii) the likelihood depends on the number of new cases to improve precision of inference, (iv) individual patient records are not required, and (v) the possibility of imperfect detection of colonization is incorporated. We compare our approach with that used by McBryde et al. (2007) based on an approximation that eliminates the health-care workers from the model, uses Markov chain Monte Carlo and individual patient data. We apply these models to MRSA colonization data collected in a small intensive care unit at the Princess Alexandra Hospital, Brisbane, Australia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been an increasing interest in objects within the HCI field particularly with a view to designing tangible interfaces. However, little is known about how people make sense of objects and how objects support thinking. This paper presents a study of groups of engineers using physical objects to prototype designs, and articulates the roles that physical objects play in supporting their design thinking and communications. The study finds that design thinking is heavily dependent upon physical objects, that designers are active and opportunistic in seeking out physical props and that the interpretation and use of an object depends heavily on the activity. The paper discusses the trade-offs that designers make between speed and accuracy of models, and specificity and generality in choice of representations. Implications for design of tangible interfaces are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the motion characteristics of on-site objects is desirable for the analysis of construction work zones, especially in problems related to safety and productivity studies. This article presents a methodology for rapid object identification and tracking. The proposed methodology contains algorithms for spatial modeling and image matching. A high-frame-rate range sensor was utilized for spatial data acquisition. The experimental results indicated that an occupancy grid spatial modeling algorithm could quickly build a suitable work zone model from the acquired data. The results also showed that an image matching algorithm is able to find the most similar object from a model database and from spatial models obtained from previous scans. It is then possible to use the matched information to successfully identify and track objects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations. However, people have limited knowledge on this complex topic. In this research, 1) the impact of traffic oscillations on freeway crash occurrences has been measured using the matched case-control design. The results consistently reveal that oscillations have a more significant impact on freeway safety than the average traffic states. 2) Wavelet Transform has been adopted to locate oscillations' origins and measure their characteristics along their propagation paths using vehicle trajectory data. 3) Lane changing maneuver's impact on the immediate follower is measured and modeled. The knowledge and the new models generated from this study could provide better understanding on fundamentals of congested traffic; enable improvements to existing traffic control strategies and freeway crash countermeasures; and instigate people to develop new operational strategies with the objective of reducing the negative effects of oscillatory driving.