399 resultados para default probability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores what determines the survival of people in a life–and-death situation. The sinking of the Titanic allows us to inquire whether pro-social behavior matters in such extreme situations. This event can be considered a quasi-natural experiment. The empirical results suggest that social norms such as ‘women and children first’ are persevered during such an event. Women of reproductive age and crew members had a higher probability of survival. Passenger class, fitness, group size, and cultural background also mattered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A recent decision by the Australian High Court means that, unless faculty are bound by an assignment or intellectual property (IP) policy, they may own inventions resulting from their research. Thirty years after its introduction, the US Bayh-Dole Act, which vests ownership of employee inventions in the employer university or research organization, has become a model for commercialization around the world. In Australia, despite recommendations that a Bayh-Dole style regime be adopted, the recent decision in University of Western Australia (UWA) v Gray1 has moved the default legal position in a diametrically opposite direction. A key focus of the debate was whether faculty’s duty to carry out research also encompasses a duty to invent. Late last year, the Full Federal Court confirmed a lower court ruling that it does not, and this year the High Court refused leave to appeal (denied certiorari). Thus, Gray stands as Australia’s most faculty-friendly authority to date.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A statistical modeling method to accurately determine combustion chamber resonance is proposed and demonstrated. This method utilises Markov-chain Monte Carlo (MCMC) through the use of the Metropolis-Hastings (MH) algorithm to yield a probability density function for the combustion chamber frequency and find the best estimate of the resonant frequency, along with uncertainty. The accurate determination of combustion chamber resonance is then used to investigate various engine phenomena, with appropriate uncertainty, for a range of engine cycles. It is shown that, when operating on various ethanol/diesel fuel combinations, a 20% substitution yields the least amount of inter-cycle variability, in relation to combustion chamber resonance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deformation Behaviour of microcrystalline (mc) and nanocrystalline (nc) Mg-5%Al alloys produced by hot extrusion of ball-milled powders were investigated using instrumented indentation tests. The hardness values of the mc and nc metals exhibited indentation size effect (ISE), with nc alloys showing weaker ISE. The highly localized dislocation activities resulted in a small activation volume, hence enhanced strain rate sensitivity. Relative higher strain rate sensitivity and the negative Hall-Petch Relationship suggested the increasingly important role of grain boundary mediated mechanisms when the grain size decreased to nanometer region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter recognizes that research is a cultural invention and explains why. It discusses what equity, research and research design mean, and suggests that the concept of equity is enriched considerably when ideas from Indigenous, critical and politically committed research traditions are involved in research design. When research design and the processes of research are guided by principles of equity, several issues warrant investigation. These include power relations, deficit models of research, homogeneity and reflexivity. Research design that is informed by principles of equity is explicit in its political purpose of seeking socially just outcomes for the short and long term.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Considerable past research has explored relationships between vehicle accidents and geometric design and operation of road sections, but relatively little research has examined factors that contribute to accidents at railway-highway crossings. Between 1998 and 2002 in Korea, about 95% of railway accidents occurred at highway-rail grade crossings, resulting in 402 accidents, of which about 20% resulted in fatalities. These statistics suggest that efforts to reduce crashes at these locations may significantly reduce crash costs. The objective of this paper is to examine factors associated with railroad crossing crashes. Various statistical models are used to examine the relationships between crossing accidents and features of crossings. The paper also compares accident models developed in the United States and the safety effects of crossing elements obtained using Korea data. Crashes were observed to increase with total traffic volume and average daily train volumes. The proximity of crossings to commercial areas and the distance of the train detector from crossings are associated with larger numbers of accidents, as is the time duration between the activation of warning signals and gates. The unique contributions of the paper are the application of the gamma probability model to deal with underdispersion and the insights obtained regarding railroad crossing related vehicle crashes. Considerable past research has explored relationships between vehicle accidents and geometric design and operation of road sections, but relatively little research has examined factors that contribute to accidents at railway-highway crossings. Between 1998 and 2002 in Korea, about 95% of railway accidents occurred at highway-rail grade crossings, resulting in 402 accidents, of which about 20% resulted in fatalities. These statistics suggest that efforts to reduce crashes at these locations may significantly reduce crash costs. The objective of this paper is to examine factors associated with railroad crossing crashes. Various statistical models are used to examine the relationships between crossing accidents and features of crossings. The paper also compares accident models developed in the United States and the safety effects of crossing elements obtained using Korea data. Crashes were observed to increase with total traffic volume and average daily train volumes. The proximity of crossings to commercial areas and the distance of the train detector from crossings are associated with larger numbers of accidents, as is the time duration between the activation of warning signals and gates. The unique contributions of the paper are the application of the gamma probability model to deal with underdispersion and the insights obtained regarding railroad crossing related vehicle crashes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Now in its second edition, this book describes tools that are commonly used in transportation data analysis. The first part of the text provides statistical fundamentals while the second part presents continuous dependent variable models. With a focus on count and discrete dependent variable models, the third part features new chapters on mixed logit models, logistic regression, and ordered probability models. The last section provides additional coverage of Bayesian statistical modeling, including Bayesian inference and Markov chain Monte Carlo methods. Data sets are available online to use with the modeling techniques discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The costs of work-related crashes In Australia and overseas, fleet safety or work-related road safety is an issue gaining increased attention from researchers, organisations, road safety practitioners and the general community. This attention is primarily in response to the substantial physical, emotional and economic costs associated with work-related road crashes. The increased risk factors and subsequent costs of work-related driving are also now well documented in the literature. For example, it is noteworthy that research has demonstrated that work-related drivers on average report a higher level of crash involvement compared to personal car drivers (Downs et al., 1999; Kweon and Kockelman, 2003) and in particular within Australia, road crashes are the most common form of work-related fatalities (Haworth et al., 2000).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Survival probability prediction using covariate-based hazard approach is a known statistical methodology in engineering asset health management. We have previously reported the semi-parametric Explicit Hazard Model (EHM) which incorporates three types of information: population characteristics; condition indicators; and operating environment indicators for hazard prediction. This model assumes the baseline hazard has the form of the Weibull distribution. To avoid this assumption, this paper presents the non-parametric EHM which is a distribution-free covariate-based hazard model. In this paper, an application of the non-parametric EHM is demonstrated via a case study. In this case study, survival probabilities of a set of resistance elements using the non-parametric EHM are compared with the Weibull proportional hazard model and traditional Weibull model. The results show that the non-parametric EHM can effectively predict asset life using the condition indicator, operating environment indicator, and failure history.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maintenance activities in a large-scale engineering system are usually scheduled according to the lifetimes of various components in order to ensure the overall reliability of the system. Lifetimes of components can be deduced by the corresponding probability distributions with parameters estimated from past failure data. While failure data of the components is not always readily available, the engineers have to be content with the primitive information from the manufacturers only, such as the mean and standard deviation of lifetime, to plan for the maintenance activities. In this paper, the moment-based piecewise polynomial model (MPPM) are proposed to estimate the parameters of the reliability probability distribution of the products when only the mean and standard deviation of the product lifetime are known. This method employs a group of polynomial functions to estimate the two parameters of the Weibull Distribution according to the mathematical relationship between the shape parameter of two-parameters Weibull Distribution and the ratio of mean and standard deviation. Tests are carried out to evaluate the validity and accuracy of the proposed methods with discussions on its suitability of applications. The proposed method is particularly useful for reliability-critical systems, such as railway and power systems, in which the maintenance activities are scheduled according to the expected lifetimes of the system components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Power load flow analysis is essential for system planning, operation, development and maintenance. Its application on railway supply system is no exception. Railway power supplies system distinguishes itself in terms of load pattern and mobility, as well as feeding system structure. An attempt has been made to apply probability load flow (PLF) techniques on electrified railways in order to examine the loading on the feeding substations and the voltage profiles of the trains. This study is to formulate a simple and reliable model to support the necessary calculations for probability load flow analysis in railway systems with autotransformer (AT) feeding system, and describe the development of a software suite to realise the computation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a ∑GIi/D/1/∞ queue with heterogeneous input/output slot times. This queueing model can be regarded as an extension of the ordinary GI/D/1/∞ model. For this ∑GIi/D/1/∞ queue, we assume that several input streams arrive at the system according to different slot times. In other words, there are different slot times for different input/output processes in the queueing model. The queueing model can therefore be used for an ATM multiplexer with heterogeneous input/output link capacities. Several cases of the queueing model are discussed to reflect different relationships among the input/output link capacities of an ATM multiplexer. In the queueing analysis, two approaches: the Markov model and the probability generating function technique, are adopted to develop the queue length distributions observed at different epochs. This model is particularly useful in the performance analysis of ATM multiplexers with heterogeneous input/output link capacities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary clinical role of the non-invasive physical measurement of a bone, generally referred to as ‘bone densitometry,’ is to identify those subjects at risk of an osteoporotic fracture and their subsequent response to pharmaceutical intervention. The true ‘gold standard’ measurement of the mechanical integrity of a bone, and hence its fracture load, is a destructive test, generally performed by compressing either a regular shaped sample or whole bone.