915 resultados para Probability Metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a new prognosis model based on the technique for health state estimation of machines for accurate assessment of the remnant life. For the evaluation of health stages of machines, the Support Vector Machine (SVM) classifier was employed to obtain the probability of each health state. Two case studies involving bearing failures were used to validate the proposed model. Simulated bearing failure data and experimental data from an accelerated bearing test rig were used to train and test the model. The result obtained is very encouraging and shows that the proposed prognostic model produces promising results and has the potential to be used as an estimation tool for machine remnant life prediction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Healthcare-associated methicillin-resistant Staphylococcus aureus(MRSA) infection may cause increased hospital stay or, sometimes, death. Quantifying this effect is complicated because it is a time-dependent exposure: infection may prolong hospital stay, while longer stays increase the risk of infection. We overcome these problems by using a multinomial longitudinal model for estimating the daily probability of death and discharge. We then extend the basic model to estimate how the effect of MRSA infection varies over time, and to quantify the number of excess ICU days due to infection. We find that infection decreases the relative risk of discharge (relative risk ratio = 0.68, 95% credible interval: 0.54, 0.82), but is only indirectly associated with increased mortality. An infection on the first day of admission resulted in a mean extra stay of 0.3 days (95% CI: 0.1, 0.5) for a patient with an APACHE II score of 10, and 1.2 days (95% CI: 0.5, 2.0) for a patient with an APACHE II score of 30. The decrease in the relative risk of discharge remained fairly constant with day of MRSA infection, but was slightly stronger closer to the start of infection. These results confirm the importance of MRSA infection in increasing ICU stay, but suggest that previous work may have systematically overestimated the effect size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent decades, concepts and ideas from James J. Gibson’s theory of direct perception in ecological psychology have been applied to the study of how perception and action regulate sport performance. This article examines the influence of different streams of thought in ecological psychology for studying cognition and action in the diverse behavioural contexts of sport and exercise. In discussing the origins of ecological psychology it can be concluded that psychologists such as Lewin, and to some extent Heider, provided the initial impetus for the development of key ideas. We argue that the papers in this special issue clarify that the different schools of thinking in ecological psychology have much to contribute to theoretical and practical developments in sport and exercise psychology. For example, Gibson emphasized and formalized how the individual is coupled with the environment; Brunswik raised the issue of the ontology of probability in human behaviour and the problem of representative design for experimental task constraints; Barker looked carefully into extra-individual behavioural contexts and Bronfenbrenner presented insights pertinent to the relations between behaviour contexts, and macro influences on behaviour. In this overview, we highlight essential issues from the main schools of thought of relevance to the contexts of sport and exercise, and we consider some potential theoretical linkages with dynamical systems theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to provide a contemporary summary of statistical and non-statistical meta-analytic procedures that have relevance to the type of experimental designs often used by sport scientists when examining differences/change in dependent measure(s) as a result of one or more independent manipulation(s). Using worked examples from studies on observational learning in the motor behaviour literature, we adopt a random effects model and give a detailed explanation of the statistical procedures for the three types of raw score difference-based analyses applicable to between-participant, within-participant, and mixed-participant designs. Major merits and concerns associated with these quantitative procedures are identified and agreed methods are reported for minimizing biased outcomes, such as those for dealing with multiple dependent measures from single studies, design variation across studies, different metrics (i.e. raw scores and difference scores), and variations in sample size. To complement the worked examples, we summarize the general considerations required when conducting and reporting a meta-analysis, including how to deal with publication bias, what information to present regarding the primary studies, and approaches for dealing with outliers. By bringing together these statistical and non-statistical meta-analytic procedures, we provide the tools required to clarify understanding of key concepts and principles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advantages of using a balanced approach to measurement of overall organisational performance are well-known. We examined the effects of a balanced approach in the more specific domain of measuring innovation effectiveness in 144 small to medium sized companies in Australia and Thailand. We found that there were no differences in the metrics used by Australian and Thai companies. In line with our hypotheses, we found that those SMEs that took a balanced approach were more likely to perceive benefits of implemented innovations than those that used only a financial approach to measurement. The perception of benefits then had a subsequent effect on overall attitudes towards innovation. The study shows the importance of measuring both financial and non-financial indicators of innovation effectiveness within SMEs and discusses ways in which these can be conducted with limited resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditionally, the aquisition of skills and sport movement has been characterised by numerous repetitions of presumed model movement pattern to be acquired by learners. This approach has been questioned by research identifying the presence of individualised movement patterns and the low probability of occurrence of two identical movements within and between individuals. In contrast, the differential learning approach claims advantage for incurring variability in the learning process by adding stochastic perturbations during practice. These ideas are exemplified by data from a high jump experiment which compared the effectiveness of classical and a differential training approach with pre-post test design. Results showed clear advantages for the group with additional stochastic perturbation during the aquisition phase in comparison to classically trained athletes. Analogies to similar phenomenological effects in the neurobiological literature are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A shortage of affordable housing is a major problem in Australia today. This is mainly due to the limited supply of affordable housing that is provided by the non-government housing sector. Some private housing developers see the provision of affordable housing for lower income people as a high risk investment which offers a lower return than broader market-based housing. The scarcity of suitable land, a limited government ‘subsidy’, and increasing housing costs have not provided sufficient development incentives to encourage their investment despite the existing high demand for affordable housing. This study analyses the risk management process conducted by some private and not-for-profit housing providers in South East Queensland, and draws conclusions about the relationship between risk assessments/responses and past experiences. In-depth interviews of selected non-government housing providers have been conducted to facilitate an understanding of their approach to risk assessment/response in developing and in managing affordable housing projects. These developers use an informal risk management process as part of their normal business process in accordance with industry standards. A simple qualitative matrix has been used to analyse probability and impacts using a qualitative scale - low, medium and high. For housing providers who have considered investing in affordable housing but have not yet implemented any such projects, affordable housing development is seen as an opportunity that needs to be approached with caution. The risks associated with such projects and the levels of acceptance of these are not consistently identified by current housing providers. Many interviewees agree that the recognition of financial risk and the fear of community rejection of such housing projects have restrained them from committing to such investment projects. This study suggests that implementing improvements to the risk mitigation and management framework may assist in promoting the supply of affordable housing by non-government providers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Query reformulation is a key user behavior during Web search. Our research goal is to develop predictive models of query reformulation during Web searching. This article reports results from a study in which we automatically classified the query-reformulation patterns for 964,780 Web searching sessions, composed of 1,523,072 queries, to predict the next query reformulation. We employed an n-gram modeling approach to describe the probability of users transitioning from one query-reformulation state to another to predict their next state. We developed first-, second-, third-, and fourth-order models and evaluated each model for accuracy of prediction, coverage of the dataset, and complexity of the possible pattern set. The results show that Reformulation and Assistance account for approximately 45% of all query reformulations; furthermore, the results demonstrate that the first- and second-order models provide the best predictability, between 28 and 40% overall and higher than 70% for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern Engineering Asset Management (EAM) requires the accurate assessment of current and the prediction of future asset health condition. Appropriate mathematical models that are capable of estimating times to failures and the probability of failures in the future are essential in EAM. In most real-life situations, the lifetime of an engineering asset is influenced and/or indicated by different factors that are termed as covariates. Hazard prediction with covariates is an elemental notion in the reliability theory to estimate the tendency of an engineering asset failing instantaneously beyond the current time assumed that it has already survived up to the current time. A number of statistical covariate-based hazard models have been developed. However, none of them has explicitly incorporated both external and internal covariates into one model. This paper introduces a novel covariate-based hazard model to address this concern. This model is named as Explicit Hazard Model (EHM). Both the semi-parametric and non-parametric forms of this model are presented in the paper. The major purpose of this paper is to illustrate the theoretical development of EHM. Due to page limitation, a case study with the reliability field data is presented in the applications part of this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports results from a study in which we automatically classified the query reformulation patterns for 964,780 Web searching sessions (composed of 1,523,072 queries) in order to predict what the next query reformulation would be. We employed an n-gram modeling approach to describe the probability of searchers transitioning from one query reformulation state to another and predict their next state. We developed first, second, third, and fourth order models and evaluated each model for accuracy of prediction. Findings show that Reformulation and Assistance account for approximately 45 percent of all query reformulations. Searchers seem to seek system searching assistant early in the session or after a content change. The results of our evaluations show that the first and second order models provided the best predictability, between 28 and 40 percent overall, and higher than 70 percent for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance in real time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Crash risk is the statistical probability of a crash. Its assessment can be performed through ex post statistical analysis or in real-time with on-vehicle systems. These systems can be cooperative. Cooperative Vehicle-Infrastructure Systems (CVIS) are a developing research avenue in the automotive industry worldwide. This paper provides a survey of existing CVIS systems and methods to assess crash risk with them. It describes the advantages of cooperative systems versus non-cooperative systems. A sample of cooperative crash risk assessment systems is analysed to extract vulnerabilities according to three criteria: market penetration, over-reliance on GPS and broadcasting issues. It shows that cooperative risk assessment systems are still in their infancy and requires further development to provide their full benefits to road users.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To assess the repeatability and validity of lens densitometry derived from the Pentacam Scheimpflug imaging system. Setting Eye Clinic, Queensland University of Technology, Brisbane, Australia. Methods This prospective cross-sectional study evaluated 1 eye of subjects with or without cataract. Scheimpflug measurements and slitlamp and retroillumination photographs were taken through a dilated pupil. Lenses were graded with the Lens Opacities Classification System III. Intraobserver and interobserver reliability of 3 observers performing 3 repeated Scheimpflug lens densitometry measurements each was assessed. Three lens densitometry metrics were evaluated: linear, for which a line was drawn through the visual axis and a mean lens densitometry value given; peak, which is the point at which lens densitometry is greatest on the densitogram; 3-dimensional (3D), in which a fixed, circular 3.0 mm area of the lens is selected and a mean lens densitometry value given. Bland and Altman analysis of repeatability for multiple measures was applied; results were reported as the repeatability coefficient and relative repeatability (RR). Results Twenty eyes were evaluated. Repeatability was high. Overall, interobserver repeatability was marginally lower than intraobserver repeatability. The peak was the least reliable metric (RR 37.31%) and 3D, the most reliable (RR 5.88%). Intraobserver and interobserver lens densitometry values in the cataract group were slightly less repeatable than in the noncataract group. Conclusion The intraobserver and interobserver repeatability of Scheimpflug lens densitometry was high in eyes with cataract and eyes without cataract, which supports the use of automated lens density scoring using the Scheimpflug system evaluated in the study

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to forecast machinery failure is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models for forecasting machinery health based on condition data. Although these models have aided the advancement of the discipline, they have made only a limited contribution to developing an effective machinery health prognostic system. The literature review indicates that there is not yet a prognostic model that directly models and fully utilises suspended condition histories (which are very common in practice since organisations rarely allow their assets to run to failure); that effectively integrates population characteristics into prognostics for longer-range prediction in a probabilistic sense; which deduces the non-linear relationship between measured condition data and actual asset health; and which involves minimal assumptions and requirements. This work presents a novel approach to addressing the above-mentioned challenges. The proposed model consists of a feed-forward neural network, the training targets of which are asset survival probabilities estimated using a variation of the Kaplan-Meier estimator and a degradation-based failure probability density estimator. The adapted Kaplan-Meier estimator is able to model the actual survival status of individual failed units and estimate the survival probability of individual suspended units. The degradation-based failure probability density estimator, on the other hand, extracts population characteristics and computes conditional reliability from available condition histories instead of from reliability data. The estimated survival probability and the relevant condition histories are respectively presented as “training target” and “training input” to the neural network. The trained network is capable of estimating the future survival curve of a unit when a series of condition indices are inputted. Although the concept proposed may be applied to the prognosis of various machine components, rolling element bearings were chosen as the research object because rolling element bearing failure is one of the foremost causes of machinery breakdowns. Computer simulated and industry case study data were used to compare the prognostic performance of the proposed model and four control models, namely: two feed-forward neural networks with the same training function and structure as the proposed model, but neglected suspended histories; a time series prediction recurrent neural network; and a traditional Weibull distribution model. The results support the assertion that the proposed model performs better than the other four models and that it produces adaptive prediction outputs with useful representation of survival probabilities. This work presents a compelling concept for non-parametric data-driven prognosis, and for utilising available asset condition information more fully and accurately. It demonstrates that machinery health can indeed be forecasted. The proposed prognostic technique, together with ongoing advances in sensors and data-fusion techniques, and increasingly comprehensive databases of asset condition data, holds the promise for increased asset availability, maintenance cost effectiveness, operational safety and – ultimately – organisation competitiveness.