202 resultados para predictive value


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mindfulness is a concept which has been widely used in studies on consciousness, but has recently been applied to the understanding of behaviours in other areas, including clinical psychology, meditation, physical activity, education and business. It has been suggested that mindfulness can also be applied to road safety, though this has not yet been researched. A standard definition of mindfulness is “paying attention in a particular way, on purpose in the present moment and non-judgemental to the unfolding of experience moment by moment” [1]. Scales have been developed to measure mindfulness; however, there are different views in the literature on the nature of the mindfulness construct. This paper reviews the issues raised in the literature and arrives at an operational definition of mindfulness considered relevant to road safety. It is further proposed that mindfulness is best construed as operating together with other psychosocial factors to influence road safety behaviours. The specific case of speeding behaviour is outlined, where the psychosocial variables in the Theory of Planned Behaviour (TPB) have been demonstrated to predict both intention to speed and actual speeding behaviour. A role is proposed for mindfulness in enhancing the explanatory and predictive powers of the TPB concerning speeding. The implications of mindfulness for speeding countermeasures are discussed and a program of future research is outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: This paper aims to show that identification of expectations and software functional requirements via consultation with potential users is an integral component of the development of an emergency department patient admissions prediction tool. ---------- Design/methodology/approach: Thematic analysis of semi-structured interviews with 14 key health staff delivered rich data regarding existing practice and future needs. Participants included emergency department staff, bed managers, nurse unit managers, directors of nursing, and personnel from health administration. ---------- Findings: Participants contributed contextual insights on the current system of admissions, revealing a culture of crisis, imbued with misplayed communication. Their expectations and requirements of a potential predictive tool provided strategic data that moderated the development of the Emergency Department Patient Admissions Prediction Tool, based on their insistence that it feature availability, reliability and relevance. In order to deliver these stipulations, participants stressed that it should be incorporated, validated, defined and timely. ---------- Research limitations/implications: Participants were envisaging a concept and use of a tool that was somewhat hypothetical. However, further research will evaluate the tool in practice. ---------- Practical implications: Participants' unsolicited recommendations regarding implementation will not only inform a subsequent phase of the tool evaluation, but are eminently applicable to any process of implementation in a healthcare setting. ---------- Originality/value: The consultative process engaged clinicians and the paper delivers an insider view of an overburdened system, rather than an outsider's observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last few decades, construction project performance has been evaluated due to the increase of delays, cost overruns and quality failures. Growing numbers of disputes, inharmonious working environments, conflict, blame cultures, and mismatches of objectives among project teams have been found to be contributory factors to poor project performance. Performance measurement (PM) approaches have been developed to overcome these issues, however, the comprehensiveness of PM as an overall approach is still criticised in terms of the iron triangle; namely time, cost, and quality. PM has primarily focused on objective measures, however, continuous improvement requires the inclusion of subjective measures, particularly contractor satisfaction (Co-S). It is challenging to deal with the two different groups of large and small-medium contractor satisfaction as to date, Co-S has not been extensively defined, primarily in developing countries such as Malaysia. Therefore, a Co-S model is developed in this research which aims to fulfil the current needs in the construction industry by integrating performance measures to address large and small-medium contractor perceptions. The positivist paradigm used in the research was adhered to by reviewing relevant literature and evaluating expert discussions on the research topic. It yielded a basis for the contractor satisfaction model (CoSMo) development which consists of three elements: contractor satisfaction (Co-S) dimensions; contributory factors and characteristics (project and participant). Using valid questionnaire results from 136 contractors in Malaysia lead to the prediction of several key factors of contractor satisfaction and to an examination of the relationships between elements. The relationships were examined through a series of sequential statistical analyses, namely correlation, one-way analysis of variance (ANOVA), t-tests and multiple regression analysis (MRA). Forward and backward MRAs were used to develop Co-S mathematical models. Sixteen Co-S models were developed for both large and small-medium contractors. These determined that the large contractor Malaysian Co-S was most affected by the conciseness of project scope and quality of the project brief. Contrastingly, Co-S for small-medium contractors was strongly affected by the efficiency of risk control in a project. The results of the research provide empirical evidence in support of the notion that appropriate communication systems in projects negatively contributes to large Co-S with respect to cost and profitability. The uniqueness of several Co-S predictors was also identified through a series of analyses on small-medium contractors. These contractors appear to be less satisfied than large contractors when participants lack effectiveness in timely authoritative decision-making and communication between project team members. Interestingly, the empirical results show that effective project health and safety measures are influencing factors in satisfying both large and small-medium contractors. The perspectives of large and small-medium contractors in respect to the performance of the entire project development were derived from the Co-S models. These were statistically validated and refined before a new Co-S model was developed. Developing such a unique model has the potential to increase project value and benefit all project participants. It is important to improve participant collaboration as it leads to better project performance. This study may encourage key project participants; such as client, consultant, subcontractor and supplier; to increase their attention to contractor needs in the development of a project. Recommendations for future research include investigating other participants‟ perspectives on CoSMo and the impact of the implementation of CoSMo in a project, since this study is focused purely on the contractor perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Bayesian framework a standard approach to model criticism is to compare some function of the observed data to a reference predictive distribution. The result of the comparison can be summarized in the form of a p-value, and it's well known that computation of some kinds of Bayesian predictive p-values can be challenging. The use of regression adjustment approximate Bayesian computation (ABC) methods is explored for this task. Two problems are considered. The first is the calibration of posterior predictive p-values so that they are uniformly distributed under some reference distribution for the data. Computation is difficult because the calibration process requires repeated approximation of the posterior for different data sets under the reference distribution. The second problem considered is approximation of distributions of prior predictive p-values for the purpose of choosing weakly informative priors in the case where the model checking statistic is expensive to compute. Here the computation is difficult because of the need to repeatedly sample from a prior predictive distribution for different values of a prior hyperparameter. In both these problems we argue that high accuracy in the computations is not required, which makes fast approximations such as regression adjustment ABC very useful. We illustrate our methods with several samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the following predictive business process monitoring problem: Given the execution trace of an ongoing case,and given a set of traces of historical (completed) cases, predict the most likely outcome of the ongoing case. In this context, a trace refers to a sequence of events with corresponding payloads, where a payload consists of a set of attribute-value pairs. Meanwhile, an outcome refers to a label associated to completed cases, like, for example, a label indicating that a given case completed “on time” (with respect to a given desired duration) or “late”, or a label indicating that a given case led to a customer complaint or not. The paper tackles this problem via a two-phased approach. In the first phase, prefixes of historical cases are encoded using complex symbolic sequences and clustered. In the second phase, a classifier is built for each of the clusters. To predict the outcome of an ongoing case at runtime given its (uncompleted) trace, we select the closest cluster(s) to the trace in question and apply the respective classifier(s), taking into account the Euclidean distance of the trace from the center of the clusters. We consider two families of clustering algorithms – hierarchical clustering and k-medoids – and use random forests for classification. The approach was evaluated on four real-life datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The object of this paper is to examine whether the improvements in technology that enhance community understanding of the frequency and severity of natural hazards also increased the risk of potential liability of planning authorities in negligence. In Australia, the National Strategy imposes a resilience based approach to disaster management and stresses that responsible land use planning can reduce or prevent the impact of natural hazards upon communities. Design/methodology/approach This paper analyses how the principles of negligence allocate responsibility for loss suffered by a landowner in a hazard prone area between the landowner and local government. Findings The analysis in this paper concludes that despite being able to establish a causal link between the loss suffered by a landowner and the approval of a local authority to build in a hazard prone area, it would be in the rarest of circumstances a negligence action may be proven. Research limitations/implications The focus of this paper is on planning policies and land development, not on the negligent provision of advice or information by the local authority. Practical implications This paper identifies the issues a landowner may face when seeking compensation from a local authority for loss suffered due to the occurrence of a natural hazard known or predicted to be possible in the area. Originality/value The paper establishes that as risk managers, local authorities must place reliance upon scientific modelling and predictive technology when determining planning processes in order to fulfil their responsibilities under the National Strategy and to limit any possible liability in negligence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the objectives of general-purpose financial reporting is to provide information about the financial position, financial performance and cash flows of an entity that is useful to a wide range of users in making economic decisions. The current focus on potentially increased relevance of fair value accounting weighed against issues of reliability has failed to consider the potential impact on the predictive ability of accounting. Based on a sample of international (non-U.S.) banks from 24 countries during 2009-2012, we test the usefulness of fair values in improving the predictive ability of earnings. First, we find that the increasing use of fair values on balance-sheet financial instruments enhances the ability of current earnings to predict future earnings and cash flows. Second, we provide evidence that the fair value hierarchy classification choices affect the ability of earnings to predict future cash flows and future earnings. More precisely, we find that the non-discretionary fair value component (Level 1 assets) improves the predictability of current earnings whereas the discretionary fair value components (Level 2 and Level 3 assets) weaken the predictive power of earnings. Third, we find a consistent and strong association between factors reflecting country-wide institutional structures and predictive power of fair values based on discretionary measurement inputs (Level 2 and Level 3 assets and liabilities). Our study is timely and relevant. The findings have important implications for standard setters and contribute to the debate on the use of fair value accounting.