810 resultados para predictive value
Resumo:
Antecedentes: El cáncer gástrico se diagnostica tardíamente. Sólo en países como Corea y Japón existen políticas de tamizaje, que se justificarían en cualquier país con alta prevalencia de cáncer gástrico como Colombia o Chile. El análisis del pepsinógeno sérico se ha propuesto para el diagnóstico de lesiones premalignas y malignas gástricas, por lo cual se pretende revisar sistemáticamente en la literatura el valor diagnóstico del cociente pepsinógeno I/II como marcador de lesiones premalignas y malignas gástricas. Metodología: Se revisó la literatura hasta septiembre del 2016 con palabras claves lesiones malignas, premalignas gástricas y pepsinógeno en las bases de datos PubMed, OVID, EMBASE, EBSCO, LILACS, OPENGRAY y Dialnet, artículos de prueba diagnóstica que evaluaran el cociente pepsinógeno I/II en relación con los hallazgos histológicos. Resultados: Se incluyeron 21 artículos conun total de 20601 pacientes, que demuestranuna sensibilidad entre13.7% - 91.2%, una especificidad entre 38.5% - 100%, un Valor Predictivo Positivo entre 6.3% - 100% y un Valor Predictivo Negativo entre 33.3% - 98.8%del cociente pepsinógeno I/II en relación con el diagnósticode lesiones premalignas y malignas gástricas. Conclusiones: Los valores del cociente pepsinógeno I/II disminuidos se relacionan con la presencia delesiones premalignas y malignas gástricas.Dado que tiene mejor especificidad que sensibilidad, en cuanto prueba para tamizaje, sería útil para la selección de pacientes que se beneficiaríande la EVDA. Se requieren más estudios de prueba diagnóstica para validar un punto de corte específico que pueda ser utilizado como valor estándar.
Resumo:
Mindfulness is a concept which has been widely used in studies on consciousness, but has recently been applied to the understanding of behaviours in other areas, including clinical psychology, meditation, physical activity, education and business. It has been suggested that mindfulness can also be applied to road safety, though this has not yet been researched. A standard definition of mindfulness is “paying attention in a particular way, on purpose in the present moment and non-judgemental to the unfolding of experience moment by moment” [1]. Scales have been developed to measure mindfulness; however, there are different views in the literature on the nature of the mindfulness construct. This paper reviews the issues raised in the literature and arrives at an operational definition of mindfulness considered relevant to road safety. It is further proposed that mindfulness is best construed as operating together with other psychosocial factors to influence road safety behaviours. The specific case of speeding behaviour is outlined, where the psychosocial variables in the Theory of Planned Behaviour (TPB) have been demonstrated to predict both intention to speed and actual speeding behaviour. A role is proposed for mindfulness in enhancing the explanatory and predictive powers of the TPB concerning speeding. The implications of mindfulness for speeding countermeasures are discussed and a program of future research is outlined.
Resumo:
Purpose: This paper aims to show that identification of expectations and software functional requirements via consultation with potential users is an integral component of the development of an emergency department patient admissions prediction tool. ---------- Design/methodology/approach: Thematic analysis of semi-structured interviews with 14 key health staff delivered rich data regarding existing practice and future needs. Participants included emergency department staff, bed managers, nurse unit managers, directors of nursing, and personnel from health administration. ---------- Findings: Participants contributed contextual insights on the current system of admissions, revealing a culture of crisis, imbued with misplayed communication. Their expectations and requirements of a potential predictive tool provided strategic data that moderated the development of the Emergency Department Patient Admissions Prediction Tool, based on their insistence that it feature availability, reliability and relevance. In order to deliver these stipulations, participants stressed that it should be incorporated, validated, defined and timely. ---------- Research limitations/implications: Participants were envisaging a concept and use of a tool that was somewhat hypothetical. However, further research will evaluate the tool in practice. ---------- Practical implications: Participants' unsolicited recommendations regarding implementation will not only inform a subsequent phase of the tool evaluation, but are eminently applicable to any process of implementation in a healthcare setting. ---------- Originality/value: The consultative process engaged clinicians and the paper delivers an insider view of an overburdened system, rather than an outsider's observations.
Resumo:
Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.
Resumo:
Over the last few decades, construction project performance has been evaluated due to the increase of delays, cost overruns and quality failures. Growing numbers of disputes, inharmonious working environments, conflict, blame cultures, and mismatches of objectives among project teams have been found to be contributory factors to poor project performance. Performance measurement (PM) approaches have been developed to overcome these issues, however, the comprehensiveness of PM as an overall approach is still criticised in terms of the iron triangle; namely time, cost, and quality. PM has primarily focused on objective measures, however, continuous improvement requires the inclusion of subjective measures, particularly contractor satisfaction (Co-S). It is challenging to deal with the two different groups of large and small-medium contractor satisfaction as to date, Co-S has not been extensively defined, primarily in developing countries such as Malaysia. Therefore, a Co-S model is developed in this research which aims to fulfil the current needs in the construction industry by integrating performance measures to address large and small-medium contractor perceptions. The positivist paradigm used in the research was adhered to by reviewing relevant literature and evaluating expert discussions on the research topic. It yielded a basis for the contractor satisfaction model (CoSMo) development which consists of three elements: contractor satisfaction (Co-S) dimensions; contributory factors and characteristics (project and participant). Using valid questionnaire results from 136 contractors in Malaysia lead to the prediction of several key factors of contractor satisfaction and to an examination of the relationships between elements. The relationships were examined through a series of sequential statistical analyses, namely correlation, one-way analysis of variance (ANOVA), t-tests and multiple regression analysis (MRA). Forward and backward MRAs were used to develop Co-S mathematical models. Sixteen Co-S models were developed for both large and small-medium contractors. These determined that the large contractor Malaysian Co-S was most affected by the conciseness of project scope and quality of the project brief. Contrastingly, Co-S for small-medium contractors was strongly affected by the efficiency of risk control in a project. The results of the research provide empirical evidence in support of the notion that appropriate communication systems in projects negatively contributes to large Co-S with respect to cost and profitability. The uniqueness of several Co-S predictors was also identified through a series of analyses on small-medium contractors. These contractors appear to be less satisfied than large contractors when participants lack effectiveness in timely authoritative decision-making and communication between project team members. Interestingly, the empirical results show that effective project health and safety measures are influencing factors in satisfying both large and small-medium contractors. The perspectives of large and small-medium contractors in respect to the performance of the entire project development were derived from the Co-S models. These were statistically validated and refined before a new Co-S model was developed. Developing such a unique model has the potential to increase project value and benefit all project participants. It is important to improve participant collaboration as it leads to better project performance. This study may encourage key project participants; such as client, consultant, subcontractor and supplier; to increase their attention to contractor needs in the development of a project. Recommendations for future research include investigating other participants‟ perspectives on CoSMo and the impact of the implementation of CoSMo in a project, since this study is focused purely on the contractor perspective.
Resumo:
In the Bayesian framework a standard approach to model criticism is to compare some function of the observed data to a reference predictive distribution. The result of the comparison can be summarized in the form of a p-value, and it's well known that computation of some kinds of Bayesian predictive p-values can be challenging. The use of regression adjustment approximate Bayesian computation (ABC) methods is explored for this task. Two problems are considered. The first is the calibration of posterior predictive p-values so that they are uniformly distributed under some reference distribution for the data. Computation is difficult because the calibration process requires repeated approximation of the posterior for different data sets under the reference distribution. The second problem considered is approximation of distributions of prior predictive p-values for the purpose of choosing weakly informative priors in the case where the model checking statistic is expensive to compute. Here the computation is difficult because of the need to repeatedly sample from a prior predictive distribution for different values of a prior hyperparameter. In both these problems we argue that high accuracy in the computations is not required, which makes fast approximations such as regression adjustment ABC very useful. We illustrate our methods with several samples.
Resumo:
Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.
Resumo:
This paper addresses the following predictive business process monitoring problem: Given the execution trace of an ongoing case,and given a set of traces of historical (completed) cases, predict the most likely outcome of the ongoing case. In this context, a trace refers to a sequence of events with corresponding payloads, where a payload consists of a set of attribute-value pairs. Meanwhile, an outcome refers to a label associated to completed cases, like, for example, a label indicating that a given case completed “on time” (with respect to a given desired duration) or “late”, or a label indicating that a given case led to a customer complaint or not. The paper tackles this problem via a two-phased approach. In the first phase, prefixes of historical cases are encoded using complex symbolic sequences and clustered. In the second phase, a classifier is built for each of the clusters. To predict the outcome of an ongoing case at runtime given its (uncompleted) trace, we select the closest cluster(s) to the trace in question and apply the respective classifier(s), taking into account the Euclidean distance of the trace from the center of the clusters. We consider two families of clustering algorithms – hierarchical clustering and k-medoids – and use random forests for classification. The approach was evaluated on four real-life datasets.
Resumo:
Understanding the effects of different types and quality of data on bioclimatic modeling predictions is vital to ascertaining the value of existing models, and to improving future models. Bioclimatic models were constructed using the CLIMEX program, using different data types – seasonal dynamics, geographic (overseas) distribution, and a combination of the two – for two biological control agents for the major weed Lantana camara L. in Australia. The models for one agent, Teleonemia scrupulosa Stål (Hemiptera:Tingidae) were based on a higher quality and quantity of data than the models for the other agent, Octotoma scabripennis Guérin-Méneville (Coleoptera: Chrysomelidae). Predictions of the geographic distribution for Australia showed that T. scrupulosa models exhibited greater accuracy with a progressive improvement from seasonal dynamics data, to the model based on overseas distribution, and finally the model combining the two data types. In contrast, O. scabripennis models were of low accuracy, and showed no clear trends across the various model types. These case studies demonstrate the importance of high quality data for developing models, and of supplementing distributional data with species seasonal dynamics data wherever possible. Seasonal dynamics data allows the modeller to focus on the species response to climatic trends, while distributional data enables easier fitting of stress parameters by restricting the species envelope to the described distribution. It is apparent that CLIMEX models based on low quality seasonal dynamics data, together with a small quantity of distributional data, are of minimal value in predicting the spatial extent of species distribution.
Resumo:
Purpose The object of this paper is to examine whether the improvements in technology that enhance community understanding of the frequency and severity of natural hazards also increased the risk of potential liability of planning authorities in negligence. In Australia, the National Strategy imposes a resilience based approach to disaster management and stresses that responsible land use planning can reduce or prevent the impact of natural hazards upon communities. Design/methodology/approach This paper analyses how the principles of negligence allocate responsibility for loss suffered by a landowner in a hazard prone area between the landowner and local government. Findings The analysis in this paper concludes that despite being able to establish a causal link between the loss suffered by a landowner and the approval of a local authority to build in a hazard prone area, it would be in the rarest of circumstances a negligence action may be proven. Research limitations/implications The focus of this paper is on planning policies and land development, not on the negligent provision of advice or information by the local authority. Practical implications This paper identifies the issues a landowner may face when seeking compensation from a local authority for loss suffered due to the occurrence of a natural hazard known or predicted to be possible in the area. Originality/value The paper establishes that as risk managers, local authorities must place reliance upon scientific modelling and predictive technology when determining planning processes in order to fulfil their responsibilities under the National Strategy and to limit any possible liability in negligence.
Resumo:
One of the objectives of general-purpose financial reporting is to provide information about the financial position, financial performance and cash flows of an entity that is useful to a wide range of users in making economic decisions. The current focus on potentially increased relevance of fair value accounting weighed against issues of reliability has failed to consider the potential impact on the predictive ability of accounting. Based on a sample of international (non-U.S.) banks from 24 countries during 2009-2012, we test the usefulness of fair values in improving the predictive ability of earnings. First, we find that the increasing use of fair values on balance-sheet financial instruments enhances the ability of current earnings to predict future earnings and cash flows. Second, we provide evidence that the fair value hierarchy classification choices affect the ability of earnings to predict future cash flows and future earnings. More precisely, we find that the non-discretionary fair value component (Level 1 assets) improves the predictability of current earnings whereas the discretionary fair value components (Level 2 and Level 3 assets) weaken the predictive power of earnings. Third, we find a consistent and strong association between factors reflecting country-wide institutional structures and predictive power of fair values based on discretionary measurement inputs (Level 2 and Level 3 assets and liabilities). Our study is timely and relevant. The findings have important implications for standard setters and contribute to the debate on the use of fair value accounting.
Resumo:
For a homing interceptor, suitable initial condition must be achieved by mid course guidance scheme for its maximum effectiveness. To achieve desired end goal of any mid course guidance scheme, two point boundary value problem must be solved online with all realistic constrain. A Newly developed computationally efficient technique named as MPSP (Model Predictive Static Programming) is utilized in this paper for obtaining suboptimal solution of optimal mid course guidance. Time to go uncertainty is avoided in this formulation by making use of desired position where midcourse guidance terminate and terminal guidance takes over. A suitable approach angle towards desired point also can be specified in this guidance law formulation. This feature makes this law particularly attractive because warhead effectiveness issue can be indirectly solved in mid course phase.
Resumo:
Food preferences are acquired through experience and can exert strong influence on choice behavior. In order to choose which food to consume, it is necessary to maintain a predictive representation of the subjective value of the associated food stimulus. Here, we explore the neural mechanisms by which such predictive representations are learned through classical conditioning. Human subjects were scanned using fMRI while learning associations between arbitrary visual stimuli and subsequent delivery of one of five different food flavors. Using a temporal difference algorithm to model learning, we found predictive responses in the ventral midbrain and a part of ventral striatum (ventral putamen) that were related directly to subjects' actual behavioral preferences. These brain structures demonstrated divergent response profiles, with the ventral midbrain showing a linear response profile with preference, and the ventral striatum a bivalent response. These results provide insight into the neural mechanisms underlying human preference behavior.
Resumo:
Toivonen, H., Srinivasan, A., King, R. D., Kramer, S. and Helma, C. (2003) Statistical Evaluation of the Predictive Toxicology Challenge 2000-2001. Bioinformatics 19: 1183-1193
Resumo:
Background: There has been relatively little research into health inequalities in older populations. This may be partly explained by the difficulty in identifying appropriate indicators of socio-economic status for older people. Ideally, indicators of socio-economic status to be used in studies of health inequalities in older populations should incorporate some measure of life-time socio-economic standing, and house value may fill this role. This study examined whether an indicator of accumulated wealth based on a combination of housing tenure and house value was a strong predictor of ill-health in older populations.
Methods: A total of 191 848 people aged =65 years and not living in communal establishments were identified from the 2001 Northern Ireland Census and followed for 5 years. Self-reported health and mortality risk by housing tenure/house value groupings were examined while controlling for a range of other demographic and socio-economic characteristics.
Results: Housing tenure/house value was highly correlated with other indicators of socio-economic status. Public-sector renters had worse self-reported health and higher mortality rates than owner occupiers but significant gradients were also found between those living in the highest-and lowest-valued owner-occupier properties. The relationship between housing tenure and value was unchanged by adjustment for indicators of social support and quality of the physical environment. Adjustment for limiting long-term illness and self-reported health at baseline narrowed but did not eliminate the health gains associated with living in more expensive housing.
Conclusions: House value of residence is an accessible and powerful indicator of accumulated wealth that is highly correlated with current health status and predictive of future mortality risk in older populations.