834 resultados para predictive value


Relevância:

60.00% 60.00%

Publicador:

Resumo:

La enfermedad renal crónica ha aumentado a nivel mundial y nacional, mientras que el número de donantes viene en descenso, y los pacientes en lista de espera aumentan. Los donantes cadavéricos son una opción para estos pacientes, y han sido utilizados en últimos años para aumentar los órganos disponibles. La evaluación de la calidad de estos es importante para optimizar su uso. Estudio analítico tipo cohorte retrospectiva, cálculo de KDPI en donantes cadavéricos, seguimiento función renal creatinina sérica 1 mes, 3 meses, 6 meses y un año. Correlación supervivencia del injerto, función renal, KDPI y EPTS. Análisis de supervivencia y regresión logística con variables del donante, receptor y acto quirúrgico.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objetivo: Determinar un modelo predictivo para uso del condón y consumo de alcohol como conductas de riesgo relacionadas el contagio de VIH/Sida en mujeres trabajadoras sexuales de la ciudad de Bogotá en el año 2015. Métodos Estudio de tipo transversal con diseño observacional, se tomaron 255 mujeres trabajadoras sexuales de la ciudad de Bogotá; La información analizada fue tomada del estudio realizado en cinco ciudades de Colombia en el año 2015, las hipótesis planteadas se soportaron en la asociación entre las condiciones sociodemográficas, de conocimiento, practicas, hábitos, apoyo social y de ocupación propia de las mujeres trabajadoras sexuales que podían explicar y predecir la adopción de conductas riesgosas para VIH/sida como son el uso del condón y el consumo de alcohol en ejercicio de su ocupación. Resultados El promedio de edad de inicio en el trabajo sexual fue 22,1±7,1 años, tres cuartas partes son solteras y residen en estrato dos y tres; el 96,5% dijo usar el condón con el último cliente y el 27,8% de ellas consumió alcohol durante su último servicio. En la conducta de riesgo uso del condón, se encontraron asociados entre otras, la edad [OR=1,10(1,03-1,17)], vivir en estrato dos [OR=7,7(1,5-39,5)], el ingreso por trabajo sexual [OR=1,0(1,0-1,0)], la disponibilidad del condón para el servicio [OR=0,03(0,008-0,16)] y contar con otro método de planificación (ligadura de trompas) [OR=4,47(1,0-18,3)]. En la conducta de riesgo consumo de alcohol, se encontró asociado ente otros: estrato socioeconómico dos [OR=5,8(1,54-22,3)], nivel de escolaridad secundaria [OR=0,12(0,16-0,96)], vivir con otros familiares [OR=3,45(1,7-7,02)], ingreso por trabajo sexual [OR=1,0(1,0-1,0)] y el sitio donde se ofrece el servicio [OR=0,07(0,04-0,15)]. Después de ajustar, se encontró que las variables que mejor explican el uso del condón fueron edad [OR=1,1(1,02-1,17)] y disponibilidad del condón [OR=0,04(0,008-0,024)], el modelo tuvo poca sensibilidad 33,3% y buena capacidad predictiva (84,6%). Las variables que mejor explicaron el consumo de alcohol durante el servicio fueron edad [OR= 0,95(0,91-0,98)], Número de clientes por semana [OR=0,9(0,90-0,98)], sitio donde ofrece el servicio [OR=7,1(3,45-14,8)], y estrato socioeconómico [OR=1,8 (0,90-3,83)], resultando un modelo con buena sensibilidad (71,8%) y buena capacidad predictiva (86,4%). Conclusiones Aspectos como la edad, el estrato socioeconómico, escolaridad, estado civil, ingreso económico por trabajo sexual, edad de inicio en el trabajo sexual, número de clientes antiguos en la última semana, disponibilidad del condón para prestar el servicio y ligadura de trompas como método diferente de planificación, se asociaron estadísticamente con el uso del condón. Sin embargo al ajustar las variables solo la edad y la disponibilidad del condón se mantuvieron como variables explicativas. Cabe anotar, que aunque el modelo mostró buena capacidad predictiva (84,6%), la precisión en sus estimaciones fue baja debido a la poca frecuencia del no uso del condón con el ultimo cliente (3,5%), y la sensibilidad del modelo apenas fue del 33,3%. Por otro lado, factores como la edad, el estrato socioeconómico, nivel educativo, ingreso económico, sitio de oferta del servicio, composición familiar, número de hijos, número de clientes atendidos en la última semana y número de clientes antiguos mostraron asociación estadística con el consumo de alcohol. Sin embargo, al ajustar las variables solo edad, estrato socioeconómico, sitio donde se ofrece el servicio y número de clientes por semana mantuvieron asociación estadística; observándose además que el estrato socioeconómico (uno y dos) y sitio donde se ofrece el servicio (establecimiento), son factores de riesgo para el consumo de alcohol en ejercicio de la ocupación y la poca edad y un número reducido de clientes por semana se comportan como factores de protección para el consumo de alcohol. El modelo predictivo que se desarrolló para la conducta de riesgo de consumo de alcohol, con una sensibilidad del 71,8% y un poder predictivo del 86,4%. .

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introducción Los sistemas de puntuación para predicción se han desarrollado para medir la severidad de la enfermedad y el pronóstico de los pacientes en la unidad de cuidados intensivos. Estas medidas son útiles para la toma de decisiones clínicas, la estandarización de la investigación, y la comparación de la calidad de la atención al paciente crítico. Materiales y métodos Estudio de tipo observacional analítico de cohorte en el que reviso las historias clínicas de 283 pacientes oncológicos admitidos a la unidad de cuidados intensivos (UCI) durante enero de 2014 a enero de 2016 y a quienes se les estimo la probabilidad de mortalidad con los puntajes pronósticos APACHE IV y MPM II, se realizó regresión logística con las variables predictoras con las que se derivaron cada uno de los modelos es sus estudios originales y se determinó la calibración, la discriminación y se calcularon los criterios de información Akaike AIC y Bayesiano BIC. Resultados En la evaluación de desempeño de los puntajes pronósticos APACHE IV mostro mayor capacidad de predicción (AUC = 0,95) en comparación con MPM II (AUC = 0,78), los dos modelos mostraron calibración adecuada con estadístico de Hosmer y Lemeshow para APACHE IV (p = 0,39) y para MPM II (p = 0,99). El ∆ BIC es de 2,9 que muestra evidencia positiva en contra de APACHE IV. Se reporta el estadístico AIC siendo menor para APACHE IV lo que indica que es el modelo con mejor ajuste a los datos. Conclusiones APACHE IV tiene un buen desempeño en la predicción de mortalidad de pacientes críticamente enfermos, incluyendo pacientes oncológicos. Por lo tanto se trata de una herramienta útil para el clínico en su labor diaria, al permitirle distinguir los pacientes con alta probabilidad de mortalidad.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

INTRODUCCIÓN: El Edema Macular (EM) es la principal causa de perdida de agudeza visual en pacientes con Oclusión Venosa Retiniana (OVR); luego del tratamiento, algunos pacientes persisten con mala agudeza visual. OBJETIVO: Realizar una Revisión Sistemática de la Literatura (RSL), para identificar la evidencia existente sobre factores tomográficos que predicen el resultado visual en pacientes con EM secundario a OVR. FUENTE DE LA INFORMACIÓN: PUBMED, MEDLINE, EMBASE, LILACS, COCHRANE, literatura gris. SELECCIÓN DE LOS ESTUDIOS: Ensayos Clínicos Controlados (ECC) y estudios observacionales analíticos. EXTRACCIÓN Y SÍNTESIS DE LOS DATOS: Dos investigadores seleccionaron los artículos de forma independiente. Se realizó una síntesis cualitativa de la información siguiendo las recomendaciones de la declaración PRISMA 2009. MEDIDAS Y DESENLACE PRINCIPAL: Grosor Retiniano Central (GRC), integridad de Banda Elipsoide e Integridad de Membrana Limitante Externa (MLE), determinados por SD OCT. El desenlace principal es la Agudeza Visual Mejor Corregida (AVMC) a los 6, 12,18 y/o 24 meses. RESULTADOS: Se identificaron 872 abstract y se incluyeron 8 artículos en el análisis cualitativo. Seis estudios evaluaron el GRC sin encontrar asociación con resultado visual final. Solo 2 estudios evaluaron y encontraron asociación estadísticamente significativa de la integridad de la MLE con el desenlace visual, Kang, H 2012 (r2 0,51 p 0,000), Rodriguez, F 2014 (p< 0,001). La integridad de la BE fue asociada a pronostico visual en 4 de 5 estudios que evaluaron esta variable, con resultados estadísticamente significativos. La AVMC de base también se asocio con desenlace visual en 4 de 5 estudios que la evaluaron. El mejor modelo que predice el resultado funcional según el estudio de Kang, H 2012 fue: Integridad de MLE, integridad de BE y AVMC de base (R2 0,671 p 0,000), a los 12 meses de seguimiento. CONCLUSION: La evidencia actual sugiere que la integridad de la BE y la MLE son predictores del resultados funcional en pacientes con EM secundario a OVR después de 6 o mas meses de seguimiento. Es necesario la realización de estudios controlados para llegar a resultados mas concluyentes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Antecedentes: El cáncer gástrico se diagnostica tardíamente. Sólo en países como Corea y Japón existen políticas de tamizaje, que se justificarían en cualquier país con alta prevalencia de cáncer gástrico como Colombia o Chile. El análisis del pepsinógeno sérico se ha propuesto para el diagnóstico de lesiones premalignas y malignas gástricas, por lo cual se pretende revisar sistemáticamente en la literatura el valor diagnóstico del cociente pepsinógeno I/II como marcador de lesiones premalignas y malignas gástricas. Metodología: Se revisó la literatura hasta septiembre del 2016 con palabras claves lesiones malignas, premalignas gástricas y pepsinógeno en las bases de datos PubMed, OVID, EMBASE, EBSCO, LILACS, OPENGRAY y Dialnet, artículos de prueba diagnóstica que evaluaran el cociente pepsinógeno I/II en relación con los hallazgos histológicos. Resultados: Se incluyeron 21 artículos conun total de 20601 pacientes, que demuestranuna sensibilidad entre13.7% - 91.2%, una especificidad entre 38.5% - 100%, un Valor Predictivo Positivo entre 6.3% - 100% y un Valor Predictivo Negativo entre 33.3% - 98.8%del cociente pepsinógeno I/II en relación con el diagnósticode lesiones premalignas y malignas gástricas. Conclusiones: Los valores del cociente pepsinógeno I/II disminuidos se relacionan con la presencia delesiones premalignas y malignas gástricas.Dado que tiene mejor especificidad que sensibilidad, en cuanto prueba para tamizaje, sería útil para la selección de pacientes que se beneficiaríande la EVDA. Se requieren más estudios de prueba diagnóstica para validar un punto de corte específico que pueda ser utilizado como valor estándar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mindfulness is a concept which has been widely used in studies on consciousness, but has recently been applied to the understanding of behaviours in other areas, including clinical psychology, meditation, physical activity, education and business. It has been suggested that mindfulness can also be applied to road safety, though this has not yet been researched. A standard definition of mindfulness is “paying attention in a particular way, on purpose in the present moment and non-judgemental to the unfolding of experience moment by moment” [1]. Scales have been developed to measure mindfulness; however, there are different views in the literature on the nature of the mindfulness construct. This paper reviews the issues raised in the literature and arrives at an operational definition of mindfulness considered relevant to road safety. It is further proposed that mindfulness is best construed as operating together with other psychosocial factors to influence road safety behaviours. The specific case of speeding behaviour is outlined, where the psychosocial variables in the Theory of Planned Behaviour (TPB) have been demonstrated to predict both intention to speed and actual speeding behaviour. A role is proposed for mindfulness in enhancing the explanatory and predictive powers of the TPB concerning speeding. The implications of mindfulness for speeding countermeasures are discussed and a program of future research is outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: This paper aims to show that identification of expectations and software functional requirements via consultation with potential users is an integral component of the development of an emergency department patient admissions prediction tool. ---------- Design/methodology/approach: Thematic analysis of semi-structured interviews with 14 key health staff delivered rich data regarding existing practice and future needs. Participants included emergency department staff, bed managers, nurse unit managers, directors of nursing, and personnel from health administration. ---------- Findings: Participants contributed contextual insights on the current system of admissions, revealing a culture of crisis, imbued with misplayed communication. Their expectations and requirements of a potential predictive tool provided strategic data that moderated the development of the Emergency Department Patient Admissions Prediction Tool, based on their insistence that it feature availability, reliability and relevance. In order to deliver these stipulations, participants stressed that it should be incorporated, validated, defined and timely. ---------- Research limitations/implications: Participants were envisaging a concept and use of a tool that was somewhat hypothetical. However, further research will evaluate the tool in practice. ---------- Practical implications: Participants' unsolicited recommendations regarding implementation will not only inform a subsequent phase of the tool evaluation, but are eminently applicable to any process of implementation in a healthcare setting. ---------- Originality/value: The consultative process engaged clinicians and the paper delivers an insider view of an overburdened system, rather than an outsider's observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last few decades, construction project performance has been evaluated due to the increase of delays, cost overruns and quality failures. Growing numbers of disputes, inharmonious working environments, conflict, blame cultures, and mismatches of objectives among project teams have been found to be contributory factors to poor project performance. Performance measurement (PM) approaches have been developed to overcome these issues, however, the comprehensiveness of PM as an overall approach is still criticised in terms of the iron triangle; namely time, cost, and quality. PM has primarily focused on objective measures, however, continuous improvement requires the inclusion of subjective measures, particularly contractor satisfaction (Co-S). It is challenging to deal with the two different groups of large and small-medium contractor satisfaction as to date, Co-S has not been extensively defined, primarily in developing countries such as Malaysia. Therefore, a Co-S model is developed in this research which aims to fulfil the current needs in the construction industry by integrating performance measures to address large and small-medium contractor perceptions. The positivist paradigm used in the research was adhered to by reviewing relevant literature and evaluating expert discussions on the research topic. It yielded a basis for the contractor satisfaction model (CoSMo) development which consists of three elements: contractor satisfaction (Co-S) dimensions; contributory factors and characteristics (project and participant). Using valid questionnaire results from 136 contractors in Malaysia lead to the prediction of several key factors of contractor satisfaction and to an examination of the relationships between elements. The relationships were examined through a series of sequential statistical analyses, namely correlation, one-way analysis of variance (ANOVA), t-tests and multiple regression analysis (MRA). Forward and backward MRAs were used to develop Co-S mathematical models. Sixteen Co-S models were developed for both large and small-medium contractors. These determined that the large contractor Malaysian Co-S was most affected by the conciseness of project scope and quality of the project brief. Contrastingly, Co-S for small-medium contractors was strongly affected by the efficiency of risk control in a project. The results of the research provide empirical evidence in support of the notion that appropriate communication systems in projects negatively contributes to large Co-S with respect to cost and profitability. The uniqueness of several Co-S predictors was also identified through a series of analyses on small-medium contractors. These contractors appear to be less satisfied than large contractors when participants lack effectiveness in timely authoritative decision-making and communication between project team members. Interestingly, the empirical results show that effective project health and safety measures are influencing factors in satisfying both large and small-medium contractors. The perspectives of large and small-medium contractors in respect to the performance of the entire project development were derived from the Co-S models. These were statistically validated and refined before a new Co-S model was developed. Developing such a unique model has the potential to increase project value and benefit all project participants. It is important to improve participant collaboration as it leads to better project performance. This study may encourage key project participants; such as client, consultant, subcontractor and supplier; to increase their attention to contractor needs in the development of a project. Recommendations for future research include investigating other participants‟ perspectives on CoSMo and the impact of the implementation of CoSMo in a project, since this study is focused purely on the contractor perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Bayesian framework a standard approach to model criticism is to compare some function of the observed data to a reference predictive distribution. The result of the comparison can be summarized in the form of a p-value, and it's well known that computation of some kinds of Bayesian predictive p-values can be challenging. The use of regression adjustment approximate Bayesian computation (ABC) methods is explored for this task. Two problems are considered. The first is the calibration of posterior predictive p-values so that they are uniformly distributed under some reference distribution for the data. Computation is difficult because the calibration process requires repeated approximation of the posterior for different data sets under the reference distribution. The second problem considered is approximation of distributions of prior predictive p-values for the purpose of choosing weakly informative priors in the case where the model checking statistic is expensive to compute. Here the computation is difficult because of the need to repeatedly sample from a prior predictive distribution for different values of a prior hyperparameter. In both these problems we argue that high accuracy in the computations is not required, which makes fast approximations such as regression adjustment ABC very useful. We illustrate our methods with several samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the following predictive business process monitoring problem: Given the execution trace of an ongoing case,and given a set of traces of historical (completed) cases, predict the most likely outcome of the ongoing case. In this context, a trace refers to a sequence of events with corresponding payloads, where a payload consists of a set of attribute-value pairs. Meanwhile, an outcome refers to a label associated to completed cases, like, for example, a label indicating that a given case completed “on time” (with respect to a given desired duration) or “late”, or a label indicating that a given case led to a customer complaint or not. The paper tackles this problem via a two-phased approach. In the first phase, prefixes of historical cases are encoded using complex symbolic sequences and clustered. In the second phase, a classifier is built for each of the clusters. To predict the outcome of an ongoing case at runtime given its (uncompleted) trace, we select the closest cluster(s) to the trace in question and apply the respective classifier(s), taking into account the Euclidean distance of the trace from the center of the clusters. We consider two families of clustering algorithms – hierarchical clustering and k-medoids – and use random forests for classification. The approach was evaluated on four real-life datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the effects of different types and quality of data on bioclimatic modeling predictions is vital to ascertaining the value of existing models, and to improving future models. Bioclimatic models were constructed using the CLIMEX program, using different data types – seasonal dynamics, geographic (overseas) distribution, and a combination of the two – for two biological control agents for the major weed Lantana camara L. in Australia. The models for one agent, Teleonemia scrupulosa Stål (Hemiptera:Tingidae) were based on a higher quality and quantity of data than the models for the other agent, Octotoma scabripennis Guérin-Méneville (Coleoptera: Chrysomelidae). Predictions of the geographic distribution for Australia showed that T. scrupulosa models exhibited greater accuracy with a progressive improvement from seasonal dynamics data, to the model based on overseas distribution, and finally the model combining the two data types. In contrast, O. scabripennis models were of low accuracy, and showed no clear trends across the various model types. These case studies demonstrate the importance of high quality data for developing models, and of supplementing distributional data with species seasonal dynamics data wherever possible. Seasonal dynamics data allows the modeller to focus on the species response to climatic trends, while distributional data enables easier fitting of stress parameters by restricting the species envelope to the described distribution. It is apparent that CLIMEX models based on low quality seasonal dynamics data, together with a small quantity of distributional data, are of minimal value in predicting the spatial extent of species distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The object of this paper is to examine whether the improvements in technology that enhance community understanding of the frequency and severity of natural hazards also increased the risk of potential liability of planning authorities in negligence. In Australia, the National Strategy imposes a resilience based approach to disaster management and stresses that responsible land use planning can reduce or prevent the impact of natural hazards upon communities. Design/methodology/approach This paper analyses how the principles of negligence allocate responsibility for loss suffered by a landowner in a hazard prone area between the landowner and local government. Findings The analysis in this paper concludes that despite being able to establish a causal link between the loss suffered by a landowner and the approval of a local authority to build in a hazard prone area, it would be in the rarest of circumstances a negligence action may be proven. Research limitations/implications The focus of this paper is on planning policies and land development, not on the negligent provision of advice or information by the local authority. Practical implications This paper identifies the issues a landowner may face when seeking compensation from a local authority for loss suffered due to the occurrence of a natural hazard known or predicted to be possible in the area. Originality/value The paper establishes that as risk managers, local authorities must place reliance upon scientific modelling and predictive technology when determining planning processes in order to fulfil their responsibilities under the National Strategy and to limit any possible liability in negligence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the objectives of general-purpose financial reporting is to provide information about the financial position, financial performance and cash flows of an entity that is useful to a wide range of users in making economic decisions. The current focus on potentially increased relevance of fair value accounting weighed against issues of reliability has failed to consider the potential impact on the predictive ability of accounting. Based on a sample of international (non-U.S.) banks from 24 countries during 2009-2012, we test the usefulness of fair values in improving the predictive ability of earnings. First, we find that the increasing use of fair values on balance-sheet financial instruments enhances the ability of current earnings to predict future earnings and cash flows. Second, we provide evidence that the fair value hierarchy classification choices affect the ability of earnings to predict future cash flows and future earnings. More precisely, we find that the non-discretionary fair value component (Level 1 assets) improves the predictability of current earnings whereas the discretionary fair value components (Level 2 and Level 3 assets) weaken the predictive power of earnings. Third, we find a consistent and strong association between factors reflecting country-wide institutional structures and predictive power of fair values based on discretionary measurement inputs (Level 2 and Level 3 assets and liabilities). Our study is timely and relevant. The findings have important implications for standard setters and contribute to the debate on the use of fair value accounting.