973 resultados para Data uncertainty
Resumo:
Dissertação de mestrado em Estatística
Resumo:
Article first published online: 13 NOV 2013
Resumo:
Football is considered nowadays one of the most popular sports. In the betting world, it has acquired an outstanding position, which moves millions of euros during the period of a single football match. The lack of profitability of football betting users has been stressed as a problem. This lack gave origin to this research proposal, which it is going to analyse the possibility of existing a way to support the users to increase their profits on their bets. Data mining models were induced with the purpose of supporting the gamblers to increase their profits in the medium/long term. Being conscience that the models can fail, the results achieved by four of the seven targets in the models are encouraging and suggest that the system can help to increase the profits. All defined targets have two possible classes to predict, for example, if there are more or less than 7.5 corners in a single game. The data mining models of the targets, more or less than 7.5 corners, 8.5 corners, 1.5 goals and 3.5 goals achieved the pre-defined thresholds. The models were implemented in a prototype, which it is a pervasive decision support system. This system was developed with the purpose to be an interface for any user, both for an expert user as to a user who has no knowledge in football games.
Resumo:
Current data mining engines are difficult to use, requiring optimizations by data mining experts in order to provide optimal results. To solve this problem a new concept was devised, by maintaining the functionality of current data mining tools and adding pervasive characteristics such as invisibility and ubiquity which focus on their users, providing better ease of use and usefulness, by providing autonomous and intelligent data mining processes. This article introduces an architecture to implement a data mining engine, composed by four major components: database; Middleware (control); Middleware (processing); and interface. These components are interlinked but provide independent scaling, allowing for a system that adapts to the user’s needs. A prototype has been developed in order to test the architecture. The results are very promising and showed their functionality and the need for further improvements.
Resumo:
The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.
Resumo:
Healthcare organizations often benefit from information technologies as well as embedded decision support systems, which improve the quality of services and help preventing complications and adverse events. In Centro Materno Infantil do Norte (CMIN), the maternal and perinatal care unit of Centro Hospitalar of Oporto (CHP), an intelligent pre-triage system is implemented, aiming to prioritize patients in need of gynaecology and obstetrics care in two classes: urgent and consultation. The system is designed to evade emergency problems such as incorrect triage outcomes and extensive triage waiting times. The current study intends to improve the triage system, and therefore, optimize the patient workflow through the emergency room, by predicting the triage waiting time comprised between the patient triage and their medical admission. For this purpose, data mining (DM) techniques are induced in selected information provided by the information technologies implemented in CMIN. The DM models achieved accuracy values of approximately 94% with a five range target distribution, which not only allow obtaining confident prediction models, but also identify the variables that stand as direct inducers to the triage waiting times.
Resumo:
An unsuitable patient flow as well as prolonged waiting lists in the emergency room of a maternity unit, regarding gynecology and obstetrics care, can affect the mother and child’s health, leading to adverse events and consequences regarding their safety and satisfaction. Predicting the patients’ waiting time in the emergency room is a means to avoid this problem. This study aims to predict the pre-triage waiting time in the emergency care of gynecology and obstetrics of Centro Materno Infantil do Norte (CMIN), the maternal and perinatal care unit of Centro Hospitalar of Oporto, situated in the north of Portugal. Data mining techniques were induced using information collected from the information systems and technologies available in CMIN. The models developed presented good results reaching accuracy and specificity values of approximately 74% and 94%, respectively. Additionally, the number of patients and triage professionals working in the emergency room, as well as some temporal variables were identified as direct enhancers to the pre-triage waiting time. The imp lementation of the attained knowledge in the decision support system and business intelligence platform, deployed in CMIN, leads to the optimization of the patient flow through the emergency room and improving the quality of services.
Resumo:
Patient blood pressure is an important vital signal to the physicians take a decision and to better understand the patient condition. In Intensive Care Units is possible monitoring the blood pressure due the fact of the patient being in continuous monitoring through bedside monitors and the use of sensors. The intensivist only have access to vital signs values when they look to the monitor or consult the values hourly collected. Most important is the sequence of the values collected, i.e., a set of highest or lowest values can signify a critical event and bring future complications to a patient as is Hypotension or Hypertension. This complications can leverage a set of dangerous diseases and side-effects. The main goal of this work is to predict the probability of a patient has a blood pressure critical event in the next hours by combining a set of patient data collected in real-time and using Data Mining classification techniques. As output the models indicate the probability (%) of a patient has a Blood Pressure Critical Event in the next hour. The achieved results showed to be very promising, presenting sensitivity around of 95%.
Resumo:
Este proyecto propone extender y generalizar los procesos de estimación e inferencia de modelos aditivos generalizados multivariados para variables aleatorias no gaussianas, que describen comportamientos de fenómenos biológicos y sociales y cuyas representaciones originan series longitudinales y datos agregados (clusters). Se genera teniendo como objeto para las aplicaciones inmediatas, el desarrollo de metodología de modelación para la comprensión de procesos biológicos, ambientales y sociales de las áreas de Salud y las Ciencias Sociales, la condicionan la presencia de fenómenos específicos, como el de las enfermedades.Es así que el plan que se propone intenta estrechar la relación entre la Matemática Aplicada, desde un enfoque bajo incertidumbre y las Ciencias Biológicas y Sociales, en general, generando nuevas herramientas para poder analizar y explicar muchos problemas sobre los cuales tienen cada vez mas información experimental y/o observacional.Se propone, en forma secuencial, comenzando por variables aleatorias discretas (Yi, con función de varianza menor que una potencia par del valor esperado E(Y)) generar una clase unificada de modelos aditivos (paramétricos y no paramétricos) generalizados, la cual contenga como casos particulares a los modelos lineales generalizados, no lineales generalizados, los aditivos generalizados, los de media marginales generalizados (enfoques GEE1 -Liang y Zeger, 1986- y GEE2 -Zhao y Prentice, 1990; Zeger y Qaqish, 1992; Yan y Fine, 2004), iniciando una conexión con los modelos lineales mixtos generalizados para variables latentes (GLLAMM, Skrondal y Rabe-Hesketh, 2004), partiendo de estructuras de datos correlacionados. Esto permitirá definir distribuciones condicionales de las respuestas, dadas las covariables y las variables latentes y estimar ecuaciones estructurales para las VL, incluyendo regresiones de VL sobre las covariables y regresiones de VL sobre otras VL y modelos específicos para considerar jerarquías de variación ya reconocidas. Cómo definir modelos que consideren estructuras espaciales o temporales, de manera tal que permitan la presencia de factores jerárquicos, fijos o aleatorios, medidos con error como es el caso de las situaciones que se presentan en las Ciencias Sociales y en Epidemiología, es un desafío a nivel estadístico. Se proyecta esa forma secuencial para la construcción de metodología tanto de estimación como de inferencia, comenzando con variables aleatorias Poisson y Bernoulli, incluyendo los existentes MLG, hasta los actuales modelos generalizados jerárquicos, conextando con los GLLAMM, partiendo de estructuras de datos correlacionados. Esta familia de modelos se generará para estructuras de variables/vectores, covariables y componentes aleatorios jerárquicos que describan fenómenos de las Ciencias Sociales y la Epidemiología.
Resumo:
Los eventos transitorios únicos analógicos (ASET, Analog Single Event Transient) se producen debido a la interacción de un ión pesado o un protón de alta energía con un dispositivo sensible de un circuito analógico. La interacción del ión con un transistor bipolar o de efecto de campo MOS induce pares electrón-hueco que provocan picos que pueden propagarse a la salida del componente analógico provocando transitorios que pueden inducir fallas en el nivel sistema. Los problemas más graves debido a este tipo de fenómeno se dan en el medioambiente espacial, muy rico en iones pesados. Casos típicos los constituyen las computadoras de a bordo de satélites y otros artefactos espaciales. Sin embargo, y debido a la continua contracción de dimensiones de los transistores (que trae aparejado un aumento de sensibilidad), este fenómeno ha comenzado a observarse a nivel del mar, provocado fundamentalmente por el impacto de neutrones atmosféricos. Estos efectos pueden provocar severos problemas a los sistemas informáticos con interfaces analógicas desde las que obtienen datos para el procesamiento y se han convertido en uno de los problemas más graves a los que tienen que hacer frente los diseñadores de sistemas de alta escala de integración. Casos típicos son los Sistemas en Chip que incluyen módulos de procesamiento de altas prestaciones como las interfaces analógicas.El proyecto persigue como objetivo general estudiar la susceptibilidad de sistemas informáticos a ASETs en sus secciones analógicas, proponiendo estrategias para la mitigación de los errores.Como objetivos específicos se pretende: -Proponer nuevos modelos de ASETs basados en simulaciones en el nivel dispositivo y resueltas por el método de elementos finitos.-Utilizar los modelos para identificar las secciones más propensas a producir errores y consecuentemente para ser candidatos a la aplicación de técnicas de endurecimiento a radiaciones.-Utilizar estos modelos para estudiar la naturaleza de los errores producidos en sistemas de procesamiento de datos.-Proponer soluciones novedosas para la mitigación de estos efectos en los mismos circuitos analógicos evitando su propagación a las secciones digitales.-Proponer soluciones para la mitigación de los efectos en el nivel sistema.Para llevar a cabo el proyecto se plantea un procedimiento ascendente para las investigaciones a realizar, comenzando por descripciones en el nivel físico para posteriormente aumentar el nivel de abstracción en el que se encuentra modelado el circuito. Se propone el modelado físico de los dispositivos MOS y su resolución mediante el Método de Elementos Finitos. La inyección de cargas en las zonas sensibles de los modelos permitirá determinar los perfiles de los pulsos de corriente que deben inyectarse en el nivel circuito para emular estos efectos. Estos procedimientos se realizarán para los distintos bloques constructivos de las interfaces analógicas, proponiendo estrategias de mitigación de errores en diferentes niveles.Los resultados esperados del presente proyecto incluyen hardware para detección de errores y tolerancia a este tipo de eventos que permitan aumentar la confiabilidad de sistemas de tratamiento de la información, así como también nuevos datos referentes a efectos de la radiación en semiconductores, nuevos modelos de fallas transitorias que permitan una simulación de estos eventos en el nivel circuito y la determinación de zonas sensibles de interfaces analógicas típicas que deben ser endurecidas para radiación.
Resumo:
Driven by concerns about rising energy costs, security of supply and climate change a new wave of Sustainable Energy Technologies (SET’s) have been embraced by the Irish consumer. Such systems as solar collectors, heat pumps and biomass boilers have become common due to government backed financial incentives and revisions of the building regulations. However, there is a deficit of knowledge and understanding of how these technologies operate and perform under Ireland’s maritime climate. This AQ-WBL project was designed to address both these needs by developing a Data Acquisition (DAQ) system to monitor the performance of such technologies and a web-based learning environment to disseminate performance characteristics and supplementary information about these systems. A DAQ system consisting of 108 sensors was developed as part of Galway-Mayo Institute of Technology’s (GMIT’s) Centre for the Integration of Sustainable EnergyTechnologies (CiSET) in an effort to benchmark the performance of solar thermal collectors and Ground Source Heat Pumps (GSHP’s) under Irish maritime climate, research new methods of integrating these systems within the built environment and raise awareness of SET’s. It has operated reliably for over 2 years and has acquired over 25 million data points. Raising awareness of these SET’s is carried out through the dissemination of the performance data through an online learning environment. A learning environment was created to provide different user groups with a basic understanding of a SET’s with the support of performance data, through a novel 5 step learning process and two examples were developed for the solar thermal collectors and the weather station which can be viewed at http://www.kdp 1 .aquaculture.ie/index.aspx. This online learning environment has been demonstrated to and well received by different groups of GMIT’s undergraduate students and plans have been made to develop it further to support education, awareness, research and regional development.