774 resultados para Prediction intervals
Resumo:
The research of this thesis dissertation covers developments and applications of short-and long-term climate predictions. The short-term prediction emphasizes monthly and seasonal climate, i.e. forecasting from up to the next month over a season to up to a year or so. The long-term predictions pertain to the analysis of inter-annual- and decadal climate variations over the whole 21st century. These two climate prediction methods are validated and applied in the study area, namely, Khlong Yai (KY) water basin located in the eastern seaboard of Thailand which is a major industrial zone of the country and which has been suffering from severe drought and water shortage in recent years. Since water resources are essential for the further industrial development in this region, a thorough analysis of the potential climate change with its subsequent impact on the water supply in the area is at the heart of this thesis research. The short-term forecast of the next-season climate, such as temperatures and rainfall, offers a potential general guideline for water management and reservoir operation. To that avail, statistical models based on autoregressive techniques, i.e., AR-, ARIMA- and ARIMAex-, which includes additional external regressors, and multiple linear regression- (MLR) models, are developed and applied in the study region. Teleconnections between ocean states and the local climate are investigated and used as extra external predictors in the ARIMAex- and the MLR-model and shown to enhance the accuracy of the short-term predictions significantly. However, as the ocean state – local climate teleconnective relationships provide only a one- to four-month ahead lead time, the ocean state indices can support only a one-season-ahead forecast. Hence, GCM- climate predictors are also suggested as an additional predictor-set for a more reliable and somewhat longer short-term forecast. For the preparation of “pre-warning” information for up-coming possible future climate change with potential adverse hydrological impacts in the study region, the long-term climate prediction methodology is applied. The latter is based on the downscaling of climate predictions from several single- and multi-domain GCMs, using the two well-known downscaling methods SDSM and LARS-WG and a newly developed MLR-downscaling technique that allows the incorporation of a multitude of monthly or daily climate predictors from one- or several (multi-domain) parent GCMs. The numerous downscaling experiments indicate that the MLR- method is more accurate than SDSM and LARS-WG in predicting the recent past 20th-century (1971-2000) long-term monthly climate in the region. The MLR-model is, consequently, then employed to downscale 21st-century GCM- climate predictions under SRES-scenarios A1B, A2 and B1. However, since the hydrological watershed model requires daily-scale climate input data, a new stochastic daily climate generator is developed to rescale monthly observed or predicted climate series to daily series, while adhering to the statistical and geospatial distributional attributes of observed (past) daily climate series in the calibration phase. Employing this daily climate generator, 30 realizations of future daily climate series from downscaled monthly GCM-climate predictor sets are produced and used as input in the SWAT- distributed watershed model, to simulate future streamflow and other hydrological water budget components in the study region in a multi-realization manner. In addition to a general examination of the future changes of the hydrological regime in the KY-basin, potential future changes of the water budgets of three main reservoirs in the basin are analysed, as these are a major source of water supply in the study region. The results of the long-term 21st-century downscaled climate predictions provide evidence that, compared with the past 20th-reference period, the future climate in the study area will be more extreme, particularly, for SRES A1B. Thus, the temperatures will be higher and exhibit larger fluctuations. Although the future intensity of the rainfall is nearly constant, its spatial distribution across the region is partially changing. There is further evidence that the sequential rainfall occurrence will be decreased, so that short periods of high intensities will be followed by longer dry spells. This change in the sequential rainfall pattern will also lead to seasonal reductions of the streamflow and seasonal changes (decreases) of the water storage in the reservoirs. In any case, these predicted future climate changes with their hydrological impacts should encourage water planner and policy makers to develop adaptation strategies to properly handle the future water supply in this area, following the guidelines suggested in this study.
Resumo:
There has been recent interest in using temporal difference learning methods to attack problems of prediction and control. While these algorithms have been brought to bear on many problems, they remain poorly understood. It is the purpose of this thesis to further explore these algorithms, presenting a framework for viewing them and raising a number of practical issues and exploring those issues in the context of several case studies. This includes applying the TD(lambda) algorithm to: 1) learning to play tic-tac-toe from the outcome of self-play and of play against a perfectly-playing opponent and 2) learning simple one-dimensional segmentation tasks.
Resumo:
We contribute a quantitative and systematic model to capture etch non-uniformity in deep reactive ion etch of microelectromechanical systems (MEMS) devices. Deep reactive ion etch is commonly used in MEMS fabrication where high-aspect ratio features are to be produced in silicon. It is typical for many supposedly identical devices, perhaps of diameter 10 mm, to be etched simultaneously into one silicon wafer of diameter 150 mm. Etch non-uniformity depends on uneven distributions of ion and neutral species at the wafer level, and on local consumption of those species at the device, or die, level. An ion–neutral synergism model is constructed from data obtained from etching several layouts of differing pattern opening densities. Such a model is used to predict wafer-level variation with an r.m.s. error below 3%. This model is combined with a die-level model, which we have reported previously, on a MEMS layout. The two-level model is shown to enable prediction of both within-die and wafer-scale etch rate variation for arbitrary wafer loadings.
Resumo:
Resumen tomado de la publicaci??n. Resumen tambi??n en ingl??s
Resumo:
Considering the difficulty in the insulin dosage selection and the problem of hyper- and hypoglycaemia episodes in type 1 diabetes, dosage-aid systems appear as tremendously helpful for these patients. A model-based approach to this problem must unavoidably consider uncertainty sources such as the large intra-patient variability and food intake. This work addresses the prediction of glycaemia for a given insulin therapy face to parametric and input uncertainty, by means of modal interval analysis. As result, a band containing all possible glucose excursions suffered by the patient for the given uncertainty is obtained. From it, a safer prediction of possible hyper- and hypoglycaemia episodes can be calculated
Resumo:
In this paper, robustness of parametric systems is analyzed using a new approach to interval mathematics called Modal Interval Analysis. Modal Intervals are an interval extension that, instead of classic intervals, recovers some of the properties required by a numerical system. Modal Interval Analysis not only simplifies the computation of interval functions but allows semantic interpretation of their results. Necessary, sufficient and, in some cases, necessary and sufficient conditions for robust performance are presented
Resumo:
PowerPoint slides for Confidence Intervals. Examples are taken from the Medical Literature
Resumo:
lecture for COMP6235
Resumo:
An emerging consensus in cognitive science views the biological brain as a hierarchically-organized predictive processing system. This is a system in which higher-order regions are continuously attempting to predict the activity of lower-order regions at a variety of (increasingly abstract) spatial and temporal scales. The brain is thus revealed as a hierarchical prediction machine that is constantly engaged in the effort to predict the flow of information originating from the sensory surfaces. Such a view seems to afford a great deal of explanatory leverage when it comes to a broad swathe of seemingly disparate psychological phenomena (e.g., learning, memory, perception, action, emotion, planning, reason, imagination, and conscious experience). In the most positive case, the predictive processing story seems to provide our first glimpse at what a unified (computationally-tractable and neurobiological plausible) account of human psychology might look like. This obviously marks out one reason why such models should be the focus of current empirical and theoretical attention. Another reason, however, is rooted in the potential of such models to advance the current state-of-the-art in machine intelligence and machine learning. Interestingly, the vision of the brain as a hierarchical prediction machine is one that establishes contact with work that goes under the heading of 'deep learning'. Deep learning systems thus often attempt to make use of predictive processing schemes and (increasingly abstract) generative models as a means of supporting the analysis of large data sets. But are such computational systems sufficient (by themselves) to provide a route to general human-level analytic capabilities? I will argue that they are not and that closer attention to a broader range of forces and factors (many of which are not confined to the neural realm) may be required to understand what it is that gives human cognition its distinctive (and largely unique) flavour. The vision that emerges is one of 'homomimetic deep learning systems', systems that situate a hierarchically-organized predictive processing core within a larger nexus of developmental, behavioural, symbolic, technological and social influences. Relative to that vision, I suggest that we should see the Web as a form of 'cognitive ecology', one that is as much involved with the transformation of machine intelligence as it is with the progressive reshaping of our own cognitive capabilities.
Resumo:
This study assessed visual working memory through Memonum computerized test in schoolchildren. The effects of three exposure times (1, 4 and 8 seconds) have been evaluated, and the presentation of a distractor on the mnemonic performance in the test Memonum in 72 children from a college in the metropolitan area of Bucaramanga, Colombia, aged between 8 and 11 in grades third, fourth and fifth grade. It has been found significant difference regarding the exposure time in the variables number of hits and successes accumulated, showing a better mnemonic performance in participants who took the test during 8 seconds compared to children who took the test during 1 second; in addition, the presence of a distractor showed a significant difference regarding the strengths and successes accumulated. Such distractor is considered a stimulus generator interference that disrupts the storage capacity of working memory in children. Additionally, a significant difference was found with respect to the use of mental rehearsal strategy, indicating that participants who took the test in 4 and 8 seconds, respectively, assigned higher scores than children who took the test in 1 second. A long exposure time to stimuli during Memonum test increases the holding capacity. Also, the use of a distractor affects the storage capacity and this, at the same time, increases the school progression due to the use of mnemonic strategies that children use to ensure the memory of the numerical series
Resumo:
Objective: To establish a prediction model of the degree of disability in adults with Spinal CordInjury (SCI ) based on the use of the WHO-DAS II . Methods: The disability degree was correlatedwith three variable groups: clinical, sociodemographic and those related with rehabilitation services.A model of multiple linear regression was built to predict disability. 45 people with sci exhibitingdiverse etiology, neurological level and completeness participated. Patients were older than 18 andthey had more than a six-month post-injury. The WHO-DAS II and the ASIA impairment scale(AIS ) were used. Results: Variables that evidenced a significant relationship with disability were thefollowing: occupational situation, type of affiliation to the public health care system, injury evolutiontime, neurological level, partial preservation zone, ais motor and sensory scores and number ofclinical complications during the last year. Complications significantly associated to disability werejoint pain, urinary infections, intestinal problems and autonomic disreflexia. None of the variablesrelated to rehabilitation services showed significant association with disability. The disability degreeexhibited significant differences in favor of the groups that received the following services: assistivedevices supply and vocational, job or educational counseling. Conclusions: The best predictiondisability model in adults with sci with more than six months post-injury was built with variablesof injury evolution time, AIS sensory score and injury-related unemployment.
Resumo:
Non-specific Occupational Low Back Pain (NOLBP) is a health condition that generates a high absenteeism and disability. Due to multifactorial causes is difficult to determine accurate diagnosis and prognosis. The clinical prediction of NOLBP is identified as a series of models that integrate a multivariate analysis to determine early diagnosis, course, and occupational impact of this health condition. Objective: to identify predictor factors of NOLBP, and the type of material referred to in the scientific evidence and establish the scopes of the prediction. Materials and method: the title search was conducted in the databases PubMed, Science Direct, and Ebsco Springer, between1985 and 2012. The selected articles were classified through a bibliometric analysis allowing to define the most relevant ones. Results: 101 titles met the established criteria, but only 43 metthe purpose of the review. As for NOLBP prediction, the studies varied in relation to the factors for example: diagnosis, transition of lumbar pain from acute to chronic, absenteeism from work, disability and return to work. Conclusion: clinical prediction is considered as a strategic to determine course and prognostic of NOLBP, and to determine the characteristics that increase the risk of chronicity in workers with this health condition. Likewise, clinical prediction rules are tools that aim to facilitate decision making about the evaluation, diagnosis, prognosis and intervention for low back pain, which should incorporate risk factors of physical, psychological and social.
Resumo:
Los pacientes con cáncer de próstata con tumores de riesgo bajo e intermedio de recaída pueden ser tratados con cirugía, radioterapia, y en casos seleccionados observación. Los pacientes en nuestro país, son tratados con prostatectomía radical, los cuales tienen una probabilidad de recaída bioquímica del 15% al 40% a 5 años (1,2,3). Metodología: estudio descriptivo, retrospectivo, tipo serie de casos. Se revisaron los registros de todos que recibieron radioterapia de salvamento que ofrece para a aquellos pacientes que ya tienen recaída bioquímica o local después de la Prostatectomia Radical, entre enero de 2003 y diciembre de 2007. Resultado: entre los 40 pacientes elegibles para el análisis, la media de seguimiento fue de 2,17 años, con una desviación estándar de 1,5 años, con un rango de 0 a 58 meses, la media de la edad fue de 66,12 años, con una desviación estándar de 6,63, con un rango entre 50 y 78 años. Todos los pacientes le realizaron prostatectomía. La media de supervivencia libre de enfermedad con intervalos de confianza del 95% fue de 4,58 años (2,24 a 4,92 años). Discusión: analizados los resultados en éste grupo de pacientes con cáncer de próstata sometidos a prostatectomía radical y radioterapia como terapia de salvamento, con un seguimiento promedio de 2,17 años, observamos que los resultados obtenidos en el presente estudio son inferiores a los registrados en otros reportes en la literatura (16-20).
Resumo:
Objetivo: Recientemente, se han propuesto varios dispositivos de impedancia bioeléctrica (BIA) para la estimación rápida de la grasa corporal. Sin embargo, no han sido publicadas referencias de grasa corporal para niños y adolescentes en población Colombiana. El objetivo de este estudio fue establecer percentiles de grasa corporal por BIA en niños y adolescentes de Bogotá, Colombia de entre 9 y 17.9 años, pertenecientes al estudio FUPRECOL. Métodos: Estudio descriptivo y transversal, realizado en 2.526 niños y 3.324 adolescentes de entre 9 y 17.9 años de edad, pertenecientes a instituciones educativas oficiales de Bogotá, Colombia. El porcentaje de grasa corporal fue medido con Tanita® Analizador de Composición Corporal (Modelo BF-689), según edad y sexo. Se tomaron medidas de peso, talla, circunferencia de cintura, circunferencia de cadera y estado de maduración sexual por auto-reporte. Se calcularon los percentiles (P3, P10, P25, P50, P75, P90 y P97) y curvas centiles por el método LMS según sexo y edad y se realizó una comparación entre los valores de la CC observados con estándares internacionales. Resultados: Se presentan valores de porcentaje de grasa corporal y las curvas de percentiles. En la mayoría de los grupos etáreos la grasa corporal de las chicas fue mayor a la de los chicos. Sujetos cuyo porcentaje de grasa corporal estaba por encima del percentil 90 de la distribución estándar normal se consideró que tenían un elevado riesgo cardiovascular (chicos desde 23,4-28,3 y chicas desde 31,0-34,1). En general, nuestros porcentajes de grasa corporal fueron inferiores a los valores de Turquía, Alemania, Grecia, España y Reino Unido. Conclusiones: Se presentan percentiles del porcentaje de grasa por BIA según edad y sexo que podrán ser usados de referencia en la evaluación del estado nutricional y en la predicción del riesgo cardiovascular desde edades tempranas.
Resumo:
Objetivo: Determinar la distribución por percentiles de la circunferencia de cintura en una población escolar de Bogotá, Colombia, pertenecientes al estudio FUPRECOL. Métodos: Estudio transversal, realizado en 3.005 niños y 2.916 adolescentes de entre 9 y 17,9 años de edad, de Bogotá, Colombia. Se tomaron medidas de peso, talla, circunferencia de cintura, circunferencia de cadera y estado de maduración sexual por auto-reporte. Se calcularon los percentiles (P3, P10, P25, P50, P75, P90 y P97) y curvas centiles según sexo y edad. Se realizó una comparación entre los valores de la circunferencia de cintura observados con estándares internacionales. Resultados: De la población general (n=5.921), el 57,0% eran chicas (promedio de edad 12,7±2,3 años). En la mayoría de los grupos etáreos la circunferencia de cintura de las chicas fue inferior a la de los chicos. El aumento entre el P50-P97 de la circunferencia de cintura , por edad, fue mínimo de 15,7 cm en chicos de 9-9.9 años y de 16,0 cm en las chicas de 11-11.9 años. Al comparar los resultados de este estudio, por grupos de edad y sexo, con trabajos internacionales de niños y adolescentes, el P50 fue inferior al reportado en Perú e Inglaterra a excepción de los trabajos de la India, Venezuela (Mérida), Estados Unidos y España. Conclusiones: Se presentan percentiles de la circunferencia de cintura según edad y sexo que podrán ser usados de referencia en la evaluación del estado nutricional y en la predicción del riesgo cardiovascular desde edades tempranas.