947 resultados para PROBABILISTIC FORECASTS
Resumo:
In the trend towards tolerating hardware unreliability, accuracy is exchanged for cost savings. Running on less reliable machines, functionally correct code becomes risky and one needs to know how risk propagates so as to mitigate it. Risk estimation, however, seems to live outside the average programmer’s technical competence and core practice. In this paper we propose that program design by source-to-source transformation be risk-aware in the sense of making probabilistic faults visible and supporting equational reasoning on the probabilistic behaviour of programs caused by faults. This reasoning is carried out in a linear algebra extension to the standard, `a la Bird-Moor algebra of programming. This paper studies, in particular, the propagation of faults across standard program transformation techniques known as tupling and fusion, enabling the fault of the whole to be expressed in terms of the faults of its parts.
Resumo:
Software reconfigurability became increasingly relevant to the architectural process due to the crescent dependency of modern societies on reliable and adaptable systems. Such systems are supposed to adapt themselves to surrounding environmental changes with minimal service disruption, if any. This paper introduces an engine that statically applies reconfigurations to (formal) models of software architectures. Reconfigurations are specified using a domain specific language— ReCooPLa—which targets the manipulation of software coordinationstructures,typicallyusedinservice-orientedarchitectures(soa).Theengine is responsible for the compilation of ReCooPLa instances and their application to the relevant coordination structures. The resulting configurations are amenable to formal analysis of qualitative and quantitative (probabilistic) properties.
Resumo:
ABSTRACT Objective Investigate the occurrence of dual diagnosis in users of legal and illegal drugs. Methods It is an analytical, cross-sectional study with a quantitative approach, non-probabilistic intentional sampling, carried out in two centers for drug addiction treatment, by means of individual interviews. A sociodemographic questionnaire, the Alcohol, Smoking and Substance Involvement Screening Test (ASSIST) and the Mini-International Neuropsychiatric Interview (MINI) were used. Results One hundred and ten volunteers divided into abstinent users (group 1), alcoholics (group 2) and users of alcohol and illicit drugs (group 3). The substances were alcohol, tobacco, crack and marijuana. A higher presence of dual diagnosis in group 3 (71.8%) was observed, which decreased in group 2 (60%) and 37.1% of drug abstinent users had psychiatric disorder. Dual diagnosis was associated with the risk of suicide, suicide attempts and the practice of infractions. The crack consumption was associated with the occurrence of major depressive episode and antisocial personality disorder. Conclusion It was concluded that the illicit drug users had a higher presence of dual diagnosis showing the severity of this clinical condition. It is considered essential that this clinical reality is included in intervention strategies in order to decrease the negative effects of consumption of these substances and provide better quality of life for these people.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
First published online: December 16, 2014.
Resumo:
There are two significant reasons for the uncertainties of water demand. On one hand, an evolving technological world is plagued with accelerated change in lifestyles and consumption patterns; and on the other hand, intensifying climate change. Therefore, with an uncertain future, what enables policymakers to define the state of water resources, which are affected by withdrawals and demands? Through a case study based on thirteen years of observation data in the Zayandeh Rud River basin in Isfahan province located in Iran, this paper forecasts a wide range of urban water demand possibilities in order to create a portfolio of plans which could be utilized by different water managers. A comparison and contrast of two existing methods are discussed, demonstrating the Random Walk Methodology, which will be referred to as the â On uncertainty pathâ , because it takes the uncertainties into account and can be recommended to managers. This On Uncertainty Path is composed of both dynamic forecasting method and system simulation. The outcomes show the advantage of such methods particularly for places that climate change will aggravate their water scarcity, such as Iran.
Resumo:
This research work explores a new way of presenting and representing information about patients in critical care, which is the use of a timeline to display information. This is accomplished with the development of an interactive Pervasive Patient Timeline able to give to the intensivists an access in real-time to an environment containing patients clinical information from the moment in which the patients are admitted in the Intensive Care Unit (ICU) until their discharge This solution allows the intensivists to analyse data regarding vital signs, medication, exams, data mining predictions, among others. Due to the pervasive features, intensivists can have access to the timeline anywhere and anytime, allowing them to make decisions when they need to be made. This platform is patient-centred and is prepared to support the decision process allowing the intensivists to provide better care to patients due the inclusion of clinical forecasts.
Resumo:
Telecommunications and network technology is now the driving force that ensures continued progress of world civilization. Design of new and expansion of existing network infrastructures requires improving the quality of service(QoS). Modeling probabilistic and time characteristics of telecommunication systems is an integral part of modern algorithms of administration of quality of service. At present, for the assessment of quality parameters except simulation models analytical models in the form of systems and queuing networks are widely used. Because of the limited mathematical tools of models of these classes the corresponding parameter estimation of parameters of quality of service are inadequate by definition. Especially concerning the models of telecommunication systems with packet transmission of multimedia real-time traffic.
Resumo:
We analyze the classical Bertrand model when consumers exhibit some strategic behavior in deciding from which seller they will buy. We use two related but different tools. Both consider a probabilistic learning (or evolutionary) mechanism, and in the two of them consumers' behavior in uences the competition between the sellers. The results obtained show that, in general, developing some sort of loyalty is a good strategy for the buyers as it works in their best interest. First, we consider a learning procedure described by a deterministic dynamic system and, using strong simplifying assumptions, we can produce a description of the process behavior. Second, we use nite automata to represent the strategies played by the agents and an adaptive process based on genetic algorithms to simulate the stochastic process of learning. By doing so we can relax some of the strong assumptions used in the rst approach and still obtain the same basic results. It is suggested that the limitations of the rst approach (analytical) provide a good motivation for the second approach (Agent-Based). Indeed, although both approaches address the same problem, the use of Agent-Based computational techniques allows us to relax hypothesis and overcome the limitations of the analytical approach.
Resumo:
This paper evaluates the forecasting performance of a continuous stochastic volatility model with two factors of volatility (SV2F) and compares it to those of GARCH and ARFIMA models. The empirical results show that the volatility forecasting ability of the SV2F model is better than that of the GARCH and ARFIMA models, especially when volatility seems to change pattern. We use ex-post volatility as a proxy of the realized volatility obtained from intraday data and the forecasts from the SV2F are calculated using the reprojection technique proposed by Gallant and Tauchen (1998).
Resumo:
In this paper, a new class of generalized backward doubly stochastic differential equations is investigated. This class involves an integral with respect to an adapted continuous increasing process. A probabilistic representation for viscosity solutions of semi-linear stochastic partial differential equations with a Neumann boundary condition is given.
Resumo:
Pleistocene glacial and interglacial periods have moulded the evolutionary history of European cold-adapted organisms. The role of the different mountain massifs has, however, not been accurately investigated in the case of high-altitude insect species. Here, we focus on three closely related species of non-flying leaf beetles of the genus Oreina (Coleoptera, Chrysomelidae), which are often found in sympatry within the mountain ranges of Europe. After showing that the species concept as currently applied does not match barcoding results, we show, based on more than 700 sequences from one nuclear and three mitochondrial genes, the role of biogeography in shaping the phylogenetic hypothesis. Dating the phylogeny using an insect molecular clock, we show that the earliest lineages diverged more than 1 Mya and that the main shift in diversification rate occurred between 0.36 and 0.18 Mya. By using a probabilistic approach on the parsimony-based dispersal/vicariance framework (MP-DIVA) as well as a direct likelihood method of state change optimization, we show that the Alps acted as a cross-roads with multiple events of dispersal to and reinvasion from neighbouring mountains. However, the relative importance of vicariance vs. dispersal events on the process of rapid diversification remains difficult to evaluate because of a bias towards overestimation of vicariance in the DIVA algorithm. Parallels are drawn with recent studies of cold-adapted species, although our study reveals novel patterns in diversity and genetic links between European mountains, and highlights the importance of neglected regions, such as the Jura and the Balkanic range.
Resumo:
This paper sheds new light on a long-standing puzzle in the international finance literature, namely, that exchange rate expectations appear inaccurate and even irrational. We find for a comprehensive dataset that individual forecasters’ performance is skill-based. ‘Superior’ forecasters show consistent ability as their forecasting success holds across currencies. They seem to possess knowledge on the role of fundamentals in explaining exchange rate behavior, as indicated by better interest rate forecasts. Superior forecasters are more experienced than the median forecaster and have fewer personnel responsibilities. Accordingly, foreign exchange markets may function in less puzzling and irrational ways than is often thought.
Resumo:
The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.
Resumo:
This paper examines the effect that heterogeneous customer orders flows have on exchange rates by using a new, and the largest, proprietary dataset of weekly net order flow segmented by customer type across nine of the most liquid currency pairs. We make several contributions. Firstly, we investigate the extent to which customer order flow can help to explain exchange rate movements over and above the influence of macroeconomic variables. Secondly, we address the issue of whether order flows contain (private) information which explain exchange rates changes. Thirdly, we look at the usefulness of order flow in forecasting exchange rate movements at longer horizons than those generally considered in the microstructure literature. Finally we address the question of whether the out-of-sample exchange rate forecasts generated by order flows can be employed profitably in the foreign exchange markets