947 resultados para PROBABILISTIC FORECASTS
Resumo:
2000 Mathematics Subject Classification: 94A29, 94B70
Resumo:
Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.
Resumo:
2000 Mathematics Subject Classification: 54H25, 47H10.
Resumo:
2000 Mathematics Subject Classification: Primary 60J45, 60J50, 35Cxx; Secondary 31Cxx.
Resumo:
The focus of this thesis is the extension of topographic visualisation mappings to allow for the incorporation of uncertainty. Few visualisation algorithms in the literature are capable of mapping uncertain data with fewer able to represent observation uncertainties in visualisations. As such, modifications are made to NeuroScale, Locally Linear Embedding, Isomap and Laplacian Eigenmaps to incorporate uncertainty in the observation and visualisation spaces. The proposed mappings are then called Normally-distributed NeuroScale (N-NS), T-distributed NeuroScale (T-NS), Probabilistic LLE (PLLE), Probabilistic Isomap (PIso) and Probabilistic Weighted Neighbourhood Mapping (PWNM). These algorithms generate a probabilistic visualisation space with each latent visualised point transformed to a multivariate Gaussian or T-distribution, using a feed-forward RBF network. Two types of uncertainty are then characterised dependent on the data and mapping procedure. Data dependent uncertainty is the inherent observation uncertainty. Whereas, mapping uncertainty is defined by the Fisher Information of a visualised distribution. This indicates how well the data has been interpolated, offering a level of ‘surprise’ for each observation. These new probabilistic mappings are tested on three datasets of vectorial observations and three datasets of real world time series observations for anomaly detection. In order to visualise the time series data, a method for analysing observed signals and noise distributions, Residual Modelling, is introduced. The performance of the new algorithms on the tested datasets is compared qualitatively with the latent space generated by the Gaussian Process Latent Variable Model (GPLVM). A quantitative comparison using existing evaluation measures from the literature allows performance of each mapping function to be compared. Finally, the mapping uncertainty measure is combined with NeuroScale to build a deep learning classifier, the Cascading RBF. This new structure is tested on the MNist dataset achieving world record performance whilst avoiding the flaws seen in other Deep Learning Machines.
Resumo:
Cloud computing is a new technological paradigm offering computing infrastructure, software and platforms as a pay-as-you-go, subscription-based service. Many potential customers of cloud services require essential cost assessments to be undertaken before transitioning to the cloud. Current assessment techniques are imprecise as they rely on simplified specifications of resource requirements that fail to account for probabilistic variations in usage. In this paper, we address these problems and propose a new probabilistic pattern modelling (PPM) approach to cloud costing and resource usage verification. Our approach is based on a concise expression of probabilistic resource usage patterns translated to Markov decision processes (MDPs). Key costing and usage queries are identified and expressed in a probabilistic variant of temporal logic and calculated to a high degree of precision using quantitative verification techniques. The PPM cost assessment approach has been implemented as a Java library and validated with a case study and scalability experiments. © 2012 Springer-Verlag Berlin Heidelberg.
Resumo:
The traditional use of global and centralised control methods, fails for large, complex, noisy and highly connected systems, which typify many real world industrial and commercial systems. This paper provides an efficient bottom up design of distributed control in which many simple components communicate and cooperate to achieve a joint system goal. Each component acts individually so as to maximise personal utility whilst obtaining probabilistic information on the global system merely through local message-passing. This leads to an implied scalable and collective control strategy for complex dynamical systems, without the problems of global centralised control. Robustness is addressed by employing a fully probabilistic design, which can cope with inherent uncertainties, can be implemented adaptively and opens a systematic rich way to information sharing. This paper opens the foreseen direction and inspects the proposed design on a linearised version of coupled map lattice with spatiotemporal chaos. A version close to linear quadratic design gives an initial insight into possible behaviours of such networks.
Resumo:
Rationing occurs if the demand for a certain good exceeds its supply. In such situations a rationing method has to be specified in order to determine the allocation of the scarce good to the agents. Moulin (1999) introduced the notion of probabilistic rationing methods for the discrete framework. In this paper we establish a link between classical and probabilistic rationing methods. In particular, we assign to any given classical rationing method a probabilistic rationing method with minimal variance among those probabilistic rationing methods, which result in the same expected distributions as the given classical rationing method.
Resumo:
Az életben számtalan olyan esettel találkozunk, amikor egy jószág iránti kereslet meghaladja a rendelkezésre álló kínálatot. Példaként említhetjük a kárpótlási igényeket, egy csődbement cég hitelezőinek igényeit, valamely szerv átültetésére váró betegek sorát stb. Ilyen helyzetekben valamilyen eljárás szerint oszthatjuk el a szűkös mennyiséget a szereplők között. Szokás megkülönböztetni a determinisztikus és a sztochasztikus elosztási eljárásokat, jóllehet sok esetben csak a determinisztikus eljárásokat alkalmazzák. Azonban igazságossági szempontból gyakran használnak sztochasztikus elosztási eljárásokat is, mint például tette azt az Egyesült államok hadserege a második világháború végét követően a külföldön állomásozó katonáinak visszavonásakor, illetve a vietnami háború során behívandó személyek kiválasztásakor. / === / We investigated the minimal variance methods introduced in Tasnádi [6] based on seven popular axioms. We proved that if a deterministic rationing method satisfies demand monotonicity, resource monotonicity, equal treatment of equals and self-duality, than the minimal variance methods associated with the given deterministic rationing method also satisfies demand monotonicity, resource monotonicity, equal treatment of equals and self-duality. Furthermore, we found that the consistency, the lower composition and the upper composition of a deterministic rationing method does not imply the consistency, the lower composition and the upper composition of a minimal variance method associated with the given deterministic rationing method.
Resumo:
The beginning of the 21st century was plagued with unprecedented instances of corporate fraud. In an attempt to address apparent non-existent or “broken” corporate governance policies, sweeping measures of financial reporting reform ensued, having specific requirements relating to the composition of audit committees, the interaction between audit committees and external auditors, and procedures concerning auditors’ assessment of client risk. The purpose of my dissertation is to advance knowledge about “good” corporate governance by examining the association between meeting-or-beating analyst forecasts and audit fees, audit committee compensation, and audit committee tenure and “busyness”. Using regression analysis, I found the following: (1) the frequency of meeting-or-just beating (just missing) analyst forecasts is negatively (positively) associated with audit fees, (2) the extent by which a firm exceeds analysts’ forecasts is positively (negatively) associated with audit committee compensation that is predominately equity-based (cash-based), and (3) the likelihood of repeatedly meeting-or-just beating analyst forecasts is positively associated with audit committee tenure and “busyness”. These results suggest that auditors consider clients who frequently meet-or-just beat forecasts as being less “risky”, and clients that frequently just miss as being more “risky”. The results also imply that cash-based director compensation is more successful in preserving the effectiveness of the audit committee’s financial reporting oversight role, that equity-based compensation motivates independent audit committee directors to focus on short-term performance thereby aligning their interests with management, and that audit committee director tenure and the degree of director “busyness” can affect an audit committee member’s effectiveness in providing financial reporting oversight. Collectively, my dissertation provides additional insights regarding corporate governance practices and informs policy-makers for future relevant decisions.^
Resumo:
Corporate executives closely monitor the accuracy of their hotels' occupancy fore- casts since important decisions are based upon these predictions. This study lists the criteria for selecting an appropriate error measure. It discusses several evaluation methods focusing on statistical significance tests and demonstrates the use of two adequate evaluation methods: Mincer- Zamowitz's efficiency test and Wilcoxon's Non-Parametric Matched-Pairs Signed- Ranks test.
Resumo:
Prior research suggests that book-tax income differences (BTD) relate to both firms' earnings quality and operating performance. In this dissertation, I explore whether and how financial analysts signal the implications of BTD efficiently. This dissertation is comprised of three essays on BTD. The three essays seek to develop a better understanding of how financial analysts utilize information reflected in BTD (derived from the ratio of taxable income to book income). The first essay is a review and discussion of prior research regarding BTD. The second essay of this dissertation investigates the role of BTD in indicating the consensus and dispersion of analyst recommendations. I find that sell recommendations are positively related to BTD. I also document that analyst coverage has a positive effect on the standard deviation of consensus recommendations with respect to BTD. The third essay is an empirical analysis of analysts' forecast optimism, analyst coverage, and BTD. I find a negative association between forecast optimism and BTD. My results are consistent with a larger BTD being associated with less forecast bias. Overall, I interpret the sum of the evidence as being consistent with BTD reflecting information about earnings quality, and consistent with analysts examining and using this information in making decisions regarding both forecasts and recommendations.
Resumo:
Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.
Resumo:
Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.
Resumo:
Formation of hydrates is one of the major flow assurance problems faced by the oil and gas industry. Hydrates tend to form in natural gas pipelines with the presence of water and favorable temperature and pressure conditions, generally low temperatures and corresponding high pressures. Agglomeration of hydrates can result in blockage of flowlines and equipment, which can be time consuming to remove in subsea equipment and cause safety issues. Natural gas pipelines are more susceptible to burst and explosion owing to hydrate plugging. Therefore, a rigorous risk-assessment related to hydrate formation is required, which assists in preventing hydrate blockage and ensuring equipment integrity. This thesis presents a novel methodology to assess the probability of hydrate formation and presents a risk-based approach to determine the parameters of winterization schemes to avoid hydrate formation in natural gas pipelines operating in Arctic conditions. It also presents a lab-scale multiphase flow loop to study the effects of geometric and hydrodynamic parameters on hydrate formation and discusses the effects of geometric and hydrodynamic parameters on multiphase development length of a pipeline. Therefore, this study substantially contributes to the assessment of probability of hydrate formation and the decision making process of winterization strategies to prevent hydrate formation in Arctic conditions.