977 resultados para SERIES MODELS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uno de los temas más complejos y necesarios en los cursos de Administración de Operaciones, es el uso de los pronósticos con modelos de series de tiempo (TSM por sus siglas en inglés) -- Para facilitar el entendimiento y ayudar a los estudiantes a comprender fácilmente los pronósticos de demanda, este proyecto presenta FOR TSM, una herramienta desarrollada en MS Excel VBA® -- La herramienta fue diseñada con una Interfaz gráfica de Usuario (GUI por sus siglas en inglés) para explicar conceptos fundamentales como la selección de los parámetros, los valores de inicialización, cálculo y análisis de medidas de desempeño y finalmente la selección de modelos

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En este documento empleamos las series de la Tasa de Cambio Representativa de Mercado y el Índice General de la Bolsa Colombia para ilustrar cuatro hechos estilizados muy conocidos en la literatura financiera: i) las series de precios siguen un camino aleatorio, ii) la distribución de los rendimientos es leptocúrtica y exhibe colas pesadas, iii) a medida que se calculan los rendimientos para períodos más amplios su distribución se acerca más a la distribución normal, y iv) los rendimientos presentan volatilidad agrupada (volatility clustering).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of forest re activity, in its several aspects, is essencial to understand the phenomenon and to prevent environmental public catastrophes. In this context the analysis of monthly number of res along several years is one aspect to have into account in order to better comprehend this tematic. The goal of this work is to analyze the monthly number of forest res in the neighboring districts of Aveiro and Coimbra, Portugal, through dynamic factor models for bivariate count series. We use a bayesian approach, through MCMC methods, to estimate the model parameters as well as to estimate the common latent factor to both series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this chapter four combinations of input features and the feedforward, cascade forward and recurrent architectures are compared for the task of forecast tourism time series. The input features of the ANNs consist in the combination of the previous 12 months, the index time modeled by two nodes used to the year and month and one input with the daily hours of sunshine (insolation duration). The index time features associated to the previous twelve values of the time series proved its relevance in this forecast task. The insolation variable can improved results with some architectures, namely the cascade forward architecture. Finally, the experimented ANN models/architectures produced a mean absolute percentage error between 4 and 6%, proving the ability of the ANN models based to forecast this time series. Besides, the feedforward architecture behaved better considering validation and test sets, with 4.2% percentage error in test set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, the Schwarz Information Criterion (SIC) is applied in order to detect change-points in the time series of surface water quality variables. The application of change-point analysis allowed detecting change-points in both the mean and the variance in series under study. Time variations in environmental data are complex and they can hinder the identification of the so-called change-points when traditional models are applied to this type of problems. The assumptions of normality and uncorrelation are not present in some time series, and so, a simulation study is carried out in order to evaluate the methodology’s performance when applied to non-normal data and/or with time correlation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comprehensive database of temperature, salinity and bio-chemical parameters in the Mediterranean and Black Sea has been constructed through comprehensive co-operation between the bordering countries. Statistical climatologies have been computed with all assembled and quality controlled data. The database, designed to initiate and validate prediction models, also represents a system to quality-check new incoming data produced by ocean observing systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy Conservation Measure (ECM) project selection is made difficult given real-world constraints, limited resources to implement savings retrofits, various suppliers in the market and project financing alternatives. Many of these energy efficient retrofit projects should be viewed as a series of investments with annual returns for these traditionally risk-averse agencies. Given a list of ECMs available, federal, state and local agencies must determine how to implement projects at lowest costs. The most common methods of implementation planning are suboptimal relative to cost. Federal, state and local agencies can obtain greater returns on their energy conservation investment over traditional methods, regardless of the implementing organization. This dissertation outlines several approaches to improve the traditional energy conservations models. Any public buildings in regions with similar energy conservation goals in the United States or internationally can also benefit greatly from this research. Additionally, many private owners of buildings are under mandates to conserve energy e.g., Local Law 85 of the New York City Energy Conservation Code requires any building, public or private, to meet the most current energy code for any alteration or renovation. Thus, both public and private stakeholders can benefit from this research. The research in this dissertation advances and presents models that decision-makers can use to optimize the selection of ECM projects with respect to the total cost of implementation. A practical application of a two-level mathematical program with equilibrium constraints (MPEC) improves the current best practice for agencies concerned with making the most cost-effective selection leveraging energy services companies or utilities. The two-level model maximizes savings to the agency and profit to the energy services companies (Chapter 2). An additional model presented leverages a single congressional appropriation to implement ECM projects (Chapter 3). Returns from implemented ECM projects are used to fund additional ECM projects. In these cases, fluctuations in energy costs and uncertainty in the estimated savings severely influence ECM project selection and the amount of the appropriation requested. A risk aversion method proposed imposes a minimum on the number of “of projects completed in each stage. A comparative method using Conditional Value at Risk is analyzed. Time consistency was addressed in this chapter. This work demonstrates how a risk-based, stochastic, multi-stage model with binary decision variables at each stage provides a much more accurate estimate for planning than the agency’s traditional approach and deterministic models. Finally, in Chapter 4, a rolling-horizon model allows for subadditivity and superadditivity of the energy savings to simulate interactive effects between ECM projects. The approach makes use of inequalities (McCormick, 1976) to re-express constraints that involve the product of binary variables with an exact linearization (related to the convex hull of those constraints). This model additionally shows the benefits of learning between stages while remaining consistent with the single congressional appropriations framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative assessment of possible, alternative actions. Although the degree of uncertainty associated with CDF estimation could influence decisions, such information is rarely provided. Hence, we propose Cox-type regression models (CRMs) as a statistical framework for making inferences on CDFs in climate science. CRMs were designed for modelling probability distributions rather than just mean or median values. This makes the approach appealing for risk assessments where probabilities of extremes are often more informative than central tendency measures. CRMs are semi-parametric approaches originally designed for modelling risks arising from time-to-event data. Here we extend this original concept beyond time-dependent measures to other variables of interest. We also provide tools for estimating CDFs and surrounding uncertainty envelopes from empirical data. These statistical techniques intrinsically account for non-stationarities in time series that might be the result of climate change. This feature makes CRMs attractive candidates to investigate the feasibility of developing rigorous global circulation model (GCM)-CRM interfaces for provision of user-relevant forecasts. To demonstrate the applicability of CRMs, we present two examples for El Ni ? no/Southern Oscillation (ENSO)-based forecasts: the onset date of the wet season (Cairns, Australia) and total wet season rainfall (Quixeramobim, Brazil). This study emphasises the methodological aspects of CRMs rather than discussing merits or limitations of the ENSO-based predictors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatio-temporal modelling is an area of increasing importance in which models and methods have often been developed to deal with specific applications. In this study, a spatio-temporal model was used to estimate daily rainfall data. Rainfall records from several weather stations, obtained from the Agritempo system for two climatic homogeneous zones, were used. Rainfall values obtained for two fixed dates (January 1 and May 1, 2012) using the spatio-temporal model were compared with the geostatisticals techniques of ordinary kriging and ordinary cokriging with altitude as auxiliary variable. The spatio-temporal model was more than 17% better at producing estimates of daily precipitation compared to kriging and cokriging in the first zone and more than 18% in the second zone. The spatio-temporal model proved to be a versatile technique, adapting to different seasons and dates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

United States federal agencies assess flood risk using Bulletin 17B procedures which assume annual maximum flood series are stationary. This represents a significant limitation of current flood frequency models as the flood distribution is thereby assumed to be unaffected by trends or periodicity of atmospheric/climatic variables and/or anthropogenic activities. The validity of this assumption is at the core of this thesis, which aims to improve understanding of the forms and potential causes of non-stationarity in flood series for moderately impaired watersheds in the Upper Midwest and Northeastern US. Prior studies investigated non-stationarity in flood series for unimpaired watersheds; however, as the majority of streams are located in areas of increasing human activity, relative and coupled impacts of natural and anthropogenic factors need to be considered such that non-stationary flood frequency models can be developed for flood risk forecasting over relevant planning horizons for large scale water resources planning and management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose two or more variables are jointly normally distributed. If there is a common relationship between these variables it would be very important to quantify this relationship by a parameter called the correlation coefficient which measures its strength, and the use of it can develop an equation for predicting, and ultimately draw testable conclusion about the parent population. This research focused on the correlation coefficient ρ for the bivariate and trivariate normal distribution when equal variances and equal covariances are considered. Particularly, we derived the maximum Likelihood Estimators (MLE) of the distribution parameters assuming all of them are unknown, and we studied the properties and asymptotic distribution of . Showing this asymptotic normality, we were able to construct confidence intervals of the correlation coefficient ρ and test hypothesis about ρ. With a series of simulations, the performance of our new estimators were studied and were compared with those estimators that already exist in the literature. The results indicated that the MLE has a better or similar performance than the others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.