996 resultados para Exponential smoothing methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the last few years, we have observed an exponential increasing of the information systems, and parking information is one more example of them. The needs of obtaining reliable and updated information of parking slots availability are very important in the goal of traffic reduction. Also parking slot prediction is a new topic that has already started to be applied. San Francisco in America and Santander in Spain are examples of such projects carried out to obtain this kind of information. The aim of this thesis is the study and evaluation of methodologies for parking slot prediction and the integration in a web application, where all kind of users will be able to know the current parking status and also future status according to parking model predictions. The source of the data is ancillary in this work but it needs to be understood anyway to understand the parking behaviour. Actually, there are many modelling techniques used for this purpose such as time series analysis, decision trees, neural networks and clustering. In this work, the author explains the best techniques at this work, analyzes the result and points out the advantages and disadvantages of each one. The model will learn the periodic and seasonal patterns of the parking status behaviour, and with this knowledge it can predict future status values given a date. The data used comes from the Smart Park Ontinyent and it is about parking occupancy status together with timestamps and it is stored in a database. After data acquisition, data analysis and pre-processing was needed for model implementations. The first test done was with the boosting ensemble classifier, employed over a set of decision trees, created with C5.0 algorithm from a set of training samples, to assign a prediction value to each object. In addition to the predictions, this work has got measurements error that indicates the reliability of the outcome predictions being correct. The second test was done using the function fitting seasonal exponential smoothing tbats model. Finally as the last test, it has been tried a model that is actually a combination of the previous two models, just to see the result of this combination. The results were quite good for all of them, having error averages of 6.2, 6.6 and 5.4 in vacancies predictions for the three models respectively. This means from a parking of 47 places a 10% average error in parking slot predictions. This result could be even better with longer data available. In order to make this kind of information visible and reachable from everyone having a device with internet connection, a web application was made for this purpose. Beside the data displaying, this application also offers different functions to improve the task of searching for parking. The new functions, apart from parking prediction, were: - Park distances from user location. It provides all the distances to user current location to the different parks in the city. - Geocoding. The service for matching a literal description or an address to a concrete location. - Geolocation. The service for positioning the user. - Parking list panel. This is not a service neither a function, is just a better visualization and better handling of the information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this study was to determine whether V˙O(2) kinetics and specifically, the time constant of transitions from rest to heavy (τ(p)H) and severe (τ(p)S) exercise intensities, are related to middle distance swimming performance. DESIGN: Fourteen highly trained male swimmers (mean ± SD: 20.5 ± 3.0 yr; 75.4 ± 12.4 kg; 1.80 ± 0.07 m) performed an discontinuous incremental test, as well as square wave transitions for heavy and severe swimming intensities, to determine V˙O(2) kinetics parameters using two exponential functions. METHODS: All the tests involved front-crawl swimming with breath-by-breath analysis using the Aquatrainer swimming snorkel. Endurance performance was recorded as the time taken to complete a 400 m freestyle swim within an official competition (T400), one month from the date of the other tests. RESULTS: T400 (Mean ± SD) (251.4 ± 12.4 s) was significantly correlated with τ(p)H (15.8 ± 4.8s; r=0.62; p=0.02) and τ(p)S (15.8 ± 4.7s; r=0.61; p=0.02). The best single predictor of 400 m freestyle time, out of the variables that were assessed, was the velocity at V˙O(2max)vV˙O(2max), which accounted for 80% of the variation in performance between swimmers. However, τ(p)H and V˙O(2max) were also found to influence the prediction of T400 when they were included in a regression model that involved respiratory parameters only. CONCLUSIONS: Faster kinetics during the primary phase of the V˙O(2) response is associated with better performance during middle-distance swimming. However, vV˙O(2max) appears to be a better predictor of T400.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis, the suitability of different trackers for finger tracking in high-speed videos was studied. Tracked finger trajectories from the videos were post-processed and analysed using various filtering and smoothing methods. Position derivatives of the trajectories, speed and acceleration were extracted for the purposes of hand motion analysis. Overall, two methods, Kernelized Correlation Filters and Spatio-Temporal Context Learning tracking, performed better than the others in the tests. Both achieved high accuracy for the selected high-speed videos and also allowed real-time processing, being able to process over 500 frames per second. In addition, the results showed that different filtering methods can be applied to produce more appropriate velocity and acceleration curves calculated from the tracking data. Local Regression filtering and Unscented Kalman Smoother gave the best results in the tests. Furthermore, the results show that tracking and filtering methods are suitable for high-speed hand-tracking and trajectory-data post-processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El presente trabajo desarrollado en el Hospital Méderi es una asesoría sobre modelos de pronósticos la cual consiste en analizar una base de datos de mercancía almacenada en la bodega general, suministrada por la entidad, mediante cuatro tipos de pronósticos diferentes, Promedio Móvil Ponderado, Promedio Móvil simple, Regresión Lineal y Suavizamiento Exponencial. Teniendo en cuenta el resultado arrojado por cada uno de los pronósticos, se hace una recomendación al hospital diciendo cual pronóstico debería utilizar para predecir la demanda con mayor precisión.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho tem como objetivo verificar se o mercado de opções da Petrobras PN (PETR4) é ineficiente na forma fraca, ou seja, se as informações públicas estão ou não refletidas nos preços dos ativos. Para isso, tenta-se obter lucro sistemático por meio da estratégia Delta-Gama-Neutra que utiliza a ação preferencial e as opções de compra da empresa. Essa ação foi escolhida, uma vez que as suas opções tinham alto grau de liquidez durante todo o período estudado (01/10/2012 a 31/03/2013). Para a realização do estudo, foram consideradas as ordens de compra e venda enviadas tanto para o ativo-objeto quanto para as opções de forma a chegar ao livro de ofertas (book) real de todos os instrumentos a cada cinco minutos. A estratégia foi utilizada quando distorções entre a Volatilidade Implícita, calculada pelo modelo Black & Scholes, e a volatilidade calculada por alisamento exponencial (EWMA – Exponentially Weighted Moving Average) foram observadas. Os resultados obtidos mostraram que o mercado de opções de Petrobras não é eficiente em sua forma fraca, já que em 371 operações realizadas durante esse período, 85% delas foram lucrativas, com resultado médio de 0,49% e o tempo médio de duração de cada operação sendo pouco menor que uma hora e treze minutos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Within the regression framework, we show how different levels of nonlinearity influence the instantaneous firing rate prediction of single neurons. Nonlinearity can be achieved in several ways. In particular, we can enrich the predictor set with basis expansions of the input variables (enlarging the number of inputs) or train a simple but different model for each area of the data domain. Spline-based models are popular within the first category. Kernel smoothing methods fall into the second category. Whereas the first choice is useful for globally characterizing complex functions, the second is very handy for temporal data and is able to include inner-state subject variations. Also, interactions among stimuli are considered. We compare state-of-the-art firing rate prediction methods with some more sophisticated spline-based nonlinear methods: multivariate adaptive regression splines and sparse additive models. We also study the impact of kernel smoothing. Finally, we explore the combination of various local models in an incremental learning procedure. Our goal is to demonstrate that appropriate nonlinearity treatment can greatly improve the results. We test our hypothesis on both synthetic data and real neuronal recordings in cat primary visual cortex, giving a plausible explanation of the results from a biological perspective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En esta tesis se va a describir y aplicar de forma novedosa la técnica del alisado exponencial multivariante a la predicción a corto plazo, a un día vista, de los precios horarios de la electricidad, un problema que se está estudiando intensivamente en la literatura estadística y económica reciente. Se van a demostrar ciertas propiedades interesantes del alisado exponencial multivariante que permiten reducir el número de parámetros para caracterizar la serie temporal y que al mismo tiempo permiten realizar un análisis dinámico factorial de la serie de precios horarios de la electricidad. En particular, este proceso multivariante de elevada dimensión se estimará descomponiéndolo en un número reducido de procesos univariantes independientes de alisado exponencial caracterizado cada uno por un solo parámetro de suavizado que variará entre cero (proceso de ruido blanco) y uno (paseo aleatorio). Para ello, se utilizará la formulación en el espacio de los estados para la estimación del modelo, ya que ello permite conectar esa secuencia de modelos univariantes más eficientes con el modelo multivariante. De manera novedosa, las relaciones entre los dos modelos se obtienen a partir de un simple tratamiento algebraico sin requerir la aplicación del filtro de Kalman. De este modo, se podrán analizar y poner al descubierto las razones últimas de la dinámica de precios de la electricidad. Por otra parte, la vertiente práctica de esta metodología se pondrá de manifiesto con su aplicación práctica a ciertos mercados eléctricos spot, tales como Omel, Powernext y Nord Pool. En los citados mercados se caracterizará la evolución de los precios horarios y se establecerán sus predicciones comparándolas con las de otras técnicas de predicción. ABSTRACT This thesis describes and applies the multivariate exponential smoothing technique to the day-ahead forecast of the hourly prices of electricity in a whole new way. This problem is being studied intensively in recent statistics and economics literature. It will start by demonstrating some interesting properties of the multivariate exponential smoothing that reduce drastically the number of parameters to characterize the time series and that at the same time allow a dynamic factor analysis of the hourly prices of electricity series. In particular this very complex multivariate process of dimension 24 will be estimated by decomposing a very reduced number of univariate independent of exponentially smoothing processes each characterized by a single smoothing parameter that varies between zero (white noise process) and one (random walk). To this end, the formulation is used in the state space model for the estimation, since this connects the sequence of efficient univariate models to the multivariate model. Through a novel way, relations between the two models are obtained from a simple algebraic treatment without applying the Kalman filter. Thus, we will analyze and expose the ultimate reasons for the dynamics of the electricity price. Moreover, the practical aspect of this methodology will be shown by applying this new technique to certain electricity spot markets such as Omel, Powernext and Nord Pool. In those markets the behavior of prices will be characterized, their predictions will be formulated and the results will be compared with those of other forecasting techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traffic flow time series data are usually high dimensional and very complex. Also they are sometimes imprecise and distorted due to data collection sensor malfunction. Additionally, events like congestion caused by traffic accidents add more uncertainty to real-time traffic conditions, making traffic flow forecasting a complicated task. This article presents a new data preprocessing method targeting multidimensional time series with a very high number of dimensions and shows its application to real traffic flow time series from the California Department of Transportation (PEMS web site). The proposed method consists of three main steps. First, based on a language for defining events in multidimensional time series, mTESL, we identify a number of types of events in time series that corresponding to either incorrect data or data with interference. Second, each event type is restored utilizing an original method that combines real observations, local forecasted values and historical data. Third, an exponential smoothing procedure is applied globally to eliminate noise interference and other random errors so as to provide good quality source data for future work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Remez penalty and smoothing algorithm (RPSALG) is a unified framework for penalty and smoothing methods for solving min-max convex semi-infinite programing problems, whose convergence was analyzed in a previous paper of three of the authors. In this paper we consider a partial implementation of RPSALG for solving ordinary convex semi-infinite programming problems. Each iteration of RPSALG involves two types of auxiliary optimization problems: the first one consists of obtaining an approximate solution of some discretized convex problem, while the second one requires to solve a non-convex optimization problem involving the parametric constraints as objective function with the parameter as variable. In this paper we tackle the latter problem with a variant of the cutting angle method called ECAM, a global optimization procedure for solving Lipschitz programming problems. We implement different variants of RPSALG which are compared with the unique publicly available SIP solver, NSIPS, on a battery of test problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.