27 resultados para Previsão Estatística
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
In the last decades the study of integer-valued time series has gained notoriety due to its broad applicability (modeling the number of car accidents in a given highway, or the number of people infected by a virus are two examples). One of the main interests of this area of study is to make forecasts, and for this reason it is very important to propose methods to make such forecasts, which consist of nonnegative integer values, due to the discrete nature of the data. In this work, we focus on the study and proposal of forecasts one, two and h steps ahead for integer-valued second-order autoregressive conditional heteroskedasticity processes [INARCH (2)], and in determining some theoretical properties of this model, such as the ordinary moments of its marginal distribution and the asymptotic distribution of its conditional least squares estimators. In addition, we study, via Monte Carlo simulation, the behavior of the estimators for the parameters of INARCH(2) processes obtained using three di erent methods (Yule- Walker, conditional least squares, and conditional maximum likelihood), in terms of mean squared error, mean absolute error and bias. We present some forecast proposals for INARCH(2) processes, which are compared again via Monte Carlo simulation. As an application of this proposed theory, we model a dataset related to the number of live male births of mothers living at Riachuelo city, in the state of Rio Grande do Norte, Brazil.
Resumo:
In the last decades the study of integer-valued time series has gained notoriety due to its broad applicability (modeling the number of car accidents in a given highway, or the number of people infected by a virus are two examples). One of the main interests of this area of study is to make forecasts, and for this reason it is very important to propose methods to make such forecasts, which consist of nonnegative integer values, due to the discrete nature of the data. In this work, we focus on the study and proposal of forecasts one, two and h steps ahead for integer-valued second-order autoregressive conditional heteroskedasticity processes [INARCH (2)], and in determining some theoretical properties of this model, such as the ordinary moments of its marginal distribution and the asymptotic distribution of its conditional least squares estimators. In addition, we study, via Monte Carlo simulation, the behavior of the estimators for the parameters of INARCH(2) processes obtained using three di erent methods (Yule- Walker, conditional least squares, and conditional maximum likelihood), in terms of mean squared error, mean absolute error and bias. We present some forecast proposals for INARCH(2) processes, which are compared again via Monte Carlo simulation. As an application of this proposed theory, we model a dataset related to the number of live male births of mothers living at Riachuelo city, in the state of Rio Grande do Norte, Brazil.
Resumo:
Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model
Análise de volatilidade, integração de preços e previsibilidade para o mercado brasileiro de camarão
Resumo:
The present paper has the purpose of investigate the dynamics of the volatility structure in the shrimp prices in the Brazilian fish market. Therefore, a description of the initial aspects of the shrimp price series was made. From this information, statistics tests were made and selected univariate models to be price predictors. Then, it was verified the existence of relationship of long-term equilibrium between the Brazilian and American imported shrimp and if, confirmed the relationship, whether or not there is a causal link between these assets, considering that the two countries had presented trade relations over the years. It is presented as an exploratory research of applied nature with quantitative approach. The database was collected through direct contact with the Companhia de Entrepostos e Armazéns Gerais de São Paulo (CEAGESP) and on the official website of American import, National Marine Fisheries Service - National Oceanic and Atmospheric Administration (NMFS- NOAA). The results showed that the great variability in the active price is directly related with the gain and loss of the market agents. The price series presents a strong seasonal and biannual effect. The average structure of price of shrimp in the last 12 years was R$ 11.58 and external factors besides the production and marketing (U.S. antidumping, floods and pathologies) strongly affected the prices. Among the tested models for predicting prices of shrimp, four were selected, which through the prediction methodologies of one step forward of horizon 12, proved to be statistically more robust. It was found that there is weak evidence of long-term equilibrium between the Brazilian and American shrimp, where equivalently, was not found a causal link between them. We concluded that the dynamic pricing of commodity shrimp is strongly influenced by external productive factors and that these phenomena cause seasonal effects in the prices. There is no relationship of long-term stability between the Brazilian and American shrimp prices, but it is known that Brazil imports USA production inputs, which somehow shows some dependence productive. To the market agents, the risk of interferences of the external prices cointegrated to Brazilian is practically inexistent. Through statistical modeling is possible to minimize the risk and uncertainty embedded in the fish market, thus, the sales and marketing strategies for the Brazilian shrimp can be consolidated and widespread
Resumo:
Waterflooding is a technique largely applied in the oil industry. The injected water displaces oil to the producer wells and avoid reservoir pressure decline. However, suspended particles in the injected water may cause plugging of pore throats causing formation damage (permeability reduction) and injectivity decline during waterflooding. When injectivity decline occurs it is necessary to increase the injection pressure in order to maintain water flow injection. Therefore, a reliable prediction of injectivity decline is essential in waterflooding projects. In this dissertation, a simulator based on the traditional porous medium filtration model (including deep bed filtration and external filter cake formation) was developed and applied to predict injectivity decline in perforated wells (this prediction was made from history data). Experimental modeling and injectivity decline in open-hole wells is also discussed. The injectivity of modeling showed good agreement with field data, which can be used to support plan stimulation injection wells
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
In this work we analyze the skin bioimpedance statistical distribution. We focus on the study of two distinct samples: the statistics of impedance of several points in the skin of a single individual and the statistics over a population (many individuals) but in a single skin point. The impedance data was obtained from the literature (Pearson, 2007). Using the Shapiro-Wilk test and the assymmetry test we conclude that the impedance of a population is better described by an assymetric and non-normal distribution. On the other side, the data concerning the individual impedance seems to follow a normal distribution. We have performed a goodnes of fitting test and the better distribution to fit the data of a population is the log-normal distribution. It is interesting to note that our result for skin impedance is in simtony with body impedance from the literature of electrical engeneering. Our results have an impact over the statistical planning and modelling of skin impedance experiments. Special attention we should drive to the treatment of outliers in this kind of dataset. The results of this work are important in the general discussion of low impedance of points of acupuncture and also in the problem of skin biopotentials used in equipments like the Electrodermal Screen Tests.
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
The vehicles are the main mobile sources of carbon monoxide (CO) and unburned hydrocarbons (HC) released into the atmosphere. In the last years the increment of the fleet of vehicles in the municipal district of Natal-RN it is contributing to the increase of the emissions of those pollutants. The study consisted of a statistical analysis of the emissions of CO and HC of a composed sample for 384 vehicles with mechanization Gasoline/CNG or Alcohol/Gasoline/CNG of the municipal district of Natal-RN. The tests were accomplished in vehicles submitted to Vehicular Safety's Inspection, in the facilities of INSPETRANS, Organism of Vehicular Inspection. An partial gases analyzer allowed to measure, for each vehicle, the levels of CO and HC in two conditions of rotation of the motor (900 and 2500 rpm). The statistical analysis accomplished through the STATISTICA software revealed a sensitive reduction in the efficiency of the converters catalytic after 6 years of use with emission average it is of 0,78% of CO and 156 (ppm) of HC, Which represents approximately 4 (four) times the amount of CO and the double of HC in comparison with the newest vehicles. The result of a Student s t-test, suggests strongly that the average of the emissions of HC (152 ppm), at 900 rpm, is 40% larger than at 2500 rpm, for the motor without load. This result reveals that the efficiency of the catalytic conversion is limited kinetically in low engine speeds. The Study also ends that when comparing the emissions of CO and HC considering the influence of the fuels, it was verified that although the emissions of CO starting from CNG are 62% smaller than arising from the gasoline, there are not significant differences among the emissions of HC originating from of CNG and of the gasoline. In synthesis, the results place the current criteria of vehicular inspection, for exhaust gases, in doubt, leading the creation of emission limits of pollutant more rigorous, because the efficiency of the converters catalytic is sensibly reduced starting from 6 years of use. It is also raised the possibility of modifications in the test conditions adopted by the current norms, specifically in the speed engine, have seen that in the condition without load the largest emission indexes were registered in slow march. That fact that allows to suggest the dismissal of the tests in high speed engine, reducing the time of inspection in half and generating economy of fuel
Resumo:
The aim of this study is to create an artificial neural network (ANN) capable of modeling the transverse elasticity modulus (E2) of unidirectional composites. To that end, we used a dataset divided into two parts, one for training and the other for ANN testing. Three types of architectures from different networks were developed, one with only two inputs, one with three inputs and the third with mixed architecture combining an ANN with a model developed by Halpin-Tsai. After algorithm training, the results demonstrate that the use of ANNs is quite promising, given that when they were compared with those of the Halpín-Tsai mathematical model, higher correlation coefficient values and lower root mean square values were observed
Resumo:
In the 20th century, the acupuncture has spread on occident as a complementary practice of heath care. This fact has motivated the international scientific community to invest in research that seek to understand why acupuncture works. In this work we compare statistically volt age fluctuation of bioelectric signals caught on the skin at an acupuncture point (IG 4) another nearby on acupuncture point. The acquisition of these signals was performed utilizing an electronic interface with a computer, which was based on an instrumentation amplifier designed with adequate specifications to this end. On the collected signals from a sample of 30 volunteers we have calculated major statistics and submitted them to pairing t-test with significance leveI a = O, 05. We have estimated to bioelectric signals the following parameters: standard deviation, asymmetry and curtose. Moreover, we have calculated the self-correlation function matched by on exponential curve we have observed that the signal decays more rapidly from a non-acupoint then from an acupoint. This fact is an indicative of the existence of information in the acupoint
Resumo:
O processamento de registros sísmicos é uma tarefa muito importante dentro da Geofísica e que representa um desafio permanente na exploração de petróleo. Embora esses sinais forneçam uma imagem adequada da estrutura geológica do subsolo, eles são contaminados por ruídos e, o ground roll é a componente principal. Este fato exige um esforço grande para o desenvolvimento de metodologias para filtragem, Dentro desse contexto, este trabalho tem como objetivo apresentar um método de remoção do ruído ground roll fazendo uso de ferramentas da Física Estatística. No método, a Análise em Ondeletas é combinada com a Transformada de Karhunen-Loève para a remoção em uma região bem localizada. O processo de filtragem começa com a Decomposição em Multiescala. Essa técnica permite uma representação em tempo-escala fazendo uso das ondeletas discretas implementadas a filtros de reconstrução perfeita. O padrão sísmico original fica representado em multipadrões: um por escala. Assim, pode-se atenuar o ground roll como uma operação cirúrgica em cada escala, somente na região onde sua presença é forte, permitindo preservar o máximo de informações relevantes. A atenuação é realizada pela definição de um fator de atenuação Af. Sua escolha é feita pelo comportamento dos modos de energia da Transformada de Karhunen-Loève. O ponto correspondendo a um mínimo de energia do primeiro modo é identificado como um fator de atenuação ótimo
Resumo:
Systems whose spectra are fractals or multifractals have received a lot of attention in recent years. The complete understanding of the behavior of many physical properties of these systems is still far from being complete because of the complexity of such systems. Thus, new applications and new methods of study of their spectra have been proposed and consequently a light has been thrown on their properties, enabling a better understanding of these systems. We present in this work initially the basic and necessary theoretical framework regarding the calculation of energy spectrum of elementary excitations in some systems, especially in quasiperiodic ones. Later we show, by using the Schr¨odinger equation in tight-binding approximation, the results for the specific heat of electrons within the statistical mechanics of Boltzmann-Gibbs for one-dimensional quasiperiodic systems, growth by following the Fibonacci and Double Period rules. Structures of this type have already been exploited enough, however the use of non-extensive statistical mechanics proposed by Constantino Tsallis is well suited to systems that have a fractal profile, and therefore our main objective was to apply it to the calculation of thermodynamical quantities, by extending a little more the understanding of the properties of these systems. Accordingly, we calculate, analytical and numerically, the generalized specific heat of electrons in one-dimensional quasiperiodic systems (quasicrystals) generated by the Fibonacci and Double Period sequences. The electronic spectra were obtained by solving the Schr¨odinger equation in the tight-binding approach. Numerical results are presented for the two types of systems with different values of the parameter of nonextensivity q
Resumo:
In this work we study a connection between a non-Gaussian statistics, the Kaniadakis
statistics, and Complex Networks. We show that the degree distribution P(k)of
a scale free-network, can be calculated using a maximization of information entropy in
the context of non-gaussian statistics. As an example, a numerical analysis based on the
preferential attachment growth model is discussed, as well as a numerical behavior of
the Kaniadakis and Tsallis degree distribution is compared. We also analyze the diffusive
epidemic process (DEP) on a regular lattice one-dimensional. The model is composed
of A (healthy) and B (sick) species that independently diffusive on lattice with diffusion
rates DA and DB for which the probabilistic dynamical rule A + B → 2B and B → A. This
model belongs to the category of non-equilibrium systems with an absorbing state and a
phase transition between active an inactive states. We investigate the critical behavior of
the DEP using an auto-adaptive algorithm to find critical points: the method of automatic
searching for critical points (MASCP). We compare our results with the literature and we
find that the MASCP successfully finds the critical exponents 1/ѵ and 1/zѵ in all the cases
DA =DB, DA
Resumo:
This dissertation briefly presents the random graphs and the main quantities calculated from them. At the same time, basic thermodynamics quantities such as energy and temperature are associated with some of their characteristics. Approaches commonly used in Statistical Mechanics are employed and rules that describe a time evolution for the graphs are proposed in order to study their ergodicity and a possible thermal equilibrium between them