71 resultados para Séries chronologiques
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model
Resumo:
Ce travail que nous venons de présenter apport une systématisation sur la viabilité de la praxis de l´éducateur français Célestin Freinet (1896-1996). Il s´agit d´une étude qualificative, recherche-action, développée dans une école coopérative l´ École Freinet à Natal/RN Brésil, auprès des élèves du 5ème et 8ème séries de l´Enseignement Fondamental. Le problème central que nous a amené à cette recherche, a été le fait de vérifier les progressions dans la production scientifique et dans les pratiques des enseignants qui ont à la charge l´Éducation Infantine et les premières séries de l´Enseignement Fondamental. Tout cela en tenant compte en concevoir l´élève en tant que sujet actif de son apprentissage et coresponsable à l´organisation du travail scolaire. Toutefois, qaund-il s´agit des quatre dernières séries de l´Enseignement Fondamental (5ème au 8ème séries), les propositions progressistes sont prises comme quelque chose pratiquement impossible d´en mettre en marche. Pour attenuer la lacune théorique sur le sujet, nous avons offert un texte significatif pour les enseignants d´une façon générale, les chercheurs de l´académie et pour l´écolechamp de la recherché. Tout cela par le fait de ne pas donner de priorité à une technique ou à un principe, en particulier. Cette pédagogie est discutée autant que possible dans son contexte global et aussi sur le comment s´est manifestée dans sa práxis . Nous avons mis en relief les aspects importants pour le fonctionnement de la salle de classe, aussi que les estratégies utilisées pour la construction, communication et documentation des connaissances L´intervention nous a permis de construire l´objet par la réflexion en action, à partir des données obtenues pendant les observations, les entretiens, les enterviews, les moments d´études et de la quête des textes écrits. L´analyse a été faite à la lumière de la methodologie comprehensive (Kaufmann, 1996), entrelacée au materiau empirique et à la théorie. Cette forme d´analyse nous a permis d´obtenir la logique dans l´ensemble comme nous crayons que doit caracteriser la recherche aux Sciences Humaines et Sociales. Les considérations finales mettent l´accent d´un côté sur l´efficacité de la pédagogie Freinet, pour orienter le processus de l´enseignement apprentissage, en tous les dégres de l´enseignement et de l´autre côté nous montre la nécessité urgente de l´équipe de l´école champ de la recherche, de qualifier sa pratique à partir des références théoriques-pratiques de la pédagogie Freinet
Resumo:
This thesis represents a didactic research linked to the Post-graduation Programme in Education of the Universidade Federal do Rio Grande do Norte which aimed to approach the construction of the geometrical concepts of Volume of the Rectangular Parallelepiped, Area and Perimeter of the Rectangle adding a study of the Area of the Circle. The research was developed along with students from the 6th level of the Elementary School, in a public school in Natal/RN. The pedagogical intervention was made up of three moments: application of a diagnostic evaluation, instrument that enabled the creation of the teaching module by showing the level of the geometry knowledge of the students; introduction of a Teaching Module by Activities aiming to propose a reflexive didactic routing directed to the conceptual construction because we believed that such an approach would favor the consolidation of the learning process by becoming significant to the apprentice, and the accomplishment of a Final Evaluation through which we established a comparison of the results obtained before and after the teaching intervention. The data gathered were analyzed qualitatively by means of a study of understanding categories of mathematical concepts, in addition to using descriptive statistics under the quantitative aspect. Based on the theory of Richard Skemp, about categorization of mathematical knowledge, in the levels of Relational and Instrumental Understanding were achieved in contextual situations and varied proportions, thus enabling a contribution in the learning of the geometrical concepts studied along with the students who took part in the research. We believe that this work may contribute with reflections about the learning processes, a concern which remained during all the stages of the research, and also that the technical competence along with the knowledge about the constructivist theory will condition the implementation of a new dynamics to the teaching and learning processes. We hope that the present research work may add some contribution to the teaching practice in the context of the teaching of Mathematics for the intermediate levels of the Elementary School
Resumo:
The opening of the Brazilian market of electricity and competitiveness between companies in the energy sector make the search for useful information and tools that will assist in decision making activities, increase by the concessionaires. An important source of knowledge for these utilities is the time series of energy demand. The identification of behavior patterns and description of events become important for the planning execution, seeking improvements in service quality and financial benefits. This dissertation presents a methodology based on mining and representation tools of time series, in order to extract knowledge that relate series of electricity demand in various substations connected of a electric utility. The method exploits the relationship of duration, coincidence and partial order of events in multi-dimensionals time series. To represent the knowledge is used the language proposed by Mörchen (2005) called Time Series Knowledge Representation (TSKR). We conducted a case study using time series of energy demand of 8 substations interconnected by a ring system, which feeds the metropolitan area of Goiânia-GO, provided by CELG (Companhia Energética de Goiás), responsible for the service of power distribution in the state of Goiás (Brazil). Using the proposed methodology were extracted three levels of knowledge that describe the behavior of the system studied, representing clearly the system dynamics, becoming a tool to assist planning activities
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
With the growing demand of data traffic in the networks of third generation (3G), the mobile operators have attempted to focus resources on infrastructure in places where it identifies a greater need. The channeling investments aim to maintain the quality of service especially in dense urban areas. WCDMA - HSPA parameters Rx Power, RSCP (Received Signal Code Power), Ec/Io (Energy per chip/Interference) and transmission rate (throughput) at the physical layer are analyzed. In this work the prediction of time series on HSPA network is performed. The collection of values of the parameters was performed on a fully operational network through a drive test in Natal - RN, a capital city of Brazil northeastern. The models used for prediction of time series were the Simple Exponential Smoothing, Holt, Holt Winters Additive and Holt Winters Multiplicative. The objective of the predictions of the series is to check which model will generate the best predictions of network parameters WCDMA - HSPA.
Resumo:
The time series analysis has played an increasingly important role in weather and climate studies. The success of these studies depends crucially on the knowledge of the quality of climate data such as, for instance, air temperature and rainfall data. For this reason, one of the main challenges for the researchers in this field is to obtain homogeneous series. A time series of climate data is considered homogeneous when the values of the observed data can change only due to climatic factors, i.e., without any interference from external non-climatic factors. Such non-climatic factors may produce undesirable effects in the time series, as unrealistic homogeneity breaks, trends and jumps. In the present work it was investigated climatic time series for the city of Natal, RN, namely air temperature and rainfall time series, for the period spanning from 1961 to 2012. The main purpose was to carry out an analysis in order to check the occurrence of homogeneity breaks or trends in the series under investigation. To this purpose, it was applied some basic statistical procedures, such as normality and independence tests. The occurrence of trends was investigated by linear regression analysis, as well as by the Spearman and Mann-Kendall tests. The homogeneity was investigated by the SNHT, as well as by the Easterling-Peterson and Mann-Whitney-Pettit tests. Analyzes with respect to normality showed divergence in their results. The von Neumann ratio test showed that in the case of the air temperature series the data are not independent and identically distributed (iid), whereas for the rainfall series the data are iid. According to the applied testings, both series display trends. The mean air temperature series displays an increasing trend, whereas the rainfall series shows an decreasing trend. Finally, the homogeneity tests revealed that all series under investigations present inhomogeneities, although they breaks depend on the applied test. In summary, the results showed that the chosen techniques may be applied in order to verify how well the studied time series are characterized. Therefore, these results should be used as a guide for further investigations about the statistical climatology of Natal or even of any other place.
Resumo:
Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model
Resumo:
The financial crisis that occurred between the years 2007 and 2008, known as the subprime crisis, has highlighted the governance of companies in Brazil and worldwide. To monitor the financial risk, quantitative tools of risk management were created in the 1990s, after several financial disasters. The market turmoil has also led companies to invest in the development and use of information, which are applied as tools to support process control and decision making. Numerous empirical studies on informational efficiency of the market have been made inside and outside Brazil, revealing whether the prices reflect the information available instantly. The creation of different levels of corporate governance on BOVESPA, in 2000, made the firms had greater impairment in relation to its shareholders with greater transparency in their information. The purpose of this study is to analyze how the subprime financial crisis has affected, between January 2007 and December 2009, the volatility of stock returns in the BM&BOVESPA of companies with greater liquidity at different levels of corporate governance. From studies of time series and through the studies of events, econometric tests were performed by the EVIEWS, and through the results obtained it became evident that the adoption of good practices of corporate governance affect the volatility of returns of companies
Análise de volatilidade, integração de preços e previsibilidade para o mercado brasileiro de camarão
Resumo:
The present paper has the purpose of investigate the dynamics of the volatility structure in the shrimp prices in the Brazilian fish market. Therefore, a description of the initial aspects of the shrimp price series was made. From this information, statistics tests were made and selected univariate models to be price predictors. Then, it was verified the existence of relationship of long-term equilibrium between the Brazilian and American imported shrimp and if, confirmed the relationship, whether or not there is a causal link between these assets, considering that the two countries had presented trade relations over the years. It is presented as an exploratory research of applied nature with quantitative approach. The database was collected through direct contact with the Companhia de Entrepostos e Armazéns Gerais de São Paulo (CEAGESP) and on the official website of American import, National Marine Fisheries Service - National Oceanic and Atmospheric Administration (NMFS- NOAA). The results showed that the great variability in the active price is directly related with the gain and loss of the market agents. The price series presents a strong seasonal and biannual effect. The average structure of price of shrimp in the last 12 years was R$ 11.58 and external factors besides the production and marketing (U.S. antidumping, floods and pathologies) strongly affected the prices. Among the tested models for predicting prices of shrimp, four were selected, which through the prediction methodologies of one step forward of horizon 12, proved to be statistically more robust. It was found that there is weak evidence of long-term equilibrium between the Brazilian and American shrimp, where equivalently, was not found a causal link between them. We concluded that the dynamic pricing of commodity shrimp is strongly influenced by external productive factors and that these phenomena cause seasonal effects in the prices. There is no relationship of long-term stability between the Brazilian and American shrimp prices, but it is known that Brazil imports USA production inputs, which somehow shows some dependence productive. To the market agents, the risk of interferences of the external prices cointegrated to Brazilian is practically inexistent. Through statistical modeling is possible to minimize the risk and uncertainty embedded in the fish market, thus, the sales and marketing strategies for the Brazilian shrimp can be consolidated and widespread
Resumo:
Nowadays, evaluation methods to measure thermal performance of buildings have been developed in order to improve thermal comfort in buildings and reduce the use of energy with active cooling and heating systems. However, in developed countries, the criteria used in rating systems to asses the thermal and energy performance of buildings have demonstrated some limitations when applied to naturally ventilated building in tropical climates. The present research has as its main objective to propose a method to evaluate the thermal performance of low-rise residential buildings in warm humid climates, through computational simulation. The method was developed in order to conceive a suitable rating system for the athermal performance assessment of such buildings using as criteria the indoor air temperature and a thermal comfort adaptive model. The research made use of the software VisualDOE 4.1 in two simulations runs of a base case modeled for two basic types of occupancies: living room and bedroom. In the first simulation run, sensitive analyses were made to identify the variables with the higher impact over the cases´ thermal performance. Besides that, the results also allowed the formulation of design recommendations to warm humid climates toward an improvement on the thermal performance of residential building in similar situations. The results of the second simulation run was used to identify the named Thermal Performance Spectrum (TPS) of both occupancies types, which reflect the variations on the thermal performance considering the local climate, building typology, chosen construction material and studied occupancies. This analysis generates an index named IDTR Thermal Performance Resultant Index, which was configured as a thermal performance rating system. It correlates the thermal performance with the number of hours that the indoor air temperature was on each of the six thermal comfort bands pre-defined that received weights to measure the discomfort intensity. The use of this rating system showed to be appropriated when used in one of the simulated cases, presenting advantages in relation to other evaluation methods and becoming a tool for the understanding of building thermal behavior
Resumo:
The main objective of this study is to apply recently developed methods of physical-statistic to time series analysis, particularly in electrical induction s profiles of oil wells data, to study the petrophysical similarity of those wells in a spatial distribution. For this, we used the DFA method in order to know if we can or not use this technique to characterize spatially the fields. After obtain the DFA values for all wells, we applied clustering analysis. To do these tests we used the non-hierarchical method called K-means. Usually based on the Euclidean distance, the K-means consists in dividing the elements of a data matrix N in k groups, so that the similarities among elements belonging to different groups are the smallest possible. In order to test if a dataset generated by the K-means method or randomly generated datasets form spatial patterns, we created the parameter Ω (index of neighborhood). High values of Ω reveals more aggregated data and low values of Ω show scattered data or data without spatial correlation. Thus we concluded that data from the DFA of 54 wells are grouped and can be used to characterize spatial fields. Applying contour level technique we confirm the results obtained by the K-means, confirming that DFA is effective to perform spatial analysis
Resumo:
In recent years, the DFA introduced by Peng, was established as an important tool capable of detecting long-range autocorrelation in time series with non-stationary. This technique has been successfully applied to various areas such as: Econophysics, Biophysics, Medicine, Physics and Climatology. In this study, we used the DFA technique to obtain the Hurst exponent (H) of the profile of electric density profile (RHOB) of 53 wells resulting from the Field School of Namorados. In this work we want to know if we can or not use H to spatially characterize the spatial data field. Two cases arise: In the first a set of H reflects the local geology, with wells that are geographically closer showing similar H, and then one can use H in geostatistical procedures. In the second case each well has its proper H and the information of the well are uncorrelated, the profiles show only random fluctuations in H that do not show any spatial structure. Cluster analysis is a method widely used in carrying out statistical analysis. In this work we use the non-hierarchy method of k-means. In order to verify whether a set of data generated by the k-means method shows spatial patterns, we create the parameter Ω (index of neighborhood). High Ω shows more aggregated data, low Ω indicates dispersed or data without spatial correlation. With help of this index and the method of Monte Carlo. Using Ω index we verify that random cluster data shows a distribution of Ω that is lower than actual cluster Ω. Thus we conclude that the data of H obtained in 53 wells are grouped and can be used to characterize space patterns. The analysis of curves level confirmed the results of the k-means
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
The increase in ultraviolet radiation (UV) at surface, the high incidence of non-melanoma skin cancer (NMSC) in coast of Northeast of Brazil (NEB) and reduction of total ozone were the motivation for the present study. The overall objective was to identify and understand the variability of UV or Index Ultraviolet Radiation (UV Index) in the capitals of the east coast of the NEB and adjust stochastic models to time series of UV index aiming make predictions (interpolations) and forecasts / projections (extrapolations) followed by trend analysis. The methodology consisted of applying multivariate analysis (principal component analysis and cluster analysis), Predictive Mean Matching method for filling gaps in the data, autoregressive distributed lag (ADL) and Mann-Kendal. The modeling via the ADL consisted of parameter estimation, diagnostics, residuals analysis and evaluation of the quality of the predictions and forecasts via mean squared error and Pearson correlation coefficient. The research results indicated that the annual variability of UV in the capital of Rio Grande do Norte (Natal) has a feature in the months of September and October that consisting of a stabilization / reduction of UV index because of the greater annual concentration total ozone. The increased amount of aerosol during this period contributes in lesser intensity for this event. The increased amount of aerosol during this period contributes in lesser intensity for this event. The application of cluster analysis on the east coast of the NEB showed that this event also occurs in the capitals of Paraiba (João Pessoa) and Pernambuco (Recife). Extreme events of UV in NEB were analyzed from the city of Natal and were associated with absence of cloud cover and levels below the annual average of total ozone and did not occurring in the entire region because of the uneven spatial distribution of these variables. The ADL (4, 1) model, adjusted with data of the UV index and total ozone to period 2001-2012 made a the projection / extrapolation for the next 30 years (2013-2043) indicating in end of that period an increase to the UV index of one unit (approximately), case total ozone maintain the downward trend observed in study period