358 resultados para Previsões


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho avalia as previsões de três métodos não lineares — Markov Switching Autoregressive Model, Logistic Smooth Transition Autoregressive Model e Autometrics com Dummy Saturation — para a produção industrial mensal brasileira e testa se elas são mais precisas que aquelas de preditores naive, como o modelo autorregressivo de ordem p e o mecanismo de double differencing. Os resultados mostram que a saturação com dummies de degrau e o Logistic Smooth Transition Autoregressive Model podem ser superiores ao mecanismo de double differencing, mas o modelo linear autoregressivo é mais preciso que todos os outros métodos analisados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As decisões de financiamento com dívida têm impacto na estrutura de capital através da alteração da alavancagem, da titularidade e da maturidade da dívida. As teorias mais populares sobre a composição da dívida, preveem um efeito negativo nas ações quando uma empresa emite debêntures. Os meus resultados não confirmam esse efeito, pelo menos diretamente. Contudo os determinantes da emissão são consistentes com as previsões, com algumas particularidades da economia Brasileira.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho procura identificar quais variáveis são as mais relevantes para previsão da taxa de câmbio real do Brasil e analisar a robustez dessas previsões. Para isso foram realizados testes de cointegração de Johansen em 13 variáveis macroeconômicas. O banco de dados utilizado são séries trimestrais e compreende o período de 1970 a 2014 e os testes foram realizados sobre as séries combinadas dois a dois, três a três e quatro a quatro. Por meio desse método, encontramos nove grupos que cointegram entre si. Utilizando esses grupos, são feitas previsões fora da amostra com a partir das últimas 60 observações. A qualidade das previsões foi avaliada por meio dos testes de Erro Quadrático Médio, teste de Diebold-Mariano e, além disso, foi utilizado um modelo de passeio aleatório do câmbio real como benchmark para o procedimento de Hansen. Todos os testes mostram que, à medida que se aumenta o horizonte de projeção, o passeio aleatório perde poder preditivo e a maioria dos modelos são mais informativos sobre o futuro da o câmbio real efetivo. O horizonte é de três a quatro anos à frente. No caso do teste de Hansen, o passeio aleatório é completamente eliminado do grupo final de modelos, mostrando que é possível fazer previsões superiores ao passeio aleatório.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho procura identificar quais variáveis são as mais relevantes para previsão da taxa de câmbio real do Brasil e analisar a robustez dessas previsões. Para isso foram realizados testes de cointegração de Johansen em 13 variáveis macroeconômicas. O banco de dados utilizado são séries trimestrais e compreende o período de 1970 a 2014 e os testes foram realizados sobre as séries combinadas dois a dois, três a três e quatro a quatro. Por meio desse método, encontramos nove grupos que cointegram entre si. Utilizando esses grupos, são feitas previsões fora da amostra com a partir das últimas 60 observações. A qualidade das previsões foi avaliada por meio dos testes de Erro Quadrático Médio, teste de Diebold-Mariano e, além disso, foi utilizado um modelo de passeio aleatório do câmbio real como benchmark para o procedimento de Hansen. Todos os testes mostram que, à medida que se aumenta o horizonte de projeção, o passeio aleatório perde poder preditivo e a maioria dos modelos são mais informativos sobre o futuro da o câmbio real efetivo. O horizonte é de três a quatro anos à frente. No caso do teste de Hansen, o passeio aleatório é completamente eliminado do grupo final de modelos, mostrando que é possível fazer previsões superiores ao passeio aleatório.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente trabalho tem como objetivo avaliar a capacidade preditiva de modelos econométricos de séries de tempo baseados em indicadores macroeconômicos na previsão da inflação brasileira (IPCA). Os modelos serão ajustados utilizando dados dentro da amostra e suas projeções ex-post serão acumuladas de um a doze meses à frente. As previsões serão comparadas a de modelos univariados como autoregressivo de primeira ordem - AR(1) - que nesse estudo será o benchmark escolhido. O período da amostra vai de janeiro de 2000 até agosto de 2015 para ajuste dos modelos e posterior avaliação. Ao todo foram avaliadas 1170 diferentes variáveis econômicas a cada período a ser projetado, procurando o melhor conjunto preditores para cada ponto no tempo. Utilizou-se o algoritmo Autometrics para a seleção de modelos. A comparação dos modelos foi feita através do Model Confidence Set desenvolvido por Hansen, Lunde e Nason (2010). Os resultados obtidos nesse ensaio apontam evidências de ganhos de desempenho dos modelos multivariados para períodos posteriores a 1 passo à frente.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O estudo busca identificar quais variáveis são as mais relevantes para previsão da taxa de câmbio real efetiva e analisar a robustez dessas previsões. Foram realizados testes de cointegração de Johansen em 13 variáveis macroeconômicas. O banco de dados utilizado são séries trimestrais e os testes foram realizados sobre as séries combinadas dois a dois, três a três e quatro a quatro. Utilizando esse método, encontramos modelos que cointegravam entre si, para os países analisados. A partir desses modelos, foram feitas previsões fora da amostra a partir das últimas 60 observações. A qualidade das previsões foi avaliada por meio dos testes de Erro Quadrático Médio (EQM) e Modelo do Conjunto de Confiança de Hansen (MCS) utilizando um modelo de passeio aleatório do câmbio real como benchmark. Todos os testes mostram que, à medida que se aumenta o horizonte de projeção, o passeio aleatório perde poder preditivo e a maioria dos modelos são mais informativos sobre o futuro da taxa de câmbio real efetivo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work analyzes the establishment of a startup’s operations and the structuring all of the processes required to start up the business, launch the platform and keep it working. The thesis’ main focus can therefore be described as designing and structuring a startup’s operations in an emerging market before and during its global launch. Such business project aims to provide a successful case regarding the creation of a business and its launch into an emerging market, by illustrating a practical example on how to structure the business’ operations within a limited time frame. Moreover, this work will also perform a complete economic analysis of Brazil, thorough analyses of the industries the company is related to, as well as a competitive analysis of the market the venture operates in. Furthermore, an assessment of the venture’s business model and of its first six-month performance will also be included. The thesis’ ultimate goal lies in evaluating the company’s potential of success in the next few years, by highlighting its strengths and criticalities. On top of providing the company’s management with brilliant findings and forecasts about its own business, the present work will represent a reference and a practical roadmap for any entrepreneur willing to establish his operations in Brazil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A capacidade de prever a produção de energia eólica com precisão num parque eólico é de extrema relevância tanto do ponto de vista económico bem como de controlo e estabilidade da rede elétrica. Vários e diferentes métodos têm sido utilizados para este propósito, como os físicos, estatísticos, lógica difusa e redes neuronais artificiais. Os dados disponíveis dos parques eólicos contêm ruído e leituras inesperadas em relação às entradas disponíveis. Lidar com estes dados não é uma tarefa simples mas, neste trabalho, as Redes Neuronais Artificiais são usadas para prever a potência gerada baseada em medições locais do vento. Os resultados mostram que as Redes Neuronais Artificiais são uma ferramenta que deve ser considerada nestas difíceis condições, uma vez que elas proporcionam uma precisão razoável nas suas previsões.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work has as its theme the social function of terrenos de marinha. Theresearch universe is the terrenos de marinha of Natal coastline, focusing on thefulfillment of its social function. Prescribed by law since the colonial period with thepurpose of protecting the coast and free movement of people and goods, theywere swathes of land not available to private use by individuals. With the transitionfrom the allotments system to the purchase and sale, regard to land access,crystallized with the creation of the Land Law in the nineteenth century, the land isheld as merchandise and terrenos de marinha, following this logic, also acquireexchange value and become capable of enjoyment by private individuals, with thecondition of tax payments to the state. This is seen until the twentieth century,when in 1988, primarily because of the Federal Constitution promulgation, begins anew cycle when is possible to use on terrenos de marinha the principle of thesocial function of property. From this perspective this study aims to identify thesocial function of terrenos de marinha in Natal, focusing on the public destinationand the use value of the city coastline. To this end, it was made a data collection inthe on-line information system of the Federal Heritage Department of Rio Grandedo Norte (SPU / RN) and in the terrenos de marinha areas, in order to find out ifthey had public or private use, or if they were empty lots, as well as if thepopulation access to the shore exist. Interviews with managers of the SPU weremade. The empirical study showed that the social function of terrenos de marinhain the city of Natal still didn´t happen, considering the constant existence of vacantlots in their areas, the lack of access in significant portions of the coastline and thereduced areas directed to common use along the coastline, minimizing its potentialof enjoyment by the population. It concludes by pointing to the existence of a newtransition phase on the terrenos de marinha, in witch, gradually, come up lawprovisions in the legal system and public policies to expand the purely taxcollection function attributed to this land for two centuries. In this direction, thesocial function of terrenos de marinha is embodied in concomitant adjustment ofthe tax collection function and the rescue of coastline use value, national heritageand a place for sociability and social relations development

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The world has many types of oil that have a range of values of density and viscosity, these are characteristics to identify whether an oil is light, heavy or even ultraheavy. The occurrence of heavy oil has increased significantly and pointing to a need for greater investment in the exploitation of deposits and therefore new methods to recover that oil. There are economic forecasts that by 2025, the heavy oil will be the main source of fossil energy in the world. One such method is the use of solvent vaporized VAPEX which is known as a recovery method which consists of two horizontal wells parallel to each other, with a gun and another producer, which uses as an injection solvent that is vaporized in order to reduce the viscosity of oil or bitumen, facilitating the flow to the producing well. This method was proposed by Dr. Roger Butler, in 1991. The importance of this study is to analyze how the influence some operational reservoir and parameters are important in the process VAPEX, such as accumulation of oil produced in the recovery factor in flow injection and production rate. Parameters such as flow injection, spacing between wells, type of solvent to be injected, vertical permeability and oil viscosity were addressed in this study. The results showed that the oil viscosity is the parameter that showed statistically significant influence, then the choice of Heptane solvent to be injected showed a greater recovery of oil compared to other solvents chosen, considering the spacing between the wells was shown that for a greater distance between the wells to produce more oil

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Injectivity decline, which can be caused by particle retention, generally occurs during water injection or reinjection in oil fields. Several mechanisms, including straining, are responsible for particle retention and pore blocking causing formation damage and injectivity decline. Predicting formation damage and injectivity decline is essential in waterflooding projects. The Classic Model (CM), which incorporates filtration coefficients and formation damage functions, has been widely used to predict injectivity decline. However, various authors have reported significant discrepancies between Classical Model and experimental results, motivating the development of deep bed filtration models considering multiple particle retention mechanisms (Santos & Barros, 2010; SBM). In this dissertation, inverse problem solution was studied and a software for experimental data treatment was developed. Finally, experimental data were fitted using both the CM and SBM. The results showed that, depending on the formation damage function, the predictions for injectivity decline using CM and SBM models can be significantly different

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discrepancies between classical model predictions and experimental data for deep bed filtration have been reported by various authors. In order to understand these discrepancies, an analytic continuum model for deep bed filtration is proposed. In this model, a filter coefficient is attributed to each distinct retention mechanism (straining, diffusion, gravity interception, etc.). It was shown that these coefficients generally cannot be merged into an effective filter coefficient, as considered in the classical model. Furthermore, the derived analytic solutions for the proposed model were applied for fitting experimental data, and a very good agreement between experimental data and proposed model predictions were obtained. Comparison of the obtained results with empirical correlations allowed identifying the dominant retention mechanisms. In addition, it was shown that the larger the ratio of particle to pore sizes, the more intensive the straining mechanism and the larger the discrepancies between experimental data and classical model predictions. The classical model and proposed model were compared via statistical analysis. The obtained p values allow concluding that the proposed model should be preferred especially when straining plays an important role. In addition, deep bed filtration with finite retention capacity was studied. This work also involves the study of filtration of particles through porous media with a finite capacity of filtration. It was observed, in this case, that is necessary to consider changes in the boundary conditions through time evolution. It was obtained a solution for such a model using different functions of filtration coefficients. Besides that, it was shown how to build a solution for any filtration coefficient. It was seen that, even considering the same filtration coefficient, the classic model and the one here propposed, show different predictions for the concentration of particles retained in the porous media and for the suspended particles at the exit of the media

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increase in ultraviolet radiation (UV) at surface, the high incidence of non-melanoma skin cancer (NMSC) in coast of Northeast of Brazil (NEB) and reduction of total ozone were the motivation for the present study. The overall objective was to identify and understand the variability of UV or Index Ultraviolet Radiation (UV Index) in the capitals of the east coast of the NEB and adjust stochastic models to time series of UV index aiming make predictions (interpolations) and forecasts / projections (extrapolations) followed by trend analysis. The methodology consisted of applying multivariate analysis (principal component analysis and cluster analysis), Predictive Mean Matching method for filling gaps in the data, autoregressive distributed lag (ADL) and Mann-Kendal. The modeling via the ADL consisted of parameter estimation, diagnostics, residuals analysis and evaluation of the quality of the predictions and forecasts via mean squared error and Pearson correlation coefficient. The research results indicated that the annual variability of UV in the capital of Rio Grande do Norte (Natal) has a feature in the months of September and October that consisting of a stabilization / reduction of UV index because of the greater annual concentration total ozone. The increased amount of aerosol during this period contributes in lesser intensity for this event. The increased amount of aerosol during this period contributes in lesser intensity for this event. The application of cluster analysis on the east coast of the NEB showed that this event also occurs in the capitals of Paraiba (João Pessoa) and Pernambuco (Recife). Extreme events of UV in NEB were analyzed from the city of Natal and were associated with absence of cloud cover and levels below the annual average of total ozone and did not occurring in the entire region because of the uneven spatial distribution of these variables. The ADL (4, 1) model, adjusted with data of the UV index and total ozone to period 2001-2012 made a the projection / extrapolation for the next 30 years (2013-2043) indicating in end of that period an increase to the UV index of one unit (approximately), case total ozone maintain the downward trend observed in study period

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A quantificação do impacto das práticas de preparo sobre as perdas de carbono do solo é dependente da habilidade de se descrever a variabilidade temporal da emissão de CO2 do solo após preparo. Tem sido sugerido que as grandes quantidades de CO2 emitido após o preparo do solo podem servir como um indicador das modificações nos estoques de carbono do solo em longo termo. Neste trabalho é apresentado um modelo de duas partes baseado na temperatura e na umidade do solo e que inclui um termo exponencial decrescente do tempo que é eficiente no ajuste das emissões intermediárias após preparo: arado de disco seguido de uma passagem com a grade niveladora (convencional) e escarificador de arrasto seguido da passagem com rolo destorroador (reduzido). As emissões após o preparo do solo são descritas utilizando-se estimativa não linear com um coeficiente de determinação (R²) tão alto quanto 0.98 após preparo reduzido. Os resultados indicam que nas previsões da emissão de CO2 após o preparo do solo é importante considerar um termo exponencial decrescente no tempo após preparo.