883 resultados para FORECASTING
Resumo:
This paper analyzes the in-, and out-of sample, predictability of the stock market returns from Eurozone’s banking sectors, arising from bank-specific ratios and macroeconomic variables, using panel estimation techniques. In order to do that, I set an unbalanced panel of 116 banks returns, from April, 1991, to March, 2013, to constitute equal-weighted country-sorted portfolios representative of the Austrian, Belgian, Finish, French, German, Greek, Irish, Italian, Portuguese and Spanish banking sectors. I find that both earnings per share (EPS) and the ratio of total loans to total assets have in-sample predictive power over the portfolios’ monthly returns whereas, regarding the cross-section of annual returns, only EPS retain significant explanatory power. Nevertheless, the sign associated with the impact of EPS is contrarian to the results of past literature. When looking at inter-yearly horizon returns, I document in-sample predictive power arising from the ratios of provisions to net interest income, and non-interest income to net income. Regarding the out-of-sample performance of the proposed models, I find that these would only beat the portfolios’ historical mean on the month following the disclosure of year-end financial statements. Still, the evidence found is not statistically significant. Finally, in a last attempt to find significant evidence of predictability of monthly and annual returns, I use Fama and French 3-Factor and Carhart models to describe the cross-section of returns. Although in-sample the factors can significantly track Eurozone’s banking sectors’ stock market returns, they do not beat the portfolios’ historical mean when forecasting returns.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Double Degree in Economics from the Nova School of Business and Economics and University of Maastricht
Resumo:
A Work Project, presented as part of the requirements for the Award of a Master’s Double Degree in Finance and Financial Economics from NOVA – School of Business and Economics and Maastricht University
Resumo:
RESUMO - Introdução — O presente estudo descreve os cenários de impacto que uma eventual pandemia de gripe poderá ter na população portuguesa e nos serviços de saúde. Trata-se de uma versão actualizada dos cenários preliminares que têm vindo a ser elaborados e discutidos desde 2005. Material e métodos — Os cenários assumem que a pandemia ocorrerá em duas ondas das quais a primeira (taxa de ataque: 10%) será menos intensa do que a segunda (taxas de ataque: 20%, 25% ou 30%). Neste trabalho são descritos apenas os cenários respeitantes à situação mais grave (taxa de ataque global = 10% + 30%). A elaboração dos cenários utilizou o método proposto por Meltzer, M. I., Cox, N. J. e Fukuda, K. (1999) mas com quase todos os parâmetros adaptados à população portuguesa. Esta adaptação incidiu sobre: 1. duração da pandemia; 2. taxa de letalidade; 3. percentagem da população com risco elevado de complicações; 4. percentagem de doentes com suspeita de gripe que procurará consulta; 5. tempo entre o início dos sintomas e a procura de cuidados; 6. percentagem de doentes que terá acesso efectivo a antiviral; 7. taxa de hospitalização por gripe e tempo médio de hospitalização; 8. percentagem de doentes hospitalizados que necessitarão de cuidados intensivos (CI) e tempo de internamento em CI; 9. efectividade de oseltamivir para evitar complicações e morte. Resultados — Os cenários correspondentes à situação mais grave (taxa de ataque global: 10% + 30%) são apresentados sem qualquer intervenção e, também, com utilização de oseltamivir para fins terapêuticos. Os resultados sem intervenção para o cenário «provável» indicam: • número total de casos — 4 142 447; • número total de indivíduos a necessitar de consulta — 5 799 426; • número total de hospitalizações — 113 712; • número total de internamentos em cuidados intensivos — 17 057; • número total de óbitos — 32 051; • número total de óbitos, nas semanas com valor máximo — 1.a onda: 2551, 2.a onda: 7651. Quando os cenários foram simulados entrando em linha de conta com a utilização de oseltamivir (considerando uma efectividade de 10% e 30%), verificou-se uma redução dos valores dos óbitos e hospitalizações calculados. O presente artigo também apresenta a distribuição semanal, no período de desenvolvimento da pandemia, dos vários resultados obtidos. Discussão — Os resultados apresentados devem ser interpretados como «cenários» e não como «previsões». De facto, as incertezas existentes em relação à doença e ao seu agente não permitem prever com rigor suficiente os seus impactos sobre a população e sobre os serviços de saúde. Por isso, os cenários agora apresentados servem, sobretudo, para fins de planeamento. Assim, a preparação da resposta à eventual pandemia pode ser apoiada em valores cujas ordens de grandeza correspondem às situações de mais elevada gravidade. Desta forma, a sua utilização para outros fins é inadequada e é vivamente desencorajada pelos autores.
Resumo:
As florestas são uma fonte importante de recursos naturais, desempenhando um papel fulcral na sustentabilidade ambiental. A sua gestão quer territorial quer económica, conduz a uma maximização da produção, sem alteração da qualidade da matéria-prima. Portugal apresenta mais de um terço do seu território coberto por floresta, apresentando uma possibilidade de aplicação de sistemas de gestão, territorial e económica que maximizem a sua produção. Os Sistemas de Informação Geográfica (SIG) são modelos da realidade em que é possível integrar toda a informação disponível sobre um assunto tendo por base um campo comum a todos as variáveis, a localização geográfica. Os SIG podem contribuir de diversas formas para um maior desenvolvimento das rotinas e ferramentas de planeamento e gestão florestal. A sua integração com modelos quantitativos para planeamento e gestão de florestas é uma mais-valia nesta área. Nesta dissertação apresentam-se modelos geoestatísticos, com recurso a Sistemas de Informação Geográfica, de apoio e suporte à produção de pinha em Pinheiro-manso (Pinus pinea L.). Procurando estimar as áreas com melhor propensão à produção, a partir de dados amostrais. Estes foram previamente estudados tendo sido selecionadas quatro variáveis: largura da copa, área basal, altura da árvore e produção de pinha. A geoestatística aplicada, inclui modelos de correlação espacial: kriging, onde são atribuídos pesos às amostras a partir de uma análise espacial baseada no variograma experimental. Foi utilizada a extensão Geostatistical Analyst do ArcGis da ESRI, para realizar 96 krigings para as quatro variáveis em estudo, com diferentes parametrizações, destes foram selecionados 8 krigings. Com base nos critérios de adequação dos modelos e da análise de resultados da predição dos erros - cross validation. O resultado deste estudo é apresentado através de mapas de previsão para a produção de pinha em Pinheiro manso, em que foram analisadas áreas com maior e menor probabilidade de produção tendo-se realizado análises de comparação de variáveis. Através da interseção de todas as variáveis com a produção, podemos concluir que os concelhos com maiores áreas de probabilidade de produção de pinha em Pinheiro manso, da área de estudo, são Alcácer do Sal, Montemor-o-Novo, Vendas Novas, Coruche e Chamusca. Com a realização de um cruzamento de dados entre os resultados obtidos dos krigings, e a Carta de Uso e Ocupação do Solo de Portugal Continental para 2007 (COS2007), realizaram-se mapas de previsão para a expansão do Pinheiro manso. Nas áreas de expansão conseguimos atingir aumentos mínimos na ordem dos 11% e máximo na ordem dos 61%. No total consegue-se atingir aproximadamente 128 mil ha para área de expansão do Pinheiro manso. Superando, os valores esperados pelos Planos Regionais de Ordenamento Florestal, abrangidos pela área da amostra em estudo, em que é esperado um incremento de cerca de 130 mil hectares de área de Pinheiro manso para 2030.
Resumo:
This paper addresses the growing difficulties automobile manufacturers face within their after sales business: an increasing number of trade obstacles set up by import countries discriminates against the foreign suppliers and impedes the international sales of genuine parts. The purpose of the study is to explore the emergence of trade restrictive product certification systems, which affect spare parts exports of automobile manufacturers. The methodology used includes review of the literature and an empirical study based on qualitative interviews with representatives of major stakeholders of the automotive after sales business. Relevant key drivers, which initiate the introduction of technical regulations in importing countries, are identified and analysed to evaluate their effect on the emerging trade policy. The analysis of the key drivers outlines that several interacting components, such as the global competitiveness of the country, macroeconomic and microeconomic factors, and certain country-specific variables induce trade restrictive product certification systems. The findings allow for an early detection of the emergence of product certification systems and provide a means to early recognise the risks and opportunities for the sales of automotive spare parts in the automakers’ target markets. This allows the manufacturers to react immediately and adapt in time to the upcoming changes.
Transcatheter aortic valve implantation (TAVI): state of the art techniques and future perspectives.
Resumo:
Transcatheter aortic valve therapies are the newest established techniques for the treatment of high risk patients affected by severe symptomatic aortic valve stenosis. The transapical approach requires a left anterolateral mini-thoracotomy, whereas the transfemoral method requires an adequate peripheral vascular access and can be performed fully percutaneously. Alternatively, the trans-subclavian access has been recently proposed as a third promising approach. Depending on the technique, the fine stent-valve positioning can be performed with or without contrast injections. The transapical echo-guided stent-valve implantation without angiography (the Lausanne technique) relies entirely on transoesophageal echocardiogramme imaging for the fine stent-valve positioning and it has been proved that this technique prevents the onset of postoperative contrast-related acute kidney failure. Recent published reports have shown good hospital outcomes and short-term results after transcatheter aortic valve implantation, but there are no proven advantages in using the transfemoral or the transapical technique. In particular, the transapical series have a higher mean logistic Euroscore of 27-35%, a procedural success rate above 95% and a mean 30-day mortality between 7.5 and 17.5%, whereas the transfemoral results show a lower logistic Euroscore of 23-25.5%, a procedural success rate above 90% and a 30-day mortality of 7-10.8%. Nevertheless, further clinical trials and long-term results are mandatory to confirm this positive trend. Future perspectives in transcatheter aortic valve therapies would be the development of intravascular devices for the ablation of the diseased valve leaflets and the launch of new stent-valves with improved haemodynamic, different sizes and smaller delivery systems.
Resumo:
The historically-reactive approach to identifying safety problems and mitigating them involves selecting black spots or hot spots by ranking locations based on crash frequency and severity. The approach focuses mainly on the corridor level without taking the exposure rate (vehicle miles traveled) and socio-demographics information of the study area, which are very important in the transportation planning process, into consideration. A larger study analysis unit at the Transportation Analysis Zone (TAZ) level or the network planning level should be used to address the needs of development of the community in the future and incorporate safety into the long-range transportation planning process. In this study, existing planning tools (such as the PLANSAFE models presented in NCHRP Report 546) were evaluated for forecasting safety in small and medium-sized communities, particularly as related to changes in socio-demographics characteristics, traffic demand, road network, and countermeasures. The research also evaluated the applicability of the Empirical Bayes (EB) method to network-level analysis. In addition, application of the United States Road Assessment Program (usRAP) protocols at the local urban road network level was investigated. This research evaluated the applicability of these three methods for the City of Ames, Iowa. The outcome of this research is a systematic process and framework for considering road safety issues explicitly in the small and medium-sized community transportation planning process and for quantifying the safety impacts of new developments and policy programs. More specifically, quantitative safety may be incorporated into the planning process, through effective visualization and increased awareness of safety issues (usRAP), the identification of high-risk locations with potential for improvement, (usRAP maps and EB), countermeasures for high-risk locations (EB before and after study and PLANSAFE), and socio-economic and demographic induced changes at the planning-level (PLANSAFE).
Resumo:
Työn tavoitteena oli suunnitella ja toteuttaa sähkön ja lämmön yhteistuotantolaitoksen tuotannon optimointi. Optimoinnin kriteerinä on tuotannon kannattavuus. Pyrittiin luomaan optimointimalli, joka ottaa optimoinnissa huomioon erityisesti kaukolämmön kulutusennusteen muutokset sekä sähkön pörssihinnan vaihtelut. Tuotannon kannalta olennaisin kriteeri on kaukolämmön kulutusennusteen pohjalta arvioidun kaukolämpökuorman tyydyttäminen mahdollisimman tehokkaasti ja taloudellisesti. Sähkön tuotannon merkittävimmiksi kriteereiksi muodostuivat sähkön tuotannon ennustettavuus ja tuotannon maksimointi sähkön pörssihinnan asettamissa puitteissa. Optimointiohjelmaa ei ole tarkoitus kytkeä suoraan voimalaitoksen ajojärjestelmään, vaan siitä on tarkoitus tulla erillinen ajosuunnittelijan työkalu. Itse ajosuunnitteluun vaikuttaa usein monipuolisemmat suunnittelukriteerit kuin pelkästään tuotannon tuottavuus. Näiden eri kriteerien painotuksia ei ohjelmassa huomioida, vaan ne päättää ajosuunnittelija. Tuloksena saatiin aikaan optimointiohjelma, joka laskee valittujen tuotantovaihtoehtojen kokonaistuotot eri kaukolämmön kulutusennusteiden ja sähkön pörssihintaennusteiden pohjalta.
Resumo:
QUESTIONS UNDER STUDY: Since tumour burden consumes substantial healthcare resources, precise cancer incidence estimations are pivotal to define future needs of national healthcare. This study aimed to estimate incidence and mortality rates of oesophageal, gastric, pancreatic, hepatic and colorectal cancers up to 2030 in Switzerland. METHODS: Swiss Statistics provides national incidences and mortality rates of various cancers, and models of future developments of the Swiss population. Cancer incidences and mortality rates from 1985 to 2009 were analysed to estimate trends and to predict incidence and mortality rates up to 2029. Linear regressions and Joinpoint analyses were performed to estimate the future trends of incidences and mortality rates. RESULTS: Crude incidences of oesophageal, pancreas, liver and colorectal cancers have steadily increased since 1985, and will continue to increase. Gastric cancer incidence and mortality rates reveal an ongoing decrease. Pancreatic and liver cancer crude mortality rates will keep increasing, whereas colorectal cancer mortality on the contrary will fall. Mortality from oesophageal cancer will plateau or minimally increase. If we consider European population-standardised incidence rates, oesophageal, pancreatic and colorectal cancer incidences are steady. Gastric cancers are diminishing and liver cancers will follow an increasing trend. Standardised mortality rates show a diminution for all but liver cancer. CONCLUSIONS: The oncological burden of gastrointestinal cancer will significantly increase in Switzerland during the next two decades. The crude mortality rates globally show an ongoing increase except for gastric and colorectal cancers. Enlarged healthcare resources to take care of these complex patient groups properly will be needed.
Resumo:
The main objective of this master’s thesis is to examine if Weibull analysis is suitable method for warranty forecasting in the Case Company. The Case Company has used Reliasoft’s Weibull++ software, which is basing on the Weibull method, but the Company has noticed that the analysis has not given right results. This study was conducted making Weibull simulations in different profit centers of the Case Company and then comparing actual cost and forecasted cost. Simula-tions were made using different time frames and two methods for determining future deliveries. The first sub objective is to examine, which parameters of simulations will give the best result to each profit center. The second sub objective of this study is to create a simple control model for following forecasted costs and actual realized costs. The third sub objective is to document all Qlikview-parameters of profit centers. This study is a constructive research, and solutions for company’s problems are figured out in this master’s thesis. In the theory parts were introduced quality issues, for example; what is quality, quality costing and cost of poor quality. Quality is one of the major aspects in the Case Company, so understand-ing the link between quality and warranty forecasting is important. Warranty management was also introduced and other different tools for warranty forecasting. The Weibull method and its mathematical properties and reliability engineering were introduced. The main results of this master’s thesis are that the Weibull analysis forecasted too high costs, when calculating provision. Although, some forecasted values of profit centers were lower than actual values, the method works better for planning purposes. One of the reasons is that quality improving or alternatively quality decreasing is not showing in the results of the analysis in the short run. The other reason for too high values is that the products of the Case Company are com-plex and analyses were made in the profit center-level. The Weibull method was developed for standard products, but products of the Case Company consists of many complex components. According to the theory, this method was developed for homogeneous-data. So the most im-portant notification is that the analysis should be made in the product level, not the profit center level, when the data is more homogeneous.
Resumo:
The purpose of this study is to examine the impact of the choice of cut-off points, sampling procedures, and the business cycle on the accuracy of bankruptcy prediction models. Misclassification can result in erroneous predictions leading to prohibitive costs to firms, investors and the economy. To test the impact of the choice of cut-off points and sampling procedures, three bankruptcy prediction models are assessed- Bayesian, Hazard and Mixed Logit. A salient feature of the study is that the analysis includes both parametric and nonparametric bankruptcy prediction models. A sample of firms from Lynn M. LoPucki Bankruptcy Research Database in the U. S. was used to evaluate the relative performance of the three models. The choice of a cut-off point and sampling procedures were found to affect the rankings of the various models. In general, the results indicate that the empirical cut-off point estimated from the training sample resulted in the lowest misclassification costs for all three models. Although the Hazard and Mixed Logit models resulted in lower costs of misclassification in the randomly selected samples, the Mixed Logit model did not perform as well across varying business-cycles. In general, the Hazard model has the highest predictive power. However, the higher predictive power of the Bayesian model, when the ratio of the cost of Type I errors to the cost of Type II errors is high, is relatively consistent across all sampling methods. Such an advantage of the Bayesian model may make it more attractive in the current economic environment. This study extends recent research comparing the performance of bankruptcy prediction models by identifying under what conditions a model performs better. It also allays a range of user groups, including auditors, shareholders, employees, suppliers, rating agencies, and creditors' concerns with respect to assessing failure risk.
Resumo:
For predicting future volatility, empirical studies find mixed results regarding two issues: (1) whether model free implied volatility has more information content than Black-Scholes model-based implied volatility; (2) whether implied volatility outperforms historical volatilities. In this thesis, we address these two issues using the Canadian financial data. First, we examine the information content and forecasting power between VIXC - a model free implied volatility, and MVX - a model-based implied volatility. The GARCH in-sample test indicates that VIXC subsumes all information that is reflected in MVX. The out-of-sample examination indicates that VIXC is superior to MVX for predicting the next 1-, 5-, 10-, and 22-trading days' realized volatility. Second, we investigate the predictive power between VIXC and alternative volatility forecasts derived from historical index prices. We find that for time horizons lesser than 10-trading days, VIXC provides more accurate forecasts. However, for longer time horizons, the historical volatilities, particularly the random walk, provide better forecasts. We conclude that VIXC cannot incorporate all information contained in historical index prices for predicting future volatility.
Resumo:
The Meese-Rogoff forecasting puzzle states that foreign exchange (FX) rates are unpredictable. Since one country’s macroeconomic conditions could affect the price of its national currency, we study the dynamic relations between the FX rates and some macroeconomic accounts. Our research tests whether the predictability of the FX rates could be improved through the advanced econometrics. Improving the predictability of the FX rates has important implications for various groups including investors, business entities and the government. The present thesis examines the dynamic relations between the FX rates, savings and investments for a sample of 25 countries from the Organization for Economic Cooperation and Development. We apply quarterly data of FX rates, macroeconomic indices and accounts including the savings and the investments over three decades. Through preliminary Augmented Dickey-Fuller unit root tests and Johansen cointegration tests, we found that the savings rate and the investment rate are cointegrated with the vector (1,-1). This result is consistent with many previous studies on the savings-investment relations and therefore confirms the validity of the Feldstein-Horioka puzzle. Because of the special cointegrating relation between the savings rate and investment rate, we introduce the savings-investment rate differential (SID). Investigating each country through a vector autoregression (VAR) model, we observe extremely insignificant coefficient estimates of the historical SIDs upon the present FX rates. We also report similar findings through the panel VAR approach. We thus conclude that the historical SIDs are useless in forecasting the FX rate. Nonetheless, the coefficients of the past FX rates upon the current SIDs for both the country-specific and the panel VAR models are statistically significant. Therefore, we conclude that the historical FX rates can conversely predict the SID to some degree. Specifically, depreciation in the domestic currency would cause the increase in the SID.
Resumo:
This paper develops and estimates a game-theoretical model of inflation targeting where the central banker's preferences are asymmetric around the targeted rate. In particular, positive deviations from the target can be weighted more, or less, severely than negative ones in the central banker's loss function. It is shown that some of the previous results derived under the assumption of symmetry are not robust to the generalization of preferences. Estimates of the central banker's preference parameters for Canada, Sweden, and the United Kingdom are statistically different from the ones implied by the commonly used quadratic loss function. Econometric results are robust to different forecasting models for the rate of unemployment but not to the use of measures of inflation broader than the one targeted.