916 resultados para two-step process


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Random effect models have been widely applied in many fields of research. However, models with uncertain design matrices for random effects have been little investigated before. In some applications with such problems, an expectation method has been used for simplicity. This method does not include the extra information of uncertainty in the design matrix is not included. The closed solution for this problem is generally difficult to attain. We therefore propose an two-step algorithm for estimating the parameters, especially the variance components in the model. The implementation is based on Monte Carlo approximation and a Newton-Raphson-based EM algorithm. As an example, a simulated genetics dataset was analyzed. The results showed that the proportion of the total variance explained by the random effects was accurately estimated, which was highly underestimated by the expectation method. By introducing heuristic search and optimization methods, the algorithm can possibly be developed to infer the 'model-based' best design matrix and the corresponding best estimates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Mauri Model DMF is unique in its approach to the management of water resources as the framework offers a transparent and inclusive approach to considering the environmental, economic, social and cultural aspects of the decisions being contemplated. The Mauri Model DMF is unique because it is capable of including multiple-worldviews and adopts mauri (intrinsic value or well-being) in the place of the more common monetised assessments of pseudo sustainability using Cost Benefit Analysis. The Mauri Model DMF uses a two stage process that first identifies participants’ worldviews and inherent bias regarding water resource management, and then facilitates transparent assessment of selected sustainability performance indicators. The assessment can then be contemplated as the separate environmental, economic, social and cultural dimensions of the decision, and collectively as an overall result; or the priorities associated with different worldviews can be applied to determine the sensitivity of the result to different cultural contexts or worldviews.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we propose a two-step estimator for panel data models in which a binary covariate is endogenous. In the first stage, a random-effects probit model is estimated, having the endogenous variable as the left-hand side variable. Correction terms are then constructed and included in the main regression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We consider model selection criteria which have data-dependent penalties as well as the traditional ones. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. Our Monte Carlo simulations measure the improvements in forecasting accuracy that can arise from the joint determination of lag-length and rank using our proposed procedure, relative to an unrestricted VAR or a cointegrated VAR estimated by the commonly used procedure of selecting the lag-length only and then testing for cointegration. Two empirical applications forecasting Brazilian inflation and U.S. macroeconomic aggregates growth rates respectively show the usefulness of the model-selection strategy proposed here. The gains in different measures of forecasting accuracy are substantial, especially for short horizons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We consider model selection criteria which have data-dependent penalties as well as the traditional ones. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. Our Monte Carlo simulations measure the improvements in forecasting accuracy that can arise from the joint determination of lag-length and rank using our proposed procedure, relative to an unrestricted VAR or a cointegrated VAR estimated by the commonly used procedure of selecting the lag-length only and then testing for cointegration. Two empirical applications forecasting Brazilian in ation and U.S. macroeconomic aggregates growth rates respectively show the usefulness of the model-selection strategy proposed here. The gains in di¤erent measures of forecasting accuracy are substantial, especially for short horizons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. A Monte Carlo study explores the finite sample performance of this procedure and evaluates the forecasting accuracy of models selected by this procedure. Two empirical applications confirm the usefulness of the model selection procedure proposed here for forecasting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most studies around that try to verify the existence of regulatory risk look mainly at developed countries. Looking at regulatory risk in emerging market regulated sectors is no less important to improving and increasing investment in those markets. This thesis comprises three papers comprising regulatory risk issues. In the first Paper I check whether CAPM betas capture information on regulatory risk by using a two-step procedure. In the first step I run Kalman Filter estimates and then use these estimated betas as inputs in a Random-Effect panel data model. I find evidence of regulatory risk in electricity, telecommunications and all regulated sectors in Brazil. I find further evidence that regulatory changes in the country either do not reduce or even increase the betas of the regulated sectors, going in the opposite direction to the buffering hypothesis as proposed by Peltzman (1976). In the second Paper I check whether CAPM alphas say something about regulatory risk. I investigate a methodology similar to those used by some regulatory agencies around the world like the Brazilian Electricity Regulatory Agency (ANEEL) that incorporates a specific component of regulatory risk in setting tariffs for regulated sectors. I find using SUR estimates negative and significant alphas for all regulated sectors especially the electricity and telecommunications sectors. This runs in the face of theory that predicts alphas that are not statistically different from zero. I suspect that the significant alphas are related to misspecifications in the traditional CAPM that fail to capture true regulatory risk factors. On of the reasons is that CAPM does not consider factors that are proven to have significant effects on asset pricing, such as Fama and French size (ME) and price-to-book value (ME/BE). In the third Paper, I use two additional factors as controls in the estimation of alphas, and the results are similar. Nevertheless, I find evidence that the negative alphas may be the result of the regulated sectors premiums associated with the three Fama and French factors, particularly the market risk premium. When taken together, ME and ME/BE regulated sectors diminish the statistical significance of market factors premiums, especially for the electricity sector. This show how important is the inclusion of these factors, which unfortunately is scarce in emerging markets like Brazil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this article is to assess the role of real effective exchange rate volatility on long-run economic growth for a set of 82 advanced and emerging economies using a panel data set ranging from 1970 to 2009. With an accurate measure for exchange rate volatility, the results for the two-step system GMM panel growth models show that a more (less) volatile RER has significant negative (positive) impact on economic growth and the results are robust for different model specifications. In addition to that, exchange rate stability seems to be more important to foster long-run economic growth than exchange rate misalignment

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Desenvolve um método para desenhar e avaliar programas de DRH, permitindo uma visão integral e organizada dos mesmos Decompõe o programa em fases de diagnóstico, que justifica a sua existência, seleção de conteúdo e métodos que constituem o programa em si, e como última etapa a avaliação que funciona como realimentadora do processo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study semiparametric two-step estimators which have the same structure as parametric doubly robust estimators in their second step. The key difference is that we do not impose any parametric restriction on the nuisance functions that are estimated in a first stage, but retain a fully nonparametric model instead. We call these estimators semiparametric doubly robust estimators (SDREs), and show that they possess superior theoretical and practical properties compared to generic semiparametric two-step estimators. In particular, our estimators have substantially smaller first-order bias, allow for a wider range of nonparametric first-stage estimates, rate-optimal choices of smoothing parameters and data-driven estimates thereof, and their stochastic behavior can be well-approximated by classical first-order asymptotics. SDREs exist for a wide range of parameters of interest, particularly in semiparametric missing data and causal inference models. We illustrate our method with a simulation exercise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As peculiaridades da atividade bancária - normalmente vista como fundamental à persecução do desenvolvimento, bem como bastante influenciada pelo direito - estimularam a emergência de um regime internacional de regulação da categoria. Tal advento se deu na esteira dos trabalhos realizados por organizações internacionais, como o Comitê da Basileia (BCBS) e o Comitê de Estabilidade Financeira (FSB), e em virtude da percepção de estarmos em um mundo no qual os mercados estão muito interligados, mas permanecem nacionalmente regulados. À parte da discussão do mérito e efetividade dos padrões regulatórios propostos por essas organizações, em um contexto no qual uma série de países busca implementá-los, interessa ao presente trabalho perscrutar os elementos que definem o grau adequado de discricionariedade de implementação conferida na formulação desses. A análise de tal problema sugere a existência de dois extremos a se evitar: a arbitragem regulatória e o one size fits all. Evitar a arbitragem regulatória é uma preocupação da literatura de regulação bancária que se traduz em conter uma variação muito acentuada entre os regimes regulatórios de diferentes jurisdições. Isso enseja três vetores favoráveis a um menor grau de discricionariedade, representado por desígnios de maior coordenação, maior competitividade e de evitar uma race to the bottom regulatória entre os países. Já evitar o one size fits all é uma preocupação recorrente da literatura de direito e desenvolvimento que sugere a necessidade de se atentar para as peculiaridades locais na formulação de políticas regulatórias. Por sua vez, isso enseja outros três vetores, dessa vez em direção a um maior grau de discricionariedade. Sendo esses representados por preocupações com a eficiência das medidas adotadas, com a garantia de um espaço de manobra que respeite a autodeterminação dos países - ao menos minorando eventuais déficits democráticos da estipulação de padrões internacionais - e com a viabilidade prática do experimentalismo. A fim de analisar esse problema e levando em conta esses extremos, propõe-se uma estratégia bipartida: a construção de um enquadramento teórico e a verificação de uma hipótese de pesquisa, segundo a qual um caso específico de regulação bancária pode demonstrar como esses elementos interagem na definição do grau de discricionariedade. Assim, em um primeiro momento - após a necessária contextualização e descrição metodológica - é construído um framework teórico do problema à luz da literatura da regulação bancária e do instrumental utilizado pelas discussões acerca do impacto do direito no desenvolvimento. Discussões essas que há anos têm abordado a formulação de padrões internacionais e a sua implementação em contextos nacionais diversos. Também nesse primeiro momento e como parte da construção dos alicerces teóricos, procede-se a um excurso que busca verificar a hipótese da confiança no sistema bancário ser uma espécie de baldio (common), bem como suas possíveis consequências. Partindo desse enquadramento, elege-se o segmento de regulação bancária relativo aos garantidores de depósito para uma análise de caso. Tal análise - realizada com subsídios provenientes de pesquisa bibliográfica e empírica - busca demonstrar com que grau de discricionariedade e de que forma se deu a formulação e implementação de padrões internacionais nesse segmento. Ao fim, analisa-se como os vetores determinantes do grau de discricionariedade interagem no caso dos garantidores de depósitos, bem como as sugestões possivelmente inferíveis dessa verificação para os demais segmentos da regulação bancária.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A tradicional representação da estrutura a termo das taxas de juros em três fatores latentes (nível, inclinação e curvatura) teve sua formulação original desenvolvida por Charles R. Nelson e Andrew F. Siegel em 1987. Desde então, diversas aplicações vêm sendo desenvolvidas por acadêmicos e profissionais de mercado tendo como base esta classe de modelos, sobretudo com a intenção de antecipar movimentos nas curvas de juros. Ao mesmo tempo, estudos recentes como os de Diebold, Piazzesi e Rudebusch (2010), Diebold, Rudebusch e Aruoba (2006), Pooter, Ravazallo e van Dijk (2010) e Li, Niu e Zeng (2012) sugerem que a incorporação de informação macroeconômica aos modelos da ETTJ pode proporcionar um maior poder preditivo. Neste trabalho, a versão dinâmica do modelo Nelson-Siegel, conforme proposta por Diebold e Li (2006), foi comparada a um modelo análogo, em que são incluídas variáveis exógenas macroeconômicas. Em paralelo, foram testados dois métodos diferentes para a estimação dos parâmetros: a tradicional abordagem em dois passos (Two-Step DNS), e a estimação com o Filtro de Kalman Estendido, que permite que os parâmetros sejam estimados recursivamente, a cada vez que uma nova informação é adicionada ao sistema. Em relação aos modelos testados, os resultados encontrados mostram-se pouco conclusivos, apontando uma melhora apenas marginal nas estimativas dentro e fora da amostra quando as variáveis exógenas são incluídas. Já a utilização do Filtro de Kalman Estendido mostrou resultados mais consistentes quando comparados ao método em dois passos para praticamente todos os horizontes de tempo estudados.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis analyses the European Unions’ effort to create an integrated pan-European electricity market based on “market coupling” as the proposed allocation mechanism for interconnector transfer capacity. Thus, the thesis’ main focus is if market coupling leads to a price convergence in interlinked markets and how it affects the behavior of electricity price data. The applied research methods are a qualitative, structured literature review and a quantitative analysis of electricity price data. The quantitative analysis relies on descriptive statistics of absolute price differentials and on a Cointegration analysis according to Engle & Granger (1987)’s two step approach. Main findings are that implicit auction mechanisms such as market coupling are more efficient than explicit auctions. Especially the method of price coupling leads to a price convergence in involved markets, to social welfare gains and reduces market power of producers, as shown on the example of the TLC market coupling. The market coupling initiative between Germany and Denmark, on the other hand, is evaluated as less successful and illustrates the complexity and difficulties of implementing market coupling initiatives. The cointegration analysis shows that the time series were already before the coupling date cointegrated, but the statistical significance increased. The thesis suggests that market coupling leads to a price convergence of involved markets and thus functions as method to create a single, integrated European electricity market.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given the environmental concern over global warming that occurs mainly by emission of CO2 from the combustion of petroleum, coal and natural gas research focused on alternative and clean energy generation has been intensified. Among these, the highlight the solid oxide fuel cell intermediate temperature (IT-SOFC). For application as electrolyte of the devices doped based CeO2 with rare earth ions (TR+ 3) have been quite promising because they have good ionic conductivity and operate at relatively low temperatures (500-800 ° C). In this work, studied the Ce1-xEuxO2-δ (x = 0,1, 0,2 and 0,3), solid solutions synthesized by the polymeric precursor method to be used as solid electrolyte. It was also studied the processing steps of these powders (milling, compaction and two step sintering) in order to obtain dense sintered pellets with reduced grain size and homogeneous microstructure. For this, the powders were characterized by thermal analysis, X-ray diffraction, particle size distribution and scanning electrons microscopy, since the sintered samples were characterized by dilatometry, scanning electrons microscopy, density and grain size measurements. By x-ray diffraction, it was verified the formation of the solid solution for all compositions. Crystallites in the nanometric scale were found for both sintering routes but the two step sintering presented significant reduction in the average grain size

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Control of human visceral leishmaniasis in endemic regions is hampered in part by the lack of knowledge with respect of the role reservoirs and vector. In addition, there is not yet an understanding of how non-symptomatic subclinical infection might influence the maintenance of infection in a particular locality. Of worrisome is the limited accessibility to medical care in places with emerging drug resistance. There is still no available protective vaccine either for humans or other reservoirs. Leishmania species are protozoa that express multiple antigens which are recognized by the vertebrate immune system. Since there is not one immunodominant epitope recognized by most hosts, strategies must be developed to optimize selection of antigens for prevention and immunodiagnosis. For this reason, we generated a cDNA library from the intracellular amastigote form of Leishmania chagasi, the causative agent of South American visceral leishmaniasis. We employed a two-step expression screen of the library to systematically identify T and T-dependent B cell antigens. The first step was aimed at identifying the largest possible number of clones producing an epitope-containing polypeptide with a pool of sera from Brazilians with documented visceral leishmaniasis. After removal of clones encoding heat shock proteins, positive clones underwent a second step screen for their ability to cause proliferation and IFN-γ responses of T cells from immune mice. Six unique clones were selected from the second screen for further analysis. The clones encoded part of the coding sequence of glutamine synthetase, transitional endoplasmic reticulum ATPase, elongation factor 1γ, kinesin K-39, repetitive protein A2, and a hypothetical conserved protein. Humans naturally infected with L. chagasi mounted both cellular and antibody responses to these protein Preparations containing multiple antigens may be optimal for immunodiagnosis and protective vaccines against Leishmania