9 resultados para Statistical Robustness

em Repositório digital da Fundação Getúlio Vargas - FGV


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data available on continuos-time diffusions are always sampled discretely in time. In most cases, the likelihood function of the observations is not directly computable. This survey covers a sample of the statistical methods that have been developed to solve this problem. We concentrate on some recent contributions to the literature based on three di§erent approaches to the problem: an improvement of the Euler-Maruyama discretization scheme, the use of Martingale Estimating Functions and the application of Generalized Method of Moments (GMM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data available on continuous-time diffusions are always sampled discretely in time. In most cases, the likelihood function of the observations is not directly computable. This survey covers a sample of the statistical methods that have been developed to solve this problem. We concentrate on some recent contributions to the literature based on three di§erent approaches to the problem: an improvement of the Euler-Maruyama discretization scheme, the employment of Martingale Estimating Functions, and the application of Generalized Method of Moments (GMM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The US term structure of interest rates plays a central role in fixed-income analysis. For example, estimating accurately the US term structure is a crucial step for those interested in analyzing Brazilian Brady bonds such as IDUs, DCBs, FLIRBs, EIs, etc. In this work we present a statistical model to estimate the US term structure of interest rates. We address in this report all major issues which drove us in the process of implementing the model developed, concentrating on important practical issues such as computational efficiency, robustness of the final implementation, the statistical properties of the final model, etc. Numerical examples are provided in order to illustrate the use of the model on a daily basis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the presence of long memory in financiaI time series using four test statistics: V/S, KPSS, KS and modified R/S. There has been a large amount of study on the long memory behavior in economic and financiaI time series. However, there is still no consensus. We argue in this paper that spurious short-term memory may be found due to the incorrect use of data-dependent bandwidth to estimating the longrun variance. We propose a partially adaptive lag truncation procedure that is robust against the presence of long memory under the alternative hypothesis and revisit several economic and financiaI time series using the proposed bandwidth choice. Our results indicate the existence of spurious short memory in real exchange rates when Andrews' formula is employed, but long memory is detected when the proposed lag truncation procedure is used. Using stock market data, we also found short memory in returns and long memory in volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Os impactos da adoção das Normas Internacionais de Contabilidade (IFRSs) tem sido objeto de debates nos meios profissionais e acadêmicos, entretanto, pouco tem sido pesquisado sobre as repercussões da adoção dos IFRSs na atividade pericial criminal. Portanto, o objetivo deste estudo é captar e analisar a percepção dos Peritos Criminais Federais sobre os impactos da adoção dos IFRSs na atividade de perícia criminal oficial realizada em fraudes contábeis. Lastreou-se numa abordagem quantitativa e qualitativa utilizada para verificar associações entre as percepções, recorrendo-se ao teste Qui-quadrado de Pearson e a análise de conteúdo, respectivamente. Os resultados evidenciaram que a maior parte dos respondentes concorda parcial ou totalmente que a adoção dos IFRSs facilitará o trabalho de perícia criminal federal, encontrando associação estatística com a percepção de que fraudes cometidas sem engenharia financeira são mais fáceis de comprovar e com a percepção que um maior espaço para julgamentos técnicos tem impacto positivo na atividade de perícia criminal. Outros benefícios apontados foram o aumento da comparabilidade, a diminuição da complexidade e a valorização profissional. Entretanto, constatou-se como riscos a possibilidade de aumento nas contestações técnicas aos laudos periciais, o risco de viés e a necessidade de qualificação, porém sem associação estatística com a percepção de que os IFRSs facilitarão ou não o trabalho pericial. Não foram identificadas diferenças estatísticas de percepção em função do nível de conhecimento dos preceitos sobre os IFRSs e dos conhecimentos teóricos e práticos dos pesquisados. O estudo apresenta limitações que dizem respeito principalmente à generalização dos resultados, uma vez que a abordagem pretendida foi qualitativa e quantitativa e o número de questionários respondidos não possibilitou realizar testes estatísticos com maior robustez.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atypical points in the data may result in meaningless e±cient frontiers. This follows since portfolios constructed using classical estimates may re°ect neither the usual nor the unusual days patterns. On the other hand, portfolios constructed using robust approaches are able to capture just the dynamics of the usual days, which constitute the majority of the business days. In this paper we propose an statistical model and a robust estimation procedure to obtain an e±cient frontier which would take into account the behavior of both the usual and most of the atypical days. We show, using real data and simulations, that portfolios constructed in this way require less frequent rebalancing, and may yield higher expected returns for any risk level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses an output oriented Data Envelopment Analysis (DEA) measure of technical efficiency to assess the technical efficiencies of the Brazilian banking system. Four approaches to estimation are compared in order to assess the significance of factors affecting inefficiency. These are nonparametric Analysis of Covariance, maximum likelihood using a family of exponential distributions, maximum likelihood using a family of truncated normal distributions, and the normal Tobit model. The sole focus of the paper is on a combined measure of output and the data analyzed refers to the year 2001. The factors of interest in the analysis and likely to affect efficiency are bank nature (multiple and commercial), bank type (credit, business, bursary and retail), bank size (large, medium, small and micro), bank control (private and public), bank origin (domestic and foreign), and non-performing loans. The latter is a measure of bank risk. All quantitative variables, including non-performing loans, are measured on a per employee basis. The best fits to the data are provided by the exponential family and the nonparametric Analysis of Covariance. The significance of a factor however varies according to the model fit although it can be said that there is some agreements between the best models. A highly significant association in all models fitted is observed only for nonperforming loans. The nonparametric Analysis of Covariance is more consistent with the inefficiency median responses observed for the qualitative factors. The findings of the analysis reinforce the significant association of the level of bank inefficiency, measured by DEA residuals, with the risk of bank failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on three versions of a small macroeconomic model for Brazil, this paper presents empirical evidence on the effects of parameter uncertainty on monetary policy rules and on the robustness of optimal and simple rules over different model specifications. By comparing the optimal policy rule under parameter uncertainty with the rule calculated under purely additive uncertainty, we find that parameter uncertainty should make policymakers react less aggressively to the economy's state variables, as suggested by Brainard's "conservatism principIe", although this effect seems to be relatively small. We then informally investigate each rule's robustness by analyzing the performance of policy rules derived from each model under each one of the alternative models. We find that optimal rules derived from each model perform very poorly under alternative models, whereas a simple Taylor rule is relatively robusto We also fmd that even within a specific model, the Taylor rule may perform better than the optimal rule under particularly unfavorable realizations from the policymaker' s loss distribution function.