917 resultados para Statistical Robustness
Resumo:
The ever increasing spurt in digital crimes such as image manipulation, image tampering, signature forgery, image forgery, illegal transaction, etc. have hard pressed the demand to combat these forms of criminal activities. In this direction, biometrics - the computer-based validation of a persons' identity is becoming more and more essential particularly for high security systems. The essence of biometrics is the measurement of person’s physiological or behavioral characteristics, it enables authentication of a person’s identity. Biometric-based authentication is also becoming increasingly important in computer-based applications because the amount of sensitive data stored in such systems is growing. The new demands of biometric systems are robustness, high recognition rates, capability to handle imprecision, uncertainties of non-statistical kind and magnanimous flexibility. It is exactly here that, the role of soft computing techniques comes to play. The main aim of this write-up is to present a pragmatic view on applications of soft computing techniques in biometrics and to analyze its impact. It is found that soft computing has already made inroads in terms of individual methods or in combination. Applications of varieties of neural networks top the list followed by fuzzy logic and evolutionary algorithms. In a nutshell, the soft computing paradigms are used for biometric tasks such as feature extraction, dimensionality reduction, pattern identification, pattern mapping and the like.
Resumo:
Data available on continuos-time diffusions are always sampled discretely in time. In most cases, the likelihood function of the observations is not directly computable. This survey covers a sample of the statistical methods that have been developed to solve this problem. We concentrate on some recent contributions to the literature based on three di§erent approaches to the problem: an improvement of the Euler-Maruyama discretization scheme, the use of Martingale Estimating Functions and the application of Generalized Method of Moments (GMM).
Resumo:
Data available on continuous-time diffusions are always sampled discretely in time. In most cases, the likelihood function of the observations is not directly computable. This survey covers a sample of the statistical methods that have been developed to solve this problem. We concentrate on some recent contributions to the literature based on three di§erent approaches to the problem: an improvement of the Euler-Maruyama discretization scheme, the employment of Martingale Estimating Functions, and the application of Generalized Method of Moments (GMM).
Resumo:
The US term structure of interest rates plays a central role in fixed-income analysis. For example, estimating accurately the US term structure is a crucial step for those interested in analyzing Brazilian Brady bonds such as IDUs, DCBs, FLIRBs, EIs, etc. In this work we present a statistical model to estimate the US term structure of interest rates. We address in this report all major issues which drove us in the process of implementing the model developed, concentrating on important practical issues such as computational efficiency, robustness of the final implementation, the statistical properties of the final model, etc. Numerical examples are provided in order to illustrate the use of the model on a daily basis.
Resumo:
This paper investigates the presence of long memory in financiaI time series using four test statistics: V/S, KPSS, KS and modified R/S. There has been a large amount of study on the long memory behavior in economic and financiaI time series. However, there is still no consensus. We argue in this paper that spurious short-term memory may be found due to the incorrect use of data-dependent bandwidth to estimating the longrun variance. We propose a partially adaptive lag truncation procedure that is robust against the presence of long memory under the alternative hypothesis and revisit several economic and financiaI time series using the proposed bandwidth choice. Our results indicate the existence of spurious short memory in real exchange rates when Andrews' formula is employed, but long memory is detected when the proposed lag truncation procedure is used. Using stock market data, we also found short memory in returns and long memory in volatility.
Resumo:
Os impactos da adoção das Normas Internacionais de Contabilidade (IFRSs) tem sido objeto de debates nos meios profissionais e acadêmicos, entretanto, pouco tem sido pesquisado sobre as repercussões da adoção dos IFRSs na atividade pericial criminal. Portanto, o objetivo deste estudo é captar e analisar a percepção dos Peritos Criminais Federais sobre os impactos da adoção dos IFRSs na atividade de perícia criminal oficial realizada em fraudes contábeis. Lastreou-se numa abordagem quantitativa e qualitativa utilizada para verificar associações entre as percepções, recorrendo-se ao teste Qui-quadrado de Pearson e a análise de conteúdo, respectivamente. Os resultados evidenciaram que a maior parte dos respondentes concorda parcial ou totalmente que a adoção dos IFRSs facilitará o trabalho de perícia criminal federal, encontrando associação estatística com a percepção de que fraudes cometidas sem engenharia financeira são mais fáceis de comprovar e com a percepção que um maior espaço para julgamentos técnicos tem impacto positivo na atividade de perícia criminal. Outros benefícios apontados foram o aumento da comparabilidade, a diminuição da complexidade e a valorização profissional. Entretanto, constatou-se como riscos a possibilidade de aumento nas contestações técnicas aos laudos periciais, o risco de viés e a necessidade de qualificação, porém sem associação estatística com a percepção de que os IFRSs facilitarão ou não o trabalho pericial. Não foram identificadas diferenças estatísticas de percepção em função do nível de conhecimento dos preceitos sobre os IFRSs e dos conhecimentos teóricos e práticos dos pesquisados. O estudo apresenta limitações que dizem respeito principalmente à generalização dos resultados, uma vez que a abordagem pretendida foi qualitativa e quantitativa e o número de questionários respondidos não possibilitou realizar testes estatísticos com maior robustez.
Resumo:
Atypical points in the data may result in meaningless e±cient frontiers. This follows since portfolios constructed using classical estimates may re°ect neither the usual nor the unusual days patterns. On the other hand, portfolios constructed using robust approaches are able to capture just the dynamics of the usual days, which constitute the majority of the business days. In this paper we propose an statistical model and a robust estimation procedure to obtain an e±cient frontier which would take into account the behavior of both the usual and most of the atypical days. We show, using real data and simulations, that portfolios constructed in this way require less frequent rebalancing, and may yield higher expected returns for any risk level.
Resumo:
This paper uses an output oriented Data Envelopment Analysis (DEA) measure of technical efficiency to assess the technical efficiencies of the Brazilian banking system. Four approaches to estimation are compared in order to assess the significance of factors affecting inefficiency. These are nonparametric Analysis of Covariance, maximum likelihood using a family of exponential distributions, maximum likelihood using a family of truncated normal distributions, and the normal Tobit model. The sole focus of the paper is on a combined measure of output and the data analyzed refers to the year 2001. The factors of interest in the analysis and likely to affect efficiency are bank nature (multiple and commercial), bank type (credit, business, bursary and retail), bank size (large, medium, small and micro), bank control (private and public), bank origin (domestic and foreign), and non-performing loans. The latter is a measure of bank risk. All quantitative variables, including non-performing loans, are measured on a per employee basis. The best fits to the data are provided by the exponential family and the nonparametric Analysis of Covariance. The significance of a factor however varies according to the model fit although it can be said that there is some agreements between the best models. A highly significant association in all models fitted is observed only for nonperforming loans. The nonparametric Analysis of Covariance is more consistent with the inefficiency median responses observed for the qualitative factors. The findings of the analysis reinforce the significant association of the level of bank inefficiency, measured by DEA residuals, with the risk of bank failure.
Resumo:
Based on three versions of a small macroeconomic model for Brazil, this paper presents empirical evidence on the effects of parameter uncertainty on monetary policy rules and on the robustness of optimal and simple rules over different model specifications. By comparing the optimal policy rule under parameter uncertainty with the rule calculated under purely additive uncertainty, we find that parameter uncertainty should make policymakers react less aggressively to the economy's state variables, as suggested by Brainard's "conservatism principIe", although this effect seems to be relatively small. We then informally investigate each rule's robustness by analyzing the performance of policy rules derived from each model under each one of the alternative models. We find that optimal rules derived from each model perform very poorly under alternative models, whereas a simple Taylor rule is relatively robusto We also fmd that even within a specific model, the Taylor rule may perform better than the optimal rule under particularly unfavorable realizations from the policymaker' s loss distribution function.
Resumo:
Extreme rainfall events have triggered a significant number of flash floods in Madeira Island along its past and recent history. Madeira is a volcanic island where the spatial rainfall distribution is strongly affected by its rugged topography. In this thesis, annual maximum of daily rainfall data from 25 rain gauge stations located in Madeira Island were modelled by the generalised extreme value distribution. Also, the hypothesis of a Gumbel distribution was tested by two methods and the existence of a linear trend in both distributions parameters was analysed. Estimates for the 50– and 100–year return levels were also obtained. Still in an univariate context, the assumption that a distribution function belongs to the domain of attraction of an extreme value distribution for monthly maximum rainfall data was tested for the rainy season. The available data was then analysed in order to find the most suitable domain of attraction for the sampled distribution. In a different approach, a search for thresholds was also performed for daily rainfall values through a graphical analysis. In a multivariate context, a study was made on the dependence between extreme rainfall values from the considered stations based on Kendall’s τ measure. This study suggests the influence of factors such as altitude, slope orientation, distance between stations and their proximity of the sea on the spatial distribution of extreme rainfall. Groups of three pairwise associated stations were also obtained and an adjustment was made to a family of extreme value copulas involving the Marshall–Olkin family, whose parameters can be written as a function of Kendall’s τ association measures of the obtained pairs.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)