944 resultados para parametric and nonparametric test
Resumo:
Electronic applications are currently developed under the reuse-based paradigm. This design methodology presents several advantages for the reduction of the design complexity, but brings new challenges for the test of the final circuit. The access to embedded cores, the integration of several test methods, and the optimization of the several cost factors are just a few of the several problems that need to be tackled during test planning. Within this context, this thesis proposes two test planning approaches that aim at reducing the test costs of a core-based system by means of hardware reuse and integration of the test planning into the design flow. The first approach considers systems whose cores are connected directly or through a functional bus. The test planning method consists of a comprehensive model that includes the definition of a multi-mode access mechanism inside the chip and a search algorithm for the exploration of the design space. The access mechanism model considers the reuse of functional connections as well as partial test buses, cores transparency, and other bypass modes. The test schedule is defined in conjunction with the access mechanism so that good trade-offs among the costs of pins, area, and test time can be sought. Furthermore, system power constraints are also considered. This expansion of concerns makes it possible an efficient, yet fine-grained search, in the huge design space of a reuse-based environment. Experimental results clearly show the variety of trade-offs that can be explored using the proposed model, and its effectiveness on optimizing the system test plan. Networks-on-chip are likely to become the main communication platform of systemson- chip. Thus, the second approach presented in this work proposes the reuse of the on-chip network for the test of the cores embedded into the systems that use this communication platform. A power-aware test scheduling algorithm aiming at exploiting the network characteristics to minimize the system test time is presented. The reuse strategy is evaluated considering a number of system configurations, such as different positions of the cores in the network, power consumption constraints and number of interfaces with the tester. Experimental results show that the parallelization capability of the network can be exploited to reduce the system test time, whereas area and pin overhead are strongly minimized. In this manuscript, the main problems of the test of core-based systems are firstly identified and the current solutions are discussed. The problems being tackled by this thesis are then listed and the test planning approaches are detailed. Both test planning techniques are validated for the recently released ITC’02 SoC Test Benchmarks, and further compared to other test planning methods of the literature. This comparison confirms the efficiency of the proposed methods.
Resumo:
A pesquisa testou se, a exemplo do que ocorre nos mercados acionários internacionais, a volatilidade no mercado acionário brasileiro está mais associada à negociação do que à passagem do tempo. Testes paramétricos e não-paramétricos aplicados a séries de retornos diários do IBOVESPA e de vinte e oito ações isoladas, no período de 1º de julho de 1994 a 30 de junho de 1999, autorizam a afirmativa de que, na matéria em foco, nosso mercado não se distingue dos mercados internacionais.
Resumo:
A pesquisa teve como objetivo testar se preços no mercado futuro brasileiro seguem um passeio aleatório - uma das versões da chamada Hipótese do Mercado Eficiente. Foram estudados os preços dos contratos futuros de Ibovespa e de dólar comercial, de 30 de junho de 1994 a 31 de dezembro de 1998. Aplicação de testes paramétricos e não-paramétricos envolvendo a Relação de Variâncias (Variance Ratio) de Lo-MacKinlay levam à conclusão de que a hipótese testada não pode ser rejeitada, apontando, portanto, para eficiência em tais mercados.
Resumo:
This article analyses the relationship between infrastructure and total factor productivity (TFP) in the four major Latin American economies: Argentina, Brazil, Chile and Mexico. We hypothesise that an increase in infrastructure has an indirect effect on long-term economic growth by raising productivity. To assess this theory, we use the traditional Johansen methodology for testing the cointegration between TFP and physical measures of infrastructure stock, such as energy, roads, and telephones. We then apply the Lütkepohl, Saikkonen and Trenkler Test, which considers a possible level shift in the series and has better small sample properties, to the same data set and compare the two tests. The results do not support a robust long-term relationship between the series; we do not find strong evidence that cuts in infrastructure investment in some Latin American countries were the main reason for the fall in TFP during the 1970s and 1980s.
Resumo:
O trabalho procura investigar a existência de relação de cointegração entre a Taxa de Câmbio Real (CRER), Passivo Externo Líquido (PEL), Termos de Troca (TOT) e um fator de produtividade (BS), utilizando um teste não paramétrico proposto por Bierens (1997), aplicado a uma amostra de dados para EUA e Brasil que cobre o período de 1980 a 2010. Para os EUA, é encontrada evidência da influência das variáveis elencadas. No caso brasileiro verifica-se pouca relevância da variável BS, sendo as demais variáveis presentes no vetor de cointegração.
Resumo:
To assess the quality of school education, much of educational research is concerned with comparisons of test scores means or medians. In this paper, we shift this focus and explore test scores data by addressing some often neglected questions. In the case of Brazil, the mean of test scores in Math for students of the fourth grade has declined approximately 0,2 standard deviation in the late 1990s. But what about changes in the distribution of scores? It is unclear whether the decline was caused by deterioration in student performance in upper and/or lower tails of the distribution. To answer this question, we propose the use of the relative distribution method developed by Handcock and Morris (1999). The advantage of this methodology is that it compares two distributions of test scores data through a single distribution and synthesizes all the differences between them. Moreover, it is possible to decompose the total difference between two distributions in a level effect (changes in median) and shape effect (changes in shape of the distribution). We find that the decline of average-test scores is mainly caused by a worsening in the position of all students throughout the distribution of scores and is not only specific to any quantile of distribution.
Resumo:
Este estudo compara previsões de volatilidade de sete ações negociadas na Bovespa usando 02 diferentes modelos de volatilidade realizada e 03 de volatilidade condicional. A intenção é encontrar evidências empíricas quanto à diferença de resultados que são alcançados quando se usa modelos de volatilidade realizada e de volatilidade condicional para prever a volatilidade de ações no Brasil. O período analisado vai de 01 de Novembro de 2007 a 30 de Março de 2011. A amostra inclui dados intradiários de 5 minutos. Os estimadores de volatilidade realizada que serão considerados neste estudo são o Bi-Power Variation (BPVar), desenvolvido por Barndorff-Nielsen e Shephard (2004b), e o Realized Outlyingness Weighted Variation (ROWVar), proposto por Boudt, Croux e Laurent (2008a). Ambos são estimadores não paramétricos, e são robustos a jumps. As previsões de volatilidade realizada foram feitas através de modelos autoregressivos estimados para cada ação sobre as séries de volatilidade estimadas. Os modelos de variância condicional considerados aqui serão o GARCH(1,1), o GJR (1,1), que tem assimetrias em sua construção, e o FIGARCH-CHUNG (1,d,1), que tem memória longa. A amostra foi divida em duas; uma para o período de estimação de 01 de Novembro de 2007 a 30 de Dezembro de 2010 (779 dias de negociação) e uma para o período de validação de 03 de Janeiro de 2011 a 31 de Março de 2011 (61 dias de negociação). As previsões fora da amostra foram feitas para 1 dia a frente, e os modelos foram reestimados a cada passo, incluindo uma variável a mais na amostra depois de cada previsão. As previsões serão comparadas através do teste Diebold-Mariano e através de regressões da variância ex-post contra uma constante e a previsão. Além disto, o estudo também apresentará algumas estatísticas descritivas sobre as séries de volatilidade estimadas e sobre os erros de previsão.
Resumo:
Este trabalho tem a finalidade de analisar as evidências de relações de longo prazo entre a taxa de câmbio real (“RER”), a posição internacional de investimentos (“NFA”) e o efeito Balassa-Samuelson (“PREL”) em um grupo de 28 países, grupo este que inclui países em diferentes estágios de desenvolvimento. A metodologia utilizada foi a de testes de cointegração. Os testes aplicados foram desenvolvidos por Bierens (1997), teste não paramétrico, e por Saikkonen e Lütkepohl (2000a, b, c), teste que consiste em primeiro estimar um termo determinístico. Evidências de cointegração são constatadas, em ambos os testes, na maioria dos países estudados. Entretanto, houve diferenças relevantes entre os resultados encontrados através dos dois testes aplicados. Estas diferenças entre os resultados, bem como alguns casos especiais de países que não demonstraram evidências de cointegração, requerem análises mais aprofundadas sobre o comportamento de longo prazo das três variáveis estudadas.
Resumo:
This paper performs a thorough statistical examination of the time-series properties of the daily market volatility index (VIX) from the Chicago Board Options Exchange (CBOE). The motivation lies not only on the widespread consensus that the VIX is a barometer of the overall market sentiment as to what concerns investors' risk appetite, but also on the fact that there are many trading strategies that rely on the VIX index for hedging and speculative purposes. Preliminary analysis suggests that the VIX index displays long-range dependence. This is well in line with the strong empirical evidence in the literature supporting long memory in both options-implied and realized variances. We thus resort to both parametric and semiparametric heterogeneous autoregressive (HAR) processes for modeling and forecasting purposes. Our main ndings are as follows. First, we con rm the evidence in the literature that there is a negative relationship between the VIX index and the S&P 500 index return as well as a positive contemporaneous link with the volume of the S&P 500 index. Second, the term spread has a slightly negative long-run impact in the VIX index, when possible multicollinearity and endogeneity are controlled for. Finally, we cannot reject the linearity of the above relationships, neither in sample nor out of sample. As for the latter, we actually show that it is pretty hard to beat the pure HAR process because of the very persistent nature of the VIX index.
Resumo:
We develop and empirically test a continuous time equilibrium model for the pricing of oil futures. The model provides a link between no-arbitrage models and expectation oriented models. It highlights the role of inventories for the identification of different pricing regimes. In an empirical study the hedging performance of our model is compared with five other one- and two-factor pricing models. The hedging problem considered is related to Metallgesellschaft´s strategy to hedge long-term forward commitments with short-term futures. The results show that the downside risk distribution of our inventory based model stochastically dominates those of the other models.
Resumo:
This paper analyzes the placement in the private sector of a subset of Brazilian public-sector employees. This group left public employment in the mid-1990’s through a voluntary severance program. This paper contrasts their earnings before and after quitting the public sector, and compares both sets of wages to public and private sector earnings for similar workers. We find that participants in this voluntary severance program suffered a significant reduction in average earnings wage and an increase in earnings dispersion. We test whether the reduction in average earnings and the increase in earnings dispersion is the expected outcome once one controls for observed characteristics, by means of counterfactual simulations. Several methods of controlling for observed characteristics (parametric and non-parametrically) are used for robustness. The results indicate that this group of workers was paid at levels below what would be expected given their embodied observable characteristics.
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
PURPOSE: The infection is one of the main factors that affect the physiological evolution of the surgical wounds. The aim of this work is to evaluate the effects of fibroblast growth factor (FGFâ) and anti-FGFâ in the healing, synthesis and maturation of collagen when topically used on infected skin wounds of rats. METHODS: An experimental study was perfomed in 60 male Wistar rats. All animals were divided in two groups (A and B). Each group was divided in three subgroups A1, B1; A2, B2 and A3, B3. After anesthesia with pentobarbital, two open squared wounds (1cm2), 4cm distant to each other, were done in the dorsal skin of all the rats. In group A (n=30) the wounds were contaminated with multibacterial standard solution, and in group B(n=30) the wounds were maintained sterile. These wounds were named F1 (for inflammation analysis) and F2 (for collagen study). The open wounds of A1 and B1 rats were topically treated with saline solution, A2 and B2 were treated with FGFâ and subgroups A3 and B3 were treated with FGFâ and anti-FGFâ. The rats were observed until complete epitelization of F2 wounds for determination of healing time and the expression of types I and III collagen, using Picro Sirius Red staining. Inflammatory reaction in F1 wounds was studied using hematoxilineosin staining. The three variable was measured by the Image Pro-Plus Média Cybernetics software. The statistical analysis was performed by ANOVA and Tukey test, considering p<0.05 as significant. RESULTS: It was observed that infection retarded significantly (p<0.05) the time of wound scarring and the topical application of FCFb reverted the inhibition of healing caused by bacteria. The inflammatory reaction was greater in the subgroup B2 than in B1 and A3, and the difference was significant (p<0.05). It was observed greater expression of type I collagen in all the subgroups treated with FCFb, when compared with the untreated subgroups. Type III collagen was significantly decreased in wounds of B3 rats, comparing to the other subgroups. CONCLUSIONS: The FCFb accelerated the healing of open infected wounds and contributed with maturation of collagen, enhancing the type I collagen density. The anti-FCFb antibody was able to attenuate the production of both type I and III collagen
Resumo:
Among the toxic elements, Cd has received considerable attention in view of its association with a number of human health problems. The objectives of this study were to evaluate the Cd availability and accumulation in soil, transfer rate and toxicity in lettuce and rice plants grown in a Cd-contaminated Typic Hapludox. Two simultaneous greenhouse experiments with lettuce and rice test plants were conducted in a randomized complete block design with four replications. The treatments consisted of four Cd rates (CdCl2), 0.0; 1.3; 3.0 and 6.0 mg kg(-1), based on the guidelines recommended by the Environmental Agency of the State of São Paulo, Brazil (Cetesb). Higher Cd rates increased extractable Cd (using Mehlich-3, Mehlich-1 and DTPA chemical extractants) and decreased lettuce and rice dry matter yields. However, no visual toxicity symptoms were observed in plants. Mehlich-1, Mehlich-3 and DTPA extractants were effective in predicting soil Cd availability as well as the Cd concentration and accumulation in plant parts. Cadmium concentration in rice remained below the threshold for human consumption established by Brazilian legislation. on the other hand, lettuce Cd concentration in edible parts exceeded the acceptable limit.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)