1000 resultados para CRASH TESTS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the estimation and testing of conditional duration models by looking at the density and baseline hazard rate functions. More precisely, we foeus on the distance between the parametric density (or hazard rate) function implied by the duration process and its non-parametric estimate. Asymptotic justification is derived using the functional delta method for fixed and gamma kernels, whereas finite sample properties are investigated through Monte Carlo simulations. Finally, we show the practical usefulness of such testing procedures by carrying out an empirical assessment of whether autoregressive conditional duration models are appropriate to oIs for modelling price durations of stocks traded at the New York Stock Exchange.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new multivariate test for the detection ofunit roots is proposed. Use is made ofthe possible correlations between the disturbances of difIerent series, and constrained and unconstrained SURE estimators are employed. The corresponding asymptotic distributions, for the case oftwo series, are obtained and a table with criticai vaIues is generated. Some simulations indivate that the procedure performs better than the existing alternatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theories can be produced by individuals seeking a good reputation of knowledge. Hence, a significant question is how to test theories anticipating that they might have been produced by (potentially uninformed) experts who prefer their theories not to be rejected. If a theory that predicts exactly like the data generating process is not rejected with high probability then the test is said to not reject the truth. On the other hand, if a false expert, with no knowledge over the data generating process, can strategically select theories that will not be rejected then the test can be ignorantly passed. These tests have limited use because they cannot feasibly dismiss completely uninformed experts. Many tests proposed in the literature (e.g., calibration tests) can be ignorantly passed. Dekel and Feinberg (2006) introduced a class of tests that seemingly have some power of dismissing uninformed experts. We show that some tests from their class can also be ignorantly passed. One of those tests, however, does not reject the truth and cannot be ignorantly passed. Thus, this empirical test can dismiss false experts.We also show that a false reputation of knowledge can be strategically sustained for an arbitrary, but given, number of periods, no matted which test is used (provided that it does not reject the truth). However, false experts can be discredited, even with bounded data sets, if the domain of permissible theories is mildly restricted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers two-sided tests for the parameter of an endogenous variable in an instrumental variable (IV) model with heteroskedastic and autocorrelated errors. We develop the nite-sample theory of weighted-average power (WAP) tests with normal errors and a known long-run variance. We introduce two weights which are invariant to orthogonal transformations of the instruments; e.g., changing the order in which the instruments appear. While tests using the MM1 weight can be severely biased, optimal tests based on the MM2 weight are naturally two-sided when errors are homoskedastic. We propose two boundary conditions that yield two-sided tests whether errors are homoskedastic or not. The locally unbiased (LU) condition is related to the power around the null hypothesis and is a weaker requirement than unbiasedness. The strongly unbiased (SU) condition is more restrictive than LU, but the associated WAP tests are easier to implement. Several tests are SU in nite samples or asymptotically, including tests robust to weak IV (such as the Anderson-Rubin, score, conditional quasi-likelihood ratio, and I. Andrews' (2015) PI-CLC tests) and two-sided tests which are optimal when the sample size is large and instruments are strong. We refer to the WAP-SU tests based on our weights as MM1-SU and MM2-SU tests. Dropping the restrictive assumptions of normality and known variance, the theory is shown to remain valid at the cost of asymptotic approximations. The MM2-SU test is optimal under the strong IV asymptotics, and outperforms other existing tests under the weak IV asymptotics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we focus on tests for the parameter of an endogenous variable in a weakly identi ed instrumental variable regressionmodel. We propose a new unbiasedness restriction for weighted average power (WAP) tests introduced by Moreira and Moreira (2013). This new boundary condition is motivated by the score e ciency under strong identi cation. It allows reducing computational costs of WAP tests by replacing the strongly unbiased condition. This latter restriction imposes, under the null hypothesis, the test to be uncorrelated to a given statistic with dimension given by the number of instruments. The new proposed boundary condition only imposes the test to be uncorrelated to a linear combination of the statistic. WAP tests under both restrictions to perform similarly numerically. We apply the di erent tests discussed to an empirical example. Using data from Yogo (2004), we assess the e ect of weak instruments on the estimation of the elasticity of inter-temporal substitution of a CCAPM model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (‘light-touch’) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — i.e., by investors who have incentives to interpret information in a biased fashion in a systematic way. Thus, ‘fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in (excessively) ‘friendly-regulated’ and over-liquid financial market to learn from previous mistakes seems rather limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (light-touched) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — investors have incentives to interpret information in a biased fashion in a systematic way. ‘Fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in unregulated and over-liquid financial market to learn from previous mistakes seems rather limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of competition is an important source of variation in breeding experiments. This study aimed to compare the selection of plants of open-pollinated families of Eucalyptus with and without the use of competition covariables. Genetic values were determined for each family and tree and for the traits height, diameter at breast height and timber volume in a randomized block design, resulting in the variance components, genetic parameters, selection gains, effective size and selection coincidence, with and without the use of covariables. Intergenotypic competition is an important factor of environmental variation. The use of competition covariables generally reduces the estimates of variance components and influences genetic gains in the studied traits. Intergenotypic competition biases the selection of open-pollinated eucalypt progenies, and can result in an erroneous choice of superior genotypes; the inclusion of covariables in the model reduces this influence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A duplicated nitrotienyl derivative was obtained as a by-product from the synthesis of a proposed molecular hybrid of a nitrotienyl derivative and isoniazid with an expected dual antimycobacteria mechanism. The structure was shown to be the 5,5'-dinitro-2(2,3-diaza-4-(2'-tienyl)buta-1,3-dienyl)tiophene by X-ray crystallography. The minimal inhibitory concentration (MIC) determination of this compound proved to be promising against Mycobacterium pathogenic strains such as M. avium and M. kansasei, although it had a high level of mutagenicity, as observed in mutagenic activity tests. (c) 2006 Elsevier Masson SAS. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neural networks and wavelet transform have been recently seen as attractive tools for developing eficient solutions for many real world problems in function approximation. Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. So, mathematical model is a very important tool to guarantee the development of the neural network area. In this article we will introduce one series of mathematical demonstrations that guarantee the wavelets properties for the PPS functions. As application, we will show the use of PPS-wavelets in pattern recognition problems of handwritten digit through function approximation techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many factors such as the sunlight, intensity of radiation, temperature, and moisture may influence the degradation process of geosynthetics. UV stabilizers are used especially in polyolefin geomembrane to prevent the degradation process. In these geomembranes the service lifetime is initially governed by the consumption of antioxidants. Tests like MFI and OIT are a alternative to detect the oxidative degradation in polyolefins. This article evaluates HDPE geomembrane degradation after UV exposure through the results of MFI and OIT tests. Two kinds of geomembranes were evaluated: a black and smooth (0.8, 1.0, 1.5, 2.5 mm) and a white and textured (1.0 mm). MFI test showed some levels of superficial degradation (crosslink) in HDPE geomembrane.