903 resultados para LM Tests
Resumo:
We evaluate the performance of several specification tests for Markov regime-switching time-series models. We consider the Lagrange multiplier (LM) and dynamic specification tests of Hamilton (1996) and Ljung–Box tests based on both the generalized residual and a standard-normal residual constructed using the Rosenblatt transformation. The size and power of the tests are studied using Monte Carlo experiments. We find that the LM tests have the best size and power properties. The Ljung–Box tests exhibit slight size distortions, though tests based on the Rosenblatt transformation perform better than the generalized residual-based tests. The tests exhibit impressive power to detect both autocorrelation and autoregressive conditional heteroscedasticity (ARCH). The tests are illustrated with a Markov-switching generalized ARCH (GARCH) model fitted to the US dollar–British pound exchange rate, with the finding that both autocorrelation and GARCH effects are needed to adequately fit the data.
Resumo:
Objective. The aim of this study was to verify the possibility of lactate minimum (LM) determination during a walking test and the validity of such LM protocol on predicting the maximal lactate steady-state (MLSS) intensity. Design. Eleven healthy subjects (24.2 ± 4.5 yr; 74.3 ± 7.7 kg; 176.9 ± 4.1 cm) performed LM tests on a treadmill, consisting of walking at 5.5 km h -1 and with 20-22% of inclination until voluntary exhaustion to induce metabolic acidosis. After 7 minutes of recovery the participants performed an incremental test starting at 7% incline with increments of 2% at each 3 minutes until exhaustion. A polynomial modeling approach (LMp) and a visual inspection (LMv) were used to identify the LM as the exercise intensity associated to the lowest [bLac] during the test. Participants also underwent to 24 constant intensity tests of 30 minutes to determine the MLSS intensity. Results. There were no differences among LMv (12.6 ± 1.7 %), LMp (13.1 ± 1.5 %), and MLSS (13.6 ± 2.1 %) and the Bland and Altman plots evidenced acceptable agreement between them. Conclusion. It was possible to identify the LM during walking tests with intensity imposed by treadmill inclination, and it seemed to be valid on identifying the exercise intensity associated to the MLSS. Copyright © 2012 Guilherme Morais Puga et al.
Resumo:
Recent work shows that a low correlation between the instruments and the included variables leads to serious inference problems. We extend the local-to-zero analysis of models with weak instruments to models with estimated instruments and regressors and with higher-order dependence between instruments and disturbances. This makes this framework applicable to linear models with expectation variables that are estimated non-parametrically. Two examples of such models are the risk-return trade-off in finance and the impact of inflation uncertainty on real economic activity. Results show that inference based on Lagrange Multiplier (LM) tests is more robust to weak instruments than Wald-based inference. Using LM confidence intervals leads us to conclude that no statistically significant risk premium is present in returns on the S&P 500 index, excess holding yields between 6-month and 3-month Treasury bills, or in yen-dollar spot returns.
Resumo:
The goal of this paper is to introduce a class of tree-structured models that combines aspects of regression trees and smooth transition regression models. The model is called the Smooth Transition Regression Tree (STR-Tree). The main idea relies on specifying a multiple-regime parametric model through a tree-growing procedure with smooth transitions among different regimes. Decisions about splits are entirely based on a sequence of Lagrange Multiplier (LM) tests of hypotheses.
Resumo:
This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.
Resumo:
Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed, A brief overview of Genetic Algorithms (GAs) and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance pf our GA-based approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger. To account for the relatively quick convergence of the gradient descent methods, we analyze the landscape of the COP-based cost function. We prove that the cost function is unimodal in the search space. This feature makes the cost function amenable to optimization by gradient-descent techniques as compared to random search methods such as Genetic Algorithms.
Resumo:
This paper proposes finite-sample procedures for testing the SURE specification in multi-equation regression models, i.e. whether the disturbances in different equations are contemporaneously uncorrelated or not. We apply the technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] to obtain exact tests based on standard LR and LM zero correlation tests. We also suggest a MC quasi-LR (QLR) test based on feasible generalized least squares (FGLS). We show that the latter statistics are pivotal under the null, which provides the justification for applying MC tests. Furthermore, we extend the exact independence test proposed by Harvey and Phillips (1982) to the multi-equation framework. Specifically, we introduce several induced tests based on a set of simultaneous Harvey/Phillips-type tests and suggest a simulation-based solution to the associated combination problem. The properties of the proposed tests are studied in a Monte Carlo experiment which shows that standard asymptotic tests exhibit important size distortions, while MC tests achieve complete size control and display good power. Moreover, MC-QLR tests performed best in terms of power, a result of interest from the point of view of simulation-based tests. The power of the MC induced tests improves appreciably in comparison to standard Bonferroni tests and, in certain cases, outperforms the likelihood-based MC tests. The tests are applied to data used by Fischer (1993) to analyze the macroeconomic determinants of growth.
Resumo:
The running velocities associated to lactate minimum (V-lm), heart rate deflection (V-HRd), critical velocity (CV), 3000 M (V-3000) and 10000 m performance (V-10km) were compared. Additionally the ability of V-lm and VHRd on identifying sustainable velocities was investigated.Methods. Twenty runners (28.5 +/- 5.9 y) performed 1) 3000 m running test for V3000; 2) an all-out 500 in sprint followed by 6x800 m incremental bouts with blood lactate ([lac]) measurements for V-lm; 3) a continuous velocity-incremented test with heart rate measurements at each 200 m for V-HRd; 4) participants attempted to 30 min of endurance test both at V-lm(ETVlm) and V-HRd(ETVHRd). Additionally, the distance-time and velocity-1/time relationships produced CV by 2 (500 m and 3000 m) or 3 predictive trials (500 m, 3000 m and distance reached before exhaustion during ETVHRd), and a 10 km race was recorded for V-10km.Results. The CV identified by different methods did not differ to each other. The results (m(.)min(-1)) revealed that V-.(lm) (281 +/- 14.8)< CV (292.1 +/- 17.5)=V-10km (291.7 +/- 19.3)< V-HRd (300.8 +/- 18.7)=V-3000 (304 +/- 17.5) with high correlation among parameters (P < 0.001). During ETVlm participants completed 30 min of running while on the ETVHRd they lasted only 12.5 +/- 8.2 min with increasing [lac].Conclusion. We evidenced that CV and Vim track-protocols are valid for running evaluation and performance prediction and the parameters studied have different significance. The V-lm reflects the moderate-high intensity domain (below CV), can be sustained without [lac] accumulation and may be used for long-term exercise while the V-HRd overestimates a running intensity that can be sustained for long-time. Additionally, V-3000 and V-HRd reflect the severe intensity domain (above CV).
Resumo:
ABSTRACT: Preliminary studies completed on commensal rodents with the new anticoagulant rodenticide difethialone showed very good efficacy, such that 25 ppm baits could be used effectively. New test results presented in this publication confirm the activity as shown under laboratory conditions in choice tests, which represent more severe conditions, as well as its effectiveness against rodents that are resistant and non-resistant to warfarin. In tests where the palatability was only fair the chemical activity resulted in excellent mortality. In a field test against a large population of Mus musculus the results proved very satisfactory. Difethialone is toxic to birds and fish. However, it seems to be better tolerated by dogs and pigs, animals that are frequently on the list of accidental poisonings. Difethialone is stored over a prolonged period in the liver but the risk to non-target species consuming rodents having ingested the compound does not seem to be high. For reasons attributed to the mode of action, difethialone must be handled with precautions as other anticoagulants for which Vitamin Kj is the antidote. In the event of an accidental poisoning, an antidotal therapy plan is proposed. The lower levels of active ingredient in finished baits (25 ppm) should pose a low risk to non-target species.
Resumo:
The interactions between outdoor bronzes and the environment, which lead to bronze corrosion, require a better understanding in order to design effective conservation strategies in the Cultural Heritage field. In the present work, investigations on real patinas of the outdoor monument to Vittorio Bottego (Parma, Italy) and laboratory studies on accelerated corrosion testing of inhibited (by silane-based films, with and without ceria nanoparticles) and non-inhibited quaternary bronzes are reported and discussed. In particular, a wet&dry ageing method was used both for testing the efficiency of the inhibitor and for patinating bronze coupons before applying the inhibitor. A wide range of spectroscopic techniques has been used, for characterizing the core metal (SEM+EDS, XRF, AAS), the corroded surfaces (SEM+EDS, portable XRF, micro-Raman, ATR-IR, Py-GC-MS) and the ageing solutions (AAS). The main conclusions were: 1. The investigations on the Bottego monument confirmed the differentiation of the corrosion products as a function of the exposure geometry, already observed in previous works, further highlighting the need to take into account the different surface features when selecting conservation procedures such as the application of inhibitors (i.e. the relative Sn enrichment in unsheltered areas requires inhibitors which effectively interact not only with Cu but also with Sn). 2. The ageing (pre-patination) cycle on coupons was able to reproduce the relative Sn enrichment that actually happens in real patinated surfaces, making the bronze specimens representative of the real support for bronze inhibitors. 3. The non-toxic silane-based inhibitors display a good protective efficiency towards pre-patinated surfaces, differently from other widely used inhibitors such as benzotriazole (BTA) and its derivatives. 4. The 3-mercapto-propyl-trimethoxy-silane (PropS-SH) additivated with CeO2 nanoparticles generally offered a better corrosion protection than PropS-SH.
Resumo:
Total knee arthroplasty (TKA) has revolutionized the life of millions of patients and it is the most efficient treatment in cases of osteoarthritis. The increase in life expectancy has lowered the average age of the patient, which requires a more enduring and performing prosthesis. To improve the design of implants and satisfying the patient's needs, a deep understanding of the knee Biomechanics is needed. To overcome the uncertainties of numerical models, recently instrumented knee prostheses are spreading. The aim of the thesis was to design and manifacture a new prototype of instrumented implant, able to measure kinetics and kinematics (in terms of medial and lateral forces and patellofemoral forces) of different interchangeable designs of prosthesis during experiments tests within a research laboratory, on robotic knee simulator. Unlike previous prototypes it was not aimed for industrial applications, but purely focusing on research. After a careful study of the literature, and a preliminary analytic study, the device was created modifying the structure of a commercial prosthesis and transforming it in a load cell. For monitoring the kinematics of the femoral component a three-layers, piezoelettric position sensor was manifactured using a Velostat foil. This sensor has responded well to pilot test. Once completed, such device can be used to validate existing numerical models of the knee and of TKA and create new ones, more accurate.It can lead to refinement of surgical techniques, to enhancement of prosthetic designs and, once validated, and if properly modified, it can be used also intraoperatively.
Resumo:
Altough nowadays DMTA is one of the most used techniques to characterize polymers thermo-mechanical behaviour, it is only effective for small amplitude oscillatory tests and limited to a single frequency analysis (linear regime). In this thesis work a Fourier transform based experimental system has proven to give hint on structural and chemical changes in specimens during large amplitude oscillatory tests exploiting multi frequency spectral analysis turning out in a more sensitive tool than classical linear approach. The test campaign has been focused on three test typologies: Strain sweep tests, Damage investigation and temperature sweep tests.