897 resultados para estimation and filtering


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. A Monte Carlo study explores the finite sample performance of this procedure and evaluates the forecasting accuracy of models selected by this procedure. Two empirical applications confirm the usefulness of the model selection procedure proposed here for forecasting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study semiparametric two-step estimators which have the same structure as parametric doubly robust estimators in their second step. The key difference is that we do not impose any parametric restriction on the nuisance functions that are estimated in a first stage, but retain a fully nonparametric model instead. We call these estimators semiparametric doubly robust estimators (SDREs), and show that they possess superior theoretical and practical properties compared to generic semiparametric two-step estimators. In particular, our estimators have substantially smaller first-order bias, allow for a wider range of nonparametric first-stage estimates, rate-optimal choices of smoothing parameters and data-driven estimates thereof, and their stochastic behavior can be well-approximated by classical first-order asymptotics. SDREs exist for a wide range of parameters of interest, particularly in semiparametric missing data and causal inference models. We illustrate our method with a simulation exercise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past decade has wítenessed a series of (well accepted and defined) financial crises periods in the world economy. Most of these events aI,"e country specific and eventually spreaded out across neighbor countries, with the concept of vicinity extrapolating the geographic maps and entering the contagion maps. Unfortunately, what contagion represents and how to measure it are still unanswered questions. In this article we measure the transmission of shocks by cross-market correlation\ coefficients following Forbes and Rigobon's (2000) notion of shift-contagion,. Our main contribution relies upon the use of traditional factor model techniques combined with stochastic volatility mo deIs to study the dependence among Latin American stock price indexes and the North American indexo More specifically, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. From a theoretical perspective, we improve currently available methodology by allowing the factor loadings, in the factor model structure, to have a time-varying structure and to capture changes in the series' weights over time. By doing this, we believe that changes and interventions experienced by those five countries are well accommodated by our models which learns and adapts reasonably fast to those economic and idiosyncratic shocks. We empirically show that the time varying covariance structure can be modeled by one or two common factors and that some sort of contagion is present in most of the series' covariances during periods of economical instability, or crisis. Open issues on real time implementation and natural model comparisons are thoroughly discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The indiscriminate management and use of soils without moisture control has changed the structure of it due to the increment of the traffic by agricultural machines through the years, causing in consequence, a soil compaction and yield reduction in the areas of intensive traffic. The purpose of this work was to estimate and to evaluate the performance of preconsolidation pressure of the soil and shear stress as indicators of changes on soil structure in fields cropped with sugarcane, as well as the impact of management processes in an Eutrorthox soil structure located in São Paulo State. The experimental field was located in Piracicaba's rural area (São Paulo State, Brazil) and has been cropped with sugarcane, in the second harvest cycle. The soil was classified by Empresa Brasileira de Pesquisa Agropecuária (EMBRAPA) [Empresa Brasileira de Pesquisa Agropecuária (EMBRAPA), 1999. Centro Nacional de Pesquisa de Solos. Sistema Brasileiro de Classificao de Solos, Empresa Brasileira de Pesquisa Agropecuária (EMBRAPA), Brasilia, 412 pp.] as an Eutrorthox. Undisturbed samples were collected and georeferenced in a grid of 60 m x 60 m from two depths: 0-0.10 m (superficial layer - SL) and in the layer of greatest mechanical resistance (LGMR), previously identified by cone index (CI). The investigated variables were pressure preconsolidation (sigma(p)), apparent cohesion (c) and internal friction angle (phi). The conclusions from the results were that the SLSC was predicted satisfactorily from up as a function of soil moisture; thus, decisions about machinery size and loading (contact pressures) can be taken. Apparent cohesion (c), internal friction angle (phi) and the Coulomb equation were significantly altered by traffic intensity. The sigma(p), c and phi maps were shown to be important tools to localize and visualize soil compaction and mechanical resistance zones. They constitute a valuable resource to evaluate the traffic impact in areas cropped with sugarcane in State of São Paulo, Brazil. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper reports the assessment of the vegetation occupancy rate of the roadside, through analysis of aerial photographs. Using such value the potential of these areas to be employed as carbon (C) sinks was also assessed. Moreover, for the areas suitable for afforestation, the potential for carbon sequestration was estimated considering different species of vegetation, both native (scenario 1) and exotic (formed by Pinus sp. and Eucalyptus sp. - scenario 2). The study was carried out through GIS techniques and two regions were considered. A set of equations was used to estimate the rate of occupancy over the study areas, as well as amounts of fixed C under the above scenarios. The average occupancy rate was 0.06%. The simulation showed a higher potential for C sequestration in scenario 2, being the estimated amounts of CO(2) sequestered from the atmosphere per km of roadside: 131 tons of CO(2) km(-1) of highway to native species and 655 tons of CO(2) km(-1) of highway for exotic species (over period of 10 years for both estimates). If we consider the whole road network of the São Paulo State (approximately 190 000 km) and that a considerable part of this road work is suitable to receive this kind of service, it is possible to predict the very high potential for C sequestration if managers and planners consider roadside as area for afforestation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Commissioning studies of the CMS hadron calorimeter have identified sporadic uncharacteristic noise and a small number of malfunctioning calorimeter channels. Algorithms have been developed to identify and address these problems in the data. The methods have been tested on cosmic ray muon data, calorimeter noise data, and single beam data collected with CMS in 2008. The noise rejection algorithms can be applied to LHC collision data at the trigger level or in the offline analysis. The application of the algorithms at the trigger level is shown to remove 90% of noise events with fake missing transverse energy above 100 GeV, which is sufficient for the CMS physics trigger operation. © 2010 IOP Publishing Ltd and SISSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A robotic control design considering all the inherent nonlinearities of the robot engine configuration is developed. The interactions between the robot and joint motor drive mechanism are considered. The proposed control combines two strategies, one feedforward control in order to maintain the system in the desired coordinate, and feedback control system to take the system into a desired coordinate. The feedback control is obtained using State Dependent Riccati Equation (SDRE). For link positioning two cases are considered. Case 1: For control positioning, it is only used motor voltage; Case 2: For control positioning, it is used both motor voltage and torque between the links. Simulation results, including parametric uncertainties in control shows the feasibility of the proposed control for the considered system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extension of some standard likelihood based procedures to heteroscedastic nonlinear regression models under scale mixtures of skew-normal (SMSN) distributions is developed. This novel class of models provides a useful generalization of the heteroscedastic symmetrical nonlinear regression models (Cysneiros et al., 2010), since the random term distributions cover both symmetric as well as asymmetric and heavy-tailed distributions such as skew-t, skew-slash, skew-contaminated normal, among others. A simple EM-type algorithm for iteratively computing maximum likelihood estimates of the parameters is presented and the observed information matrix is derived analytically. In order to examine the performance of the proposed methods, some simulation studies are presented to show the robust aspect of this flexible class against outlying and influential observations and that the maximum likelihood estimates based on the EM-type algorithm do provide good asymptotic properties. Furthermore, local influence measures and the one-step approximations of the estimates in the case-deletion model are obtained. Finally, an illustration of the methodology is given considering a data set previously analyzed under the homoscedastic skew-t nonlinear regression model. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the scientific achievement of the last decades in the astrophysical and cosmological fields, the majority of the Universe energy content is still unknown. A potential solution to the “missing mass problem” is the existence of dark matter in the form of WIMPs. Due to the very small cross section for WIMP-nuleon interactions, the number of expected events is very limited (about 1 ev/tonne/year), thus requiring detectors with large target mass and low background level. The aim of the XENON1T experiment, the first tonne-scale LXe based detector, is to be sensitive to WIMP-nucleon cross section as low as 10^-47 cm^2. To investigate the possibility of such a detector to reach its goal, Monte Carlo simulations are mandatory to estimate the background. To this aim, the GEANT4 toolkit has been used to implement the detector geometry and to simulate the decays from the various background sources: electromagnetic and nuclear. From the analysis of the simulations, the level of background has been found totally acceptable for the experiment purposes: about 1 background event in a 2 tonne-years exposure. Indeed, using the Maximum Gap method, the XENON1T sensitivity has been evaluated and the minimum for the WIMP-nucleon cross sections has been found at 1.87 x 10^-47 cm^2, at 90% CL, for a WIMP mass of 45 GeV/c^2. The results have been independently cross checked by using the Likelihood Ratio method that confirmed such results with an agreement within less than a factor two. Such a result is completely acceptable considering the intrinsic differences between the two statistical methods. Thus, in the PhD thesis it has been proven that the XENON1T detector will be able to reach the designed sensitivity, thus lowering the limits on the WIMP-nucleon cross section by about 2 orders of magnitude with respect to the current experiments.