960 resultados para Parametric and semiparametric methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a comparative analysis of linear and mixed modelsfor short term forecasting of a real data series with a high percentage of missing data. Data are the series of significant wave heights registered at regular periods of three hours by a buoy placed in the Bay of Biscay.The series is interpolated with a linear predictor which minimizes theforecast mean square error. The linear models are seasonal ARIMA models and themixed models have a linear component and a non linear seasonal component.The non linear component is estimated by a non parametric regression of dataversus time. Short term forecasts, no more than two days ahead, are of interestbecause they can be used by the port authorities to notice the fleet.Several models are fitted and compared by their forecasting behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By means of classical Itô's calculus we decompose option prices asthe sum of the classical Black-Scholes formula with volatility parameterequal to the root-mean-square future average volatility plus a term dueby correlation and a term due to the volatility of the volatility. Thisdecomposition allows us to develop first and second-order approximationformulas for option prices and implied volatilities in the Heston volatilityframework, as well as to study their accuracy. Numerical examples aregiven.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the existence of moments and the tail behaviour of the densitiesof storage processes. We give sufficient conditions for existence andnon-existence of moments using the integrability conditions ofsubmultiplicative functions with respect to Lévy measures. Then, we studythe asymptotical behavior of the tails of these processes using the concaveor convex envelope of the release rate function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, generalizing results in Alòs, León and Vives (2007b), we see that the dependence of jumps in the volatility under a jump-diffusion stochastic volatility model, has no effect on the short-time behaviour of the at-the-money implied volatility skew, although the corresponding Hull and White formula depends on the jumps. Towards this end, we use Malliavin calculus techniques for Lévy processes based on Løkka (2004), Petrou (2006), and Solé, Utzet and Vives (2007).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the prevalence of smoking has decreased to below 20%, health practitioners interest has shifted towards theprevalence of obesity, and reducing it is one of the major health challenges in decades to come. In this paper westudy the impact that the final product of the anti-smoking campaign, that is, smokers quitting the habit, had onaverage weight in the population. To these ends, we use data from the Behavioral Risk Factors Surveillance System,a large series of independent representative cross-sectional surveys. We construct a synthetic panel that allows us tocontrol for unobserved heterogeneity and we exploit the exogenous changes in taxes and regulations to instrumentthe endogenous decision to give up the habit of smoking. Our estimates, are very close to estimates issued in the 90sby the US Department of Health, and indicate that a 10% decrease in the incidence of smoking leads to an averageweight increase of 2.2 to 3 pounds, depending on choice of specification. In addition, we find evidence that the effectovershoots in the short run, although a significant part remains even after two years. However, when we split thesample between men and women, we only find a significant effect for men. Finally, the implicit elasticity of quittingsmoking to the probability of becoming obese is calculated at 0.58. This implies that the net benefit from reducingthe incidence of smoking by 1% is positive even though the cost to society is $0.6 billions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generalization of simple (two-variable) correspondence analysis to more than two categorical variables, commonly referred to as multiple correspondence analysis, is neither obvious nor well-defined. We present two alternative ways of generalizing correspondence analysis, one based on the quantification of the variables and intercorrelation relationships, and the other based on the geometric ideas of simple correspondence analysis. We propose a version of multiple correspondence analysis, with adjusted principal inertias, as the method of choice for the geometric definition, since it contains simple correspondence analysis as an exact special case, which is not the situation of the standard generalizations. We also clarify the issue of supplementary point representation and the properties of joint correspondence analysis, a method that visualizes all two-way relationships between the variables. The methodology is illustrated using data on attitudes to science from the International Social Survey Program on Environment in 1993.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate on-line prediction of individual sequences. Given a class of predictors, the goal is to predict as well as the best predictor in the class, where the loss is measured by the self information (logarithmic) loss function. The excess loss (regret) is closely related to the redundancy of the associated lossless universal code. Using Shtarkov's theorem and tools from empirical process theory, we prove a general upper bound on the best possible (minimax) regret. The bound depends on certain metric properties of the class of predictors. We apply the bound to both parametric and nonparametric classes ofpredictors. Finally, we point out a suboptimal behavior of the popular Bayesian weighted average algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop a general error analysis framework for the Monte Carlo simulationof densities for functionals in Wiener space. We also study variancereduction methods with the help of Malliavin derivatives. For this, wegive some general heuristic principles which are applied to diffusionprocesses. A comparison with kernel density estimates is made.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the relationship between ethnic fractionalization, polarization, and conflict. In recent years many authors have found empirical evidence that ethnic fractionalization has a negative effect on growth. One mechanism that can explain this nexus is the effect of ethnic heterogeneity on rent-seeking activities and the increase in potential conflict, which is negative for investment. However the empirical evidence supporting the effect of ethnic fractionalization on the incidence of civil conflicts is very weak. Although ethnic fractionalization may be important for growth, we argue that the channel is not through an increase in potential ethnic conflict. We discuss the appropriateness of indices of polarization to capture conflictive dimensions. We develop a new measure of ethnic heterogeneity that satisfies the basic properties associated with the concept of polarization. The empirical section shows that this index of ethnic polarization is a significant variable in the explanation of the incidence of civil wars. This result is robust to the presence of other indicators of ethnic heterogeneity, other sources of data for the construction of the index, and other data structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neural signatures of humans' movement intention can be exploited by future neuroprosthesis. We propose a method for detecting self-paced upper limb movement intention from brain signals acquired with both invasive and noninvasive methods. In the first study with scalp electroencephalograph (EEG) signals from healthy controls, we report single trial detection of movement intention using movement related potentials (MRPs) in a frequency range between 0.1 to 1 Hz. Movement intention can be detected above chance level (p<0.05) on average 460 ms before the movement onset with low detection rate during the on-movement intention period. Using intracranial EEG (iEEG) from one epileptic subject, we detect movement intention as early as 1500 ms before movement onset with accuracy above 90% using electrodes implanted in the bilateral supplementary motor area (SMA). The coherent results obtained with non-invasive and invasive method and its generalization capabilities across different days of recording, strengthened the theory that self-paced movement intention can be detected before movement initiation for the advancement in robot-assisted neurorehabilitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I discuss the identifiability of a structural New Keynesian Phillips curve when it is embedded in a small scale dynamic stochastic general equilibrium model. Identification problems emerge because not all the structural parameters are recoverable from the semi-structural ones and because the objective functions I consider are poorly behaved. The solution and the moment mappings are responsible for the problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dual scaling of a subjects-by-objects table of dominance data (preferences,paired comparisons and successive categories data) has been contrasted with correspondence analysis, as if the two techniques were somehow different. In this note we show that dual scaling of dominance data is equivalent to the correspondence analysis of a table which is doubled with respect to subjects. We also show that the results of both methods can be recovered from a principal components analysis of the undoubled dominance table which is centred with respect to subject means.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a model and solution methods, for locating a fixed number ofmultiple-server, congestible common service centers or congestible publicfacilities. Locations are chosen so to minimize consumers congestion (orqueuing) and travel costs, considering that all the demand must be served.Customers choose the facilities to which they travel in order to receiveservice at minimum travel and congestion cost. As a proxy for thiscriterion, total travel and waiting costs are minimized. The travel costis a general function of the origin and destination of the demand, whilethe congestion cost is a general function of the number of customers inqueue at the facilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Recent work practices in the conservation and restoration involve the use of cyclododecane (CDD, CAS 294-62-2) to protect fragile artifacts during their handling or transportation. Little is known about its toxicity, and no previous exposure has been reported. A short field investigation was conducted to characterize the exposure conditions to both CDD vapors and aerosols.METHODS: Measurements were conducted in the laboratory of conservation and restoration of the archeological service in Bern (Switzerland). Three indoor and four outdoor typical work situations, either during brush or spray gun applications, were investigated. Measurements were performed on charcoal adsorbent tube and analyzed by a gas chromatograph equipped with a flame ionization detector.RESULTS: Measurements have been conducted during both brush and spray gun applications. Indoor exposures were of 0.75-15.5 mg/m(3), while outdoors exposures were 19.5-53.9 mg/m(3). Exposures appear to be extremely localized due to both physicochemical properties and application methods of the CDD. Vapor exposure increases dramatically with the confinement of the workplace.CONCLUSION: Preventive measures should be undertaken to limit as much as possible these exposures. Field work in confined areas (ditches, underground) is of particular concern. CDD-coated artifacts or materials should be stored in ventilated areas to avoid delayed exposures. [Authors]