172 resultados para Multivariate White Noise
Resumo:
It has been recently found that a number of systems displaying crackling noise also show a remarkable behavior regarding the temporal occurrence of successive events versus their size: a scaling law for the probability distributions of waiting times as a function of a minimum size is fulfilled, signaling the existence on those systems of self-similarity in time-size. This property is also present in some non-crackling systems. Here, the uncommon character of the scaling law is illustrated with simple marked renewal processes, built by definition with no correlations. Whereas processes with a finite mean waiting time do not fulfill a scaling law in general and tend towards a Poisson process in the limit of very high sizes, processes without a finite mean tend to another class of distributions, characterized by double power-law waiting-time densities. This is somehow reminiscent of the generalized central limit theorem. A model with short-range correlations is not able to escape from the attraction of those limit distributions. A discussion on open problems in the modeling of these properties is provided.
Resumo:
This paper empirically studies the effects of service offshoring on white-collar employment, using data for more than one hundred U.S. occupations. A model of firm behavior based on separability allows to derive the labor demand elasticity with respect to service offshoring for each occupation. Estimation is performed with Quasi-Maximum Likelihood, to account for high degrees of censoring in the employment variable. The estimated elasticities are then related to proxies for the skill level and the degree of tradability of the occupations. Results show that service offshoring increases high skilled employment and decreases medium and low skilled employment. Within each skill group, however, service offshoring penalizes tradable occupations and benefits non-tradable occupations.
Resumo:
When actuaries face with the problem of pricing an insurance contract that contains different types of coverage, such as a motor insurance or homeowner's insurance policy, they usually assume that types of claim are independent. However, this assumption may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce different regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion. These models have been largely ignored to multivariate Poisson date, mainly because of their computational di±culties. Bayesian inference based on MCMC helps to solve this problem (and also lets us derive, for several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile insurance claims database with three different types of claims. We analyse the consequences for pure and loaded premiums when the independence assumption is relaxed by using different multivariate Poisson regression models and their zero-inflated versions.
Resumo:
El projecte ha consistit en la creació de gràfics estadístics de soroll d’Europa de forma automàtica amb tecnologies Open Source dins el visor Noise Map Viewer per Europa de l’ETC-LUSI. La llibreria utilitzada per fer aquest procés ha estat JFreeChart i el llenguatge de programació utilitzat ha estat Java (programació orientada a objectes) dins l’entorn de desenvolupament integrat Eclipse. La base de dades utilitzada ha estat PostgreSQL. Com a servidors s’han fet servir Apache (servidor HTTP) i Tomcat (servidor contenidor d’aplicacions). Un cop acabat el procés s’ha integrat dins de MapFish canviant el codi JavaScript corresponent de la web original.
Resumo:
This paper proposes a contemporaneous-threshold multivariate smooth transition autoregressive (C-MSTAR) model in which the regime weights depend on the ex ante probabilities that latent regime-specific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are also a function of the regime-specific innovation covariance matrix, the model can account for contemporaneous regime-specific co-movements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the C-MSTAR model is illustrated by examining the relationship between US stock prices and interest rates.
Gaussian estimates for the density of the non-linear stochastic heat equation in any space dimension
Resumo:
In this paper, we establish lower and upper Gaussian bounds for the probability density of the mild solution to the stochastic heat equation with multiplicative noise and in any space dimension. The driving perturbation is a Gaussian noise which is white in time with some spatially homogeneous covariance. These estimates are obtained using tools of the Malliavin calculus. The most challenging part is the lower bound, which is obtained by adapting a general method developed by Kohatsu-Higa to the underlying spatially homogeneous Gaussian setting. Both lower and upper estimates have the same form: a Gaussian density with a variance which is equal to that of the mild solution of the corresponding linear equation with additive noise.
Resumo:
This paper studies frequent monitoring in an infinitely repeated game with imperfect public information and discounting, where players observe the state of a continuous time Brownian process at moments in time of length _. It shows that a limit folk theorem can be achieved with imperfect public monitoring when players monitor each other at the highest frequency, i.e., _. The approach assumes that the expected joint output depends exclusively on the action profile simultaneously and privately decided by the players at the beginning of each period of the game, but not on _. The strong decreasing effect on the expected immediate gains from deviation when the interval between actions shrinks, and the associated increase precision of the public signals, make the result possible in the limit. JEL: C72/73, D82, L20. KEYWORDS: Repeated Games, Frequent Monitoring, Public Monitoring, Brownian Motion.
Resumo:
In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuouslycored boreholes, 100 to 220m deep were drilled in the northern part of the PoPlain by Regione Lombardia in the last five years. Quantitative provenanceanalysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carriedout by using multivariate statistical analysis (principal component analysis, PCA,and similarity analysis) on an integrated data set, including high-resolution bulkpetrography and heavy-mineral analyses on Pleistocene sands and of 250 majorand minor modern rivers draining the southern flank of the Alps from West toEast (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations,metamorphic and quartzofeldspathic detritus from the Western and Central Alpswas carried from the axial belt to the Po basin longitudinally parallel to theSouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenariorapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset ofthe first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA andsimilarity analysis from core samples show that the longitudinal trunk river at thistime was shifted southward by the rapid southward and westward progradation oftransverse alluvial river systems fed from the Central and Southern Alps.Sediments were transported southward by braided river systems as well as glacialsediments transported by Alpine valley glaciers invaded the alluvial plain.Kew words: Detrital modes; Modern sands; Provenance; Principal ComponentsAnalysis; Similarity, Canberra Distance; palaeodrainage
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
A compositional time series is obtained when a compositional data vector is observed atdifferent points in time. Inherently, then, a compositional time series is a multivariatetime series with important constraints on the variables observed at any instance in time.Although this type of data frequently occurs in situations of real practical interest, atrawl through the statistical literature reveals that research in the field is very much in itsinfancy and that many theoretical and empirical issues still remain to be addressed. Anyappropriate statistical methodology for the analysis of compositional time series musttake into account the constraints which are not allowed for by the usual statisticaltechniques available for analysing multivariate time series. One general approach toanalyzing compositional time series consists in the application of an initial transform tobreak the positive and unit sum constraints, followed by the analysis of the transformedtime series using multivariate ARIMA models. In this paper we discuss the use of theadditive log-ratio, centred log-ratio and isometric log-ratio transforms. We also presentresults from an empirical study designed to explore how the selection of the initialtransform affects subsequent multivariate ARIMA modelling as well as the quality ofthe forecasts
Resumo:
Exact closed-form expressions are obtained for the outage probability of maximal ratio combining in η-μ fadingchannels with antenna correlation and co-channel interference. The scenario considered in this work assumes the joint presence of background white Gaussian noise and independent Rayleigh-faded interferers with arbitrary powers. Outage probability results are obtained through an appropriate generalization of the moment-generating function of theη-μ fading distribution, for which new closed-form expressions are provided.
Resumo:
We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.
Resumo:
Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.
Resumo:
We study a novel class of noisy rational expectations equilibria in markets with largenumber of agents. We show that, as long as noise increases with the number of agents inthe economy, the limiting competitive equilibrium is well-defined and leads to non-trivialinformation acquisition, perfect information aggregation, and partially revealing prices,even if per capita noise tends to zero. We find that in such equilibrium risk sharing and price revelation play dierent roles than in the standard limiting economy in which per capita noise is not negligible. We apply our model to study information sales by a monopolist, information acquisition in multi-asset markets, and derivatives trading. Thelimiting equilibria are shown to be perfectly competitive, even when a strategic solutionconcept is used.