874 resultados para multivariate stochastic volatility
Resumo:
In the context of Bayesian statistical analysis, elicitation is the process of formulating a prior density f(.) about one or more uncertain quantities to represent a person's knowledge and beliefs. Several different methods of eliciting prior distributions for one unknown parameter have been proposed. However, there are relatively few methods for specifying a multivariate prior distribution and most are just applicable to specific classes of problems and/or based on restrictive conditions, such as independence of variables. Besides, many of these procedures require the elicitation of variances and correlations, and sometimes elicitation of hyperparameters which are difficult for experts to specify in practice. Garthwaite et al. (2005) discuss the different methods proposed in the literature and the difficulties of eliciting multivariate prior distributions. We describe a flexible method of eliciting multivariate prior distributions applicable to a wide class of practical problems. Our approach does not assume a parametric form for the unknown prior density f(.), instead we use nonparametric Bayesian inference, modelling f(.) by a Gaussian process prior distribution. The expert is then asked to specify certain summaries of his/her distribution, such as the mean, mode, marginal quantiles and a small number of joint probabilities. The analyst receives that information, treating it as a data set D with which to update his/her prior beliefs to obtain the posterior distribution for f(.). Theoretical properties of joint and marginal priors are derived and numerical illustrations to demonstrate our approach are given. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Wind-excited vibrations in the frequency range of 10 to 50 Hz due to vortex shedding often cause fatigue failures in the cables of overhead transmission lines. Damping devices, such as the Stockbridge dampers, have been in use for a long time for supressing these vibrations. The dampers are conveniently modelled by means of their driving point impedance, measured in the lab over the frequency range under consideration. The cables can be modelled as strings with additional small bending stiffness. The main problem in modelling the vibrations does however lay in the aerodynamic forces, which usually are approximated by the forces acting on a rigid cylinder in planar flow. In the present paper, the wind forces are represented by stochastic processes with arbitrary crosscorrelation in space; the case of a Kármán vortex street on a rigid cylinder in planar flow is contained as a limit case in this approach. The authors believe that this new view of the problem may yield useful results, particularly also concerning the reliability of the lines and the probability of fatigue damages. © 1987.
Resumo:
I analyze two inequalities on entropy and information, one due to von Neumann and a recent one to Schiffer, and show that the relevant quantities in these inequalities are related by special doubly stochastic matrices (DSM). I then use generalization of the first inequality to prove algebraically a generalization of Schiffer's inequality to arbitrary DSM. I also give a second interpretation to the latter inequality, determine its domain of applicability, and illustrate it by using Zeeman splitting. This example shows that symmetric (degenerate) systems have less entropy than the corresponding split systems, if compared at the same average energy. This seemingly counter-intuitive result is explained thermodynamically. © 1991.
Resumo:
The correspondence between morphometric and isozymic geographic variation patterns of Africanized honey bees in Brazil was analyzed. Morphometric data consisted of mean vectors of 19 wing traits measured in 42 local populations distributed throughout the country. Isozymic data refer to allelic frequencies of malate dehydrogenase (MDH), and were obtained from Lobo and Krieger. The two data sets were analyzed through canonical trend surface, principal components and spatial autocorrelation analyses, and showed north-south dines, demonstrating that Africanized honey bees in southern and southeastern Brazil are more similar to European honey bees than those found in northern and northeastern regions. Also, the morphometric variation is within the limits established by the racial admixture model, considering the expected values of Africanized honey bee fore wing length (WL) in southern and northeastern regions of Brazil, estimated by combining average values of WL in the three main subspecies involved in the Africanization process (Apis mellifera scutellata, A. m. ligustica and A. m. mellifera) with racial admixture coefficients.
Resumo:
A methodology to define favorable areas in petroleum and mineral exploration is applied, which consists in weighting the exploratory variables, in order to characterize their importance as exploration guides. The exploration data are spatially integrated in the selected area to establish the association between variables and deposits, and the relationships among distribution, topology, and indicator pattern of all variables. Two methods of statistical analysis were compared. The first one is the Weights of Evidence Modeling, a conditional probability approach (Agterberg, 1989a), and the second one is the Principal Components Analysis (Pan, 1993). In the conditional method, the favorability estimation is based on the probability of deposit and variable joint occurrence, with the weights being defined as natural logarithms of likelihood ratios. In the multivariate analysis, the cells which contain deposits are selected as control cells and the weights are determined by eigendecomposition, being represented by the coefficients of the eigenvector related to the system's largest eigenvalue. The two techniques of weighting and complementary procedures were tested on two case studies: 1. Recôncavo Basin, Northeast Brazil (for Petroleum) and 2. Itaiacoca Formation of Ribeira Belt, Southeast Brazil (for Pb-Zn Mississippi Valley Type deposits). The applied methodology proved to be easy to use and of great assistance to predict the favorability in large areas, particularly in the initial phase of exploration programs. © 1998 International Association for Mathematical Geology.
Resumo:
Power-law distributions, i.e. Levy flights have been observed in various economical, biological, and physical systems in high-frequency regime. These distributions can be successfully explained via gradually truncated Levy flight (GTLF). In general, these systems converge to a Gaussian distribution in the low-frequency regime. In the present work, we develop a model for the physical basis for the cut-off length in GTLF and its variation with respect to the time interval between successive observations. We observe that GTLF automatically approach a Gaussian distribution in the low-frequency regime. We applied the present method to analyze time series in some physical and financial systems. The agreement between the experimental results and theoretical curves is excellent. The present method can be applied to analyze time series in a variety of fields, which in turn provide a basis for the development of further microscopic models for the system. © 2000 Elsevier Science B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper is concerned with the stability of discrete-time linear systems subject to random jumps in the parameters, described by an underlying finite-state Markov chain. In the model studied, a stopping time τ Δ is associated with the occurrence of a crucial failure after which the system is brought to a halt for maintenance. The usual stochastic stability concepts and associated results are not indicated, since they are tailored to pure infinite horizon problems. Using the concept named stochastic τ-stability, equivalent conditions to ensure the stochastic stability of the system until the occurrence of τ Δ is obtained. In addition, an intermediary and mixed case for which τ represents the minimum between the occurrence of a fix number N of failures and the occurrence of a crucial failure τ Δ is also considered. Necessary and sufficient conditions to ensure the stochastic τ-stability are provided in this setting that are auxiliary to the main result.
Resumo:
The effect of combining the photocatalytic processes using TiO 2 and the photo-Fenton reaction with Fe3+ or ferrioxalate as a source of Fe2+ was investigated in the degradation of 4-chlorophenol (4CP) and dichloroacetic acid (DCA) using solar irradiation. Multivariate analysis was used to evaluate the role of three variables: iron, H2O2 and TiO2 concentrations. The results show that TiO2 plays a minor role when compared to iron and H2O2 in the solar degradation of 4CP and DCA in the studied conditions. However, its presence can improve TOC removal when H2O2 is totally consumed. Iron and peroxide play major roles, especially when Fe(NO3)3 used in the degradation of 4CP. No significant synergistic effect was observed by the addition of TiO 2 in this process. On the other hand, synergistic effects were observed between FeOx and TiO2 and between H 2O2 and TiO2 in the degradation of DCA. © IWA Publishing 2004.
Resumo:
The GPS observables are subject to several errors. Among them, the systematic ones have great impact, because they degrade the accuracy of the accomplished positioning. These errors are those related, mainly, to GPS satellites orbits, multipath and atmospheric effects. Lately, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique (PLS). In this method, the errors are modeled as functions varying smoothly in time. It is like to change the stochastic model, in which the errors functions are incorporated, the results obtained are similar to those in which the functional model is changed. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method (CLS). In general, the solution requires a shorter data interval, minimizing costs. The method performance was analyzed in two experiments, using data from single frequency receivers. The first one was accomplished with a short baseline, where the main error was the multipath. In the second experiment, a baseline of 102 km was used. In this case, the predominant errors were due to the ionosphere and troposphere refraction. In the first experiment, using 5 minutes of data collection, the largest coordinates discrepancies in relation to the ground truth reached 1.6 cm and 3.3 cm in h coordinate for PLS and the CLS, respectively, in the second one, also using 5 minutes of data, the discrepancies were 27 cm in h for the PLS and 175 cm in h for the CLS. In these tests, it was also possible to verify a considerable improvement in the ambiguities resolution using the PLS in relation to the CLS, with a reduced data collection time interval. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
In this article, we evaluate the performance of the T2 chart based on the principal components (PC chart) and the simultaneous univariate control charts based on the original variables (SU X̄ charts) or based on the principal components (SUPC charts). The main reason to consider the PC chart lies on the dimensionality reduction. However, depending on the disturbance and on the way the original variables are related, the chart is very slow in signaling, except when all variables are negatively correlated and the principal component is wisely selected. Comparing the SU X̄, the SUPC and the T 2 charts we conclude that the SU X̄ charts (SUPC charts) have a better overall performance when the variables are positively (negatively) correlated. We also develop the expression to obtain the power of two S 2 charts designed for monitoring the covariance matrix. These joint S2 charts are, in the majority of the cases, more efficient than the generalized variance |S| chart.
Resumo:
Includes bibliography