990 resultados para Dirichlet multivariate processes
Resumo:
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
Nous y introduisons une nouvelle classe de distributions bivariées de type Marshall-Olkin, la distribution Erlang bivariée. La transformée de Laplace, les moments et les densités conditionnelles y sont obtenus. Les applications potentielles en assurance-vie et en finance sont prises en considération. Les estimateurs du maximum de vraisemblance des paramètres sont calculés par l'algorithme Espérance-Maximisation. Ensuite, notre projet de recherche est consacré à l'étude des processus de risque multivariés, qui peuvent être utiles dans l'étude des problèmes de la ruine des compagnies d'assurance avec des classes dépendantes. Nous appliquons les résultats de la théorie des processus de Markov déterministes par morceaux afin d'obtenir les martingales exponentielles, nécessaires pour établir des bornes supérieures calculables pour la probabilité de ruine, dont les expressions sont intraitables.
Resumo:
Dans cette thèse, nous étudions quelques problèmes fondamentaux en mathématiques financières et actuarielles, ainsi que leurs applications. Cette thèse est constituée de trois contributions portant principalement sur la théorie de la mesure de risques, le problème de l’allocation du capital et la théorie des fluctuations. Dans le chapitre 2, nous construisons de nouvelles mesures de risque cohérentes et étudions l’allocation de capital dans le cadre de la théorie des risques collectifs. Pour ce faire, nous introduisons la famille des "mesures de risque entropique cumulatifs" (Cumulative Entropic Risk Measures). Le chapitre 3 étudie le problème du portefeuille optimal pour le Entropic Value at Risk dans le cas où les rendements sont modélisés par un processus de diffusion à sauts (Jump-Diffusion). Dans le chapitre 4, nous généralisons la notion de "statistiques naturelles de risque" (natural risk statistics) au cadre multivarié. Cette extension non-triviale produit des mesures de risque multivariées construites à partir des données financiéres et de données d’assurance. Le chapitre 5 introduit les concepts de "drawdown" et de la "vitesse d’épuisement" (speed of depletion) dans la théorie de la ruine. Nous étudions ces concepts pour des modeles de risque décrits par une famille de processus de Lévy spectrallement négatifs.
Resumo:
A bivariate semi-Pareto distribution is introduced and characterized using geometric minimization. Autoregressive minification models for bivariate random vectors with bivariate semi-Pareto and bivariate Pareto distributions are also discussed. Multivariate generalizations of the distributions and the processes are briefly indicated.
Resumo:
Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability, and the possible processes associated with compositional data sets from many disciplines. In this paper we concentrate on geochemical data sets. First we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of subcompositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained together with the necessary tools for a staying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland. Finally we point out a number of unresolved problems in the statistical analysis of compositional processes
Resumo:
Phenolic compounds in wastewaters are difficult to treat using the conventional biological techniques such as activated sludge processes because of their bio-toxic and recalcitrant properties and the high volumes released from various chemical, pharmaceutical and other industries. In the current work, a modified heterogeneous advanced Fenton process (AFP) is presented as a novel methodology for the treatment of phenolic wastewater. The modified AFP, which is a combination of hydrodynamic cavitation generated using a liquid whistle reactor and the AFP is a promising technology for wastewaters containing high organic content. The presence of hydrodynamic cavitation in the treatment scheme intensifies the Fenton process by generation of additional free radicals. Also, the turbulence produced during the hydrodynamic cavitation process increases the mass transfer rates as well as providing better contact between the pseudo-catalyst surfaces and the reactants. A multivariate design of experiments has been used to ascertain the influence of hydrogen peroxide dosage and iron catalyst loadings on the oxidation performance of the modified AFP. High er TOC removal rates were achieved with increased concentrations of hydrogen peroxide. In contrast, the effect of catalyst loadings was less important on the TOC removal rate under conditions used in this work although there is an optimum value of this parameter. The concentration of iron species in the reaction solution was measured at 105 min and its relationship with the catalyst loadings and hydrogen peroxide level is presented.
Resumo:
The South American (SA) rainy season is studied in this paper through the application of a multivariate Empirical Orthogonal Function (EOF) analysis to a SA gridded precipitation analysis and to the components of Lorenz Energy Cycle (LEC) derived from the National Centers for Environmental Prediction (NCEP) reanalysis. The EOF analysis leads to the identification of patterns of the rainy season and the associated mechanisms in terms of their energetics. The first combined EOF represents the northwest-southeast dipole of the precipitation between South and Central America, the South American Monsoon System (SAMS). The second combined EOF represents a synoptic pattern associated with the SACZ (South Atlantic convergence zone) and the third EOF is in spatial quadrature to the second EOF. The phase relationship of the EOFs, as computed from the principal components (PCs), suggests a nonlinear transition from the SACZ to the fully developed SAMS mode by November and between both components describing the SACZ by September-October (the rainy season onset). According to the LEC, the first mode is dominated by the eddy generation term at its maximum, the second by both baroclinic and eddy generation terms and the third by barotropic instability previous to the connection to the second mode by September-October. The predominance of the different LEC components at each phase of the SAMS can be used as an indicator of the onset of the rainy season in terms of physical processes, while the existence of the outstanding spectral peaks in the time dependence of the EOFs at the intraseasonal time scale could be used for monitoring purposes. Copyright (C) 2009 Royal Meteorological Society
Resumo:
This paper presents a GIS-based multicriteria flood risk assessment and mapping approach applied to coastal drainage basins where hydrological data are not available. It involves risk to different types of possible processes: coastal inundation (storm surge), river, estuarine and flash flood, either at urban or natural areas, and fords. Based on the causes of these processes, several environmental indicators were taken to build-up the risk assessment. Geoindicators include geological-geomorphologic proprieties of Quaternary sedimentary units, water table, drainage basin morphometry, coastal dynamics, beach morphodynamics and microclimatic characteristics. Bioindicators involve coastal plain and low slope native vegetation categories and two alteration states. Anthropogenic indicators encompass land use categories properties such as: type, occupation density, urban structure type and occupation consolidation degree. The selected indicators were stored within an expert Geoenvironmental Information System developed for the State of Sao Paulo Coastal Zone (SIIGAL), which attributes were mathematically classified through deterministic approaches, in order to estimate natural susceptibilities (Sn), human-induced susceptibilities (Sa), return period of rain events (Ri), potential damages (Dp) and the risk classification (R), according to the equation R=(Sn.Sa.Ri).Dp. Thematic maps were automatically processed within the SIIGAL, in which automata cells (""geoenvironmental management units"") aggregating geological-geomorphologic and land use/native vegetation categories were the units of classification. The method has been applied to the Northern Littoral of the State of Sao Paulo (Brazil) in 32 small drainage basins, demonstrating to be very useful for coastal zone public politics, civil defense programs and flood management.
Resumo:
Canalizing genes possess such broad regulatory power, and their action sweeps across a such a wide swath of processes that the full set of affected genes are not highly correlated under normal conditions. When not active, the controlling gene will not be predictable to any significant degree by its subject genes, either alone or in groups, since their behavior will be highly varied relative to the inactive controlling gene. When the controlling gene is active, its behavior is not well predicted by any one of its targets, but can be very well predicted by groups of genes under its control. To investigate this question, we introduce in this paper the concept of intrinsically multivariate predictive (IMP) genes, and present a mathematical study of IMP in the context of binary genes with respect to the coefficient of determination (CoD), which measures the predictive power of a set of genes with respect to a target gene. A set of predictor genes is said to be IMP for a target gene if all properly contained subsets of the predictor set are bad predictors of the target but the full predictor set predicts the target with great accuracy. We show that logic of prediction, predictive power, covariance between predictors, and the entropy of the joint probability distribution of the predictors jointly affect the appearance of IMP genes. In particular, we show that high-predictive power, small covariance among predictors, a large entropy of the joint probability distribution of predictors, and certain logics, such as XOR in the 2-predictor case, are factors that favor the appearance of IMP. The IMP concept is applied to characterize the behavior of the gene DUSP1, which exhibits control over a central, process-integrating signaling pathway, thereby providing preliminary evidence that IMP can be used as a criterion for discovery of canalizing genes.
Resumo:
Multivariate Affine term structure models have been increasingly used for pricing derivatives in fixed income markets. In these models, uncertainty of the term structure is driven by a state vector, while the short rate is an affine function of this vector. The model is characterized by a specific form for the stochastic differential equation (SDE) for the evolution of the state vector. This SDE presents restrictions on its drift term which rule out arbitrages in the market. In this paper we solve the following inverse problem: Suppose the term structure of interest rates is modeled by a linear combination of Legendre polynomials with random coefficients. Is there any SDE for these coefficients which rules out arbitrages? This problem is of particular empirical interest because the Legendre model is an example of factor model with clear interpretation for each factor, in which regards movements of the term structure. Moreover, the Affine structure of the Legendre model implies knowledge of its conditional characteristic function. From the econometric perspective, we propose arbitrage-free Legendre models to describe the evolution of the term structure. From the pricing perspective, we follow Duffie et al. (2000) in exploring Legendre conditional characteristic functions to obtain a computational tractable method to price fixed income derivatives. Closing the article, the empirical section presents precise evidence on the reward of implementing arbitrage-free parametric term structure models: The ability of obtaining a good approximation for the state vector by simply using cross sectional data.
Resumo:
This study reports the photodegradation of 4-chlorophenol (4-CP) in aqueous solution by the photo-Fenton process using solar irradiation. The influence of solution path length, and Fe(NO3)(3) and H2O2 concentrations on the degradation of 4-CP is evaluated by response surface methodology. The degradation process was monitored by the removal of total organic carbon (TOC) and the release of chloride ion. The results showed a very important role of iron concentration either for TOC removal or dechlorination. on the other hand, a negative effect of increasing solution path length on mineralization was observed, which can be compensated by increasing the iron concentration. This permits an adjustment of the iron concentration according to the irradiation exposure area and path length (depth of a tank reactor). Under optimum conditions of 1.5 mM Fe(NO3)(3), 20.0 mM H2O2 and 4.5 cm solution path length, 17 min irradiation under solar light were sufficient to reduce a 72 mg C L-1 solution of 4-CP by 91 (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The effect of combining the photocatalytic processes using TiO 2 and the photo-Fenton reaction with Fe3+ or ferrioxalate as a source of Fe2+ was investigated in the degradation of 4-chlorophenol (4CP) and dichloroacetic acid (DCA) using solar irradiation. Multivariate analysis was used to evaluate the role of three variables: iron, H2O2 and TiO2 concentrations. The results show that TiO2 plays a minor role when compared to iron and H2O2 in the solar degradation of 4CP and DCA in the studied conditions. However, its presence can improve TOC removal when H2O2 is totally consumed. Iron and peroxide play major roles, especially when Fe(NO3)3 used in the degradation of 4CP. No significant synergistic effect was observed by the addition of TiO 2 in this process. On the other hand, synergistic effects were observed between FeOx and TiO2 and between H 2O2 and TiO2 in the degradation of DCA. © IWA Publishing 2004.
Resumo:
In this article, we present a new control chart for monitoring the covariance matrix in a bivariate process. In this method, n observations of the two variables were considered as if they came from a single variable (as a sample of 2n observations), and a sample variance was calculated. This statistic was used to build a new control chart specifically as a VMIX chart. The performance of the new control chart was compared with its main competitors: the generalized sampled variance chart, the likelihood ratio test, Nagao's test, probability integral transformation (v(t)), and the recently proposed VMAX chart. Among these statistics, only the VMAX chart was competitive with the VMIX chart. For shifts in both variances, the VMIX chart outperformed VMAX; however, VMAX showed better performance for large shifts (higher than 10%) in one variance.
Resumo:
By using a symbolic method, known in the literature as the classical umbral calculus, a symbolic representation of Lévy processes is given and a new family of time-space harmonic polynomials with respect to such processes, which includes and generalizes the exponential complete Bell polynomials, is introduced. The usefulness of time-space harmonic polynomials with respect to Lévy processes is that it is a martingale the stochastic process obtained by replacing the indeterminate x of the polynomials with a Lévy process, whereas the Lévy process does not necessarily have this property. Therefore to find such polynomials could be particularly meaningful for applications. This new family includes Hermite polynomials, time-space harmonic with respect to Brownian motion, Poisson-Charlier polynomials with respect to Poisson processes, Laguerre and actuarial polynomials with respect to Gamma processes , Meixner polynomials of the first kind with respect to Pascal processes, Euler, Bernoulli, Krawtchuk, and pseudo-Narumi polynomials with respect to suitable random walks. The role played by cumulants is stressed and brought to the light, either in the symbolic representation of Lévy processes and their infinite divisibility property, either in the generalization, via umbral Kailath-Segall formula, of the well-known formulae giving elementary symmetric polynomials in terms of power sum symmetric polynomials. The expression of the family of time-space harmonic polynomials here introduced has some connections with the so-called moment representation of various families of multivariate polynomials. Such moment representation has been studied here for the first time in connection with the time-space harmonic property with respect to suitable symbolic multivariate Lévy processes. In particular, multivariate Hermite polynomials and their properties have been studied in connection with a symbolic version of the multivariate Brownian motion, while multivariate Bernoulli and Euler polynomials are represented as powers of multivariate polynomials which are time-space harmonic with respect to suitable multivariate Lévy processes.
Resumo:
The Large Hadron Collider, located at the CERN laboratories in Geneva, is the largest particle accelerator in the world. One of the main research fields at LHC is the study of the Higgs boson, the latest particle discovered at the ATLAS and CMS experiments. Due to the small production cross section for the Higgs boson, only a substantial statistics can offer the chance to study this particle properties. In order to perform these searches it is desirable to avoid the contamination of the signal signature by the number and variety of the background processes produced in pp collisions at LHC. Much account assumes the study of multivariate methods which, compared to the standard cut-based analysis, can enhance the signal selection of a Higgs boson produced in association with a top quark pair through a dileptonic final state (ttH channel). The statistics collected up to 2012 is not sufficient to supply a significant number of ttH events; however, the methods applied in this thesis will provide a powerful tool for the increasing statistics that will be collected during the next LHC data taking.