947 resultados para Mélange de distributions khi-deux non centrées
Resumo:
L'exposition aux mélanges de contaminants (environnementaux, alimentaires ou thérapeutiques) soulève de nombreuses interrogations et inquiétudes vis-à-vis des probabilités d'interactions toxicocinétiques et toxicodynamiques. Une telle coexposition peut influencer le mode d’action des composants du cocktail et donc de leur toxicité, suite à un accroissement de leurs concentrations internes. Le bisphénol A (4 dihydroxy-2,2-diphenylpropane) est un contaminant chimique répandu de manière ubiquitaire dans notre environnement, largement utilisé dans la fabrication des plastiques avec l’un des plus grands volumes de production à l’échelle mondiale. Il est un perturbateur endocrinien par excellence de type œstrogèno-mimétique. Cette molécule est biotransformée en métabolites non toxiques par un processus de glucuronidation. L'exposition concomitante à plusieurs xénobiotiques peut induire à la baisse le taux de glucuronidation du polluant chimique d'intérêt, entre autres la co-exposition avec des médicaments. Puisque la consommation de produits thérapeutiques est un phénomène grandissant dans la population, la possibilité d’une exposition simultanée est d’autant plus grande et forte. Sachant que l'inhibition métabolique est le mécanisme d'interaction le plus plausible pouvant aboutir à une hausse des niveaux internes ainsi qu’à une modulation de la toxicité prévue, la présente étude visait d'abord à confirmer et caractériser ce type d'interactions métaboliques entre le bisphénol A et le naproxène, qui est un anti-inflammatoire non stéroïdiennes (AINS), sur l'ensemble d'un organe intact en utilisant le système de foie de rat isolé et perfusé (IPRL). Elle visait ensuite à déterminer la cinétique enzymatique de chacune de ces deux substances, seule puis en mélange binaire. Dans un second temps, nous avons évalué aussi l’influence de la présence d'albumine sur la cinétique métabolique et le comportement de ces deux substances étudiées en suivant le même modèle de perfusion in vivo au niveau du foie de rat. Les constantes métaboliques ont été déterminées par régression non linéaire. Les métabolismes du BPA et du NAP seuls ont montré une cinétique saturable avec une vélocité maximale (Vmax) de 8.9 nmol/min/ mg prot de foie et une constante d'affinité de l'enzyme pour le substrat (Km) de 51.6 μM pour le BPA et de 3 nmol/min/mg prot de foie et 149.2 μM pour le NAP. L'analyse des expositions combinées suggère une inhibition compétitive partielle du métabolisme du BPA par le NAP avec une valeur de Ki estimée à 0.3542 μM. Les résultats obtenus montrent que l’analyse de risque pour les polluants environnementaux doit donc prendre en considération la consommation des produits pharmaceutiques comme facteur pouvant accroitre le niveau interne lors d’une exposition donnée. Ces données in vivo sur les interactions métaboliques pourraient être intégrées dans un modèle pharmacocinétique à base physiologique (PBPK) pour prédire les conséquences toxicococinétique (TK) de l'exposition d'un individu à ces mélanges chimiques.
Resumo:
Spatial characterization of non-Gaussian attributes in earth sciences and engineering commonly requires the estimation of their conditional distribution. The indicator and probability kriging approaches of current nonparametric geostatistics provide approximations for estimating conditional distributions. They do not, however, provide results similar to those in the cumbersome implementation of simultaneous cokriging of indicators. This paper presents a new formulation termed successive cokriging of indicators that avoids the classic simultaneous solution and related computational problems, while obtaining equivalent results to the impractical simultaneous solution of cokriging of indicators. A successive minimization of the estimation variance of probability estimates is performed, as additional data are successively included into the estimation process. In addition, the approach leads to an efficient nonparametric simulation algorithm for non-Gaussian random functions based on residual probabilities.
Resumo:
A common problem encountered during the development of MS methods for the quantitation of small organic molecules by LGMS is the formation of non-covalently bound species or adducts in the electrospray interface. Often the population of the molecular ion is insignificant compared to those of all other forms of the analyte produced in the electrospray, making it difficult to obtain the sensitivity required for accurate quantitation. We have investigated the effects of the following variables: orifice potential, nebulizer gas flow, temperature, solvent composition and the sample pH on the relative distributions of ions of the types MH+, MNa+, MNH+, and 2MNa(+), where M represents a 4 small organic molecule: BAY 11-7082 ((E)-3-[4-methylphenylsulfonyl]-2-propenenitrile). Orifice potential, solvent composition and the sample pH had the greatest influence on the relative distributions of these ions, making these parameters the most useful for optimizing methods for the quantitation of small molecules.
Resumo:
Nous souhaitons nous pencher ici sur un emploi particulier de la périphrase en aller + infinitif qui n’a fait l’objet – à notre connaissance – que d’un article (Lansari 2010). Cet emploi « modalisant » que Lansari limite à la formule 'on va dire' mériterait d’être approfondi pour plusieurs raisons. D’une part, l’emploi n’est décrit que sur base de « vingt exemples tirés d’internet, de blogs ou de forums » (Lansari 2010: 120) alors que, de l’aveu de Lansari elle-même, l’emploi relève de l’oral. Il serait donc utile d’enrichir – quantitativement et qualitativement - le corpus et d’y intégrer des occurrences d’oral authentique. D’autre part, Lansari restreint l’emploi modalisant à la séquence 'on va dire' ; on pourrait s’interroger sur la capacité de séquences comme 'je vais dire' à remplir les mêmes fonctions discursives. Dans cet article, nous commencerons par un – forcément bref – état de la question. Après avoir présenté le corpus, nous testerons les hypothèses précédemment défendues à la lueur du corpus rassemblé: (a) Le corpus CFPP2000 issu du projet Discours sur la ville. Corpus de Français Parlé Parisien des années 2000 (disponible en ligne à http://cfpp2000.univ-paris3.fr/Corpus.html). CFPP2000 donne la parole à 41 informateurs en 28 interviews (2198 min) et a généré 96 occurrences de on va dire modalisant. (b) Le corpus CLAPI comprenant 45 heures d’interactions interrogeables en ligne à http://clapi.univ-lyon2.fr/analyse_requete_aide.php?menu=outils. On y a relevé 12 exemples de on va dire modalisant. (c) Un corpus personnel d’interviews (163min) réalisées pendant l’année académique 2009-10 auprès de cinq étudiants Erasmus français grâce au soutien d’une bourse de la Délégation Générale à la Langue Française et aux Langues de France (DGLFLF). Les entretiens avec une assistante de recherche, basés sur les thèmes suivants, étaient supposés générer l’emploi d’une variété de temps verbaux : - Récits de rêve (imparfait) - Récits biographiques (personnage historique vs autobiographie) (PC vs PS) - Narration de film vs d’épisode historique (PC/ PRES vs PS) - Présentation de projets d’avenir vs conjectures (Futur périphrastique ou simple) Le corpus contient dix-sept occurrences de on va dire générés par deux des cinq informateurs : 15 par A. et 2 par J. Notre réflexion se basera donc sur 125 occurrences orales de 'on va dire'.
Resumo:
This thesis applies a hierarchical latent trait model system to a large quantity of data. The motivation for it was lack of viable approaches to analyse High Throughput Screening datasets which maybe include thousands of data points with high dimensions. High Throughput Screening (HTS) is an important tool in the pharmaceutical industry for discovering leads which can be optimised and further developed into candidate drugs. Since the development of new robotic technologies, the ability to test the activities of compounds has considerably increased in recent years. Traditional methods, looking at tables and graphical plots for analysing relationships between measured activities and the structure of compounds, have not been feasible when facing a large HTS dataset. Instead, data visualisation provides a method for analysing such large datasets, especially with high dimensions. So far, a few visualisation techniques for drug design have been developed, but most of them just cope with several properties of compounds at one time. We believe that a latent variable model (LTM) with a non-linear mapping from the latent space to the data space is a preferred choice for visualising a complex high-dimensional data set. As a type of latent variable model, the latent trait model can deal with either continuous data or discrete data, which makes it particularly useful in this domain. In addition, with the aid of differential geometry, we can imagine the distribution of data from magnification factor and curvature plots. Rather than obtaining the useful information just from a single plot, a hierarchical LTM arranges a set of LTMs and their corresponding plots in a tree structure. We model the whole data set with a LTM at the top level, which is broken down into clusters at deeper levels of t.he hierarchy. In this manner, the refined visualisation plots can be displayed in deeper levels and sub-clusters may be found. Hierarchy of LTMs is trained using expectation-maximisation (EM) algorithm to maximise its likelihood with respect to the data sample. Training proceeds interactively in a recursive fashion (top-down). The user subjectively identifies interesting regions on the visualisation plot that they would like to model in a greater detail. At each stage of hierarchical LTM construction, the EM algorithm alternates between the E- and M-step. Another problem that can occur when visualising a large data set is that there may be significant overlaps of data clusters. It is very difficult for the user to judge where centres of regions of interest should be put. We address this problem by employing the minimum message length technique, which can help the user to decide the optimal structure of the model. In this thesis we also demonstrate the applicability of the hierarchy of latent trait models in the field of document data mining.
Resumo:
It has been postulated that immunogenicity results from the overall dissimilarity of pathogenic proteins versus the host proteome. We have sought to use this concept to discriminate between antigens and non-antigens of bacterial origin. Sets of 100 known antigenic and nonantigenic peptide sequences from bacteria were compared to human and mouse proteomes. Both antigenic and non-antigenic sequences lacked human or mouse homologues. Observed distributions were compared using the non-parametric Mann-Whitney test. The statistical null hypothesis was accepted, indicating that antigen and non-antigens did not differ significantly. Likewise, we were unable to determine a threshold able to separate meaningfully antigen from non-antigen. Thus, antigens cannot be predicted from pathogen genomes based solely on their dissimilarity to the human genome.
Resumo:
Immunogenicity arises via many synergistic mechanisms, yet the overall dissimilarity of pathogenic proteins versus the host proteome has been proposed as a key arbiter. We have previously explored this concept in relation to Bacterial antigens; here we extend our analysis to antigens of viral and fungal origin. Sets of known viral and fungal antigenic and non-antigenic protein sequences were compared to human and mouse proteomes. Both antigenic and non-antigenic sequences lacked human or mouse homologues. Observed distributions were compared using the non-parametric Mann-Whitney test. The statistical null hypothesis was accepted, indicating that antigen and non-antigens did not differ significantly. Likewise, we could not determine a threshold able meaningfully to separate non-antigen from antigen. We conclude that viral and fungal antigens cannot be predicted from pathogen genomes based solely on their dissimilarity to mammalian genomes.
Resumo:
Purpose-To develop a non-invasive method for quantification of blood and pigment distributions across the posterior pole of the fundus from multispectral images using a computer-generated reflectance model of the fundus. Methods - A computer model was developed to simulate light interaction with the fundus at different wavelengths. The distribution of macular pigment (MP) and retinal haemoglobins in the fundus was obtained by comparing the model predictions with multispectral image data at each pixel. Fundus images were acquired from 16 healthy subjects from various ethnic backgrounds and parametric maps showing the distribution of MP and of retinal haemoglobins throughout the posterior pole were computed. Results - The relative distributions of MP and retinal haemoglobins in the subjects were successfully derived from multispectral images acquired at wavelengths 507, 525, 552, 585, 596, and 611?nm, providing certain conditions were met and eye movement between exposures was minimal. Recovery of other fundus pigments was not feasible and further development of the imaging technique and refinement of the software are necessary to understand the full potential of multispectral retinal image analysis. Conclusion - The distributions of MP and retinal haemoglobins obtained in this preliminary investigation are in good agreement with published data on normal subjects. The ongoing development of the imaging system should allow for absolute parameter values to be computed. A further study will investigate subjects with known pathologies to determine the effectiveness of the method as a screening and diagnostic tool.
Resumo:
Dependence in the world of uncertainty is a complex concept. However, it exists, is asymmetric, has magnitude and direction, and can be measured. We use some measures of dependence between random events to illustrate how to apply it in the study of dependence between non-numeric bivariate variables and numeric random variables. Graphics show what is the inner dependence structure in the Clayton Archimedean copula and the Bivariate Poisson distribution. We know this approach is valid for studying the local dependence structure for any pair of random variables determined by its empirical or theoretical distribution. And it can be used also to simulate dependent events and dependent r/v/’s, but some restrictions apply. ACM Computing Classification System (1998): G.3, J.2.
Resumo:
Since the seminal works of Markowitz (1952), Sharpe (1964), and Lintner (1965), numerous studies on portfolio selection and performance measure have been based upon the mean-variance framework. However, several researchers (e.g., Arditti (1967, and 1971), Samuelson (1970), and Rubinstein (1973)) argue that the higher moments cannot be neglected unless there is reason to believe that: (i) the asset returns are normally distributed and the investor's utility function is quadratic, or (ii) the empirical evidence demonstrates that higher moments are irrelevant to the investor's decision. Based on the same argument, this dissertation investigates the impact of higher moments of return distributions on three issues concerning the 14 international stock markets.^ First, the portfolio selection with skewness is determined using: the Polynomial Goal Programming in which investor preferences for skewness can be incorporated. The empirical findings suggest that the return distributions of international stock markets are not normally distributed, and that the incorporation of skewness into an investor's portfolio decision causes a major change in the construction of his optimal portfolio. The evidence also indicates that an investor will trade expected return of the portfolio for skewness. Moreover, when short sales are allowed, investors are better off as they attain higher expected return and skewness simultaneously.^ Second, the performance of international stock markets are evaluated using two types of performance measures: (i) the two-moment performance measures of Sharpe (1966), and Treynor (1965), and (ii) the higher-moment performance measures of Prakash and Bear (1986), and Stephens and Proffitt (1991). The empirical evidence indicates that higher moments of return distributions are significant and relevant to the investor's decision. Thus, the higher moment performance measures should be more appropriate to evaluate the performances of international stock markets. The evidence also indicates that various measures provide a vastly different performance ranking of the markets, albeit in the same direction.^ Finally, the inter-temporal stability of the international stock markets is investigated using the Parhizgari and Prakash (1989) algorithm for the Sen and Puri (1968) test which accounts for non-normality of return distributions. The empirical finding indicates that there is strong evidence to support the stability in international stock market movements. However, when the Anderson test which assumes normality of return distributions is employed, the stability in the correlation structure is rejected. This suggests that the non-normality of the return distribution is an important factor that cannot be ignored in the investigation of inter-temporal stability of international stock markets. ^
Resumo:
A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. ^ The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. ^ A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks. ^
Resumo:
Tropical coastal marine ecosystems including mangroves, seagrass beds and coral reef communities are undergoing intense degradation in response to natural and human disturbances, therefore, understanding the causes and mechanisms present challenges for scientist and managers. In order to protect our marine resources, determining the effects of nutrient loads on these coastal systems has become a key management goal. Data from monitoring programs were used to detect trends of macroalgae abundances and develop correlations with nutrient availability, as well as forecast potential responses of the communities monitored. Using eight years of data (1996–2003) from complementary but independent monitoring programs in seagrass beds and water quality of the Florida Keys National Marine Sanctuary (FKNMS), we: (1) described the distribution and abundance of macroalgae groups; (2) analyzed the status and spatiotemporal trends of macroalgae groups; and (3) explored the connection between water quality and the macroalgae distribution in the FKNMS. In the seagrass beds of the FKNMS calcareous green algae were the dominant macroalgae group followed by the red group; brown and calcareous red algae were present but in lower abundance. Spatiotemporal patterns of the macroalgae groups were analyzed with a non-linear regression model of the abundance data. For the period of record, all macroalgae groups increased in abundance (Abi) at most sites, with calcareous green algae increasing the most. Calcareous green algae and red algae exhibited seasonal pattern with peak abundances (Φi) mainly in summer for calcareous green and mainly in winter for red. Macroalgae Abi and long-term trend (mi) were correlated in a distinctive way with water quality parameters. Both the Abi and mi of calcareous green algae had positive correlations with NO3−, NO2−, total nitrogen (TN) and total organic carbon (TOC). Red algae Abi had a positive correlation with NO2−, TN, total phosphorus and TOC, and the mi in red algae was positively correlated with N:P. In contrast brown and calcareous red algae Abi had negative correlations with N:P. These results suggest that calcareous green algae and red algae are responding mainly to increases in N availability, a process that is happening in inshore sites. A combination of spatially variable factors such as local current patterns, nutrient sources, and habitat characteristics result in a complex array of the macroalgae community in the seagrass beds of the FKNMS.
Resumo:
A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.