997 resultados para Basic hypergeometric functions
Resumo:
Manufactured nanoparticles are introduced into industrial processes, but they are suspected to cause similar negative health effects as ambient particles. The poor knowledge about the scale of this introduction did not allow global risk analysis so far. In 2006 a targeted telephone survey among Swiss companies (1) showed the usage of nanoparticles in a few selected companies but did not provide data to extrapolate on the totality of the Swiss workforce. To gain this kind of information a layered representative questionnaire survey among 1'626 Swiss companies was conducted in 2007. Data was collected about the number of potentially exposed persons in the companies and their protection strategy. The response rate was 58.3%. An expected number of 586 companies (95%−confidence interval 145 to 1'027) was shown by this study to use nanoparticles in Switzerland. Estimated 1'309 (1'073 to 1'545) workers do their job in the same room as a nanoparticle application. Personal protection was shown to be the predominant type of protection means. Companies starting productions with nanomaterials need to consider incorporating protection measures into the plans. This will not only benefit the workers' health, but will also likely increase the competitiveness of the companies. Technical and organisational protection means are not only more cost−effective on the long term, but are also easier to control. Guidelines may have to be designed specifically for different industrial applications, including fields outside nanotechnology, and adapted to all sizes of companies.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
Iowa Department of Education surveyed Iowa’s 15 community colleges to gain information about each institution’s basic skill assessment requirements for placement into courses and programs. The survey asked what basic skill assessment(s) each institution uses, whether developmental course placement was mandatory, and what scores students needed to obtain to avoid being required or urged to take developmental courses in math, science, and reading. Additionally, staff members at each college were asked what the testing requirements are for students’ enrolled full time in high school that are taking community college classes.
Resumo:
Abstract
Resumo:
We experimentally question the assertion of Prospect Theory that people display risk attraction in choices involving high-probability losses. Indeed, our experimental participants tend to avoid fair risks for large (up to ? 90), high-probability (80%) losses. Our research hinges on a novel experimental method designed to alleviate the house-money bias that pervades experiments with real (not hypothetical) loses.Our results vindicate Daniel Bernoulli?s view that risk aversion is the dominant attitude,But, contrary to the Bernoulli-inspired canonical expected utility theory, we do find frequent risk attraction for small amounts of money at stake.In any event, we attempt neither to test expected utility versus nonexpected utility theories, nor to contribute to the important literature that estimates value and weighting functions. The question that we ask is more basic, namely: do people display risk aversion when facing large losses, or large gains? And, at the risk of oversimplifying, our answer is yes.
Resumo:
[eng] In this paper we claim that capital is as important in the production of ideas as in the production of final goods. Hence, we introduce capital in the production of knowledge and discuss the associated problems arising from the public good nature of knowledge. We show that although population growth can affect economic growth, it is not necessary for growth to arise. We derive both the social planner and the decentralized economy growth rates and show the optimal subsidy that decentralizes it. We also show numerically that the effects of population growth on the market growth rate, the optimal growth rate and the optimal subsidy are small. Besides, we find that physical capital is more important for the production of knowledge than for the production of goods.
Resumo:
[eng] In this paper we claim that capital is as important in the production of ideas as in the production of final goods. Hence, we introduce capital in the production of knowledge and discuss the associated problems arising from the public good nature of knowledge. We show that although population growth can affect economic growth, it is not necessary for growth to arise. We derive both the social planner and the decentralized economy growth rates and show the optimal subsidy that decentralizes it. We also show numerically that the effects of population growth on the market growth rate, the optimal growth rate and the optimal subsidy are small. Besides, we find that physical capital is more important for the production of knowledge than for the production of goods.
Resumo:
Low pressure partial melting of basanitic and ankaramitic dykes gave rise to unusual, zebra-like migmatites, in the contact aureole of a layered pyroxenite-gabbro intrusion, in the root zone of an ocean island (Basal Complex, Fuerteventura, Canary Islands). These migmatites are characterised by a dense network of closely spaced, millimetre-wide leucocratic segregations. Their mineralogy consists of plagioclase (An(32-36)), diopside, biotite, oxides (magnetite, ilmenite), +/-amphibole, dominated by plagioclase in the leucosome and diopside in the melanosome. The melanosome is almost completely recrystallised, with the preservation of large, relict igneous diopside phenocrysts in dyke centres. Comparison of whole-rock and mineral major- and trace-element data allowed us to assess the redistribution of elements between different mineral phases and generations during contact metamorphism and partial melting. Dykes within and outside the thermal aureole behaved like closed chemical systems. Nevertheless, Zr, Hf, Y and REEs were internally redistributed, as deduced by comparing the trace element contents of the various diopside generations. Neocrystallised diopside - in the melanosome, leucosome and as epitaxial phenocryst rims - from the migmatite zone, are all enriched in Zr, Hf, Y and REEs compared to relict phenocrysts. This has been assigned to the liberation of trace elements on the breakdown of enriched primary minerals, kaersutite and sphene, on entering the thermal aureole. Major and trace element compositions of minerals in migmatite melanosomes and leucosomes are almost identical, pointing to a syn- or post-solidus reequilibration on the cooling of the migmatite terrain i.e. mineral-melt equilibria were reset to mineral-mineral equilibria. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Linear spaces consisting of σ-finite probability measures and infinite measures (improper priors and likelihood functions) are defined. The commutative group operation, called perturbation, is the updating given by Bayes theorem; the inverse operation is the Radon-Nikodym derivative. Bayes spaces of measures are sets of classes of proportional measures. In this framework, basic notions of mathematical statistics get a simple algebraic interpretation. For example, exponential families appear as affine subspaces with their sufficient statistics as a basis. Bayesian statistics, in particular some well-known properties of conjugated priors and likelihood functions, are revisited and slightly extended
Resumo:
Chaque jour, le médecin utilise dans sa pratique des scores cliniques. Ces scores sont souvent des aides à la décision médicale. Les étapes de validation des scores cliniques sont par contre souvent méconnues du médecin. Cette revue rappelle les bases théoriques de la validation d'un score clinique et propose des exercices pratiques. [Abstract] Physicians are using clinical scores on a regular basis. These scores are generally helpful in making medical decisions. However, the process of validation of clinical scores is often unknown to the physicians. This paper reviews the theory of validation of clinical scores and proposes practical exercises.
Resumo:
The three peroxisome proliferator-activated receptors (PPAR alpha, PPAR beta, and PPAR gamma) are ligand-activated transcription factors belonging to the nuclear hormone receptor superfamily. They are regarded as being sensors of physiological levels of fatty acids and fatty acid derivatives. In the adult mouse skin, they are found in hair follicle keratinocytes but not in interfollicular epidermis keratinocytes. Skin injury stimulates the expression of PPAR alpha and PPAR beta at the site of the wound. Here, we review the spatiotemporal program that triggers PPAR beta expression immediately after an injury, and then gradually represses it during epithelial repair. The opposing effects of the tumor necrosis factor-alpha and transforming growth factor-beta-1 signalling pathways on the activity of the PPAR beta promoter are the key elements of this regulation. We then compare the involvement of PPAR beta in the skin in response to an injury and during hair morphogenesis, and underscore the similarity of its action on cell survival in both situations.
Resumo:
Calcineurin signaling plays diverse roles in fungi in regulating stress responses, morphogenesis and pathogenesis. Although calcineurin signaling is conserved among fungi, recent studies indicate important divergences in calcineurin-dependent cellular functions among different human fungal pathogens. Fungal pathogens utilize the calcineurin pathway to effectively survive the host environment and cause life-threatening infections. The immunosuppressive calcineurin inhibitors (FK506 and cyclosporine A) are active against fungi, making targeting calcineurin a promising antifungal drug development strategy. Here we summarize current knowledge on calcineurin in yeasts and filamentous fungi, and review the importance of understanding fungal-specific attributes of calcineurin to decipher fungal pathogenesis and develop novel antifungal therapeutic approaches.
Resumo:
A chronic inflammatory microenvironment favors tumor progression through molecular mechanisms that are still incompletely defined. In inflammation-induced skin cancers, IL-1 receptor- or caspase-1-deficient mice, or mice specifically deficient for the inflammasome adaptor protein ASC (apoptosis-associated speck-like protein containing a CARD) in myeloid cells, had reduced tumor incidence, pointing to a role for IL-1 signaling and inflammasome activation in tumor development. However, mice fully deficient for ASC were not protected, and mice specifically deficient for ASC in keratinocytes developed more tumors than controls, suggesting that, in contrast to its proinflammatory role in myeloid cells, ASC acts as a tumor-suppressor in keratinocytes. Accordingly, ASC protein expression was lost in human cutaneous squamous cell carcinoma, but not in psoriatic skin lesions. Stimulation of primary mouse keratinocytes or the human keratinocyte cell line HaCaT with UVB induced an ASC-dependent phosphorylation of p53 and expression of p53 target genes. In HaCaT cells, ASC interacted with p53 at the endogenous level upon UVB irradiation. Thus, ASC in different tissues may influence tumor growth in opposite directions: it has a proinflammatory role in infiltrating cells that favors tumor development, but it also limits keratinocyte proliferation in response to noxious stimuli, possibly through p53 activation, which helps suppressing tumors.
Resumo:
While there is evidence that the two ubiquitously expressed thyroid hormone (T3) receptors, TRalpha1 and TRbeta1, have distinct functional specificities, the mechanism by which they discriminate potential target genes remains largely unexplained. In this study, we demonstrate that the thyroid hormone response elements (TRE) from the malic enzyme and myelin basic protein genes (METRE and MBPTRE) respectively, are not functionally equivalent. The METRE, which is a direct repeat motif with a 4-base pair gap between the two half-site hexamers binds thyroid hormone receptor as a heterodimer with 9-cis-retinoic acid receptor (RXR) and mediates a high T3-dependent activation in response to TRalpha1 or TRbeta1 in NIH3T3 cells. In contrast, the MBPTRE, which consists of an inverted palindrome formed by two hexamers spaced by 6 base pairs, confers an efficient transactivation by TRbeta1 but a poor transactivation by TRalpha1. While both receptors form heterodimers with RXR on MBPTRE, the poor transactivation by TRalpha1 correlates also with its ability to bind efficiently as a monomer. This monomer, which is only observed with TRalpha1 bound to MBPTRE, interacts neither with N-CoR nor with SRC-1, explaining its functional inefficacy. However, in Xenopus oocytes, in which RXR proteins are not detectable, the transactivation mediated by TRalpha1 and TRbeta1 is equivalent and independent of a RXR supply, raising the question of the identity of the thyroid hormone receptor partner in these cells. Thus, in mammalian cells, the binding characteristics of TRalpha1 to MBPTRE (i.e. high monomer binding efficiency and low transactivation activity) might explain the particular pattern of T3 responsiveness of MBP gene expression during central nervous system development.