942 resultados para categorical and mix datasets


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Ph.D. dissertation seeks to study the work motivation of employees in the delivery of public services. The questioning on work motivation in public services in not new but it becomes central for governments which are now facing unprecedented public debts. The objective of this research is twofold : First, we want to see if the work motivation of employees in public services is a continuum (intrinsic and extrinsic motivations cannot coexist) or a bi-dimensional construct (intrinsic and extrinsic motivations coexist simultaneously). The research in public administration literature has focused on the concept of public service motivation, and considered motivation to be uni-dimensional (Perry and Hondeghem 2008). However, no study has yet tackled both types of motivation, the intrinsic and extrinsic ones, in the same time. This dissertation proposes, in Part I, a theoretical assessment and an empirical test of a global work motivational structure, by using a self-constructed Swiss dataset with employees from three public services, the education sector, the security sector and the public administrative services sector. Our findings suggest that work motivation in public services in not uni-dimensional but bi-dimensional, the intrinsic and extrinsic motivations coexist simultaneously and can be positively correlated (Amabile et al. 1994). Our findings show that intrinsic motivation is as important as extrinsic motivation, thus, the assumption that employees in public services are less attracted by extrinsic rewards is not confirmed for this sample. Other important finding concerns the public service motivation concept, which, as theoretically predicted, represents the major motivational dimension of employees in the delivery of public services. Second, the theory of public service motivation makes the assumption that employees in public services engage in activities that go beyond their self-interest, but never uses this construct as a determinant for their pro-social behavior. In the same time, several studies (Gregg et al. 2011 and Georgellis et al. 2011) bring evidence about the pro-social behavior of employees in public services. However, they do not identify which type of motivation is at the origin of this behavior, they only make the assumption of an intrinsically motivated behavior. We analyze the pro-social behavior of employees in public services and use the public service motivation as determinant of their pro-social behavior. We add other determinants highlighted by the theory of pro-social behavior (Bénabou and Tirole 2006), by Le Grand (2003) and by fit theories (Besley and Ghatak 2005). We test these determinants on Part II and identify for each sector of activity the positive or the negative impact on pro-social behavior of Swiss employees. Contrary to expectations, we find, for this sample, that both intrinsic and extrinsic factors have a positive impact on pro-social behavior, no crowding-out effect is identified in this sample. We confirm the hypothesis of Le Grand (2003) about the positive impact of the opportunity cost on pro-social behavior. Our results suggest a mix of action-oriented altruism and out-put oriented altruism of employees in public services. These results are relevant when designing incentives schemes for employees in the delivery of public services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Diet plays a role on the development of the immune system, and polyunsaturated fatty acids can modulate the expression of a variety of genes. Human milk contains conjugated linoleic acid (CLA), a fatty acid that seems to contribute to immune development. Indeed, recent studies carried out in our group in suckling animals have shown that the immune function is enhanced after feeding them with an 80:20 isomer mix composed of c9,t11 and t10,c12 CLA. However, little work has been done on the effects of CLA on gene expression, and even less regarding immune system development in early life. Results The expression profile of mesenteric lymph nodes from animals supplemented with CLA during gestation and suckling through dam's milk (Group A) or by oral gavage (Group B), supplemented just during suckling (Group C) and control animals (Group D) was determined with the aid of the specific GeneChip® Rat Genome 230 2.0 (Affymettrix). Bioinformatics analyses were performed using the GeneSpring GX software package v10.0.2 and lead to the identification of 89 genes differentially expressed in all three dietary approaches. Generation of a biological association network evidenced several genes, such as connective tissue growth factor (Ctgf), tissue inhibitor of metalloproteinase 1 (Timp1), galanin (Gal), synaptotagmin 1 (Syt1), growth factor receptor bound protein 2 (Grb2), actin gamma 2 (Actg2) and smooth muscle alpha actin (Acta2), as highly interconnected nodes of the resulting network. Gene underexpression was confirmed by Real-Time RT-PCR. Conclusions Ctgf, Timp1, Gal and Syt1, among others, are genes modulated by CLA supplementation that may have a role on mucosal immune responses in early life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the implementation of the 2000 Q-MC specification, an incentive is provided to produce an optimized gradation to improve placement characteristics. Also, specifications for slip-formed barrier rail have changed to require an optimized gradation. Generally, these optimized gradations have been achieved by blending an intermediate aggregate with the coarse and fine aggregate. The demand for this intermediate aggregate has been satisfied by using crushed limestone chips developed from the crushing of the parent concrete stone. The availability, cost, and physical limitations of crushed limestone chips can be a concern. A viable option in addressing these concerns is the use of gravel as the intermediate aggregate. Unfortunately, gravels of Class 3I durability are limited to a small geographic area in Mississippi river sands north of the Rock River. Class 3 or Class 2 durability gravels are more widely available across the state. The durability classification of gravels is based on the amount and quality of the carbonate fraction of the material. At present, no service histories or research exists to assess the impact of using Class 3 or 2 durability gravels would have on the long-term durability of Portland cement concrete (PCC) pavement requiring Class 3I aggregate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coverage and volume of geo-referenced datasets are extensive and incessantly¦growing. The systematic capture of geo-referenced information generates large volumes¦of spatio-temporal data to be analyzed. Clustering and visualization play a key¦role in the exploratory data analysis and the extraction of knowledge embedded in¦these data. However, new challenges in visualization and clustering are posed when¦dealing with the special characteristics of this data. For instance, its complex structures,¦large quantity of samples, variables involved in a temporal context, high dimensionality¦and large variability in cluster shapes.¦The central aim of my thesis is to propose new algorithms and methodologies for¦clustering and visualization, in order to assist the knowledge extraction from spatiotemporal¦geo-referenced data, thus improving making decision processes.¦I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical¦Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis:¦the Tree-structured Self-organizing Maps Component Planes. In addition, I present¦methodologies that combined with FGHSON and the Tree-structured SOM Component¦Planes allow the integration of space and time seamlessly and simultaneously in¦order to extract knowledge embedded in a temporal context.¦The originality of the FGHSON lies in its capability to reflect the underlying structure¦of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of¦clusters is crucial when data include complex structures with large variability of cluster¦shapes, variances, densities and number of clusters. The most important characteristics¦of the FGHSON include: (1) It does not require an a-priori setup of the number¦of clusters. (2) The algorithm executes several self-organizing processes in parallel.¦Hence, when dealing with large datasets the processes can be distributed reducing the¦computational cost. (3) Only three parameters are necessary to set up the algorithm.¦In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm¦lies in its ability to create a structure that allows the visual exploratory data analysis¦of large high-dimensional datasets. This algorithm creates a hierarchical structure¦of Self-Organizing Map Component Planes, arranging similar variables' projections in¦the same branches of the tree. Hence, similarities on variables' behavior can be easily¦detected (e.g. local correlations, maximal and minimal values and outliers).¦Both FGHSON and the Tree-structured SOM Component Planes were applied in¦several agroecological problems proving to be very efficient in the exploratory analysis¦and clustering of spatio-temporal datasets.¦In this thesis I also tested three soft competitive learning algorithms. Two of them¦well-known non supervised soft competitive algorithms, namely the Self-Organizing¦Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the¦third was our original contribution, the FGHSON. Although the algorithms presented¦here have been used in several areas, to my knowledge there is not any work applying¦and comparing the performance of those techniques when dealing with spatiotemporal¦geospatial data, as it is presented in this thesis.¦I propose original methodologies to explore spatio-temporal geo-referenced datasets¦through time. Our approach uses time windows to capture temporal similarities and¦variations by using the FGHSON clustering algorithm. The developed methodologies¦are used in two case studies. In the first, the objective was to find similar agroecozones¦through time and in the second one it was to find similar environmental patterns¦shifted in time.¦Several results presented in this thesis have led to new contributions to agroecological¦knowledge, for instance, in sugar cane, and blackberry production.¦Finally, in the framework of this thesis we developed several software tools: (1)¦a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called¦BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user¦interface tool which integrates the FGHSON algorithm with Google Earth in order to¦show zones with similar agroecological characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent findings suggest an association between exposure to cleaning products and respiratory dysfunctions including asthma. However, little information is available about quantitative airborne exposures of professional cleaners to volatile organic compounds deriving from cleaning products. During the first phases of the study, a systematic review of cleaning products was performed. Safety data sheets were reviewed to assess the most frequently added volatile organic compounds. It was found that professional cleaning products are complex mixtures of different components (compounds in cleaning products: 3.5 ± 2.8), and more than 130 chemical substances listed in the safety data sheets were identified in 105 products. The main groups of chemicals were fragrances, glycol ethers, surfactants, solvents; and to a lesser extent phosphates, salts, detergents, pH-stabilizers, acids, and bases. Up to 75% of products contained irritant (Xi), 64% harmful (Xn) and 28% corrosive (C) labeled substances. Hazards for eyes (59%), skin (50%) and by ingestion (60%) were the most reported. Monoethanolamine, a strong irritant and known to be involved in sensitizing mechanisms as well as allergic reactions, is frequently added to cleaning products. Monoethanolamine determination in air has traditionally been difficult and air sampling and analysis methods available were little adapted for personal occupational air concentration assessments. A convenient method was developed with air sampling on impregnated glass fiber filters followed by one step desorption, gas chromatography and nitrogen phosphorous selective detection. An exposure assessment was conducted in the cleaning sector, to determine airborne concentrations of monoethanolamine, glycol ethers, and benzyl alcohol during different cleaning tasks performed by professional cleaning workers in different companies, and to determine background air concentrations of formaldehyde, a known indoor air contaminant. The occupational exposure study was carried out in 12 cleaning companies, and personal air samples were collected for monoethanolamine (n=68), glycol ethers (n=79), benzyl alcohol (n=15) and formaldehyde (n=45). All but ethylene glycol mono-n-butyl ether air concentrations measured were far below (<1/10) of the Swiss eight hours occupational exposure limits, except for butoxypropanol and benzyl alcohol, where no occupational exposure limits were available. Although only detected once, ethylene glycol mono-n-butyl ether air concentrations (n=4) were high (49.5 mg/m3 to 58.7 mg/m3), hovering at the Swiss occupational exposure limit (49 mg/m3). Background air concentrations showed no presence of monoethanolamine, while the glycol ethers were often present, and formaldehyde was universally detected. Exposures were influenced by the amount of monoethanolamine in the cleaning product, cross ventilation and spraying. The collected data was used to test an already existing exposure modeling tool during the last phases of the study. The exposure estimation of the so called Bayesian tool converged with the measured range of exposure the more air concentrations of measured exposure were added. This was best described by an inverse 2nd order equation. The results suggest that the Bayesian tool is not adapted to predict low exposures. The Bayesian tool should be tested also with other datasets describing higher exposures. Low exposures to different chemical sensitizers and irritants should be further investigated to better understand the development of respiratory disorders in cleaning workers. Prevention measures should especially focus on incorrect use of cleaning products, to avoid high air concentrations at the exposure limits. - De récentes études montrent l'existence d'un lien entre l'exposition aux produits de nettoyages et les maladies respiratoires telles que l'asthme. En revanche, encore peu d'informations sont disponibles concernant la quantité d'exposition des professionnels du secteur du nettoyage aux composants organiques volatiles provenant des produits qu'ils utilisent. Pendant la première phase de cette étude, un recueil systématique des produits professionnels utilisés dans le secteur du nettoyage a été effectué. Les fiches de données de sécurité de ces produits ont ensuite été analysées, afin de répertorier les composés organiques volatiles les plus souvent utilisés. Il a été mis en évidence que les produits de nettoyage professionnels sont des mélanges complexes de composants chimiques (composants chimiques dans les produits de nettoyage : 3.5 ± 2.8). Ainsi, plus de 130 substances listées dans les fiches de données de sécurité ont été retrouvées dans les 105 produits répertoriés. Les principales classes de substances chimiques identifiées étaient les parfums, les éthers de glycol, les agents de surface et les solvants; dans une moindre mesure, les phosphates, les sels, les détergents, les régulateurs de pH, les acides et les bases ont été identifiés. Plus de 75% des produits répertoriés contenaient des substances décrites comme irritantes (Xi), 64% nuisibles (Xn) et 28% corrosives (C). Les risques pour les yeux (59%), la peau (50%) et par ingestion (60%) était les plus mentionnés. La monoéthanolamine, un fort irritant connu pour être impliqué dans les mécanismes de sensibilisation tels que les réactions allergiques, est fréquemment ajouté aux produits de nettoyage. L'analyse de la monoéthanolamine dans l'air a été habituellement difficile et les échantillons d'air ainsi que les méthodes d'analyse déjà disponibles étaient peu adaptées à l'évaluation de la concentration individuelle d'air aux postes de travail. Une nouvelle méthode plus efficace a donc été développée en captant les échantillons d'air sur des filtres de fibre de verre imprégnés, suivi par une étape de désorption, puis une Chromatographie des gaz et enfin une détection sélective des composants d'azote. Une évaluation de l'exposition des professionnels a été réalisée dans le secteur du nettoyage afin de déterminer la concentration atmosphérique en monoéthanolamine, en éthers de glycol et en alcool benzylique au cours des différentes tâches de nettoyage effectuées par les professionnels du nettoyage dans différentes entreprises, ainsi que pour déterminer les concentrations atmosphériques de fond en formaldéhyde, un polluant de l'air intérieur bien connu. L'étude de l'exposition professionnelle a été effectuée dans 12 compagnies de nettoyage et les échantillons d'air individuels ont été collectés pour l'éthanolamine (n=68), les éthers de glycol (n=79), l'alcool benzylique (n=15) et le formaldéhyde (n=45). Toutes les substances mesurées dans l'air, excepté le 2-butoxyéthanol, étaient en-dessous (<1/10) de la valeur moyenne d'exposition aux postes de travail en Suisse (8 heures), excepté pour le butoxypropanol et l'alcool benzylique, pour lesquels aucune valeur limite d'exposition n'était disponible. Bien que détecté qu'une seule fois, les concentrations d'air de 2-butoxyéthanol (n=4) étaient élevées (49,5 mg/m3 à 58,7 mg/m3), se situant au-dessus de la frontière des valeurs limites d'exposition aux postes de travail en Suisse (49 mg/m3). Les concentrations d'air de fond n'ont montré aucune présence de monoéthanolamine, alors que les éthers de glycol étaient souvent présents et les formaldéhydes quasiment toujours détectés. L'exposition des professionnels a été influencée par la quantité de monoéthanolamine présente dans les produits de nettoyage utilisés, par la ventilation extérieure et par l'emploie de sprays. Durant la dernière phase de l'étude, les informations collectées ont été utilisées pour tester un outil de modélisation de l'exposition déjà existant, l'outil de Bayesian. L'estimation de l'exposition de cet outil convergeait avec l'exposition mesurée. Cela a été le mieux décrit par une équation du second degré inversée. Les résultats suggèrent que l'outil de Bayesian n'est pas adapté pour mettre en évidence les taux d'expositions faibles. Cet outil devrait également être testé avec d'autres ensembles de données décrivant des taux d'expositions plus élevés. L'exposition répétée à des substances chimiques ayant des propriétés irritatives et sensibilisantes devrait être investiguée d'avantage, afin de mieux comprendre l'apparition de maladies respiratoires chez les professionnels du nettoyage. Des mesures de prévention devraient tout particulièrement être orientées sur l'utilisation correcte des produits de nettoyage, afin d'éviter les concentrations d'air élevées se situant à la valeur limite d'exposition acceptée.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence is accumulating that total body mass and its relative composition influence the rate of fat utilization in man. This effect can be explained by two factors operating in concert: (i) the effect of the size of the tissue mass and (ii) the nature of the fuel mix oxidized, i.e. the proportion of energy derived from fat vs. carbohydrate. In a cross-sectional study of 307 women with increasing degrees of obesity, we observed that the respiratory quotient (RQ) in post-absorptive conditions became progressively lower with increased body fatness, indicating a shift in substrate utilization. However, the RQ is known to be also influenced by the diet commonly ingested by the subjects. A short-term mixed diet overfeeding in lean and obese women has also demonstrated the high sensitivity of RQ to changes in energy balance. Following a one-day overfeeding (2500 kcal/day in excess of the previous 24 h energy expenditure), the magnitude of increase in RQ was identical in lean and obese subjects and the net efficiency of substrate utilization and storage was not influenced by the state of obesity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arbuscular mycorrhizal fungi (AMF) are obligate symbionts with most terrestrial plants. They improve plant nutrition, particularly phosphate acquisition, and thus are able to improve plant growth. In exchange, the fungi obtain photosynthetically fixed carbon. AMF are coenocytic, meaning that many nuclei coexist in a common cytoplasm. Genetic exchange recently has been demonstrated in the AMF Glomus intraradices, allowing nuclei of different Glomus intraradices strains to mix. Such genetic exchange was shown previously to have negative effects on plant growth and to alter fungal colonization. However, no attempt was made to detect whether genetic exchange in AMF can alter plant gene expression and if this effect was time dependent. Here, we show that genetic exchange in AMF also can be beneficial for rice growth, and that symbiosis-specific gene transcription is altered by genetic exchange. Moreover, our results show that genetic exchange can change the dynamics of the colonization of the fungus in the plant. Our results demonstrate that the simple manipulation of the genetics of AMF can have important consequences for their symbiotic effects on plants such as rice, which is considered the most important crop in the world. Exploiting natural AMF genetic variation by generating novel AMF genotypes through genetic exchange is a potentially useful tool in the development of AMF inocula that are more beneficial for crop growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document summarizes the discussion and findings of a workshop on intelligent compaction for soils and hot-mix asphalt held in West Des Moines, Iowa, on April 2–4, 2008. The objective of the meeting was to provide a collaborative exchange of ideas for developing research initiatives that accelerate implementation of intelligent compaction (IC) technologies for soil, aggregates, and hot mix asphalt. Technical presentations, working breakout sessions, a panel discussion, and a group implementation strategy session comprised the workshop activities. About 100 attendees representing state departments of transportation, Federal Highway Administration, contractors, equipment manufacturers, and researchers participated in the workshop.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this article is to identify tobacco and cannabis co-consumptions and consumers' perceptions of each substance. A qualitative research including 22 youths (14 males) aged 15-21 years in seven individual interviews and five focus groups. Discussions were recorded, transcribed verbatim and transferred to Atlas.ti software for narrative analysis. The main consumption mode is cannabis cigarettes which always mix cannabis and tobacco. Participants perceive cannabis much more positively than tobacco, which is considered unnatural, harmful and addictive. Future consumption forecasts thus more often exclude tobacco smoking than cannabis consumption. A substitution phenomenon often takes place between both substances. Given the co-consumption of tobacco and cannabis, in helping youths quit or decrease their consumptions, both substances should be taken into account in a global approach. Cannabis consumers should be made aware of their tobacco use while consuming cannabis and the risk of inducing nicotine addiction through cannabis use, despite the perceived disconnect between the two substances. Prevention programs should correct made-up ideas about cannabis consumption and convey a clear message about its harmful consequences. Our findings support the growing evidence which suggests that nicotine dependence and cigarette smoking may be induced by cannabis consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular shape has long been known to be an important property for the process of molecular recognition. Previous studies postulated the existence of a drug-like shape space that could be used to artificially bias the composition of screening libraries, with the aim to increase the chance of success in Hit Identification. In this work, it was analysed to which extend this assumption holds true. Normalized Principal Moments of Inertia Ratios (NPRs) have been used to describe the molecular shape of small molecules. It was investigated, whether active molecules of diverse targets are located in preferred subspaces of the NPR shape space. Results illustrated a significantly stronger clustering than could be expected by chance, with parts of the space unlikely to be occupied by active compounds. Furthermore, a strong enrichment of elongated, rather flat shapes could be observed, while globular compounds were highly underrepresented. This was confirmed for a wide range of small molecule datasets from different origins. Active compounds exhibited a high overlap in their shape distributions across different targets, making a purely shape­ based discrimination very difficult. An additional perspective was provided by comparing the shapes of protein binding pockets with those of their respective ligands. Although more globular than their ligands, it was observed that binding sites shapes exhibited a similarly skewed distribution in shape space: spherical shapes were highly underrepresented. This was different for unoccupied binding pockets of smaller size. These were on the contrary identified to possess a more globular shape. The relation between shape complementarity and exhibited bioactivity was analysed; a moderate correlation between bioactivity and parameters including pocket coverage, distance in shape space, and others could be identified, which reflects the importance of shape complementarity. However, this also suggests that other aspects are of relevance for molecular recognition. A subsequent analysis assessed if and how shape and volume information retrieved from pocket or respective reference ligands could be used as a pre-filter in a virtual screening approach. ln Lead Optimization compounds need to get optimized with respect to a variety of pararneters. Here, the availability of past success stories is very valuable, as they can guide medicinal chemists during their analogue synthesis plans. However, although of tremendous interest for the public domain, so far only large corporations had the ability to mine historical knowledge in their proprietary databases. With the aim to provide such information, the SwissBioisostere database was developed and released during this thesis. This database contains information on 21,293,355 performed substructural exchanges, corresponding to 5,586,462 unique replacements that have been measured in 35,039 assays against 1,948 molecular targets representing 30 target classes, and on their impact on bioactivity . A user-friendly interface was developed that provides facile access to these data and is accessible at http//www.swissbioisostere.ch. The ChEMBL database was used as primary data source of bioactivity information. Matched molecular pairs have been identified in the extracted and cleaned data. Success-based scores were developed and integrated into the database to allow re-ranking of proposed replacements by their past outcomes. It was analysed to which degree these scores correlate with chemical similarity of the underlying fragments. An unexpectedly weak relationship was detected and further investigated. Use cases of this database were envisioned, and functionalities implemented accordingly: replacement outcomes are aggregatable at the assay level, and it was shawn that an aggregation at the target or target class level could also be performed, but should be accompanied by a careful case-by-case assessment. It was furthermore observed that replacement success depends on the activity of the starting compound A within a matched molecular pair A-B. With increasing potency the probability to lose bioactivity through any substructural exchange was significantly higher than in low affine binders. A potential existence of a publication bias could be refuted. Furthermore, often performed medicinal chemistry strategies for structure-activity-relationship exploration were analysed using the acquired data. Finally, data originating from pharmaceutical companies were compared with those reported in the literature. It could be seen that industrial medicinal chemistry can access replacement information not available in the public domain. In contrast, a large amount of often-performed replacements within companies could also be identified in literature data. Preferences for particular replacements differed between these two sources. The value of combining different endpoints in an evaluation of molecular replacements was investigated. The performed studies highlighted furthermore that there seem to exist no universal substructural replacement that always retains bioactivity irrespective of the biological environment. A generalization of bioisosteric replacements seems therefore not possible. - La forme tridimensionnelle des molécules a depuis longtemps été reconnue comme une propriété importante pour le processus de reconnaissance moléculaire. Des études antérieures ont postulé que les médicaments occupent préférentiellement un sous-ensemble de l'espace des formes des molécules. Ce sous-ensemble pourrait être utilisé pour biaiser la composition de chimiothèques à cribler, dans le but d'augmenter les chances d'identifier des Hits. L'analyse et la validation de cette assertion fait l'objet de cette première partie. Les Ratios de Moments Principaux d'Inertie Normalisés (RPN) ont été utilisés pour décrire la forme tridimensionnelle de petites molécules de type médicament. Il a été étudié si les molécules actives sur des cibles différentes se co-localisaient dans des sous-espaces privilégiés de l'espace des formes. Les résultats montrent des regroupements de molécules incompatibles avec une répartition aléatoire, avec certaines parties de l'espace peu susceptibles d'être occupées par des composés actifs. Par ailleurs, un fort enrichissement en formes allongées et plutôt plates a pu être observé, tandis que les composés globulaires étaient fortement sous-représentés. Cela a été confirmé pour un large ensemble de compilations de molécules d'origines différentes. Les distributions de forme des molécules actives sur des cibles différentes se recoupent largement, rendant une discrimination fondée uniquement sur la forme très difficile. Une perspective supplémentaire a été ajoutée par la comparaison des formes des ligands avec celles de leurs sites de liaison (poches) dans leurs protéines respectives. Bien que plus globulaires que leurs ligands, il a été observé que les formes des poches présentent une distribution dans l'espace des formes avec le même type d'asymétrie que celle observée pour les ligands: les formes sphériques sont fortement sous­ représentées. Un résultat différent a été obtenu pour les poches de plus petite taille et cristallisées sans ligand: elles possédaient une forme plus globulaire. La relation entre complémentarité de forme et bioactivité a été également analysée; une corrélation modérée entre bioactivité et des paramètres tels que remplissage de poche, distance dans l'espace des formes, ainsi que d'autres, a pu être identifiée. Ceci reflète l'importance de la complémentarité des formes, mais aussi l'implication d'autres facteurs. Une analyse ultérieure a évalué si et comment la forme et le volume d'une poche ou de ses ligands de référence pouvaient être utilisés comme un pré-filtre dans une approche de criblage virtuel. Durant l'optimisation d'un Lead, de nombreux paramètres doivent être optimisés simultanément. Dans ce contexte, la disponibilité d'exemples d'optimisations réussies est précieuse, car ils peuvent orienter les chimistes médicinaux dans leurs plans de synthèse par analogie. Cependant, bien que d'un extrême intérêt pour les chercheurs dans le domaine public, seules les grandes sociétés pharmaceutiques avaient jusqu'à présent la capacité d'exploiter de telles connaissances au sein de leurs bases de données internes. Dans le but de remédier à cette limitation, la base de données SwissBioisostere a été élaborée et publiée dans le domaine public au cours de cette thèse. Cette base de données contient des informations sur 21 293 355 échanges sous-structuraux observés, correspondant à 5 586 462 remplacements uniques mesurés dans 35 039 tests contre 1948 cibles représentant 30 familles, ainsi que sur leur impact sur la bioactivité. Une interface a été développée pour permettre un accès facile à ces données, accessible à http:/ /www.swissbioisostere.ch. La base de données ChEMBL a été utilisée comme source de données de bioactivité. Une version modifiée de l'algorithme de Hussain et Rea a été implémentée pour identifier les Matched Molecular Pairs (MMP) dans les données préparées au préalable. Des scores de succès ont été développés et intégrés dans la base de données pour permettre un reclassement des remplacements proposés selon leurs résultats précédemment observés. La corrélation entre ces scores et la similarité chimique des fragments correspondants a été étudiée. Une corrélation plus faible qu'attendue a été détectée et analysée. Différents cas d'utilisation de cette base de données ont été envisagés, et les fonctionnalités correspondantes implémentées: l'agrégation des résultats de remplacement est effectuée au niveau de chaque test, et il a été montré qu'elle pourrait également être effectuée au niveau de la cible ou de la classe de cible, sous réserve d'une analyse au cas par cas. Il a en outre été constaté que le succès d'un remplacement dépend de l'activité du composé A au sein d'une paire A-B. Il a été montré que la probabilité de perdre la bioactivité à la suite d'un remplacement moléculaire quelconque est plus importante au sein des molécules les plus actives que chez les molécules de plus faible activité. L'existence potentielle d'un biais lié au processus de publication par articles a pu être réfutée. En outre, les stratégies fréquentes de chimie médicinale pour l'exploration des relations structure-activité ont été analysées à l'aide des données acquises. Enfin, les données provenant des compagnies pharmaceutiques ont été comparées à celles reportées dans la littérature. Il a pu être constaté que les chimistes médicinaux dans l'industrie peuvent accéder à des remplacements qui ne sont pas disponibles dans le domaine public. Par contre, un grand nombre de remplacements fréquemment observés dans les données de l'industrie ont également pu être identifiés dans les données de la littérature. Les préférences pour certains remplacements particuliers diffèrent entre ces deux sources. L'intérêt d'évaluer les remplacements moléculaires simultanément selon plusieurs paramètres (bioactivité et stabilité métabolique par ex.) a aussi été étudié. Les études réalisées ont souligné qu'il semble n'exister aucun remplacement sous-structural universel qui conserve toujours la bioactivité quel que soit le contexte biologique. Une généralisation des remplacements bioisostériques ne semble donc pas possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Moisture sensitivity of Hot Mix Asphalt (HMA) mixtures, generally called stripping, is a major form of distress in asphalt concrete pavement. It is characterized by the loss of adhesive bond between the asphalt binder and the aggregate (a failure of the bonding of the binder to the aggregate) or by a softening of the cohesive bonds within the asphalt binder (a failure within the binder itself), both of which are due to the action of loading under traffic in the presence of moisture. The evaluation of HMA moisture sensitivity has been divided into two categories: visual inspection test and mechanical test. However, most of them have been developed in pre-Superpave mix design. This research was undertaken to develop a protocol for evaluating the moisture sensitivity potential of HMA mixtures using the Nottingham Asphalt Tester (NAT). The mechanisms of HMA moisture sensitivity were reviewed and the test protocols using the NAT were developed. Different types of blends as moisture-sensitive groups and non-moisture-sensitive groups were used to evaluate the potential of the proposed test. The test results were analyzed with three parameters based on performance character: the retained flow number depending on critical permanent deformation failure (RFNP), the retained flow number depending on cohesion failure (RFNC), and energy ratio (ER). Analysis based on energy ratio of elastic strain (EREE ) at flow number of cohesion failure (FNC) has higher potential to evaluate the HMA moisture sensitivity than other parameters. If the measurement error in data-acquisition process is removed, analyses based on RFNP and RFNC would also have high potential to evaluate the HMA moisture sensitivity. The vacuum pressure saturation used in AASHTO T 283 and proposed test has a risk to damage specimen before the load applying.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-precision isotope dilution - thermal ionization mass spectrometry (ID-TIMS) U-Pb zircon and baddeleyite ages from the PX1 vertically layered mafic intrusion Fuerteventura, Canary Islands, indicate initiation of magma crystallization at 22.10 +/- 0.07 Ma. The magmatic activity lasted a minimum of 0.52 Ma. Ar-40/Ar-39 amphibole dating yielded ages from 21.9 +/- 0.6 to 21.8 +/- 0.3, identical within errors to the U-Pb ages, despite the expected 1% theoretical bias between Ar-40/Ar-39 and U-Pb dates. This overlap could result from (i) rapid cooling of the intrusion (i. e., less than the 0.3 to 0.6 Ma 40Ar/39Ar age uncertainties) from closure temperatures (T-c) of zircon (699-988 degrees C) to amphibole (500-600 degrees C); (ii) lead loss affecting the youngest zircons; or (iii) excess argon shifting the plateau ages towards older values. The combination of the Ar-40/Ar-39 and U/Pb datasets implies that the maximum amount of time PX1 intrusion took to cool below amphibole T-c is 0.8 Ma, suggesting PX1 lifetime of 520 000 to 800 000 Ma. Age disparities among coexisting baddeleyite and zircon (22.10 +/- 0.07/0.08/0.15 Ma and 21.58 +/- 0.15/0.16/0.31 Ma) in a gabbro sample from the pluton margin suggest complex genetic relationships between phases. Baddeleyite is found preserved in plagioclase cores and crystallized early from low silica activity magma. Zircon crystallized later in a higher silica activity environment and is found in secondary scapolite and is found close to calcite veins, in secondary scapolite that recrystallised from plagioclase. close to calcite veins. Oxygen isotope delta O-18 values of altered plagioclase are high (+7.7), indicating interaction with fluids derived from host-rock carbonatites. The coexistence of baddeleyite and zircon is ascribed to interaction of the PX1 gabbro with CO2-rich carbonatite-derived fluids released during contact metamorphism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since different pedologists will draw different soil maps of a same area, it is important to compare the differences between mapping by specialists and mapping techniques, as for example currently intensively discussed Digital Soil Mapping. Four detailed soil maps (scale 1:10.000) of a 182-ha sugarcane farm in the county of Rafard, São Paulo State, Brazil, were compared. The area has a large variation of soil formation factors. The maps were drawn independently by four soil scientists and compared with a fifth map obtained by a digital soil mapping technique. All pedologists were given the same set of information. As many field expeditions and soil pits as required by each surveyor were provided to define the mapping units (MUs). For the Digital Soil Map (DSM), spectral data were extracted from Landsat 5 Thematic Mapper (TM) imagery as well as six terrain attributes from the topographic map of the area. These data were summarized by principal component analysis to generate the map designs of groups through Fuzzy K-means clustering. Field observations were made to identify the soils in the MUs and classify them according to the Brazilian Soil Classification System (BSCS). To compare the conventional and digital (DSM) soil maps, they were crossed pairwise to generate confusion matrices that were mapped. The categorical analysis at each classification level of the BSCS showed that the agreement between the maps decreased towards the lower levels of classification and the great influence of the surveyor on both the mapping and definition of MUs in the soil map. The average correspondence between the conventional and DSM maps was similar. Therefore, the method used to obtain the DSM yielded similar results to those obtained by the conventional technique, while providing additional information about the landscape of each soil, useful for applications in future surveys of similar areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil properties have an enormous impact on economic and environmental aspects of agricultural production. Quantitative relationships between soil properties and the factors that influence their variability are the basis of digital soil mapping. The predictive models of soil properties evaluated in this work are statistical (multiple linear regression-MLR) and geostatistical (ordinary kriging and co-kriging). The study was conducted in the municipality of Bom Jardim, RJ, using a soil database with 208 sampling points. Predictive models were evaluated for sand, silt and clay fractions, pH in water and organic carbon at six depths according to the specifications of the consortium of digital soil mapping at the global level (GlobalSoilMap). Continuous covariates and categorical predictors were used and their contributions to the model assessed. Only the environmental covariates elevation, aspect, stream power index (SPI), soil wetness index (SWI), normalized difference vegetation index (NDVI), and b3/b2 band ratio were significantly correlated with soil properties. The predictive models had a mean coefficient of determination of 0.21. Best results were obtained with the geostatistical predictive models, where the highest coefficient of determination 0.43 was associated with sand properties between 60 to 100 cm deep. The use of a sparse data set of soil properties for digital mapping can explain only part of the spatial variation of these properties. The results may be related to the sampling density and the quantity and quality of the environmental covariates and predictive models used.