976 resultados para agent knowledge bases


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Context. Seizures during intoxications with pharmaceuticals are a well-known complication. However, only a few studies report on drugs commonly involved and calculate the seizure potential of these drugs. Objectives. To identify the pharmaceutical drugs most commonly associated with seizures after single-agent overdose, the seizure potential of these pharmaceuticals, the age-distribution of the cases with seizures and the ingested doses. Methods. A retrospective review of acute single-agent exposures to pharmaceuticals reported to the Swiss Toxicological Information Centre (STIC) between January 1997 and December 2010 was conducted. Exposures which resulted in at least one seizure were identified. The seizure potential of a pharmaceutical was calculated by dividing the number of cases with seizures by the number of all cases recorded with that pharmaceutical. Data were analyzed using descriptive statistics. Results. We identified 15,441 single-agent exposures. Seizures occurred in 313 cases. The most prevalent pharmaceuticals were mefenamic acid (51 of the 313 cases), citalopram (34), trimipramine (27), venlafaxine (23), tramadol (15), diphenhydramine (14), amitriptyline (12), carbamazepine (11), maprotiline (10), and quetiapine (10). Antidepressants were involved in 136 cases. Drugs with a high seizure potential were bupropion (31.6%, seizures in 6 of 19 cases, 95% CI: 15.4-50.0%), maprotiline (17.5%, 10/57, 95% CI: 9.8-29.4%), venlafaxine (13.7%, 23/168, 95% CI: 9.3-19.7%), citalopram (13.1%, 34/259, 95% CI: 9.5-17.8%), and mefenamic acid (10.9%, 51/470, 95% CI: 8.4-14.0%). In adolescents (15-19y/o) 23.9% (95% CI: 17.6-31.7%) of the cases involving mefenamic acid resulted in seizures, but only 5.7% (95% CI: 3.3-9.7%) in adults (≥ 20y/o; p < 0.001). For citalopram these numbers were 22.0% (95% CI: 12.8-35.2%) and 10.9% (95% CI: 7.1-16.4%), respectively (p = 0.058). The probability of seizures with mefenamic acid, citalopram, trimipramine, and venlafaxine increased as the ingested dose increased. Conclusions. Antidepressants were frequently associated with seizures in overdose, but other pharmaceuticals, as mefenamic acid, were also associated with seizures in a considerable number of cases. Bupropion was the pharmaceutical with the highest seizure potential even if overdose with bupropion was uncommon in our sample. Adolescents might be more susceptible to seizures after mefenamic acid overdose than adults. "Part of this work is already published as a conference abstract for the XXXIV International Congress of the European Association of Poisons Centres and Clinical Toxicologists (EAPCCT) 27-30 May 2014, Brussels, Belgium." Abstract 8, Clin Toxicol 2014;52(4):298.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of microbial biocontrol agents for the control of fungal plant diseases and plant insect pests is a promising approach in the development of environmentally benign pest management strategies. The ideal biocontrol organism would be a bacterium or a fungus with activity against both, insect pests and fungal pathogens. Here we demonstrate the oral insecticidal activity of the root colonizing Pseudomonas fluorescens CHA0, which is so far known for its capacity to efficiently suppress fungal plant pathogens. Feeding assays with CHA0-sprayed leaves showed that this strain displays oral insecticidal activity and is able to efficiently kill larvae of three important insect pests. We further show data indicating that the Fit insect toxin produced by CHA0 and also metabolites controlled by the global regulator GacA contribute to oral insect toxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we are aimed to investigate the relationship between Catalan knowledge and individual earnings in Catalonia. Using data from 2006, we find a positive earning return to Catalan proficiency; however, when accounting for self-selection into Catalan knowledge, we find a higher language return (20% of extra earnings), suggesting that individuals who are more prone to know Catalan are also less remunerated than others (negative selection effect). Moreover, we also find important complementarities between language knowledge and completed education, which means that only more educated individuals benefit from Catalan knowledge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the economic value of Catalan knowledge for national and foreign first- and second-generation immigrants in Catalonia. Specifically, drawing on data from the “Survey on Living Conditions and Habits of the Catalan Population (2006)”, we want to quantify the expected earnings differential between individuals who are proficient in Catalan and those who are not, taking into account the potential endogeneity between knowledge of Catalan and earnings. The results indicate the existence of a positive return to knowledge of Catalan, with a 7.5% increase in earnings estimated by OLS; however, when we account for the presence of endogeneity, monthly earnings are around 18% higher for individuals who are able to speak and write Catalan. However, we also find that language and education are complementary inputs for generating earnings in Catalonia, given that knowledge of Catalan increases monthly earnings only for more educated individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El present treball vol constatar un fet, que alhora ha resultat motor d'aquest. La realitat tal com se'ns planteja és la següent: sembla que no existeix un model organitzat i globalizador de competències, a manera de paradigma integrat, que relacioni els processos educatius que s'esdevenen en el si de la família i de l'escola. La definició d'aquesta problemàtica tan estretament relacionada amb la tasca docent ens ha conduït a plantejar una hipòtesi d'investigació, que pot formular-se d'aquesta manera. El nostre sistema educatiu preveu la implicació de la institució familiar en la formació competencial dels escolars?, i si és així, ens plantegem la segona qüestió: és possible elaborar un model de formació competencial que articuli els processos educatius que s'esdevenen en el si de la família i de l'escola de forma operativa i integrada? Per aconseguir respondre a aquestes qüestions s'inicia el treball amb l'estudi i posterior exposició dels següents punts: el marc legislatiu més recent, (des de la primera norma postconstitucional que va regular la participació dels pares a l'escola fins als nostres dies), les propostes formulades pels Consells Escolars Autonòmics i de l'Estat en les seves trobades anuals i les dades obtingudes de diversos informes realitzats pel INCE sobre la participació dels pares a l'escola. Tenint en compte que el nucli de la recerca gira al voltant de la relació família-escola a nivell competencial, s’exposa -més endavant- el marc legal i institucional, que actualment sembla favorable a la viabilitat d'un desenvolupament competencial comú entre família i escola . A continuació, es mostren unes taules amb indicadors que facilitaran l'estudi de la potencial relació entre les competències que es desenvolupen a l'escola i les que es donen en la família. Els aspectes relacionats amb la família els obtindrem de les aportacions trobades en obres i documents de pedagogs rellevants com Pestalozzi, Froebel, Montessori i Decroly. Al final del treball podem trobar -mitjançant l'assignació d'indicadors o descriptors- la relació que s'estableix entre les accions familiars i les competències escolars. Aquest treball permet postular que no només és possible articular una formació competencial del nen / a en la família i l'escola, sinó que les dues institucions educatives són articulables i sinèrgiques en aquest desenvolupament, optimitzant la formació competencial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the earnings return to Catalan knowledge for public and private workers in Catalonia. In doing so, we allow for a double simultaneous selection process. We consider, on the one hand, the non-random allocation of workers into one sector or another, and on the other, the potential self-selection into Catalan proficiency. In addition, when correcting the earnings equations, we take into account the correlation between the two selectivity rules. Our findings suggest that the apparent higher language return for public sector workers is entirely accounted for by selection effects, whereas knowledge of Catalan has a significant positive return in the private sector, which is somewhat higher when the selection processes are taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aquest document complementa les subtasques 2.4.2, 2.4.3 i 2.4.4 relacionades amb els Informes dels estudis de viabilitat constructiva, els resultats de la intervenció i la validació de la intervenció, proporcionant el nexe lògic entre la caracterització de les tècniques no-destructives i semidestructives realitzat en els documents I2.17 i I.2.18 i les millores de processos que s'aborden d'una forma més sistemàtica en el *SP4, en relació amb el desenvolupament i aplicació de models d'optimització multicriteri. Per a arribar a aquest objectiu i resoldre els problemes de interoperabilitat lligats a diferents bases de dades, ha estat necessari estendre l'Ontologia orientada inicialment en el SP6 per a usuaris amb algun tipus de discapacitat, a usuaris corresponents als tècnics que vagin a desenvolupar les intervencions d'acord amb les tècniques disponibles. Aquesta extensió inclou el diagrama entitat-relació que estén al cas dels tècnics l'esquema conceptual inicialment restringit als usuaris finals (ciutadans eventualment discapacitats). La major dificultat procedeix del caràcter excepcional de moltes de les intervencions realitzades, la qual cosa fa realment difícil l'estandardització dels processos orientats a resoldre el problema de l'accessibilitat al Patrimoni

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pyochelin (Pch) and enantiopyochelin (EPch) are enantiomeric siderophores, with three chiral centers, produced under iron limitation conditions by Pseudomonas aeruginosa and Pseudomonas fluorescens , respectively. After iron chelation in the extracellular medium, Pch-Fe and EPch-Fe are recognized and transported by their specific outer-membrane transporters: FptA in P. aeruginosa and FetA in P. fluorescens . Structural analysis of FetA-EPch-Fe and FptA-Pch-Fe, combined with mutagenesis and docking studies revealed the structural basis of the stereospecific recognition of these enantiomers by their respective transporters. Whereas FetA and FptA have a low sequence identity but high structural homology, the Pch and EPch binding pockets do not share any structural homology, but display similar physicochemical properties. The stereospecific recognition of both enantiomers by their corresponding transporters is imposed by the configuration of the siderophore's C4'' and C2'' chiral centers. This recognition involves specific hydrogen bonds between the Arg91 guanidinium group and EPch-Fe for FetA and between the Leu117-Leu116 main chain and Pch-Fe for FptA. FetA and FptA are the first membrane receptors to be structurally described with opposite binding enantioselectivities for their ligands, giving insights into the structural basis of their enantiospecificity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, many of the health care systems are large and complex environments and quite dynamic, specifically Emergency Departments, EDs. It is opened and working 24 hours per day throughout the year with limited resources, whereas it is overcrowded. Thus, is mandatory to simulate EDs to improve qualitatively and quantitatively their performance. This improvement can be achieved modelling and simulating EDs using Agent-Based Model, ABM and optimising many different staff scenarios. This work optimises the staff configuration of an ED. In order to do optimisation, objective functions to minimise or maximise have to be set. One of those objective functions is to find the best or optimum staff configuration that minimise patient waiting time. The staff configuration comprises: doctors, triage nurses, and admissions, the amount and sort of them. Staff configuration is a combinatorial problem, that can take a lot of time to be solved. HPC is used to run the experiments, and encouraging results were obtained. However, even with the basic ED used in this work the search space is very large, thus, when the problem size increases, it is going to need more resources of processing in order to obtain results in an acceptable time.