170 resultados para Ontology tool

em Université de Lausanne, Switzerland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gene Ontology (GO) Consortium (http://www.geneontology.org) (GOC) continues to develop, maintain and use a set of structured, controlled vocabularies for the annotation of genes, gene products and sequences. The GO ontologies are expanding both in content and in structure. Several new relationship types have been introduced and used, along with existing relationships, to create links between and within the GO domains. These improve the representation of biology, facilitate querying, and allow GO developers to systematically check for and correct inconsistencies within the GO. Gene product annotation using GO continues to increase both in the number of total annotations and in species coverage. GO tools, such as OBO-Edit, an ontology-editing tool, and AmiGO, the GOC ontology browser, have seen major improvements in functionality, speed and ease of use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asthma is a chronic inflammatory disease of the airways that involves many cell types, amongst which mast cells are known to be important. Adenosine, a potent bronchoconstricting agent, exerts its ability to modulate adenosine receptors of mast cells thereby potentiating derived mediator release, histamine being one of the first mediators to be released. The heterogeneity of sources of mast cells and the lack of highly potent ligands selective for the different adenosine receptor subtypes have been important hurdles in this area of research. In the present study we describe compound C0036E08, a novel ligand that has high affinity (pK(i) 8.46) for adenosine A(2B) receptors, being 9 times, 1412 times and 3090 times more selective for A(2B) receptors than for A(1), A(2A) and A(3) receptors, respectively. Compound C0036E08 showed antagonist activity at recombinant and native adenosine receptors, and it was able to fully block NECA-induced histamine release in freshly isolated mast cells from human bronchoalveolar fluid. C0036E08 has been shown to be a valuable tool for the identification of adenosine A(2B) receptors as the adenosine receptors responsible for the NECA-induced response in human mast cells. Considering the increasing interest of A(2B) receptors as a therapeutic target in asthma, this chemical tool might provide a base for the development of new anti-asthmatic drugs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Iron deficiency is a common and undertreated problem in inflammatory bowel disease (IBD). AIM: To develop an online tool to support treatment choice at the patient-specific level. METHODS: Using the RAND/UCLA Appropriateness Method (RUAM), a European expert panel assessed the appropriateness of treatment regimens for a variety of clinical scenarios in patients with non-anaemic iron deficiency (NAID) and iron deficiency anaemia (IDA). Treatment options included adjustment of IBD medication only, oral iron supplementation, high-/low-dose intravenous (IV) regimens, IV iron plus erythropoietin-stimulating agent (ESA), and blood transfusion. The panel process consisted of two individual rating rounds (1148 treatment indications; 9-point scale) and three plenary discussion meetings. RESULTS: The panel reached agreement on 71% of treatment indications. 'No treatment' was never considered appropriate, and repeat treatment after previous failure was generally discouraged. For 98% of scenarios, at least one treatment was appropriate. Adjustment of IBD medication was deemed appropriate in all patients with active disease. Use of oral iron was mainly considered an option in NAID and mildly anaemic patients without disease activity. IV regimens were often judged appropriate, with high-dose IV iron being the preferred option in 77% of IDA scenarios. Blood transfusion and IV+ESA were indicated in exceptional cases only. CONCLUSIONS: The RUAM revealed high agreement amongst experts on the management of iron deficiency in patients with IBD. High-dose IV iron was more often considered appropriate than other options. To facilitate dissemination of the recommendations, panel outcomes were embedded in an online tool, accessible via http://ferroscope.com/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and Aims: The international EEsAI study group is currently developing the first activity index specific for Eosinophilic Esophagitis (EoE). None of the existing dysphagia questionnaires takes into account the consistency of the ingested food that considerably impacts the symptom presentation. Goal: To develop an EoE-specific questionnaire assessing dysphagia associated with different food consistencies. Methods: Based on patient chart reviews, an expert panel (EEsAI study group) identified internationally standardized food prototypes typically associated with EoE-related dysphagia. Food consistencies were correlated with EoE-related dysphagia, also considering potential food avoidance. This Visual Dysphagia Questionnaire (VDQ) was then tested, as a pilot, in 10 EoE patients. Results: The following 9 food consistency prototypes were identified: water, soft foods (pudding, jelly), grits, toast bread, French fries, dry rice, ground meat, raw fibrous foods (eg. apple, carrot), solid meat. Dysphagia was ranked on a 5-point Likert scale (0=no difficulties, 5=very severe difficulties, food will not pass). Severity of dysphagia in the 10 EoE patients was related to the eosinophil load and presence of esophageal strictures. Conclusions: The VDQ will be the first EoE-specific tool for assessing dysphagia related to internationally defined food consistencies. It performed well in a pilot study and will now be further evaluated in a cohort study including 100 adult and 100 pediatric EoE patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Adequate pain assessment is critical for evaluating the efficacy of analgesic treatment in clinical practice and during the development of new therapies. Yet the currently used scores of global pain intensity fail to reflect the diversity of pain manifestations and the complexity of underlying biological mechanisms. We have developed a tool for a standardized assessment of pain-related symptoms and signs that differentiates pain phenotypes independent of etiology. METHODS AND FINDINGS: Using a structured interview (16 questions) and a standardized bedside examination (23 tests), we prospectively assessed symptoms and signs in 130 patients with peripheral neuropathic pain caused by diabetic polyneuropathy, postherpetic neuralgia, or radicular low back pain (LBP), and in 57 patients with non-neuropathic (axial) LBP. A hierarchical cluster analysis revealed distinct association patterns of symptoms and signs (pain subtypes) that characterized six subgroups of patients with neuropathic pain and two subgroups of patients with non-neuropathic pain. Using a classification tree analysis, we identified the most discriminatory assessment items for the identification of pain subtypes. We combined these six interview questions and ten physical tests in a pain assessment tool that we named Standardized Evaluation of Pain (StEP). We validated StEP for the distinction between radicular and axial LBP in an independent group of 137 patients. StEP identified patients with radicular pain with high sensitivity (92%; 95% confidence interval [CI] 83%-97%) and specificity (97%; 95% CI 89%-100%). The diagnostic accuracy of StEP exceeded that of a dedicated screening tool for neuropathic pain and spinal magnetic resonance imaging. In addition, we were able to reproduce subtypes of radicular and axial LBP, underscoring the utility of StEP for discerning distinct constellations of symptoms and signs. CONCLUSIONS: We present a novel method of identifying pain subtypes that we believe reflect underlying pain mechanisms. We demonstrate that this new approach to pain assessment helps separate radicular from axial back pain. Beyond diagnostic utility, a standardized differentiation of pain subtypes that is independent of disease etiology may offer a unique opportunity to improve targeted analgesic treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The discovery of innate immune genes, such as those encoding Toll-like receptors (TLRs), nucleotide-binding oligomerisation domain-like receptors (NLRs), and related signal-transducing molecules, has led to a substantial improvement of our understanding of innate immunity. Recent immunogenetic studies have associated polymorphisms of the genes encoding TLRs, NLRs, and key signal-transducing molecules, such as interleukin-1 receptor-associated kinase 4 (IRAK4), with increased susceptibility to, or outcome of, infectious diseases. With the availability of high-throughput genotyping techniques, it is becoming increasingly evident that analyses of genetic polymorphisms of innate immune genes will further improve our knowledge of the host antimicrobial defence response and help in identifying individuals who are at increased risk of life-threatening infections. This is likely to open new perspectives for the development of diagnostic, predictive, and preventive management strategies to combat infectious diseases.