938 resultados para Ontology Visualization
Resumo:
El projecte ha consistit en el disseny i implementació d'una arquitectura/plataforma d'integració dels serveis d'emmagatzemament i postprocessament d'imatge mèdica que oferix el grup així com la visualització, anonimització, transferència d'arxius... basat en una interfície web com a frontend de la plataforma. Els servis que requereixen interacció gràfica han estat implementats mitjançant tècniques d'exportació d'escriptori remotament a la web i altres s'han implementat per tal que funcionin amb el cluster de màquines del que disposa el PIC.
Resumo:
Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.
Resumo:
Coronary magnetic resonance angiography (MRA) is a technique aimed at establishing a noninvasive test for the assessment of significant coronary stenoses. There are certain boundary conditions that have hampered the clinical success of coronary MRA and coronary vessel wall imaging. Recent advances in hardware and software allow for consistent visualization of the proximal and mid portions of the native coronary arteries. Current research focuses on the use of intravascular MR contrast agents and black blood coronary angiography. One common goal is to create a noninvasive test which might allow for screening for major proximal and mid coronary artery disease. These novel approaches will represent a major step forward in diagnostic cardiology.
Resumo:
Brain deformations induced by space-occupying lesions may result in unpredictable position and shape of functionally important brain structures. The aim of this study is to propose a method for segmentation of brain structures by deformation of a segmented brain atlas in presence of a space-occupying lesion. Our approach is based on an a priori model of lesion growth (MLG) that assumes radial expansion from a seeding point and involves three steps: first, an affine registration bringing the atlas and the patient into global correspondence; then, the seeding of a synthetic tumor into the brain atlas providing a template for the lesion; finally, the deformation of the seeded atlas, combining a method derived from optical flow principles and a model of lesion growth. The method was applied on two meningiomas inducing a pure displacement of the underlying brain structures, and segmentation accuracy of ventricles and basal ganglia was assessed. Results show that the segmented structures were consistent with the patient's anatomy and that the deformation accuracy of surrounding brain structures was highly dependent on the accurate placement of the tumor seeding point. Further improvements of the method will optimize the segmentation accuracy. Visualization of brain structures provides useful information for therapeutic consideration of space-occupying lesions, including surgical, radiosurgical, and radiotherapeutic planning, in order to increase treatment efficiency and prevent neurological damage.
Resumo:
PADICAT is the web archive created in 2005 in Catalonia (Spain ) by the Library of Catalonia (BC ) , the National Library of Catalonia , with the aim of collecting , processing and providing permanent access to the digital heritage of Catalonia . Its harvesting strategy is based on the hybrid model ( of massive harvesting . SPA top level domain ; selective compilation of the web site output of Catalan organizations; focused harvesting of public events) . The system provides open access to the whole collection , on the Internet . We consider necessary to complement the current search for new and visualization software with open source software tool, CAT ( Curator Archiving Tool) , composed by three modules aimed to effectively managing the processes of human cataloguing ; to publish directories where the digital resources and special collections ; and to offer statistical information of added value to end users. Within the framework of the International Internet Preservation Consortium meeting ( Vienna 2010) , the progress in the development of this new tool, and the philosophy that has motivated his design, are presented to the international community.
Resumo:
The aim of this study was to illustrate the chest radiographs (CR) and CT imaging features and sequential findings of cavitary necrosis in complicated childhood pneumonia. Among 30 children admitted in the Pediatric Intensive Care Unit for persistent or progressive pneumonia, respiratory distress or sepsis despite adequate antibiotic therapy, a study group of 9 children (5 girls and 4 boys; mean age 4 years) who had the radiographic features and CT criteria for cavitary necrosis complicated pneumonia was identified. The pathogens identified were Streptococcus pneumoniae( n=4), Aspergillus( n=2), Legionella( n=1), and Staphylococcus aureus( n=1). Sequential CR and CT scans were retrospectively reviewed. Follow-up CR and CT were evaluated for persistent abnormalities. Chest radiographs showed consolidations in 8 of the 9 patients. On CT examination, cavitary necrosis was localized to 1 lobe in 2 patients and 7 patients showed multilobar or bilateral areas of cavitary necrosis. In 3 patients of 9, the cavitary necrosis was initially shown on CT and visualization by CR was delayed by a time span varying from 5 to 9 days. In all patients with cavities, a mean number of five cavities were seen on antero-posterior CR, contrasting with the multiple cavities seen on CT. Parapneumonic effusions were shown by CR in 3 patients and in 5 patients by CT. Bronchopleural fistulae were demonstrated by CT alone ( n=3). No purulent pericarditis was demonstrated. The CT scan displayed persistent residual pneumatoceles of the left lower lobe in 2 patients. Computed tomography is able to define a more specific pattern of abnormalities than conventional CR in children with necrotizing pneumonia and allows an earlier diagnosis of this rapidly progressing condition. Lung necrosis and cavitation may also be associated with Aspergillus or Legionella pneumonia in the pediatric population.
Resumo:
La question centrale de ce travail est celle de la relation entre finitude environnementale et liberté individuelle. Par finitude environnementale il faut entendre l'ensemble des contraintes écologiques variées qui posent des limites à l'action humaine. Celles-ci sont de deux types généraux : les limites de disponibilité des ressources naturelles et: les limites de charge des écosystèmes et des grands cycles biogéochimiques globaux (chapitre 1). La thèse défendue ici est que les conceptions libertariennes et libérales de la liberté sont en conflit avec la nécessité de prendre en compte de telles limites et qu'une approche néo-républicaine est mieux à même de répondre à ces enjeux écologiques. Les théories libertariennes, de droite comme de gauche, sont inadaptées à la prise en compte de la finitude des ressources naturelles car elles maintiennent un droit à l'appropriation illimitée de ces dernières par les individus. Ce point est en contradiction avec le caractère systémique de la rareté et avec l'absence de substitut pour certaines ressources indispensables à la poursuite d'une vie décente (chapitres 2 et 3). La théorie libérale de la neutralité, appuyée par le principe du tort (harm principle), est quant à elle inadaptée à la prise en compte des problèmes environnementaux globaux comme le changement climatique. Les mécanismes causaux menant à la création de dommages environnementaux sont en effet indirects et diffus, ce qui empêche l'assignation de responsabilités au niveau individuel. La justification de politiques environnementales contraignantes s'en trouve donc mise en péril (chapitre 4). Ces difficultés proviennent avant tout de deux traits caractéristiques de ces doctrines : leur ontologie sociale atomiste et leur conception de la liberté comme liberté de choix. Le néo-républicanisme de Philip Pettit permet de répondre à ces deux problèmes grâce à son ontologie holiste et à sa conception de la liberté comme non- domination. Cette théorie permet donc à la fois de proposer une conception de la liberté compatible avec la finitude environnementale et de justifier des politiques environnementales exigeantes, sans que le sacrifice en termes de liberté n'apparaisse trop important (chapitre 5). - The centrai issue of this work is that of the relationship between environmental finiteness and individual liberty. By environmental finiteness one should understand the set of diverse ecological constraints that limit human action. These limits are of two general kinds: on the one hand the availability of natural resources, and on the other hand the carrying capacity of ecosystems and biogeochemical cycles (chapter 1}. The thesis defended here is that libertarian and liberal conceptions of liberty conflict with the necessity to take such limits into account, and that a neo-republican approach is best suited to address environmental issues. Libertarian theories, right-wing as well as left-wing, are in particular not able to take resource scarcity into account because they argue for an unlimited right of individuals to appropriate those resources. This point is in contradiction with the systemic nature of scarcity and with the absence of substitutes for some essential resources (chapters 2 and 3). The liberal doctrine of neutrality, as associated with the harm principle, is unsuitable when addressing global environmental issues like climate change. Causal mechanisms leading to environmental harm are indirect and diffuse, which prevents the assignation of individual responsibilities. This makes the justification of coercive environmental policies difficult (chapter 4). These difficulties stem above all from two characteristic features of libertarian and liberal doctrines: their atomistic social ontology and their conception of freedom as liberty of choice. Philip Pettit's neo- republicanism on the other hand is able to address these problems thanks to its holist social ontology and its conception of liberty as non-domination. This doctrine offers a conception of liberty compatible with environmental limits and theoretical resources able to justify demanding environmental policies without sacrificing too much in terms of liberty (chapter 5).
Resumo:
Amb una plantilla, hem elaborat uns materials docents -l'objecte d'estudi dels quals és l’Estatut d'Autonomia de Catalunya de 2006, des d'una perspectiva jurídica -, que corresponen fonamentalment a l’assignatura “Institucions Polítiques de Catalunya” de 4rt. curs de la llicenciatura de Dret de la UAB. Els materials s’han penjat al web http://www.institucionspolitiques.com. L' objectiu de l’acció docent ha estat consolidar una eina interactiva, que fomenti la creativitat i el treball cooperatiu de l’estudiant, que permeti un rol més actiu del professorat, i finalment, es faciliti l’adaptació docent a l’espai europeu d’educació superior. L’experiència s'ha completat amb l’ús del mètode bimodal -campus virtual de la UAB -, amb visites a institucions públiques i amb l’assistència periòdica com a públic a programes de televisió de continguts relacionats amb la matèria. La plantilla per a desenvolupar els diversos temes del programa ha estat la següent: 1.- Redacció del tema; 2.- Visualització de conceptes: quadres sinòptics, esquemes, gràfics; 3.-Bibliografia bàsica. 4.-Legislació i jurisprudència. 5.-Text reproduït. 6.-Qüestions.7.-Temes per al debat. 8.-Test. 9.-Materials complementaris audiovisuals. 10.-Enllaços útils a la xarxa. L'experiència s’ha desenvolupat els dos darrers cursos acadèmics: 2006/07 i 2007/08, i ha girat entorn de l'elaboració i aplicació dels materials elaborats. Ha estat teòrica i pràctica a la vegada. S'ha prestat especial atenció a la feina de l'alumnat fora de l'aula, amb el conseqüent estudi previ i col·laboració en l’elaboració de materials. Cal destacar també que hem aconseguit augmentar la participació de l'alumnat a l’aula, i d'aquesta manera el rol del professor ha esdevingut més dinàmic. Els materials han permès un fàcil i ràpid accés dels estudiants, amb l’objectiu de gaudir d’unes sessions presencials més interactives, i s'ha connectat la Universitat amb la societat, a més de desenvolupar actituds cíviques o de consciència de país en una matèria tan sensible com és l'estudi del dret públic de Catalunya.
Resumo:
Genetic determinants of blood pressure are poorly defined. We undertook a large-scale, gene-centric analysis to identify loci and pathways associated with ambulatory systolic and diastolic blood pressure. We measured 24-hour ambulatory blood pressure in 2020 individuals from 520 white European nuclear families (the Genetic Regulation of Arterial Pressure of Humans in the Community Study) and genotyped their DNA using the Illumina HumanCVD BeadChip array, which contains ≈50 000 single nucleotide polymorphisms in >2000 cardiovascular candidate loci. We found a strong association between rs13306560 polymorphism in the promoter region of MTHFR and CLCN6 and mean 24-hour diastolic blood pressure; each minor allele copy of rs13306560 was associated with 2.6 mm Hg lower mean 24-hour diastolic blood pressure (P=1.2×10(-8)). rs13306560 was also associated with clinic diastolic blood pressure in a combined analysis of 8129 subjects from the Genetic Regulation of Arterial Pressure of Humans in the Community Study, the CoLaus Study, and the Silesian Cardiovascular Study (P=5.4×10(-6)). Additional analysis of associations between variants in gene ontology-defined pathways and mean 24-hour blood pressure in the Genetic Regulation of Arterial Pressure of Humans in the Community Study showed that cell survival control signaling cascades could play a role in blood pressure regulation. There was also a significant overrepresentation of rare variants (minor allele frequency: <0.05) among polymorphisms showing at least nominal association with mean 24-hour blood pressure indicating that a considerable proportion of its heritability may be explained by uncommon alleles. Through a large-scale gene-centric analysis of ambulatory blood pressure, we identified an association of a novel variant at the MTHFR/CLNC6 locus with diastolic blood pressure and provided new insights into the genetic architecture of blood pressure.
Resumo:
Over the past decades, several sensitive post-electrophoretic stains have been developed for an identification of proteins in general, or for a specific detection of post-translational modifications such as phosphorylation, glycosylation or oxidation. Yet, for a visualization and quantification of protein differences, the differential two-dimensional gel electrophoresis, termed DIGE, has become the method of choice for a detection of differences in two sets of proteomes. The goal of this review is to evaluate the use of the most common non-covalent and covalent staining techniques in 2D electrophoresis gels, in order to obtain maximal information per electrophoresis gel and for an identification of potential biomarkers. We will also discuss the use of detergents during covalent labeling, the identification of oxidative modifications and review influence of detergents on finger prints analysis and MS/MS identification in relation to 2D electrophoresis.
Resumo:
Somatic copy number aberrations (CNA) represent a mutation type encountered in the majority of cancer genomes. Here, we present the 2014 edition of arrayMap (http://www.arraymap.org), a publicly accessible collection of pre-processed oncogenomic array data sets and CNA profiles, representing a vast range of human malignancies. Since the initial release, we have enhanced this resource both in content and especially with regard to data mining support. The 2014 release of arrayMap contains more than 64,000 genomic array data sets, representing about 250 tumor diagnoses. Data sets included in arrayMap have been assembled from public repositories as well as additional resources, and integrated by applying custom processing pipelines. Online tools have been upgraded for a more flexible array data visualization, including options for processing user provided, non-public data sets. Data integration has been improved by mapping to multiple editions of the human reference genome, with the majority of the data now being available for the UCSC hg18 as well as GRCh37 versions. The large amount of tumor CNA data in arrayMap can be freely downloaded by users to promote data mining projects, and to explore special events such as chromothripsis-like genome patterns.
Resumo:
Mucus and lymph smears collected from leprosy patients (9) and their household contacts (44) in the Caño Mochuelo Indian Reservation, Casanare, Colombia, were examined with monoclonal antibodies (MoAb) against Mycobacterium leprae. The individuals studied were: 5 borderline leprosy (BB) patients, 4 with a lepromatous leprosy (LL), all of whom were undergoing epidemiological surveillance after treatment and 44 household contacts: 21 of the LL and 23 contacts of the BB patients. The MoAb were reactive with the following M. leprae antigens: 65 kd heat shock protein, A6; soluble antigen G7 and complete antigen, E11. All the samples were tested with each of the MoAb using the avidin-biotin-peroxidase technique and 3,3 diaminobenzidine as chromogen. The patients and household contacts studied were all recorded as Ziehl-Neelsen stain negative. The MoAb which showed optimal reaction was G7, this MoAb permited good visualization of the bacilli. Five patients with BB diagnosis and one with LL were positive for G7; of the BB patients' household contacts, 9 were positive for G7; 7 of the LL patients' household contacts were positive for the same MoAb. MoAb G7 allowed the detection of bacillar Mycobacterium spp. compatible structures in both patients and household contacts. G7 permited the visualization of the complete bacillus and could be used for early diagnosis and follow-up of the disease in patients.
Resumo:
Sequential stages in the life cycle of the ionotropic 5-HT(3) receptor (5-HT(3)R) were resolved temporally and spatially in live cells by multicolor fluorescence confocal microscopy. The insertion of the enhanced cyan fluorescent protein into the large intracellular loop delivered a fluorescent 5-HT(3)R fully functional in terms of ligand binding specificity and channel activity, which allowed for the first time a complete real-time visualization and documentation of intracellular biogenesis, membrane targeting, and ligand-mediated internalization of a receptor belonging to the ligand-gated ion channel superfamily. Fluorescence signals of newly expressed receptors were detectable in the endoplasmic reticulum about 3 h after transfection onset. At this stage receptor subunits assembled to form active ligand binding sites as demonstrated in situ by binding of a fluorescent 5-HT(3)R-specific antagonist. After novel protein synthesis was chemically blocked, the 5-HT(3) R populations in the endoplasmic reticulum and Golgi cisternae moved virtually quantitatively to the cell surface, indicating efficient receptor folding and assembly. Intracellular 5-HT(3) receptors were trafficking in vesicle-like structures along microtubules to the cell surface at a velocity generally below 1 mum/s and were inserted into the plasma membrane in a characteristic cluster distribution overlapping with actin-rich domains. Internalization of cell surface 5-HT(3) receptors was observed within minutes after exposure to an extracellular agonist. Our orchestrated use of spectrally distinguishable fluorescent labels for the receptor, its cognate ligand, and specific organelle markers can be regarded as a general approach allowing subcellular insights into dynamic processes of membrane receptor trafficking.
Resumo:
Amino acids form the building blocks of all proteins. Naturally occurring amino acids are restricted to a few tens of sidechains, even when considering post-translational modifications and rare amino acids such as selenocysteine and pyrrolysine. However, the potential chemical diversity of amino acid sidechains is nearly infinite. Exploiting this diversity by using non-natural sidechains to expand the building blocks of proteins and peptides has recently found widespread applications in biochemistry, protein engineering and drug design. Despite these applications, there is currently no unified online bioinformatics resource for non-natural sidechains. With the SwissSidechain database (http://www.swisssidechain.ch), we offer a central and curated platform about non-natural sidechains for researchers in biochemistry, medicinal chemistry, protein engineering and molecular modeling. SwissSidechain provides biophysical, structural and molecular data for hundreds of commercially available non-natural amino acid sidechains, both in l- and d-configurations. The database can be easily browsed by sidechain names, families or physico-chemical properties. We also provide plugins to seamlessly insert non-natural sidechains into peptides and proteins using molecular visualization software, as well as topologies and parameters compatible with molecular mechanics software.
Resumo:
MOTIVATION: The anatomy of model species is described in ontologies, which are used to standardize the annotations of experimental data, such as gene expression patterns. To compare such data between species, we need to establish relations between ontologies describing different species. RESULTS: We present a new algorithm, and its implementation in the software Homolonto, to create new relationships between anatomical ontologies, based on the homology concept. Homolonto uses a supervised ontology alignment approach. Several alignments can be merged, forming homology groups. We also present an algorithm to generate relationships between these homology groups. This has been used to build a multi-species ontology, for the database of gene expression evolution Bgee. AVAILABILITY: download section of the Bgee website http://bgee.unil.ch/