952 resultados para knowledge application


Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tesi si propone di valutare la architettura del modello "Molecules of Knowledge", di realizzarne la sua implementazione su infrastruttura TuCSoN opportunamente verificata ed estesa, e di effettuare esperimenti di sistemi MoK in scenari applicativi come i news management systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study aims at assessing the innovation strategies adopted within a regional economic system, the Italian region Emilia-Romagna, as it faced the challenges of a changing international scenario. As the strengthening of the regional innovative capabilities is regarded as a keystone to foster a new phase of economic growth, it is important also to understand how the local industrial, institutional, and academic actors have tackled the problem of innovation in the recent past. In this study we explore the approaches to innovation and the strategies adopted by the main regional actors through three different case studies. Chapter 1 provides a general survey of the innovative performance of the regional industries over the past two decades, as it emerges from statistical data and systematic comparisons at the national and European levels. The chapter also discusses the innovation policies that the regional government set up since 2001 in order to strengthen the collaboration among local economic actors, including universities and research centres. As mechanics is the most important regional industry, chapter 2 analyses the combination of knowledge and practices utilized in the period 1960s-1990s in the design of a particular kind of machinery produced by G.D S.p.A., a world-leader in the market of tobacco packaging machines. G.D is based in Bologna, the region’s capital, and is at the centre of the most important Italian packaging district. In chapter 3 the attention turns to the institutional level, focusing on how the local public administrations, and the local, publicly-owned utility companies have dealt with the creation of new telematic networks on the regional territory during the 1990s and 2000s. Finally, chapter 4 assesses the technology transfer carried out by the main university of the region – the University of Bologna – by focusing on the patenting activities involving its research personnel in the period 1960-2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on the processes of change that firms undertake to overcome conditions of organizational rigidity and develop new dynamic capabilities, thanks to the contribution of external knowledge. When external contingencies highlight firms’ core rigidities, external actors can intervene in change projects, providing new competences to firms’ managers. Knowledge transfer and organizational learning processes can lead to the development of new dynamic capabilities. Existing literature does not completely explain how these processes develop and how external knowledge providers, as management consultants, influence them. Dynamic capabilities literature has become very rich in the last years; however, the models that explain how dynamic capabilities evolve are not particularly investigated. Adopting a qualitative approach, this research proposes four relevant case studies in which external actors introduce new knowledge within organizations, activating processes of change. Each case study consists of a management consulting project. Data are collected through in-depth interviews with consultants and managers. A large amount of documents supports evidences from interviews. A narrative approach is adopted to account for change processes and a synthetic approach is proposed to compare case studies along relevant dimensions. This study presents a model of capabilities evolution, supported by empirical evidence, to explain how external knowledge intervenes in capabilities evolution processes: first, external actors solve gaps between environmental demands and firms’ capabilities, changing organizational structures and routines; second, a knowledge transfer between consultants and managers leads to the creation of new ordinary capabilities; third, managers can develop new dynamic capabilities through a deliberate learning process that internalizes new tacit knowledge from consultants. After the end of the consulting project, two elements can influence the deliberate learning process: new external contingencies and changes in the perceptions about external actors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present work is to contribute to a better understanding of the relation between organization theory and management practice. It is organized as a collection of two papers, a theoretical and conceptual contribution and an ethnographic study. The first paper is concerned with systematizing different literatures inside and outside the field of organization studies that deal with the theory-practice relation. After identifying a series of positions to the theory-practice debate and unfolding some of their implicit assumptions and limitations, a new position called entwinement is developed in order to overcome status quo through reconciliation and integration. Accordingly, the paper proposes to reconceptualize theory and practice as a circular iterative process of action and cognition, science and common-sense enacted in the real world both by organization scholars and practitioners according to purposes at hand. The second paper is the ethnographic study of an encounter between two groups of expert academics and practitioners occasioned by a one-year executive business master in an international business school. The research articulates a process view of the knowledge exchange between management academics and practitioners in particular and between individuals belonging to different communities of practice, in general, and emphasizes its dynamic, relational and transformative mechanisms. Findings show that when they are given the chance to interact, academics and practitioners set up local provisional relations that enable them to act as change intermediaries vis-a-vis each other’s worlds, without tying themselves irremediably to each other and to the scenarios they conjointly projected during the master’s experience. Finally, the study shows that provisional relations were accompanied by a recursive shift in knowledge modes. While interacting, academics passed from theory to practical theorizing, practitioners passed from an involved practical mode to a reflexive and quasi-theoretical one, and then, as exchanges proceeded, the other way around.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NGAL (Neutrophil Gelatinase-associated Lipocalin ) is a protein of lipocalin superfamily. Recent literature focused on its biomarkers function in several pathological condition (acute and chronic kidney damage, autoimmune disease, malignancy). NGAL biological role is not well elucidated. Several are the demonstration of its bacteriostatic role. Recent papers have indeed highlight NGAL role in NFkB modulation. The aim of this study is to understand whether NGAL may exert a role in the activation (modulation) of T cell response through the regulation of HLA-G complex, a mediator of tolerance. From 8 healthy donors we obtained peripheral blood mononuclear cells (PBMCs) and we isolated by centrifugation on a Ficoll gradient. Cells were then treated with four concentrations of NGAL (40-320 ng/ml) with or without iron. We performed flow cytometry analysis and ELISA test. NGAL increased the HLA-G expression on CD4+ T cells, with an increasing corresponding to the dose. Iron effect is not of unique interpretation. NGAL adiction affects regulatory T cells increasing in vitro expansion of CD4+ CD25+ FoxP3+ cells. Neutralizing antibody against NGAL decreased HLA-G expression and reduced significantly CD4+ CD25+ FoxP3+ cells percentage. In conclusion, we provided in vitro evidence of NGAL involvement in cellular immunity. The potential role of NGAL as an immunomodulatory molecule has been evaluated: it has been shown that NGAL plays a pivotal role in the induction of immune tolerance up regulating HLA-G and T regulatory cells expression in healthy donors. As potential future scenario we highlight the in vivo role of NGAL in immunology and immunomodulation, and its possible relationship with immunosuppressive therapy efficacy, tolerance induction in transplant patients, and/or in other immunological disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'innovazione delle tecnologie di sequenziamento negli ultimi anni ha reso possibile la catalogazione delle varianti genetiche nei campioni umani, portando nuove scoperte e comprensioni nella ricerca medica, farmaceutica, dell'evoluzione e negli studi sulla popolazione. La quantità di sequenze prodotta è molto cospicua, e per giungere all'identificazione delle varianti sono necessari diversi stadi di elaborazione delle informazioni genetiche in cui, ad ogni passo, vengono generate ulteriori informazioni. Insieme a questa immensa accumulazione di dati, è nata la necessità da parte della comunità scientifica di organizzare i dati in repository, dapprima solo per condividere i risultati delle ricerche, poi per permettere studi statistici direttamente sui dati genetici. Gli studi su larga scala coinvolgono quantità di dati nell'ordine dei petabyte, il cui mantenimento continua a rappresentare una sfida per le infrastrutture. Per la varietà e la quantità di dati prodotti, i database giocano un ruolo di primaria importanza in questa sfida. Modelli e organizzazione dei dati in questo campo possono fare la differenza non soltanto per la scalabilità, ma anche e soprattutto per la predisposizione al data mining. Infatti, la memorizzazione di questi dati in file con formati quasi-standard, la dimensione di questi file, e i requisiti computazionali richiesti, rendono difficile la scrittura di software di analisi efficienti e scoraggiano studi su larga scala e su dati eterogenei. Prima di progettare il database si è perciò studiata l’evoluzione, negli ultimi vent’anni, dei formati quasi-standard per i flat file biologici, contenenti metadati eterogenei e sequenze nucleotidiche vere e proprie, con record privi di relazioni strutturali. Recentemente questa evoluzione è culminata nell’utilizzo dello standard XML, ma i flat file delimitati continuano a essere gli standard più supportati da tools e piattaforme online. È seguita poi un’analisi dell’organizzazione interna dei dati per i database biologici pubblici. Queste basi di dati contengono geni, varianti genetiche, strutture proteiche, ontologie fenotipiche, relazioni tra malattie e geni, relazioni tra farmaci e geni. Tra i database pubblici studiati rientrano OMIM, Entrez, KEGG, UniProt, GO. L'obiettivo principale nello studio e nella modellazione del database genetico è stato quello di strutturare i dati in modo da integrare insieme i dati eterogenei prodotti e rendere computazionalmente possibili i processi di data mining. La scelta di tecnologia Hadoop/MapReduce risulta in questo caso particolarmente incisiva, per la scalabilità garantita e per l’efficienza nelle analisi statistiche più complesse e parallele, come quelle riguardanti le varianti alleliche multi-locus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet of Things (IoT) is the next industrial revolution: we will interact naturally with real and virtual devices as a key part of our daily life. This technology shift is expected to be greater than the Web and Mobile combined. As extremely different technologies are needed to build connected devices, the Internet of Things field is a junction between electronics, telecommunications and software engineering. Internet of Things application development happens in silos, often using proprietary and closed communication protocols. There is the common belief that only if we can solve the interoperability problem we can have a real Internet of Things. After a deep analysis of the IoT protocols, we identified a set of primitives for IoT applications. We argue that each IoT protocol can be expressed in term of those primitives, thus solving the interoperability problem at the application protocol level. Moreover, the primitives are network and transport independent and make no assumption in that regard. This dissertation presents our implementation of an IoT platform: the Ponte project. Privacy issues follows the rise of the Internet of Things: it is clear that the IoT must ensure resilience to attacks, data authentication, access control and client privacy. We argue that it is not possible to solve the privacy issue without solving the interoperability problem: enforcing privacy rules implies the need to limit and filter the data delivery process. However, filtering data require knowledge of how the format and the semantics of the data: after an analysis of the possible data formats and representations for the IoT, we identify JSON-LD and the Semantic Web as the best solution for IoT applications. Then, this dissertation present our approach to increase the throughput of filtering semantic data by a factor of ten.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aims at investigating methods and software architectures for discovering what are the typical and frequently occurring structures used for organizing knowledge in the Web. We identify these structures as Knowledge Patterns (KPs). KP discovery needs to address two main research problems: the heterogeneity of sources, formats and semantics in the Web (i.e., the knowledge soup problem) and the difficulty to draw relevant boundary around data that allows to capture the meaningful knowledge with respect to a certain context (i.e., the knowledge boundary problem). Hence, we introduce two methods that provide different solutions to these two problems by tackling KP discovery from two different perspectives: (i) the transformation of KP-like artifacts to KPs formalized as OWL2 ontologies; (ii) the bottom-up extraction of KPs by analyzing how data are organized in Linked Data. The two methods address the knowledge soup and boundary problems in different ways. The first method provides a solution to the two aforementioned problems that is based on a purely syntactic transformation step of the original source to RDF followed by a refactoring step whose aim is to add semantics to RDF by select meaningful RDF triples. The second method allows to draw boundaries around RDF in Linked Data by analyzing type paths. A type path is a possible route through an RDF that takes into account the types associated to the nodes of a path. Then we present K~ore, a software architecture conceived to be the basis for developing KP discovery systems and designed according to two software architectural styles, i.e, the Component-based and REST. Finally we provide an example of reuse of KP based on Aemoo, an exploratory search tool which exploits KPs for performing entity summarization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our research asked the following main questions: how the characteristics of professionals service firms allow them to successfully innovate in exploiting through exploring by combining internal and external factors of innovation and how these ambidextrous organisations perceive these factors; and how do successful innovators in professional service firms use corporate entrepreneurship models in their new service development processes? With a goal to shed light on innovation in professional knowledge intensive business service firms’ (PKIBS), we concluded a qualitative analysis of ten globally acting law firms, providing business legal services. We analyse the internal and factors of innovation that are critical for PKIBS’ innovation. We suggest how these firms become ambidextrous in changing environment. Our findings show that this kind of firms has particular type of ambidexterity due to their specific characteristics. As PKIBS are very dependant on its human capital, governance structure, and the high expectations of their clients, their ambidexterity is structural, but also contextual at the same time. In addition, we suggest 3 types of corporate entrepreneurship models that international PKIBS use to enhance innovation in turbulent environments. We looked at how law firms going through turbulent environments were using corporate entrepreneurship activities as a part of their strategies to be more innovative. Using visual mapping methodology, we developed three types of innovation patterns in the law firms. We suggest that corporate entrepreneurship models depend on successful application of mainly three elements: who participates in corporate entrepreneurship initiatives; what are the formal processes that enhances these initiatives; and what are the policies applied to this type of behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In vielen Bereichen der industriellen Fertigung, wie zum Beispiel in der Automobilindustrie, wer- den digitale Versuchsmodelle (sog. digital mock-ups) eingesetzt, um die Entwicklung komplexer Maschinen m ̈oglichst gut durch Computersysteme unterstu ̈tzen zu k ̈onnen. Hierbei spielen Be- wegungsplanungsalgorithmen eine wichtige Rolle, um zu gew ̈ahrleisten, dass diese digitalen Pro- totypen auch kollisionsfrei zusammengesetzt werden k ̈onnen. In den letzten Jahrzehnten haben sich hier sampling-basierte Verfahren besonders bew ̈ahrt. Diese erzeugen eine große Anzahl von zuf ̈alligen Lagen fu ̈r das ein-/auszubauende Objekt und verwenden einen Kollisionserken- nungsmechanismus, um die einzelnen Lagen auf Gu ̈ltigkeit zu u ̈berpru ̈fen. Daher spielt die Kollisionserkennung eine wesentliche Rolle beim Design effizienter Bewegungsplanungsalgorith- men. Eine Schwierigkeit fu ̈r diese Klasse von Planern stellen sogenannte “narrow passages” dar, schmale Passagen also, die immer dort auftreten, wo die Bewegungsfreiheit der zu planenden Objekte stark eingeschr ̈ankt ist. An solchen Stellen kann es schwierig sein, eine ausreichende Anzahl von kollisionsfreien Samples zu finden. Es ist dann m ̈oglicherweise n ̈otig, ausgeklu ̈geltere Techniken einzusetzen, um eine gute Performance der Algorithmen zu erreichen.rnDie vorliegende Arbeit gliedert sich in zwei Teile: Im ersten Teil untersuchen wir parallele Kollisionserkennungsalgorithmen. Da wir auf eine Anwendung bei sampling-basierten Bewe- gungsplanern abzielen, w ̈ahlen wir hier eine Problemstellung, bei der wir stets die selben zwei Objekte, aber in einer großen Anzahl von unterschiedlichen Lagen auf Kollision testen. Wir im- plementieren und vergleichen verschiedene Verfahren, die auf Hu ̈llk ̈operhierarchien (BVHs) und hierarchische Grids als Beschleunigungsstrukturen zuru ̈ckgreifen. Alle beschriebenen Verfahren wurden auf mehreren CPU-Kernen parallelisiert. Daru ̈ber hinaus vergleichen wir verschiedene CUDA Kernels zur Durchfu ̈hrung BVH-basierter Kollisionstests auf der GPU. Neben einer un- terschiedlichen Verteilung der Arbeit auf die parallelen GPU Threads untersuchen wir hier die Auswirkung verschiedener Speicherzugriffsmuster auf die Performance der resultierenden Algo- rithmen. Weiter stellen wir eine Reihe von approximativen Kollisionstests vor, die auf den beschriebenen Verfahren basieren. Wenn eine geringere Genauigkeit der Tests tolerierbar ist, kann so eine weitere Verbesserung der Performance erzielt werden.rnIm zweiten Teil der Arbeit beschreiben wir einen von uns entworfenen parallelen, sampling- basierten Bewegungsplaner zur Behandlung hochkomplexer Probleme mit mehreren “narrow passages”. Das Verfahren arbeitet in zwei Phasen. Die grundlegende Idee ist hierbei, in der er- sten Planungsphase konzeptionell kleinere Fehler zuzulassen, um die Planungseffizienz zu erh ̈ohen und den resultierenden Pfad dann in einer zweiten Phase zu reparieren. Der hierzu in Phase I eingesetzte Planer basiert auf sogenannten Expansive Space Trees. Zus ̈atzlich haben wir den Planer mit einer Freidru ̈ckoperation ausgestattet, die es erlaubt, kleinere Kollisionen aufzul ̈osen und so die Effizienz in Bereichen mit eingeschr ̈ankter Bewegungsfreiheit zu erh ̈ohen. Optional erlaubt unsere Implementierung den Einsatz von approximativen Kollisionstests. Dies setzt die Genauigkeit der ersten Planungsphase weiter herab, fu ̈hrt aber auch zu einer weiteren Perfor- mancesteigerung. Die aus Phase I resultierenden Bewegungspfade sind dann unter Umst ̈anden nicht komplett kollisionsfrei. Um diese Pfade zu reparieren, haben wir einen neuartigen Pla- nungsalgorithmus entworfen, der lokal beschr ̈ankt auf eine kleine Umgebung um den bestehenden Pfad einen neuen, kollisionsfreien Bewegungspfad plant.rnWir haben den beschriebenen Algorithmus mit einer Klasse von neuen, schwierigen Metall- Puzzlen getestet, die zum Teil mehrere “narrow passages” aufweisen. Unseres Wissens nach ist eine Sammlung vergleichbar komplexer Benchmarks nicht ̈offentlich zug ̈anglich und wir fan- den auch keine Beschreibung von vergleichbar komplexen Benchmarks in der Motion-Planning Literatur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the author presents a query language for an RDF (Resource Description Framework) database and discusses its applications in the context of the HELM project (the Hypertextual Electronic Library of Mathematics). This language aims at meeting the main requirements coming from the RDF community. in particular it includes: a human readable textual syntax and a machine-processable XML (Extensible Markup Language) syntax both for queries and for query results, a rigorously exposed formal semantics, a graph-oriented RDF data access model capable of exploring an entire RDF graph (including both RDF Models and RDF Schemata), a full set of Boolean operators to compose the query constraints, fully customizable and highly structured query results having a 4-dimensional geometry, some constructions taken from ordinary programming languages that simplify the formulation of complex queries. The HELM project aims at integrating the modern tools for the automation of formal reasoning with the most recent electronic publishing technologies, in order create and maintain a hypertextual, distributed virtual library of formal mathematical knowledge. In the spirit of the Semantic Web, the documents of this library include RDF metadata describing their structure and content in a machine-understandable form. Using the author's query engine, HELM exploits this information to implement some functionalities allowing the interactive and automatic retrieval of documents on the basis of content-aware requests that take into account the mathematical nature of these documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis work aims to find a procedure for isolating specific features of the current signal from a plasma focus for medical applications. The structure of the current signal inside a plasma focus is exclusive of this class of machines and a specific analysis procedure has to be developed. The hope is to find one or more features that shows a correlation with the dose erogated. The study of the correlation between the current discharge signal and the dose delivered by a plasma focus could be of some importance not only for the practical application of dose prediction but also for expanding the knowledge anbout the plasma focus physics. Vatious classes of time-frequency analysis tecniques are implemented in order to solve the problem.