983 resultados para automated knowledge visualization
Resumo:
The driving forces of technology and globalization continuously transform the business landscape in a way which undermines the existing strategies and innovations of organizations. The challenge for organizations is to establish such conditions where they are able to create new knowledge for innovative business ideas in interaction between other organizations and individuals. Innovation processes continuously need new external stimulations and seek new ideas, new information and knowledge locating more and more outside traditional organizational boundaries. In several studies, the early phases of the innovation process have been considered as the most critical ones. During these phases, the innovation process can emerge or conclude. External knowledge acquirement and utilization are noticed to be important at this stage of the innovation process giving information about the development of future markets and needs for new innovative businessideas. To make it possible, new methods and approaches to manage proactive knowledge creation and sharing activities are needed. In this study, knowledge creation and sharing in the early phases of the innovation process has been studied, and the understanding of knowledge management in the innovation process in an open and collaborative context advanced. Furthermore, the innovation management methods in this study are combined in a novel way to establish an open innovation process and tested in real-life cases. For these purposes two complementary and sequentially applied group work methods - the heuristic scenario method and the idea generation process - are examined by focusing the research on the support of the open knowledge creation and sharing process. The research objective of this thesis concerns two doctrines: the innovation management including the knowledge management, and the futures research concerning the scenario paradigm. This thesis also applies the group decision support system (GDSS) in the idea generation process to utilize the converged knowledge during the scenario process.
Resumo:
Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.
Resumo:
Thisthesis supplements the systematic approach to competitive intelligence and competitor analysis by introducing an information-processing perspective on management of the competitive environment and competitors therein. The cognitive questions connected to the intelligence process and also the means that organizational actors use in sharing information are discussed. The ultimate aim has been to deepen knowledge of the different intraorganizational processes that are used in acorporate organization to manage and exploit the vast amount of competitor information that is received from the environment. Competitor information and competitive knowledge management is examined as a process, where organizational actorsidentify and perceive the competitive environment by using cognitive simplification, make interpretations resulting in learning and finally utilize competitor information and competitive knowledge in their work processes. The sharing of competitive information and competitive knowledge is facilitated by intraorganizational networks that evolve as a means of developing a shared, organizational level knowledge structure and ensuring that the right information is in the right place at the right time. This thesis approaches competitor information and competitive knowledge management both theoretically and empirically. Based on the conceptual framework developed by theoretical elaboration, further understanding of the studied phenomena is sought by an empirical study. The empirical research was carried out in a multinationally operating forest industry company. This thesis makes some preliminary suggestions of improving the competitive intelligence process. It is concluded that managing competitor information and competitive knowledge is not simply a question of managing information flow or improving sophistication of competitor analysis, but the crucial question to be solved is rather, how to improve the cognitive capabilities connected to identifying and making interpretations of the competitive environment and how to increase learning. It is claimed that competitive intelligence can not be treated like an organizational function or assigned solely to a specialized intelligence unit.
Resumo:
The Goliath grouper, Epinephelus itajara, a large-bodied (similar to 2.5 m TL, > 400 kg) and critically endangered fish (Epinephelidae), is highly Vulnerable to overfishing. Although protected from fishing in many countries, its exploitation in Mexico is unregulated; a situation that puts its populations at risk. Fishery records of E. itajara are scarce, which prevents determination of its fishery status. This work aimed to elucidate the E itajara fishery in the northern Yucatan Peninsula by 1) analyzing available catch records and 2) interviewing veteran fishermen (local ecological knowledge) from two traditional landing sites: Dzilam de Bravo and Puerto Progreso. Historic fishery records from two fishing cooperatives were analyzed in order to elucidate the current situation and offer viable alternatives for conservation and management. Catches have decreased severely. Local knowledge obtained from fishermen represented a very important source of information for reconstructing the fisheries history of this species. Conservation measures that incorporate regional and international regulations on critically endangered fish species are suggested
Resumo:
Background: Reconstruction of genes and/or protein networks from automated analysis of the literature is one of the current targets of text mining in biomedical research. Some user-friendly tools already perform this analysis on precompiled databases of abstracts of scientific papers. Other tools allow expert users to elaborate and analyze the full content of a corpus of scientific documents. However, to our knowledge, no user friendly tool that simultaneously analyzes the latest set of scientific documents available on line and reconstructs the set of genes referenced in those documents is available. Results: This article presents such a tool, Biblio-MetReS, and compares its functioning and results to those of other user-friendly applications (iHOP, STRING) that are widely used. Under similar conditions, Biblio-MetReS creates networks that are comparable to those of other user friendly tools. Furthermore, analysis of full text documents provides more complete reconstructions than those that result from using only the abstract of the document. Conclusions: Literature-based automated network reconstruction is still far from providing complete reconstructions of molecular networks. However, its value as an auxiliary tool is high and it will increase as standards for reporting biological entities and relationships become more widely accepted and enforced. Biblio- MetReS is an application that can be downloaded from http://metres.udl.cat/. It provides an easy to use environment for researchers to reconstruct their networks of interest from an always up to date set of scientific documents.
Resumo:
Postmortem MRI (PMMR) examinations are seldom performed in legal medicine due to long examination times, unfamiliarity with the technique, and high costs. Furthermore, it is difficult to obtain access to an MRI device used for patients in clinical settings to image an entire human body. An alternative is available: ex situ organ examination. To our knowledge, there is no standardized protocol that includes ex situ organ preparation and scanning parameters for postmortem MRI. Thus, our objective was to develop a standard procedure for ex situ heart PMMR examinations. We also tested the oily contrast agent Angiofil® commonly used for PMCT angiography, for its applicability in MRI. We worked with a 3 Tesla MRI device and 32-channel head coils. Twelve porcine hearts were used to test different materials to find the best way to prepare and place organs in the device and to test scanning parameters. For coronary MR angiography, we tested different mixtures of Angiofil® and different injection materials. In a second step, 17 human hearts were examined to test the procedure and its applicability to human organs. We established two standardized protocols: one for preparation of the heart and another for scanning parameters based on experience in clinical practice. The established protocols enabled a standardized technical procedure with comparable radiological images, allowing for easy radiological reading. The performance of coronary MR angiography enabled detailed coronary assessment and revealed the utility of Angiofil® as a contrast agent for PMMR. Our simple, reproducible method for performing heart examinations ex situ yields high quality images and visualization of the coronary arteries.
Resumo:
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Resumo:
El objetivo de esta investigación es conocer la tipología de aportaciones, que se producen en el entorno virtual colaborativo Knowledge Forum y comprobar si está teniendo lugar un aprendizaje colaborativo a través del ordenador (CSCL). Las contribuciones a los diferentes 30 foros han sido analizados y categorizados usando un esquema de codificación en base a las scaffolds o andamiajes que dicho entorno proporciona. Los resultados muestran que en conjunto los 308 estudiantes universitarios aportan nueva información y opinan, pero hay escasez de mensajes con diferentes opiniones que lleven a la discusión y a intercambios de puntos de vista distintos.
Resumo:
This paper aims to explore asynchronous communication in computer supported collaborative learning (CSCL). Thirty virtual forums are analysed in both a quantitative and a qualitative way. Quantitatively, the number of messages written, message threads and original and answer messages are counted. Qualitatively, the content of the notes is analysed, cataloguing these into two different levels: on the one hand, as a set of knowledge building process categories, and on the other hand, following the scaffolds that Knowledge Forum offers. The results show that both an exchange of information and a collaborative work take place. Nevertheless, the construction of knowledge is superficial.
Resumo:
The aim of this study was to examine the development of the metacognitive knowledge of a group of higher education students who participated actively in an experiment based on a Computer Supported Collaborative Learning environment called KnowCat. Eighteen university students participated in a 12-month learning project during which the KnowCat learning environment was used to support scaffolding process among equals during problem-solving tasks. After using KnowCat, the students were interviewed over their work in this shared workspace. Qualitative analysis revealed that the educational application of KnowCat can favour and improve the development of metacognitive knowledge.
Resumo:
UOC
Resumo:
UOC
Resumo:
Kolmiulotteisten kappaleiden rekonstruktio on yksi konenäön haastavimmista ongelmista, koska kappaleiden kolmiulotteisia etäisyyksiä ei voida selvittää yhdestä kaksiulotteisesta kuvasta. Ongelma voidaan ratkaista stereonäön avulla, jossa näkymän kolmiulotteinen rakenne päätellään usean kuvan perusteella. Tämä lähestymistapa mahdollistaa kuitenkin vain rekonstruktion niille kappaleiden osille, jotka näkyvät vähintään kahdessa kuvassa. Piilossa olevien osien rekonstruktio ei ole mahdollista pelkästään stereonäön avulla. Tässä työssä on kehitetty uusi menetelmä osittain piilossa olevien kolmiulotteisten tasomaisten kappaleiden rekonstruktioon. Menetelmän avulla voidaan selvittää hyvällä tarkkuudella tasomaisista pinnoista koostuvan kappaleen muoto ja paikka käyttäen kahta kuvaa kappaleesta. Menetelmä perustuu epipolaarigeometriaan, jonka avulla selvitetään molemmissa kuvissa näkyvät kappaleiden osat. Osittain piilossa olevien piirteiden rekonstruointi suoritetaan käyttämäen stereonäköä sekä tietoa kappaleen rakenteesta. Esitettyä ratkaisua voitaisiin käyttää esimerkiksi kolmiulotteisten kappaleiden visualisointiin, robotin navigointiin tai esineentunnistukseen.
Resumo:
In this article, I address epistemological questions regarding the status of linguistic rules and the pervasive--though seldom discussed--tension that arises between theory-driven object perception by linguists on the one hand, and ordinary speakers' possible intuitive knowledge on the other hand. Several issues will be discussed using examples from French verb morphology, based on the 6500 verbs from Le Petit Robert dictionary (2013).
Resumo:
Tietojohtaminen on osoittautunut nykypäivänä organisaatioiden yhdeksi suurimmaksi haasteeksi. Haasteena ei vain ole se tiedon määrä mitä tulisi hallita, vaan pikemminkin tiedonhallinta toimii yritykselle kilpailuetuna globaalissa yritysmaailmassa. Tämän työn tavoitteena on tutkia yritysportaalin soveltuvuutta tiedonhallintaan globaalissa metsäteollisuusyrityksessä. Lisäksi tavoitteena on selvittää portaalin sovittamista kullekin käyttäjäryhmälle case yrityksessä. Työn teoriaosassa on käsitelty tiedonhallinnan monimuotoisuutta ja vaikeutta kuvata sitä yksiselitteisesti. Lisäksi käyttäjäryhmien ja käyttäjäprofiilien määrittämiseen vaikuttavia seikkoja on selvitetty tässä osassa. Empiirinen osa käsittelee case-yritystä ja sen suhdetta tiedonhallintaan sekä tämän kaltaisen tiedonhallinnan työvälineen käyttöön. Työstä saatujen tulosten perusteella voidaan todeta yritysportaalin soveltuvan hyvin tiedonhallintaan monimutkaisessakin yrityksessä. Portaali muuttaa yrityksen liiketoimintaprosesseja läpinäkyvämmiksi, kun bisneskriittistä tietoa tarjotaan yhdessä paikassa.