984 resultados para knowledge modeling
Resumo:
This paper introduces an ontology-based knowledge model for knowledge management. This model can facilitate knowledge discovery that provides users with insight for decision making. The users requiring the insight normally play different roles with different requirements in an organisation. To meet the requirements, insights are created by purposely aggregated transnational data. This involves a semantic data integration process. In this paper, we present a knowledge management system which is capable of representing knowledge requirements in a domain context and enabling the semantic data integration through ontology modeling. The knowledge domain context of United Bible Societies is used to illustrate the features of the knowledge management capabilities.
Resumo:
The top managers of a biotechnology startup firm agreed to participate in a system dynamics modeling project to help them think about the firm's growth strategy. The article describes how the model was created and used to stimulate debate and discussion about growth management. The paper highlights several novel features about the process used for capturing management team knowledge. A heavy emphasis was placed on mapping the operating structure of the factory and distribution channels. Qualitative modeling methods (structural diagrams, descriptive variable names, and friendly algebra) were used to capture the management team's descriptions of the business. Simulation scenarios were crafted to stimulate debate about strategic issues such as capacity allocation, capacity expansion, customer recruitment, customer retention, and market growth, and to engage the management team in using the computer to design strategic scenarios. The article concludes with comments on the impact of the project.
Resumo:
In a world where organizations are ever more complex the need for the knowledge of the organizational self is a growing necessity. The DEMO methodology sets a goal in achieving the specification of the organizational self capturing the essence of the organization in way independent of its implementation and also coherent, consistent, complete, modular and objective. But having such organization self notion is of little meaning if this notion is not shared by the organization actors. To achieve this goal in a society that has grown attached to technology and where time is of utmost importance, using a tool such as a semantic Wikipedia may be the perfect way of making the information accessible. However, to establish DEMO methodology in such platform there is a need to create bridges between its modeling components and semantic Wikipedia. It’s in that aspect that our thesis focuses, trying to establish and implement, using a study case, the principles of a way of transforming the DEMO methodology diagrams in comprehensive pages on semantic Wikipedia but keeping them as abstract as possible to allow expansibility and generalization to all diagrams without losing any valuable information so that, if that is the wish, those diagrams may be recreated from the semantic pages and make this process a full cycle.
Resumo:
The accurate determination of thermophysical properties of milk is very important for design, simulation, optimization, and control of food processing such as evaporation, heat exchanging, spray drying, and so forth. Generally, polynomial methods are used for prediction of these properties based on empirical correlation to experimental data. Artificial neural networks are better Suited for processing noisy and extensive knowledge indexing. This article proposed the application of neural networks for prediction of specific heat, thermal conductivity, and density of milk with temperature ranged from 2.0 to 71.0degreesC, 72.0 to 92.0% of water content (w/w), and 1.350 to 7.822% of fat content (w/w). Artificial neural networks presented a better prediction capability of specific heat, thermal conductivity, and density of milk than polynomial modeling. It showed a reasonable alternative to empirical modeling for thermophysical properties of foods.
Resumo:
Nowadays, drives that use a combination of induction motors and frequency inverters are very common, a fact due to the financial practicality and viability in purchasing and operating that system. This system modeling and simulation becomes important when it wants to evaluate the performance, to calculate and correct parameters, and it has a fundamental role in functionality and viability analysis for application of new configurations and technologies. This work is about to elaborate a simple induction motor model based in the torque versus speed characteristic, using the linearization method for application in a specific operation range to be controlled by a frequency inverter. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
Simulation of large and complex systems, such as computing grids, is a difficult task. Current simulators, despite providing accurate results, are significantly hard to use. They usually demand a strong knowledge of programming, what is not a standard pattern in today's users of grids and high performance computing. The need for computer expertise prevents these users from simulating how the environment will respond to their applications, what may imply in large loss of efficiency, wasting precious computational resources. In this paper we introduce iSPD, iconic Simulator of Parallel and Distributed Systems, which is a simulator where grid models are produced through an iconic interface. We describe the simulator and its intermediate model languages. Results presented here provide an insight in its easy-of-use and accuracy.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The adoption of ERPs systems by small and middle-sized companies may not be possible due to their cost. At the same time, when adapting ERP to the company's particular needs, the user keeps depending on the system's sellers due to the lack of access and knowledge of the respective code. Free and open-source software may promote advantages to the enterprises, however, for its adoption it is necessary the development of techniques and tools in order to facilitate its deployment and code maintenance. This article emphasizes the importance of defining modeling architectures and reference models for the development and maintenance of open-source ERPs, in special the ERP5 project.
Resumo:
Background: Cancer is the second leading cause of death in Argentina, and there is little knowledge about its incidence. The first study based on population-based cancer registry described spatial incidence and indicated that there existed at least county-level aggregation. The aim of the present work is to model the incidence patterns for the most incidence cancer in Córdoba Province, Argentina, using information from the Córdoba Cancer Registry by performing multilevel mixed model approach to deal with dependence and unobserved heterogeneity coming from the geo-reference cancer occurrence. Methods: Standardized incidence rates (world standard population) (SIR) by sex based on 5-year age groups were calculated for 109 districts nested on 26 counties for the most incidence cancers in Cordoba using 2004 database. A Poisson twolevel random effect model representing unobserved heterogeneity between first level-districts and second level-counties was fitted to assess the spatial distribution of the overall and site specific cancer incidence rates. Results: SIR cancer at Córdoba province shown an average of 263.53±138.34 and 200.45±98.30 for men and women, respectively. Considering the ratio site specific mean SIR to the total mean, breast cancer ratio was 0.25±0.19, prostate cancer ratio was 0.12±0.10 and lower values for lung and colon cancer for both sexes. The Poisson two-level random intercepts model fitted for SIR data distributed with overdispersion shown significant hierarchical structure for the cancer incidence distribution. Conclusions: a strong spatial-nested effect for the cancer incidence in Córdoba was observed and will help to begin the study of the factors associated with it.
Resumo:
Research literature is replete with the importance of collaboration in schools, the lack of its implementation, the centrality of the role of the principal, and the existence of a gap between knowledge and practice--or a "Knowing-Doing Gap." In other words, there is a set of knowledge that principals must know in order to create a collaborative workplace environment for teachers. This study sought to describe what high school principals know about creating such a culture of collaboration. The researcher combed journal articles, studies and professional literature in order to identify what principals must know in order to create a culture of collaboration. The result was ten elements of principal knowledge: Staff involvement in important decisions, Charismatic leadership not being necessary for success, Effective elements of teacher teams, Administrator‘s modeling professional learning, The allocation of resources, Staff meetings focused on student learning, Elements of continuous improvement, and Principles of Adult Learning, Student Learning and Change. From these ten elements, the researcher developed a web-based survey intended to measure nine of those elements (Charismatic leadership was excluded). Principals of accredited high schools in the state of Nebraska were invited to participate in this survey, as high schools are well-known for the isolation that teachers experience--particularly as a result of departmentalization. The results indicate that principals have knowledge of eight of the nine measured elements. The one that they lacked an understanding of was Principles of Student Learning. Given these two findings of what principals do and do not know, the researcher recommends that professional organizations, intermediate service agencies and district-level support staff engage in systematic and systemic initiatives to increase the knowledge of principals in the element of lacking knowledge. Further, given that eight of the nine elements are understood by principals, it would be wise to examine reasons for the implementation gap (Knowing-Doing Gap) and how to overcome it.
Resumo:
Purpose - The aim of this study is to investigate whether knowledge management (KM) contributes to the development of strategic orientation and to enhance innovativeness, and whether these three factors contribute to improve business performance. Design/methodology/approach - A sample of 241 Brazilian companies was surveyed, using Web-based questionnaires with 54 questions, using ten-point scales to measure the degree of agreement on each item of each construct. Structural equation modeling techniques were applied for model assessment and analysis of the relationships among constructs. Exploratory factor analysis, confirmatory factor analysis, and path analysis using the technique of structural equation modeling were applied to the data. Findings - Effective KM contributes positively to strategic orientation. Although there is no significant direct effect of KM on innovativeness, the relationship is significant when mediated by strategic orientation. Similarly effective KM has no direct effect on business performance, but this relationship becomes statistically significant when mediated by strategic orientation and innovativeness. Research limitations/implications - The findings indicate that KM permeates all relationships among the constructs, corroborating the argument that knowledge is an essential organizational resource that leverages all value-creating activities. The results indicate that both KM and innovativeness produce significant impacts on performance when they are aligned with a strategic orientation that enables the organization to anticipate and respond to changing market conditions. Originality/value - There is a substantial body of research on several types of relationships involving KM, strategic orientation, innovativeness and performance. This study offers an original contribution by analyzing all of those constructs simultaneously using established scales so that comparative studies are possible.
Resumo:
The objective of this dissertation is to develop and test a predictive model for the passive kinematics of human joints based on the energy minimization principle. To pursue this goal, the tibio-talar joint is chosen as a reference joint, for the reduced number of bones involved and its simplicity, if compared with other sinovial joints such as the knee or the wrist. Starting from the knowledge of the articular surface shapes, the spatial trajectory of passive motion is obtained as the envelop of joint configurations that maximize the surfaces congruence. An increase in joint congruence corresponds to an improved capability of distributing an applied load, allowing the joint to attain a better strength with less material. Thus, joint congruence maximization is a simple geometric way to capture the idea of joint energy minimization. The results obtained are validated against in vitro measured trajectories. Preliminary comparison provide strong support for the predictions of the theoretical model.
Resumo:
The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.
Resumo:
The field of complex systems is a growing body of knowledge, It can be applied to countless different topics, from physics to computer science, biology, information theory and sociology. The main focus of this work is the use of microscopic models to study the behavior of urban mobility, which characteristics make it a paradigmatic example of complexity. In particular, simulations are used to investigate phase changes in a finite size open Manhattan-like urban road network under different traffic conditions, in search for the parameters to identify phase transitions, equilibrium and non-equilibrium conditions . It is shown how the flow-density macroscopic fundamental diagram of the simulation shows,like real traffic, hysteresis behavior in the transition from the congested phase to the free flow phase, and how the different regimes can be identified studying the statistics of road occupancy.
Resumo:
Die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen fasziniert sowohl aus angewandter als auch theoretischer Sicht. Sie ist ein wichtiger Aspekt in vielen Anwendungen, unter anderem in chirugischen Implantaten oder Biosensoren. Sie ist außerdem ein Beispiel für theoretische Fragestellungen betreffend die Grenzfläche zwischen harter und weicher Materie. Fest steht, dass Kenntnis der beteiligten Mechanismen erforderlich ist um die Wechselwirkung zwischen Proteinen und Oberflächen zu verstehen, vorherzusagen und zu optimieren. Aktuelle Fortschritte im experimentellen Forschungsbereich ermöglichen die Untersuchung der direkten Peptid-Metall-Bindung. Dadurch ist die Erforschung der theoretischen Grundlagen weiter ins Blickfeld aktueller Forschung gerückt. Eine Möglichkeit die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen zu erforschen ist durch Computersimulationen. Obwohl Simulationen von Metalloberflächen oder Proteinen als Einzelsysteme schon länger verbreitet sind, bringt die Simulation einer Kombination beider Systeme neue Schwierigkeiten mit sich. Diese zu überwinden erfordert ein Mehrskalen-Verfahren: Während Proteine als biologische Systeme ausreichend mit klassischer Molekulardynamik beschrieben werden können, bedarf die Beschreibung delokalisierter Elektronen metallischer Systeme eine quantenmechanische Formulierung. Die wichtigste Voraussetzung eines Mehrskalen-Verfahrens ist eine Übereinstimmung der Simulationen auf den verschiedenen Skalen. In dieser Arbeit wird dies durch die Verknüpfung von Simulationen alternierender Skalen erreicht. Diese Arbeit beginnt mit der Untersuchung der Thermodynamik der Benzol-Hydratation mittels klassischer Molekulardynamik. Dann wird die Wechselwirkung zwischen Wasser und den [111]-Metalloberflächen von Gold und Nickel mittels eines Multiskalen-Verfahrens modelliert. In einem weiteren Schritt wird die Adsorbtion des Benzols an Metalloberflächen in wässriger Umgebung studiert. Abschließend wird die Modellierung erweitert und auch die Aminosäuren Alanin und Phenylalanin einbezogen. Dies eröffnet die Möglichkeit realistische Protein- Metall-Systeme in Computersimulationen zu betrachten und auf theoretischer Basis die Wechselwirkung zwischen Peptiden und Oberflächen für jede Art Peptide und Oberfläche vorauszusagen.