928 resultados para Knowledge organization systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet and World Wide Web have had, and continue to have, an incredible impact on our civilization. These technologies have radically influenced the way that society is organised and the manner in which people around the world communicate and interact. The structure and function of individual, social, organisational, economic and political life begin to resemble the digital network architectures upon which they are increasingly reliant. It is increasingly difficult to imagine how our ‘offline’ world would look or function without the ‘online’ world; it is becoming less meaningful to distinguish between the ‘actual’ and the ‘virtual’. Thus, the major architectural project of the twenty-first century is to “imagine, build, and enhance an interactive and ever changing cyberspace” (Lévy, 1997, p. 10). Virtual worlds are at the forefront of this evolving digital landscape. Virtual worlds have “critical implications for business, education, social sciences, and our society at large” (Messinger et al., 2009, p. 204). This study focuses on the possibilities of virtual worlds in terms of communication, collaboration, innovation and creativity. The concept of knowledge creation is at the core of this research. The study shows that scholars increasingly recognise that knowledge creation, as a socially enacted process, goes to the very heart of innovation. However, efforts to build upon these insights have struggled to escape the influence of the information processing paradigm of old and have failed to move beyond the persistent but problematic conceptualisation of knowledge creation in terms of tacit and explicit knowledge. Based on these insights, the study leverages extant research to develop the conceptual apparatus necessary to carry out an investigation of innovation and knowledge creation in virtual worlds. The study derives and articulates a set of definitions (of virtual worlds, innovation, knowledge and knowledge creation) to guide research. The study also leverages a number of extant theories in order to develop a preliminary framework to model knowledge creation in virtual worlds. Using a combination of participant observation and six case studies of innovative educational projects in Second Life, the study yields a range of insights into the process of knowledge creation in virtual worlds and into the factors that affect it. The study’s contributions to theory are expressed as a series of propositions and findings and are represented as a revised and empirically grounded theoretical framework of knowledge creation in virtual worlds. These findings highlight the importance of prior related knowledge and intrinsic motivation in terms of shaping and stimulating knowledge creation in virtual worlds. At the same time, they highlight the importance of meta-knowledge (knowledge about knowledge) in terms of guiding the knowledge creation process whilst revealing the diversity of behavioural approaches actually used to create knowledge in virtual worlds and. This theoretical framework is itself one of the chief contributions of the study and the analysis explores how it can be used to guide further research in virtual worlds and on knowledge creation. The study’s contributions to practice are presented as actionable guide to simulate knowledge creation in virtual worlds. This guide utilises a theoretically based classification of four knowledge-creator archetypes (the sage, the lore master, the artisan, and the apprentice) and derives an actionable set of behavioural prescriptions for each archetype. The study concludes with a discussion of the study’s implications in terms of future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to growing concerns regarding the anthropogenic interference with the climate system, countries across the world are being challenged to develop effective strategies to mitigate climate change by reducing or preventing greenhouse gas (GHG) emissions. The European Union (EU) is committed to contribute to this challenge by setting a number of climate and energy targets for the years 2020, 2030 and 2050 and then agreeing effort sharing amongst Member States. This thesis focus on one Member State, Ireland, which faces specific challenges and is not on track to meet the targets agreed to date. Before this work commenced, there were no projections of energy demand or supply for Ireland beyond 2020. This thesis uses techno-economic energy modelling instruments to address this knowledge gap. It builds and compares robust, comprehensive policy scenarios, providing a means of assessing the implications of different future energy and emissions pathways for the Irish economy, Ireland’s energy mix and the environment. A central focus of this thesis is to explore the dynamics of the energy system moving towards a low carbon economy. This thesis develops an energy systems model (the Irish TIMES model) to assess the implications of a range of energy and climate policy targets and target years. The thesis also compares the results generated from the least cost scenarios with official projections and target pathways and provides useful metrics and indications to identify key drivers and to support both policy makers and stakeholder in identifying cost optimal strategies. The thesis also extends the functionality of energy system modelling by developing and applying new methodologies to provide additional insights with a focus on particular issues that emerge from the scenario analysis carried out. Firstly, the thesis develops a methodology for soft-linking an energy systems model (Irish TIMES) with a power systems model (PLEXOS) to improve the interpretation of the electricity sector results in the energy system model. The soft-linking enables higher temporal resolution and improved characterisation of power plants and power system operation Secondly, the thesis develops a methodology for the integration of agriculture and energy systems modelling to enable coherent economy wide climate mitigation scenario analysis. This provides a very useful starting point for considering the trade-offs between the energy system and agriculture in the context of a low carbon economy and for enabling analysis of land-use competition. Three specific time scale perspectives are examined in this thesis (2020, 2030, 2050), aligning with key policy target time horizons. The results indicate that Ireland’s short term mandatory emissions reduction target will not be achieved without a significant reassessment of renewable energy policy and that the current dominant policy focus on wind-generated electricity is misplaced. In the medium to long term, the results suggest that energy efficiency is the first cost effective measure to deliver emissions reduction; biomass and biofuels are likely to be the most significant fuel source for Ireland in the context of a low carbon future prompting the need for a detailed assessment of possible implications for sustainability and competition with the agri-food sectors; significant changes are required in infrastructure to deliver deep emissions reductions (to enable the electrification of heat and transport, to accommodate carbon capture and storage facilities (CCS) and for biofuels); competition between energy and agriculture for land-use will become a key issue. The purpose of this thesis is to increase the evidence-based underpinning energy and climate policy decisions in Ireland. The methodology is replicable in other Member States.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge sharing typically examines organizational transfer of knowledge, often from headquarters to subsidiaries, from developed country sites to emerging country sites, or from host to local employees. Yes, recent research, such as Prahalad’s Bottom of the Pyramid, raises the question of reverse transfer of knowledge, or whether knowledge could and should be transferred from local sites to home country sites within an organization. As several emerging economies build their capabilities in knowledge, research and development, marketing, and the like, it only makes sense to consider what type of knowledge and how to transfer it in reverse or bi-directional manners. This paper takes one step back in the process. Rather than focusing on what knowledge transfer may make sense within an organization, we consider what types of knowledge are important for foreigners to know at the initial stages of engagement abroad as they consider whether to do business in an emerging country.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Veterans Health Administration (VHA) in the Department of Veteran Affairs (VA) has emerged as a national and international leader in the delivery and research of telehealth-based treatment. Several unique characteristics of care in VA settings intersect to create an ideal environment for telehealth modalities and research. However, the value of telehealth experience and initiatives in VA settings is limited if telehealth strategies cannot be widely exported to other public or private systems. Whereas a hierarchical organization, such as VA, can innovate and fund change relatively quickly based on provider and patient preferences and a growing knowledge base, other health provider organizations and third-party payers may likely require replicable scientific findings over time before incremental investments will be made to create infrastructure, reform regulatory barriers, and amend laws to accommodate expansion of telehealth modalities. Accordingly, large-scale scientifically rigorous telehealth research in VHA settings is essential not only to investigate the efficacy of existing and future telehealth practices in VHA, but also to hasten the development of telehealth infrastructure in private and other public health settings. We propose an expanded partnership between the VA, NIH, and other funding agencies to investigate creative and pragmatic uses of telehealth technology. To this end, we identify six specific areas of research we believe to be particularly relevant to the efficient development of telehealth modalities in civilian and military contexts outside VHA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To demonstrate the feasibility of using a knowledge base of prior treatment plans to generate new prostate intensity modulated radiation therapy (IMRT) plans. Each new case would be matched against others in the knowledge base. Once the best match is identified, that clinically approved plan is used to generate the new plan. METHODS: A database of 100 prostate IMRT treatment plans was assembled into an information-theoretic system. An algorithm based on mutual information was implemented to identify similar patient cases by matching 2D beam's eye view projections of contours. Ten randomly selected query cases were each matched with the most similar case from the database of prior clinically approved plans. Treatment parameters from the matched case were used to develop new treatment plans. A comparison of the differences in the dose-volume histograms between the new and the original treatment plans were analyzed. RESULTS: On average, the new knowledge-based plan is capable of achieving very comparable planning target volume coverage as the original plan, to within 2% as evaluated for D98, D95, and D1. Similarly, the dose to the rectum and dose to the bladder are also comparable to the original plan. For the rectum, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are 1.8% +/- 8.5%, -2.5% +/- 13.9%, and -13.9% +/- 23.6%, respectively. For the bladder, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are -5.9% +/- 10.8%, -12.2% +/- 14.6%, and -24.9% +/- 21.2%, respectively. A negative percentage difference indicates that the new plan has greater dose sparing as compared to the original plan. CONCLUSIONS: The authors demonstrate a knowledge-based approach of using prior clinically approved treatment plans to generate clinically acceptable treatment plans of high quality. This semiautomated approach has the potential to improve the efficiency of the treatment planning process while ensuring that high quality plans are developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marine protected areas (MPAs) are often implemented to conserve or restore species, fisheries, habitats, ecosystems, and ecological functions and services; buffer against the ecological effects of climate change; and alleviate poverty in coastal communities. Scientific research provides valuable insights into the social and ecological impacts of MPAs, as well as the factors that shape these impacts, providing useful guidance or "rules of thumb" for science-based MPA policy. Both ecological and social factors foster effective MPAs, including substantial coverage of representative habitats and oceanographic conditions; diverse size and spacing; protection of habitat bottlenecks; participatory decisionmaking arrangements; bounded and contextually appropriate resource use rights; active and accountable monitoring and enforcement systems; and accessible conflict resolution mechanisms. For MPAs to realize their full potential as a tool for ocean governance, further advances in policy-relevant MPA science are required. These research frontiers include MPA impacts on nontarget and wide-ranging species and habitats; impacts beyond MPA boundaries, on ecosystem services, and on resource-dependent human populations, as well as potential scale mismatches of ecosystem service flows. Explicitly treating MPAs as "policy experiments" and employing the tools of impact evaluation holds particular promise as a way for policy-relevant science to inform and advance science-based MPA policy. © 2011 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

© 2013 American Psychological Association.This meta-analysis synthesizes research on the effectiveness of intelligent tutoring systems (ITS) for college students. Thirty-five reports were found containing 39 studies assessing the effectiveness of 22 types of ITS in higher education settings. Most frequently studied were AutoTutor, Assessment and Learning in Knowledge Spaces, eXtended Tutor-Expert System, and Web Interface for Statistics Education. Major findings include (a) Overall, ITS had a moderate positive effect on college students' academic learning (g = .32 to g = .37); (b) ITS were less effective than human tutoring, but they outperformed all other instruction methods and learning activities, including traditional classroom instruction, reading printed text or computerized materials, computer-assisted instruction, laboratory or homework assignments, and no-treatment control; (c) ITS's effectiveness did not significantly differ by different ITS, subject domain, or the manner or degree of their involvement in instruction and learning; and (d) effectiveness in earlier studies appeared to be significantly greater than that in more recent studies. In addition, there is some evidence suggesting the importance of teachers and pedagogy in ITS-assisted learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Scientists rarely reuse expert knowledge of phylogeny, in spite of years of effort to assemble a great "Tree of Life" (ToL). A notable exception involves the use of Phylomatic, which provides tools to generate custom phylogenies from a large, pre-computed, expert phylogeny of plant taxa. This suggests great potential for a more generalized system that, starting with a query consisting of a list of any known species, would rectify non-standard names, identify expert phylogenies containing the implicated taxa, prune away unneeded parts, and supply branch lengths and annotations, resulting in a custom phylogeny suited to the user's needs. Such a system could become a sustainable community resource if implemented as a distributed system of loosely coupled parts that interact through clearly defined interfaces. RESULTS: With the aim of building such a "phylotastic" system, the NESCent Hackathons, Interoperability, Phylogenies (HIP) working group recruited 2 dozen scientist-programmers to a weeklong programming hackathon in June 2012. During the hackathon (and a three-month follow-up period), 5 teams produced designs, implementations, documentation, presentations, and tests including: (1) a generalized scheme for integrating components; (2) proof-of-concept pruners and controllers; (3) a meta-API for taxonomic name resolution services; (4) a system for storing, finding, and retrieving phylogenies using semantic web technologies for data exchange, storage, and querying; (5) an innovative new service, DateLife.org, which synthesizes pre-computed, time-calibrated phylogenies to assign ages to nodes; and (6) demonstration projects. These outcomes are accessible via a public code repository (GitHub.com), a website (http://www.phylotastic.org), and a server image. CONCLUSIONS: Approximately 9 person-months of effort (centered on a software development hackathon) resulted in the design and implementation of proof-of-concept software for 4 core phylotastic components, 3 controllers, and 3 end-user demonstration tools. While these products have substantial limitations, they suggest considerable potential for a distributed system that makes phylogenetic knowledge readily accessible in computable form. Widespread use of phylotastic systems will create an electronic marketplace for sharing phylogenetic knowledge that will spur innovation in other areas of the ToL enterprise, such as annotation of sources and methods and third-party methods of quality assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: New product design challenges, related to customer needs, product usage and environments, face companies when they expand their product offerings to new markets; Some of the main challenges are: the lack of quantifiable information, product experience and field data. Designing reliable products under such challenges requires flexible reliability assessment processes that can capture the variables and parameters affecting the product overall reliability and allow different design scenarios to be assessed. These challenges also suggest a mechanistic (Physics of Failure-PoF) reliability approach would be a suitable framework to be used for reliability assessment. Mechanistic Reliability recognizes the primary factors affecting design reliability. This research views the designed entity as a “system of components required to deliver specific operations”; it addresses the above mentioned challenges by; Firstly: developing a design synthesis that allows a descriptive operations/ system components relationships to be realized; Secondly: developing component’s mathematical damage models that evaluate components Time to Failure (TTF) distributions given: 1) the descriptive design model, 2) customer usage knowledge and 3) design material properties; Lastly: developing a procedure that integrates components’ damage models to assess the mechanical system’s reliability over time. Analytical and numerical simulation models were developed to capture the relationships between operations and components, the mathematical damage models and the assessment of system’s reliability. The process was able to affect the design form during the conceptual design phase by providing stress goals to meet component’s reliability target. The process was able to numerically assess the reliability of a system based on component’s mechanistic TTF distributions, besides affecting the design of the component during the design embodiment phase. The process was used to assess the reliability of an internal combustion engine manifold during design phase; results were compared to reliability field data and found to produce conservative reliability results. The research focused on mechanical systems, affected by independent mechanical failure mechanisms that are influenced by the design process. Assembly and manufacturing stresses and defects’ influences are not a focus of this research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatives to Planetary Nebulae, such as barium stars or symbiotic systems, can shed light on the connection between Planetary Nebulae and binarity. Because of the observational selection effects against direct spectroscopic detection of binary PNe cores with orbital periods longer than a few dozen days, at present these "awkward relatives" are a critical source of our knowledge about the binary PNe population at longer periods. Below a few examples are discussed, posing constraints on the attempts to model nebula, ejection process in a binary. © 2006 International Astronomical Union.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of loadsharing algorithms for heterogeneous distributed systems is investigated by simulation. The systems considered are networks of workstations (nodes) which differ in processing power. Two parameters are proposed for characterising system heterogeneity, namely the variance and skew of the distribution of processing power among the network nodes. A variety of networks are investigated, with the same number of nodes and total processing power, but with the processing power distributed differently among the nodes. Two loadsharing algorithms are evaluated, at overall system loadings of 50% and 90%, using job response time as the performance metric. Comparison is made with the ideal situation of ‘perfect sharing’, where it is assumed that the communication delays are zero and that complete knowledge is available about job lengths and the loading at the different nodes, so that an arriving job can be sent to the node where it will be completed in the shortest time. The algorithms studied are based on those already in use for homogeneous networks, but were adapted to take account of system heterogeneity. Both algorithms take into account the differences in the processing powers of the nodes in their location policies, but differ in the extent to which they ‘discriminate’ against the slower nodes. It is seen that the relative performance of the two is strongly influenced by the system utilisation and the distribution of processing power among the nodes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cross-domain workflow application may be constructed using a standard reference model such as the one by the Workflow Management Coalition (WfMC) [7] but the requirements for this type of application are inherently different from one organization to another. The existing models and systems built around them meet some but not all the requirements from all the organizations involved in a collaborative process. Furthermore the requirements change over time. This makes the applications difficult to develop and distribute. Service Oriented Architecture (SOA) based approaches such as the BPET (Business Process Execution Language) intend to provide a solution but fail to address the problems sufficiently, especially in the situations where the expectations and level of skills of the users (e.g. the participants of the processes) in different organisations are likely to be different. In this paper, we discuss a design pattern that provides a novel approach towards a solution. In the solution, business users can design the applications at a high level of abstraction: the use cases and user interactions; the designs are documented and used, together with the data and events captured later that represents the user interactions with the systems, to feed an intermediate component local to the users -the IFM (InterFace Mapper) -which bridges the gaps between the users and the systems. We discuss the main issues faced in the design and prototyping. The approach alleviates the need for re-programming with the APIs to any back-end service thus easing the development and distribution of the applications

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Western manufacturing companies are developing innovative ways of delivering value that competes with the low cost paradigm. One such strategy is to deliver not only products, but systems that are closely aligned with the customer value proposition. These systems are comprised of integrated products and services, and are referred to as Product-Service Systems (PSS). A key challenge in PSS is supporting the design activity. In one sense, PSS design is a further extension of concurrent engineering that requires front-end input from the additional downstream sources of product service and maintenance. However, simply developing products and service packages is not sufficient: the new design challenge is the integrated system. This paper describes the development of a PSS data structure that can support this integrated design activity. The data structure is implemented in a knowledge base using the Protégé knowledge base editor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The enormous growth of wireless communication systems makes it important to evaluate the capacity of such channels. Multiple Input Multiple Output (MIMO) wireless communication systems are shown to yield significant performance improvement to data rates when compared to the traditional Single Input Single Output (SISO) wireless systems. The benefits of multiple antenna elements at the transmitter and receiver have become necessary to the research and the development of the next generation of mobile communication systems. In this paper we propose the use of Relaying MIMO wireless communication systems for use over long throughput. We investigate how Relays can be used in a "demodulate-and-forward" operation when the transmitter is equipped with spatially correlated multiple antenna elements and the receiver has only partial knowledge of the statistics of the channel. We show that Relays between the source and destination nodes of a wireless communication system in MIMO configuration improve the throughput of the system when compared to the typical MIMO systems, or achieve the desired channel capacity with significantly lower power resources needed.