754 resultados para Knowledge-Capital Model
Resumo:
In this paper we show how political uncertainty may impede economic growth by reducing public investment in the formation of human capital, and how this negative effect of political uncertainty can be offset by a government contract. We present a model of growth with accumulation of human capital and government investment in education. We show that in a country with an unstable political system the government is reluctant to invest in human capital. Low government spending on education negatively affects productivity and slows growth. Furthermore, a politically unstable economy may be trapped in a stagnant equilibrium. We also demonstrate the role of a government retirement contract. Public investment in education and economic growth are higher when the future retirement compensation of the government depends on the future national income, in comparison with investment under zero or fixed retirement compensation.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
Iatrogenic errors and patient safety in clinical processes are an increasing concern. The quality of process information in hardcopy or electronic form can heavily influence clinical behaviour and decision making errors. Little work has been undertaken to assess the safety impact of clinical process planning documents guiding the clinical actions and decisions. This paper investigates the clinical process documents used in elective surgery and their impact on latent and active clinical errors. Eight clinicians from a large health trust underwent extensive semi- structured interviews to understand their use of clinical documents, and their perceived impact on errors and patient safety. Samples of the key types of document used were analysed. Theories of latent organisational and active errors from the literature were combined with the EDA semiotics model of behaviour and decision making to propose the EDA Error Model. This model enabled us to identify perceptual, evaluation, knowledge and action error types and approaches to reducing their causes. The EDA error model was then used to analyse sample documents and identify error sources and controls. Types of knowledge artefact structures used in the documents were identified and assessed in terms of safety impact. This approach was combined with analysis of the questionnaire findings using existing error knowledge from the literature. The results identified a number of document and knowledge artefact issues that give rise to latent and active errors and also issues concerning medical culture and teamwork together with recommendations for further work.
Resumo:
Knowledge is a valuable asset in organisations that has become significant as a strategic resource in the information age. Many studies have focused on managing knowledge in organisations. In particular, knowledge transfer has become a significant issue concerned with the movement of knowledge across organisational boundaries. One way to capture knowledge in a transferrable form is through practice. In this paper, we discuss how organisations can transfer knowledge through practice effectively and propose a model for a semiotic approach to practice-oriented knowledge transfer. In this model, practice is treated as a sign that represents knowledge, and its localisation is analysed as a semiotic process.
Resumo:
Key point summary • Cerebellar ataxias are progressive debilitating diseases with no known treatment and are associated with defective motor function and, in particular, abnormalities to Purkinje cells. • Mutant mice with deficits in Ca2+ channel auxiliary α2δ-2 subunits are used as models of cerebellar ataxia. • Our data in the du2J mouse model shows an association between the ataxic phenotype exhibited by homozygous du2J/du2J mice and increased irregularity of Purkinje cell firing. • We show that both heterozygous +/du2J and homozygous du2J/du2J mice completely lack the strong presynaptic modulation of neuronal firing by cannabinoid CB1 receptors which is exhibited by litter-matched control mice. • These results show that the du2J ataxia model is associated with deficits in CB1 receptor signalling in the cerebellar cortex, putatively linked with compromised Ca2+ channel activity due to reduced α2δ-2 subunit expression. Knowledge of such deficits may help design therapeutic agents to combat ataxias. Abstract Cerebellar ataxias are a group of progressive, debilitating diseases often associated with abnormal Purkinje cell (PC) firing and/or degeneration. Many animal models of cerebellar ataxia display abnormalities in Ca2+ channel function. The ‘ducky’ du2J mouse model of ataxia and absence epilepsy represents a clean knock-out of the auxiliary Ca2+ channel subunit, α2δ-2, and has been associated with deficient Ca2+ channel function in the cerebellar cortex. Here, we investigate effects of du2J mutation on PC layer (PCL) and granule cell (GC) layer (GCL) neuronal spiking activity and, also, inhibitory neurotransmission at interneurone-Purkinje cell(IN-PC) synapses. Increased neuronal firing irregularity was seen in the PCL and, to a less marked extent, in the GCL in du2J/du2J, but not +/du2J, mice; these data suggest that the ataxic phenotype is associated with lack of precision of PC firing, that may also impinge on GC activity and requires expression of two du2J alleles to manifest fully. du2J mutation had no clear effect on spontaneous inhibitory postsynaptic current (sIPSC) frequency at IN-PC synapses, but was associated with increased sIPSC amplitudes. du2J mutation ablated cannabinoid CB1 receptor (CB1R)-mediated modulation of spontaneous neuronal spike firing and CB1Rmediated presynaptic inhibition of synaptic transmission at IN-PC synapses in both +/du2J and du2J/du2J mutants; effects that occurred in the absence of changes in CB1R expression. These results demonstrate that the du2J ataxia model is associated with deficient CB1R signalling in the cerebellar cortex, putatively linked with compromised Ca2+ channel activity and the ataxic phenotype.
Resumo:
The present article addresses the following question: what variables condition syntactic transfer? Evidence is provided in support of the position that third language (L3) transfer is selective, whereby, at least under certain conditions, it is driven by the typological proximity of the target L3 measured against the other previously acquired linguistic systems (cf. Rothman and Cabrelli Amaro, 2007, 2010; Rothman, 2010; Montrul et al., 2011). To show this, we compare data in the domain of adjectival interpretation between successful first language (L1) Italian learners of English as a second language (L2) at the low to intermediate proficiency level of L3 Spanish, and successful L1 English learners of L2 Spanish at the same levels for L3 Brazilian Portuguese. The data show that, irrespective of the L1 or the L2, these L3 learners demonstrate target knowledge of subtle adjectival semantic nuances obtained via noun-raising, which English lacks and the other languages share. We maintain that such knowledge is transferred to the L3 from Italian (L1) and Spanish (L2) respectively in light of important differences between the L3 learners herein compared to what is known of the L2 Spanish performance of L1 English speakers at the same level of proficiency (see, for example, Judy et al., 2008; Rothman et al., 2010). While the present data are consistent with Flynn et al.’s (2004) Cumulative Enhancement Model, we discuss why a coupling of these data with evidence from other recent L3 studies suggests necessary modifications to this model, offering in its stead the Typological Primacy Model (TPM) for multilingual transfer.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
This paper aims to design a collaboration model for a Knowledge Community - SSMEnetUK. The research identifies SSMEnetUK as a socio-technical system and uses the core concepts of Service Science to explore the subject domain. The paper is positioned within the concept of Knowledge Management (KM) and utilising Web 2.0 tools for collaboration. A qualitative case study method was adopted and multiple data sources were used. In achieving that, the degree of co-relation between knowledge management activities and Web 2.0 tools for collaboration in the scenario are pitted against the concept of value propositions offered by both customer/user and service provider. The proposed model provides a better understanding of how Knowledge Management and Web 2.0 tools can enable effective collaboration within SSMEnetUK. This research is relevant to the wider service design and innovation community because it provides a basis for building a service-centric collaboration platform for the benefit of both customer/user and service provider.
Resumo:
This paper demonstrates that the use of GARCH-type models for the calculation of minimum capital risk requirements (MCRRs) may lead to the production of inaccurate and therefore inefficient capital requirements. We show that this inaccuracy stems from the fact that GARCH models typically overstate the degree of persistence in return volatility. A simple modification to the model is found to improve the accuracy of MCRR estimates in both back- and out-of-sample tests. Given that internal risk management models are currently in widespread usage in some parts of the world (most notably the USA), and will soon be permitted for EC banks and investment firms, we believe that our paper should serve as a valuable caution to risk management practitioners who are using, or intend to use this popular class of models.
Resumo:
This paper introduces an ontology-based knowledge model for knowledge management. This model can facilitate knowledge discovery that provides users with insight for decision making. The users requiring the insight normally play different roles with different requirements in an organisation. To meet the requirements, insights are created by purposely aggregated transnational data. This involves a semantic data integration process. In this paper, we present a knowledge management system which is capable of representing knowledge requirements in a domain context and enabling the semantic data integration through ontology modeling. The knowledge domain context of United Bible Societies is used to illustrate the features of the knowledge management capabilities.
Resumo:
Earthworms are important organisms in soil communities and so are used as model organisms in environmental risk assessments of chemicals. However current risk assessments of soil invertebrates are based on short-term laboratory studies, of limited ecological relevance, supplemented if necessary by site-specific field trials, which sometimes are challenging to apply across the whole agricultural landscape. Here, we investigate whether population responses to environmental stressors and pesticide exposure can be accurately predicted by combining energy budget and agent-based models (ABMs), based on knowledge of how individuals respond to their local circumstances. A simple energy budget model was implemented within each earthworm Eisenia fetida in the ABM, based on a priori parameter estimates. From broadly accepted physiological principles, simple algorithms specify how energy acquisition and expenditure drive life cycle processes. Each individual allocates energy between maintenance, growth and/or reproduction under varying conditions of food density, soil temperature and soil moisture. When simulating published experiments, good model fits were obtained to experimental data on individual growth, reproduction and starvation. Using the energy budget model as a platform we developed methods to identify which of the physiological parameters in the energy budget model (rates of ingestion, maintenance, growth or reproduction) are primarily affected by pesticide applications, producing four hypotheses about how toxicity acts. We tested these hypotheses by comparing model outputs with published toxicity data on the effects of copper oxychloride and chlorpyrifos on E. fetida. Both growth and reproduction were directly affected in experiments in which sufficient food was provided, whilst maintenance was targeted under food limitation. Although we only incorporate toxic effects at the individual level we show how ABMs can readily extrapolate to larger scales by providing good model fits to field population data. The ability of the presented model to fit the available field and laboratory data for E. fetida demonstrates the promise of the agent-based approach in ecology, by showing how biological knowledge can be used to make ecological inferences. Further work is required to extend the approach to populations of more ecologically relevant species studied at the field scale. Such a model could help extrapolate from laboratory to field conditions and from one set of field conditions to another or from species to species.
Resumo:
The climate over the Arctic has undergone changes in recent decades. In order to evaluate the coupled response of the Arctic system to external and internal forcing, our study focuses on the estimation of regional climate variability and its dependence on large-scale atmospheric and regional ocean circulations. A global ocean–sea ice model with regionally high horizontal resolution is coupled to an atmospheric regional model and global terrestrial hydrology model. This way of coupling divides the global ocean model setup into two different domains: one coupled, where the ocean and the atmosphere are interacting, and one uncoupled, where the ocean model is driven by prescribed atmospheric forcing and runs in a so-called stand-alone mode. Therefore, selecting a specific area for the regional atmosphere implies that the ocean–atmosphere system can develop ‘freely’ in that area, whereas for the rest of the global ocean, the circulation is driven by prescribed atmospheric forcing without any feedbacks. Five different coupled setups are chosen for ensemble simulations. The choice of the coupled domains was done to estimate the influences of the Subtropical Atlantic, Eurasian and North Pacific regions on northern North Atlantic and Arctic climate. Our simulations show that the regional coupled ocean–atmosphere model is sensitive to the choice of the modelled area. The different model configurations reproduce differently both the mean climate and its variability. Only two out of five model setups were able to reproduce the Arctic climate as observed under recent climate conditions (ERA-40 Reanalysis). Evidence is found that the main source of uncertainty for Arctic climate variability and its predictability is the North Pacific. The prescription of North Pacific conditions in the regional model leads to significant correlation with observations, even if the whole North Atlantic is within the coupled model domain. However, the inclusion of the North Pacific area into the coupled system drastically changes the Arctic climate variability to a point where the Arctic Oscillation becomes an ‘internal mode’ of variability and correlations of year-to-year variability with observational data vanish. In line with previous studies, our simulations provide evidence that Arctic sea ice export is mainly due to ‘internal variability’ within the Arctic region. We conclude that the choice of model domains should be based on physical knowledge of the atmospheric and oceanic processes and not on ‘geographic’ reasons. This is particularly the case for areas like the Arctic, which has very complex feedbacks between components of the regional climate system.
Resumo:
This article describes a case study involving information technology managers and their new programmer recruitment policy, but the primary interest is methodological. The processes of issue generation and selection and model conceptualization are described. Early use of “magnetic hexagons” allowed the generation of a range of issues, most of which would not have emerged if system dynamics elicitation techniques had been employed. With the selection of a specific issue, flow diagraming was used to conceptualize a model, computer implementation and scenario generation following naturally. Observations are made on the processes of system dynamics modeling, particularly on the need to employ general techniques of knowledge elicitation in the early stages of interventions. It is proposed that flexible approaches should be used to generate, select, and study the issues, since these reduce any biasing of the elicitation toward system dynamics problems and also allow the participants to take up the most appropriate problem- structuring approach.
Resumo:
Knowledge spillover theory of entrepreneurship and the prevailing theory of economic growth treat opportunities as endogenous and generally focus on opportunity recognition by entrepreneurs. New knowledge created endogenously results in knowledge spillovers enabling inventors and entrepreneurs to commercialize it. This article discusses that knowledge spillover entrepreneurship depends not only on ordinary human capital, but more importantly also on creativity embodied in creative individuals and diverse urban environments that attract creative classes. This might result in self-selection of creative individuals into entrepreneurship or enable entrepreneurs to recognize creativity and commercialize it. This creativity theory of knowledge spillover entrepreneurship is tested utilizing data on European cities.
Resumo:
This paper addresses one of the issues in contemporary globalisation theory: the extent to which there is ‘one best way’ in which business can be done and organisations managed. It uses Czarniawska’s ‘Travels of Ideas’ model as an organising framework to present and understand how the concept of ‘Quality’, so important in contemporary approaches to manufacturing & services, and their management, travelled to, and impinged on, a newly opened vehicle assembly plant in Poland. The extent to which new meanings were mutually created in the process of translation is discussed, using ethnographic reporting and analysis techniques commonly used in diffusion research. Parallels between the process of translation as an idea becomes embedded into a new cultural location, and the processes which contemporary research has identified as important to organisational learning, are briefly discussed in conclusion.