828 resultados para Tacit and explicit knowledge
Resumo:
This paper reports on: (a) new primary source evidence on; and (b) statistical and econometric analysis of high technology clusters in Scotland. It focuses on the following sectors: software, life sciences, microelectronics, optoelectronics, and digital media. Evidence on a postal and e-mailed questionnaire is presented and discussed under the headings of: performance, resources, collaboration & cooperation, embeddedness, and innovation. The sampled firms are characterised as being small (viz. micro-firms and SMEs), knowledge intensive (largely graduate staff), research intensive (mean spend on R&D GBP 842k), and internationalised (mainly selling to markets beyond Europe). Preliminary statistical evidence is presented on Gibrat’s Law (independence of growth and size) and the Schumpeterian Hypothesis (scale economies in R&D). Estimates suggest a short-run equilibrium size of just 100 employees, but a long-run equilibrium size of 1000 employees. Further, to achieve the Schumpeterian effect (of marked scale economies in R&D), estimates suggest that firms have to grow to very much larger sizes of beyond 3,000 employees. We argue that the principal way of achieving the latter scale may need to be by takeovers and mergers, rather than by internally driven growth.
Resumo:
This paper has three contributions. First, it shows how field work within small firms in PR Chinese has provided new evidence which enables us to measure and calibrate Entrepreneurial Orientation (EO), as ‘spirit’, and Intangible Assets (IA), as ‘material’, for use in models of small firm growth. Second, it uses inter-item correlation analysis and both exploratory and confirmatory factor analysis to provide new measures of EO and IA, in index and in vector form, for use in econometric models of firm growth. Third, it estimates two new econometric models of small firm employment growth in PR China, under the null hypothesis of Gibrat’s Law, using our two new index-based and vector-based measures of EO and IA. Estimation is by OLS with adjustment for heteroscedasticity, and for sample selectivity. Broadly, it finds that EO attributes have had little significant impact on small firm growth, and indeed innovativeness and pro-activity paradoxically may even dampen growth. However, IA attributes have had a positive and significant impact on growth, with networking, and technological knowledge being of prime importance, and intellectual property and human capital being of lesser but still significant importance. In the light of these results, Gibrat’s Law is generalized, and Jovanovic’s learning theory is extended, to emphasise the importance of IA to growth. These findings cast new empirical light on the oft-quoted national slogan in PR China of “spirit and material”. So far as small firms are concerned, this paper suggests that their contribution to PR China’s remarkable economic growth is not so much attributable to the ‘spirit’ of enterprise (as suggested by propaganda) as, more prosaically, to the pursuit of the ‘material’.
Resumo:
BACKGROUND AND PURPOSE: Neuromyelitis optica (NMO) or Devic's disease is a rare inflammatory and demyelinating autoimmune disorder of the central nervous system (CNS) characterized by recurrent attacks of optic neuritis (ON) and longitudinally extensive transverse myelitis (LETM), which is distinct from multiple sclerosis (MS). The guidelines are designed to provide guidance for best clinical practice based on the current state of clinical and scientific knowledge. SEARCH STRATEGY: Evidence for this guideline was collected by searches for original articles, case reports and meta-analyses in the MEDLINE and Cochrane databases. In addition, clinical practice guidelines of professional neurological and rheumatological organizations were studied. RESULTS: Different diagnostic criteria for NMO diagnosis [Wingerchuk et al. Revised NMO criteria, 2006 and Miller et al. National Multiple Sclerosis Society (NMSS) task force criteria, 2008] and features potentially indicative of NMO facilitate the diagnosis. In addition, guidance for the work-up and diagnosis of spatially limited NMO spectrum disorders is provided by the task force. Due to lack of studies fulfilling requirement for the highest levels of evidence, the task force suggests concepts for treatment of acute exacerbations and attack prevention based on expert opinion. CONCLUSIONS: Studies on diagnosis and management of NMO fulfilling requirements for the highest levels of evidence (class I-III rating) are limited, and diagnostic and therapeutic concepts based on expert opinion and consensus of the task force members were assembled for this guideline.
Resumo:
In this paper we study a model where non-cooperative agents may exchange knowledge in a competitive environment. As a potential factor that could induce the knowledge disclosure between humans we consider the timing of the moves of players. We develop a simple model of a multistage game in which there are only three players and competition takes place only within two stages. Players can share their private knowledge with their opponents and the knowledge is modelled as in uencing their marginal cost of e¤ort. We identify two main mechanisms that work towards knowledge disclosure. One of them is that before the actual competition starts, the stronger player of the rst stage of a game may have desire to share his knowledge with the "observer", be- cause this reduces the valuation of the prize of the weaker player of that stage and as a result his e¤ort level and probability of winning in a ght. Another mechanism is that the "observer" may have sometimes desire to share knowledge with the weaker player of the rst stage, because in this way, by increasing his probability of winning in that stage, he decreases the probability of winning of the stronger player. As a result, in the second stage the "observer" may have greater chances to meet the weaker player rather than the stronger one. Keywords: knowledge sharing, strategic knowledge disclosure, multistage contest game, non-cooperative games
Resumo:
We explore the relationship between polynomial functors and trees. In the first part we characterise trees as certain polynomial functors and obtain a completely formal but at the same time conceptual and explicit construction of two categories of rooted trees, whose main properties we describe in terms of some factorisation systems. The second category is the category Ω of Moerdijk and Weiss. Although the constructions are motivated and explained in terms of polynomial functors, they all amount to elementary manipulations with finite sets. Included in Part 1 is also an explicit construction of the free monad on a polynomial endofunctor, given in terms of trees. In the second part we describe polynomial endofunctors and monads as structures built from trees, characterising the images of several nerve functors from polynomial endofunctors and monads into presheaves on categories of trees. Polynomial endofunctors and monads over a base are characterised by a sheaf condition on categories of decorated trees. In the absolute case, one further condition is needed, a projectivity condition, which serves also to characterise polynomial endofunctors and monads among (coloured) collections and operads.
Resumo:
Research suggests that implicit attitudes play a key role in the occurrence of antisocial behaviours. This study assessed implicit attitudes and self-concepts related to aggression and transgression in community and offender adolescents, using a new set of Implicit Association Tests (IATs), and examined their association with of psychopathic traits. Thirty-six offenders and 66 community adolescents performed 4 IATs assessing 1) implicit attitudes about a) aggression and b) transgression as good, and 2) implicit self-concepts about a) aggression and b) transgression as self-descriptive. They filled in self-report questionnaires: the Youth Psychopathic Traits Inventory, the Child Behaviour Checklist, and explicit measures of their attitudes and self-concepts towards transgression and aggression. Results showed few differences between community and offender adolescents on implicit attitudes and self-concepts, and unexpected negative associations between some implicit attitudes and psychopathic traits, while the association was positive for the corresponding explicit attitudes. Possible explanations of these findings are discussed.
Resumo:
Female genital mutilation (FGM) is defined as an injury of the external female genitalia for cultural or non-therapeutic reasons. FGM is mainly performed in sub-Saharan and Eastern Africa. The western health care systems are confronted with migrants from this cultural background. The aim is to offer information on how to approach this subject. The degree of FGM can vary from excision of the prepuce and clitoris to infibulation. Infections, urinary retention, pain, lesions of neighbouring organs, bleeding, psychological trauma and even death are possible acute complications. The different long-term complications include the risk of reduced fertility and difficulties during labour, which are key arguments against FGM in the migrant community. Paediatricians often have questions on how to approach the subject. With an open, neutral approach and basic knowledge, discussions with parents are constructive. Talking about the newborn, delivery or traditions may be a good starting point. Once they feel accepted, they speak surprisingly openly. FGM is performed out of love for their daughters. We have to be aware of their arguments and fears, but we should also stress the parents' responsibility in taking a health risk for their daughters. It is important to know the family's opinion on FGM. Some may need support, especially against community pressure. As FGM is often performed on newborns or at 4-9 years of age, paediatricians should have an active role in the prevention of FGM, especially as they have repeated close contact with those concerned and medical consequences are the main arguments against FGM.
Resumo:
This PhD project aims to study paraphrasing, initially understood as the different ways in which the same content is expressed linguistically. We will go into that concept in depth trying to define and delimit its scope more accurately. In that sense, we also aim to discover which kind of structures and phenomena it covers. Although there exist some paraphrasing typologies, the great majority of them only apply to English, and focus on lexical and syntactic transformations. Our intention is to go further into this subject and propose a paraphrasing typology for Spanish and Catalan combining lexical, syntactic, semantic and pragmatic knowledge. We apply a bottom-up methodology trying to collect evidence of this phenomenon from the data. For this purpose, we are initially using the Spanish Wikipedia as our corpus. The internal structure of this encyclopedia makes it a good resource for extracting paraphrasing examples for our investigation. This empirical approach will be complemented with the use of linguistic knowledge, and by comparing and contrasting our results to previously proposed paraphrasing typologies in order to enlarge the possible paraphrasing forms found in our corpus. The fact that the same content can be expressed in many different ways presents a major challenge for Natural Language Processing (NLP) applications. Thus, research on paraphrasing has recently been attracting increasing attention in the fields of NLP and Computational Linguistics. The results obtained in this investigation would be of great interest in many of these applications.
Multimodel inference and multimodel averaging in empirical modeling of occupational exposure levels.
Resumo:
Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.
Resumo:
OBJECTIVE: To assess the theoretical and practical knowledge of the Glasgow Coma Scale (GCS) by trained Air-rescue physicians in Switzerland. METHODS: Prospective anonymous observational study with a specially designed questionnaire. General knowledge of the GCS and its use in a clinical case were assessed. RESULTS: From 130 questionnaires send out, 103 were returned (response rate of 79.2%) and analyzed. Theoretical knowledge of the GCS was consistent for registrars, fellows, consultants and private practitioners active in physician-staffed helicopters. The clinical case was wrongly scored by 38 participants (36.9%). Wrong evaluation of the motor component occurred in 28 questionnaires (27.2%), and 19 errors were made for the verbal score (18.5%). Errors were made most frequently by registrars (47.5%, p = 0.09), followed by fellows (31.6%, p = 0.67) and private practitioners (18.4%, p = 1.00). Consultants made significantly less errors than the rest of the participating physicians (0%, p < 0.05). No statistically significant differences were shown between anesthetists, general practitioners, internal medicine trainees or others. CONCLUSION: Although the theoretical knowledge of the GCS by out-of-hospital physicians is correct, significant errors were made in scoring a clinical case. Less experienced physicians had a higher rate of errors. Further emphasis on teaching the GCS is mandatory.
Current millennium biotechniques for biomedical research on parasites and host-parasite interactions
Resumo:
The development of biotechnology in the last three decades has generated the feeling that the newest scientific achievements will deliver high standard quality of life through abundance of food and means for successfully combating diseases. Where the new biotechnologies give access to genetic information, there is a common belief that physiological and pathological processes result from subtle modifications of gene expression. Trustfully, modern genetics has produced genetic maps, physical maps and complete nucleotide sequences from 141 viruses, 51 organelles, two eubacteria, one archeon and one eukaryote (Saccharomices cerevisiae). In addition, during the Centennial Commemoration of the Oswaldo Cruz Institute the nearly complete human genome map was proudly announced, whereas the latest Brazilian key stone contribution to science was the publication of the Shillela fastidiosa genomic sequence highlythed on a Nature cover issue. There exists a belief among the populace that further scientific accomplishments will rapidly lead to new drugs and methodological approaches to cure genetic diseases and other incurable ailments. Yet, much evidence has been accumulated, showing that a large information gap exists between the knowledge of genome sequence and our knowledge of genome function. Now that many genome maps are available, people wish to know what are we going to do with them. Certainly, all these scientific accomplishments will shed light on many more secrets of life. Nevertheless, parsimony in the weekly announcements of promising scientific achievements is necessary. We also need many more creative experimental biologists to discover new, as yet un-envisaged biotechnological approaches, and the basic resource needed for carrying out mile stone research necessary for leading us to that "promised land"often proclaimed by the mass media.
Resumo:
What allows an armed group in a civil war to prevent desertion? This paper addresses this question with a focus on control in the rearguard. Most past studies focus on motivations for desertion. They explain desertion in terms of where soldiers stand in relation to the macro themes of the war, or in terms of an inability to provide positive incentives to overcome the collective action problem. However, since individuals decide whether and how to participate in civil wars for multiple reasons, responding to a variety of local conditions in an environment of threat and violence, a focus only on macro-level motivations is incomplete. The opportunities side of the ledger deserves more attention. I therefore turn my attention to how control by an armed group eliminates soldiers’ opportunities to desert. In particular, I consider the control that an armed group maintains over soldiers’ hometowns, treating geographic terrain as an important exogenous indicator of the ease of control. Rough terrain at home affords soldiers and their families and friends advantages in ease of hiding, the difficulty of using force, and local knowledge. Based on an original dataset of soldiers from Santander Province in the Spanish Civil War, gathered from archival sources, I find statistical evidence that the rougher the terrain in a soldier’s home municipality, the more likely he is to desert. I find complementary qualitative evidence indicating that soldiers from rough-terrain communities took active advantage of their greater opportunities for evasion. This finding has important implications for the way observers interpret different soldiers’ decisions to desert or remain fighting, for the prospect that structural factors may shape the cohesion of armed groups, and for the possibility that local knowledge may be a double-edged sword, making soldiers simultaneously good at fighting and good at deserting.
Resumo:
Dans le contexte climatique actuel, les régions méditerranéennes connaissent une intensification des phénomènes hydrométéorologiques extrêmes. Au Maroc, le risque lié aux inondations est devenu problématique, les communautés étant vulnérables aux événements extrêmes. En effet, le développement économique et urbain rapide et mal maîtrisé augmente l'exposition aux phénomènes extrêmes. La Direction du Développement et de la Coopération suisse (DDC) s'implique activement dans la réduction des risques naturels au Maroc. La cartographie des dangers et son intégration dans l'aménagement du territoire représentent une méthode efficace afin de réduire la vulnérabilité spatiale. Ainsi, la DDC a mandaté ce projet d'adaptation de la méthode suisse de cartographie des dangers à un cas d'étude marocain (la ville de Beni Mellal, région de Tadla-Azilal, Maroc). La méthode suisse a été adaptée aux contraintes spécifiques du terrain (environnement semi-aride, morphologie de piémont) et au contexte de transfert de connaissances (caractéristiques socio-économiques et pratiques). Une carte des phénomènes d'inondations a été produite. Elle contient les témoins morphologiques et les éléments anthropiques pertinents pour le développement et l'aggravation des inondations. La modélisation de la relation pluie-débit pour des événements de référence, et le routage des hydrogrammes de crue ainsi obtenus ont permis d'estimer quantitativement l'aléa inondation. Des données obtenues sur le terrain (estimations de débit, extension de crues connues) ont permis de vérifier les résultats des modèles. Des cartes d'intensité et de probabilité ont été obtenues. Enfin, une carte indicative du danger d'inondation a été produite sur la base de la matrice suisse du danger qui croise l'intensité et la probabilité d'occurrence d'un événement pour obtenir des degrés de danger assignables au territoire étudié. En vue de l'implémentation des cartes de danger dans les documents de l'aménagement du territoire, nous nous intéressons au fonctionnement actuel de la gestion institutionnelle du risque à Beni Mellal, en étudiant le degré d'intégration de la gestion et la manière dont les connaissances sur les risques influencent le processus de gestion. L'analyse montre que la gestion est marquée par une logique de gestion hiérarchique et la priorité des mesures de protection par rapport aux mesures passives d'aménagement du territoire. Les connaissances sur le risque restent sectorielles, souvent déconnectées. L'innovation dans le domaine de la gestion du risque résulte de collaborations horizontales entre les acteurs ou avec des sources de connaissances externes (par exemple les universités). Des recommandations méthodologiques et institutionnelles issues de cette étude ont été adressées aux gestionnaires en vue de l'implémentation des cartes de danger. Plus que des outils de réduction du risque, les cartes de danger aident à transmettre des connaissances vers le public et contribuent ainsi à établir une culture du risque. - Severe rainfall events are thought to be occurring more frequently in semi-arid areas. In Morocco, flood hazard has become an important topic, notably as rapid economic development and high urbanization rates have increased the exposure of people and assets in hazard-prone areas. The Swiss Agency for Development and Cooperation (SADC) is active in natural hazard mitigation in Morocco. As hazard mapping for urban planning is thought to be a sound tool for vulnerability reduction, the SADC has financed a project aimed at adapting the Swiss approach for hazard assessment and mapping to the case of Morocco. In a knowledge transfer context, the Swiss method was adapted to the semi-arid environment, the specific piedmont morphology and to socio-economic constraints particular to the study site. Following the Swiss guidelines, a hydro-geomorphological map was established, containing all geomorphic elements related to known past floods. Next, rainfall / runoff modeling for reference events and hydraulic routing of the obtained hydrographs were carried out in order to assess hazard quantitatively. Field-collected discharge estimations and flood extent for known floods were used to verify the model results. Flood hazard intensity and probability maps were obtained. Finally, an indicative danger map as defined within the Swiss hazard assessment terminology was calculated using the Swiss hazard matrix that convolves flood intensity with its recurrence probability in order to assign flood danger degrees to the concerned territory. Danger maps become effective, as risk mitigation tools, when implemented in urban planning. We focus on how local authorities are involved in the risk management process and how knowledge about risk impacts the management. An institutional vulnerability "map" was established based on individual interviews held with the main institutional actors in flood management. Results show that flood hazard management is defined by uneven actions and relationships, it is based on top-down decision-making patterns, and focus is maintained on active mitigation measures. The institutional actors embody sectorial, often disconnected risk knowledge pools, whose relationships are dictated by the institutional hierarchy. Results show that innovation in the risk management process emerges when actors collaborate despite the established hierarchy or when they open to outer knowledge pools (e.g. the academia). Several methodological and institutional recommendations were addressed to risk management stakeholders in view of potential map implementation to planning. Hazard assessment and mapping is essential to an integrated risk management approach: more than a mitigation tool, danger maps represent tools that allow communicating on hazards and establishing a risk culture.