64 resultados para Unbalanced operation of diode-clamped three-level inverter


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existence of at least three isoforms of Na(+)-K(+)-ATPase in adult brain tissues [alpha 1, kidney type; alpha 2 [or alpha(+)]; alpha 3] suggests that these genes might be regulated in a cell-specific and time-dependent manner during development. We have studied this question in serum-free aggregating cell cultures of mechanically dissociated rat fetal telencephalon. At the protein level, the relative rate of synthesis of the pool of alpha 1-, alpha 2-, and alpha 3-subunits increased approximately twofold over 15 days of culture, leading to a marked increase in the immunochemical pool of alpha-subunits as measured by a panspecific polyclonal antibody. Concomitantly, Na(+)-K(+)-ATPase enzyme-specific activity increased three- (lower forebrain) to sixfold (upper forebrain). The transcripts of all three alpha-isoforms and beta-subunit were detected in vitro in similar proportion to the level observed in vivo. alpha 3-mRNA (3.7 kb) was more abundant than alpha 1 (3.7 kb) or alpha 2 (5.3 and 3.4 kb). Cytosine arabinoside (0.4 microM) and cholera toxin (0.1 microM) were used to selectively eliminate glial cells or neurons, respectively. It was found that alpha 2-mRNA is predominantly transcribed in glial cell cultures, whereas alpha 3- and beta 1-mRNA (2.7, 2.3, and 1.8 kb) are predominant in neuronal cultures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method is used to estimate the volumes of sediments of glacial valleys. This method is based on the concept of sloping local base level and requires only a digital terrain model and the limits of the alluvial valleys as input data. The bedrock surface of the glacial valley is estimated by a progressive excavation of the digital elevation model (DEM) of the filled valley area. This is performed using an iterative routine that replaces the altitude of a point of the DEM by the mean value of its neighbors minus a fixed value. The result is a curved surface, quadratic in 2D. The bedrock surface of the Rhone Valley in Switzerland was estimated by this method using the free digital terrain model Shuttle Radar Topography Mission (SRTM) (~92 m resolution). The results obtained are in good agreement with the previous estimations based on seismic profiles and gravimetric modeling, with the exceptions of some particular locations. The results from the present method and those from the seismic interpretation are slightly different from the results of the gravimetric data. This discrepancy may result from the presence of large buried landslides in the bottom of the Rhone Valley.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND CONTEXT: Kyphotic deformities with sagittal imbalance of the spine can be treated with spinal osteotomies. Those procedures are known to have a high incidence of neurological complications, in particular at the thoracic level. Motor evoked potentials (MEPs) have been widely used in helping to avoid major neurological deficits postoperatively. Previous reports have shown that a significant proportion of such cases present with important transcranial MEP (Tc-MEP) changes during surgery with some of them being predictive of postoperative deficits. PURPOSE: Our aim was to study Tc-MEP changes in a consecutive series of patients and correlate them with clinical parameters and radiological changes. STUDY DESIGN/SETTING: Retrospective case notes study from a prospective patient register. PATIENT SAMPLE: Eighteen patients undergoing posterior shortening osteotomies (nine at thoracic and nine at lumbar levels) for kyphosis of congenital, degenerative, inflammatory, or post-traumatic origin were included. OUTCOME MEASURES: Loss of at least 80% of Tc-MEP signal expressed as the area under the curve percentual change, of at least one muscle. METHODS: We studied the relation between outcome measure (80% Tc-MEP loss in at least one muscle group) and amount of posterior vertebral body shortening as well as angular correction measured on computed tomography scans, occurrence of postoperative deficits, intraoperative blood pressure at the time of the osteotomy, and hemoglobin (Hb) change. RESULTS: All patients showed significant Tc-MEP changes. In particular, greater than 80% MEP loss in at least one muscle group was observed in five of nine patients in the thoracic group and four of nine patients in the lumbar group. No surgical maneuver was undertaken as a result of this loss in an effort to improve motor responses other than verifying the stability of the construct and the extent of the decompression. Four patients developed postoperative deficits of radicular origin, three of them recovering fully at 3 months. No relation was found between intraoperative blood pressure, Hb changes, and Tc-MEP changes. Severity of Tc-MEP loss did not correlate with postoperative deficits. Shortening of more than 10 mm was linked to more severe Tc-MEP changes in the thoracic group. CONCLUSIONS: Transcranial MEP changes during spinal shortening procedures are common and do not appear to predict severe postoperative deficits. Total loss of Tc-MEP (not witnessed in our series) might require a more drastic approach with possible reversal of the correction and wake-up test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Denervated muscle tissue undergoes morphologic changes that result in atrophy. The amount of muscle atrophy after denervation following free muscle transfer has not been measured so far. Therefore, the amount of muscle atrophy in human free muscle transfer for lower extremity reconstruction was measured in a series of 10 patients. Three-dimensional laser surface scanning was used to measure flap volume changes 2 weeks as well as 6 and 12 months after the operation. None of the muscles transferred was re-innervated.All muscles healed uneventfully without signs of compromised perfusion resulting in partial flap loss. The muscle volume decreased to 30 ± 4% and 19 ± 4% 6 and 12 months, respectively, after the operation, ie, the volume decreased by approximately 80% within a 12-month period.Denervated free muscle flap tissue undergoes massive atrophy of approximately 80%, mostly within the first 6 months.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of our work was to show how a chosen normal-isation strategy can affect the outcome of quantitative gene expression studies. As an example, we analysed the expression of three genes known to be upregulated under hypoxic conditions: HIF1A, VEGF and SLC2A1 (GLUT1). Raw RT-qPCR data were normalised using two different strategies: a straightforward normalisation against a single reference gene, GAPDH, using the 2(-ΔΔCt) algorithm and a more complex normalisation against a normalisation factor calculated from the quantitative raw data from four previously validated reference genes. We found that the two different normalisation strategies revealed contradicting results: normalising against a validated set of reference genes revealed an upregulation of the three genes of interest in three post-mortem tissue samples (cardiac muscle, skeletal muscle and brain) under hypoxic conditions. Interestingly, we found a statistically significant difference in the relative transcript abundance of VEGF in cardiac muscle between donors who died of asphyxia versus donors who died from cardiac death. Normalisation against GAPDH alone revealed no upregulation but, in some instances, a downregulation of the genes of interest. To further analyse this discrepancy, the stability of all reference genes used were reassessed and the very low expression stability of GAPDH was found to originate from the co-regulation of this gene under hypoxic conditions. We concluded that GAPDH is not a suitable reference gene for the quantitative analysis of gene expression in hypoxia and that validation of reference genes is a crucial step for generating biologically meaningful data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The neuropathology of Alzheimer disease is characterized by senile plaques, neurofibrillary tangles and cell death. These hallmarks develop according to the differential vulnerability of brain networks, senile plaques accumulating preferentially in the associative cortical areas and neurofibrillary tangles in the entorhinal cortex and the hippocampus. We suggest that the main aetiological hypotheses such as the beta-amyloid cascade hypothesis or its variant, the synaptic beta-amyloid hypothesis, will have to consider neural networks not just as targets of degenerative processes but also as contributors of the disease's progression and of its phenotype. Three domains of research are highlighted in this review. First, the cerebral reserve and the redundancy of the network's elements are related to brain vulnerability. Indeed, an enriched environment appears to increase the cerebral reserve as well as the threshold of disease's onset. Second, disease's progression and memory performance cannot be explained by synaptic or neuronal loss only, but also by the presence of compensatory mechanisms, such as synaptic scaling, at the microcircuit level. Third, some phenotypes of Alzheimer disease, such as hallucinations, appear to be related to progressive dysfunction of neural networks as a result, for instance, of a decreased signal to noise ratio, involving a diminished activity of the cholinergic system. Overall, converging results from studies of biological as well as artificial neural networks lead to the conclusion that changes in neural networks contribute strongly to Alzheimer disease's progression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary This dissertation explores how stakeholder dialogue influences corporate processes, and speculates about the potential of this phenomenon - particularly with actors, like non-governmental organizations (NGOs) and other representatives of civil society, which have received growing attention against a backdrop of increasing globalisation and which have often been cast in an adversarial light by firms - as a source of teaming and a spark for innovation in the firm. The study is set within the context of the introduction of genetically-modified organisms (GMOs) in Europe. Its significance lies in the fact that scientific developments and new technologies are being generated at an unprecedented rate in an era where civil society is becoming more informed, more reflexive, and more active in facilitating or blocking such new developments, which could have the potential to trigger widespread changes in economies, attitudes, and lifestyles, and address global problems like poverty, hunger, climate change, and environmental degradation. In the 1990s, companies using biotechnology to develop and offer novel products began to experience increasing pressure from civil society to disclose information about the risks associated with the use of biotechnology and GMOs, in particular. Although no harmful effects for humans or the environment have been factually demonstrated even to date (2008), this technology remains highly-contested and its introduction in Europe catalysed major companies to invest significant financial and human resources in stakeholder dialogue. A relatively new phenomenon at the time, with little theoretical backing, dialogue was seen to reflect a move towards greater engagement with stakeholders, commonly defined as those "individuals or groups with which. business interacts who have a 'stake', or vested interest in the firm" (Carroll, 1993:22) with whom firms are seen to be inextricably embedded (Andriof & Waddock, 2002). Regarding the organisation of this dissertation, Chapter 1 (Introduction) describes the context of the study, elaborates its significance for academics and business practitioners as an empirical work embedded in a sector at the heart of the debate on corporate social responsibility (CSR). Chapter 2 (Literature Review) traces the roots and evolution of CSR, drawing on Stakeholder Theory, Institutional Theory, Resource Dependence Theory, and Organisational Learning to establish what has already been developed in the literature regarding the stakeholder concept, motivations for engagement with stakeholders, the corporate response to external constituencies, and outcomes for the firm in terms of organisational learning and change. I used this review of the literature to guide my inquiry and to develop the key constructs through which I viewed the empirical data that was gathered. In this respect, concepts related to how the firm views itself (as a victim, follower, leader), how stakeholders are viewed (as a source of pressure and/or threat; as an asset: current and future), corporate responses (in the form of buffering, bridging, boundary redefinition), and types of organisational teaming (single-loop, double-loop, triple-loop) and change (first order, second order, third order) were particularly important in building the key constructs of the conceptual model that emerged from the analysis of the data. Chapter 3 (Methodology) describes the methodology that was used to conduct the study, affirms the appropriateness of the case study method in addressing the research question, and describes the procedures for collecting and analysing the data. Data collection took place in two phases -extending from August 1999 to October 2000, and from May to December 2001, which functioned as `snapshots' in time of the three companies under study. The data was systematically analysed and coded using ATLAS/ti, a qualitative data analysis tool, which enabled me to sort, organise, and reduce the data into a manageable form. Chapter 4 (Data Analysis) contains the three cases that were developed (anonymised as Pioneer, Helvetica, and Viking). Each case is presented in its entirety (constituting a `within case' analysis), followed by a 'cross-case' analysis, backed up by extensive verbatim evidence. Chapter 5 presents the research findings, outlines the study's limitations, describes managerial implications, and offers suggestions for where more research could elaborate the conceptual model developed through this study, as well as suggestions for additional research in areas where managerial implications were outlined. References and Appendices are included at the end. This dissertation results in the construction and description of a conceptual model, grounded in the empirical data and tied to existing literature, which portrays a set of elements and relationships deemed important for understanding the impact of stakeholder engagement for firms in terms of organisational learning and change. This model suggests that corporate perceptions about the nature of stakeholder influence the perceived value of stakeholder contributions. When stakeholders are primarily viewed as a source of pressure or threat, firms tend to adopt a reactive/defensive posture in an effort to manage stakeholders and protect the firm from sources of outside pressure -behaviour consistent with Resource Dependence Theory, which suggests that firms try to get control over extemal threats by focussing on the relevant stakeholders on whom they depend for critical resources, and try to reverse the control potentially exerted by extemal constituencies by trying to influence and manipulate these valuable stakeholders. In situations where stakeholders are viewed as a current strategic asset, firms tend to adopt a proactive/offensive posture in an effort to tap stakeholder contributions and connect the organisation to its environment - behaviour consistent with Institutional Theory, which suggests that firms try to ensure the continuing license to operate by internalising external expectations. In instances where stakeholders are viewed as a source of future value, firms tend to adopt an interactive/innovative posture in an effort to reduce or widen the embedded system and bring stakeholders into systems of innovation and feedback -behaviour consistent with the literature on Organisational Learning, which suggests that firms can learn how to optimize their performance as they develop systems and structures that are more adaptable and responsive to change The conceptual model moreover suggests that the perceived value of stakeholder contribution drives corporate aims for engagement, which can be usefully categorised as dialogue intentions spanning a continuum running from low-level to high-level to very-high level. This study suggests that activities aimed at disarming critical stakeholders (`manipulation') providing guidance and correcting misinformation (`education'), being transparent about corporate activities and policies (`information'), alleviating stakeholder concerns (`placation'), and accessing stakeholder opinion ('consultation') represent low-level dialogue intentions and are experienced by stakeholders as asymmetrical, persuasive, compliance-gaining activities that are not in line with `true' dialogue. This study also finds evidence that activities aimed at redistributing power ('partnership'), involving stakeholders in internal corporate processes (`participation'), and demonstrating corporate responsibility (`stewardship') reflect high-level dialogue intentions. This study additionally finds evidence that building and sustaining high-quality, trusted relationships which can meaningfully influence organisational policies incline a firm towards the type of interactive, proactive processes that underpin the development of sustainable corporate strategies. Dialogue intentions are related to type of corporate response: low-level intentions can lead to buffering strategies; high-level intentions can underpin bridging strategies; very high-level intentions can incline a firm towards boundary redefinition. The nature of corporate response (which encapsulates a firm's posture towards stakeholders, demonstrated by the level of dialogue intention and the firm's strategy for dealing with stakeholders) favours the type of learning and change experienced by the organisation. This study indicates that buffering strategies, where the firm attempts to protect itself against external influences and cant' out its existing strategy, typically lead to single-loop learning, whereby the firm teams how to perform better within its existing paradigm and at most, improves the performance of the established system - an outcome associated with first-order change. Bridging responses, where the firm adapts organisational activities to meet external expectations, typically leads a firm to acquire new behavioural capacities characteristic of double-loop learning, whereby insights and understanding are uncovered that are fundamentally different from existing knowledge and where stakeholders are brought into problem-solving conversations that enable them to influence corporate decision-making to address shortcomings in the system - an outcome associated with second-order change. Boundary redefinition suggests that the firm engages in triple-loop learning, where the firm changes relations with stakeholders in profound ways, considers problems from a whole-system perspective, examining the deep structures that sustain the system, producing innovation to address chronic problems and develop new opportunities - an outcome associated with third-order change. This study supports earlier theoretical and empirical studies {e.g. Weick's (1979, 1985) work on self-enactment; Maitlis & Lawrence's (2007) and Maitlis' (2005) work and Weick et al's (2005) work on sensegiving and sensemaking in organisations; Brickson's (2005, 2007) and Scott & Lane's (2000) work on organisational identity orientation}, which indicate that corporate self-perception is a key underlying factor driving the dynamics of organisational teaming and change. Such theorizing has important implications for managerial practice; namely, that a company which perceives itself as a 'victim' may be highly inclined to view stakeholders as a source of negative influence, and would therefore be potentially unable to benefit from the positive influence of engagement. Such a selfperception can blind the firm from seeing stakeholders in a more positive, contributing light, which suggests that such firms may not be inclined to embrace external sources of innovation and teaming, as they are focussed on protecting the firm against disturbing environmental influences (through buffering), and remain more likely to perform better within an existing paradigm (single-loop teaming). By contrast, a company that perceives itself as a 'leader' may be highly inclined to view stakeholders as a source of positive influence. On the downside, such a firm might have difficulty distinguishing when stakeholder contributions are less pertinent as it is deliberately more open to elements in operating environment (including stakeholders) as potential sources of learning and change, as the firm is oriented towards creating space for fundamental change (through boundary redefinition), opening issues to entirely new ways of thinking and addressing issues from whole-system perspective. A significant implication of this study is that potentially only those companies who see themselves as a leader are ultimately able to tap the innovation potential of stakeholder dialogue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet is becoming more and more popular among drug users. The use of websites and forums to obtain illicit drugs and relevant information about the means of consumption is a growing phenomenon mainly for new synthetic drugs. Gamma Butyrolactone (GBL), a chemical precursor of Gamma Hydroxy Butyric acid (GHB), is used as a "club drug" and also in drug facilitated sexual assaults. Its market takes place mainly on the Internet through online websites but the structure of the market remains unknown. This research aims to combine digital, physical and chemical information to help understand the distribution routes and the structure of the GBL market. Based on an Internet monitoring process, thirty-nine websites selling GBL, mainly in the Netherlands, were detected between January 2010 and December 2011. Seventeen websites were categorized into six groups based on digital traces (e.g. IP addresses and contact information). In parallel, twenty-five bulk GBL specimens were purchased from sixteen websites for packaging comparisons and carbon isotopic measurements. Packaging information showed a high correlation with digital data confirming the links previously established whereas chemical information revealed undetected links and provided complementary information. Indeed, while digital and packaging data give relevant information about the retailers, the supply routes and the distribution close to the consumer, the carbon isotopic data provides upstream information about the production level and in particular the synthesis pathways and the chemical precursors. A three-level structured market has been thereby identified with a production level mainly located in China and in Germany, an online distribution level mainly hosted in the Netherlands and the customers who order on the Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The production and use of false identity and travel documents in organized crime represent a serious and evolving threat. However, a case-by-case perspective, thus suffering from linkage blindness and a limited analysis capacity, essentially drives the present-day fight against this criminal problem. To assist in overcoming these limitations, a process model was developed using a forensic perspective. It guides the systematic analysis and management of seized false documents to generate forensic intelligence that supports strategic and tactical decision-making in an intelligence-led policing approach. The model is articulated on a three-level architecture that aims to assist in detecting and following-up on general trends, production methods and links between cases or series. Using analyses of a large dataset of counterfeit and forged identity and travel documents, it is possible to illustrate the model, its three levels and their contribution. Examples will point out how the proposed approach assists in detecting emerging trends, in evaluating the black market's degree of structure, in uncovering criminal networks, in monitoring the quality of false documents, and in identifying their weaknesses to orient the conception of more secured travel and identity documents. The process model proposed is thought to have a general application in forensic science and can readily be transposed to other fields of study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the present study was to determinate the cycle length of spermatogenesis in three species of shrew, Suncus murinus, Sorex coronatus and Sorex minutus, and to assess the relative influence of variation in basal metabolic rate (BMR) and mating system (level of sperm competition) on the observed rate of spermatogenesis, including data of shrew species studied before (Sorex araneus, Crocidura russula and Neomys fodiens). The dynamics of sperm production were determined by tracing 5-bromodeoxyuridine in the DNA of germ cells. As a continuous scaling of mating systems is not evident, the level of sperm competition was evaluated by the significantly correlated relative testis size (RTS). The cycle durations estimated by linear regression were 14.3 days (RTS 0.3%) in Suncus murinus, 9.0 days (RTS 0.5%) in Sorex coronatus and 8.5 days (RTS 2.8%) in Sorex minutus. In regression and multiple regression analyses including all six studied species of shrew, cycle length was significantly correlated with BMR (r2=0.73) and RTS (r2=0.77). Sperm competition as an ultimate factor obviously leads to a reduction in the time of spermatogenesis in order to increase sperm production. BMR may act in the same way, independently or as a proximate factor, revealed by the covariation, but other factors (related to testes size and thus to mating system) may also be involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: The study aimed to analyse the currently available national and international guidelines for areas of consensus and contrasting recommendations in the treatment of diverticulitis and thereby to design questions for future research. METHOD: MEDLINE, EMBASE and PubMed were systematically searched for guidelines on diverticular disease and diverticulitis. Inclusion was confined to papers in English and those < 10 years old. The included topics were classified as consensus or controversy between guidelines, and the highest level of evidence was scored as sufficient (Oxford Centre of Evidence-Based Medicine Level of Evidence of 3a or higher) or insufficient. RESULTS: Six guidelines were included and all topics with recommendations were compared. Overall, in 13 topics consensus was reached and 10 topics were regarded as controversial. In five topics, consensus was reached without sufficient evidence and in three topics there was no evidence and no consensus. Clinical staging, the need for intraluminal imaging, dietary restriction, duration of antibiotic treatment, the protocol for abscess treatment, the need for elective surgery in subgroups of patients, the need for surgery after abscess treatment and the level of the proximal resection margin all lack consensus or evidence. CONCLUSION: Evidence on the diagnosis and treatment of diverticular disease and diverticulitis ranged from nonexistent to strong, regardless of consensus. The most relevant research questions were identified and proposed as topics for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chlamydia and Chlamydia-related bacteria are known to infect various organisms and may cause a wide range of diseases, especially in ruminants. To gain insight into the prevalence of these bacteria in the ruminant environment, we applied a pan-Chlamydiales PCR followed by sequencing to 72 ruminant environmental samples from water, feed bunks and floors. Chlamydiales from four family-level lineages were detected indicating a high biodiversity of Chlamydiales in ruminant farms. Parachlamydiaceae were detected in all three types of environmental samples and was the most abundant family-level taxon (60%). In contrast, only one bacterium from each of the following family-level lineages was identified: Chlamydiaceae, Criblamydiaceae and Simkaniaceae. The observed high prevalence of Parachlamydiaceae in water samples may suggest water as the main source of contamination for ruminants as well as their environment due to spoilage. The absence of reported infections in the investigated ruminant farms might indicate that either detected Chlamydiales are of reduced pathogenicity or infective doses have not been reached.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: During the last decade, the management of blunt hepatic injury has considerably changed. Three options are available as follows: nonoperative management (NOM), transarterial embolization (TAE), and surgery. We aimed to evaluate in a systematic review the current practice and outcomes in the management of Grade III to V blunt hepatic injury. METHOD: The MEDLINE database was searched using PubMed to identify English-language citations published after 2000 using the key words blunt, hepatic injury, severe, and grade III to V in different combinations. Liver injury was graded according to the American Association for the Surgery of Trauma classification on computed tomography (CT). Primary outcome analyzed was success rate in intention to treat. Critical appraisal of the literature was performed using the validated National Institute for Health and Care Excellence "Quality Assessment for Case Series" system. RESULTS: Twelve articles were selected for critical appraisal (n = 4,946 patients). The median quality score of articles was 4 of 8 (range, 2-6). Overall, the median Injury Severity Score (ISS) at admission was 26 (range, 0.6-75). A median of 66% (range, 0-100%) of patients was managed with NOM, with a success rate of 94% (range, 86-100%). TAE was used in only 3% of cases (range, 0-72%) owing to contrast extravasation on CT with a success rate of 93% (range, 81-100%); however, 9% to 30% of patients required a laparotomy. Thirty-one percent (range, 17-100%) of patients were managed with surgery owing to hemodynamic instability in most cases, with 12% to 28% requiring secondary TAE to control recurrent hepatic bleeding. Mortality was 5% (range, 0-8%) after NOM and 51% (range, 30-68%) after surgery. CONCLUSION: NOM of Grade III to V blunt hepatic injury is the first treatment option to manage hemodynamically stable patients. TAE and surgery are considered in a highly selective group of patients with contrast extravasation on CT or shock at admission, respectively. Additional standardization of the reports is necessary to allow accurate comparisons of the various management strategies. LEVEL OF EVIDENCE: Systematic review, level IV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Since recombinant human growth hormone (rhGH) became available in 1985, the spectrum of indications has broadened and the number of treated patients increased. However, long-term health-related quality of life (HRQoL) after childhood rhGH treatment has rarely been documented. We assessed HRQoL and its determinants in young adults treated with rhGH during childhood. METHODOLOGY/PRINCIPAL FINDINGS: For this study, we retrospectively identified former rhGH patients in 11 centers of paediatric endocrinology, including university hospitals and private practices. We sent a questionnaire to all patients treated with rhGH for any diagnosis, who were older than 18 years, and who resided in Switzerland at time of the survey. Three hundred participants (58% of 514 eligible) returned the questionnaire. Mean age was 23 years; 56% were women; 43% had isolated growth hormone deficiency, or idiopathic short stature; 43% had associated diseases or syndromes, and 14% had growth hormone deficiency after childhood cancer. Swiss siblings of childhood cancer survivors and the German norm population served as comparison groups. HRQoL was assessed using the Short Form-36. We found that the Physical Component Summary of healthy patients with isolated growth hormone deficiency or idiopathic short stature resembled that of the control group (53.8 vs. 54.9). Patients with associated diseases or syndromes scored slightly lower (52.5), and former cancer patients scored lowest (42.6). The Mental Component Summary was similar for all groups. Lower Physical Component Summary was associated with lower educational level (coeff. -1.9). Final height was not associated with HRQoL. CONCLUSIONS/SIGNIFICANCE: In conclusion, HRQoL after treatment with rhGH in childhood depended mainly on the underlying indication for rhGH treatment. Patients with isolated growth hormone deficiency/idiopathic short stature or patients with associated diseases or syndromes had HRQoL comparable to peers. Patients with growth hormone deficiency after childhood cancer were at high risk for lower HRQoL. This reflects the general impaired health of this vulnerable group, which needs long-term follow-up.