25 resultados para executive summary

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The International Society for Clinical Densitometry (ISCD) and the International Osteoporosis Foundation (IOF) convened the FRAX(®) Position Development Conference (PDC) in Bucharest, Romania, on November 14, 2010, following a two-day joint meeting of the ISCD and IOF on the "Interpretation and Use of FRAX(®) in Clinical Practice." These three days of critical discussion and debate, led by a panel of international experts from the ISCD, IOF and dedicated task forces, have clarified a number of important issues pertaining to the interpretation and implementation of FRAX(®) in clinical practice. The Official Positions resulting from the PDC are intended to enhance the quality and clinical utility of fracture risk assessment worldwide. Since the field of skeletal assessment is still evolving rapidly, some clinically important issues addressed at the PDCs are not associated with robust medical evidence. Accordingly, some Official Positions are based largely on expert opinion. Despite limitations inherent in such a process, the ISCD and IOF believe it is important to provide clinicians and technologists with the best distillation of current knowledge in the discipline of bone densitometry and provide an important focus for the scientific community to consider. This report describes the methodology and results of the ISCD-IOF PDC dedicated to FRAX(®).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

[Contents] 1. Executive summary. 2. Introduction. 3. Methods. 4. Main results. 4.1. Participants. 4.2. Estimation of dietary salt intake using 24-hour urine collection. 4.3. Blood pressure and hypertension. 4.4. Anthropometric data (Body weight, height and body mass index BMI; prevalence of overweight and obesity; waist circumference;...). 4.5. Knowledge and behaviors towards salt. 5. Discussion.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Executive Summary The first essay of this dissertation investigates whether greater exchange rate uncertainty (i.e., variation over time in the exchange rate) fosters or depresses the foreign investment of multinational firms. In addition to the direct capital financing it supplies, foreign investment can be a source of valuable technology and know-how, which can have substantial positive effects on a host country's economic growth. Thus, it is critically important for policy makers and central bankers, among others, to understand how multinationals base their investment decisions on the characteristics of foreign exchange markets. In this essay, I first develop a theoretical framework to improve our knowledge regarding how the aggregate level of foreign investment responds to exchange rate uncertainty when an economy consists of many firms, each of which is making decisions. The analysis predicts a U-shaped effect of exchange rate uncertainty on the total level of foreign investment of the economy. That is, the effect is negative for low levels of uncertainty and positive for higher levels of uncertainty. This pattern emerges because the relationship between exchange rate volatility and 'the probability of investment is negative for firms with low productivity at home (i.e., firms that find it profitable to invest abroad) and the relationship is positive for firms with high productivity at home (i.e., firms that prefer exporting their product). This finding stands in sharp contrast to predictions in the existing literature that consider a single firm's decision to invest in a unique project. The main contribution of this research is to show that the aggregation over many firms produces a U-shaped pattern between exchange rate uncertainty and the probability of investment. Using data from industrialized countries for the period of 1982-2002, this essay offers a comprehensive empirical analysis that provides evidence in support of the theoretical prediction. In the second essay, I aim to explain the time variation in sovereign credit risk, which captures the risk that a government may be unable to repay its debt. The importance of correctly evaluating such a risk is illustrated by the central role of sovereign debt in previous international lending crises. In addition, sovereign debt is the largest asset class in emerging markets. In this essay, I provide a pricing formula for the evaluation of sovereign credit risk in which the decision to default on sovereign debt is made by the government. The pricing formula explains the variation across time in daily credit spreads - a widely used measure of credit risk - to a degree not offered by existing theoretical and empirical models. I use information on a country's stock market to compute the prevailing sovereign credit spread in that country. The pricing formula explains a substantial fraction of the time variation in daily credit spread changes for Brazil, Mexico, Peru, and Russia for the 1998-2008 period, particularly during the recent subprime crisis. I also show that when a government incentive to default is allowed to depend on current economic conditions, one can best explain the level of credit spreads, especially during the recent period of financial distress. In the third essay, I show that the risk of sovereign default abroad can produce adverse consequences for the U.S. equity market through a decrease in returns and an increase in volatility. The risk of sovereign default, which is no longer limited to emerging economies, has recently become a major concern for financial markets. While sovereign debt plays an increasing role in today's financial environment, the effects of sovereign credit risk on the U.S. financial markets have been largely ignored in the literature. In this essay, I develop a theoretical framework that explores how the risk of sovereign default abroad helps explain the level and the volatility of U.S. equity returns. The intuition for this effect is that negative economic shocks deteriorate the fiscal situation of foreign governments, thereby increasing the risk of a sovereign default that would trigger a local contraction in economic growth. The increased risk of an economic slowdown abroad amplifies the direct effect of these shocks on the level and the volatility of equity returns in the U.S. through two channels. The first channel involves a decrease in the future earnings of U.S. exporters resulting from unfavorable adjustments to the exchange rate. The second channel involves investors' incentives to rebalance their portfolios toward safer assets, which depresses U.S. equity prices. An empirical estimation of the model with monthly data for the 1994-2008 period provides evidence that the risk of sovereign default abroad generates a strong leverage effect during economic downturns, which helps to substantially explain the level and the volatility of U.S. equity returns.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Executive Summary Electricity is crucial for modern societies, thus it is important to understand the behaviour of electricity markets in order to be prepared to face the consequences of policy changes. The Swiss electricity market is now in a transition stage from a public monopoly to a liberalised market and it is undergoing an "emergent" liberalisation - i.e. liberalisation taking place without proper regulation. The withdrawal of nuclear capacity is also being debated. These two possible changes directly affect the mechanisms for capacity expansion. Thus, in this thesis we concentrate on understanding the dynamics of capacity expansion in the Swiss electricity market. A conceptual model to help understand the dynamics of capacity expansion in the Swiss electricity market is developed an explained in the first essay. We identify a potential risk of imports dependence. In the second essay a System Dynamics model, based on the conceptual model, is developed to evaluate the consequences of three scenarios: a nuclear phase-out, the implementation of a policy for avoiding imports dependence, and the combination of both. We conclude that the Swiss market is not well prepared to face unexpected changes of supply and demand, and we identify a risk of imports dependence, mainly in the case of a nuclear phase-out. The third essay focus on the opportunity cost of hydro-storage power generation, one of the main generation sources in Switzerland. We use and extended version of our model to test different policies for assigning an opportunity cost to hydro-storage power generation. We conclude that the preferred policies are different for different market participants and depend on market structure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY This PhD research, funded by the Swiss Sciences Foundation, is principally devoted to enhance the recognition, the visualisation and the characterization of geobodies through innovative 3D seismic approaches. A series of case studies from the Australian North West Shelf ensures the development of reproducible integrated 3D workflows and gives new insight into local and regional stratigraphic as well as structural issues. This project was initiated in year 2000 at the Geology and Palaeontology Institute of the University of Lausanne (Switzerland). Several collaborations ensured the improvement of technical approaches as well as the assessment of geological models. - Investigations into the Timor Sea structural style were carried out at the Tectonics Special Research Centre of the University of Western Australia and in collaboration with Woodside Energy in Perth. - Seismic analysis and attributes classification approach were initiated with Schlumberger Oilfield Australia in Perth; assessments and enhancements of the integrated seismic approaches benefited from collaborations with scientists from Schlumberger Stavanger Research (Norway). Adapting and refining from "linear" exploration techniques, a conceptual "helical" 3D seismic approach has been developed. In order to investigate specific geological issues this approach, integrating seismic attributes and visualisation tools, has been refined and adjusted leading to the development of two specific workflows: - A stratigraphic workflow focused on the recognition of geobodies and the characterization of depositional systems. Additionally, it can support the modelling of the subsidence and incidentally the constraint of the hydrocarbon maturity of a given area. - A structural workflow used to quickly and accurately define major and secondary fault systems. The integration of the 3D structural interpretation results ensures the analysis of the fault networks kinematics which can affect hydrocarbon trapping mechanisms. The application of these integrated workflows brings new insight into two complex settings on the Australian North West Shelf and ensures the definition of astonishing stratigraphic and structural outcomes. The stratigraphic workflow ensures the 3D characterization of the Late Palaeozoic glacial depositional system on the Mermaid Nose (Dampier Subbasin, Northern Carnarvon Basin) that presents similarities with the glacial facies along the Neotethys margin up to Oman (chapter 3.1). A subsidence model reveals the Phanerozoic geodynamic evolution of this area (chapter 3.2) and emphasizes two distinct mode of regional extension for the Palaeozoic (Neotethys opening) and Mesozoic (abyssal plains opening). The structural workflow is used for the definition of the structural evolution of the Laminaria High area (Bonaparte Basin). Following a regional structural characterization of the Timor Sea (chapter 4.1), a thorough analysis of the Mesozoic fault architecture reveals a local rotation of the stress field and the development of reverse structures (flower structures) in extensional setting, that form potential hydrocarbon traps (chapter 4.2). The definition of the complex Neogene structural architecture associated with the fault kinematic analysis and a plate flexure model (chapter 4.3) suggest that the Miocene to Pleistocene reactivation phases recorded at the Laminaria High most probably result from the oblique normal reactivation of the underlying Mesozoic fault planes. This episode is associated with the deformation of the subducting Australian plate. Based on these results three papers were published in international journals and two additional publications will be submitted. Additionally this research led to several communications in international conferences. Although the different workflows presented in this research have been primarily developed and used for the analysis of specific stratigraphic and structural geobodies on the Australian North West Shelf, similar integrated 3D seismic approaches will have applications to hydrocarbon exploration and production phases; for instance increasing the recognition of potential source rocks, secondary migration pathways, additional traps or reservoir breaching mechanisms. The new elements brought by this research further highlight that 3D seismic data contains a tremendous amount of hidden geological information waiting to be revealed and that will undoubtedly bring new insight into depositional systems, structural evolution and geohistory of the areas reputed being explored and constrained and other yet to be constrained. The further development of 3D texture attributes highlighting specific features of the seismic signal, the integration of quantitative analysis for stratigraphic and structural processes, the automation of the interpretation workflow as well as the formal definition of "seismo-morphologic" characteristics of a wide range of geobodies from various environments would represent challenging examples of continuation of this present research. The 21st century will most probably represent a transition period between fossil and other alternative energies. The next generation of seismic interpreters prospecting for hydrocarbon will undoubtedly face new challenges mostly due to the shortage of obvious and easy targets. They will probably have to keep on integrating techniques and geological processes in order to further capitalise the seismic data for new potentials definition. Imagination and creativity will most certainly be among the most important quality required from such geoscientists.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article expresses the current view of the European Society of Gastrointestinal Endoscopy (ESGE) about radiation protection for endoscopic procedures, in particular endoscopic retrograde cholangiopancreatography (ERCP). Particular cases, including pregnant women and pediatric patients, are also discussed. This Guideline was developed by a group of endoscopists and medical physicists to ensure that all aspects of radiation protection are adequately dealt with. A two-page executive summary of evidence statements and recommendations is provided. The target readership for this Guideline mostly includes endoscopists, anesthesiologists, and endoscopy assistants who may be exposed to X-rays during endoscopic procedures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cerebral microangiopathy (CMA) has been associated with executive dysfunction and fronto-parietal neural network disruption. Advances in magnetic resonance imaging allow more detailed analyses of gray (e.g., voxel-based morphometry-VBM) and white matter (e.g., diffusion tensor imaging-DTI) than traditional visual rating scales. The current study investigated patients with early CMA and healthy control subjects with all three approaches. Neuropsychological assessment focused on executive functions, the cognitive domain most discussed in CMA. The DTI and age-related white matter changes rating scales revealed convergent results showing widespread white matter changes in early CMA. Correlations were found in frontal and parietal areas exclusively with speeded, but not with speed-corrected executive measures. The VBM analyses showed reduced gray matter in frontal areas. All three approaches confirmed the hypothesized fronto-parietal network disruption in early CMA. Innovative methods (DTI) converged with results from conventional methods (visual rating) while allowing greater spatial and tissue accuracy. They are thus valid additions to the analysis of neural correlates of cognitive dysfunction. We found a clear distinction between speeded and nonspeeded executive measures in relationship to imaging parameters. Cognitive slowing is related to disease severity in early CMA and therefore important for early diagnostics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/Aims: Cognitive dysfunction after medical treatment is increasingly being recognized. Studies on this topic require repeated cognitive testing within a short time. However, with repeated testing, practice effects must be expected. We quantified practice effects in a demographically corrected summary score of a neuropsychological test battery repeatedly administered to healthy elderly volunteers. Methods: The Consortium to Establish a Registry for Alzheimer's Disease (CERAD) Neuropsychological Assessment Battery (for which a demographically corrected summary score was developed), phonemic fluency tests, and trail-making tests were administered in healthy volunteers aged 65 years or older on days 0, 7, and 90. This battery allows calculation of a demographically adjusted continuous summary score. Results: Significant practice effects were observed in the CERAD total score and in the word list (learning and recall) subtest. Based on these volunteer data, we developed a threshold for diagnosis of postoperative cognitive dysfunction (POCD) with the CERAD total score. Conclusion: Practice effects with repeated administration of neuropsychological tests must be accounted for in the interpretation of such tests. Ignoring practice effects may lead to an underestimation of POCD. The usefulness of the proposed demographically adjusted continuous score for cognitive function will have to be tested prospectively in patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been reported in the literature that executive functions may be fractioned into updating, shifting, and inhibition. The present study aimed to explore whether these executive sub-components can be identified in a more age-heterogeneous sample and see if they are prone to an age-related decline. We tested the performances of 81 individuals aged from 18 to 88 years old in each executive sub-component, working memory, fluid intelligence and processing speed. Correlation analysis revealed only a slight positive relationship between the two updating measures. A linear decrement with age was observed only for two complex executive tests. Tasks indexing working memory, processing speed and fluid intelligence showed a stronger linear decline with age than executive tasks. In conclusion, our results did not replicate the executive structure known from the literature, and revealed that decrement in executive function is not an unavoidable concomitant of aging but rather concerns specific executive tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND/AIMS: For many therapeutic decisions in Crohn's disease (CD), high-grade evidence is lacking. To assist clinical decision-making, explicit panel-based appropriateness criteria were developed by an international, multidisciplinary expert panel. METHODS: 10 gastroenterologists, 3 surgeons and 2 general practitioners from 12 European countries assessed the appropriateness of therapy for CD using the RAND Appropriateness Method. Their assessment was based on the study of a recent literature review of the subject, combined with their own expert clinical judgment. Panelists rated clinical indications and treatment options using a 9-point scale (1 = extremely inappropriate; 9 = extremely appropriate). These scenarios were then discussed in detail at the panel meeting and re-rated. Median ratings and disagreement were used to aggregate ratings into three assessment categories: appropriate (A), uncertain (U) and inappropriate (I). RESULTS: 569 specific indications were rated, dealing with 9 clinical presentations: mild/moderate luminal CD (n = 104), severe CD (n = 126), steroid-dependent CD (n = 25), steroid-refractory CD (n = 37), fistulizing CD (n = 49), fibrostenotic CD (n = 35), maintenance of medical remission of CD (n = 84), maintenance of surgical remission (n = 78), drug safety in pregnancy (n = 24) and use of infliximab (n = 7). Overall, 146 indications (26%) were judged appropriate, 129 (23%) uncertain and 294 (52%) inappropriate. Frank disagreement was low (14% overall) with the greatest disagreement (54% of scenarios) being observed for treatment of steroid-refractory disease. CONCLUSIONS: Detailed explicit appropriateness criteria for the appropriate use of therapy for CD were developed for the first time by a European expert panel. Disease location, severity and previous treatments were the main factors taken into account. User-friendly access to EPACT criteria is available via an Internet site, www.epact.ch, allowing prospective evaluation and improvement of appropriateness of current CD therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing number of studies have been addressing the relationship between theory of mind (TOM) and executive functions (EF) in patients with acquired neurological pathology. In order to provide a global overview on the main findings, we conducted a systematic review on group studies where we aimed to (1) evaluate the patterns of impaired and preserved abilities of both TOM and EF in groups of patients with acquired neurological pathology and (2) investigate the existence of particular relations between different EF domains and TOM tasks. The search was conducted in Pubmed/Medline. A total of 24 articles met the inclusion criteria. We considered for analysis classical clinically accepted TOM tasks (first- and second-order false belief stories, the Faux Pas test, Happe's stories, the Mind in the Eyes task, and Cartoon's tasks) and EF domains (updating, shifting, inhibition, and access). The review suggests that (1) EF and TOM appear tightly associated. However, the few dissociations observed suggest they cannot be reduced to a single function; (2) no executive subprocess could be specifically associated with TOM performances; (3) the first-order false belief task and the Happe's story task seem to be less sensitive to neurological pathologies and less associated to EF. Even though the analysis of the reviewed studies demonstrates a close relationship between TOM and EF in patients with acquired neurological pathology, the nature of this relationship must be further investigated. Studies investigating ecological consequences of TOM and EF deficits, and intervention researches may bring further contributions to this question.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neurocritical care depends, in part, on careful patient monitoring but as yet there are little data on what processes are the most important to monitor, how these should be monitored, and whether monitoring these processes is cost-effective and impacts outcome. At the same time, bioinformatics is a rapidly emerging field in critical care but as yet there is little agreement or standardization on what information is important and how it should be displayed and analyzed. The Neurocritical Care Society in collaboration with the European Society of Intensive Care Medicine, the Society for Critical Care Medicine, and the Latin America Brain Injury Consortium organized an international, multidisciplinary consensus conference to begin to address these needs. International experts from neurosurgery, neurocritical care, neurology, critical care, neuroanesthesiology, nursing, pharmacy, and informatics were recruited on the basis of their research, publication record, and expertise. They undertook a systematic literature review to develop recommendations about specific topics on physiologic processes important to the care of patients with disorders that require neurocritical care. This review does not make recommendations about treatment, imaging, and intraoperative monitoring. A multidisciplinary jury, selected for their expertise in clinical investigation and development of practice guidelines, guided this process. The GRADE system was used to develop recommendations based on literature review, discussion, integrating the literature with the participants' collective experience, and critical review by an impartial jury. Emphasis was placed on the principle that recommendations should be based on both data quality and on trade-offs and translation into clinical practice. Strong consideration was given to providing pragmatic guidance and recommendations for bedside neuromonitoring, even in the absence of high quality data.