825 resultados para risk management, interdisciplinarity
Resumo:
[spa] En este artículo presentamos una nueva estrategia de reaseguro, a la que denominamos estrategia de reaseguro umbral, que actúa de forma diferente en función del nivel de las reservas. Así, para unos niveles de las reservas inferiores a un determinado nivel, el gestor decide aplicar un reaseguro proporcional, y para niveles superiores, al considerar que se ha alcanzado cierta solvencia en la cartera, opta por no ceder ningún porcentaje del riesgo. El análisis del efecto de la introducción del reaseguro umbral sobre la probabilidad de supervivencia, y su comparación con el reaseguro proporcional y la opción de no reasegurar, nos permite hallar estrategias de reaseguro equivalentes desde el punto de vista de la solvencia. Palabras clave: teoría del riesgo, reaseguro de umbral, reaseguro proporcional, probabilidad de supervivencia.
Resumo:
Despite the development of many effective antihypertensive drugs, target blood pressures are reached in only a minority of patients in clinical practice. Poor adherence to drug therapy and the occurrence of side effects are among the main reasons commonly reported by patients and physicians to explain the poor results of actual antihypertensive therapies. The development of new effective antihypertensive agents with an improved tolerability profile might help to partly overcome these problems. Lercanidipine is an effective dihydropyridine calcium channel blocker of the third generation characterized by a long half-life and its lipophylicity. In contrast to first-generation dihydropyridines, lercanidipine does not induce reflex tachycardia and induces peripheral edema with a lower incidence. Recent data suggest that in addition to lowering blood pressure, lercanidipine might have some renal protective properties. In this review we shall discuss the problems of drug adherence in the management of hypertension with a special emphasis on lercanidipine.
Resumo:
In this paper we deal with the identification of dependencies between time series of equity returns. Marginal distribution functions are assumed to be known, and a bivariate chi-square test of fit is applied in a fully parametric copula approach. Several families of copulas are fitted and compared with Spanish stock market data. The results show that the t-copula generally outperforms other dependence structures, and highlight the difficulty in adjusting a significant number of bivariate data series
Resumo:
Rock slope instabilities such as rock slides, rock avalanche or deep-seated gravitational slope deformations are widespread in Alpine valleys. These phenomena represent at the same time a main factor that control the mountain belts erosion and also a significant natural hazard that creates important losses to the mountain communities. However, the potential geometrical and dynamic connections linking outcrop and slope-scale instabilities are often unknown. A more detailed definition of the potential links will be essential to improve the comprehension of the destabilization processes and to dispose of a more complete hazard characterization of the rock instabilities at different spatial scales. In order to propose an integrated approach in the study of the rock slope instabilities, three main themes were analysed in this PhD thesis: (1) the inventory and the spatial distribution of rock slope deformations at regional scale and their influence on the landscape evolution, (2) the influence of brittle and ductile tectonic structures on rock slope instabilities development and (3) the characterization of hazard posed by potential rock slope instabilities through the development of conceptual instability models. To prose and integrated approach for the analyses of these topics, several techniques were adopted. In particular, high resolution digital elevation models revealed to be fundamental tools that were employed during the different stages of the rock slope instability assessment. A special attention was spent in the application of digital elevation model for detailed geometrical modelling of past and potential instabilities and for the rock slope monitoring at different spatial scales. Detailed field analyses and numerical models were performed to complete and verify the remote sensing approach. In the first part of this thesis, large slope instabilities in Rhone valley (Switzerland) were mapped in order to dispose of a first overview of tectonic and climatic factors influencing their distribution and their characteristics. Our analyses demonstrate the key influence of neotectonic activity and the glacial conditioning on the spatial distribution of the rock slope deformations. Besides, the volumes of rock instabilities identified along the main Rhone valley, were then used to propose the first estimate of the postglacial denudation and filling of the Rhone valley associated to large gravitational movements. In the second part of the thesis, detailed structural analyses of the Frank slide and the Sierre rock avalanche were performed to characterize the influence of brittle and ductile tectonic structures on the geometry and on the failure mechanism of large instabilities. Our observations indicated that the geometric characteristics and the variation of the rock mass quality associated to ductile tectonic structures, that are often ignored landslide study, represent important factors that can drastically influence the extension and the failure mechanism of rock slope instabilities. In the last part of the thesis, the failure mechanisms and the hazard associated to five potential instabilities were analysed in detail. These case studies clearly highlighted the importance to incorporate different analyses and monitoring techniques to dispose of reliable and hazard scenarios. This information associated to the development of a conceptual instability model represents the primary data for an integrated risk management of rock slope instabilities. - Les mouvements de versant tels que les chutes de blocs, les éboulements ou encore les phénomènes plus lents comme les déformations gravitaires profondes de versant représentent des manifestations courantes en régions montagneuses. Les mouvements de versant sont à la fois un des facteurs principaux contrôlant la destruction progressive des chaines orogéniques mais aussi un danger naturel concret qui peut provoquer des dommages importants. Pourtant, les phénomènes gravitaires sont rarement analysés dans leur globalité et les rapports géométriques et mécaniques qui lient les instabilités à l'échelle du versant aux instabilités locales restent encore mal définis. Une meilleure caractérisation de ces liens pourrait pourtant représenter un apport substantiel dans la compréhension des processus de déstabilisation des versants et améliorer la caractérisation des dangers gravitaires à toutes les échelles spatiales. Dans le but de proposer un approche plus globale à la problématique des mouvements gravitaires, ce travail de thèse propose trois axes de recherche principaux: (1) l'inventaire et l'analyse de la distribution spatiale des grandes instabilités rocheuses à l'échelle régionale, (2) l'analyse des structures tectoniques cassantes et ductiles en relation avec les mécanismes de rupture des grandes instabilités rocheuses et (3) la caractérisation des aléas rocheux par une approche multidisciplinaire visant à développer un modèle conceptuel de l'instabilité et une meilleure appréciation du danger . Pour analyser les différentes problématiques traitées dans cette thèse, différentes techniques ont été utilisées. En particulier, le modèle numérique de terrain s'est révélé être un outil indispensable pour la majorité des analyses effectuées, en partant de l'identification de l'instabilité jusqu'au suivi des mouvements. Les analyses de terrain et des modélisations numériques ont ensuite permis de compléter les informations issues du modèle numérique de terrain. Dans la première partie de cette thèse, les mouvements gravitaires rocheux dans la vallée du Rhône (Suisse) ont été cartographiés pour étudier leur répartition en fonction des variables géologiques et morphologiques régionales. En particulier, les analyses ont mis en évidence l'influence de l'activité néotectonique et des phases glaciaires sur la distribution des zones à forte densité d'instabilités rocheuses. Les volumes des instabilités rocheuses identifiées le long de la vallée principale ont été ensuite utilisés pour estimer le taux de dénudations postglaciaire et le remplissage de la vallée du Rhône lié aux grands mouvements gravitaires. Dans la deuxième partie, l'étude de l'agencement structural des avalanches rocheuses de Sierre (Suisse) et de Frank (Canada) a permis de mieux caractériser l'influence passive des structures tectoniques sur la géométrie des instabilités. En particulier, les structures issues d'une tectonique ductile, souvent ignorées dans l'étude des instabilités gravitaires, ont été identifiées comme des structures très importantes qui contrôlent les mécanismes de rupture des instabilités à différentes échelles. Dans la dernière partie de la thèse, cinq instabilités rocheuses différentes ont été étudiées par une approche multidisciplinaire visant à mieux caractériser l'aléa et à développer un modèle conceptuel trois dimensionnel de ces instabilités. A l'aide de ces analyses on a pu mettre en évidence la nécessité d'incorporer différentes techniques d'analyses et de surveillance pour une gestion plus objective du risque associée aux grandes instabilités rocheuses.
Resumo:
The emergence of powerful new technologies, the existence of large quantities of data, and increasing demands for the extraction of added value from these technologies and data have created a number of significant challenges for those charged with both corporate and information technology management. The possibilities are great, the expectations high, and the risks significant. Organisations seeking to employ cloud technologies and exploit the value of the data to which they have access, be this in the form of "Big Data" available from different external sources or data held within the organisation, in structured or unstructured formats, need to understand the risks involved in such activities. Data owners have responsibilities towards the subjects of the data and must also, frequently, demonstrate that they are in compliance with current standards, laws and regulations. This thesis sets out to explore the nature of the technologies that organisations might utilise, identify the most pertinent constraints and risks, and propose a framework for the management of data from discovery to external hosting that will allow the most significant risks to be managed through the definition, implementation, and performance of appropriate internal control activities.
Resumo:
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.
Resumo:
In this paper we deal with the identification of dependencies between time series of equity returns. Marginal distribution functions are assumed to be known, and a bivariate chi-square test of fit is applied in a fully parametric copula approach. Several families of copulas are fitted and compared with Spanish stock market data. The results show that the t-copula generally outperforms other dependence structures, and highlight the difficulty in adjusting a significant number of bivariate data series
Resumo:
[spa] En un modelo de Poisson compuesto, definimos una estrategia de reaseguro proporcional de umbral : se aplica un nivel de retención k1 siempre que las reservas sean inferiores a un determinado umbral b, y un nivel de retención k2 en caso contrario. Obtenemos la ecuación íntegro-diferencial para la función Gerber-Shiu, definida en Gerber-Shiu -1998- en este modelo, que nos permite obtener las expresiones de la probabilidad de ruina y de la transformada de Laplace del momento de ruina para distintas distribuciones de la cuantía individual de los siniestros. Finalmente presentamos algunos resultados numéricos.
Resumo:
[spa] En este artículo presentamos una nueva estrategia de reaseguro, a la que denominamos estrategia de reaseguro umbral, que actúa de forma diferente en función del nivel de las reservas. Así, para unos niveles de las reservas inferiores a un determinado nivel, el gestor decide aplicar un reaseguro proporcional, y para niveles superiores, al considerar que se ha alcanzado cierta solvencia en la cartera, opta por no ceder ningún porcentaje del riesgo. El análisis del efecto de la introducción del reaseguro umbral sobre la probabilidad de supervivencia, y su comparación con el reaseguro proporcional y la opción de no reasegurar, nos permite hallar estrategias de reaseguro equivalentes desde el punto de vista de la solvencia. Palabras clave: teoría del riesgo, reaseguro de umbral, reaseguro proporcional, probabilidad de supervivencia.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
This is the second edition of the compendium. Since the first edition a number of important initiatives have been launched in the shape of large projects targeting integration of research infrastructure and new technology for toxicity studies and exposure monitoring.The demand for research in the area of human health and environmental safety management of nanotechnologies is present since a decade and identified by several landmark reports and studies. Several guidance documents have been published. It is not the intention of this compendium to report on these as they are widely available.It is also not the intention to publish scientific papers and research results as this task is covered by scientific conferences and the peer reviewed press.The intention of the compendium is to bring together researchers, create synergy in their work, and establish links and communication between them mainly during the actual research phase before publication of results. Towards this purpose we find useful to give emphasis to communication of projects strategic aims, extensive coverage of specific work objectives and of methods used in research, strengthening human capacities and laboratories infrastructure, supporting collaboration for common goals and joint elaboration of future plans, without compromising scientific publication potential or IP Rights.These targets are far from being achieved with the publication in its present shape. We shall continue working, though, and hope with the assistance of the research community to make significant progress. The publication will take the shape of a dynamic, frequently updated, web-based document available free of charge to all interested parties. Researchers in this domain are invited to join the effort, communicating the work being done. [Auteurs]
Resumo:
La Commission européenne soutient le projet NanoImpactNet - réseau européen sur l'impact des nanomatériaux sur l'environnement et la santé - afin de coordonner la recherche sur le développement sûr et responsable des nanomatériaux. Depuis avril 2008, NanoImpactNet a organisé 14 conférences/ateliers pour toxicologues et écotoxicologues universitaires, sans oublier les hygiénistes des industries fabriquant et utilisant les nanomatériaux en Europe, les fonctionnaires gouvernementaux et la société civile. La communication entre ces 260 membres et toutes les parties prenantes touchées par cette technnologie transversale est impérative, surtout pour les travailleurs en contact direct avec ces matériaux innovants. www.nanoimpactnet.eu
Resumo:
France amended its constitution in 2005 to include a Charter for the Environment. The Charter lays out France's commitment to supporting the right to a 'balanced environment'. This article first traces the Charter's origins to a legacy-building presidential initiative. Jacques Chirac decided to invest in a neglected policy domain in which his own majority had shown little interest. He was obliged to intervene repeatedly in order to bring this project to a successful conclusion. In doing so, he staked out environmental affairs as an area of potential presidential supremacy. Next, the content of the Charter is examined. In this document, French traditions of universalism come together with an international movement for anticipatory environmental protection. This is reflected in the constitutionalisation of the precautionary principle, which emerged as the most controversial part of the Charter. The debates this provoked tended to caricature a risk-management principle whose meaning has been carefully refined to forestall objections. Finally, the Charter's potential efficacy is analysed. The post-Charter record of legislative and judicial activity concerning the environment is meagre, but not wholly inauspicious.
Resumo:
OBJECTIVE: The Healthy Heart Kit (HHK) is a risk management and patient education kit for the prevention of cardiovascular disease (CVD) and the promotion of CV health. There are currently no published data examining predictors of HHK use by physicians. The main objective of this study was to examine the association between physicians' characteristics (socio-demographic, cognitive, and behavioural) and the use of the HHK. METHODS: All registered family physicians in Alberta (n=3068) were invited to participate in the "Healthy Heart Kit" Study. Consenting physicians (n=153) received the Kit and were requested to use it for two months. At the end of this period, a questionnaire collected data on the frequency of Kit use by physicians, as well as socio-demographic, cognitive, and behavioural variables pertaining to the physicians. RESULTS: The questionnaire was returned by 115 physicians (follow-up rate = 75%). On a scale ranging from 0 to 100, the mean score of Kit use was 61 [SD=26]. A multiple linear regression showed that "agreement with the Kit" and the degree of "confidence in using the Kit" was strongly associated with Kit use, explaining 46% of the variability for Kit use. Time since graduation was inversely associated with Kit use, and a trend was observed for smaller practices to be associated with lower use. CONCLUSION: Given these findings, future research and practice should explore innovative strategies to gain initial agreement among physicians to employ such clinical tools. Participation of older physicians and solo-practitioners in this process should be emphasized.