84 resultados para ANIMAL RISK ANALYSIS
em Université de Lausanne, Switzerland
Resumo:
Commentaire de: Gaziano TA, Young CR, Fitzmaurice G, Atwood S, Gaziano JM. Laboratory-based versus non-laboratory-based method for assessment of cardiovascular disease risk: the NHANES I Follow-up Study cohort. Lancet. 2008;371(9616):923-31. PMID: 18342687
Resumo:
This paper presents a prototype of an interactive web-GIS tool for risk analysis of natural hazards, in particular for floods and landslides, based on open-source geospatial software and technologies. The aim of the presented tool is to assist the experts (risk managers) in analysing the impacts and consequences of a certain hazard event in a considered region, providing an essential input to the decision-making process in the selection of risk management strategies by responsible authorities and decision makers. This tool is based on the Boundless (OpenGeo Suite) framework and its client-side environment for prototype development, and it is one of the main modules of a web-based collaborative decision support platform in risk management. Within this platform, the users can import necessary maps and information to analyse areas at risk. Based on provided information and parameters, loss scenarios (amount of damages and number of fatalities) of a hazard event are generated on the fly and visualized interactively within the web-GIS interface of the platform. The annualized risk is calculated based on the combination of resultant loss scenarios with different return periods of the hazard event. The application of this developed prototype is demonstrated using a regional data set from one of the case study sites, Fella River of northeastern Italy, of the Marie Curie ITN CHANGES project.
Resumo:
A computerized handheld procedure is presented in this paper. It is intended as a database complementary tool, to enhance prospective risk analysis in the field of occupational health. The Pendragon forms software (version 3.2) has been used to implement acquisition procedures on Personal Digital Assistants (PDAs) and to transfer data to a computer in an MS-Access format. The data acquisition strategy proposed relies on the risk assessment method practiced at the Institute of Occupational Health Sciences (IST). It involves the use of a systematic hazard list and semi-quantitative risk assessment scales. A set of 7 modular forms has been developed to cover the basic need of field audits. Despite the minor drawbacks observed, the results obtained so far show that handhelds are adequate to support field risk assessment and follow-up activities. Further improvements must still be made in order to increase the tool effectiveness and field adequacy.
Resumo:
Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.
Resumo:
Angio-oedema (AE) is a known adverse effect of angiotensin converting enzyme inhibitor (ACE-I) therapy. Over the past several decades, evidence of failure to diagnose this important and potentially fatal reaction is commonly found in the literature. Because this reaction is often seen first in the primary care setting, a review was undertaken to analyse and document the keys to both diagnostic criteria as well as to investigate potential risk factors for ACE-I AE occurrence. A general review of published literature was conducted through Medline, EMBASE, and the Cochrane Database, targeting ACE-I-related AE pathomechanism, diagnosis, epidemiology, risk factors, and clinical decision making and treatment. The incidence and severity of AE appears to be on the rise and there is evidence of considerable delay in diagnosis contributing to significant morbidity and mortality for patients. The mechanism of AE due to ACE-I drugs is not fully understood, but some genomic and metabolomic information has been correlated. Additional epidemiologic data and clinical treatment outcome predictors have been evaluated, creating a basis for future work on the development of clinical prediction tools to aid in risk identification and diagnostic differentiation. Accurate recognition of AE by the primary care provider is essential to limit the rising morbidity associated with ACE-I treatment-related AE. Research findings on the phenotypic indicators relevant to this group of patients as well as basic research into the pathomechanism of AE are available, and should be used in the construction of better risk analysis and clinical diagnostic tools for ACE-I AE.
Resumo:
Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.
Resumo:
A review of extinction risk analysis and viability methods is presented. The importance of environmental, demographic and genetic uncertainties, as well as the role of catastrophes are successively considered, and different approaches aiming at the integration of these risk factors in predictive population dynamic models are discussed.
Resumo:
La gestion des risques est souvent appréhendée par l'utilisation de méthodes linéaires mettant l'accent sur des raisonnements de positionnement et de type causal : à tel événement correspond tel risque et telle conséquence. Une prise en compte des interrelations entre risques est souvent occultée et les risques sont rarement analysés dans leurs dynamiques et composantes non linéaires. Ce travail présente ce que les méthodes systémiques et notamment l'étude des systèmes complexes sont susceptibles d'apporter en matière de compréhension, de management et d'anticipation et de gestion des risques d'entreprise, tant sur le plan conceptuel que de matière appliquée. En partant des définitions relatives aux notions de systèmes et de risques dans différents domaines, ainsi que des méthodes qui sont utilisées pour maîtriser les risques, ce travail confronte cet ensemble à ce qu'apportent les approches d'analyse systémique et de modélisation des systèmes complexes. En mettant en évidence les effets parfois réducteurs des méthodes de prise en compte des risques en entreprise ainsi que les limitations des univers de risques dues, notamment, à des définitions mal adaptées, ce travail propose également, pour la Direction d'entreprise, une palette des outils et approches différentes, qui tiennent mieux compte de la complexité, pour gérer les risques, pour aligner stratégie et management des risques, ainsi que des méthodes d'analyse du niveau de maturité de l'entreprise en matière de gestion des risques. - Risk management is often assessed through linear methods which stress positioning and causal logical frameworks: to such events correspond such consequences and such risks accordingly. Consideration of the interrelationships between risks is often overlooked and risks are rarely analyzed in their dynamic and nonlinear components. This work shows what systemic methods, including the study of complex systems, are likely to bring to knowledge, management, anticipation of business risks, both on the conceptual and the practical sides. Based on the definitions of systems and risks in various areas, as well as methods used to manage risk, this work confronts these concepts with approaches of complex systems analysis and modeling. This work highlights the reducing effects of some business risk analysis methods as well as limitations of risk universes caused in particular by unsuitable definitions. As a result this work also provides chief officers with a range of different tools and approaches which allows them a better understanding of complexity and as such a gain in efficiency in their risk management practices. It results in a better fit between strategy and risk management. Ultimately the firm gains in its maturity of risk management.
Resumo:
This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.
Resumo:
Nanoparticles <100 nanometres are being introduced into industrial processes, but they are suspected to cause similar negative health effects to ambient particles. Poor knowledge about the scale of introduction has not allowed global risk analysis until now. In 2006 a targeted telephone survey among Swiss companies (1) showed the usage of nanoparticles in a few selected companies but did not provide data to extrapolate to the full Swiss workforce. The purpose of the study presented here was to provide a quantitative estimate of the potential occupational exposure to nanoparticles in Swiss industry. Method: A layered representative questionnaire survey among 1626 Swiss companies of the production sector was conducted in 2007. The survey was a written questionnaire, collecting data about the used nanoparticles, the number of potentially exposed persons in the companies and their protection strategy. Results: The response rate of the study was 58.3%. The number of companies estimated to be using nanoparticles in Switzerland was 586 (95% Confidence Interval 145 to 1027). It is estimated that 1309 workers (95% CI 1073 to 1545) do their job in the same room as a nanoparticle application. Personal protection was shown to be the predominant protection means. Such information is valuable for risk evaluation. The low number of companies dealing with nanoparticles in Switzerland suggests that policy makers as well as health, safety and environmental officers within companies can focus their efforts on a relatively small number of companies or workers. The collected data about types of particles and applications may be used for research on prevention strategies and adapted protection means. However, to reflect the most recent trends, the information presented here has to be continuously updated, and a large-scale inventory of the usage should be considered.
Resumo:
Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.
Resumo:
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.
Resumo:
Manufactured nanoparticles are introduced into industrial processes, but they are suspected to cause similar negative health effects as ambient particles. The poor knowledge about the scale of this introduction did not allow global risk analysis so far. In 2006 a targeted telephone survey among Swiss companies (1) showed the usage of nanoparticles in a few selected companies but did not provide data to extrapolate on the totality of the Swiss workforce. To gain this kind of information a layered representative questionnaire survey among 1'626 Swiss companies was conducted in 2007. Data was collected about the number of potentially exposed persons in the companies and their protection strategy. The response rate was 58.3%. An expected number of 586 companies (95%−confidence interval 145 to 1'027) was shown by this study to use nanoparticles in Switzerland. Estimated 1'309 (1'073 to 1'545) workers do their job in the same room as a nanoparticle application. Personal protection was shown to be the predominant type of protection means. Companies starting productions with nanomaterials need to consider incorporating protection measures into the plans. This will not only benefit the workers' health, but will also likely increase the competitiveness of the companies. Technical and organisational protection means are not only more cost−effective on the long term, but are also easier to control. Guidelines may have to be designed specifically for different industrial applications, including fields outside nanotechnology, and adapted to all sizes of companies.
Resumo:
SUMMARY Barrett's esophagus (BE) is an acquired condition in which the normal squamous epithelium in the distal esophagus is replaced by a metaplastic columnar epithelium, as a complication of chronic gastroesophageal reflux. The clinical significance of this disease is its associated predisposition to esophageal adenocarcinoma (EAC). EAC is a highly lethal disease. Better understanding of the pathogenesis of columnar metaplasia and its progression to cancer might allow the identification of biomarkers that can be used for early diagnosis, which will improve the patient survival. In this study, an improved protocol for methylation-sensitive single-strand conformation analysis, which is used to analyze promoter methylation, is proposed and a methylation-sensitive dot blot assay is described, which allows a rapid, easy, and sensitive detection of promoter methylation. Both methods were applied to study the methylation pattern of the APC promoter in histologically normal appearing gastric mucosa. The APC promoter showed monoallelic methylation, and because the methylated allele differed between the different gastric cell types, this corresponded to allelic exclusion. The APC methylation pattern was frequently altered in noimal gastric mucosa associated with neoplastic lesions, indicating that changes in the pattern of promoter methylation might precede the development of neoplasia, without accompanying histological manifestations. An epigenetic profile of 10 genes important in EAC was obtained in this study; 5 promoter genes (APC, TIMP3, TERT, CDKN2A and SFRP1) were found to be hypermethylated in the tumors. Furthermore, the promoter of APC, TIMP3 and TERT was frequently methylated in BE samples from EAC patients, but rarely in BE samples that did not progress to EAC. These three biomarkers might therefore be considered as potential predictive markers for increased EAC risk. Analysis of Wnt pathway alterations indicated that WNT2 ligand is overexpressed as early as the low-grade dysplastic stage and downregulation by promoter methylation of the SFRP1 gene occurrs already in the metaplastic lesions. Moreover, loss of APC expression is not the only factor involved in the activation of the Wnt pathway. These results indicate that a variety of biologic, mostly epigenetic events occurs very early in the carcinogenesis of BE. This new information might lead to improved early diagnosis of EAC and thus open the way to a possible application of these biomarkers in the prediction of increased EAC risk progression. RESUME L'oesophage de Barrett est une lésion métaplasique définie par le remplacement de la muqueuse malpighienne du bas oesophage par une muqueuse cylindrique glandulaire, suite à une agression chronique par du reflux gastro-esophagien. La plus importante signification clinique de cette maladie est sa prédisposition au développement d'un adénocarcinome. Le pronostic de l'adénocarcinome sur oesophage de Barrett est sombre. Seule une meilleure compréhension de la pathogenèse de l'épithélium métaplasique et de sa progression néoplasique permettrait l'identification de biomarqueurs pouvant être utilisés pour un diagnostic précoce ; la survie du patient serait ainsi augmentée. Dans cette étude, un protocole amélioré pour l'analyse de la méthylation par conformation simple brin est proposé. De plus, une technique d'analyse par dot blot permettant une détection rapide, facile et sensible de la méthylation d'un promoteur est décrite. Les deux méthodes ont été appliquées à l'étude de la méthylation du promoteur du gène APC dans des muqueuses gastriques histologiquement normales. Le promoteur APC a montré une méthylation monoallélique et, parce que les allèles méthylés différaient entre les différents types de cellules gastriques, celle-ci correspondait à une méthylation allélique exclusive. La méthylation d'APC a été trouvée fréquemment altérée dans la muqueuse gastrique normale associée à des lésions néoplasiques. Ceci indique que des changements dans la méthylation d'un promoteur peuvent précéder le développement d'une tumeur, et cela sans modification histologique. Un profil épigénétique des adénocarcinomes sur oesophage de Barrett a été obtenu dans cette étude. Cinq promoteurs (APC, TIMP3, TERT, CDKN2A et SFRP1) ont été trouvés hyperméthylés dans les tumeurs. Les promoteurs d'APC, TIMP3 et TERT étaient fréquemment méthylés dans l'épithélium métaplasique proche d'un adénocarcinome et rarement dans l'épithélium sans évolution néoplasique. Ces trois biomarkers pourraient par conséquent être considérés comme marqueur prédicatif d'un risque accru de développer une tumeur. L'analyse des altérations de la voie Wnt a montré que WNT2 est surexprimé déjà dans des dysplasies de bas-grade et que la dérégulation de SFRP1 par méthylation de son promoteur intervenait dans les lésions métaplasiques. Une perte d'expression d'APC n'est pas le seul facteur impliqué dans l'activation de cette voie. Ces résultats montrent qu'une grande diversité d'événements biologiques, principalement épigénétiques, surviennent très tôt lors de la carcinogenèse de l'oesophage de Barrett. Ces nouveaux éléments pourraient améliorer le diagnostic précoce et rendre possible l'application de ces biomarqueurs dans la prédiction d'un risque accru de développer un adénocarcinome sur un oesophage de Barrett.
Resumo:
CANCER CARE FACILITIES: In 2005, the registration area had about 3200 hospital beds available for cancer diagnosis and treatment (about 5 per 1000 residents). There were about 3600 hospital medical residents and private practitioners (1 per 180 residents). The canton has a major, multidisciplinary, public university oncology and radiotherapy centre and two private radiotherapy units (available to all residents), as well as several peripheral (mostly hospital-based) medical and surgical oncology facilities and specialists. REGISTRY STRUCTURE AND METHODS: The registry is part of the Cancer Epidemiology Unit of the Institute of Social and Preventive Medicine within the Faculty of Biology and Medicine of the University of Lausanne. Notiĺcation is voluntary. The registry's main sources of information are the University Institute of Pathology at the University of Lausanne and three major private pathology laboratories. Passive and active follow-up are conducted. Data on all deaths in the canton (including cancer deaths) are available. Other features of the registry are good registration of non-melanoma skin cancers, linkage of reports of selected preneoplastic conditions to the registry database (to study subsequent cancer risk), analysis.