849 resultados para Urban Crash Risk Assessment Tool
Resumo:
The Short Term Assessment of Risk and Treatability is a structured judgement tool used to inform risk estimation for multiple adverse outcomes. In research, risk estimates outperform the tool's strength and vulnerability scales for violence prediction. Little is known about what its’component parts contribute to the assignment of risk estimates and how those estimates fare in prediction of non-violent adverse outcomes compared with the structured components. START assessment and outcomes data from a secure mental health service (N=84) was collected. Binomial and multinomial regression analyses determined the contribution of selected elements of the START structured domain and recent adverse risk events to risk estimates and outcomes prediction for violence, self-harm/suicidality, victimisation, and self-neglect. START vulnerabilities and lifetime history of violence, predicted the violence risk estimate; self-harm and victimisation estimates were predicted only by corresponding recent adverse events. Recent adverse events uniquely predicted all corresponding outcomes, with the exception of self-neglect which was predicted by the strength scale. Only for victimisation did the risk estimate outperform prediction based on the START components and recent adverse events. In the absence of recent corresponding risk behaviour, restrictions imposed on the basis of START-informed risk estimates could be unwarranted and may be unethical.
Resumo:
Global climate change is predicted to have impacts on the frequency and severity of flood events. In this study, output from Global Circulation Models (GCMs) for a range of possible future climate scenarios was used to force hydrologic models for four case study watersheds built using the Soil and Water Assessment Tool (SWAT). GCM output was applied with either the "delta change" method or a bias correction. Potential changes in flood risk are assessed based on modeling results and possible relationships to watershed characteristics. Differences in model outputs when using the two different methods of adjusting GCM output are also compared. Preliminary results indicate that watersheds exhibiting higher proportions of runoff in streamflow are more vulnerable to changes in flood risk. The delta change method appears to be more useful when simulating extreme events as it better preserves daily climate variability as opposed to using bias corrected GCM output.
Resumo:
Background: Genetic polymorphisms of the TCF7L2 gene are strongly associated with large increments in type 2 diabetes risk in different populations worldwide. In this study, we aimed to confirm the effect of the TCF7L2 polymorphism rs7903146 on diabetes risk in a Brazilian population and to assess the use of this genetic marker in improving diabetes risk prediction in the general population. Methods: We genotyped the single nucleotide polymorphisms (SNP) rs7903146 of the TCF7L2 gene in 560 patients with known coronary disease enrolled in the MASS II (Medicine, Angioplasty, or Surgery Study) Trial and in 1,449 residents of Vitoria, in Southeast Brazil. The associations of this gene variant to diabetes risk and metabolic characteristics in these two different populations were analyzed. To access the potential benefit of using this marker for diabetes risk prediction in the general population we analyzed the impact of this genetic variant on a validated diabetes risk prediction tool based on clinical characteristics developed for the Brazilian general population. Results: SNP rs7903146 of the TCF7L2 gene was significantly associated with type 2 diabetes in the MASS-II population (OR = 1.57 per T allele, p = 0.0032), confirming, in the Brazilian population, previous reports of the literature. Addition of this polymorphism to an established clinical risk prediction score did not increased model accuracy (both area under ROC curve equal to 0.776). Conclusion: TCF7L2 rs7903146 T allele is associated with a 1.57 increased risk for type 2 diabetes in a Brazilian cohort of patients with known coronary heart disease. However, the inclusion of this polymorphism in a risk prediction tool developed for the general population resulted in no improvement of performance. This is the first study, to our knowledge, that has confirmed this recent association in a South American population and adds to the great consistency of this finding in studies around the world. Finally, confirming the biological association of a genetic marker does not guarantee improvement on already established screening tools based solely on demographic variables.
Resumo:
Dissertação de Mestrado em Ambiente, Saúde e Segurança.
Resumo:
The impact of effluent wastewaters from four different hospitals: a university (1456 beds), a general (350 beds), a pediatric (110 beds) and a maternity hospital (96 beds), which are conveyed to the same wastewater treatment plant (WWTP), was evaluated in the receiving urban wastewaters. The occurrence of 78 pharmaceuticals belonging to several therapeutic classes was assessed in hospital effluents and WWTP wastewaters (influent and effluent) as well as the contribution of each hospital in WWTP influent in terms of pharmaceutical load. Results indicate that pharmaceuticals are widespread pollutants in both hospital and urban wastewaters. The contribution of hospitals to the input of pharmaceuticals in urban wastewaters widely varies, according to their dimension. The estimated total mass loadings were 306 g d− 1 for the university hospital, 155 g d− 1 for the general one, 14 g d− 1 for the pediatric hospital and 1.5 g d− 1 for the maternity hospital, showing that the biggest hospitals have a greater contribution to the total mass load of pharmaceuticals. Furthermore, analysis of individual contributions of each therapeutic group showed that NSAIDs, analgesics and antibiotics are among the groups with the highest inputs. Removal efficiency can go from over 90% for pharmaceuticals like acetaminophen and ibuprofen to not removal for β-blockers and salbutamol. Total mass load of pharmaceuticals into receiving surface waters was estimated between 5 and 14 g/d/1000 inhabitants. Finally, the environmental risk posed by pharmaceuticals detected in hospital and WWTP effluents was assessed by means of hazard quotients toward different trophic levels (algae, daphnids and fish). Several pharmaceuticals present in the different matrices were identified as potentially hazardous to aquatic organisms, showing that especial attention should be paid to antibiotics such as ciprofloxacin, ofloxacin, sulfamethoxazole, azithromycin and clarithromycin, since their hazard quotients in WWTP effluent revealed that they could pose an ecotoxicological risk to algae.
Resumo:
Estuaries and other transitional waters are complex ecosystems critically important as nursery and shelter areas for organisms. Also, humans depend on estuaries for multiple socio-economical activities such as urbanism, tourism, heavy industry, (taking advantage of shipping), fisheries and aquaculture, the development of which led to strong historical pressures, with emphasis on pollution. The degradation of estuarine environmental quality implies ecologic, economic and social prejudice, hence the importance of evaluating environmental quality through the identification of stressors and impacts. The Sado Estuary (SW Portugal) holds the characteristics of industrialized estuaries, which results in multiple adverse impacts. Still, it has recently been considered moderately contaminated. In fact, many studies were conducted in the past few years, albeit scattered due to the absence of true biomonitoring programmes. As such, there is a need to integrate the information, in order to obtain a holistic perspective of the area able to assist management and decision-making. As such, a geographical information system (GIS) was created based on sediment contamination and biomarker data collected from a decade-long time-series of publications. Four impacted and a reference areas were identified, characterized by distinct sediment contamination patterns related to different hot spots and diffuse sources of toxicants. The potential risk of sediment-bound toxicants was determined by contrasting the levels of pollutants with available sediment quality guidelines, followed by their integration through the Sediment Quality guideline Quotient (SQG-Q). The SQG-Q estimates per toxicant or class was then subjected to georreferencing and statistical analyses between the five distinct areas and seasons. Biomarker responses were integrated through the Biomarkers Consistency Indice and georreferenced as well through GIS. Overall, in spite of the multiple biological traits surveyed, the biomarker data (from several organisms) are accordant with sediment contamination. The most impacted areas were the shipyard area and adjacent industrial belt, followed by urban and agricultural grounds. It is evident that the estuary, although globally moderately impacted, is very heterogeneous and affected by a cocktail of contaminants, especially metals and polycyclic aromatic hydrocarbon. Although elements (like copper, zinc and even arsenic) may originate from the geology of the hydrographic basin of the Sado River, the majority of the remaining contaminants results from human activities. The present work revealed that the estuary should be divided into distinct biogeographic units, in order to implement effective measures to safeguard environmental quality.
Resumo:
Background: Physical stress echocardiography is an established methodology for diagnosis and risk stratification of coronary artery disease in patients with physical capacity. In obese (body mass index ≥ 30 kg/m2) the usefulness of pharmacological stress echocardiography has been demonstrated; however, has not been reported the use of physical stress echocardiography in this growing population group. Objective: To assess the frequency of myocardial ischemia in obese and non-obese patients undergoing physical stress echocardiography and compare their clinical and echocardiographic differences. Methods: 4,050 patients who underwent treadmill physical stress echocardiography were studied according to the Bruce protocol, divided into two groups: obese (n = 945; 23.3%) and non-obese (n = 3,105; 76.6%). Results: There was no difference regarding gender. Obese patients were younger (55.4 ± 10.9 vs. 57.56 ± 11.67) and had a higher frequency of hypertension (75.2% vs. 57, 2%; p < 0.0001), diabetis mellitus (15.2% vs. 10.9%; p < 0.0001), dyslipidemia (59.5% vs 51.9%; p < 0.0001), family history of coronary artery disease (59.3% vs. 55.1%; p = 0.023) and physical inactivity (71.4% vs. 52.9%, p < 0.0001). The obese had greater aortic dimensions (3.27 vs. 3.14 cm; p < 0.0001), left atrium (3.97 vs. 3.72 cm; p < 0.0001) and the relative thickness of the ventricule (33.7 vs. 32.8 cm; p < 0.0001). Regarding the presence of myocardial ischemia, there was no difference between groups (19% vs. 17.9%; p = 0.41). In adjusted logistic regression, the presence of myocardial ischemia remained independently associated with age, female gender, diabetes and hypertension. Conclusion: Obesity did not behave as a predictor of the presence of ischemia and the physical stress echocardiography. The application of this assessment tool in large scale sample demonstrates the feasibility of the methodology, also in obese.
Resumo:
BACKGROUND: Adequate pain assessment is critical for evaluating the efficacy of analgesic treatment in clinical practice and during the development of new therapies. Yet the currently used scores of global pain intensity fail to reflect the diversity of pain manifestations and the complexity of underlying biological mechanisms. We have developed a tool for a standardized assessment of pain-related symptoms and signs that differentiates pain phenotypes independent of etiology. METHODS AND FINDINGS: Using a structured interview (16 questions) and a standardized bedside examination (23 tests), we prospectively assessed symptoms and signs in 130 patients with peripheral neuropathic pain caused by diabetic polyneuropathy, postherpetic neuralgia, or radicular low back pain (LBP), and in 57 patients with non-neuropathic (axial) LBP. A hierarchical cluster analysis revealed distinct association patterns of symptoms and signs (pain subtypes) that characterized six subgroups of patients with neuropathic pain and two subgroups of patients with non-neuropathic pain. Using a classification tree analysis, we identified the most discriminatory assessment items for the identification of pain subtypes. We combined these six interview questions and ten physical tests in a pain assessment tool that we named Standardized Evaluation of Pain (StEP). We validated StEP for the distinction between radicular and axial LBP in an independent group of 137 patients. StEP identified patients with radicular pain with high sensitivity (92%; 95% confidence interval [CI] 83%-97%) and specificity (97%; 95% CI 89%-100%). The diagnostic accuracy of StEP exceeded that of a dedicated screening tool for neuropathic pain and spinal magnetic resonance imaging. In addition, we were able to reproduce subtypes of radicular and axial LBP, underscoring the utility of StEP for discerning distinct constellations of symptoms and signs. CONCLUSIONS: We present a novel method of identifying pain subtypes that we believe reflect underlying pain mechanisms. We demonstrate that this new approach to pain assessment helps separate radicular from axial back pain. Beyond diagnostic utility, a standardized differentiation of pain subtypes that is independent of disease etiology may offer a unique opportunity to improve targeted analgesic treatment.
Resumo:
OBJECTIVE: The aim of this pilot study was to describe problems in functioning and associated rehabilitation needs in persons with spinal cord injury after the 2010 earthquake in Haiti by applying a newly developed tool based on the International Classification of Functioning, Disability and Health (ICF). DESIGN: Pilot study. SUBJECTS: Eighteen persons with spinal cord injury (11 women, 7 men) participated in the needs assessment. Eleven patients had complete lesions (American Spinal Injury Association Impairment Scale; AIS A), one patient had tetraplegia. METHODS: Data collection included information from the International Spinal Cord Injury Core Data Set and a newly developed needs assessment tool based on ICF Core Sets. This tool assesses the level of functioning, the corresponding rehabilitation need, and required health professional. Data were summarized using descriptive statistics. RESULTS: In body functions and body structures, patients showed typical problems following spinal cord injury. Nearly all patients showed limitations and restrictions in their activities and participation related to mobility, self-care and aspects of social integration. Several environmental factors presented barriers to these limitations and restrictions. However, the availability of products and social support were identified as facilitators. Rehabilitation needs were identified in nearly all aspects of functioning. To address these needs, a multidisciplinary approach would be needed. CONCLUSION: This ICF-based needs assessment provided useful information for rehabilitation planning in the context of natural disaster. Future studies are required to test and, if necessary, adapt the assessment.
Resumo:
Perioperative cardiac events occurring in patients undergoing non-cardiac surgery are a common cause of morbidity and mortality. Current guidelines recommend an individualized approach to preoperative cardiac risk stratification prior to non-cardiac surgery, integrating risk factors both for the patient (active cardiac conditions, clinical risk factors, functional capacity) and for the planned surgery. Preoperative cardiac investigations are currently limited to high-risk patients in whom they may contribute to modify the perioperative management. A multidisciplinary approach to such patients, integrating the general practitioner, is recommended in order to define an individualized peri-operative strategy.
Resumo:
The introduction of engineered nanostructured materials into a rapidly increasing number of industrial and consumer products will result in enhanced exposure to engineered nanoparticles. Workplace exposure has been identified as the most likely source of uncontrolled inhalation of engineered aerosolized nanoparticles, but release of engineered nanoparticles may occur at any stage of the lifecycle of (consumer) products. The dynamic development of nanomaterials with possibly unknown toxicological effects poses a challenge for the assessment of nanoparticle induced toxicity and safety.In this consensus document from a workshop on in-vitro cell systems for nanoparticle toxicity testing11Workshop on 'In-Vitro Exposure Studies for Toxicity Testing of Engineered Nanoparticles' sponsored by the Association for Aerosol Research (GAeF), 5-6 September 2009, Karlsruhe, Germany. an overview is given of the main issues concerning exposure to airborne nanoparticles, lung physiology, biological mechanisms of (adverse) action, in-vitro cell exposure systems, realistic tissue doses, risk assessment and social aspects of nanotechnology. The workshop participants recognized the large potential of in-vitro cell exposure systems for reliable, high-throughput screening of nanoparticle toxicity. For the investigation of lung toxicity, a strong preference was expressed for air-liquid interface (ALI) cell exposure systems (rather than submerged cell exposure systems) as they more closely resemble in-vivo conditions in the lungs and they allow for unaltered and dosimetrically accurate delivery of aerosolized nanoparticles to the cells. An important aspect, which is frequently overlooked, is the comparison of typically used in-vitro dose levels with realistic in-vivo nanoparticle doses in the lung. If we consider average ambient urban exposure and occupational exposure at 5mg/m3 (maximum level allowed by Occupational Safety and Health Administration (OSHA)) as the boundaries of human exposure, the corresponding upper-limit range of nanoparticle flux delivered to the lung tissue is 3×10-5-5×10-3μg/h/cm2 of lung tissue and 2-300particles/h/(epithelial) cell. This range can be easily matched and even exceeded by almost all currently available cell exposure systems.The consensus statement includes a set of recommendations for conducting in-vitro cell exposure studies with pulmonary cell systems and identifies urgent needs for future development. As these issues are crucial for the introduction of safe nanomaterials into the marketplace and the living environment, they deserve more attention and more interaction between biologists and aerosol scientists. The members of the workshop believe that further advances in in-vitro cell exposure studies would be greatly facilitated by a more active role of the aerosol scientists. The technical know-how for developing and running ALI in-vitro exposure systems is available in the aerosol community and at the same time biologists/toxicologists are required for proper assessment of the biological impact of nanoparticles.
Resumo:
BACKGROUND: Physicians need a specific risk-stratification tool to facilitate safe and cost-effective approaches to the management of patients with cancer and acute pulmonary embolism (PE). The objective of this study was to develop a simple risk score for predicting 30-day mortality in patients with PE and cancer by using measures readily obtained at the time of PE diagnosis. METHODS: Investigators randomly allocated 1,556 consecutive patients with cancer and acute PE from the international multicenter Registro Informatizado de la Enfermedad TromboEmbólica to derivation (67%) and internal validation (33%) samples. The external validation cohort for this study consisted of 261 patients with cancer and acute PE. Investigators compared 30-day all-cause mortality and nonfatal adverse medical outcomes across the derivation and two validation samples. RESULTS: In the derivation sample, multivariable analyses produced the risk score, which contained six variables: age > 80 years, heart rate ≥ 110/min, systolic BP < 100 mm Hg, body weight < 60 kg, recent immobility, and presence of metastases. In the internal validation cohort (n = 508), the 22.2% of patients (113 of 508) classified as low risk by the prognostic model had a 30-day mortality of 4.4% (95% CI, 0.6%-8.2%) compared with 29.9% (95% CI, 25.4%-34.4%) in the high-risk group. In the external validation cohort, the 18% of patients (47 of 261) classified as low risk by the prognostic model had a 30-day mortality of 0%, compared with 19.6% (95% CI, 14.3%-25.0%) in the high-risk group. CONCLUSIONS: The developed clinical prediction rule accurately identifies low-risk patients with cancer and acute PE.
Resumo:
[Table des matières] 1. Introduction to the control banding method : Nanomaterials and occupational risk assessment; Alternative method known as control banding; Scope and limits of control banding. - 2. Control banding process applied to manufactured nanomaterials: General points; Operating principle. - 3. Implementation of control banding: Gathering of information; Hazard bands; Exposure bands; Allocation of risk control bands. - 4. Bibliography: Publications; Books, reports, opinions, bulletins; Standards and references; Legislation and regulations; Websites. - Annexes
Resumo:
Executive SummaryIn Nepal, landslides are one of the major natural hazards after epidemics, killing over 100 persons per year. However, this figure is an underreported reflection of the actual impact that landslides have on livelihoods and food security in rural Nepal. With predictions of more intense rainfall patterns, landslide occurrence in the Himalayas is likely to increase and continue to be one of the major impediments to development. Due to the remoteness of many localities and lack of resources, responsibilities for disaster preparedness and response in mountain areas usually lie with the communities themselves. Everyday life is full of risk in mountains of Nepal. This is why mountain populations, as well as other populations living in harsh conditions have developed a number of coping strategies for dealing with adverse situations. Perhaps due to the dispersed and remote nature of landslides in Nepal, there have been few studies on vulnerability, coping- and mitigation strategies of landslide affected populations. There are also few recommendations available to guide authorities and populations how to reduce losses due to landslides in Nepal, and even less so, how to operationalize resilience and vulnerability.Many policy makers, international donors, NGOs and national authorities are currently asking what investments are needed to increase the so-called 'resilience' of mountain populations to deal with climate risks. However, mountain populations are already quite resilient to seasonal fluctuations, temperature variations, rainfall patterns and market prices. In spite of their resilience, they continue to live in places at risk due to high vulnerability caused by structural inequalities: access to land, resources, markets, education. This interdisciplinary thesis examines the concept of resilience by questioning its usefulness and validity as the current goal of international development and disaster risk reduction policies, its conceptual limitations and its possible scope of action. The goal of this study is two-fold: to better define and distinguish factors and relationships between resilience, vulnerability, capacities and risk; and to test and improve a participatory methodology for evaluating landslide risk that can serve as a guidance tool for improving community-based disaster risk reduction. The objective is to develop a simple methodology that can be used by NGOs, local authorities and communities to reduce losses from landslides.Through its six case studies in Central-Eastern Nepal, this study explores the relation between resilience, vulnerability and landslide risk based on interdisciplinary methods, including geological assessments of landslides, semi-structured interviews, focus groups and participatory risk mapping. For comparison, the study sites were chosen in Tehrathum, Sunsari and Dolakha Districts of Central/Eastern Nepal, to reflect a variety of landslide types, from chronic to acute, and a variety of communities, from very marginalized to very high status. The study uses the Sustainable Livelihoods Approach as its conceptual basis, which is based on the notion that access and rights to resources (natural, human/institutional, economic, environmental, physical) are the basis for coping with adversity, such as landslides. The study is also intended as a contribution to the growing literature and practices on Community Based Disaster Risk Reduction specifically adapted to landslide- prone areas.In addition to the six case studies, results include an indicator based methodology for assessing and measuring vulnerability and resilience, a composite risk assessment methodology, a typology of coping strategies and risk perceptions and a thorough analysis of the relation between risk, vulnerability and resilience. The methodology forassessing vulnerability, resilience and risk is relatively cost-effective and replicable in a low-data environment. Perhaps the major finding is that resilience is a process that defines a community's (or system's) capacity to rebound following adversity but it does not necessarily reduce vulnerability or risk, which requires addressing more structural issues related to poverty. Therefore, conclusions include a critical view of resilience as a main goal of international development and disaster risk reduction policies. It is a useful concept in the context of recovery after a disaster but it needs to be addressed in parallel with vulnerability and risk.This research was funded by an interdisciplinary grant (#26083591) from the Swiss National Science Foundation for the period 2009-2011 and a seed grant from the Faculty of Geosciences and Environment at the University of Lausanne in 2008.Résumé en françaisAu Népal, les glissements de terrain sont un des aléas les plus dévastateurs après les épidémies, causant 100 morts par an. Pourtant, ce chiffre est une sous-estimation de l'impact réel de l'effet des glissements sur les moyens de subsistance et la sécurité alimentaire au Népal. Avec des prévisions de pluies plus intenses, l'occurrence des glissements dans les Himalayas augmente et présente un obstacle au développement. Du fait de l'éloignement et du manque de ressources dans les montagnes au Népal, la responsabilité de la préparation et la réponse aux catastrophes se trouve chez les communautés elles-mêmes. Le risque fait partie de la vie quotidienne dans les montagnes du Népal. C'est pourquoi les populations montagnardes, comme d'autres populations vivant dans des milieux contraignants, ont développé des stratégies pour faire face aux situations défavorables. Peu d'études existent sur la vulnérabilité, ceci étant probablement dû à l'éloignement et pourtant, les stratégies d'adaptation et de mitigation des populations touchées par des glissements au Népal existent.Beaucoup de décideurs politiques, bailleurs de fonds, ONG et autorités nationales se demandent quels investissements sont nécessaires afin d'augmenter la 'resilience' des populations de montagne pour faire face aux changements climatiques. Pourtant, ces populations sont déjà résilientes aux fluctuations des saisons, des variations de température, des pluies et des prix des marchés. En dépit de leur résilience, ils continuent de vivre dans des endroits à fort risque à cause des vulnérabilités créées par les inégalités structurelles : l'accès à la terre, aux ressources, aux marchés et à l'éducation. Cette thèse interdisciplinaire examine le concept de la résilience en mettant en cause son utilité et sa validité en tant que but actuel des politiques internationales de développement et de réduction des risques, ainsi que ses limitations conceptuelles et ses possibles champs d'action. Le but de cette étude est double : mieux définir et distinguer les facteurs et relations entre la résilience, la vulnérabilité, les capacités et le risque ; Et tester et améliorer une méthode participative pour évaluer le risque des glissements qui peut servir en tant qu'outil indicatif pour améliorer la réduction des risques des communautés. Le but est de développer une méthodologie simple qui peut être utilisée par des ONG, autorités locales et communautés pour réduire les pertes dues aux glissements.A travers les études de cas au centre-est du Népal, cette étude explore le rapport entre la résilience, la vulnérabilité et les glissements basée sur des méthodes interdisciplinaires ; Y sont inclus des évaluations géologiques des glissements, des entretiens semi-dirigés, des discussions de groupes et des cartes de risques participatives. Pour la comparaison, les zones d'études ont été sélectionnées dans les districts de Tehrathum, Sunsari et Dolakha dans le centre-est du Népal, afin de refléter différents types de glissements, de chroniques à urgents, ainsi que différentes communautés, variant de très marginalisées à très haut statut. Pour son cadre conceptuel, cette étude s'appuie sur l'approche de moyens de subsistance durable, qui est basée sur les notions d'accès et de droit aux ressources (naturelles, humaines/institutionnelles, économiques, environnementales, physiques) et qui sont le minimum pour faire face à des situations difficiles, comme des glissements. Cette étude se veut aussi une contribution à la littérature et aux pratiques en croissantes sur la réduction des risques communautaires, spécifiquement adaptées aux zones affectées par des glissements.En plus des six études de cas, les résultats incluent une méthodologie basée sur des indicateurs pour évaluer et mesurer la vulnérabilité et la résilience, une méthodologie sur le risque composé, une typologie de stratégies d'adaptation et perceptions des risques ainsi qu'une analyse fondamentale de la relation entre risque, vulnérabilité et résilience. Les méthodologies pour l'évaluation de la vulnérabilité, de la résilience et du risque sont relativement peu coûteuses et reproductibles dans des endroits avec peu de données disponibles. Le résultat probablement le plus pertinent est que la résilience est un processus qui définit la capacité d'une communauté (ou d'un système) à rebondir suite à une situation défavorable, mais qui ne réduit pas forcement la vulnérabilité ou le risque, et qui requiert une approche plus fondamentale s'adressant aux questions de pauvreté. Les conclusions incluent une vue critique de la résilience comme but principal des politiques internationales de développement et de réduction des risques. C'est un concept utile dans le contexte de la récupération après une catastrophe mais il doit être pris en compte au même titre que la vulnérabilité et le risque.Cette recherche a été financée par un fonds interdisciplinaire (#26083591) du Fonds National Suisse pour la période 2009-2011 et un fonds de préparation de recherches par la Faculté des Géosciences et Environnement à l'Université de Lausanne en 2008.
Resumo:
Control banding (CB) can be a useful tool for managing the potential risks of nanomaterials. The here proposed CB, which should be part of an overall risk control strategy, groups materials by hazard and emission potential. The resulting decision matrix proposes control bands adapted to the risk potential levels and helps define an action plan. If this plan is not practical and financially feasible, a full risk assessment is launched. The hazard banding combines key concepts of nanomaterial toxicology: translocation across biological barriers, fibrous nature, solubility, and reactivity. Already existing classifications specific to the nanomaterial can be used "as is." Otherwise, the toxicity of bulk or analogous substances gives an initial hazard band, which is increased if the substance is not easily soluble or if it has a higher reactivity than the substance. The emission potential bands are defined by the nanomaterials' physical form and process characteristics. Quantities, frequencies, and existing control measures are taken into account during the definition of the action plan. Control strategies range from room ventilation to full containment with expert advice. This CB approach, once validated, can be easily embedded in risk management systems. It allows integrating new toxicity data and needs no exposure data. [Authors]