13 resultados para Environmental impact analysis--Ontario--Lake Gibson.
em Université de Lausanne, Switzerland
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4−year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard−setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: potential toxic and safety hazards of nanomaterials throughout their lifecycles; fate and persistence of nanoparticles in humans, animals and the environment; risks associated to nanoparticle exposure; participation in the preparation of nomenclature, standards, methodologies, protocols and benchmarks; development of best practice guidelines; voluntary schemes on responsibility; databases of materials, research topics and themes. Findings show that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Consequently NIN will encourage stakeholders to be active members. These survey findings will be used to improve NIN's communication tools to further build on interdisciplinary relationships towards a healthy future with nanotechnology.
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.
Resumo:
Neglecting health effects from indoor pollutant emissions and exposure, as currently done in Life Cycle Assessment (LCA), may result in product or process optimizations at the expense of workers' or consumers' health. To close this gap, methods for considering indoor exposure to chemicals are needed to complement the methods for outdoor human exposure assessment already in use. This paper summarizes the work of an international expert group on the integration of human indoor and outdoor exposure in LCA, within the UNEP/ SETAC Life Cycle Initiative. A new methodological framework is proposed for a general procedure to include human-health effects from indoor exposure in LCA. Exposure models from occupational hygiene and household indoor air quality studies and practices are critically reviewed and recommendations are provided on the appropriateness of various model alternatives in the context of LCA. A single-compartment box model is recommended for use as a default in LCA, enabling one to screen occupational and household exposures consistent with the existing models to assess outdoor emission in a multimedia environment. An initial set of model parameter values was collected. The comparison between indoor and outdoor human exposure per unit of emission shows that for many pollutants, intake per unit of indoor emission may be several orders of magnitude higher than for outdoor emissions. It is concluded that indoor exposure should be routinely addressed within LCA.
Resumo:
The ability to detect early molecular responses to various chemicals is central to the understanding of biological impact of pollutants in a context of varying environmental cues. To monitor stress responses in a model plant, we used transgenic moss Physcomitrella patens expressing the beta-glucuronidase reporter (GUS) under the control of the stress-inducible promoter hsp17.3B. Following exposure to pollutants from the dye and paper industry, GUS activity was measured by monitoring a fluorescent product. Chlorophenols, heavy metals and sulphonated anthraquinones were found to specifically activate the hsp17.3B promoter (within hours) in correlation with long-term toxicity effects (within days). At mildly elevated physiological temperatures, the chemical activation of this promoter was strongly amplified, which considerably increased the sensitivity of the bioassay. Together with the activation of hsp17.3B promoter, chlorophenols induced endogenous chaperones that transiently protected a recombinant thermolabile luciferase (LUC) from severe heat denaturation. This sensitive bioassay provides an early warning molecular sensor to industrial pollutants under varying environments, in anticipation to long-term toxic effects in plants. Because of the strong cross-talk between abiotic and chemical stresses that we find, this P. patens line is more likely to serve as a direct toxicity bioassay for pollutants combined with environmental cues, than as an indicator of absolute toxicity thresholds for various pollutants. It is also a powerful tool to study the role of heat shock proteins (HSPs) in plants exposed to combined chemical and environmental stresses.
Resumo:
Most available studies on lead smelter emissions deal with the environmental impact of outdoor particles, but only a few focus on air quality at workplaces. The objective of this study is to physically and chemically characterize the Pb-rich particles emitted at different workplaces in a lead recycling plant. A multi-scale characterization was conducted from bulk analysis to the level of individual particles, to assess the particles properties in relation with Pb speciation and availability. Process PM from various origins were sampled and then compared; namely Furnace and Refining PM respectively present in the smelter and at refinery workplaces, Emissions PM present in channeled emissions.These particles first differed by their morphology and size distribution, with finer particles found in emissions. Differences observed in chemical composition could be explained by the industrial processes. All PM contained the same major phases (Pb, PbS, PbO, PbSO4 and PbO·PbSO4) but differed on the nature and amount of minor phases. Due to high content in PM, Pb concentrations in the CaCl2 extractant reached relatively high values (40 mg.L-1). However, the ratios (soluble/total) of CaCl2 exchangeable Pb were relatively low (< 0.02%) in comparison with Cd (up to 18%). These results highlight the interest to assess the soluble fractions of all metals (minor and major) and discuss both total metal concentrations and ratios for risk evaluations. In most cases metal extractability increased with decreasing size of particles, in particular, lead exchangeability was highest for channeled emissions. Such type of study could help in the choice of targeted sanitary protection procedures and for further toxicological investigations. In the present context, particular attention is given to Emissions and Furnace PM. Moreover, exposure to other metals than Pb should be considered. [Authors]
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.
Resumo:
A radiochemical procedure was developed for the sequential determination of Pu and Am radioisotopes in environmental samples. The radioisotope activities were then used to assess the origin and release date of the environmental plutonium. The radioanalytical procedure is based on the separation of Pu and Am on selective extraction chromatographic resins (Eichrom TEVA and DGA). Alpha sources were prepared by electrodeposition on stainless steel discs, and the alpha emitting radionuclides (238Pu, 239,240Pu and 241Am) were measured by alpha spectrometry. For the determination of the beta emitting 241Pu, the Pu alpha source was leached in hot concentrated nitric acid and the Pu fraction further purified by extraction chromatography on a small column of TEVA resin (100 μg of resin in a pipette tip). 241Pu is then measured by ultra low level liquid scintillation counting. Due to the lack of reference material for 241Pu, the proposed radiochemical method was nevertheless validated using four IAEA reference sediments with information values of 241Pu. The proposed method was then used to determine the 238Pu, 239,240Pu, 241Pu and 241Am activity concentrations in alpine soils of France and Switzerland. The soil is the primary receptor of the atmospheric radioactive fallout and, because of the strong binding interaction with soils particles, the isotopes are little fractionated. Therefore, the activity ratios 241Pu/239+240Pu and 238Pu/239,240Pu in soil samples were used to determine the origin (source) and date of the Pu contamination in the investigated alpine sites. The 241Pu/239,240Pu and 238Pu/239,240Pu activity ratios confirmed that the main origin of Pu in the alpine soils was the global fallout from the nuclear bomb tests (NBT) in the fifties and sixties. Furthermore, the 241Pu/241Am activity ratios were used to determine the age of the Pu contamination, which is also an important data for distinguishing the Pu sources. The estimation of the date of the contamination, by the 241Pu/241Am age-dating method, further confirmed the NBT as the Pu source. However, the 241Pu/241Am dating method was limited to samples where Pu-Am fractionation was insignificant. If any, the contribution of the Chernobyl accident in the studied sites is negligible.
Resumo:
BACKGROUND: Dried blood spots (DBS) sampling has gained popularity in the bioanalytical community as an alternative to conventional plasma sampling, as it provides numerous benefits in terms of sample collection and logistics. The aim of this work was to show that these advantages can be coupled with a simple and cost-effective sample pretreatment, with subsequent rapid LC-MS/MS analysis for quantitation of 15 benzodiazepines, six metabolites and three Z-drugs. For this purpose, a simplified offline procedure was developed that consisted of letting a 5-µl DBS infuse directly into 100 µl of MeOH, in a conventional LC vial. RESULTS: The parameters related to the DBS pretreatment, such as extraction time or internal standard addition, were investigated and optimized, demonstrating that passive infusion in a regular LC vial was sufficient to quantitatively extract the analytes of interest. The method was validated according to international criteria in the therapeutic concentration ranges of the selected compounds. CONCLUSION: The presented strategy proved to be efficient for the rapid analysis of the selected drugs. Indeed, the offline sample preparation was reduced to a minimum, using a small amount of organic solvent and consumables, without affecting the accuracy of the method. Thus, this approach enables simple and rapid DBS analysis, even when using a non-DBS-dedicated autosampler, while lowering the costs and environmental impact.
Resumo:
Background, aim, and scope A coupled Life Cycle Costing and life cycle assessment has been performed for car-bodies of the Korean Tilting Train eXpress (TTX) project using European and Korean databases, with the objective of assessing environmental and cost performance to aid materials and process selection. More specifically, the potential of polymer composite car-body structures for the Korean Tilting Train eXpress (TTX) has been investigated. Materials and methods This assessment includes the cost of both carriage manufacturing and use phases, coupled with the life cycle environmental impacts of all stages from raw material production, through carriage manufacture and use, to end-of-life scenarios. Metallic carriages were compared with two composite options: hybrid steel-composite and full-composite carriages. The total planned production for this regional Korean train was 440 cars, with an annual production volume of 80 cars. Results and discussion The coupled analyses were used to generate plots of cost versus energy consumption and environmental impacts. The results show that the raw material and manufacturing phase costs are approximately half of the total life cycle costs, whilst their environmental impact is relatively insignificant (3-8%). The use phase of the car-body has the largest environmental impact for all scenarios, with near negligible contributions from the other phases. Since steel rail carriages weigh more (27-51%), the use phase cost is correspondingly higher, resulting in both the greatest environmental impact and the highest life cycle cost. Compared to the steel scenario, the hybrid composite variant has a lower life cycle cost (16%) and a lower environmental impact (26%). Though the full composite rail carriage may have the highest manufacturing cost, it results in the lowest total life cycle costs and lowest environmental impacts. Conclusions and recommendations This coupled cost and life cycle assessment showed that the full composite variant was the optimum solution. This case study showed that coupling of technical cost models with life cycle assessment offers an efficient route to accurately evaluate economic and environmental performance in a consistent way.
Resumo:
The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury.