997 resultados para Communicating environmental geoscience


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reports major results from collaborative research between France and Brazil on soil and water systems, carried out in the Upper Amazon Basin. It reveals the weathering processes acting in the partly inundated, low elevation plateaus of the Basin, mostly covered by evergreen forest. Our findings are based on geochemical data and mineral spectroscopy that probe the crystal chemistry of Fe and Al in mineral phases (mainly kaolinite, Al- and Fe-(hydr)oxides) of tropical soils (laterites). These techniques reveal crystal alterations in mineral populations of different ages and changes of metal speciation associated with mineral or organic phases. These results provide an integrated model of soil formation and changes (from laterites to podzols) in distinct hydrological compartments of the Amazon landscapes and under altered water regimes. (C) 2010 Academie des sciences. Published by Elsevier Masson SAS. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Science communication. including extension services. plays a key role in achieving sustainable native vegetation management. One of the pivotal aspects of the debate on sustainable vegetation management is the scientific information underpinning policy-making. In recent years. extension services have Shifted their focus from top-down technology transfer to bottom-up participation and empowerment. I here has also been a broadening of communication strategies to recognise the range of stakeholders involved in native vegetation management and to encompass environmental concerns. This paper examines the differences between government approaches to extension services to deliver policy and the need for effective communication to address broader science issues that underpin native vegetation management. The importance of knowing the learning styles of the stakeholders involved in native vegetation management is discussed at a time of increasing reliance on mass communication for information exchange and the importance of personal communication to achieve on-ground sustainable management. Critical factors for effective science-management communication are identified Such as: (i) undertaking scientific studies (research) with community involvement, acceptance and agreed understanding of project objectives (ii) realistic community consultation periods: (iii) matching communication channels with stakeholder needs; (iv) combining scientific with local knowledge in in holistic (biophysical and social) approach to understanding in issued and (v) regional partnerships. These communication factors are considered to be essential to implementing on-ground natural resource management strategics and actions, including those concerned with native vegetation management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4−year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard−setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: potential toxic and safety hazards of nanomaterials throughout their lifecycles; fate and persistence of nanoparticles in humans, animals and the environment; risks associated to nanoparticle exposure; participation in the preparation of nomenclature, standards, methodologies, protocols and benchmarks; development of best practice guidelines; voluntary schemes on responsibility; databases of materials, research topics and themes. Findings show that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Consequently NIN will encourage stakeholders to be active members. These survey findings will be used to improve NIN's communication tools to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Growing Iowa’s economy and sustaining vibrant healthy communities depends upon protecting the state’s environmental and natural resources. The Iowa Environmental Guide is designed to identify Iowa’s environmental compliance requirements and to provide expert resources to assist you with sustainable business solutions that protect the environment and enhance your bottom-line. The Iowa Department of Economic Development (IDED) Business and Regulatory Assistance Team is a non-regulatory, confidential point of contact to assist your company or organization in identifying permitting requirements, finding expert resources, communicating with the appropriate agencies, and in establishing a productive partnership with Iowa state government.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research explored environmental sustainability (ES) initiatives at five top-ranked Ontario golf courses that were members of the Audubon Cooperative Sanctuary Program for Golf (ACSP). Research Questions: (1) How are golf courses adapting to safeguard the natural environment? (2) Why or why not are golf courses moving to ES? and (3) What are the arising barriers to ES in golf and how can they be overcome; what role does communication play? Overall, the research was framed with an adaptation of the dimensions of convergence by Houlihan (2012), including the motives, inputs, implementation, momentum, and impact. Additionally, impression management and message framing constructs were utilized to address the issue of communicating ES initiatives. Data collection involved in-depth interviews, observations, and unobtrusive document collection. Environmental aspects of the examination were guided by the Canadian Standards Association (CSA) Requirements and Guidance for Organizers of Sustainable Events and Sustainable Sport and Event Toolkit (SSET).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser beams emitted from the Geoscience Laser Altimeter System (GLAS), as well as other spaceborne laser instruments, can only penetrate clouds to a limit of a few optical depths. As a result, only optical depths of thinner clouds (< about 3 for GLAS) are retrieved from the reflected lidar signal. This paper presents a comprehensive study of possible retrievals of optical depth of thick clouds using solar background light and treating GLAS as a solar radiometer. To do so one must first calibrate the reflected solar radiation received by the photon-counting detectors of the GLAS 532-nm channel, the primary channel for atmospheric products. Solar background radiation is regarded as a noise to be subtracted in the retrieval process of the lidar products. However, once calibrated, it becomes a signal that can be used in studying the properties of optically thick clouds. In this paper, three calibration methods are presented: (i) calibration with coincident airborne and GLAS observations, (ii) calibration with coincident Geostationary Opera- tional Environmental Satellite (GOES) and GLAS observations of deep convective clouds, and (iii) cali- bration from first principles using optical depth of thin water clouds over ocean retrieved by GLAS active remote sensing. Results from the three methods agree well with each other. Cloud optical depth (COD) is retrieved from the calibrated solar background signal using a one-channel retrieval. Comparison with COD retrieved from GOES during GLAS overpasses shows that the average difference between the two retriev- als is 24%. As an example, the COD values retrieved from GLAS solar background are illustrated for a marine stratocumulus cloud field that is too thick to be penetrated by the GLAS laser. Based on this study, optical depths for thick clouds will be provided as a supplementary product to the existing operational GLAS cloud products in future GLAS data releases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several biotic crises during the past 300 million years have been linked to episodes of continental flood basalt volcanism, and in particular to the release of massive quantities of magmatic sulphur gas species. Flood basalt provinces were typically formed by numerous individual eruptions, each lasting years to decades. However, the environmental impact of these eruptions may have been limited by the occurrence of quiescent periods that lasted hundreds to thousands of years. Here we use a global aerosol model to quantify the sulphur-induced environmental effects of individual, decade-long flood basalt eruptions representative of the Columbia River Basalt Group, 16.5–14.5 million years ago, and the Deccan Traps, 65 million years ago. For a decade-long eruption of Deccan scale, we calculate a decadal-mean reduction in global surface temperature of 4.5 K, which would recover within 50 years after an eruption ceased unless climate feedbacks were very different in deep-time climates. Acid mists and fogs could have caused immediate damage to vegetation in some regions, but acid-sensitive land and marine ecosystems were well-buffered against volcanic sulphur deposition effects even during century-long eruptions. We conclude that magmatic sulphur from flood basalt eruptions would have caused a biotic crisis only if eruption frequencies and lava discharge rates had been high and sustained for several centuries at a time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments in service-oriented and distributed computing have created exciting opportunities for the integration of models in service chains to create the Model Web. This offers the potential for orchestrating web data and processing services, in complex chains; a flexible approach which exploits the increased access to products and tools, and the scalability offered by the Web. However, the uncertainty inherent in data and models must be quantified and communicated in an interoperable way, in order for its effects to be effectively assessed as errors propagate through complex automated model chains. We describe a proposed set of tools for handling, characterizing and communicating uncertainty in this context, and show how they can be used to 'uncertainty- enable' Web Services in a model chain. An example implementation is presented, which combines environmental and publicly-contributed data to produce estimates of sea-level air pressure, with estimates of uncertainty which incorporate the effects of model approximation as well as the uncertainty inherent in the observational and derived data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The automatic interpolation of environmental monitoring network data such as air quality or radiation levels in real-time setting poses a number of practical and theoretical questions. Among the problems found are (i) dealing and communicating uncertainty of predictions, (ii) automatic (hyper)parameter estimation, (iii) monitoring network heterogeneity, (iv) dealing with outlying extremes, and (v) quality control. In this paper we discuss these issues, in light of the spatial interpolation comparison exercise held in 2004.