16 resultados para project state

em CentAUR: Central Archive University of Reading - UK


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing data on animal health and welfare in organic livestock production systems in the European Community countries are reviewed in the light of the demands and challenges of the recently implemented EU regulation on organic livestock production. The main conclusions and recommendations of a three-year networking project on organic livestock production are summarised and the future challenges to organic livestock production in terms of welfare and health management are discussed. The authors conclude that, whilst the available data are limited and the implementation of the EC regulation is relatively recent, there is little evidence to suggest that organic livestock management causes major threats to animal health and welfare in comparison with conventional systems. There are, however, some well-identified areas, like parasite control and balanced ration formulation, where efforts are needed to find solutions that meet with organic standard requirements and guarantee high levels of health and welfare. It is suggested that, whilst organic standards offer an implicit framework for animal health and welfare management, there is a need to solve apparent conflicts between the organic farming objectives in regard to environment, public health, farmer income and animal health and welfare. The key challenges for the future of organic livestock production in Europe are related to the feasibility of implementing improved husbandry inputs and the development of evidence-based decision support systems for health and feeding management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current e-learning systems are increasing their importance in higher education. However, the state of the art of e-learning applications, besides the state of the practice, does not achieve the level of interactivity that current learning theories advocate. In this paper, the possibility of enhancing e-learning systems to achieve deep learning has been studied by replicating an experiment in which students had to learn basic software engineering principles. One group learned these principles using a static approach, while the other group learned the same principles using a system-dynamics-based approach, which provided interactivity and feedback. The results show that, quantitatively, the latter group achieved a better understanding of the principles; furthermore, qualitatively, they enjoyed the learning experience

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many weeds occur in patches but farmers frequently spray whole fields to control the weeds in these patches. Given a geo-referenced weed map, technology exists to confine spraying to these patches. Adoption of patch spraying by arable farmers has, however, been negligible partly due to the difficulty of constructing weed maps. Building on previous DEFRA and HGCA projects, this proposal aims to develop and evaluate a machine vision system to automate the weed mapping process. The project thereby addresses the principal technical stumbling block to widespread adoption of site specific weed management (SSWM). The accuracy of weed identification by machine vision based on a single field survey may be inadequate to create herbicide application maps. We therefore propose to test the hypothesis that sufficiently accurate weed maps can be constructed by integrating information from geo-referenced images captured automatically at different times of the year during normal field activities. Accuracy of identification will also be increased by utilising a priori knowledge of weeds present in fields. To prove this concept, images will be captured from arable fields on two farms and processed offline to identify and map the weeds, focussing especially on black-grass, wild oats, barren brome, couch grass and cleavers. As advocated by Lutman et al. (2002), the approach uncouples the weed mapping and treatment processes and builds on the observation that patches of these weeds are quite stable in arable fields. There are three main aspects to the project. 1) Machine vision hardware. Hardware component parts of the system are one or more cameras connected to a single board computer (Concurrent Solutions LLC) and interfaced with an accurate Global Positioning System (GPS) supplied by Patchwork Technology. The camera(s) will take separate measurements for each of the three primary colours of visible light (red, green and blue) in each pixel. The basic proof of concept can be achieved in principle using a single camera system, but in practice systems with more than one camera may need to be installed so that larger fractions of each field can be photographed. Hardware will be reviewed regularly during the project in response to feedback from other work packages and updated as required. 2) Image capture and weed identification software. The machine vision system will be attached to toolbars of farm machinery so that images can be collected during different field operations. Images will be captured at different ground speeds, in different directions and at different crop growth stages as well as in different crop backgrounds. Having captured geo-referenced images in the field, image analysis software will be developed to identify weed species by Murray State and Reading Universities with advice from The Arable Group. A wide range of pattern recognition and in particular Bayesian Networks will be used to advance the state of the art in machine vision-based weed identification and mapping. Weed identification algorithms used by others are inadequate for this project as we intend to collect and correlate images collected at different growth stages. Plants grown for this purpose by Herbiseed will be used in the first instance. In addition, our image capture and analysis system will include plant characteristics such as leaf shape, size, vein structure, colour and textural pattern, some of which are not detectable by other machine vision systems or are omitted by their algorithms. Using such a list of features observable using our machine vision system, we will determine those that can be used to distinguish weed species of interest. 3) Weed mapping. Geo-referenced maps of weeds in arable fields (Reading University and Syngenta) will be produced with advice from The Arable Group and Patchwork Technology. Natural infestations will be mapped in the fields but we will also introduce specimen plants in pots to facilitate more rigorous system evaluation and testing. Manual weed maps of the same fields will be generated by Reading University, Syngenta and Peter Lutman so that the accuracy of automated mapping can be assessed. The principal hypothesis and concept to be tested is that by combining maps from several surveys, a weed map with acceptable accuracy for endusers can be produced. If the concept is proved and can be commercialised, systems could be retrofitted at low cost onto existing farm machinery. The outputs of the weed mapping software would then link with the precision farming options already built into many commercial sprayers, allowing their use for targeted, site-specific herbicide applications. Immediate economic benefits would, therefore, arise directly from reducing herbicide costs. SSWM will also reduce the overall pesticide load on the crop and so may reduce pesticide residues in food and drinking water, and reduce adverse impacts of pesticides on non-target species and beneficials. Farmers may even choose to leave unsprayed some non-injurious, environmentally-beneficial, low density weed infestations. These benefits fit very well with the anticipated legislation emerging in the new EU Thematic Strategy for Pesticides which will encourage more targeted use of pesticides and greater uptake of Integrated Crop (Pest) Management approaches, and also with the requirements of the Water Framework Directive to reduce levels of pesticides in water bodies. The greater precision of weed management offered by SSWM is therefore a key element in preparing arable farming systems for the future, where policy makers and consumers want to minimise pesticide use and the carbon footprint of farming while maintaining food production and security. The mapping technology could also be used on organic farms to identify areas of fields needing mechanical weed control thereby reducing both carbon footprints and also damage to crops by, for example, spring tines. Objective i. To develop a prototype machine vision system for automated image capture during agricultural field operations; ii. To prove the concept that images captured by the machine vision system over a series of field operations can be processed to identify and geo-reference specific weeds in the field; iii. To generate weed maps from the geo-referenced, weed plants/patches identified in objective (ii).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An assessment of the fifth Coupled Models Intercomparison Project (CMIP5) models’ simulation of the near-surface westerly wind jet position and strength over the Atlantic, Indian and Pacific sectors of the Southern Ocean is presented. Compared with reanalysis climatologies there is an equatorward bias of 3.7° (inter-model standard deviation of ± 2.2°) in the ensemble mean position of the zonal mean jet. The ensemble mean strength is biased slightly too weak, with the largest biases over the Pacific sector (-1.6±1.1 m/s, 27 -22%). An analysis of atmosphere-only (AMIP) experiments indicates that 41% of the zonal mean position bias comes from coupling of the ocean/ice models to the atmosphere. The response to future emissions scenarios (RCP4.5 and RCP8.5) is characterized by two phases: (i) the period of most rapid ozone recovery (2000-2049) during which there is insignificant change in summer; and (ii) the period 2050-2098 during which RCP4.5 simulations show no significant change but RCP8.5 simulations show poleward shifts (0.30, 0.19 and 0.28°/decade over the Atlantic, Indian and Pacific sectors respectively), and increases in strength (0.06, 0.08 and 0.15 m/s/decade respectively). The models with larger equatorward position biases generally show larger poleward shifts (i.e. state dependence). This inter-model relationship is strongest over the Pacific sector (r=-0.89) and insignificant over the Atlantic sector (r=-0.50). However, an assessment of jet structure shows that over the Atlantic sector jet shift is significantly correlated with jet width whereas over the Pacific sector the distance between the sub-polar and sub-tropical westerly jets appears to be more important.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To investigate the effects of the middle atmosphere on climate, the World Climate Research Programme is supporting the project "Stratospheric Processes and their Role in Climate" (SPARC). A central theme of SPARC, to examine model simulations of the coupled troposphere—middle atmosphere system, is being performed through the initiative called GRIPS (GCM—Reality Intercomparison Project for SPARC). In this paper, an overview of the objectives of GRIPS is given. Initial activities include an assessment of the performance of middle atmosphere climate models, and preliminary results from this evaluation are presented here. It is shown that although all 13 models evaluated represent most major features of the mean atmospheric state, there are deficiencies in the magnitude and location of the features, which cannot easily be traced to the formulation (resolution or the parameterizations included) of the models. Most models show a cold bias in all locations, apart from the tropical tropopause region where they can be either too warm or too cold. The strengths and locations of the major jets are often misrepresented in the models. Looking at three—dimensional fields reveals, for some models, more severe deficiencies in the magnitude and positioning of the dominant structures (such as the Aleutian high in the stratosphere), although undersampling might explain some of these differences from observations. All the models have shortcomings in their simulations of the present—day climate, which might limit the accuracy of predictions of the climate response to ozone change and other anomalous forcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While state-of-the-art models of Earth's climate system have improved tremendously over the last 20 years, nontrivial structural flaws still hinder their ability to forecast the decadal dynamics of the Earth system realistically. Contrasting the skill of these models not only with each other but also with empirical models can reveal the space and time scales on which simulation models exploit their physical basis effectively and quantify their ability to add information to operational forecasts. The skill of decadal probabilistic hindcasts for annual global-mean and regional-mean temperatures from the EU Ensemble-Based Predictions of Climate Changes and Their Impacts (ENSEMBLES) project is contrasted with several empirical models. Both the ENSEMBLES models and a “dynamic climatology” empirical model show probabilistic skill above that of a static climatology for global-mean temperature. The dynamic climatology model, however, often outperforms the ENSEMBLES models. The fact that empirical models display skill similar to that of today's state-of-the-art simulation models suggests that empirical forecasts can improve decadal forecasts for climate services, just as in weather, medium-range, and seasonal forecasting. It is suggested that the direct comparison of simulation models with empirical models becomes a regular component of large model forecast evaluations. Doing so would clarify the extent to which state-of-the-art simulation models provide information beyond that available from simpler empirical models and clarify current limitations in using simulation forecasting for decision support. Ultimately, the skill of simulation models based on physical principles is expected to surpass that of empirical models in a changing climate; their direct comparison provides information on progress toward that goal, which is not available in model–model intercomparisons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty in ocean analysis methods and deficiencies in the observing system are major obstacles for the reliable reconstruction of the past ocean climate. The variety of existing ocean reanalyses is exploited in a multi-reanalysis ensemble to improve the ocean state estimation and to gauge uncertainty levels. The ensemble-based analysis of signal-to-noise ratio allows the identification of ocean characteristics for which the estimation is robust (such as tropical mixed-layer-depth, upper ocean heat content), and where large uncertainty exists (deep ocean, Southern Ocean, sea ice thickness, salinity), providing guidance for future enhancement of the observing and data assimilation systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean–sea ice reanalyses are crucial for assessing the variability and recent trends in the Arctic sea ice cover. This is especially true for sea ice volume, as long-term and large scale sea ice thickness observations are inexistent. Results from the Ocean ReAnalyses Intercomparison Project (ORA-IP) are presented, with a focus on Arctic sea ice fields reconstructed by state-of-the-art global ocean reanalyses. Differences between the various reanalyses are explored in terms of the effects of data assimilation, model physics and atmospheric forcing on properties of the sea ice cover, including concentration, thickness, velocity and snow. Amongst the 14 reanalyses studied here, 9 assimilate sea ice concentration, and none assimilate sea ice thickness data. The comparison reveals an overall agreement in the reconstructed concentration fields, mainly because of the constraints in surface temperature imposed by direct assimilation of ocean observations, prescribed or assimilated atmospheric forcing and assimilation of sea ice concentration. However, some spread still exists amongst the reanalyses, due to a variety of factors. In particular, a large spread in sea ice thickness is found within the ensemble of reanalyses, partially caused by the biases inherited from their sea ice model components. Biases are also affected by the assimilation of sea ice concentration and the treatment of sea ice thickness in the data assimilation process. An important outcome of this study is that the spatial distribution of ice volume varies widely between products, with no reanalysis standing out as clearly superior as compared to altimetry estimates. The ice thickness from systems without assimilation of sea ice concentration is not worse than that from systems constrained with sea ice observations. An evaluation of the sea ice velocity fields reveals that ice drifts too fast in most systems. As an ensemble, the ORA-IP reanalyses capture trends in Arctic sea ice area and extent relatively well. However, the ensemble can not be used to get a robust estimate of recent trends in the Arctic sea ice volume. Biases in the reanalyses certainly impact the simulated air–sea fluxes in the polar regions, and questions the suitability of current sea ice reanalyses to initialize seasonal forecasts.