914 resultados para free-range system
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
The AcrAB-TolC efflux system of Salmonella enterica serovar Typhimurium plays a role in pathogenesis
Resumo:
The ability of an isogenic set of mutants of Salmonella enterica serovar Typhimurium L354 (SL1344) with defined deletions in genes encoding components of tripartite efflux pumps, including acrB, acrD, acrF and tolC, to colonize chickens was determined in competition with L354. In addition, the ability of L354 and each mutant to adhere to, and invade, human embryonic intestine cells and mouse monocyte macrophages was determined in vitro. The tolC and acrB knockout mutants were hyper-susceptible to a range of antibiotics, dyes and detergents; the tolC mutant was also more susceptible to acid pH and bile and grew more slowly than L354. Complementation of either gene ablated the phenotype. The tolC mutant poorly adhered to both cell types in vitro and was unable to invade macrophages. The acrB mutant adhered, but did not invade macrophages. In vivo, both the acrB mutant and the tolC mutant colonized poorly and did not persist in the avian gut, whereas the acrD and acrF mutant colonized and persisted as well as L354. These data indicate that the AcrAB-TolC system is important for the colonization of chickens by S. Typhimurium and that this system has a role in mediating adherence and uptake into target host cells.
Resumo:
An investigation is presented of a quasi-stationary convective system (QSCS) which occurred over the UK Southwest Peninsula on 21 July 2010. This system was remarkably similar in its location and structure to one which caused devastating flash flooding in the coastal village of Boscastle, Cornwall on 16 August 2004. However, in the 2010 case rainfall accumulations were around four times smaller and no flooding was recorded. The more extreme nature of the Boscastle case is shown to be related to three factors: (1) higher rain rates, associated with a warmer and moister tropospheric column and deeper convective clouds; (2) a more stationary system, due to slower evolution of the large-scale flow; and (3) distribution of the heaviest precipitation over fewer river catchments. Overall, however, the synoptic setting of the two events was broadly similar, suggesting that such conditions favour the development of QSCSs over the Southwest Peninsula. A numerical simulation of the July 2010 event was performed using a 1.5-km grid length configuration of the Met Office Unified Model. This reveals that convection was repeatedly initiated through lifting of low-level air parcels along a quasi-stationary coastal convergence line. Sensitivity tests are used to show that this convergence line was a sea breeze front which temporarily stalled along the coastline due to the retarding influence of an offshore-directed background wind component. Several deficiencies are noted in the 1.5-km model’s representation of the storm system, including delayed convective initiation; however, significant improvements are observed when the grid length is reduced to 500 m. These result in part from an improved representation of the convergence line, which enhances the associated low-level ascent allowing air parcels to more readily reach their level of free convection. The implications of this finding for forecasting convective precipitation are discussed.
Resumo:
This study aims to elucidate the key mechanisms controlling phytoplankton growth and decay within the Thames basin through the application of a modified version of an established river-algal model and comparison with observed stream water chlorophyll-a concentrations. The River Thames showed a distinct simulated phytoplankton seasonality and behaviour having high spring, moderate summer and low autumn chlorophyll-a concentrations. Three main sections were identified along the River Thames with different phytoplankton abundance and seasonality: (i) low chlorophyll-a concentrations from source to Newbridge; (ii) steep concentration increase between Newbridge and Sutton; and (iii) high concentrations with a moderate increase in concentration from Sutton to the end of the study area (Maidenhead). However, local hydrologic (e.g. locks) and other conditions (e.g. radiation, water depth, grazer dynamics, etc.) affected the simulated growth and losses. The model achieved good simulation results during both calibration and testing through a range of hydrological and nutrient conditions. Simulated phytoplankton growth was controlled predominantly by residence time, but during medium–low flow periods available light, water temperature and herbivorous grazing defined algal community development. These results challenge the perceived importance of in-stream nutrient concentrations as the perceived primary control on phytoplankton growth and death.
Resumo:
The climate of the Earth, like planetary climates in general, is broadly controlled by solar irradiation, planetary albedo and emissivity as well as its rotation rate and distribution of land (with its orography) and oceans. However, the majority of climate fluctuations that affect mankind are internal modes of the general circulation of the atmosphere and the oceans. Some of these modes, such as El Nino-Southern Oscillation (ENSO), are quasi-regular and have some longer-term predictive skill; others like the Arctic and Antarctic Oscillation are chaotic and generally unpredictable beyond a few weeks. Studies using general circulation models indicate that internal processes dominate the regional climate and that some like ENSO events have even distinct global signatures. This is one of the reasons why it is so difficult to separate internal climate processes from external ones caused, for example, by changes in greenhouse gases and solar irradiation. However, the accumulation of the warmest seasons during the latest two decades is lending strong support to the forcing of the greenhouse gases. As models are getting more comprehensive, they show a gradually broader range of internal processes including those on longer time scales, challenging the interpretation of the causes of past and present climate events further.
Resumo:
As laid out in its convention there are 8 different objectives for ECMWF. One of the major objectives will consist of the preparation, on a regular basis, of the data necessary for the preparation of medium-range weather forecasts. The interpretation of this item is that the Centre will make forecasts once a day for a prediction period of up to 10 days. It is also evident that the Centre should not carry out any real weather forecasting but merely disseminate to the member countries the basic forecasting parameters with an appropriate resolution in space and time. It follows from this that the forecasting system at the Centre must from the operational point of view be functionally integrated with the Weather Services of the Member Countries. The operational interface between ECMWF and the Member Countries must be properly specified in order to get a reasonable flexibility for both systems. The problem of making numerical atmospheric predictions for periods beyond 4-5 days differs substantially from 2-3 days forecasting. From the physical point we can define a medium range forecast as a forecast where the initial disturbances have lost their individual structure. However we are still interested to predict the atmosphere in a similar way as in short range forecasting which means that the model must be able to predict the dissipation and decay of the initial phenomena and the creation of new ones. With this definition, medium range forecasting is indeed very difficult and generally regarded as more difficult than extended forecasts, where we usually only predict time and space mean values. The predictability of atmospheric flow has been extensively studied during the last years in theoretical investigations and by numerical experiments. As has been discussed elsewhere in this publication (see pp 338 and 431) a 10-day forecast is apparently on the fringe of predictability.
Resumo:
It is shown how a renormalization technique, which is a variant of classical Krylov–Bogolyubov–Mitropol’skii averaging, can be used to obtain slow evolution equations for the vortical and inertia–gravity wave components of the dynamics in a rotating flow. The evolution equations for each component are obtained to second order in the Rossby number, and the nature of the coupling between the two is analyzed carefully. It is also shown how classical balance models such as quasigeostrophic dynamics and its second-order extension appear naturally as a special case of this renormalized system, thereby providing a rigorous basis for the slaving approach where only the fast variables are expanded. It is well known that these balance models correspond to a hypothetical slow manifold of the parent system; the method herein allows the determination of the dynamics in the neighborhood of such solutions. As a concrete illustration, a simple weak-wave model is used, although the method readily applies to more complex rotating fluid models such as the shallow-water, Boussinesq, primitive, and 3D Euler equations.
Resumo:
A programmable data acquisition system to allow novel use of meteorological radiosondes for atmospheric science measurements is described. In its basic form it supports four analogue inputs at 16 bit resolution, and up to two further inputs at lower resolution configurable instead for digital instruments. It also provides multiple instrument power supplies (+8V, +16V, +5V and -8V) from the 9V radiosonde battery. During a balloon flight encountering air temperatures from +17°C to -66°C, the worst case voltage drift in the 5V unipolar digitisation circuitry was 20mV. The system liberates a new range of low cost atmospheric research measurements, by utilising radiosondes routinely launched internationally for weather forecasting purposes. No additional receiving equipment is required. Comparisons between the specially instrumented and standard meteorological radiosondes show negligible effect of the additional instrumentation on the standard meteorological data.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
Within the project SPURT (trace gas measurements in the tropopause region) a variety of trace gases have been measured in situ in order to investigate the role of dynamical and chemical processes in the extra-tropical tropopause region. In this paper we report on a flight on 10 November 2001 leading from Hohn, Germany (52�N) to Faro, Portugal (37�N) through a strongly developed deep stratospheric intrusion. This streamer was associated with a large convective system over the western Mediterranean with potentially significant troposphere-to-stratosphere transport. Along major parts of the flight we measured unexpectedly high NOy mixing ratios. Also H2O mixing ratios were significantly higher than stratospheric background levels confirming the extraordinary chemical signature of the probed air masses in the interior of the streamer. Backward trajectories encompassing the streamer enable to analyze the origin and physical characteristics of the air masses and to trace troposphere-to-stratosphere transport. Near the western flank of the streamer features caused by long range transport, such as tropospheric filaments characterized by sudden drops in the O3 and NOy mixing ratios and enhanced CO and H2O can be reconstructed in great detail using the reverse domain filling technique. These filaments indicate a high potential for subsequent mixing with the stratospheric air. At the south-western edge of the streamer a strong gradient in the NOy and the O3 mixing ratios coincides very well with a sharp gradient in potential vorticity in the ECMWF fields. In contrast, in the interior of the streamer the observed highly elevated NOy and H2O mixing ratios up to a potential temperature level of 365K and potential vorticity values of maximum 10 PVU cannot be explained in terms of resolved troposphere-to-stratosphere transport along the backward trajectories. Also mesoscale simulations with a High Resolution Model reveal no direct evidence for convective H2O injection up to this level. Elevated H2O mixing ratios in the ECMWF and HRM are seen only up to about tropopause height at 340 hPa and 270 hPa, respectively, well below flight altitude of about 200 hPa. However, forward tracing of the convective influence as identified by satellite brightness temperature measurements and counts of lightning strokes shows that during this part of the flight the aircraft was closely following the border of an air mass which was heavily impacted by convective activity over Spain and Algeria. This is evidence that deep convection at mid-latitudes may have a large impact on the tracer distribution of the lowermost stratosphere reaching well above the thunderstorms anvils as claimed by recent studies using cloud-resolving models.
Resumo:
Earth system models are increasing in complexity and incorporating more processes than their predecessors, making them important tools for studying the global carbon cycle. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes, with coupled climate-carbon cycle models that represent land-use change simulating total land carbon stores by 2100 that vary by as much as 600 Pg C given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous model evaluation methodologies. Here we assess the state-of-the-art with respect to evaluation of Earth system models, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeo data and (ii) metrics for evaluation, and discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute towards the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but it is also a challenge, as more knowledge about data uncertainties is required in order to determine robust evaluation methodologies that move the field of ESM evaluation from "beauty contest" toward the development of useful constraints on model behaviour.
Resumo:
Agro-hydrological models have widely been used for optimizing resources use and minimizing environmental consequences in agriculture. SMCRN is a recently developed sophisticated model which simulates crop response to nitrogen fertilizer for a wide range of crops, and the associated leaching of nitrate from arable soils. In this paper, we describe the improvements of this model by replacing the existing approximate hydrological cascade algorithm with a new simple and explicit algorithm for the basic soil water flow equation, which not only enhanced the model performance in hydrological simulation, but also was essential to extend the model application to the situations where the capillary flow is important. As a result, the updated SMCRN model could be used for more accurate study of water dynamics in the soil-crop system. The success of the model update was demonstrated by the simulated results that the updated model consistently out-performed the original model in drainage simulations and in predicting time course soil water content in different layers in the soil-wheat system. Tests of the updated SMCRN model against data from 4 field crop experiments showed that crop nitrogen offtakes and soil mineral nitrogen in the top 90 cm were in a good agreement with the measured values, indicating that the model could make more reliable predictions of nitrogen fate in the crop-soil system, and thus provides a useful platform to assess the impacts of nitrogen fertilizer on crop yield and nitrogen leaching from different production systems. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The rise of food security up international political, societal and academic agendas has led to increasing interest in novel means of improving primary food production and reducing waste. There are however, also many ‘post-farm gate’ activities that are critical to food security, including processing, packaging, distributing, retailing, cooking and consuming. These activities all affect a range of important food security elements, notably availability, affordability and other aspects of access, nutrition and safety. Addressing the challenge of universal food security, in the context of a number of other policy goals (e.g. social, economic and environmental sustainability), is of keen interest to a range of UK stakeholders but requires an up-to-date evidence base and continuous innovation. An exercise was therefore conducted, under the auspices of the UK Global Food Security Programme, to identify priority research questions with a focus on the UK food system (though the outcomes may be broadly applicable to other developed nations). Emphasis was placed on incorporating a wide range of perspectives (‘world views’) from different stakeholder groups: policy, private sector, non-governmental organisations, advocacy groups and academia. A total of 456 individuals submitted 820 questions from which 100 were selected by a process of online voting and a three-stage workshop voting exercise. These 100 final questions were sorted into 10 themes and the ‘top’ question for each theme identified by a further voting exercise. This step also allowed four different stakeholder groups to select the top 7–8 questions from their perspectives. Results of these voting exercises are presented. It is clear from the wide range of questions prioritised in this exercise that the different stakeholder groups identified specific research needs on a range of post-farm gate activities and food security outcomes. Evidence needs related to food affordability, nutrition and food safety (all key elements of food security) featured highly in the exercise. While there were some questions relating to climate impacts on production, other important topics for food security (e.g. trade, transport, preference and cultural needs) were not viewed as strongly by the participants.
Resumo:
Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.
Resumo:
In this study, we examine seasonal and geographical variability of marine aerosol fine-mode fraction ( fm) and its impacts on deriving the anthropogenic component of aerosol optical depth (ta) and direct radiative forcing from multispectral satellite measurements. A proxy of fm, empirically derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) Collection 5 data, shows large seasonal and geographical variations that are consistent with the Goddard Chemistry Aerosol Radiation Transport (GOCART) and Global Modeling Initiative (GMI) model simulations. The so-derived seasonally and spatially varying fm is then implemented into a method of estimating ta and direct radiative forcing from the MODIS measurements. It is found that the use of a constant value for fm as in previous studies would have overestimated ta by about 20% over global ocean, with the overestimation up to �45% in some regions and seasons. The 7-year (2001–2007) global ocean average ta is 0.035, with yearly average ranging from 0.031 to 0.039. Future improvement in measurements is needed to better separate anthropogenic aerosol from natural ones and to narrow down the wide range of aerosol direct radiative forcing.