873 resultados para bare public-key model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Part I of this study it was shown that moving from a moisture-convergent- to a relative-humidity-dependent organized entrainment rate in the formulation for deep convection was responsible for significant advances in the simulation of the Madden – Julian Oscillation (MJO) in the ECMWF model. However, the application of traditional MJO diagnostics were not adequate to understand why changing the control on convection had such a pronounced impact on the representation of the MJO. In this study a set of process-based diagnostics are applied to the hindcast experiments described in Part I to identify the physical mechanisms responsible for the advances in MJO simulation. Increasing the sensitivity of the deep convection scheme to environmental moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid troposphere. Due to the modified precipitation – moisture relationship more moisture is able to build up, which effectively preconditions the tropical atmosphere for the t ransition t o d eep convection. R esults from this study suggest that a tropospheric moisture control on convection is key to simulating the interaction between the convective heating and the large-scale wave forcing associated with the MJO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze here the polar stratospheric temperatures in an ensemble of three 150-year integrations of the Canadian Middle Atmosphere Model (CMAM), an interactive chemistry-climate model which simulates ozone depletion and recovery, as well as climate change. A key motivation is to understand possible mechanisms for the observed trend in the extent of conditions favourable for polar stratospheric cloud (PSC) formation in the Arctic winter lower stratosphere. We find that in the Antarctic winter lower stratosphere, the low temperature extremes required for PSC formation increase in the model as ozone is depleted, but remain steady through the twenty-first century as the warming from ozone recovery roughly balances the cooling from climate change. Thus, ozone depletion itself plays a major role in the Antarctic trends in low temperature extremes. The model trend in low temperature extremes in the Arctic through the latter half of the twentieth century is weaker and less statistically robust than the observed trend. It is not projected to continue into the future. Ozone depletion in the Arctic is weaker in the CMAM than in observations, which may account for the weak past trend in low temperature extremes. In the future, radiative cooling in the Arctic winter due to climate change is more than compensated by an increase in dynamically driven downwelling over the pole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Lagrangian model of photochemistry and mixing is described (CiTTyCAT, stemming from the Cambridge Tropospheric Trajectory model of Chemistry And Transport), which is suitable for transport and chemistry studies throughout the troposphere. Over the last five years, the model has been developed in parallel at several different institutions and here those developments have been incorporated into one "community" model and documented for the first time. The key photochemical developments include a new scheme for biogenic volatile organic compounds and updated emissions schemes. The key physical development is to evolve composition following an ensemble of trajectories within neighbouring air-masses, including a simple scheme for mixing between them via an evolving "background profile", both within the boundary layer and free troposphere. The model runs along trajectories pre-calculated using winds and temperature from meteorological analyses. In addition, boundary layer height and precipitation rates, output from the analysis model, are interpolated to trajectory points and used as inputs to the mixing and wet deposition schemes. The model is most suitable in regimes when the effects of small-scale turbulent mixing are slow relative to advection by the resolved winds so that coherent air-masses form with distinct composition and strong gradients between them. Such air-masses can persist for many days while stretching, folding and thinning. Lagrangian models offer a useful framework for picking apart the processes of air-mass evolution over inter-continental distances, without being hindered by the numerical diffusion inherent to global Eulerian models. The model, including different box and trajectory modes, is described and some output for each of the modes is presented for evaluation. The model is available for download from a Subversion-controlled repository by contacting the corresponding authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Key point summary • Cerebellar ataxias are progressive debilitating diseases with no known treatment and are associated with defective motor function and, in particular, abnormalities to Purkinje cells. • Mutant mice with deficits in Ca2+ channel auxiliary α2δ-2 subunits are used as models of cerebellar ataxia. • Our data in the du2J mouse model shows an association between the ataxic phenotype exhibited by homozygous du2J/du2J mice and increased irregularity of Purkinje cell firing. • We show that both heterozygous +/du2J and homozygous du2J/du2J mice completely lack the strong presynaptic modulation of neuronal firing by cannabinoid CB1 receptors which is exhibited by litter-matched control mice. • These results show that the du2J ataxia model is associated with deficits in CB1 receptor signalling in the cerebellar cortex, putatively linked with compromised Ca2+ channel activity due to reduced α2δ-2 subunit expression. Knowledge of such deficits may help design therapeutic agents to combat ataxias. Abstract Cerebellar ataxias are a group of progressive, debilitating diseases often associated with abnormal Purkinje cell (PC) firing and/or degeneration. Many animal models of cerebellar ataxia display abnormalities in Ca2+ channel function. The ‘ducky’ du2J mouse model of ataxia and absence epilepsy represents a clean knock-out of the auxiliary Ca2+ channel subunit, α2δ-2, and has been associated with deficient Ca2+ channel function in the cerebellar cortex. Here, we investigate effects of du2J mutation on PC layer (PCL) and granule cell (GC) layer (GCL) neuronal spiking activity and, also, inhibitory neurotransmission at interneurone-Purkinje cell(IN-PC) synapses. Increased neuronal firing irregularity was seen in the PCL and, to a less marked extent, in the GCL in du2J/du2J, but not +/du2J, mice; these data suggest that the ataxic phenotype is associated with lack of precision of PC firing, that may also impinge on GC activity and requires expression of two du2J alleles to manifest fully. du2J mutation had no clear effect on spontaneous inhibitory postsynaptic current (sIPSC) frequency at IN-PC synapses, but was associated with increased sIPSC amplitudes. du2J mutation ablated cannabinoid CB1 receptor (CB1R)-mediated modulation of spontaneous neuronal spike firing and CB1Rmediated presynaptic inhibition of synaptic transmission at IN-PC synapses in both +/du2J and du2J/du2J mutants; effects that occurred in the absence of changes in CB1R expression. These results demonstrate that the du2J ataxia model is associated with deficient CB1R signalling in the cerebellar cortex, putatively linked with compromised Ca2+ channel activity and the ataxic phenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In public goods experiments, stochastic choice, censoring and motivational heterogeneity give scope for disagreement over the extent of unselfishness, and whether it is reciprocal or altruistic. We show that these problems can be addressed econometrically, by estimating a finite mixture model to isolate types, incorporating double censoring and a tremble term. Most subjects act selfishly, but a substantial proportion are reciprocal with altruism playing only a marginal role. Isolating reciprocators enables a test of Sugden’s model of voluntary contributions. We estimate that reciprocators display a self-serving bias relative to the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interpersonal interaction in public goods contexts is very different in character to its depiction in economic theory, despite the fact that the standard model is based on a small number of apparently plausible assumptions. Approaches to the problem are reviewed both from within and outside economics. It is argued that quick fixes such as a taste for giving do not provide a way forward. An improved understanding of why people contribute to such goods seems to require a different picture of the relationships between individuals than obtains in standard microeconomic theory, where they are usually depicted as asocial. No single economic model at present is consistent with all the relevant field and laboratory data. It is argued that there are defensible ideas from outside the discipline which ought to be explored, relying on different conceptions of rationality and/or more radically social agents. Three such suggestions are considered, one concerning the expressive/communicative aspect of behaviour, a second the possibility of a part-whole relationship between interacting agents and the third a version of conformism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is predicted that non-communicable diseases will account for over 73 % of global mortality in 2020. Given that the majority of these deaths occur in developed countries such as the UK, and that up to 80 % of chronic disease could be prevented through improvements in diet and lifestyle, it is imperative that dietary guidelines and disease prevention strategies are reviewed in order to improve their efficacy. Since the completion of the human genome project our understanding of complex interactions between environmental factors such as diet and genes has progressed considerably, as has the potential to individualise diets using dietary, phenotypic and genotypic data. Thus, there is an ambition for dietary interventions to move away from population-based guidance towards 'personalised nutrition'. The present paper reviews current evidence for the public acceptance of genetic testing and personalised nutrition in disease prevention. Health and clear consumer benefits have been identified as key motivators in the uptake of genetic testing, with individuals reporting personal experience of disease, such as those with specific symptoms, being more willing to undergo genetic testing for the purpose of personalised nutrition. This greater perceived susceptibility to disease may also improve motivation to change behaviour which is a key barrier in the success of any nutrition intervention. Several consumer concerns have been identified in the literature which should be addressed before the introduction of a nutrigenomic-based personalised nutrition service. Future research should focus on the efficacy and implementation of nutrigenomic-based personalised nutrition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The large scale urban consumption of energy (LUCY) model simulates all components of anthropogenic heat flux (QF) from the global to individual city scale at 2.5 × 2.5 arc-minute resolution. This includes a database of different working patterns and public holidays, vehicle use and energy consumption in each country. The databases can be edited to include specific diurnal and seasonal vehicle and energy consumption patterns, local holidays and flows of people within a city. If better information about individual cities is available within this (open-source) database, then the accuracy of this model can only improve, to provide the community data from global-scale climate modelling or the individual city scale in the future. The results show that QF varied widely through the year, through the day, between countries and urban areas. An assessment of the heat emissions estimated revealed that they are reasonably close to those produced by a global model and a number of small-scale city models, so results from LUCY can be used with a degree of confidence. From LUCY, the global mean urban QF has a diurnal range of 0.7–3.6 W m−2, and is greater on weekdays than weekends. The heat release from building is the largest contributor (89–96%), to heat emissions globally. Differences between months are greatest in the middle of the day (up to 1 W m−2 at 1 pm). December to February, the coldest months in the Northern Hemisphere, have the highest heat emissions. July and August are at the higher end. The least QF is emitted in May. The highest individual grid cell heat fluxes in urban areas were located in New York (577), Paris (261.5), Tokyo (178), San Francisco (173.6), Vancouver (119) and London (106.7). Copyright © 2010 Royal Meteorological Society

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sensitivity of the biological parameters in a nutrient-phytoplankton-zooplankton-detritus (NPZD) model in the calculation of the air-sea CO2 flux, primary production and detrital export is analysed. We explore the effect on these outputs of variation in the values of the twenty parameters that control ocean ecosystem growth in a 1-D formulation of the UK Met Office HadOCC NPZD model used in GCMs. We use and compare the results from one-at-a-time and all-at-a-time perturbations performed at three sites in the EuroSITES European Ocean Observatory Network: the Central Irminger Sea (60° N 40° W), the Porcupine Abyssal Plain (49° N 16° W) and the European Station for Time series in the Ocean Canary Islands (29° N 15° W). Reasonable changes to the values of key parameters are shown to have a large effect on the calculation of the air-sea CO2 flux, primary production, and export of biological detritus to the deep ocean. Changes in the values of key parameters have a greater effect in more productive regions than in less productive areas. The most sensitive parameters are generally found to be those controlling well-established ocean ecosystem parameterisations widely used in many NPZD-type models. The air-sea CO2 flux is most influenced by variation in the parameters that control phytoplankton growth, detrital sinking and carbonate production by phytoplankton (the rain ratio). Primary production is most sensitive to the parameters that define the shape of the photosynthesis-irradiance curve. Export production is most sensitive to the parameters that control the rate of detrital sinking and the remineralisation of detritus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of global deforestation calls for urgent attention, particularly in South America where deforestation rates have failed to decline over the past 20 years. The main direct cause of deforestation is land conversion to agriculture. We combine data from the FAO and the World Bank for six tropical Southern American countries over the period 1970–2006, estimate a panel data model accounting for various determinants of agricultural land expansion and derive elasticities to quantify the effect of the different independent variables. We investigate whether agricultural intensification, in conjunction with governance factors, has been promoting agricultural expansion, leading to a ‘‘Jevons paradox’’. The paradox occurs if an increase in the productivity of one factor (here agricultural land) leads to its increased, rather than decreased, utilization. We find that for high values of our governance indicators a Jevons paradox exists even for moderate levels of agricultural productivity, leading to an overall expansion of agricultural area. Agricultural expansion is also positively related to the level of service on external debt and population growth, while its association with agricultural exports is only moderate. Finally, we find no evidence of an environmental Kuznets curve, as agricultural area is ultimately positively correlated to per-capita income levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary role of land surface models embedded in climate models is to partition surface available energy into upwards, radiative, sensible and latent heat fluxes. Partitioning of evapotranspiration, ET, is of fundamental importance: as a major component of the total surface latent heat flux, ET affects the simulated surface water balance, and related energy balance, and consequently the feedbacks with the atmosphere. In this context it is also crucial to credibly represent the CO2 exchange between ecosystems and their environment. In this study, JULES, the land surface model used in UK weather and climate models, has been evaluated for temperate Europe. Compared to eddy covariance flux measurements, the CO2 uptake by the ecosystem is underestimated and the ET overestimated. In addition, the contribution to ET from soil and intercepted water evaporation far outweighs the contribution of plant transpiration. To alleviate these biases, adaptations have been implemented in JULES, based on key literature references. These adaptations have improved the simulation of the spatio-temporal variability of the fluxes and the accuracy of the simulated GPP and ET, including its partitioning. This resulted in a shift of the seasonal soil moisture cycle. These adaptations are expected to increase the fidelity of climate simulations over Europe. Finally, the extreme summer of 2003 was used as evaluation benchmark for the use of the model in climate change studies. The improved model captures the impact of the 2003 drought on the carbon assimilation and the water use efficiency of the plants. It, however, underestimates the 2003 GPP anomalies. The simulations showed that a reduction of evaporation from the interception and soil reservoirs, albeit not of transpiration, largely explained the good correlation between the carbon and the water fluxes anomalies that was observed during 2003. This demonstrates the importance of being able to discriminate the response of individual component of the ET flux to environmental forcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whereas fossil evidence indicates extensive treeless vegetation and diverse grazing megafauna in Europe and northern Asia during the last glacial, experiments combining vegetation models and climate models have to-date simulated widespread persistence of trees. Resolving this conflict is key to understanding both last glacial ecosystems and extinction of most of the mega-herbivores. Using a dynamic vegetation model (DVM) we explored the implications of the differing climatic conditions generated by a general circulation model (GCM) in “normal” and “hosing” experiments. Whilst the former approximate interstadial conditions, the latter, designed to mimic Heinrich Events, approximate stadial conditions. The “hosing” experiments gave simulated European vegetation much closer in composition to that inferred from fossil evidence than did the “normal” experiments. Given the short duration of interstadials, and the rate at which forest cover expanded during the late-glacial and early Holocene, our results demonstrate the importance of millennial variability in determining the character of last glacial ecosystems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A stand-alone sea ice model is tuned and validated using satellite-derived, basinwide observations of sea ice thickness, extent, and velocity from the years 1993 to 2001. This is the first time that basin-scale measurements of sea ice thickness have been used for this purpose. The model is based on the CICE sea ice model code developed at the Los Alamos National Laboratory, with some minor modifications, and forcing consists of 40-yr ECMWF Re-Analysis (ERA-40) and Polar Exchange at the Sea Surface (POLES) data. Three parameters are varied in the tuning process: Ca, the air–ice drag coefficient; P*, the ice strength parameter; and α, the broadband albedo of cold bare ice, with the aim being to determine the subset of this three-dimensional parameter space that gives the best simultaneous agreement with observations with this forcing set. It is found that observations of sea ice extent and velocity alone are not sufficient to unambiguously tune the model, and that sea ice thickness measurements are necessary to locate a unique subset of parameter space in which simultaneous agreement is achieved with all three observational datasets.