917 resultados para general circulation model (GCM) ground hydrolosic model (GHM) heat and vapor exchange between land and atmosphere


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent high-resolution radiosonde climatologies have revealed a tropopause inversion layer (TIL) in the extratropics: temperature strongly increases just above a sharp local cold point tropopause. Here, it is asked to what extent a TIL exists in current general circulation models (GCMs) and meteorological analyses. Only a weak hint of a TIL exists in NCEP/NCAR reanalysis data. In contrast, the Canadian Middle Atmosphere Model (CMAM), a comprehensive GCM, exhibits a TIL of realistic strength. However, in data assimilation mode CMAM exhibits a much weaker TIL, especially in the Southern Hemisphere where only coarse satellite data are available. The discrepancy between the analyses and the GCM is thus hypothesized to be mainly due to data assimilation acting to smooth the observed strong curvature in temperature around the tropopause. This is confirmed in the reanalysis where the stratification around the tropopause exhibits a strong discontinuity at the start of the satellite era.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to evaluate the feasibility of simulating maize yield in a subtropical region of southern Brazil using the general large area model (Glam). A 16year time series of daily weather data were used. The model was adjusted and tested as an alternative for simulating maize yield at small and large spatial scales. Simulated and observed grain yields were highly correlated (r above 0.8; p<0.01) at large scales (greater than 100,000 km2), with variable and mostly lower correlations (r from 0.65 to 0.87; p<0.1) at small spatial scales (lower than 10,000 km2). Large area models can contribute to monitoring or forecasting regional patterns of variability in maize production in the region, providing a basis for agricultural decision making, and GlamMaize is one of the alternatives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the tropical middle atmosphere the climatological radiative equilibrium temperature is inconsistent with gradient-wind balance and the available angular momentum, especially during solstice seasons. Adjustment toward a balanced state results in a type of Hadley circulation that lies outside the downward control view of zonally averaged dynamics. This middle-atmosphere Hadley circulation is reexamined here using a zonally symmetric balance model driven through an annual cycle. It is found that the inclusion of a realistic radiation scheme leads to a concentration of the circulation near the stratopause and to its closing off in the mesosphere, with no need for relaxational damping or a rigid lid. The evolving zonal flow is inertially unstable, leading to a rapid process of inertial adjustment, which becomes significant in the mesosphere. This short-circuits the slower process of angular momentum homogenization by the Hadley circulation itself, thereby weakening the latter. The effect of the meridional circulation associated with extratropical wave drag on the Hadley circulation is considered. It is shown that the two circulations are independent for linear (quasigeostrophic) zonal-mean dynamics, and interact primarily through the advection of temperature and angular momentum. There appears to be no significant coupling in the deep Tropics via temperature advection since the wave-driven circulation is unable to alter meridional temperature gradients in this region. However, the wave-driven circulation can affect the Hadley circulation by advecting angular momentum out of the Tropics. The validity of the zonally symmetric balance model with parameterized inertial adjustment is tested by comparison with a three-dimensional primitive equations model. Fields from a middle-atmosphere GCM are also examined for evidence of these processes. While many aspects of the GCM circulation are indicative of the middle-atmosphere Hadley circulation, particularly in the upper stratosphere, it appears that the circulation is obscured in the mesosphere and lower stratosphere by other processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use a state-of-the-art ocean general circulation and biogeochemistry model to examine the impact of changes in ocean circulation and biogeochemistry in governing the change in ocean carbon-13 and atmospheric CO2 at the last glacial maximum (LGM). We examine 5 different realisations of the ocean's overturning circulation produced by a fully coupled atmosphere-ocean model under LGM forcing and suggested changes in the atmospheric deposition of iron and phytoplankton physiology at the LGM. Measured changes in carbon-13 and carbon-14, as well as a qualitative reconstruction of the change in ocean carbon export are used to evaluate the results. Overall, we find that while a reduction in ocean ventilation at the LGM is necessary to reproduce carbon-13 and carbon-14 observations, this circulation results in a low net sink for atmospheric CO2. In contrast, while biogeochemical processes contribute little to carbon isotopes, we propose that most of the change in atmospheric CO2 was due to such factors. However, the lesser role for circulation means that when all plausible factors are accounted for, most of the necessary CO2 change remains to be explained. This presents a serious challenge to our understanding of the mechanisms behind changes in the global carbon cycle during the geologic past.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diagnosing the climate of New Zealand from low-resolution General Circulation Models (GCMs) is notoriously difficult due to the interaction of the complex topography and the Southern Hemisphere (SH) mid-latitude westerly winds. Therefore, methods of downscaling synoptic scale model data for New Zealand are useful to help understand past climate. New Zealand also has a wealth of palaeoclimate-proxy data to which the downscaled model output can be compared, and to provide a qualitative method of assessing the capability of GCMs to represent, in this case, the climate 6000 yr ago in the Mid-Holocene. In this paper, a synoptic weather and climate regime classification system using Empirical Orthogonal Function (EOF) analysis of GCM and reanalysis data was used. The climate regimes are associated with surface air temperature and precipitation anomalies over New Zealand. From the analysis in this study, we find at 6000 BP that increased trough activity in summer and autumn led to increased precipitation, with an increased north-south pressure gradient ("zonal events") in winter and spring leading to drier conditions. Opposing effects of increased (decreased) temperature are also seen in spring (autumn) in the South Island, which are associated with the increased zonal (trough) events; however, the circulation induced changes in temperature are likely to have been of secondary importance to the insolation induced changes. Evidence from the palaeoclimate-proxy data suggests that the Mid-Holocene was characterized by increased westerly wind events in New Zealand, which agrees with the preference for trough and zonal regimes in the models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This manuscript describes the energy and water components of a new community land surface model called the Joint UK Land Environment Simulator (JULES). This is developed from the Met Office Surface Exchange Scheme (MOSES). It can be used as a stand alone land surface model driven by observed forcing data, or coupled to an atmospheric global circulation model. The JULES model has been coupled to the Met Office Unified Model (UM) and as such provides a unique opportunity for the research community to contribute their research to improve both world-leading operational weather forecasting and climate change prediction systems. In addition JULES, and its forerunner MOSES, have been the basis for a number of very high-profile papers concerning the land-surface and climate over the last decade. JULES has a modular structure aligned to physical processes, providing the basis for a flexible modelling platform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Hadley Centre Global Environmental Model (HadGEM) includes two aerosol schemes: the Coupled Large-scale Aerosol Simulator for Studies in Climate (CLASSIC), and the new Global Model of Aerosol Processes (GLOMAP-mode). GLOMAP-mode is a modal aerosol microphysics scheme that simulates not only aerosol mass but also aerosol number, represents internally-mixed particles, and includes aerosol microphysical processes such as nucleation. In this study, both schemes provide hindcast simulations of natural and anthropogenic aerosol species for the period 20002006. HadGEM simulations of the aerosol optical depth using GLOMAP-mode compare better than CLASSIC against a data-assimilated aerosol re-analysis and aerosol ground-based observations. Because of differences in wet deposition rates, GLOMAP-mode sulphate aerosol residence time is two days longer than CLASSIC sulphate aerosols, whereas black carbon residence time is much shorter. As a result, CLASSIC underestimates aerosol optical depths in continental regions of the Northern Hemisphere and likely overestimates absorption in remote regions. Aerosol direct and first indirect radiative forcings are computed from simulations of aerosols with emissions for the year 1850 and 2000. In 1850, GLOMAP-mode predicts lower aerosol optical depths and higher cloud droplet number concentrations than CLASSIC. Consequently, simulated clouds are much less susceptible to natural and anthropogenic aerosol changes when the microphysical scheme is used. In particular, the response of cloud condensation nuclei to an increase in dimethyl sulphide emissions becomes a factor of four smaller. The combined effect of different 1850 baselines, residence times, and abilities to affect cloud droplet number, leads to substantial differences in the aerosol forcings simulated by the two schemes. GLOMAP-mode finds a presentday direct aerosol forcing of 0.49Wm2 on a global average, 72% stronger than the corresponding forcing from CLASSIC. This difference is compensated by changes in first indirect aerosol forcing: the forcing of 1.17Wm2 obtained with GLOMAP-mode is 20% weaker than with CLASSIC. Results suggest that mass-based schemes such as CLASSIC lack the necessary sophistication to provide realistic input to aerosol-cloud interaction schemes. Furthermore, the importance of the 1850 baseline highlights how model skill in predicting present-day aerosol does not guarantee reliable forcing estimates. Those findings suggest that the more complex representation of aerosol processes in microphysical schemes improves the fidelity of simulated aerosol forcings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A one-dimensional, thermodynamic, and radiative model of a melt pond on sea ice is presented that explicitly treats the melt pond as an extra phase. A two-stream radiation model, which allows albedo to be determined from bulk optical properties, and a parameterization of the summertime evolution of optical properties, is used. Heat transport within the sea ice is described using an equation describing heat transport in a mushy layer of a binary alloy (salt water). The model is tested by comparison of numerical simulations with SHEBA data and previous modeling. The presence of melt ponds on the sea ice surface is demonstrated to have a significant effect on the heat and mass balance. Sensitivity tests indicate that the maximum melt pond depth is highly sensitive to optical parameters and drainage. INDEX TERMS: 4207 Oceanography: General: Arctic and Antarctic oceanography; 4255 Oceanography: General: Numerical modeling; 4299 Oceanography: General: General or miscellaneous; KEYWORDS: sea ice, melt pond, albedo, Arctic Ocean, radiation model, thermodynamic

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes a case study involving information technology managers and their new programmer recruitment policy, but the primary interest is methodological. The processes of issue generation and selection and model conceptualization are described. Early use of magnetic hexagons allowed the generation of a range of issues, most of which would not have emerged if system dynamics elicitation techniques had been employed. With the selection of a specific issue, flow diagraming was used to conceptualize a model, computer implementation and scenario generation following naturally. Observations are made on the processes of system dynamics modeling, particularly on the need to employ general techniques of knowledge elicitation in the early stages of interventions. It is proposed that flexible approaches should be used to generate, select, and study the issues, since these reduce any biasing of the elicitation toward system dynamics problems and also allow the participants to take up the most appropriate problem- structuring approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a numerical model for predicting the evolution of the pattern of ionospheric convection in response to general time-dependent magnetic reconnection at the dayside magnetopause and in the cross-tail current sheet of the geomagnetic tail. The model quantifies the concepts of ionospheric flow excitation by Cowley and Lockwood (1992), assuming a uniform spatial distribution of ionospheric conductivity. The model is demonstrated using an example in which travelling reconnection pulses commence near noon and then move across the dayside magnetopause towards both dawn and dusk. Two such pulses, 8 min apart, are used and each causes the reconnection to be active for 1 min at every MLT that they pass over. This example demonstrates how the convection response to a given change in the interplanetary magnetic field (via the reconnection rate) depends on the previous reconnection history. The causes of this effect are explained. The inherent assumptions and the potential applications of the model are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inclusion of the direct and indirect radiative effects of aerosols in high-resolution global numerical weather prediction (NWP) models is being increasingly recognised as important for the improved accuracy of short-range weather forecasts. In this study the impacts of increasing the aerosol complexity in the global NWP configuration of the Met Office Unified Model (MetUM) are investigated. A hierarchy of aerosol representations are evaluated including three-dimensional monthly mean speciated aerosol climatologies, fully prognostic aerosols modelled using the CLASSIC aerosol scheme and finally, initialised aerosols using assimilated aerosol fields from the GEMS project. The prognostic aerosol schemes are better able to predict the temporal and spatial variation of atmospheric aerosol optical depth, which is particularly important in cases of large sporadic aerosol events such as large dust storms or forest fires. Including the direct effect of aerosols improves model biases in outgoing long-wave radiation over West Africa due to a better representation of dust. However, uncertainties in dust optical properties propagate to its direct effect and the subsequent model response. Inclusion of the indirect aerosol effects improves surface radiation biases at the North Slope of Alaska ARM site due to lower cloud amounts in high-latitude clean-air regions. This leads to improved temperature and height forecasts in this region. Impacts on the global mean model precipitation and large-scale circulation fields were found to be generally small in the short-range forecasts. However, the indirect aerosol effect leads to a strengthening of the low-level monsoon flow over the Arabian Sea and Bay of Bengal and an increase in precipitation over Southeast Asia. Regional impacts on the African Easterly Jet (AEJ) are also presented with the large dust loading in the aerosol climatology enhancing of the heat low over West Africa and weakening the AEJ. This study highlights the importance of including a more realistic treatment of aerosolcloud interactions in global NWP models and the potential for improved global environmental prediction systems through the incorporation of more complex aerosol schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-model ensembles are frequently used to assess understanding of the response of ozone and methane lifetime to changes in emissions of ozone precursors such as NOx, VOCs (volatile organic compounds) and CO. When these ozone changes are used to calculate radiative forcing (RF) (and climate metrics such as the global warming potential (GWP) and global temperature-change potential (GTP)) there is a methodological choice, determined partly by the available computing resources, as to whether the mean ozone (and methane) concentration changes are input to the radiation code, or whether each model's ozone and methane changes are used as input, with the average RF computed from the individual model RFs. We use data from the Task Force on Hemispheric Transport of Air Pollution sourcereceptor global chemical transport model ensemble to assess the impact of this choice for emission changes in four regions (East Asia, Europe, North America and South Asia). We conclude that using the multi-model mean ozone and methane responses is accurate for calculating the mean RF, with differences up to 0.6% for CO, 0.7% for VOCs and 2% for NOx. Differences of up to 60% for NOx 7% for VOCs and 3% for CO are introduced into the 20 year GWP. The differences for the 20 year GTP are smaller than for the GWP for NOx, and similar for the other species. However, estimates of the standard deviation calculated from the ensemble-mean input fields (where the standard deviation at each point on the model grid is added to or subtracted from the mean field) are almost always substantially larger in RF, GWP and GTP metrics than the true standard deviation, and can be larger than the model range for short-lived ozone RF, and for the 20 and 100 year GWP and 100 year GTP. The order of averaging has most impact on the metrics for NOx, as the net values for these quantities is the residual of the sum of terms of opposing signs. For example, the standard deviation for the 20 year GWP is 23 times larger using the ensemble-mean fields than using the individual models to calculate the RF. The source of this effect is largely due to the construction of the input ozone fields, which overestimate the true ensemble spread. Hence, while the average of multi-model fields are normally appropriate for calculating mean RF, GWP and GTP, they are not a reliable method for calculating the uncertainty in these fields, and in general overestimate the uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Land surface Processes and eXchanges (LPX) model is a fire-enabled dynamic global vegetation model that performs well globally but has problems representing fire regimes and vegetative mix in savannas. Here we focus on improving the fire module. To improve the representation of ignitions, we introduced a reatment of lightning that allows the fraction of ground strikes to vary spatially and seasonally, realistically partitions strike distribution between wet and dry days, and varies the number of dry days with strikes. Fuel availability and moisture content were improved by implementing decomposition rates specific to individual plant functional types and litter classes, and litter drying rates driven by atmospheric water content. To improve water extraction by grasses, we use realistic plant-specific treatments of deep roots. To improve fire responses, we introduced adaptive bark thickness and post-fire resprouting for tropical and temperate broadleaf trees. All improvements are based on extensive analyses of relevant observational data sets. We test model performance for Australia, first evaluating parameterisations separately and then measuring overall behaviour against standard benchmarks. Changes to the lightning parameterisation produce a more realistic simulation of fires in southeastern and central Australia. Implementation of PFT-specific decomposition rates enhances performance in central Australia. Changes in fuel drying improve fire in northern Australia, while changes in rooting depth produce a more realistic simulation of fuel availability and structure in central and northern Australia. The introduction of adaptive bark thickness and resprouting produces more realistic fire regimes in Australian savannas. We also show that the model simulates biomass recovery rates consistent with observations from several different regions of the world characterised by resprouting vegetation. The new model (LPX-Mv1) produces an improved simulation of observed vegetation composition and mean annual burnt area, by 33 and 18% respectively compared to LPX.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Substantial low-frequency rainfall fluctuations occurred in the Sahel throughout the twentieth century, causing devastating drought. Modeling these low-frequency rainfall fluctuations has remained problematic for climate models for many years. Here we show using a combination of state-of-the-art rainfall observations and high-resolution global climate models that changes in organized heavy rainfall events carry most of the rainfall variability in the Sahel at multiannual to decadal time scales. Ability to produce intense, organized convection allows climate models to correctly simulate the magnitude of late-twentieth century rainfall change, underlining the importance of model resolution. Increasing model resolution allows a better coupling between large-scale circulation changes and regional rainfall processes over the Sahel. These results provide a strong basis for developing more reliable and skilful long-term predictions of rainfall (seasons to years) which could benefit many sectors in the region by allowing early adaptation to impending extremes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.