141 resultados para model potential
em CentAUR: Central Archive University of Reading - UK
Resumo:
A model potential energy function for the ground state of H2CO has been derived which covers the whole space of the six internal coordinates. This potential reproduces the experimental energy, geometry and quadratic force field of formaldehyde, and dissociates correctly to all possible atom, diatom and triatom fragments. Thus there are good reasons for believing it to be close to the true potential energy surface except in regions where both hydrogen atoms are close to the oxygen. It leads to the prediction that there should be a metastable singlet hydroxycarbene HCOH which has a planar trans structure and an energy of 2•31 eV above that of equilibrium formaldehyde. The reaction path for dissociation into H2 + CO is predicted to pass through a low symmetry transition state with an activation energy of 4•8 eV. Both of these predictions are in good agreement with recently published ab initio calculations.
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.
Resumo:
The tropical tropopause is considered to be the main region of upward transport of tropospheric air carrying water vapor and other tracers to the tropical stratosphere. The lower tropical stratosphere is also the region where the quasi-biennial oscillation (QBO) in the zonal wind is observed. The QBO is positioned in the region where the upward transport of tropospheric tracers to the overworld takes place. Hence the QBO can in principle modulate these transports by its secondary meridional circulation. This modulation is investigated in this study by an analysis of general circulation model (GCM) experiments with an assimilated QBO. The experiments show, first, that the temperature signal of the QBO modifies the specific humidity in the air transported upward and, second, that the secondary meridional circulation modulates the velocity of the upward transport. Thus during the eastward phase of the QBO the upward moving air is moister and the upward velocity is less than during the westward phase of the QBO. It was further found that the QBO period is too short to allow an equilibration of the moisture in the QBO region. This causes a QBO signal of the moisture which is considerably smaller than what could be obtained in the limiting case of indefinitely long QBO phases. This also allows a high sensitivity of the mean moisture over a QBO cycle to the El Niño-Southern Oscillation (ENSO) phenomena or major tropical volcanic eruptions. The interplay of sporadic volcanic eruptions, ENSO, and QBO can produce low-frequency variability in the water vapor content of the tropical stratosphere, which renders the isolation of the QBO signal in observational data of water vapor in the equatorial lower stratosphere difficult.
Resumo:
A simple four-dimensional assimilation technique, called Newtonian relaxation, has been applied to the Hamburg climate model (ECHAM), to enable comparison of model output with observations for short periods of time. The prognostic model variables vorticity, divergence, temperature, and surface pressure have been relaxed toward European Center for Medium-Range Weather Forecasts (ECMWF) global meteorological analyses. Several experiments have been carried out, in which the values of the relaxation coefficients have been varied to find out which values are most usable for our purpose. To be able to use the method for validation of model physics or chemistry, good agreement of the model simulated mass and wind field is required. In addition, the model physics should not be disturbed too strongly by the relaxation forcing itself. Both aspects have been investigated. Good agreement with basic observed quantities, like wind, temperature, and pressure is obtained for most simulations in the extratropics. Derived variables, like precipitation and evaporation, have been compared with ECMWF forecasts and observations. Agreement for these variables is smaller than for the basic observed quantities. Nevertheless, considerable improvement is obtained relative to a control run without assimilation. Differences between tropics and extratropics are smaller than for the basic observed quantities. Results also show that precipitation and evaporation are affected by a sort of continuous spin-up which is introduced by the relaxation: the bias (ECMWF-ECHAM) is increasing with increasing relaxation forcing. In agreement with this result we found that with increasing relaxation forcing the vertical exchange of tracers by turbulent boundary layer mixing and, in a lesser extent, by convection, is reduced.
Resumo:
The impact of climate change on wind power generation potentials over Europe is investigated by considering ensemble projections from two regional climate models (RCMs) driven by a global climate model (GCM). Wind energy density and its interannual variability are estimated based on hourly near-surface wind speeds. Additionally, the possible impact of climatic changes on the energy output of a sample 2.5-MW turbine is discussed. GCM-driven RCM simulations capture the behavior and variability of current wind energy indices, even though some differences exist when compared with reanalysis-driven RCM simulations. Toward the end of the twenty-first century, projections show significant changes of energy density on annual average across Europe that are substantially stronger in seasonal terms. The emergence time of these changes varies from region to region and season to season, but some long-term trends are already statistically significant in the middle of the twenty-first century. Over northern and central Europe, the wind energy potential is projected to increase, particularly in winter and autumn. In contrast, energy potential over southern Europe may experience a decrease in all seasons except for the Aegean Sea. Changes for wind energy output follow the same patterns but are of smaller magnitude. The GCM/RCM model chains project a significant intensification of both interannual and intra-annual variability of energy density over parts of western and central Europe, thus imposing new challenges to a reliable pan-European energy supply in future decades.
Resumo:
This study examines, in a unified fashion, the budgets of ocean gravitational potential energy (GPE) and available gravitational potential energy (AGPE) in the control simulation of the coupled atmosphere–ocean general circulation model HadCM3. Only AGPE can be converted into kinetic energy by adiabatic processes. Diapycnal mixing supplies GPE, but not AGPE, whereas the reverse is true of the combined effect of surface buoyancy forcing and convection. Mixing and buoyancy forcing, thus, play complementary roles in sustaining the large scale circulation. However, the largest globally integrated source of GPE is resolved advection (+0.57 TW) and the largest sink is through parameterized eddy transports (-0.82 TW). The effect of these adiabatic processes on AGPE is identical to their effect on GPE, except for perturbations to both budgets due to numerical leakage exacerbated by non-linearities in the equation of state.
Resumo:
A basic data requirement of a river flood inundation model is a Digital Terrain Model (DTM) of the reach being studied. The scale at which modeling is required determines the accuracy required of the DTM. For modeling floods in urban areas, a high resolution DTM such as that produced by airborne LiDAR (Light Detection And Ranging) is most useful, and large parts of many developed countries have now been mapped using LiDAR. In remoter areas, it is possible to model flooding on a larger scale using a lower resolution DTM, and in the near future the DTM of choice is likely to be that derived from the TanDEM-X Digital Elevation Model (DEM). A variable-resolution global DTM obtained by combining existing high and low resolution data sets would be useful for modeling flood water dynamics globally, at high resolution wherever possible and at lower resolution over larger rivers in remote areas. A further important data resource used in flood modeling is the flood extent, commonly derived from Synthetic Aperture Radar (SAR) images. Flood extents become more useful if they are intersected with the DTM, when water level observations (WLOs) at the flood boundary can be estimated at various points along the river reach. To illustrate the utility of such a global DTM, two examples of recent research involving WLOs at opposite ends of the spatial scale are discussed. The first requires high resolution spatial data, and involves the assimilation of WLOs from a real sequence of high resolution SAR images into a flood model to update the model state with observations over time, and to estimate river discharge and model parameters, including river bathymetry and friction. The results indicate the feasibility of such an Earth Observation-based flood forecasting system. The second example is at a larger scale, and uses SAR-derived WLOs to improve the lower-resolution TanDEM-X DEM in the area covered by the flood extents. The resulting reduction in random height error is significant.
Resumo:
This article describes a novel algorithmic development extending the contour advective semi-Lagrangian model to include nonconservative effects. The Lagrangian contour representation of finescale tracer fields, such as potential vorticity, allows for conservative, nondiffusive treatment of sharp gradients allowing very high numerical Reynolds numbers. It has been widely employed in accurate geostrophic turbulence and tracer advection simulations. In the present, diabatic version of the model the constraint of conservative dynamics is overcome by including a parallel Eulerian field that absorbs the nonconservative ( diabatic) tendencies. The diabatic buildup in this Eulerian field is limited through regular, controlled transfers of this field to the contour representation. This transfer is done with a fast newly developed contouring algorithm. This model has been implemented for several idealized geometries. In this paper a single-layer doubly periodic geometry is used to demonstrate the validity of the model. The present model converges faster than the analogous semi-Lagrangian models at increased resolutions. At the same nominal spatial resolution the new model is 40 times faster than the analogous semi-Lagrangian model. Results of an orographically forced idealized storm track show nontrivial dependency of storm-track statistics on resolution and on the numerical model employed. If this result is more generally applicable, this may have important consequences for future high-resolution climate modeling.
Resumo:
The constant-density Charney model describes the simplest unstable basic state with a planetary-vorticity gradient, which is uniform and positive, and baroclinicity that is manifest as a negative contribution to the potential-vorticity (PV) gradient at the ground and positive vertical wind shear. Together, these ingredients satisfy the necessary conditions for baroclinic instability. In Part I it was shown how baroclinic growth on a general zonal basic state can be viewed as the interaction of pairs of ‘counter-propagating Rossby waves’ (CRWs) that can be constructed from a growing normal mode and its decaying complex conjugate. In this paper the normal-mode solutions for the Charney model are studied from the CRW perspective.
Clear parallels can be drawn between the most unstable modes of the Charney model and the Eady model, in which the CRWs can be derived independently of the normal modes. However, the dispersion curves for the two models are very different; the Eady model has a short-wave cut-off, while the Charney model is unstable at short wavelengths. Beyond its maximum growth rate the Charney model has a neutral point at finite wavelength (r=1). Thereafter follows a succession of unstable branches, each with weaker growth than the last, separated by neutral points at integer r—the so-called ‘Green branches’. A separate branch of westward-propagating neutral modes also originates from each neutral point. By approximating the lower CRW as a Rossby edge wave and the upper CRW structure as a single PV peak with a spread proportional to the Rossby scale height, the main features of the ‘Charney branch’ (0
Resumo:
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stchastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, nonetheless SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS community, simulating the transitions between active and suppressed periods of tropical convection.
Resumo:
We explore the potential predictability of rapid changes in the Atlantic meridional overturning circulation (MOC) using a coupled global climate model (HadCM3). Rapid changes in the temperature and salinity of surface water in the Nordic Seas, and the flow of dense water through Denmark Strait, are found to be precursors to rapid changes in the model's MOC, with a lead time of around 10 years. The mechanism proposed to explain this potential predictability relies on the development of density anomalies in the Nordic Seas which propagate through Denmark Strait and along the deep western boundary current, affecting the overturning. These rapid changes in the MOC have significant, and widespread, climate impacts which are potentially predictable a few years ahead. Whilst the flow through Denmark Strait is too strong in HadCM3, the presence of such potential predictability motivates the monitoring of water properties in the Nordic Seas and Denmark Strait.
Resumo:
Data assimilation is a sophisticated mathematical technique for combining observational data with model predictions to produce state and parameter estimates that most accurately approximate the current and future states of the true system. The technique is commonly used in atmospheric and oceanic modelling, combining empirical observations with model predictions to produce more accurate and well-calibrated forecasts. Here, we consider a novel application within a coastal environment and describe how the method can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using a technique known as state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme, where the specification of the error covariance matrices is crucial for success. A simple 1D model of bed-form propagation is used to demonstrate the method. The scheme is capable of recovering near-perfect parameter values and, therefore, improves the capability of our model to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.
Resumo:
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.
Resumo:
Faced by the realities of a changing climate, decision makers in a wide variety of organisations are increasingly seeking quantitative predictions of regional and local climate. An important issue for these decision makers, and for organisations that fund climate research, is what is the potential for climate science to deliver improvements - especially reductions in uncertainty - in such predictions? Uncertainty in climate predictions arises from three distinct sources: internal variability, model uncertainty and scenario uncertainty. Using data from a suite of climate models we separate and quantify these sources. For predictions of changes in surface air temperature on decadal timescales and regional spatial scales, we show that uncertainty for the next few decades is dominated by sources (model uncertainty and internal variability) that are potentially reducible through progress in climate science. Furthermore, we find that model uncertainty is of greater importance than internal variability. Our findings have implications for managing adaptation to a changing climate. Because the costs of adaptation are very large, and greater uncertainty about future climate is likely to be associated with more expensive adaptation, reducing uncertainty in climate predictions is potentially of enormous economic value. We highlight the need for much more work to compare: a) the cost of various degrees of adaptation, given current levels of uncertainty; and b) the cost of new investments in climate science to reduce current levels of uncertainty. Our study also highlights the importance of targeting climate science investments on the most promising opportunities to reduce prediction uncertainty.