999 resultados para Giorgione, 1477-1511.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In its default configuration, the Hadley Centre climate model (GA2.0) simulates roughly one-half the observed level of Madden–Julian oscillation activity, with MJO events often lasting fewer than seven days. We use initialised, climate-resolution hindcasts to examine the sensitivity of the GA2.0 MJO to a range of changes in sub-grid parameterisations and model configurations. All 22 changes are tested for two cases during the Years of Tropical Convection. Improved skill comes only from (a) disabling vertical momentum transport by convection and (b) increasing mixing entrainment and detrainment for deep and mid-level convection. These changes are subsequently tested in a further 14 hindcast cases; only (b) consistently improves MJO skill, from 12 to 22 days. In a 20-year integration, (b) produces near-observed levels of MJO activity, but propagation through the Maritime Continent remains weak. With default settings, GA2.0 produces precipitation too readily, even in anomalously dry columns. Implementing (b) decreases the efficiency of convection, permitting instability to build during the suppressed MJO phase and producing a more favourable environment for the active phase. The distribution of daily rain rates is more consistent with satellite data; default entrainment produces 6–12 mm/day too frequently. These results are consistent with recent studies showing that greater sensitivity of convection to moisture improves the representation of the MJO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Particle filters are fully non-linear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high-dimensional geophysical applications the number of particles required by the sequential importance resampling (SIR) particle filter in order to capture the high probability region of the posterior, is too large to make them usable. However particle filters can be formulated using proposal densities, which gives greater freedom in how particles are sampled and allows for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high probability region of the posterior probability density function. This gives rise to the possibility of non-linear data assimilation in large dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airborne dust affects the Earth's energy balance — an impact that is measured in terms of the implied change in net radiation (or radiative forcing, in W m-2) at the top of the atmosphere. There remains considerable uncertainty in the magnitude and sign of direct forcing by airborne dust under current climate. Much of this uncertainty stems from simplified assumptions about mineral dust-particle size, composition and shape, which are applied in remote sensing retrievals of dust characteristics and dust-cycle models. Improved estimates of direct radiative forcing by dust will require improved characterization of the spatial variability in particle characteristics to provide reliable information dust optical properties. This includes constraints on: (1) particle-size distribution, including discrimination of particle subpopulations and quantification of the amount of dust in the sub-10 µm to <0.1 µm mass fraction; (2) particle composition, specifically the abundance of iron oxides, and whether particles consist of single or multi-mineral grains; (3) particle shape, including degree of sphericity and surface roughness, as a function of size and mineralogy; and (4) the degree to which dust particles are aggregated together. The use of techniques that measure the size, composition and shape of individual particles will provide a better basis for optical modelling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate sensitivity is defined as the change in global mean equilibrium temperature after a doubling of atmospheric CO2 concentration and provides a simple measure of global warming. An early estimate of climate sensitivity, 1.5—4.5°C, has changed little subsequently, including the latest assessment by the Intergovernmental Panel on Climate Change. The persistence of such large uncertainties in this simple measure casts doubt on our understanding of the mechanisms of climate change and our ability to predict the response of the climate system to future perturbations. This has motivated continued attempts to constrain the range with climate data, alone or in conjunction with models. The majority of studies use data from the instrumental period (post-1850), but recent work has made use of information about the large climate changes experienced in the geological past. In this review, we first outline approaches that estimate climate sensitivity using instrumental climate observations and then summarize attempts to use the record of climate change on geological timescales. We examine the limitations of these studies and suggest ways in which the power of the palaeoclimate record could be better used to reduce uncertainties in our predictions of climate sensitivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interannual variability of the stratospheric winter polar vortex is correlated with the phase of the quasi-biennial oscillation (QBO) of tropical stratospheric winds. This dynamical coupling between high and low latitudes, often referred to as the Holton–Tan effect, has been the subject of numerous observational and modelling studies, yet important questions regarding its mechanism remain unanswered. In particular it remains unclear which vertical levels of the QBO exert the strongest influence on the winter polar vortex, and how QBO–vortex coupling interacts with the effects of other sources of atmospheric interannual variability such as the 11-year solar cycle or the El Nino Southern Oscillation. As stratosphere-resolving general circulation models begin to resolve the QBO and represent its teleconnections with other parts of the climate system, it seems timely to summarize what is currently known about the QBO’s high-latitude influence. In this review article, we offer a synthesis of the modelling and observational analyses of QBO–vortex coupling that have appeared in the literature, and update the observational record.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simultaneous scintillometer measurements at multiple wavelengths (pairing visible or infrared with millimetre or radio waves) have the potential to provide estimates of path-averaged surface fluxes of sensible and latent heat. Traditionally, the equations to deduce fluxes from measurements of the refractive index structure parameter at the two wavelengths have been formulated in terms of absolute humidity. Here, it is shown that formulation in terms of specific humidity has several advantages. Specific humidity satisfies the requirement for a conserved variable in similarity theory and inherently accounts for density effects misapportioned through the use of absolute humidity. The validity and interpretation of both formulations are assessed and the analogy with open-path infrared gas analyser density corrections is discussed. Original derivations using absolute humidity to represent the influence of water vapour are shown to misrepresent the latent heat flux. The errors in the flux, which depend on the Bowen ratio (larger for drier conditions), may be of the order of 10%. The sensible heat flux is shown to remain unchanged. It is also verified that use of a single scintillometer at optical wavelengths is essentially unaffected by these new formulations. Where it may not be possible to reprocess two-wavelength results, a density correction to the latent heat flux is proposed for scintillometry, which can be applied retrospectively to reduce the error.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate that summer precipitation biases in the South Asian monsoon domain are sensitive to increasing the convective parametrisation’s entrainment and detrainment rates in the Met Office Unified Model. We explore this sensitivity to improve our understanding of the biases and inform efforts to improve convective parametrisation. We perform novel targeted experiments in which we increase the entrainment and detrainment rates in regions of especially large precipitation bias. We use these experiments to determine whether the sensitivity at a given location is a consequence of the local change to convection or is a remote response to the change elsewhere. We find that a local change leads to different mean-state responses in comparable regions. When the entrainment and detrainment rates are increased globally, feedbacks between regions usually strengthen the local responses. We choose two regions of tropical ascent that show different mean-state responses, the western equatorial Indian Ocean and western north Pacific, and analyse them as case studies to determine the mechanisms leading to the different responses. Our results indicate that several aspects of a region’s mean-state, including moisture content, sea surface temperature and circulation, play a role in local feedbacks that determine the response to increased entrainment and detrainment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We perform simulations of several convective events over the southern UK with the Met Office Unified Model (UM) at horizontal grid lengths ranging from 1.5 km to 200 m. Comparing the simulated storms on these days with the Met Office rainfall radar network allows us to apply a statistical approach to evaluate the properties and evolution of the simulated storms over a range of conditions. Here we present results comparing the storm morphology in the model and reality which show that the simulated storms become smaller as grid length decreases and that the grid length that fits the observations best changes with the size of the observed cells. We investigate the sensitivity of storm morphology in the model to the mixing length used in the subgrid turbulence scheme. As the subgrid mixing length is decreased, the number of small storms with high area-averaged rain rates increases. We show that by changing the mixing length we can produce a lower resolution simulation that produces similar morphologies to a higher resolution simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Precipitation forecast data from the ERA-Interim reanalysis (33 years) are evaluated using the daily England and Wales Precipitation (EWP) observations obtained from a rain gauge network. Observed and reanalysis daily precipitation data are both described well by Weibull distributions with indistinguishable shapes but different scale parameters, such that the reanalysis underestimates the observations by an average factor of 22%. The correlation between the observed and ERA-Interim time series of regional, daily precipitation is 0.91. ERA-Interim also captures the statistics of extreme precipitation including a slightly lower likelihood of the heaviest precipitation events (>15 mm day− 1 for the regional average) than indicated by the Weibull fit. ERA-Interim is also closer to EWP for the high precipitation events. Since these carry weight in longer accumulations, a smaller underestimation of 19% is found for monthly mean precipitation. The partition between convective and stratiform precipitation in the ERA-Interim forecast is also examined. In summer both components contribute equally to the total precipitation amount, while in winter the stratiform precipitation is approximately double convective. These results are expected to be relevant to other regions with low orography on the coast of a continent at the downstream end of mid-latitude stormtracks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The time discretization in weather and climate models introduces truncation errors that limit the accuracy of the simulations. Recent work has yielded a method for reducing the amplitude errors in leapfrog integrations from first-order to fifth-order. This improvement is achieved by replacing the Robert--Asselin filter with the RAW filter and using a linear combination of the unfiltered and filtered states to compute the tendency term. The purpose of the present paper is to apply the composite-tendency RAW-filtered leapfrog scheme to semi-implicit integrations. A theoretical analysis shows that the stability and accuracy are unaffected by the introduction of the implicitly treated mode. The scheme is tested in semi-implicit numerical integrations in both a simple nonlinear stiff system and a medium-complexity atmospheric general circulation model, and yields substantial improvements in both cases. We conclude that the composite-tendency RAW-filtered leapfrog scheme is suitable for use in semi-implicit integrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea ice plays a crucial role in the earth's energy and water budget and substantially impacts local and remote atmospheric and oceanic circulations. Predictions of Arctic sea ice conditions a few months to a few years in advance could be of interest for stakeholders. This article presents a review of the potential sources of Arctic sea ice predictability on these timescales. Predictability mainly originates from persistence or advection of sea ice anomalies, interactions with the ocean and atmosphere and changes in radiative forcing. After estimating the inherent potential predictability limit with state-of-the-art models, current sea ice forecast systems are described, together with their performance. Finally, some challenges and issues in sea ice forecasting are presented, along with suggestions for future research priorities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the prospect of exascale computing, computational methods requiring only local data become especially attractive. Consequently, the typical domain decomposition of atmospheric models means horizontally-explicit vertically-implicit (HEVI) time-stepping schemes warrant further attention. In this analysis, Runge-Kutta implicit-explicit schemes from the literature are analysed for their stability and accuracy using a von Neumann stability analysis of two linear systems. Attention is paid to the numerical phase to indicate the behaviour of phase and group velocities. Where the analysis is tractable, analytically derived expressions are considered. For more complicated cases, amplification factors have been numerically generated and the associated amplitudes and phase diagnosed. Analysis of a system describing acoustic waves has necessitated attributing the three resultant eigenvalues to the three physical modes of the system. To do so, a series of algorithms has been devised to track the eigenvalues across the frequency space. The result enables analysis of whether the schemes exactly preserve the non-divergent mode; and whether there is evidence of spurious reversal in the direction of group velocities or asymmetry in the damping for the pair of acoustic modes. Frequency ranges that span next-generation high-resolution weather models to coarse-resolution climate models are considered; and a comparison is made of errors accumulated from multiple stability-constrained shorter time-steps from the HEVI scheme with a single integration from a fully implicit scheme over the same time interval. Two schemes, “Trap2(2,3,2)” and “UJ3(1,3,2)”, both already used in atmospheric models, are identified as offering consistently good stability and representation of phase across all the analyses. Furthermore, according to a simple measure of computational cost, “Trap2(2,3,2)” is the least expensive.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Here we describe general flow processes for the synthesis of alkyl and aryl azides, and the development of a new monolithic triphenylphosphine reagent, which provides a convenient format for the use of this versatile reagent in flow. The utility of these new tools was demonstrated by their application to a flow Staudinger aza-Wittig reaction sequence. Finally, a multistep aza-Wittig, reduction and purification flow process was designed, allowing access to amine products in an automated fashion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of three orthogonally tagged phosphine reagents to assist chemical work-up via phase-switch scavenging in conjunction with a modular flow reactor is described. These techniques (acidic, basic and Click chemistry) are used to prepare various amides and tri-substituted guanidines from in situ generated iminophosphoranes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cycloaddition of acetylenes with azides to give the corresponding 1,4-disubstituted 1,2,3-triazoles is reported using immobilised reagents and scavengers in pre-packed glass tubes in a modular flow reactor.