98 resultados para Concertos (Harpsichord ensemble with string orchestra)
Resumo:
Estimates of the response of crops to climate change rarely quantify the uncertainty inherent in the simulation of both climate and crops. We present a crop simulation ensemble for a location in India, perturbing the response of both crop and climate under both baseline (12 720 simulations) and doubled-CO2 (171720 simulations) climates. Some simulations used parameter values representing genotypic adaptation to mean temperature change. Firstly, observed and simulated yields in the baseline climate were compared. Secondly, the response of yield to changes in mean temperature was examined and compared to that found in the literature. No consistent response to temperature change was found across studies. Thirdly, the relative contribution of uncertainty in crop and climate simulation to the total uncertainty in projected yield changes was examined. In simulations without genotypic adaptation, most of the uncertainty came from the climate model parameters. Comparison with the simulations with genotypic adaptation and with a previous study suggested that the relatively low crop parameter uncertainty derives from the observational constraints on the crop parameters used in this study. Fourthly, the simulations were used, together with an observed dataset and a simple analysis of crop cardinal temperatures and thermal time, to estimate the potential for adaptation using existing cultivars. The results suggest that the germplasm for complete adaptation of groundnut cultivation in western India to a doubled-CO2 environment may not exist. In conjunction with analyses of germplasm and local management
Resumo:
Increased atmospheric concentrations of carbon dioxide (CO2) will benefit the yield of most crops. Two free air CO2 enrichment (FACE) meta-analyses have shown increases in yield of between 0 and 73% for C3 crops. Despite this large range, few crop modelling studies quantify the uncertainty inherent in the parameterisation of crop growth and development. We present a novel perturbed-parameter method of crop model simulation, which uses some constraints from observations, that does this. The model used is the groundnut (i.e. peanut; Arachis hypogaea L.) version of the general large-area model for annual crops (GLAM). The conclusions are of relevance to C3 crops in general. The increases in yield simulated by GLAM for doubled CO2 were between 16 and 62%. The difference in mean percentage increase between well-watered and water-stressed simulations was 6.8. These results were compared to FACE and controlled environment studies, and to sensitivity tests on two other crop models of differing levels of complexity: CROPGRO, and the groundnut model of Hammer et al. [Hammer, G.L., Sinclair, T.R., Boote, K.J., Wright, G.C., Meinke, H., Bell, M.J., 1995. A peanut simulation model. I. Model development and testing. Agron. J. 87, 1085-1093]. The relationship between CO2 and water stress in the experiments and in the models was examined. From a physiological perspective, water-stressed crops are expected to show greater CO2 stimulation than well-watered crops. This expectation has been cited in literature. However, this result is not seen consistently in either the FACE studies or in the crop models. In contrast, leaf-level models of assimilation do consistently show this result. An analysis of the evidence from these models and from the data suggests that scale (canopy versus leaf), model calibration, and model complexity are factors in determining the sign and magnitude of the interaction between CO2 and water stress. We conclude from our study that the statement that 'water-stressed crops show greater CO2 stimulation than well-watered crops' cannot be held to be universally true. We also conclude, preliminarily, that the relationship between water stress and assimilation varies with scale. Accordingly, we provide some suggestions on how studies of a similar nature, using crop models of a range of complexity, could contribute further to understanding the roles of model calibration, model complexity and scale. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Embodied theories of cognition propose that neural substrates used in experiencing the referent of a word, for example perceiving upward motion, should be engaged in weaker form when that word, for example ‘rise’, is comprehended. Motivated by the finding that the perception of irrelevant background motion at near-threshold, but not supra-threshold, levels interferes with task execution, we assessed whether interference from near-threshold background motion was modulated by its congruence with the meaning of words (semantic content) when participants completed a lexical decision task (deciding if a string of letters is a real word or not). Reaction times for motion words, such as ‘rise’ or ‘fall’, were slower when the direction of visual motion and the ‘motion’ of the word were incongruent — but only when the visual motion was at nearthreshold levels. When motion was supra-threshold, the distribution of error rates, not reaction times, implicated low-level motion processing in the semantic processing of motion words. As the perception of near-threshold signals is not likely to be influenced by strategies, our results support a close contact between semantic information and perceptual systems.
Resumo:
We assessed the vulnerability of blanket peat to climate change in Great Britain using an ensemble of 8 bioclimatic envelope models. We used 4 published models that ranged from simple threshold models, based on total annual precipitation, to Generalised Linear Models (GLMs, based on mean annual temperature). In addition, 4 new models were developed which included measures of water deficit as threshold, classification tree, GLM and generalised additive models (GAM). Models that included measures of both hydrological conditions and maximum temperature provided a better fit to the mapped peat area than models based on hydrological variables alone. Under UKCIP02 projections for high (A1F1) and low (B1) greenhouse gas emission scenarios, 7 out of the 8 models showed a decline in the bioclimatic space associated with blanket peat. Eastern regions (Northumbria, North York Moors, Orkney) were shown to be more vulnerable than higher-altitude, western areas (Highlands, Western Isles and Argyle, Bute and The Trossachs). These results suggest a long-term decline in the distribution of actively growing blanket peat, especially under the high emissions scenario, although it is emphasised that existing peatlands may well persist for decades under a changing climate. Observational data from long-term monitoring and manipulation experiments in combination with process-based models are required to explore the nature and magnitude of climate change impacts on these vulnerable areas more fully.
Resumo:
The present study investigates the initiation of precipitating deep convection in an ensemble of convection-resolving mesoscale models. Results of eight different model runs from five non-hydrostatic models are compared for a case of the Convective and Orographically-induced Precipitation Study (COPS). An isolated convective cell initiated east of the Black Forest crest in southwest Germany, although convective available potential energy was only moderate and convective inhibition was high. Measurements revealed that, due to the absence of synoptic forcing, convection was initiated by local processes related to the orography. In particular, the lifting by low-level convergence in the planetary boundary layer is assumed to be the dominant process on that day. The models used different configurations as well as different initial and boundary conditions. By comparing the different model performance with each other and with measurements, the processes which need to be well represented to initiate convection at the right place and time are discussed. Besides an accurate specification of the thermodynamic and kinematic fields, the results highlight the role of boundary-layer convergence features for quantitative precipitation forecasts in mountainous terrain.
Resumo:
The consistency of ensemble forecasts from three global medium-range prediction systems with the observed transition behaviour of a three-cluster model of the North Atlantic eddy-driven jet is examined. The three clusters consist of a mid jet cluster taken to represent an undisturbed jet and south and north jet clusters representing southward and northward shifts of the jet. The ensemble forecasts span a period of three extended winters (October–February) from October 2007–February 2010. The mean probabilities of transitions between the clusters calculated from the ensemble forecasts are compared with those calculated from a 23-extended-winter climatology taken from the European Centre for Medium-Range Weather Forecasts 40-Year Re-analysis (ERA40) dataset. No evidence of a drift with increasing lead time of the ensemble forecast transition probabilities towards values inconsistent with the 23-extended-winter climatology is found. The ensemble forecasts of transition probabilities are found to have positive Brier Skill at 15 day lead times. It is found that for the three-extended-winter forecast set, probabilistic forecasts initialized in the north jet cluster are generally less skilful than those initialized in the other clusters. This is consistent with the shorter persistence time-scale of the north jet cluster observed in the ERA40 23-extended-winter climatology. Copyright © 2011 Royal Meteorological Society
Resumo:
The initial condition effect on climate prediction skill over a 2-year hindcast time-scale has been assessed from ensemble HadCM3 climate model runs using anomaly initialization over the period 1990–2001, and making comparisons with runs without initialization (equivalent to climatological conditions), and to anomaly persistence. It is shown that the assimilation improves the prediction skill in the first year globally, and in a number of limited areas out into the second year. Skill in hindcasting surface air temperature anomalies is most marked over ocean areas, and is coincident with areas of high sea surface temperature and ocean heat content skill. Skill improvement over land areas is much more limited but is still detectable in some cases. We found little difference in the skill of hindcasts using three different sets of ocean initial conditions, and we obtained the best results by combining these to form a grand ensemble hindcast set. Results are also compared with the idealized predictability studies of Collins (Clim. Dynam. 2002; 19: 671–692), which used the same model. The maximum lead time for which initialization gives enhanced skill over runs without initialization varies in different regions but is very similar to lead times found in the idealized studies, therefore strongly supporting the process representation in the model as well as its use for operational predictions. The limited 12-year period of the study, however, means that the regional details of model skill should probably be further assessed under a wider range of observational conditions.
Resumo:
A particle filter is a data assimilation scheme that employs a fully nonlinear, non-Gaussian analysis step. Unfortunately as the size of the state grows the number of ensemble members required for the particle filter to converge to the true solution increases exponentially. To overcome this Vaswani [Vaswani N. 2008. IEEE Trans Signal Process 56:4583–97] proposed a new method known as mode tracking to improve the efficiency of the particle filter. When mode tracking, the state is split into two subspaces. One subspace is forecast using the particle filter, the other is treated so that its values are set equal to the mode of the marginal pdf. There are many ways to split the state. One hypothesis is that the best results should be obtained from the particle filter with mode tracking when we mode track the maximum number of unimodal dimensions. The aim of this paper is to test this hypothesis using the three dimensional stochastic Lorenz equations with direct observations. It is found that mode tracking the maximum number of unimodal dimensions does not always provide the best result. The best choice of states to mode track depends on the number of particles used and the accuracy and frequency of the observations.
Resumo:
The requirement to forecast volcanic ash concentrations was amplified as a response to the 2010 Eyjafjallajökull eruption when ash safety limits for aviation were introduced in the European area. The ability to provide accurate quantitative forecasts relies to a large extent on the source term which is the emissions of ash as a function of time and height. This study presents source term estimations of the ash emissions from the Eyjafjallajökull eruption derived with an inversion algorithm which constrains modeled ash emissions with satellite observations of volcanic ash. The algorithm is tested with input from two different dispersion models, run on three different meteorological input data sets. The results are robust to which dispersion model and meteorological data are used. Modeled ash concentrations are compared quantitatively to independent measurements from three different research aircraft and one surface measurement station. These comparisons show that the models perform reasonably well in simulating the ash concentrations, and simulations using the source term obtained from the inversion are in overall better agreement with the observations (rank correlation = 0.55, Figure of Merit in Time (FMT) = 25–46%) than simulations using simplified source terms (rank correlation = 0.21, FMT = 20–35%). The vertical structures of the modeled ash clouds mostly agree with lidar observations, and the modeled ash particle size distributions agree reasonably well with observed size distributions. There are occasionally large differences between simulations but the model mean usually outperforms any individual model. The results emphasize the benefits of using an ensemble-based forecast for improved quantification of uncertainties in future ash crises.
Resumo:
The prediction of Northern Hemisphere (NH) extratropical cyclones by nine different ensemble prediction systems(EPSs), archived as part of The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE), has recently been explored using a cyclone tracking approach. This paper provides a continuation of this work, extending the analysis to the Southern Hemisphere (SH). While the EPSs have larger error in all cyclone properties in the SH, the relative performance of the different EPSs remains broadly consistent between the two hemispheres. Some interesting differences are also shown. The Chinese Meteorological Administration (CMA) EPS has a significantly lower level of performance in the SH compared to the NH. Previous NH results showed that the Centro de Previsao de Tempo e Estudos Climaticos (CPTEC) EPS underpredicts cyclone intensity. The results of this current study show that this bias is significantly larger in the SH. The CPTEC EPS also has very little spread in both hemispheres. As with the NH results, cyclone propagation speed is underpredicted by all the EPSs in the SH. To investigate this further, the bias was also computed for theECMWFhigh-resolution deterministic forecast. The bias was significantly smaller than the lower resolution ECMWF EPS.
Resumo:
In numerical weather prediction (NWP) data assimilation (DA) methods are used to combine available observations with numerical model estimates. This is done by minimising measures of error on both observations and model estimates with more weight given to data that can be more trusted. For any DA method an estimate of the initial forecast error covariance matrix is required. For convective scale data assimilation, however, the properties of the error covariances are not well understood. An effective way to investigate covariance properties in the presence of convection is to use an ensemble-based method for which an estimate of the error covariance is readily available at each time step. In this work, we investigate the performance of the ensemble square root filter (EnSRF) in the presence of cloud growth applied to an idealised 1D convective column model of the atmosphere. We show that the EnSRF performs well in capturing cloud growth, but the ensemble does not cope well with discontinuities introduced into the system by parameterised rain. The state estimates lose accuracy, and more importantly the ensemble is unable to capture the spread (variance) of the estimates correctly. We also find, counter-intuitively, that by reducing the spatial frequency of observations and/or the accuracy of the observations, the ensemble is able to capture the states and their variability successfully across all regimes.
Resumo:
This paper describes the implementation of a 3D variational (3D-Var) data assimilation scheme for a morphodynamic model applied to Morecambe Bay, UK. A simple decoupled hydrodynamic and sediment transport model is combined with a data assimilation scheme to investigate the ability of such methods to improve the accuracy of the predicted bathymetry. The inverse forecast error covariance matrix is modelled using a Laplacian approximation which is calibrated for the length scale parameter required. Calibration is also performed for the Soulsby-van Rijn sediment transport equations. The data used for assimilation purposes comprises waterlines derived from SAR imagery covering the entire period of the model run, and swath bathymetry data collected by a ship-borne survey for one date towards the end of the model run. A LiDAR survey of the entire bay carried out in November 2005 is used for validation purposes. The comparison of the predictive ability of the model alone with the model-forecast-assimilation system demonstrates that using data assimilation significantly improves the forecast skill. An investigation of the assimilation of the swath bathymetry as well as the waterlines demonstrates that the overall improvement is initially large, but decreases over time as the bathymetry evolves away from that observed by the survey. The result of combining the calibration runs into a pseudo-ensemble provides a higher skill score than for a single optimized model run. A brief comparison of the Optimal Interpolation assimilation method with the 3D-Var method shows that the two schemes give similar results.
Resumo:
High-resolution ensemble simulations (Δx = 1 km) are performed with the Met Office Unified Model for the Boscastle (Cornwall, UK) flash-flooding event of 16 August 2004. Forecast uncertainties arising from imperfections in the forecast model are analysed by comparing the simulation results produced by two types of perturbation strategy. Motivated by the meteorology of the event, one type of perturbation alters relevant physics choices or parameter settings in the model's parametrization schemes. The other type of perturbation is designed to account for representativity error in the boundary-layer parametrization. It makes direct changes to the model state and provides a lower bound against which to judge the spread produced by other uncertainties. The Boscastle has genuine skill at scales of approximately 60 km and an ensemble spread which can be estimated to within ∼ 10% with only eight members. Differences between the model-state perturbation and physics modification strategies are discussed, the former being more important for triggering and the latter for subsequent cell development, including the average internal structure of convective cells. Despite such differences, the spread in rainfall evaluated at skilful scales is shown to be only weakly sensitive to the perturbation strategy. This suggests that relatively simple strategies for treating model uncertainty may be sufficient for practical, convective-scale ensemble forecasting.
Resumo:
The ability to run General Circulation Models (GCMs) at ever-higher horizontal resolutions has meant that tropical cyclone simulations are increasingly credible. A hierarchy of atmosphere-only GCMs, based on the Hadley Centre Global Environmental Model (HadGEM1), with horizontal resolution increasing from approximately 270km to 60km (at 50N), is used to systematically investigate the impact of spatial resolution on the simulation of global tropical cyclone activity, independent of model formulation. Tropical cyclones are extracted from ensemble simulations and reanalyses of comparable resolutions using a feature-tracking algorithm. Resolution is critical for simulating storm intensity and convergence to observed storm intensities is not achieved with the model hierarchy. Resolution is less critical for simulating the annual number of tropical cyclones and their geographical distribution, which are well captured at resolutions of 135km or higher, particularly for Northern Hemisphere basins. Simulating the interannual variability of storm occurrence requires resolutions of 100km or higher; however, the level of skill is basin dependent. Higher resolution GCMs are increasingly able to capture the interannual variability of the large-scale environmental conditions that contribute to tropical cyclogenesis. Different environmental factors contribute to the interannual variability of tropical cyclones in the different basins: in the North Atlantic basin the vertical wind shear, potential intensity and low-level absolute vorticity are dominant, while in the North Pacific basins mid-level relative humidity and low-level absolute vorticity are dominant. Model resolution is crucial for a realistic simulation of tropical cyclone behaviour, and high-resolution GCMs are found to be valuable tools for investigating the global location and frequency of tropical cyclones.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.