926 resultados para Model-Data Integration and Data Assimilation
Resumo:
Large-scale ocean transports of heat and freshwater have not been well monitored, and yet the regional budgets of these quantities are important to understanding the role of the oceans in climate and climate change. In contrast, atmospheric heat and freshwater transports are commonly assessed from atmospheric reanalysis products, despite the presence of non-conserving data assimilation based on the wealth of distributed atmospheric observations as constraints. The ability to carry out ocean reanalyses globally at eddy-permitting resolutions of 1/4 ° or better, along with new global ocean observation programs, now makes a similar approach viable for the ocean. In this paper we examine the budgets and transports within a global high resolution ocean model constrained by ocean data assimilation, and compare them with independent oceanic and atmospheric estimates.
Resumo:
An analysis of observational data in the Barents Sea along a meridian at 33°30' E between 70°30' and 72°30' N has reported a negative correlation between El Niño/La Niña Southern Oscillation (ENSO) events and water temperature in the top 200 m: the temperature drops about 0.5 °C during warm ENSO events while during cold ENSO events the top 200 m layer of the Barents Sea is warmer. Results from 1 and 1/4-degree global NEMO models show a similar response for the whole Barents Sea. During the strong warm ENSO event in 1997–1998 an anomalous anticyclonic atmospheric circulation over the Barents Sea enhances heat loses, as well as substantially influencing the Barents Sea inflow from the North Atlantic, via changes in ocean currents. Under normal conditions along the Scandinavian peninsula there is a warm current entering the Barents Sea from the North Atlantic, however after the 1997–1998 event this current is weakened. During 1997–1998 the model annual mean temperature in the Barents Sea is decreased by about 0.8 °C, also resulting in a higher sea ice volume. In contrast during the cold ENSO events in 1999–2000 and 2007–2008, the model shows a lower sea ice volume, and higher annual mean temperatures in the upper layer of the Barents Sea of about 0.7 °C. An analysis of model data shows that the strength of the Atlantic inflow in the Barents Sea is the main cause of heat content variability, and is forced by changing pressure and winds in the North Atlantic. However, surface heat-exchange with the atmosphere provides the means by which the Barents sea heat budget relaxes to normal in the subsequent year after the ENSO events.
Resumo:
As wind generation increases, system impact studies rely on predictions of future generation and effective representation of wind variability. A well-established approach to investigate the impact of wind variability is to simulate generation using observations from 10 m meteorological mast-data. However, there are problems with relying purely on historical wind-speed records or generation histories: mast-data is often incomplete, not sited at a relevant wind generation sites, and recorded at the wrong altitude above ground (usually 10 m), each of which may distort the generation profile. A possible complimentary approach is to use reanalysis data, where data assimilation techniques are combined with state-of-the-art weather forecast models to produce complete gridded wind time-series over an area. Previous investigations of reanalysis datasets have placed an emphasis on comparing reanalysis to meteorological site records whereas this paper compares wind generation simulated using reanalysis data directly against historic wind generation records. Importantly, this comparison is conducted using raw reanalysis data (typical resolution ∼50 km), without relying on a computationally expensive “dynamical downscaling” for a particular target region. Although the raw reanalysis data cannot, by nature of its construction, represent the site-specific effects of sub-gridscale topography, it is nevertheless shown to be comparable to or better than the mast-based simulation in the region considered and it is therefore argued that raw reanalysis data may offer a number of significant advantages as a data source.
Resumo:
An important feature of agribusiness promotion programs is their lagged impact on consumption. Efficient investment in advertising requires reliable estimates of these lagged responses and it is desirable from both applied and theoretical standpoints to have a flexible method for estimating them. This note derives an alternative Bayesian methodology for estimating lagged responses when investments occur intermittently within a time series. The method exploits a latent-variable extension of the natural-conjugate, normal-linear model, Gibbs sampling and data augmentation. It is applied to a monthly time series on Turkish pasta consumption (1993:5-1998:3) and three, nonconsecutive promotion campaigns (1996:3, 1997:3, 1997:10). The results suggest that responses were greatest to the second campaign, which allocated its entire budget to television media; that its impact peaked in the sixth month following expenditure; and that the rate of return (measured in metric tons additional consumption per thousand dollars expended) was around a factor of 20.
Resumo:
Three years of meteorological data collected at the WLEF-TV tower were used to drive a revised version of the Simple Biosphere (SiB 2.5) Model. Physiological properties and vegetation phenology were specified from satellite imagery. Simulated fluxes of heat, moisture, and carbon were compared to eddy covariance measurements taken onsite as a means of evaluating model performance on diurnal, synoptic, seasonal, and interannual time scales. The model was very successful in simulating variations of latent heat flux when compared to observations, slightly less so in the simulation of sensible heat flux. The model overestimated peak values of sensible heat flux on both monthly and diurnal scales. There was evidence that the differences between observed and simulated fluxes might be linked to wetlands near the WLEF tower, which were not present in the SiB simulation. The model overestimated the magnitude of the net ecosystem exchange of CO2 in both summer and winter. Mid-day maximum assimilation was well represented by the model, but late afternoon simulations showed excessive carbon uptake due to misrepresentation of within-canopy shading in the model. Interannual variability was not well simulated because only a single year of satellite imagery was used to parameterize the model.
Resumo:
The quasi-biennial oscillation (QBO) in the equatorial zonal wind is an outstanding phenomenon of the atmosphere. The QBO is driven by a broad spectrum of waves excited in the tropical troposphere and modulates transport and mixing of chemical compounds in the whole middle atmosphere. Therefore, the simulation of the QBO in general circulation models and chemistry climate models is an important issue. Here, aspects of the climatology and forcing of a spontaneously occurring QBO in a middle-atmosphere model are evaluated, and its influence on the climate and variability of the tropical middle atmosphere is investigated. Westerly and easterly phases are considered separately, and 40-yr ECMWF Re-Analysis (ERA-40) data are used as a reference where appropriate. It is found that the simulated QBO is realistic in many details. Resolved large-scale waves are particularly important for the westerly phase, while parameterized gravity wave drag is more important for the easterly phase. Advective zonal wind tendencies are important for asymmetries between westerly and easterly phases, as found for the suppression of the easterly phase downward propagation. The simulation of the QBO improves the tropical upwelling and the atmospheric tape recorder compared to a model without a QBO. The semiannual oscillation is simulated realistically only if the QBO is represented. In sensitivity tests, it is found that the simulated QBO is strongly sensitive to changes in the gravity wave sources. The sensitivity to the tested range of horizontal resolutions is small. The stratospheric vertical resolution must be better than 1 km to simulate a realistic QBO.
Resumo:
This paper will introduce the Baltex research programme and summarize associated numerical modelling work which has been undertaken during the last five years. The research has broadly managed to clarify the main mechanisms determining the water and energy cycle in the Baltic region, such as the strong dependence upon the large scale atmospheric circulation. It has further been shown that the Baltic Sea has a positive water balance, albeit with large interannual variations. The focus on the modelling studies has been the use of limited area models at ultra-high resolution driven by boundary conditions from global models or from reanalysis data sets. The programme has further initiated a comprehensive integration of atmospheric, land surface and hydrological modelling incorporating snow, sea ice and special lake models. Other aspects of the programme include process studies such as the role of deep convection, air sea interaction and the handling of land surface moisture. Studies have also been undertaken to investigate synoptic and sub-synoptic events over the Baltic region, thus exploring the role of transient weather systems for the hydrological cycle. A special aspect has been the strong interests and commitments of the meteorological and hydrological services because of the potentially large societal interests of operational applications of the research. As a result of this interests special attention has been put on data-assimilation aspects and the use of new types of data such as SSM/I, GPS-measurements and digital radar. A series of high resolution data sets are being produced. One of those, a 1/6 degree daily precipitation climatology for the years 1996–1999, is such a unique contribution. The specific research achievements to be presented in this volume of Meteorology and Atmospheric Physics is the result of a cooperative venture between 11 European research groups supported under the EU-Framework programmes.
Resumo:
In this paper we report on a study conducted using the Middle Atmospheric Nitrogen TRend Assessment (MANTRA) balloon measurements of stratospheric constituents and temperature and the Canadian Middle Atmosphere Model (CMAM). Three different kinds of data are used to assess the inter-consistency of the combined dataset: single profiles of long-lived species from MANTRA 1998, sparse climatologies from the ozonesonde measurements during the four MANTRA campaigns and from HALOE satellite measurements, and the CMAM climatology. In doing so, we evaluate the ability of the model to reproduce the measured fields and to thereby test our ability to describe mid-latitude summertime stratospheric processes. The MANTRA campaigns were conducted at Vanscoy, Saskatchewan, Canada (52◦ N, 107◦ W)in late August and early September of 1998, 2000, 2002 and 2004. During late summer at mid-latitudes, the stratosphere is close to photochemical control, providing an ideal scenario for the study reported here. From this analysis we find that: (1) reducing the value for the vertical diffusion coefficient in CMAM to a more physically reasonable value results in the model better reproducing the measured profiles of long-lived species; (2) the existence of compact correlations among the constituents, as expected from independent measurements in the literature and from models, confirms the self-consistency of the MANTRA measurements; and (3) the 1998 measurements show structures in the chemical species profiles that can be associated with transport, adding to the growing evidence that the summertime stratosphere can be much more disturbed than anticipated. The mechanisms responsible for such disturbances need to be understood in order to assess the representativeness of the measurements and to isolate longterm trends.
Resumo:
Recent high-resolution radiosonde climatologies have revealed a tropopause inversion layer (TIL) in the extratropics: temperature strongly increases just above a sharp local cold point tropopause. Here, it is asked to what extent a TIL exists in current general circulation models (GCMs) and meteorological analyses. Only a weak hint of a TIL exists in NCEP/NCAR reanalysis data. In contrast, the Canadian Middle Atmosphere Model (CMAM), a comprehensive GCM, exhibits a TIL of realistic strength. However, in data assimilation mode CMAM exhibits a much weaker TIL, especially in the Southern Hemisphere where only coarse satellite data are available. The discrepancy between the analyses and the GCM is thus hypothesized to be mainly due to data assimilation acting to smooth the observed strong curvature in temperature around the tropopause. This is confirmed in the reanalysis where the stratification around the tropopause exhibits a strong discontinuity at the start of the satellite era.
Resumo:
There is large uncertainty about the magnitude of warming and how rainfall patterns will change in response to any given scenario of future changes in atmospheric composition and land use. The models used for future climate projections were developed and calibrated using climate observations from the past 40 years. The geologic record of environmental responses to climate changes provides a unique opportunity to test model performance outside this limited climate range. Evaluation of model simulations against palaeodata shows that models reproduce the direction and large-scale patterns of past changes in climate, but tend to underestimate the magnitude of regional changes. As part of the effort to reduce model-related uncertainty and produce more reliable estimates of twenty-first century climate, the Palaeoclimate Modelling Intercomparison Project is systematically applying palaeoevaluation techniques to simulations of the past run with the models used to make future projections. This evaluation will provide assessments of model performance, including whether a model is sufficiently sensitive to changes in atmospheric composition, as well as providing estimates of the strength of biosphere and other feedbacks that could amplify the model response to these changes and modify the characteristics of climate variability.
Resumo:
Geophysical fluid models often support both fast and slow motions. As the dynamics are often dominated by the slow motions, it is desirable to filter out the fast motions by constructing balance models. An example is the quasi geostrophic (QG) model, which is used widely in meteorology and oceanography for theoretical studies, in addition to practical applications such as model initialization and data assimilation. Although the QG model works quite well in the mid-latitudes, its usefulness diminishes as one approaches the equator. Thus far, attempts to derive similar balance models for the tropics have not been entirely successful as the models generally filter out Kelvin waves, which contribute significantly to tropical low-frequency variability. There is much theoretical interest in the dynamics of planetary-scale Kelvin waves, especially for atmospheric and oceanic data assimilation where observations are generally only of the mass field and thus do not constrain the wind field without some kind of diagnostic balance relation. As a result, estimates of Kelvin wave amplitudes can be poor. Our goal is to find a balance model that includes Kelvin waves for planetary-scale motions. Using asymptotic methods, we derive a balance model for the weakly nonlinear equatorial shallow-water equations. Specifically we adopt the ‘slaving’ method proposed by Warn et al. (Q. J. R. Meteorol. Soc., vol. 121, 1995, pp. 723–739), which avoids secular terms in the expansion and thus can in principle be carried out to any order. Different from previous approaches, our expansion is based on a long-wave scaling and the slow dynamics is described using the height field instead of potential vorticity. The leading-order model is equivalent to the truncated long-wave model considered previously (e.g. Heckley & Gill, Q. J. R. Meteorol. Soc., vol. 110, 1984, pp. 203–217), which retains Kelvin waves in addition to equatorial Rossby waves. Our method allows for the derivation of higher-order models which significantly improve the representation of Rossby waves in the isotropic limit. In addition, the ‘slaving’ method is applicable even when the weakly nonlinear assumption is relaxed, and the resulting nonlinear model encompasses the weakly nonlinear model. We also demonstrate that the method can be applied to more realistic stratified models, such as the Boussinesq model.
Resumo:
A recent nonlinear system by Friston et al. (2000. NeuroImage 12: 466–477) links the changes in BOLD response to changes in neural activity. The system consists of five subsystems, linking: (1) neural activity to flow changes; (2) flow changes to oxygen delivery to tissue; (3) flow changes to changes in blood volume and venous outflow; (4) changes in flow, volume, and oxygen extraction fraction to deoxyhemoglobin changes; and finally (5) volume and deoxyhemoglobin changes to the BOLD response. Friston et al. exploit, in subsystem 2, a model by Buxton and Frank coupling flow changes to changes in oxygen metabolism which assumes tissue oxygen concentration to be close to zero. We describe below a model of the coupling between flow and oxygen delivery which takes into account the modulatory effect of changes in tissue oxygen concentration. The major development has been to extend the original Buxton and Frank model for oxygen transport to a full dynamic capillary model making the model applicable to both transient and steady state conditions. Furthermore our modification enables us to determine the time series of CMRO2 changes under different conditions, including CO2 challenges. We compare the differences in the performance of the “Friston system” using the original model of Buxton and Frank and that of our model. We also compare the data predicted by our model (with appropriate parameters) to data from a series of OIS studies. The qualitative differences in the behaviour of the models are exposed by different experimental simulations and by comparison with the results of OIS data from brief and extended stimulation protocols and from experiments using hypercapnia.
Resumo:
To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society
Resumo:
A stand-alone sea ice model is tuned and validated using satellite-derived, basinwide observations of sea ice thickness, extent, and velocity from the years 1993 to 2001. This is the first time that basin-scale measurements of sea ice thickness have been used for this purpose. The model is based on the CICE sea ice model code developed at the Los Alamos National Laboratory, with some minor modifications, and forcing consists of 40-yr ECMWF Re-Analysis (ERA-40) and Polar Exchange at the Sea Surface (POLES) data. Three parameters are varied in the tuning process: Ca, the air–ice drag coefficient; P*, the ice strength parameter; and α, the broadband albedo of cold bare ice, with the aim being to determine the subset of this three-dimensional parameter space that gives the best simultaneous agreement with observations with this forcing set. It is found that observations of sea ice extent and velocity alone are not sufficient to unambiguously tune the model, and that sea ice thickness measurements are necessary to locate a unique subset of parameter space in which simultaneous agreement is achieved with all three observational datasets.
Resumo:
Urban land surface models (LSM) are commonly evaluated for short periods (a few weeks to months) because of limited observational data. This makes it difficult to distinguish the impact of initial conditions on model performance or to consider the response of a model to a range of possible atmospheric conditions. Drawing on results from the first urban LSM comparison, these two issues are considered. Assessment shows that the initial soil moisture has a substantial impact on the performance. Models initialised with soils that are too dry are not able to adjust their surface sensible and latent heat fluxes to realistic values until there is sufficient rainfall. Models initialised with too wet soils are not able to restrict their evaporation appropriately for periods in excess of a year. This has implications for short term evaluation studies and implies the need for soil moisture measurements to improve data assimilation and model initialisation. In contrast, initial conditions influencing the thermal storage have a much shorter adjustment timescale compared to soil moisture. Most models partition too much of the radiative energy at the surface into the sensible heat flux at the probable expense of the net storage heat flux.