98 resultados para Dynamical variables
Resumo:
Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec−1) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.
Resumo:
Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.
Resumo:
The technique of relaxation of the tropical atmosphere towards an analysis in a month-season forecast model has previously been successfully exploited in a number of contexts. Here it is shown that when tropical relaxation is used to investigate the possible origin of the observed anomalies in June–July 2007, a simple dynamical model is able to reproduce the observed component of the pattern of anomalies given by an ensemble of ECMWF forecast runs. Following this result, the simple model is used for a range of experiments on time-scales of relaxation, variables and regions relaxed based on a control model run with equatorial heating in a zonal flow. A theory based on scale analysis for the large-scale tropics is used to interpret the results. Typical relationships between scales are determined from the basic equations, and for a specified diabatic heating a chain of deductions for determining the dependent variables is derived. Different critical time-scales are found for tropical relaxation of different dependent variables to be effective. Vorticity has the longest critical time-scale, typically 1.2 days. For temperature and divergence, the time-scales are 10 hours and 3 hours, respectively. However not all the tropical fields, in particular the vertical motion, are reproduced correctly by the model unless divergence is heavily damped. To obtain the correct extra-tropical fields, it is crucial to have the correct rotational flow in the subtropics to initiate the Rossby wave propagation from there. It is sufficient to relax vorticity or temperature on a time-scale comparable or less than their critical time-scales to obtain this. However if the divergent advection of vorticity is important in the Rossby Wave Source then strong relaxation of divergence is required to accurately represent the tropical forcing of Rossby waves.
Resumo:
Data assimilation aims to incorporate measured observations into a dynamical system model in order to produce accurate estimates of all the current (and future) state variables of the system. The optimal estimates minimize a variational principle and can be found using adjoint methods. The model equations are treated as strong constraints on the problem. In reality, the model does not represent the system behaviour exactly and errors arise due to lack of resolution and inaccuracies in physical parameters, boundary conditions and forcing terms. A technique for estimating systematic and time-correlated errors as part of the variational assimilation procedure is described here. The modified method determines a correction term that compensates for model error and leads to improved predictions of the system states. The technique is illustrated in two test cases. Applications to the 1-D nonlinear shallow water equations demonstrate the effectiveness of the new procedure.
Resumo:
This study focuses on the mechanisms underlying water and heat transfer in upper soil layers, and their effects on soil physical prognostic variables and the individual components of the energy balance. The skill of the JULES (Joint UK Land Environment Simulator) land surface model (LSM) to simulate key soil variables, such as soil moisture content and surface temperature, and fluxes such as evaporation, is investigated. The Richards equation for soil water transfer, as used in most LSMs, was updated by incorporating isothermal and thermal water vapour transfer. The model was tested for three sites representative of semi-arid and temperate arid climates: the Jornada site (New Mexico, USA), Griffith site (Australia) and Audubon site (Arizona, USA). Water vapour flux was found to contribute significantly to the water and heat transfer in the upper soil layers. This was mainly due to isothermal vapour diffusion; thermal vapour flux also played a role at the Jornada site just after rainfall events. Inclusion of water vapour flux had an effect on the diurnal evolution of evaporation, soil moisture content and surface temperature. The incorporation of additional processes, such as water vapour flux among others, into LSMs may improve the coupling between the upper soil layers and the atmosphere, which in turn could increase the reliability of weather and climate predictions.
Resumo:
The separate effects of ozone depleting substances (ODSs) and greenhouse gases (GHGs) on forcing circulation changes in the Southern Hemisphere extratropical troposphere are investigated using a version of the Canadian Middle Atmosphere Model (CMAM) that is coupled to an ocean. Circulation-related diagnostics include zonal wind, tropopause pressure, Hadley cell width, jet location, annular mode index, precipitation, wave drag, and eddy fluxes of momentum and heat. As expected, the tropospheric response to the ODS forcing occurs primarily in austral summer, with past (1960-99) and future (2000-99) trends of opposite sign, while the GHG forcing produces more seasonally uniform trends with the same sign in the past and future. In summer the ODS forcing dominates past trends in all diagnostics, while the two forcings contribute nearly equally but oppositely to future trends. The ODS forcing produces a past surface temperature response consisting of cooling over eastern Antarctica, and is the dominant driver of past summertime surface temperature changes when the model is constrained by observed sea surface temperatures. For all diagnostics, the response to the ODS and GHG forcings is additive: that is, the linear trend computed from the simulations using the combined forcings equals (within statistical uncertainty) the sum of the linear trends from the simulations using the two separate forcings. Space time spectra of eddy fluxes and the spatial distribution of transient wave drag are examined to assess the viability of several recently proposed mechanisms for the observed poleward shift in the tropospheric jet.
Resumo:
A precipitation downscaling method is presented using precipitation from a general circulation model (GCM) as predictor. The method extends a previous method from monthly to daily temporal resolution. The simplest form of the method corrects for biases in wet-day frequency and intensity. A more sophisticated variant also takes account of flow-dependent biases in the GCM. The method is flexible and simple to implement. It is proposed here as a correction of GCM output for applications where sophisticated methods are not available, or as a benchmark for the evaluation of other downscaling methods. Applied to output from reanalyses (ECMWF, NCEP) in the region of the European Alps, the method is capable of reducing large biases in the precipitation frequency distribution, even for high quantiles. The two variants exhibit similar performances, but the ideal choice of method can depend on the GCM/reanalysis and it is recommended to test the methods in each case. Limitations of the method are found in small areas with unresolved topographic detail that influence higher-order statistics (e.g. high quantiles). When used as benchmark for three regional climate models (RCMs), the corrected reanalysis and the RCMs perform similarly in many regions, but the added value of the latter is evident for high quantiles in some small regions.
Resumo:
Accurate decadal climate predictions could be used to inform adaptation actions to a changing climate. The skill of such predictions from initialised dynamical global climate models (GCMs) may be assessed by comparing with predictions from statistical models which are based solely on historical observations. This paper presents two benchmark statistical models for predicting both the radiatively forced trend and internal variability of annual mean sea surface temperatures (SSTs) on a decadal timescale based on the gridded observation data set HadISST. For both statistical models, the trend related to radiative forcing is modelled using a linear regression of SST time series at each grid box on the time series of equivalent global mean atmospheric CO2 concentration. The residual internal variability is then modelled by (1) a first-order autoregressive model (AR1) and (2) a constructed analogue model (CA). From the verification of 46 retrospective forecasts with start years from 1960 to 2005, the correlation coefficient for anomaly forecasts using trend with AR1 is greater than 0.7 over parts of extra-tropical North Atlantic, the Indian Ocean and western Pacific. This is primarily related to the prediction of the forced trend. More importantly, both CA and AR1 give skillful predictions of the internal variability of SSTs in the subpolar gyre region over the far North Atlantic for lead time of 2 to 5 years, with correlation coefficients greater than 0.5. For the subpolar gyre and parts of the South Atlantic, CA is superior to AR1 for lead time of 6 to 9 years. These statistical forecasts are also compared with ensemble mean retrospective forecasts by DePreSys, an initialised GCM. DePreSys is found to outperform the statistical models over large parts of North Atlantic for lead times of 2 to 5 years and 6 to 9 years, however trend with AR1 is generally superior to DePreSys in the North Atlantic Current region, while trend with CA is superior to DePreSys in parts of South Atlantic for lead time of 6 to 9 years. These findings encourage further development of benchmark statistical decadal prediction models, and methods to combine different predictions.
Resumo:
A version of the Canadian Middle Atmosphere Model that is coupled to an ocean is used to investigate the separate effects of climate change and ozone depletion on the dynamics of the Southern Hemisphere (SH) stratosphere. This is achieved by performing three sets of simulations extending from 1960 to 2099: 1) greenhouse gases (GHGs) fixed at 1960 levels and ozone depleting substances (ODSs) varying in time, 2) ODSs fixed at 1960 levels and GHGs varying in time, and 3) both GHGs and ODSs varying in time. The response of various dynamical quantities to theGHGand ODS forcings is shown to be additive; that is, trends computed from the sum of the first two simulations are equal to trends from the third. Additivity is shown to hold for the zonal mean zonal wind and temperature, the mass flux into and out of the stratosphere, and the latitudinally averaged wave drag in SH spring and summer, as well as for final warming dates. Ozone depletion and recovery causes seasonal changes in lower-stratosphere mass flux, with reduced polar downwelling in the past followed by increased downwelling in the future in SH spring, and the reverse in SH summer. These seasonal changes are attributed to changes in wave drag caused by ozone-induced changes in the zonal mean zonal winds. Climate change, on the other hand, causes a steady decrease in wave drag during SH spring, which delays the breakdown of the vortex, resulting in increased wave drag in summer
Resumo:
An El Niño-like steady response is found in a greenhouse warming simulation resulting from coupled ocean-atmosphere dynamical feedbacks similar to those producing the present-day El Niños. There is a strong negative cloud-radiation feedback on the sea surface temperature (SST) anomaly associated with this enhanced eastern equatorial Pacific warm pattern. However, this negative feedback is overwhelmed by the positive dynamical feedbacks and cannot diminish the sensitivity of the tropical SST to enhanced greenhouse gas concentrations. The enhanced eastern-Pacific warming in the coupled ocean-atmosphere system suggests that coupled dynamics can strengthen this sensitivity.
Resumo:
A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.