110 resultados para time varying parameter model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper exploits a structural time series approach to model the time pattern of multiple and resurgent food scares and their direct and cross-product impacts on consumer response. A structural time series Almost Ideal Demand System (STS-AIDS) is embedded in a vector error correction framework to allow for dynamic effects (VEC-STS-AIDS). Italian aggregate household data on meat demand is used to assess the time-varying impact of a resurgent BSE crisis (1996 and 2000) and the 1999 Dioxin crisis. The VEC-STS-AIDS model monitors the short-run impacts and performs satisfactorily in terms of residuals diagnostics, overcoming the major problems encountered by the customary vector error correction approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article proposes a new model for autoregressive conditional heteroscedasticity and kurtosis. Via a time-varying degrees of freedom parameter, the conditional variance and conditional kurtosis are permitted to evolve separately. The model uses only the standard Student’s t-density and consequently can be estimated simply using maximum likelihood. The method is applied to a set of four daily financial asset return series comprising U.S. and U.K. stocks and bonds, and significant evidence in favor of the presence of autoregressive conditional kurtosis is observed. Various extensions to the basic model are proposed, and we show that the response of kurtosis to good and bad news is not significantly asymmetric.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aircraft systems are highly nonlinear and time varying. High-performance aircraft at high angles of incidence experience undesired coupling of the lateral and longitudinal variables, resulting in departure from normal controlled � ight. The construction of a robust closed-loop control that extends the stable and decoupled � ight envelope as far as possible is pursued. For the study of these systems, nonlinear analysis methods are needed. Previously, bifurcation techniques have been used mainly to analyze open-loop nonlinear aircraft models and to investigate control effects on dynamic behavior. Linear feedback control designs constructed by eigenstructure assignment methods at a � xed � ight condition are investigated for a simple nonlinear aircraft model. Bifurcation analysis, in conjunction with linear control design methods, is shown to aid control law design for the nonlinear system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aircraft systems are highly nonlinear and time varying. High-performance aircraft at high angles of incidence experience undesired coupling of the lateral and longitudinal variables, resulting in departure from normal controlled flight. The aim of this work is to construct a robust closed-loop control that optimally extends the stable and decoupled flight envelope. For the study of these systems nonlinear analysis methods are needed. Previously, bifurcation techniques have been used mainly to analyze open-loop nonlinear aircraft models and investigate control effects on dynamic behavior. In this work linear feedback control designs calculated by eigenstructure assignment methods are investigated for a simple aircraft model at a fixed flight condition. Bifurcation analysis in conjunction with linear control design methods is shown to aid control law design for the nonlinear system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a mathematical model linking changes in cerebral blood flow, blood volume and the blood oxygenation state in response to stimulation. The model has three compartments to take into account the fact that the cerebral blood flow and volume as measured concurrently using laser Doppler flowmetry and optical imaging spectroscopy have contributions from the arterial, capillary as well as the venous compartments of the vasculature. It is an extension to previous one-compartment hemodynamic models which assume that the measured blood volume changes are from the venous compartment only. An important assumption of the model is that the tissue oxygen concentration is a time varying state variable of the system and is driven by the changes in metabolic demand resulting from changes in neural activity. The model takes into account the pre-capillary oxygen diffusion by flexibly allowing the saturation of the arterial compartment to be less than unity. Simulations are used to explore the sensitivity of the model and to optimise the parameters for experimental data. We conclude that the three-compartment model was better than the one-compartment model at capturing the hemodynamics of the response to changes in neural activation following stimulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flash floods pose a significant danger for life and property. Unfortunately, in arid and semiarid environment the runoff generation shows a complex non-linear behavior with a strong spatial and temporal non-uniformity. As a result, the predictions made by physically-based simulations in semiarid areas are subject to great uncertainty, and a failure in the predictive behavior of existing models is common. Thus better descriptions of physical processes at the watershed scale need to be incorporated into the hydrological model structures. For example, terrain relief has been systematically considered static in flood modelling at the watershed scale. Here, we show that the integrated effect of small distributed relief variations originated through concurrent hydrological processes within a storm event was significant on the watershed scale hydrograph. We model these observations by introducing dynamic formulations of two relief-related parameters at diverse scales: maximum depression storage, and roughness coefficient in channels. In the final (a posteriori) model structure these parameters are allowed to be both time-constant or time-varying. The case under study is a convective storm in a semiarid Mediterranean watershed with ephemeral channels and high agricultural pressures (the Rambla del Albujón watershed; 556 km 2 ), which showed a complex multi-peak response. First, to obtain quasi-sensible simulations in the (a priori) model with time-constant relief-related parameters, a spatially distributed parameterization was strictly required. Second, a generalized likelihood uncertainty estimation (GLUE) inference applied to the improved model structure, and conditioned to observed nested hydrographs, showed that accounting for dynamic relief-related parameters led to improved simulations. The discussion is finally broadened by considering the use of the calibrated model both to analyze the sensitivity of the watershed to storm motion and to attempt the flood forecasting of a stratiform event with highly different behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experiments with CO2 instantaneously quadrupled and then held constant are used to show that the relationship between the global-mean net heat input to the climate system and the global-mean surface-air-temperature change is nonlinear in Coupled Model Intercomparison Project phase 5 (CMIP5) Atmosphere-Ocean General Circulation Models (AOGCMs). The nonlinearity is shown to arise from a change in strength of climate feedbacks driven by an evolving pattern of surface warming. In 23 out of the 27 AOGCMs examined the climate feedback parameter becomes significantly (95% confidence) less negative – i.e. the effective climate sensitivity increases – as time passes. Cloud feedback parameters show the largest changes. In the AOGCM-mean approximately 60% of the change in feedback parameter comes from the topics (30N-30S). An important region involved is the tropical Pacific where the surface warming intensifies in the east after a few decades. The dependence of climate feedbacks on an evolving pattern of surface warming is confirmed using the HadGEM2 and HadCM3 atmosphere GCMs (AGCMs). With monthly evolving sea-surface-temperatures and sea-ice prescribed from its AOGCM counterpart each AGCM reproduces the time-varying feedbacks, but when a fixed pattern of warming is prescribed the radiative response is linear with global temperature change or nearly so. We also demonstrate that the regression and fixed-SST methods for evaluating effective radiative forcing are in principle different, because rapid SST adjustment when CO2 is changed can produce a pattern of surface temperature change with zero global mean but non-zero change in net radiation at the top of the atmosphere (~ -0.5 Wm-2 in HadCM3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The clustering in time (seriality) of extratropical cyclones is responsible for large cumulative insured losses in western Europe, though surprisingly little scientific attention has been given to this important property. This study investigates and quantifies the seriality of extratropical cyclones in the Northern Hemisphere using a point-process approach. A possible mechanism for serial clustering is the time-varying effect of the large-scale flow on individual cyclone tracks. Another mechanism is the generation by one parent cyclone of one or more offspring through secondary cyclogenesis. A long cyclone-track database was constructed for extended October March winters from 1950 to 2003 using 6-h analyses of 850-mb relative vorticity derived from the NCEP NCAR reanalysis. A dispersion statistic based on the varianceto- mean ratio of monthly cyclone counts was used as a measure of clustering. It reveals extensive regions of statistically significant clustering in the European exit region of the North Atlantic storm track and over the central North Pacific. Monthly cyclone counts were regressed on time-varying teleconnection indices with a log-linear Poisson model. Five independent teleconnection patterns were found to be significant factors over Europe: the North Atlantic Oscillation (NAO), the east Atlantic pattern, the Scandinavian pattern, the east Atlantic western Russian pattern, and the polar Eurasian pattern. The NAO alone is not sufficient for explaining the variability of cyclone counts in the North Atlantic region and western Europe. Rate dependence on time-varying teleconnection indices accounts for the variability in monthly cyclone counts, and a cluster process did not need to be invoked.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tropical cyclones have been investigated in a T159 version of the MPI ECHAM5 climate model using a novel technique to diagnose the evolution of the 3-dimensional vorticity structure of tropical cyclones, including their full life cycle from weak initial vortex to their possible extra-tropical transition. Results have been compared with reanalyses (ERA40 and JRA25) and observed tropical storms during the period 1978-1999 for the Northern Hemisphere. There is no indication of any trend in the number or intensity of tropical storms during this period in ECHAM5 or in re-analyses but there are distinct inter-annual variations. The storms simulated by ECHAM5 are realistic both in space and time, but the model and even more so the re-analyses, underestimate the intensities of the most intense storms (in terms of their maximum wind speeds). There is an indication of a response to ENSO with a smaller number of Atlantic storms during El Niño in agreement with previous studies. The global divergence circulation responds to El Niño by setting up a large-scale convergence flow, with the center over the central Pacific with enhanced subsidence over the tropical Atlantic. At the same time there is an increase in the vertical wind shear in the region of the tropical Atlantic where tropical storms normally develop. There is a good correspondence between the model and ERA40 except that the divergence circulation is somewhat stronger in the model. The model underestimates storms in the Atlantic but tends to overestimate them in the Western Pacific and in the North Indian Ocean. It is suggested that the overestimation of storms in the Pacific by the model is related to an overly strong response to the tropical Pacific SST anomalies. The overestimation in 2 the North Indian Ocean is likely to be due to an over prediction in the intensity of monsoon depressions, which are then classified as intense tropical storms. Nevertheless, overall results are encouraging and will further contribute to increased confidence in simulating intense tropical storms with high-resolution climate models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In most climate simulations used by the Intergovernmental Panel on Climate Change 2007 fourth assessment report, stratospheric processes are only poorly represented. For example, climatological or simple specifications of time-varying ozone concentrations are imposed and the quasi-biennial oscillation (QBO) of equatorial stratospheric zonal wind is absent. Here we investigate the impact of an improved stratospheric representation using two sets of perturbed simulations with the Hadley Centre coupled ocean atmosphere model HadGEM1 with natural and anthropogenic forcings for the 1979–2003 period. In the first set of simulations, the usual zonal mean ozone climatology with superimposed trends is replaced with a time series of observed zonal mean ozone distributions that includes interannual variability associated with the solar cycle, QBO and volcanic eruptions. In addition to this, the second set of perturbed simulations includes a scheme in which the stratospheric zonal wind in the tropics is relaxed to appropriate zonal mean values obtained from the ERA-40 re-analysis, thus forcing a QBO. Both of these changes are applied strictly to the stratosphere only. The improved ozone field results in an improved simulation of the stepwise temperature transitions observed in the lower stratosphere in the aftermath of the two major recent volcanic eruptions. The contribution of the solar cycle signal in the ozone field to this improved representation of the stepwise cooling is discussed. The improved ozone field and also the QBO result in an improved simulation of observed trends, both globally and at tropical latitudes. The Eulerian upwelling in the lower stratosphere in the equatorial region is enhanced by the improved ozone field and is affected by the QBO relaxation, yet neither induces a significant change in the upwelling trend.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Counterstreaming electrons (CSEs) are treated as signatures of closed magnetic flux, i.e., loops connected to the Sun at both ends. However, CSEs at 1 AU likely fade as the apex of a closed loop passes beyond some distance R, owing to scattering of the sunward beam along its continually increasing path length. The remaining antisunward beam at 1 AU would then give a false signature of open flux. Subsequent opening of a loop at the Sun by interchange reconnection with an open field line would produce an electron dropout (ED) at 1 AU, as if two open field lines were reconnecting to completely disconnect from the Sun. Thus EDs can be signatures of interchange reconnection as well as the commonly attributed disconnection. We incorporate CSE fadeout into a model that matches time-varying closed flux from interplanetary coronal mass ejections (ICMEs) to the solar cycle variation in heliospheric flux. Using the observed occurrence rate of CSEs at solar maximum, the model estimates R ∼ 8–10 AU. Hence we demonstrate that EDs should be much rarer than CSEs at 1 AU, as EDs can only be detected when the juncture points of reconnected field lines lie sunward of the detector, whereas CSEs continue to be detected in the legs of all loops that have expanded beyond the detector, out to R. We also demonstrate that if closed flux added to the heliosphere by ICMEs is instead balanced by disconnection elsewhere, then ED occurrence at 1 AU would still be rare, contrary to earlier expectations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the primary goals of the Center for Integrated Space Weather Modeling (CISM) effort is to assess and improve prediction of the solar wind conditions in near‐Earth space, arising from both quasi‐steady and transient structures. We compare 8 years of L1 in situ observations to predictions of the solar wind speed made by the Wang‐Sheeley‐Arge (WSA) empirical model. The mean‐square error (MSE) between the observed and model predictions is used to reach a number of useful conclusions: there is no systematic lag in the WSA predictions, the MSE is found to be highest at solar minimum and lowest during the rise to solar maximum, and the optimal lead time for 1 AU solar wind speed predictions is found to be 3 days. However, MSE is shown to frequently be an inadequate “figure of merit” for assessing solar wind speed predictions. A complementary, event‐based analysis technique is developed in which high‐speed enhancements (HSEs) are systematically selected and associated from observed and model time series. WSA model is validated using comparisons of the number of hit, missed, and false HSEs, along with the timing and speed magnitude errors between the forecasted and observed events. Morphological differences between the different HSE populations are investigated to aid interpretation of the results and improvements to the model. Finally, by defining discrete events in the time series, model predictions from above and below the ecliptic plane can be used to estimate an uncertainty in the predicted HSE arrival times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optical density measurements were used to estimate the effect of heat treatments on the single-cell lag times of Listeria innocua fitted to a shifted gamma distribution. The single-cell lag time was subdivided into repair time ( the shift of the distribution assumed to be uniform for all cells) and adjustment time (varying randomly from cell to cell). After heat treatments in which all of the cells recovered (sublethal), the repair time and the mean and the variance of the single-cell adjustment time increased with the severity of the treatment. When the heat treatments resulted in a loss of viability (lethal), the repair time of the survivors increased with the decimal reduction of the cell numbers independently of the temperature, while the mean and variance of the single-cell adjustment times remained the same irrespective of the heat treatment. Based on these observations and modeling of the effect of time and temperature of the heat treatment, we propose that the severity of a heat treatment can be characterized by the repair time of the cells whether the heat treatment is lethal or not, an extension of the F value concept for sublethal heat treatments. In addition, the repair time could be interpreted as the extent or degree of injury with a multiple-hit lethality model. Another implication of these results is that the distribution of the time for cells to reach unacceptable numbers in food is not affected by the time-temperature combination resulting in a given decimal reduction.