959 resultados para Time-varying system


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent research suggests that the ability of an extraneous formant to impair intelligibility depends on the variation of its frequency contour. This idea was explored using a method that ensures interference occurs only through informational masking. Three-formant analogues of sentences were synthesized using a monotonous periodic source (F0 = 140 Hz). Target formants were presented monaurally; the target ear was assigned randomly on each trial. A competitor for F2 (F2C) was presented contralaterally; listeners must reject F2C to optimize recognition. In experiment 1, F2Cs with various frequency and amplitude contours were used. F2Cs with time-varying frequency contours were effective competitors; constant-frequency F2Cs had far less impact. Amplitude contour also influenced competitor impact; this effect was additive. In experiment 2, F2Cs were created by inverting the F2 frequency contour about its geometric mean and varying its depth of variation over a range from constant to twice the original (0–200%). The impact on intelligibility was least for constant F2Cs and increased up to ~100% depth, but little thereafter. The effect of an extraneous formant depends primarily on its frequency contour; interference increases as the depth of variation is increased until the range exceeds that typical for F2 in natural speech.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate return-to-zero (RZ) to non-return-to-zero (NRZ) format conversion by means of the linear time-invariant system theory. It is shown that the problem of converting random RZ stream to NRZ stream can be reduced to constructing an appropriate transfer function for the linear filter. This approach is then used to propose novel optimally-designed single fiber Bragg grating (FBG) filter scheme for RZ-OOK/DPSK/DQPSK to NRZ-OOK/DPSK/DQPSK format conversion. The spectral response of the FBG is designed according to the optical spectra of the algebraic difference between isolated NRZ and RZ pulses, and the filter order is optimized for the maximum Q-factor of the output NRZ signals. Experimental results as well as simulations show that such an optimallydesigned FBG can successfully perform RZ-OOK/DPSK/DQPSK to NRZOOK/ DPSK/DQPSK format conversion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fermentation processes as objects of modelling and high-quality control are characterized with interdependence and time-varying of process variables that lead to non-linear models with a very complex structure. This is why the conventional optimization methods cannot lead to a satisfied solution. As an alternative, genetic algorithms, like the stochastic global optimization method, can be applied to overcome these limitations. The application of genetic algorithms is a precondition for robustness and reaching of a global minimum that makes them eligible and more workable for parameter identification of fermentation models. Different types of genetic algorithms, namely simple, modified and multi-population ones, have been applied and compared for estimation of nonlinear dynamic model parameters of fed-batch cultivation of S. cerevisiae.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Magnetoencephalographic (MEG) signals, like electroencephalographic (EEG) measures, are the direct extracranial manifestations of neuronal activation. The two techniques can detect time-varying changes in electromagnetic activity with a sub-millisecond time resolution. Extra-cranial electromagnetic measures are the cornerstone of the non-invasive diagnostic armamentarium in patients with epilepsy. Their extremely high temporal resolution – comparable to intracranial recordings – is the basis for a precise definition of onset and propagation of ictal and interictal abnormalities. Given the cost of the infrastructure and equipment, MEG has yet to develop into a routinely applicable diagnostic tool in clinical settings. However, in recent years, an increasing number of patients with epilepsy have been investigated – usually in the context of presurgical evaluation of refractory epilepsies – and initial encouraging results have been reported. We will briefly review the principles and the technology behind MEG and its contribution in the diagnostic work-up of patients with epilepsy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is shown that an electromagnetic wave equation in time domain is reduced in paraxial approximation to an equation similar to the Schrodinger equation but in which the time and space variables play opposite roles. This equation has solutions in form of time-varying pulses with the Airy function as an envelope. The pulses are generated by a source point with an Airy time varying field and propagate in vacuum preserving their shape and magnitude. The motion is according to a quadratic law with the velocity changing from infinity at the source point to zero in infinity. These one-dimensional results are extended to the 3D+time case when a similar Airy-Bessel pulse is excited by the field at a plane aperture. The same behaviour of the pulses, the non-diffractive preservation and their deceleration, is found. © 2011 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article describes a surgical robotic device that is able to discriminate tissue interfaces and other controlling parameters ahead of the drill tip. The advantage in such a surgery is that the tissues at the interfaces can be preserved. A smart tool detects ahead of the tool point and is able to control the interaction with respect to the flexing tissue, to avoid penetration or to control the extent of protrusion with respect to the position of the tissue. For surgical procedures, where precision is required, the tool offers significant benefit. To interpret the drilling conditions and the conditions leading up to breakthrough at a tissue interface, a sensing scheme is used that discriminates between the variety of conditions posed in the drilling environment. The result is a fully autonomous system, which is able to respond to the tissue type, behaviour, and deflection in real-time. The system is also robust in terms of disturbances encountered in the operating theatre. The device is pragmatic. It is intuitive to use, efficient to set up, and uses standard drill bits. The micro-drill, which has been used to prepare cochleostomies in the theatre, was used to remove the bone tissue leaving the endosteal membrane intact. This has enabled the preservation of sterility and the drilling debris to be removed prior to the insertion of the electrode. It is expected that this technique will promote the preservation of hearing and reduce the possibility of complications. The article describes the device (including simulated drill progress and hardware set-up) and the stages leading up to its use in the theatre. © 2010 Authors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Az 1970-es évek olajválságait követő stagflációs periódusok óta gyakorlatilag minden nagyobb áremelkedés alkalmával felerősödnek a kedvezőtlen makrogazdasági hatásokkal kapcsolatos félelmek, miközben a tapasztalat azt mutatja, hogy az importőröket egyre kevésbé érinti az olaj reálárának alakulása. A gyengülő hatások okaként Blanchard-Galí [2007] a gazdaságok hatékonyabb és rugalmasabb működését jelölte meg, míg Kilian [2010] szerint a 2000 utáni áremelkedést a kedvező világgazdasági környezet fűtötte, ami ellensúlyozta a magasabb ár okozta negatív folyamatokat. A tanulmány Kilian [2009] modelljének kiterjesztésével, időben változó paraméterű ökonometriai eljárással vizsgálja a két megközelítés összeegyeztethetőségét. Az eredmények a hipotézisek egymást kiegészítő kapcsolatára engednek következtetni, azaz a makrogazdasági következmények szempontjából nem maga az ár, hanem annak kiváltó okai lényegesek, ugyanakkor e mögöttes tényezők hatása az elmúlt évtizedekben folyamatosan változott. _____ Many economists argue that the stagflation periods of the 1970s were related to the two main oil crises. However, experience shows that these effects were eliminated over the decades, e. g. oil-importing economies enjoyed solid growth and low inflation when oil prices surged in the 2000s. Blanchard and Galí (2007) found that economies became more effective and elastic in handling high energy prices, while Kilian (2010) took as the main reason for the weakening macroeconomic effects of oil-price shocks the structural differences behind the price changes. The article sets out to test the compatibility of the two rival theories, using time-varying parameter models. The results show that both hypotheses can be correct concurrently: the structure of the change in price matters, but the impulse responses varied over time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

My dissertation investigates the financial linkages and transmission of economic shocks between the US and the smallest emerging markets (frontier markets). The first chapter sets up an empirical model that examines the impact of US market returns and conditional volatility on the returns and conditional volatilities of twenty-one frontier markets. The model is estimated via maximum likelihood; utilizes the GARCH model of errors, and is applied to daily country data from the MSCI Barra. We find limited, but statistically significant exposure of Frontier markets to shocks from the US. Our results suggest that it is not the lagged US market returns that have impact; rather it is the expected US market returns that influence frontier market returns The second chapter sets up an empirical time-varying parameter (TVP) model to explore the time-variation in the impact of mean US returns on mean Frontier market returns. The model utilizes the Kalman filter algorithm as well as the GARCH model of errors and is applied to daily country data from the MSCI Barra. The TVP model detects statistically significant time-variation in the impact of US returns and low, but statistically and quantitatively important impact of US market conditional volatility. The third chapter studies the risk-return relationship in twenty Frontier country stock markets by setting up an international version of the intertemporal capital asset pricing model. The systematic risk in this model comes from covariance of Frontier market stock index returns with world returns. Both the systematic risk and risk premium are time-varying in our model. We also incorporate own country variances as additional determinants of Frontier country returns. Our results suggest statistically significant impact of both world and own country risk in explaining Frontier country returns. Time-variation in the world risk premium is also found to be statistically significant for most Frontier market returns. However, own country risk is found to be quantitatively more important.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A landfill represents a complex and dynamically evolving structure that can be stochastically perturbed by exogenous factors. Both thermodynamic (equilibrium) and time varying (non-steady state) properties of a landfill are affected by spatially heterogenous and nonlinear subprocesses that combine with constraining initial and boundary conditions arising from the associated surroundings. While multiple approaches have been made to model landfill statistics by incorporating spatially dependent parameters on the one hand (data based approach) and continuum dynamical mass-balance equations on the other (equation based modelling), practically no attempt has been made to amalgamate these two approaches while also incorporating inherent stochastically induced fluctuations affecting the process overall. In this article, we will implement a minimalist scheme of modelling the time evolution of a realistic three dimensional landfill through a reaction-diffusion based approach, focusing on the coupled interactions of four key variables - solid mass density, hydrolysed mass density, acetogenic mass density and methanogenic mass density, that themselves are stochastically affected by fluctuations, coupled with diffusive relaxation of the individual densities, in ambient surroundings. Our results indicate that close to the linearly stable limit, the large time steady state properties, arising out of a series of complex coupled interactions between the stochastically driven variables, are scarcely affected by the biochemical growth-decay statistics. Our results clearly show that an equilibrium landfill structure is primarily determined by the solid and hydrolysed mass densities only rendering the other variables as statistically "irrelevant" in this (large time) asymptotic limit. The other major implication of incorporation of stochasticity in the landfill evolution dynamics is in the hugely reduced production times of the plants that are now approximately 20-30 years instead of the previous deterministic model predictions of 50 years and above. The predictions from this stochastic model are in conformity with available experimental observations.