974 resultados para time-varying channel


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação (mestrado)— Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The under-reporting of cases of infectious diseases is a substantial impediment to the control and management of infectious diseases in both epidemic and endemic contexts. Information about infectious disease dynamics can be recovered from sequence data using time-varying coalescent approaches, and phylodynamic models have been developed in order to reconstruct demographic changes of the numbers of infected hosts through time. In this study I have demonstrated the general concordance between empirically observed epidemiological incidence data and viral demography inferred through analysis of foot-and-mouth disease virus VP1 coding sequences belonging to the CATHAY topotype over large temporal and spatial scales. However a more precise and robust relationship between the effective population size (

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse se compose de trois articles sur les politiques budgétaires et monétaires optimales. Dans le premier article, J'étudie la détermination conjointe de la politique budgétaire et monétaire optimale dans un cadre néo-keynésien avec les marchés du travail frictionnels, de la monnaie et avec distortion des taux d'imposition du revenu du travail. Dans le premier article, je trouve que lorsque le pouvoir de négociation des travailleurs est faible, la politique Ramsey-optimale appelle à un taux optimal d'inflation annuel significativement plus élevé, au-delà de 9.5%, qui est aussi très volatile, au-delà de 7.4%. Le gouvernement Ramsey utilise l'inflation pour induire des fluctuations efficaces dans les marchés du travail, malgré le fait que l'évolution des prix est coûteuse et malgré la présence de la fiscalité du travail variant dans le temps. Les résultats quantitatifs montrent clairement que le planificateur s'appuie plus fortement sur l'inflation, pas sur l'impôts, pour lisser les distorsions dans l'économie au cours du cycle économique. En effet, il ya un compromis tout à fait clair entre le taux optimal de l'inflation et sa volatilité et le taux d'impôt sur le revenu optimal et sa variabilité. Le plus faible est le degré de rigidité des prix, le plus élevé sont le taux d'inflation optimal et la volatilité de l'inflation et le plus faible sont le taux d'impôt optimal sur le revenu et la volatilité de l'impôt sur le revenu. Pour dix fois plus petit degré de rigidité des prix, le taux d'inflation optimal et sa volatilité augmentent remarquablement, plus de 58% et 10%, respectivement, et le taux d'impôt optimal sur le revenu et sa volatilité déclinent de façon spectaculaire. Ces résultats sont d'une grande importance étant donné que dans les modèles frictionnels du marché du travail sans politique budgétaire et monnaie, ou dans les Nouveaux cadres keynésien même avec un riche éventail de rigidités réelles et nominales et un minuscule degré de rigidité des prix, la stabilité des prix semble être l'objectif central de la politique monétaire optimale. En l'absence de politique budgétaire et la demande de monnaie, le taux d'inflation optimal tombe très proche de zéro, avec une volatilité environ 97 pour cent moins, compatible avec la littérature. Dans le deuxième article, je montre comment les résultats quantitatifs impliquent que le pouvoir de négociation des travailleurs et les coûts de l'aide sociale de règles monétaires sont liées négativement. Autrement dit, le plus faible est le pouvoir de négociation des travailleurs, le plus grand sont les coûts sociaux des règles de politique monétaire. Toutefois, dans un contraste saisissant par rapport à la littérature, les règles qui régissent à la production et à l'étroitesse du marché du travail entraînent des coûts de bien-être considérablement plus faible que la règle de ciblage de l'inflation. C'est en particulier le cas pour la règle qui répond à l'étroitesse du marché du travail. Les coûts de l'aide sociale aussi baisse remarquablement en augmentant la taille du coefficient de production dans les règles monétaires. Mes résultats indiquent qu'en augmentant le pouvoir de négociation du travailleur au niveau Hosios ou plus, les coûts de l'aide sociale des trois règles monétaires diminuent significativement et la réponse à la production ou à la étroitesse du marché du travail n'entraîne plus une baisse des coûts de bien-être moindre que la règle de ciblage de l'inflation, qui est en ligne avec la littérature existante. Dans le troisième article, je montre d'abord que la règle Friedman dans un modèle monétaire avec une contrainte de type cash-in-advance pour les entreprises n’est pas optimale lorsque le gouvernement pour financer ses dépenses a accès à des taxes à distorsion sur la consommation. Je soutiens donc que, la règle Friedman en présence de ces taxes à distorsion est optimale si nous supposons un modèle avec travaie raw-efficace où seule le travaie raw est soumis à la contrainte de type cash-in-advance et la fonction d'utilité est homothétique dans deux types de main-d'oeuvre et séparable dans la consommation. Lorsque la fonction de production présente des rendements constants à l'échelle, contrairement au modèle des produits de trésorerie de crédit que les prix de ces deux produits sont les mêmes, la règle Friedman est optimal même lorsque les taux de salaire sont différents. Si la fonction de production des rendements d'échelle croissant ou decroissant, pour avoir l'optimalité de la règle Friedman, les taux de salaire doivent être égales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Composite resins have been subjected to structural modifications aiming at improved optical and mechanical properties. The present study consisted in an in vitro evaluation of the staining behavior of two nanohybrid resins (NH1 and NH2), a nanoparticulated resin (NP) and a microhybrid resin (MH). Samples of these materials were prepared and immersed in commonly ingested drinks, i.e., coffee, red wine and acai berry for periods of time varying from 1 to 60 days. Cylindrical samples of each resin were shaped using a metallic die and polymerized during 30 s both on the bottom and top of its disk. All samples were polished and immersed in the staining solutions. After 24 hours, three samples of each resin immersed in each solution were removed and placed in a spectrofotome ter for analysis. To that end, the samples were previously diluted in HCl at 50%. Tukey tests were carried out in the statistical analysis of the results. The results revealed that there was a clear difference in the staining behavior of each material. The nanoparticulated resin did not show better color stability compared to the microhybrid resin. Moreover, all resins stained with time. The degree of staining decreased in the sequence nanoparticulated, microhybrid, nanohybrid MH2 and MH1. Wine was the most aggressive drink followed by coffee and acai berry. SEM and image analysis revealed significant porosity on the surface of MH resin and relatively large pores on a NP sample. The NH2 resin was characterized by homogeneous dispersion of particles and limited porosity. Finally, the NH1 resin depicted the lowest porosity level. The results revealed that staining is likely related to the concentration of inorganic pa rticles and surface porosity

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When a task must be executed in a remote or dangerous environment, teleoperation systems may be employed to extend the influence of the human operator. In the case of manipulation tasks, haptic feedback of the forces experienced by the remote (slave) system is often highly useful in improving an operator's ability to perform effectively. In many of these cases (especially teleoperation over the internet and ground-to-space teleoperation), substantial communication latency exists in the control loop and has the strong tendency to cause instability of the system. The first viable solution to this problem in the literature was based on a scattering/wave transformation from transmission line theory. This wave transformation requires the designer to select a wave impedance parameter appropriate to the teleoperation system. It is widely recognized that a small value of wave impedance is well suited to free motion and a large value is preferable for contact tasks. Beyond this basic observation, however, very little guidance exists in the literature regarding the selection of an appropriate value. Moreover, prior research on impedance selection generally fails to account for the fact that in any realistic contact task there will simultaneously exist contact considerations (perpendicular to the surface of contact) and quasi-free-motion considerations (parallel to the surface of contact). The primary contribution of the present work is to introduce an approximate linearized optimum for the choice of wave impedance and to apply this quasi-optimal choice to the Cartesian reality of such a contact task, in which it cannot be expected that a given joint will be either perfectly normal to or perfectly parallel to the motion constraint. The proposed scheme selects a wave impedance matrix that is appropriate to the conditions encountered by the manipulator. This choice may be implemented as a static wave impedance value or as a time-varying choice updated according to the instantaneous conditions encountered. A Lyapunov-like analysis is presented demonstrating that time variation in wave impedance will not violate the passivity of the system. Experimental trials, both in simulation and on a haptic feedback device, are presented validating the technique. Consideration is also given to the case of an uncertain environment, in which an a priori impedance choice may not be possible.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de dout. em Electrónica e Computação, Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2004

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To investigate the association of self-rated health and affiliation with a primary care provider (PCP) in New Zealand.
Methods

We used data from a New Zealand panel study of 22,000 adults. The main exposure was self-rated health, and the main outcome measure was affiliation with a PCP. Fixed effects conditional logistic models were used to control for observed time-varying and unobserved time-invariant confounding.
Results

In any given wave, the odds of being affiliated with a PCP were higher for those in good and fair/poor health relative to those in excellent health. While affiliation for Europeans increased as reported health declined, the odds of being affiliated were lower for Māori respondents reporting very good or good health relative to those in excellent health. No significant differences in the association by age or gender were observed.
Conclusions

Our data support the hypothesis that those in poorer health are more likely to be affiliated with a PCP. Variations in affiliation for Māori could arise for several reasons, including differences in care-seeking behaviour and perceived need of care. It may also mean that the message about the benefits of primary health care is not getting through equally to all population groups.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Acquiring a disability in adulthood is associated with a reduction in mental health and access to secure and affordable housing is associated with better mental health. We hypothesised that the association between acquisition of disability and mental health is modified by housing tenure and affordability. We used twelve annual waves of data (2001-2012) (1913 participants, 13,037 observations) from the Household, Income and Labour Dynamics in Australia survey. Eligible participants reported at least two consecutive waves of disability preceded by two consecutive waves without disability. Effect measure modification, on the additive scale, was tested in three fixed-effects linear regression models (which remove time-invariant confounding) which included a cross-product term between disability and prior housing circumstances: housing tenure by disability; housing affordability by disability and, in a sub-sample (896 participants 5913 observations) with housing costs, tenure/affordability by disability. The outcome was the continuous mental component summary (MCS) of SF-36. Models adjusted for time-varying confounders. There was statistical evidence that prior housing modified the effect of disability acquisition on mental health. Our findings suggested that those in affordable housing had a -1.7 point deterioration in MCS (95% CI -2.1, -1.3) following disability acquisition and those in unaffordable housing had a -4.2 point reduction (95% CI -5.2, -1.4). Among people with housing costs, the largest declines in MCS were for people with unaffordable mortgages (-5.3, 95% CI -8.8, -1.9) and private renters in unaffordable housing (-4.0, 95% CI -6.3, -1.6), compared to a -1.4 reduction (95% CI -2.1, -0.7) for mortgagors in affordable housing. In sum, we used causally-robust fixed-effects regression and showed that deterioration in mental health following disability acquisition is modified by prior housing circumstance with the largest negative associations found for those in unaffordable housing. Future research should test whether providing secure, affordable housing when people acquire a disability prevents deterioration in mental health.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We examine stock return predictability for India and find strong evidence of sectoral return predictability over market return predictability. We show that mean-variance investors make statistically significant and economically meaningful profits by tracking financial ratios. For the first time in this literature, we examine the determinants of time-varying predictability and mean-variance profits. We show that both expected and unexpected shocks emanating from most financial ratios explain sectoral return predictability and profits. These are fresh contributions to the understanding of asset pricing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is concerned with the problem of stochastic stability analysis of discrete-time two-dimensional (2-D) Markovian jump systems (MJSs) described by the Roesser model with interval time-varying delays. The transition probabilities of the jumping process/Markov chain are assumed to be uncertain, that is, they are not exactly known but can be estimated. A Lyapunov-like scheme is first extended to 2-D MJSs with delays. Based on some novel 2-D summation inequalities proposed in this paper, delay-dependent stochastic stability conditions are derived in terms of linear matrix inequalities (LMIs) which can be computationally solved by various convex optimization algorithms. Finally, two numerical examples are given to illustrate the effectiveness of the obtained results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using high-frequency data, we decompose the time-varying beta for stocks into beta for continuous systematic risk and beta for discontinuous systematic risk. Estimated discontinuous betas for S&P500 constituents between 2003 and 2011 generally exceed the corresponding continuous betas. We demonstrate how continuous and discontinuous betas decrease with portfolio diversification. Using an equiweighted broad market index, we assess the speed of convergence of continuous and discontinuous betas in portfolios of stocks as the number of holdings increase. We show that discontinuous risk dissipates faster with fewer stocks in a portfolio compared to its continuous counterpart.