988 resultados para typage dynamique


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans le contexte des réformes institutionnelles engagées par la loi MAPTAM, qui propose la création des Métropoles et renforce les Pôles Métropolitaines, cet ouvrage collectif porte un regard dynamique sur le long processus de métropolitain d'un vaste territoire. A travers son histoire et son identité, les différentes étapes de la planification et de gouvernance, l'Aire métropolitaine lyonnaise n'a de cesse de se remettre en question face aux nouveaux défis et aux nouvelles transitions auxquels elle est confrontée. L'expérience métropolitaine lyonnaise apparaît alors comme un bien commun permettant d'affronter les incertitudes actuelles, et sa mise en récit vient consolider un projet territorial qui doit être partagé par ses acteurs institutionnels et socio-économiques, et par plus de 3 millions d'habitants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction : Les fractures du membre inférieur (MI) de l'enfant traitées par immobilisation plâtrée engendrent une modification significative de la mobilité exacerbée en cas d'obésité. L'accéléromètre est un outil d'évaluation du degré d'activité physique (AP) de l'enfant scientifiquement validé. Il n'a jamais fait l'objet d'étude chez un enfant ayant souffert d'une fracture du MI. Le but de ce travail était d'identifier les problèmes dans l'utilisation d'un accéléromètre comme moyen de mesure de l'AP après fracture nécessitant une décharge du MI. Une adaptation de la réhabilitation post-traumatique en fonction du BMI pourrait alors être proposée. Méthode : Identification d'enfants âgés de 8 et 15 ans, victimes d'une fracture du membre inférieur, consultant aux urgences de l'Hôpital de l'Enfance d'octobre 2013 à mai 2014 et nécessitant une décharge post-traumatique. Etaient exclus les enfants polytraumatisés ou souffrants d'un déficit mental. Données pré-requises des patients: âge, poids, taille, sexe, mécanisme de l'accident, type de fracture et traitement. Proposition de port d'un Actiwatch® Spectrum au poignet et cheville pour la période de remobilisation en décharge. Identification des avantages et problèmes liés à l'usage de l'appareil durant les premiers 30 jours de la période de réhabilitation. Importance : L'absence totale d'étude sur la mobilité post-fracture, la complexité des problèmes liés à la marche en décharge, les contraintes de l'immobilisation plâtrée et la prévalence grandissante de l'obésité pédiatrique justifient la recherche d'un moyen fiable pour quantifier la mobilité d'un enfant en décharge après traumatisme du MI. Résultats : Sur 43 fractures du MI traitées à l'HEL durant la période de l'étude, 13 enfants identifiés, dont 1 exclu pour maladie psychiatrique, 1 refus de participation, 2 transferts immédiats, 2 non inclus pour causes pratiques. Sept garçons âgés de 11 à 16 ans ont accepté le port de l'Actiwatch® pour une durée variant entre 7 et 27 jours (moyenne 15). Nombre d'activités (NA) médians de 5 enfants: 171,79 ±105,37 [cpm]* à J1 et 219,48 ±145,52 [cpm] à J5. NA totales médianes sur 24h : 114'072±44'791 [cpm] à J1 et 234'452 ±134'775 [cpm] à J5. Une dynamique de regain de mobilité est mise en évidence avec intensités maximales et minimales du nombre d'activités pour chacun. La médiane du temps de sommeil des 5 enfants était de 716± 45,5 [mn]. Les problèmes rencontrés ont été d'ordre mécanique (Un Actiwatch® fut défectueux), d'ordre pratique (un perdu et rendu tardivement, un port intermittent, une réaction allergique au bracelet à 4j de port). Conclusions La compliance à l'utilisation de l'Actiwatch® sur toute la durée de la décharge n'était pas optimale. La mobilité moyenne des enfants était objectivable de par leur dynamique, leur intensité maximale et minimale et comparables vis-à-vis de certaines études. Une différence avec les sujets en surpoids est observable. La durée de sommeil de chaque enfant suggère que l'antalgie administrée en cours de traitement est suffisante. Utiliser ce capteur de manière prolongée et sur un grand collectif d'enfants serait un moyen fiable et simple d'objectiver la dynamique de reprise de l'activité physique chez ces patients. Profil de l'étude : observation de cas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we characterize the asymmetries of the smile through multiple leverage effects in a stochastic dynamic asset pricing framework. The dependence between price movements and future volatility is introduced through a set of latent state variables. These latent variables can capture not only the volatility risk and the interest rate risk which potentially affect option prices, but also any kind of correlation risk and jump risk. The standard financial leverage effect is produced by a cross-correlation effect between the state variables which enter into the stochastic volatility process of the stock price and the stock price process itself. However, we provide a more general framework where asymmetric implied volatility curves result from any source of instantaneous correlation between the state variables and either the return on the stock or the stochastic discount factor. In order to draw the shapes of the implied volatility curves generated by a model with latent variables, we specify an equilibrium-based stochastic discount factor with time non-separable preferences. When we calibrate this model to empirically reasonable values of the parameters, we are able to reproduce the various types of implied volatility curves inferred from option market data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper assesses the empirical performance of an intertemporal option pricing model with latent variables which generalizes the Hull-White stochastic volatility formula. Using this generalized formula in an ad-hoc fashion to extract two implicit parameters and forecast next day S&P 500 option prices, we obtain similar pricing errors than with implied volatility alone as in the Hull-White case. When we specialize this model to an equilibrium recursive utility model, we show through simulations that option prices are more informative than stock prices about the structural parameters of the model. We also show that a simple method of moments with a panel of option prices provides good estimates of the parameters of the model. This lays the ground for an empirical assessment of this equilibrium model with S&P 500 option prices in terms of pricing errors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans ce texte, nous analysons les développements récents de l’économétrie à la lumière de la théorie des tests statistiques. Nous revoyons d’abord quelques principes fondamentaux de philosophie des sciences et de théorie statistique, en mettant l’accent sur la parcimonie et la falsifiabilité comme critères d’évaluation des modèles, sur le rôle de la théorie des tests comme formalisation du principe de falsification de modèles probabilistes, ainsi que sur la justification logique des notions de base de la théorie des tests (tel le niveau d’un test). Nous montrons ensuite que certaines des méthodes statistiques et économétriques les plus utilisées sont fondamentalement inappropriées pour les problèmes et modèles considérés, tandis que de nombreuses hypothèses, pour lesquelles des procédures de test sont communément proposées, ne sont en fait pas du tout testables. De telles situations conduisent à des problèmes statistiques mal posés. Nous analysons quelques cas particuliers de tels problèmes : (1) la construction d’intervalles de confiance dans le cadre de modèles structurels qui posent des problèmes d’identification; (2) la construction de tests pour des hypothèses non paramétriques, incluant la construction de procédures robustes à l’hétéroscédasticité, à la non-normalité ou à la spécification dynamique. Nous indiquons que ces difficultés proviennent souvent de l’ambition d’affaiblir les conditions de régularité nécessaires à toute analyse statistique ainsi que d’une utilisation inappropriée de résultats de théorie distributionnelle asymptotique. Enfin, nous soulignons l’importance de formuler des hypothèses et modèles testables, et de proposer des techniques économétriques dont les propriétés sont démontrables dans les échantillons finis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyzes the dynamics of wages and workers' mobility within firms with a hierarchical structure of job levels. The theoretical model proposed by Gibbons and Waldman (1999), that combines the notions of human capital accumulation, job rank assignments based on comparative advantage and learning about workers' abilities, is implemented empirically to measure the importance of these elements in explaining the wage policy of firms. Survey data from the GSOEP (German Socio-Economic Panel) are used to draw conclusions on the common features characterizing the wage policy of firms from a large sample of firms. The GSOEP survey also provides information on the worker's rank within his firm which is usually not available in other surveys. The results are consistent with non-random selection of workers onto the rungs of a job ladder. There is no direct evidence of learning about workers' unobserved abilities but the analysis reveals that unmeasured ability is an important factor driving wage dynamics. Finally, job rank effects remain significant even after controlling for measured and unmeasured characteristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

À l’aide d’un modèle de cycles réels, la présente étude vise à expliquer, de façon endogène, les fluctuations des termes de l’échange en Côte-d’Ivoire. Pour ce faire, nous cherchons principalement à répondre aux deux questions suivantes : les chocs d’offre et de demande sur le marché d’exportation suffisent-ils à expliquer les variations des termes de l’échange? Et quelle est leur importance relative dans la dynamique des termes de l’échange? Les résultats montrent que les deux chocs considérés expliquent bien la volatilité des termes de l’échange. Nous avons noté que ces deux sources d’impulsions ont un impact significatif sur les fluctuations économiques en Côte-d’Ivoire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper extends the Competitive Storage Model by incorporating prominent features of the production process and financial markets. A major limitation of this basic model is that it cannot successfully explain the degree of serial correlation observed in actual data. The proposed extensions build on the observation that in order to generate a high degree of price persistence, a model must incorporate features such that agents are willing to hold stocks more often than predicted by the basic model. We therefore allow unique characteristics of the production and trading mechanisms to provide the required incentives. Specifically, the proposed models introduce (i) gestation lags in production with heteroskedastic supply shocks, (ii) multiperiod forward contracts, and (iii) a convenience return to inventory holding. The rational expectations solutions for twelve commodities are numerically solved. Simulations are then employed to assess the effects of the above extensions on the time series properties of commodity prices. Results indicate that each of the features above partially account for the persistence and occasional spikes observed in actual data. Evidence is presented that the precautionary demand for stocks might play a substantial role in the dynamics of commodity prices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We characterize the solution to a model of consumption smoothing using financing under non-commitment and savings. We show that, under certain conditions, these two different instruments complement each other perfectly. If the rate of time preference is equal to the interest rate on savings, perfect smoothing can be achieved in finite time. We also show that, when random revenues are generated by periodic investments in capital through a concave production function, the level of smoothing achieved through financial contracts can influence the productive investment efficiency. As long as financial contracts cannot achieve perfect smoothing, productive investment will be used as a complementary smoothing device.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a recent paper, Bai and Perron (1998) considered theoretical issues related to the limiting distribution of estimators and test statistics in the linear model with multiple structural changes. In this companion paper, we consider practical issues for the empirical applications of the procedures. We first address the problem of estimation of the break dates and present an efficient algorithm to obtain global minimizers of the sum of squared residuals. This algorithm is based on the principle of dynamic programming and requires at most least-squares operations of order O(T 2) for any number of breaks. Our method can be applied to both pure and partial structural-change models. Secondly, we consider the problem of forming confidence intervals for the break dates under various hypotheses about the structure of the data and the errors across segments. Third, we address the issue of testing for structural changes under very general conditions on the data and the errors. Fourth, we address the issue of estimating the number of breaks. We present simulation results pertaining to the behavior of the estimators and tests in finite samples. Finally, a few empirical applications are presented to illustrate the usefulness of the procedures. All methods discussed are implemented in a GAUSS program available upon request for non-profit academic use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The GARCH and Stochastic Volatility paradigms are often brought into conflict as two competitive views of the appropriate conditional variance concept : conditional variance given past values of the same series or conditional variance given a larger past information (including possibly unobservable state variables). The main thesis of this paper is that, since in general the econometrician has no idea about something like a structural level of disaggregation, a well-written volatility model should be specified in such a way that one is always allowed to reduce the information set without invalidating the model. To this respect, the debate between observable past information (in the GARCH spirit) versus unobservable conditioning information (in the state-space spirit) is irrelevant. In this paper, we stress a square-root autoregressive stochastic volatility (SR-SARV) model which remains true to the GARCH paradigm of ARMA dynamics for squared innovations but weakens the GARCH structure in order to obtain required robustness properties with respect to various kinds of aggregation. It is shown that the lack of robustness of the usual GARCH setting is due to two very restrictive assumptions : perfect linear correlation between squared innovations and conditional variance on the one hand and linear relationship between the conditional variance of the future conditional variance and the squared conditional variance on the other hand. By relaxing these assumptions, thanks to a state-space setting, we obtain aggregation results without renouncing to the conditional variance concept (and related leverage effects), as it is the case for the recently suggested weak GARCH model which gets aggregation results by replacing conditional expectations by linear projections on symmetric past innovations. Moreover, unlike the weak GARCH literature, we are able to define multivariate models, including higher order dynamics and risk premiums (in the spirit of GARCH (p,p) and GARCH in mean) and to derive conditional moment restrictions well suited for statistical inference. Finally, we are able to characterize the exact relationships between our SR-SARV models (including higher order dynamics, leverage effect and in-mean effect), usual GARCH models and continuous time stochastic volatility models, so that previous results about aggregation of weak GARCH and continuous time GARCH modeling can be recovered in our framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the transition between exchange rate regimes using a Markov chain model with time-varying transition probabilities. The probabilities are parameterized as nonlinear functions of variables suggested by the currency crisis and optimal currency area literature. Results using annual data indicate that inflation, and to a lesser extent, output growth and trade openness help explain the exchange rate regime transition dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper employs the one-sector Real Business Cycle model as a testing ground for four different procedures to estimate Dynamic Stochastic General Equilibrium (DSGE) models. The procedures are: 1 ) Maximum Likelihood, with and without measurement errors and incorporating Bayesian priors, 2) Generalized Method of Moments, 3) Simulated Method of Moments, and 4) Indirect Inference. Monte Carlo analysis indicates that all procedures deliver reasonably good estimates under the null hypothesis. However, there are substantial differences in statistical and computational efficiency in the small samples currently available to estimate DSGE models. GMM and SMM appear to be more robust to misspecification than the alternative procedures. The implications of the stochastic singularity of DSGE models for each estimation method are fully discussed.