942 resultados para equilibrium asset pricing models with latent variables
Resumo:
We study the incentive to invest to improve marriage prospects, in a frictionless marriage market with non-transferable utility. Stochastic returns to investment eliminate the multiplicity of equilibria in models with deterministic returns, and a unique equilibrium exists under reasonable conditions. Equilibrium investment is efficient when the sexes are symmetric. However, when there is any asymmetry, including an unbalanced sex ratio, investments are generically excessive. For example, if there is an excess of boys, then there is parental over-investment in boys and under-investment in girls, and total investment will be excessive.
Resumo:
Discretionary policymakers cannot manage private-sector expectations and cannot coordinate the actions of future policymakers. As a consequence, expectations traps and coordination failures can occur and multiple equilibria can arise. To utilize the explanatory power of models with multiple equilibria it is first necessary to understand how an economy arrives to a particular equilibrium. In this paper we employ notions of learnability and self-enforceability to motivate and identify equilibria of particular interest. Central among these criteria are whether the equilibrium is learnable by private agents and jointly learnable by private agents and the policymaker. We use two New Keynesian policy models to identify the strategic interactions that give rise to multiple equilibria and to illustrate our methods for identifying equilibria of interest. Importantly, unless the Pareto-preferred equilibrium is learnable by private agents, we find little reason to expect coordination on that equilibrium.
Resumo:
This paper revisits the argument that the stabilisation bias that arises under discretionary monetary policy can be reduced if policy is delegated to a policymaker with redesigned objectives. We study four delegation schemes: price level targeting, interest rate smoothing, speed limits and straight conservatism. These can all increase social welfare in models with a unique discretionary equilibrium. We investigate how these schemes perform in a model with capital accumulation where uniqueness does not necessarily apply. We discuss how multiplicity arises and demonstrate that no delegation scheme is able to eliminate all potential bad equilibria. Price level targeting has two interesting features. It can create a new equilibrium that is welfare dominated, but it can also alter equilibrium stability properties and make coordination on the best equilibrium more likely.
Resumo:
The framework presents how trading in the foreign commodity futures market and the forward exchange market can affect the optimal spot positions of domestic commodity producers and traders. It generalizes the models of Kawai and Zilcha (1986) and Kofman and Viaene (1991) to allow both intermediate and final commodities to be traded in the international and futures markets, and the exporters/importers to face production shock, domestic factor costs and a random price. Applying mean-variance expected utility, we find that a rise in the expected exchange rate can raise both supply and demand for commodities and reduce domestic prices if the exchange rate elasticity of supply is greater than that of demand. Whether higher volatilities of exchange rate and foreign futures price can reduce the optimal spot position of domestic traders depends on the correlation between the exchange rate and the foreign futures price. Even though the forward exchange market is unbiased, and there is no correlation between commodity prices and exchange rates, the exchange rate can still affect domestic trading and prices through offshore hedging and international trade if the traders are interested in their profit in domestic currency. It illustrates how the world prices and foreign futures prices of commodities and their volatility can be transmitted to the domestic market as well as the dynamic relationship between intermediate and final goods prices. The equilibrium prices depends on trader behaviour i.e. who trades or does not trade in the foreign commodity futures and domestic forward currency markets. The empirical result applying a two-stage-least-squares approach to Thai rice and rubber prices supports the theoretical result.
Predicting random level and seasonality of hotel prices. A structural equation growth curve approach
Resumo:
This article examines the effect on price of different characteristics of holiday hotels in the sun-and-beach segment, under the hedonic function perspective. Monthly prices of the majority of hotels in the Spanish continental Mediterranean coast are gathered from May to October 1999 from the tour operator catalogues. Hedonic functions are specified as random-effect models and parametrized as structural equation models with two latent variables, a random peak season price and a random width of seasonal fluctuations. Characteristics of the hotel and the region where they are located are used as predictors of both latent variables. Besides hotel category, region, distance to the beach, availability of parking place and room equipment have an effect on peak price and also on seasonality. 3- star hotels have the highest seasonality and hotels located in the southern regions the lowest, which could be explained by a warmer climate in autumn
Resumo:
First: A continuous-time version of Kyle's model (Kyle 1985), known as the Back's model (Back 1992), of asset pricing with asymmetric information, is studied. A larger class of price processes and of noise traders' processes are studied. The price process, as in Kyle's model, is allowed to depend on the path of the market order. The process of the noise traders' is an inhomogeneous Lévy process. Solutions are found by the Hamilton-Jacobi-Bellman equations. With the insider being risk-neutral, the price pressure is constant, and there is no equilibirium in the presence of jumps. If the insider is risk-averse, there is no equilibirium in the presence of either jumps or drifts. Also, it is analised when the release time is unknown. A general relation is established between the problem of finding an equilibrium and of enlargement of filtrations. Random announcement time is random is also considered. In such a case the market is not fully efficient and there exists equilibrium if the sensitivity of prices with respect to the global demand is time decreasing according with the distribution of the random time. Second: Power variations. it is considered, the asymptotic behavior of the power variation of processes of the form _integral_0^t u(s-)dS(s), where S_ is an alpha-stable process with index of stability 0&alpha&2 and the integral is an Itô integral. Stable convergence of corresponding fluctuations is established. These results provide statistical tools to infer the process u from discrete observations. Third: A bond market is studied where short rates r(t) evolve as an integral of g(t-s)sigma(s) with respect to W(ds), where g and sigma are deterministic and W is the stochastic Wiener measure. Processes of this type are particular cases of ambit processes. These processes are in general not of the semimartingale kind.
Resumo:
The sample dimension, types of variables, format used for measurement, and construction of instruments to collect valid and reliable data must be considered during the research process. In the social and health sciences, and more specifically in nursing, data-collection instruments are usually composed of latent variables or variables that cannot be directly observed. Such facts emphasize the importance of deciding how to measure study variables (using an ordinal scale or a Likert or Likert-type scale). Psychometric scales are examples of instruments that are affected by the type of variables that comprise them, which could cause problems with measurement and statistical analysis (parametric tests versus non-parametric tests). Hence, investigators using these variables must rely on suppositions based on simulation studies or recommendations based on scientific evidence in order to make the best decisions.
Resumo:
OBJECTIVE: To examine predictors of stroke recurrence in patients with a high vs a low likelihood of having an incidental patent foramen ovale (PFO) as defined by the Risk of Paradoxical Embolism (RoPE) score. METHODS: Patients in the RoPE database with cryptogenic stroke (CS) and PFO were classified as having a probable PFO-related stroke (RoPE score of >6, n = 647) and others (RoPE score of ≤6 points, n = 677). We tested 15 clinical, 5 radiologic, and 3 echocardiographic variables for associations with stroke recurrence using Cox survival models with component database as a stratification factor. An interaction with RoPE score was checked for the variables that were significant. RESULTS: Follow-up was available for 92%, 79%, and 57% at 1, 2, and 3 years. Overall, a higher recurrence risk was associated with an index TIA. For all other predictors, effects were significantly different in the 2 RoPE score categories. For the low RoPE score group, but not the high RoPE score group, older age and antiplatelet (vs warfarin) treatment predicted recurrence. Conversely, echocardiographic features (septal hypermobility and a small shunt) and a prior (clinical) stroke/TIA were significant predictors in the high but not low RoPE score group. CONCLUSION: Predictors of recurrence differ when PFO relatedness is classified by the RoPE score, suggesting that patients with CS and PFO form a heterogeneous group with different stroke mechanisms. Echocardiographic features were only associated with recurrence in the high RoPE score group.
Resumo:
This paper fills a gap in the existing literature on least squareslearning in linear rational expectations models by studying a setup inwhich agents learn by fitting ARMA models to a subset of the statevariables. This is a natural specification in models with privateinformation because in the presence of hidden state variables, agentshave an incentive to condition forecasts on the infinite past recordsof observables. We study a particular setting in which it sufficesfor agents to fit a first order ARMA process, which preserves thetractability of a finite dimensional parameterization, while permittingconditioning on the infinite past record. We describe how previousresults (Marcet and Sargent [1989a, 1989b] can be adapted to handlethe convergence of estimators of an ARMA process in our self--referentialenvironment. We also study ``rates'' of convergence analytically and viacomputer simulation.
Resumo:
Many revenue management (RM) industries are characterized by (a) fixed capacities in theshort term (e.g., hotel rooms, seats on an airline flight), (b) homogeneous products (e.g., twoairline flights between the same cities at similar times), and (c) customer purchasing decisionslargely influenced by price. Competition in these industries is also very high even with just twoor three direct competitors in a market. However, RM competition is not well understood andpractically all known implementations of RM software and most published models of RM donot explicitly model competition. For this reason, there has been considerable recent interestand research activity to understand RM competition. In this paper we study price competitionfor an oligopoly in a dynamic setting, where each of the sellers has a fixed number of unitsavailable for sale over a fixed number of periods. Demand is stochastic, and depending on howit evolves, sellers may change their prices at any time. This reflects the fact that firms constantly,and almost costlessly, change their prices (alternately, allocations at a price in quantity-basedRM), reacting either to updates in their estimates of market demand, competitor prices, orinventory levels. We first prove existence of a unique subgame-perfect equilibrium for a duopoly.In equilibrium, in each state sellers engage in Bertrand competition, so that the seller withthe lowest reservation value ends up selling a unit at a price that is equal to the equilibriumreservation value of the competitor. This structure hence extends the marginal-value conceptof bid-price control, used in many RM implementations, to a competitive model. In addition,we show that the seller with the lowest capacity sells all its units first. Furthermore, we extendthe results transparently to n firms and perform a number of numerical comparative staticsexploiting the uniqueness of the subgame-perfect equilibrium.
Resumo:
The interpretation of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) is based on a 4-factor model, which is only partially compatible with the mainstream Cattell-Horn-Carroll (CHC) model of intelligence measurement. The structure of cognitive batteries is frequently analyzed via exploratory factor analysis and/or confirmatory factor analysis. With classical confirmatory factor analysis, almost all crossloadings between latent variables and measures are fixed to zero in order to allow the model to be identified. However, inappropriate zero cross-loadings can contribute to poor model fit, distorted factors, and biased factor correlations; most important, they do not necessarily faithfully reflect theory. To deal with these methodological and theoretical limitations, we used a new statistical approach, Bayesian structural equation modeling (BSEM), among a sample of 249 French-speaking Swiss children (8-12 years). With BSEM, zero-fixed cross-loadings between latent variables and measures are replaced by approximate zeros, based on informative, small-variance priors. Results indicated that a direct hierarchical CHC-based model with 5 factors plus a general intelligence factor better represented the structure of the WISC-IV than did the 4-factor structure and the higher order models. Because a direct hierarchical CHC model was more adequate, it was concluded that the general factor should be considered as a breadth rather than a superordinate factor. Because it was possible for us to estimate the influence of each of the latent variables on the 15 subtest scores, BSEM allowed improvement of the understanding of the structure of intelligence tests and the clinical interpretation of the subtest scores.
Resumo:
BACKGROUND: The outcome of Kaposi sarcoma varies. While many patients do well on highly active antiretroviral therapy, others have progressive disease and need chemotherapy. In order to predict which patients are at risk of unfavorable evolution, we established a prognostic score. METHOD: The survival analysis (Kaplan-Meier method; Cox proportional hazards models) of 144 patients with Kaposi sarcoma prospectively included in the Swiss HIV Cohort Study, from January 1996 to December 2004, was conducted. OUTCOME ANALYZED: use of chemotherapy or death. VARIABLES ANALYZED: demographics, tumor staging [T0 or T1 (16)], CD4 cell counts and HIV-1 RNA concentration, human herpesvirus 8 (HHV8) DNA in plasma and serological titers to latent and lytic antigens. RESULTS: Of 144 patients, 54 needed chemotherapy or died. In the univariate analysis, tumor stage T1, CD4 cell count below 200 cells/microl, positive HHV8 DNA and absence of antibodies against the HHV8 lytic antigen at the time of diagnosis were significantly associated with a bad outcome.Using multivariate analysis, the following variables were associated with an increased risk of unfavorable outcome: T1 [hazard ratio (HR) 5.22; 95% confidence interval (CI) 2.97-9.18], CD4 cell count below 200 cells/microl (HR 2.33; 95% CI 1.22-4.45) and positive HHV8 DNA (HR 2.14; 95% CI 1.79-2.85).We created a score with these variables ranging from 0 to 4: T1 stage counted for two points, CD4 cell count below 200 cells/microl for one point, and positive HHV8 viral load for one point. Each point increase was associated with a HR of 2.26 (95% CI 1.79-2.85). CONCLUSION: In the multivariate analysis, staging (T1), CD4 cell count (<200 cells/microl), positive HHV8 DNA in plasma, at the time of diagnosis, predict evolution towards death or the need of chemotherapy.
Resumo:
We perform an experiment on a pure coordination game with uncertaintyabout the payoffs. Our game is closely related to models that have beenused in many macroeconomic and financial applications to solve problemsof equilibrium indeterminacy. In our experiment each subject receives anoisy signal about the true payoffs. This game has a unique strategyprofile that survives the iterative deletion of strictly dominatedstrategies (thus a unique Nash equilibrium). The equilibrium outcomecoincides, on average, with the risk-dominant equilibrium outcome ofthe underlying coordination game. The behavior of the subjects convergesto the theoretical prediction after enough experience has been gained. The data (and the comments) suggest that subjects do not apply through"a priori" reasoning the iterated deletion of dominated strategies.Instead, they adapt to the responses of other players. Thus, the lengthof the learning phase clearly varies for the different signals. We alsotest behavior in a game without uncertainty as a benchmark case. The gamewith uncertainty is inspired by the "global" games of Carlsson and VanDamme (1993).
Resumo:
The paper proposes a numerical solution method for general equilibrium models with a continuum of heterogeneous agents, which combines elements of projection and of perturbation methods. The basic idea is to solve first for the stationary solutionof the model, without aggregate shocks but with fully specified idiosyncratic shocks. Afterwards one computes a first-order perturbation of the solution in the aggregate shocks. This approach allows to include a high-dimensional representation of the cross-sectional distribution in the state vector. The method is applied to a model of household saving with uninsurable income risk and liquidity constraints. The model includes not only productivity shocks, but also shocks to redistributive taxation, which cause substantial short-run variation in the cross-sectional distribution of wealth. If those shocks are operative, it is shown that a solution method based on very few statistics of the distribution is not suitable, while the proposed method can solve the model with high accuracy, at least for the case of small aggregate shocks. Techniques are discussed to reduce the dimension of the state space such that higher order perturbations are feasible.Matlab programs to solve the model can be downloaded.
Resumo:
This paper shows that information effects per se are not responsible forthe Giffen goods anomaly affecting competitive traders demands in multi-asset, noisy rational expectations equilibrium models. The role thatinformation plays in traders strategies also matters. In a market withrisk averse, uninformed traders, informed agents havea dual motive for trading: speculation and market making. Whilespeculation entails using prices to assess the effect of private signalerror terms, market making requires employing them to disentangle noisetraders effects in traders aggregate orders. In a correlated environment,this complicates a trader s signal-extraction problem and maygenerate upward-sloping demand curves. Assuming either (i) that competitive,risk neutral market makers price the assets, or that (ii) the risktolerance coefficient of uninformed traders grows without bound, removesthe market making component from informed traders demands, rendering themwell behaved in prices.