9 resultados para Empirical Models

em Duke University


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers forecasting the conditional mean and variance from a single-equation dynamic model with autocorrelated disturbances following an ARMA process, and innovations with time-dependent conditional heteroskedasticity as represented by a linear GARCH process. Expressions for the minimum MSE predictor and the conditional MSE are presented. We also derive the formula for all the theoretical moments of the prediction error distribution from a general dynamic model with GARCH(1, 1) innovations. These results are then used in the construction of ex ante prediction confidence intervals by means of the Cornish-Fisher asymptotic expansion. An empirical example relating to the uncertainty of the expected depreciation of foreign exchange rates illustrates the usefulness of the results. © 1992.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While there is growing interest in measuring the size and scope of local spillovers, it is well understood that such spillovers cannot be distinguished from unobservable local attributes using solely the observed location decisions of individuals or firms. We propose an empirical strategy for recovering estimates of spillovers in the presence of unobserved local attributes for a broadly applicable class of equilibrium sorting models. Our approach relies on an IV strategy derived from the internal logic of the sorting model itself. We show practically how the strategy is implemented, provide intuition for our instruments, discuss the role of effective choice-set variation in identifying the model, and carry-out a series of Monte Carlo simulations to demonstrate performance in small samples. © 2007 The Author(s). Journal compilation Royal Economic Society 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian factor models have proven widely useful for parsimoniously characterizing dependence in multivariate data. There is a rich literature on their extension to mixed categorical and continuous variables, using latent Gaussian variables or through generalized latent trait models acommodating measurements in the exponential family. However, when generalizing to non-Gaussian measured variables the latent variables typically influence both the dependence structure and the form of the marginal distributions, complicating interpretation and introducing artifacts. To address this problem we propose a novel class of Bayesian Gaussian copula factor models which decouple the latent factors from the marginal distributions. A semiparametric specification for the marginals based on the extended rank likelihood yields straightforward implementation and substantial computational gains. We provide new theoretical and empirical justifications for using this likelihood in Bayesian inference. We propose new default priors for the factor loadings and develop efficient parameter-expanded Gibbs sampling for posterior computation. The methods are evaluated through simulations and applied to a dataset in political science. The models in this paper are implemented in the R package bfa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much public and scientific attention. For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible unforced states of the climate system (the Envelope of Unforced Noise; EUN). Typically, the EUN is derived from climate models themselves, but climate models might not accurately simulate the correct characteristics of unforced GMT variability. Here, we simulate a new, empirical, EUN that is based on instrumental and reconstructed surface temperature records. We compare the forced GMT signal produced by climate models to observations while noting the range of GMT values provided by the empirical EUN. We find that the empirical EUN is wide enough so that the interdecadal variability in the rate of global warming over the 20(th) century does not necessarily require corresponding variability in the rate-of-increase of the forced signal. The empirical EUN also indicates that the reduced GMT warming over the past decade or so is still consistent with a middle emission scenario's forced signal, but is likely inconsistent with the steepest emission scenario's forced signal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors address the 4 main points in S. M. Monroe and S. Mineka's (2008) comment. First, the authors show that the Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; American Psychiatric Association, 2000) posttraumatic stress disorder (PTSD) diagnosis includes an etiology and that it is based on a theoretical model with a distinguished history in psychology and psychiatry. Two tenets of this theoretical model are that voluntary (strategic) recollections of the trauma are fragmented and incomplete while involuntary (spontaneous) recollections are vivid and persistent and yield privileged access to traumatic material. Second, the authors describe differences between their model and other cognitive models of PTSD. They argue that these other models share the same 2 tenets as the diagnosis and show that these 2 tenets are largely unsupported by empirical evidence. Third, the authors counter arguments about the strength of the evidence favoring the mnemonic model. Fourth, they show that concerns about the causal role of memory in PTSD are based on views of causality that are generally inappropriate for the explanation of PTSD in the social and biological sciences. © 2008 American Psychological Association.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new modality for preventing HIV transmission is emerging in the form of topical microbicides. Some clinical trials have shown some promising results of these methods of protection while other trials have failed to show efficacy. Due to the relatively novel nature of microbicide drug transport, a rigorous, deterministic analysis of that transport can help improve the design of microbicide vehicles and understand results from clinical trials. This type of analysis can aid microbicide product design by helping understand and organize the determinants of drug transport and the potential efficacies of candidate microbicide products.

Microbicide drug transport is modeled as a diffusion process with convection and reaction effects in appropriate compartments. This is applied here to vaginal gels and rings and a rectal enema, all delivering the microbicide drug Tenofovir. Although the focus here is on Tenofovir, the methods established in this dissertation can readily be adapted to other drugs, given knowledge of their physical and chemical properties, such as the diffusion coefficient, partition coefficient, and reaction kinetics. Other dosage forms such as tablets and fiber meshes can also be modeled using the perspective and methods developed here.

The analyses here include convective details of intravaginal flows by both ambient fluid and spreading gels with different rheological properties and applied volumes. These are input to the overall conservation equations for drug mass transport in different compartments. The results are Tenofovir concentration distributions in time and space for a variety of microbicide products and conditions. The Tenofovir concentrations in the vaginal and rectal mucosal stroma are converted, via a coupled reaction equation, to concentrations of Tenofovir diphosphate, which is the active form of the drug that functions as a reverse transcriptase inhibitor against HIV. Key model outputs are related to concentrations measured in experimental pharmacokinetic (PK) studies, e.g. concentrations in biopsies and blood. A new measure of microbicide prophylactic functionality, the Percent Protected, is calculated. This is the time dependent volume of the entire stroma (and thus fraction of host cells therein) in which Tenofovir diphosphate concentrations equal or exceed a target prophylactic value, e.g. an EC50.

Results show the prophylactic potentials of the studied microbicide vehicles against HIV infections. Key design parameters for each are addressed in application of the models. For a vaginal gel, fast spreading at small volume is more effective than slower spreading at high volume. Vaginal rings are shown to be most effective if inserted and retained as close to the fornix as possible. Because of the long half-life of Tenofovir diphosphate, temporary removal of the vaginal ring (after achieving steady state) for up to 24h does not appreciably diminish Percent Protected. However, full steady state (for the entire stromal volume) is not achieved until several days after ring insertion. Delivery of Tenofovir to the rectal mucosa by an enema is dominated by surface area of coated mucosa and whether the interiors of rectal crypts are filled with the enema fluid. For the enema 100% Percent Protected is achieved much more rapidly than for vaginal products, primarily because of the much thinner epithelial layer of the mucosa. For example, 100% Percent Protected can be achieved with a one minute enema application, and 15 minute wait time.

Results of these models have good agreement with experimental pharmacokinetic data, in animals and clinical trials. They also improve upon traditional, empirical PK modeling, and this is illustrated here. Our deterministic approach can inform design of sampling in clinical trials by indicating time periods during which significant changes in drug concentrations occur in different compartments. More fundamentally, the work here helps delineate the determinants of microbicide drug delivery. This information can be the key to improved, rational design of microbicide products and their dosage regimens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of social diffusion has animated sociological thinking on topics ranging from the spread of an idea, an innovation or a disease, to the foundations of collective behavior and political polarization. While network diffusion has been a productive metaphor, the reality of diffusion processes is often muddier. Ideas and innovations diffuse differently from diseases, but, with a few exceptions, the diffusion of ideas and innovations has been modeled under the same assumptions as the diffusion of disease. In this dissertation, I develop two new diffusion models for "socially meaningful" contagions that address two of the most significant problems with current diffusion models: (1) that contagions can only spread along observed ties, and (2) that contagions do not change as they spread between people. I augment insights from these statistical and simulation models with an analysis of an empirical case of diffusion - the use of enterprise collaboration software in a large technology company. I focus the empirical study on when people abandon innovations, a crucial, and understudied aspect of the diffusion of innovations. Using timestamped posts, I analyze when people abandon software to a high degree of detail.

To address the first problem, I suggest a latent space diffusion model. Rather than treating ties as stable conduits for information, the latent space diffusion model treats ties as random draws from an underlying social space, and simulates diffusion over the social space. Theoretically, the social space model integrates both actor ties and attributes simultaneously in a single social plane, while incorporating schemas into diffusion processes gives an explicit form to the reciprocal influences that cognition and social environment have on each other. Practically, the latent space diffusion model produces statistically consistent diffusion estimates where using the network alone does not, and the diffusion with schemas model shows that introducing some cognitive processing into diffusion processes changes the rate and ultimate distribution of the spreading information. To address the second problem, I suggest a diffusion model with schemas. Rather than treating information as though it is spread without changes, the schema diffusion model allows people to modify information they receive to fit an underlying mental model of the information before they pass the information to others. Combining the latent space models with a schema notion for actors improves our models for social diffusion both theoretically and practically.

The empirical case study focuses on how the changing value of an innovation, introduced by the innovations' network externalities, influences when people abandon the innovation. In it, I find that people are least likely to abandon an innovation when other people in their neighborhood currently use the software as well. The effect is particularly pronounced for supervisors' current use and number of supervisory team members who currently use the software. This case study not only points to an important process in the diffusion of innovation, but also suggests a new approach -- computerized collaboration systems -- to collecting and analyzing data on organizational processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented in this dissertation is focused on applying engineering methods to develop and explore probabilistic survival models for the prediction of decompression sickness in US NAVY divers. Mathematical modeling, computational model development, and numerical optimization techniques were employed to formulate and evaluate the predictive quality of models fitted to empirical data. In Chapters 1 and 2 we present general background information relevant to the development of probabilistic models applied to predicting the incidence of decompression sickness. The remainder of the dissertation introduces techniques developed in an effort to improve the predictive quality of probabilistic decompression models and to reduce the difficulty of model parameter optimization.

The first project explored seventeen variations of the hazard function using a well-perfused parallel compartment model. Models were parametrically optimized using the maximum likelihood technique. Model performance was evaluated using both classical statistical methods and model selection techniques based on information theory. Optimized model parameters were overall similar to those of previously published Results indicated that a novel hazard function definition that included both ambient pressure scaling and individually fitted compartment exponent scaling terms.

We developed ten pharmacokinetic compartmental models that included explicit delay mechanics to determine if predictive quality could be improved through the inclusion of material transfer lags. A fitted discrete delay parameter augmented the inflow to the compartment systems from the environment. Based on the observation that symptoms are often reported after risk accumulation begins for many of our models, we hypothesized that the inclusion of delays might improve correlation between the model predictions and observed data. Model selection techniques identified two models as having the best overall performance, but comparison to the best performing model without delay and model selection using our best identified no delay pharmacokinetic model both indicated that the delay mechanism was not statistically justified and did not substantially improve model predictions.

Our final investigation explored parameter bounding techniques to identify parameter regions for which statistical model failure will not occur. When a model predicts a no probability of a diver experiencing decompression sickness for an exposure that is known to produce symptoms, statistical model failure occurs. Using a metric related to the instantaneous risk, we successfully identify regions where model failure will not occur and identify the boundaries of the region using a root bounding technique. Several models are used to demonstrate the techniques, which may be employed to reduce the difficulty of model optimization for future investigations.