910 resultados para Discrete time pricing model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A discrete-time random process is described, which can generate bursty sequences of events. A Bernoulli process, where the probability of an event occurring at time t is given by a fixed probability x, is modified to include a memory effect where the event probability is increased proportionally to the number of events that occurred within a given amount of time preceding t. For small values of x the interevent time distribution follows a power law with exponent −2−x. We consider a dynamic network where each node forms, and breaks connections according to this process. The value of x for each node depends on the fitness distribution, \rho(x), from which it is drawn; we find exact solutions for the expectation of the degree distribution for a variety of possible fitness distributions, and for both cases where the memory effect either is, or is not present. This work can potentially lead to methods to uncover hidden fitness distributions from fast changing, temporal network data, such as online social communications and fMRI scans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article examines whether commodity risk is priced in the cross-section of global equity returns. We employ a long-only equally-weighted portfolio of commodity futures and a term structure portfolio that captures phases of backwardation and contango as mimicking portfolios for commodity risk. We find that equity-sorted portfolios with greater sensitivities to the excess returns of the backwardation and contango portfolio command higher average excess returns, suggesting that when measured appropriately, commodity risk is pervasive in stocks. Our conclusions are robust to the addition to the pricing model of financial, macroeconomic and business cycle-based risk factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A particle filter method is presented for the discrete-time filtering problem with nonlinear ItA ` stochastic ordinary differential equations (SODE) with additive noise supposed to be analytically integrable as a function of the underlying vector-Wiener process and time. The Diffusion Kernel Filter is arrived at by a parametrization of small noise-driven state fluctuations within branches of prediction and a local use of this parametrization in the Bootstrap Filter. The method applies for small noise and short prediction steps. With explicit numerical integrators, the operations count in the Diffusion Kernel Filter is shown to be smaller than in the Bootstrap Filter whenever the initial state for the prediction step has sufficiently few moments. The established parametrization is a dual-formula for the analysis of sensitivity to gaussian-initial perturbations and the analysis of sensitivity to noise-perturbations, in deterministic models, showing in particular how the stability of a deterministic dynamics is modeled by noise on short times and how the diffusion matrix of an SODE should be modeled (i.e. defined) for a gaussian-initial deterministic problem to be cast into an SODE problem. From it, a novel definition of prediction may be proposed that coincides with the deterministic path within the branch of prediction whose information entropy at the end of the prediction step is closest to the average information entropy over all branches. Tests are made with the Lorenz-63 equations, showing good results both for the filter and the definition of prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we develop a flexible cure rate survival model by assuming the number of competing causes of the event of interest to follow a compound weighted Poisson distribution. This model is more flexible in terms of dispersion than the promotion time cure model. Moreover, it gives an interesting and realistic interpretation of the biological mechanism of the occurrence of event of interest as it includes a destructive process of the initial risk factors in a competitive scenario. In other words, what is recorded is only from the undamaged portion of the original number of risk factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] Iron is hypothesized to be an important micronutrient for ocean biota, thus modulating carbon dioxide uptake by the ocean biological pump. Studies have assumed that atmospheric deposition of iron to the open ocean is predominantly from mineral aerosols. For the first time we model the source, transport, and deposition of iron from combustion sources. Iron is produced in small quantities during fossil fuel burning, incinerator use, and biomass burning. The sources of combustion iron are concentrated in the industrialized regions and biomass burning regions, largely in the tropics. Model results suggest that combustion iron can represent up to 50% of the total iron deposited, but over open ocean regions it is usually less than 5% of the total iron, with the highest values (< 30%) close to the East Asian continent in the North Pacific. For ocean biogeochemistry the bioavailability of the iron is important, and this is often estimated by the fraction which is soluble ( Fe(II)). Previous studies have argued that atmospheric processing of the relatively insoluble Fe(III) occurs to make it more soluble ( Fe( II)). Modeled estimates of soluble iron amounts based solely on atmospheric processing as simulated here cannot match the variability in daily averaged in situ concentration measurements in Korea, which is located close to both combustion and dust sources. The best match to the observations is that there are substantial direct emissions of soluble iron from combustion processes. If we assume observed soluble Fe/black carbon ratios in Korea are representative of the whole globe, we obtain the result that deposition of soluble iron from combustion contributes 20-100% of the soluble iron deposition over many ocean regions. This implies that more work should be done refining the emissions and deposition of combustion sources of soluble iron globally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study stochastic billiards on general tables: a particle moves according to its constant velocity inside some domain D R(d) until it hits the boundary and bounces randomly inside, according to some reflection law. We assume that the boundary of the domain is locally Lipschitz and almost everywhere continuously differentiable. The angle of the outgoing velocity with the inner normal vector has a specified, absolutely continuous density. We construct the discrete time and the continuous time processes recording the sequence of hitting points on the boundary and the pair location/velocity. We mainly focus on the case of bounded domains. Then, we prove exponential ergodicity of these two Markov processes, we study their invariant distribution and their normal (Gaussian) fluctuations. Of particular interest is the case of the cosine reflection law: the stationary distributions for the two processes are uniform in this case, the discrete time chain is reversible though the continuous time process is quasi-reversible. Also in this case, we give a natural construction of a chord ""picked at random"" in D, and we study the angle of intersection of the process with a (d - 1) -dimensional manifold contained in D.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The acid hydrolysis of cellulose with crystalline and amorphous fractions is analyzed on the basis of autocatalytic model with a positive feedback of acid production from the degraded biopolymer. In the condition of low acid rate production compared with hydrolysis rate, both fraction of cellulose decrease exponentially with linear and cubic time dependence, and the normalized number of scissions per cellulose chain follows a sigmoid behavior with reaction time. The model predicts that self generated acidic compounds from cellulose accelerate the degradation of the biopolymer. However, if the acidic compounds produced are volatile species, then their release under low pressure will reduce the global rate of degradation of cellulose toward its intrinsic rate value determined by the residual acid catalyst present in the starting material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in foreign students in countries such as the US, the UK and Francesuggests that the international ‘education industry’ is growing in importance. Thepurpose of this paper is to investigate the empirical determinants of internationalstudent mobility. A secondary purpose is to give tentative policy suggestions to hostcountry, source country and also to provide some recommendations to students whowant to study abroad. Using pooled cross-sectional time series data for the US overthe time period 1993-2006, we estimate an econometric model of enrolment rates offoreign students in the US. Our results suggest that tuition fees, US federal support ofeducation, and the size of the ‘young’ generation of source countries have asignificant influence on international student mobility. We also consider other factorsthat may be relevant in this context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An administrative border might hinder the optimal allocation of a given set of resources by restricting the flow of goods, services, and people. In this paper we address the question: Do administrative borders lead to poor accessibility to public service such as hospitals? In answering the question, we have examined the case of Sweden and its regional borders. We have used detailed data on the Swedish road network, its hospitals, and its geo-coded population. We have assessed the population’s spatial accessibility to Swedish hospitals by computing the inhabitants’ distance to the nearest hospital. We have also elaborated several scenarios ranging from strongly confining regional borders to no confinements of borders and recomputed the accessibility. Our findings imply that administrative borders are only marginally worsening the accessibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neste trabalho foi analisada a melhoria nas características de um solo mole quando tratado com cal, bem como a viabilidade técnica de se utilizar este novo material como uma camada suporte de fundações superficiais. O solo estudado classifica-se pedologicamente como Gley Húmico e a jazida localiza-se no município de Canoas/RS, às margens da BR 386. O trabalho teve as seguintes finalidades: realizar um estudo da influência dos diferentes teores de cal sobre as características tensão x deformação do solo tratado; verificar o ganho de resistência com o tempo de cura; modelar o comportamerito tensão x deformação do material tratado; realizar simulações numéricas, através do Método dos Elementos Finitos, do comportamento carga x recalque de fundações continuas flexíveis assentes sobre o novo material. Adotou-se o teor ótimo de cal (obtido pelo método de Eades & Grim, 1966) de 9% e dois valores inferiores de 7% e 5%. Realizaram-se os seguintes ensaios sobre o solo natural e as misturas de solo-cal: limites de Atterberg, compactação, granulometria, difratograma de raio X, permeabilidade (triaxial) e ensaios triaxiais adensados não drenados(CIU). Todos os ensaios foram realizados para três tempos de cura (7, 28 e 90 dias) e os corpos de prova foram curados em câmara úmida. Para modelar o comportamento tensão x deformação do solo melhorado, adotou-se o Modelo Hiperbólico e para o solo natural o Modelo Cam-Clay Modificado. O Modelo Hiperbólico foi implementado no software CRISPSO, desenvolvido na Universidade de Cambridge, Inglaterra. O software foi utilizado em um estudo paramétrico para determinar a influência do processo de estabilização no comportamento carga x recalque de fundações superficiais. Dos resultados obtidos, concluiu-se: que o método de Eades & Grim (1966) não mostrou-se adequado para determinação do teor ótimo de cal; houve, de maneira geral, melhora nas características físicas com o tratamento com cal; não houve ganho de resistência com o tempo de cura; o modelo hiperbólico representou bem o comportamento das misturas de solo cal e a colocação de uma camada de solo tratado apresenta melhoras no comportamento carga x recalque de fundações superficiais contínuas flexíveis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O grande objetivo em finanças para os gestores das empresas é o de proporcionar aumento de valor aos acionistas. Para que isso possa ser efetivamente implementado, é necessário que o investimento proporcione para o acionista um retorno superior ao custo de oportunidade do seu capital, contemplando o risco do investimento, mensurado através de um modelo de avaliação. Este trabalho apresenta os principais conceitos de avaliação de ativos, destacando a criação de valor como a medida mais importante do desempenho da empresa. O fluxo de caixa descontado é abordado como o método que melhor resume o conceito de criação de valor, destacando-se o fluxo de caixa do acionista e o fluxo de caixa da empresa. Também são apresentados a forma de apuração dos fluxos de caixa, a estimativa das taxas de crescimento, algumas situações especiais onde o fluxo de caixa descontado necessita de adaptações e outros métodos alternativos de análise de investimentos, sendo que nenhum deles é capaz de superar a técnica do valor presente líquido – VPL, pois o método do VPL utiliza todos os fluxos de caixa de um projeto, descontando-os corretamente de acordo com o custo de oportunidade do capital O estudo mostra, ainda, uma rápida explanação das principais técnicas de mensuração do risco e do retorno exigido pelos investidores ou proprietários segundo a teoria de valor, como o CAPM (Capital Asset Price Model), o APM (Arbitrage Pricing Model) e o Multifatorial, destacando-se entre eles a dificuldade de mensuração do custo do capital próprio em empresas de capital fechado no Brasil, para a devida apuração da taxa de desconto. A metodologia proposta é aplicada na avaliação do investimento, em um novo ponto de venda, realizado por uma pequena empresa familiar do setor supermercadista. Dessa forma, ao final do estudo, propõe-se a utilização de uma ferramenta gerencial, baseada no fluxo de caixa descontado, para avaliação de futuros investimentos da empresa, buscando-se assim a maximização de valor para o acionista ou proprietário.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho estima modelo CCAPM (consumption capital asset pricing model) para três classes de funções utilidade com dados brasileiros, gerando estimativas robustas de aversão ao risco elasticidade substitu ição intertemporal. Os resultados são analisados comparados resulta dos anteriores para dados brasileiros americanos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After Modigliani and Miller (1958) presented their capital structure irrelevance proposition, analysis of corporate Önancing choices involving debt and equity instruments have generally followed two trends in the literature, where models either incorporate informational asymmetries or introduce tax beneÖts in order to explain optimal capital structure determination (Myers, 2002). None of these features is present in this paper, which develops an asset pricing model with the purpose of providing a positive theory of corporate capital structure by replicating main aspects of standard contractual practice observed in real markets. Alternatively, the imperfect market structure of the economy is tailored to match what is most common in corporate reality. Allowance for default on corporate debt with an associated penalty of seizure of Örmís future cash áows by creditors is introduced, for instance. In this context, a qualitative assessment of Önancial managersídecisions is carried out through numerical procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O trabalho visa abordar os recentes conceitos da teoria econômica de seguro, aplicando-os especificamente ao seguro de crédito à exportação. O intuito é a construção de um modelo pioneiro de precificação do risco de crédito, ajustado ao contexto do mercado exportador brasileiro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nesse trabalho estima-se, usando o método generalizado dos momentos e dados brasileiros, os parâmetros estruturais do modelo CCAPM (consumption capital asset pricing model) a partir de três classes de funções utilidade distintas: função utilidade potência (CRRA), utilidade com hábito externo, e aversão ao desapontamento (Kreps-Porteus). Estes parâmetros estruturais estão associados à aversão ao risco, à elasticidade de substituição intertemporal no consumo e à taxa de desconto intertemporal da utilidade futura. Os resultados aqui obtidos são analisados e comparados com resultados anteriores para dados brasileiros e americanos. Adicionalmente, testa-se econometricamente todos os modelos estruturais estimados a partir do teste de restrições de sobre-identificação, para investigar, da forma mais abrangente possível, se há ou não equity premium puzzle para o Brasil. Os resultados surpreendem, dado que, em raríssimas ocasiões, se rejeita as restrições implícitas nesses modelos. Logo, conclui-se que não há equity premium puzzle para o Brasil.