911 resultados para Deterministic walkers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis concerns the analysis of epidemic models. We adopt the Bayesian paradigm and develop suitable Markov Chain Monte Carlo (MCMC) algorithms. This is done by considering an Ebola outbreak in the Democratic Republic of Congo, former Zaïre, 1995 as a case of SEIR epidemic models. We model the Ebola epidemic deterministically using ODEs and stochastically through SDEs to take into account a possible bias in each compartment. Since the model has unknown parameters, we use different methods to estimate them such as least squares, maximum likelihood and MCMC. The motivation behind choosing MCMC over other existing methods in this thesis is that it has the ability to tackle complicated nonlinear problems with large number of parameters. First, in a deterministic Ebola model, we compute the likelihood function by sum of square of residuals method and estimate parameters using the LSQ and MCMC methods. We sample parameters and then use them to calculate the basic reproduction number and to study the disease-free equilibrium. From the sampled chain from the posterior, we test the convergence diagnostic and confirm the viability of the model. The results show that the Ebola model fits the observed onset data with high precision, and all the unknown model parameters are well identified. Second, we convert the ODE model into a SDE Ebola model. We compute the likelihood function using extended Kalman filter (EKF) and estimate parameters again. The motivation of using the SDE formulation here is to consider the impact of modelling errors. Moreover, the EKF approach allows us to formulate a filtered likelihood for the parameters of such a stochastic model. We use the MCMC procedure to attain the posterior distributions of the parameters of the SDE Ebola model drift and diffusion parts. In this thesis, we analyse two cases: (1) the model error covariance matrix of the dynamic noise is close to zero , i.e. only small stochasticity added into the model. The results are then similar to the ones got from deterministic Ebola model, even if methods of computing the likelihood function are different (2) the model error covariance matrix is different from zero, i.e. a considerable stochasticity is introduced into the Ebola model. This accounts for the situation where we would know that the model is not exact. As a results, we obtain parameter posteriors with larger variances. Consequently, the model predictions then show larger uncertainties, in accordance with the assumption of an incomplete model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The investments have always been considered as an essential backbone and so-called ‘locomotive’ for the competitive economies. However, in various countries, the state has been put under tight budget constraints for the investments in capital intensive projects. In response to this situation, the cooperation between public and private sector has grown based on public-private mechanism. The promotion of favorable arrangement for collaboration between public and private sectors for the provision of policies, services, and infrastructure in Russia can help to address the problems of dry ports development that neither municipalities nor the private sector can solve alone. Especially, the stimulation of public-private collaboration is significant under the exposure to externalities that affect the magnitude of the risks during all phases of project realization. In these circumstances, the risk in the projects also is becoming increasingly a part of joint research and risk management practice, which is viewed as a key approach, aiming to take active actions on existing global and specific factors of uncertainties. Meanwhile, a relatively little progress has been made on the inclusion of the resilience aspects into the planning process of a dry ports construction that would instruct the capacity planner, on how to mitigate the occurrence of disruptions that may lead to million dollars of losses due to the deviation of the future cash flows from the expected financial flows on the project. The current experience shows that the existing methodological base is developed fragmentary within separate steps of supply chain risk management (SCRM) processes: risk identification, risk evaluation, risk mitigation, risk monitoring and control phases. The lack of the systematic approach hinders the solution of the problem of risk management processes of dry port implementation. Therefore, management of various risks during the investments phases of dry port projects still presents a considerable challenge from the practical and theoretical points of view. In this regard, the given research became a logical continuation of fundamental research, existing in the financial models and theories (e.g., capital asset pricing model and real option theory), as well as provided a complementation for the portfolio theory. The goal of the current study is in the design of methods and models for the facilitation of dry port implementation through the mechanism of public-private partnership on the national market that implies the necessity to mitigate, first and foremost, the shortage of the investments and consequences of risks. The problem of the research was formulated on the ground of the identified contradictions. They rose as a continuation of the trade-off between the opportunities that the investors can gain from the development of terminal business in Russia (i.e. dry port implementation) and risks. As a rule, the higher the investment risk, the greater should be their expected return. However, investors have a different tolerance for the risks. That is why it would be advisable to find an optimum investment. In the given study, the optimum relates to the search for the efficient portfolio, which can provide satisfaction to the investor, depending on its degree of risk aversion. There are many theories and methods in finance, concerning investment choices. Nevertheless, the appropriateness and effectiveness of particular methods should be considered with the allowance of the specifics of the investment projects. For example, the investments in dry ports imply not only the lump sum of financial inflows, but also the long-term payback periods. As a result, capital intensity and longevity of their construction determine the necessity from investors to ensure the return on investment (profitability), along with the rapid return on investment (liquidity), without precluding the fact that the stochastic nature of the project environment is hardly described by the formula-based approach. The current theoretical base for the economic appraisals of the dry port projects more often perceives net present value (NPV) as a technique superior to other decision-making criteria. For example, the portfolio theory, which considers different risk preference of an investor and structures of utility, defines net present value as a better criterion of project appraisal than discounted payback period (DPP). Meanwhile, in business practice, the DPP is more popular. Knowing that the NPV is based on the assumptions of certainty of project life, it cannot be an accurate appraisal approach alone to determine whether or not the project should be accepted for the approval in the environment that is not without of uncertainties. In order to reflect the period or the project’s useful life that is exposed to risks due to changes in political, operational, and financial factors, the second capital budgeting criterion – discounted payback period is profoundly important, particularly for the Russian environment. Those statements represent contradictions that exist in the theory and practice of the applied science. Therefore, it would be desirable to relax the assumptions of portfolio theory and regard DPP as not fewer relevant appraisal approach for the assessment of the investment and risk measure. At the same time, the rationality of the use of both project performance criteria depends on the methods and models, with the help of which these appraisal approaches are calculated in feasibility studies. The deterministic methods cannot ensure the required precision of the results, while the stochastic models guarantee the sufficient level of the accuracy and reliability of the obtained results, providing that the risks are properly identified, evaluated, and mitigated. Otherwise, the project performance indicators may not be confirmed during the phase of project realization. For instance, the economic and political instability can result in the undoing of hard-earned gains, leading to the need for the attraction of the additional finances for the project. The sources of the alternative investments, as well as supportive mitigation strategies, can be studied during the initial phases of project development. During this period, the effectiveness of the investments undertakings can also be improved by the inclusion of the various investors, e.g. Russian Railways’ enterprises and other private companies in the dry port projects. However, the evaluation of the effectiveness of the participation of different investors in the project lack the methods and models that would permit doing the particular feasibility study, foreseeing the quantitative characteristics of risks and their mitigation strategies, which can meet the tolerance of the investors to the risks. For this reason, the research proposes a combination of Monte Carlo method, discounted cash flow technique, the theory of real options, and portfolio theory via a system dynamics simulation approach. The use of this methodology allows for comprehensive risk management process of dry port development to cover all aspects of risk identification, risk evaluation, risk mitigation, risk monitoring, and control phases. A designed system dynamics model can be recommended for the decision-makers on the dry port projects that are financed via a public-private partnership. It permits investors to make a decision appraisal based on random variables of net present value and discounted payback period, depending on different risks factors, e.g. revenue risks, land acquisition risks, traffic volume risks, construction hazards, and political risks. In this case, the statistical mean is used for the explication of the expected value of the DPP and NPV; the standard deviation is proposed as a characteristic of risks, while the elasticity coefficient is applied for rating of risks. Additionally, the risk of failure of project investments and guaranteed recoupment of capital investment can be considered with the help of the model. On the whole, the application of these modern methods of simulation creates preconditions for the controlling of the process of dry port development, i.e. making managerial changes and identifying the most stable parameters that contribute to the optimal alternative scenarios of the project realization in the uncertain environment. System dynamics model allows analyzing the interactions in the most complex mechanism of risk management process of the dry ports development and making proposals for the improvement of the effectiveness of the investments via an estimation of different risk management strategies. For the comparison and ranking of these alternatives in their order of preference to the investor, the proposed indicators of the efficiency of the investments, concerning the NPV, DPP, and coefficient of variation, can be used. Thus, rational investors, who averse to taking increased risks unless they are compensated by the commensurate increase in the expected utility of a risky prospect of dry port development, can be guided by the deduced marginal utility of investments. It is computed on the ground of the results from the system dynamics model. In conclusion, the outlined theoretical and practical implications for the management of risks, which are the key characteristics of public-private partnerships, can help analysts and planning managers in budget decision-making, substantially alleviating the effect from various risks and avoiding unnecessary cost overruns in dry port projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to propose a stochastic model for commodity markets linked with the Burgers equation from fluid dynamics. We construct a stochastic particles method for commodity markets, in which particles represent market participants. A discontinuity in the model is included through an interacting kernel equal to the Heaviside function and its link with the Burgers equation is given. The Burgers equation and the connection of this model with stochastic differential equations are also studied. Further, based on the law of large numbers, we prove the convergence, for large N, of a system of stochastic differential equations describing the evolution of the prices of N traders to a deterministic partial differential equation of Burgers type. Numerical experiments highlight the success of the new proposal in modeling some commodity markets, and this is confirmed by the ability of the model to reproduce price spikes when their effects occur in a sufficiently long period of time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Serving the Niagara and surrounding areas for over 120 years, Walker Industries has made its impact not only commercially, but also culturally. Beginning in 1875 with the erection of a stone sawing mill on a property John Walker purchased from the Welland Canal Loan Company. One of the first projects Walker cut stone for was the Merritton Town Hall. In 1882 the business expanded to include Walkers children, changing the name to Walker & Sons. Eventually in 1887 the two eldest sons took control of the business operation and their partnership changed the company’s name to Walker Brothers, the same year the company began operating its first quarry. The quarry was conveniently located alongside the 3rd Welland canal, offering easy access to Toronto and Hamilton. It was also close to the railway system which allowed immediate access to Thorold and Niagara Falls and later access to parts of Ontario and Quebec. The quarry supplied stone to build numerous halls and armouries across Ontario. A use was also found for the ‘waste products’ of cutting the limestone. Leftover stone chips were sent to paper mills, where stone was needed as part of the sulphite pulp process for making paper. Beginning to supply the Ontario Paper Company with stone in 1913, meant not only long, hard, work, but also more profit for the company. Before mechanization, most of the loading and unloading of the stone was done by hand, taking 19 man-hours to load an 18 yard railway car. Mechanization followed in 1947 when the plant became fully mechanized making the work easier and increasing production rates. In 1957 the company moved from its original location and opened the St. Catharines Crushed Stone Plant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis is to price options on equity index futures with an application to standard options on S&P 500 futures traded on the Chicago Mercantile Exchange. Our methodology is based on stochastic dynamic programming, which can accommodate European as well as American options. The model accommodates dividends from the underlying asset. It also captures the optimal exercise strategy and the fair value of the option. This approach is an alternative to available numerical pricing methods such as binomial trees, finite differences, and ad-hoc numerical approximation techniques. Our numerical and empirical investigations demonstrate convergence, robustness, and efficiency. We use this methodology to value exchange-listed options. The European option premiums thus obtained are compared to Black's closed-form formula. They are accurate to four digits. The American option premiums also have a similar level of accuracy compared to premiums obtained using finite differences and binomial trees with a large number of time steps. The proposed model accounts for deterministic, seasonally varying dividend yield. In pricing futures options, we discover that what matters is the sum of the dividend yields over the life of the futures contract and not their distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The gift plate in the front of the book indicates that the book is from Walker’s Drug Store, Niagara Falls, Ontario. Walker’s Drug Company was founded in 1925 by Ivan T. Walker. The dates of this book indicate that it is more likely to have come from A.C. Thorburn, Chemist and Druggist. A.C. Thorburn purchased Smith’s Pharmacy and Pursel and Company Dry Goods Store at the corner of Main Street and Lundy’s Lane in Niagara Falls, Ontario. In 1900, Pursel moved out and Thorburn’s Drug Store came into being. Ivan T. Walker, founder of Walker’s Drugs was employed by Thorburn Drugs in his teen years. The local doctors whose prescriptions are in the book include: J. H. McGarry; F.W.E. Wilson; C. F. Abraham; W.E. Olmsted; W.W. Thompson; Dr. Robb, dentist; Horace R. Elliot, physician and surgeon and Dr. Sutherland, eye, ear nose and throat specialist

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The curse of dimensionality is a major problem in the fields of machine learning, data mining and knowledge discovery. Exhaustive search for the most optimal subset of relevant features from a high dimensional dataset is NP hard. Sub–optimal population based stochastic algorithms such as GP and GA are good choices for searching through large search spaces, and are usually more feasible than exhaustive and deterministic search algorithms. On the other hand, population based stochastic algorithms often suffer from premature convergence on mediocre sub–optimal solutions. The Age Layered Population Structure (ALPS) is a novel metaheuristic for overcoming the problem of premature convergence in evolutionary algorithms, and for improving search in the fitness landscape. The ALPS paradigm uses an age–measure to control breeding and competition between individuals in the population. This thesis uses a modification of the ALPS GP strategy called Feature Selection ALPS (FSALPS) for feature subset selection and classification of varied supervised learning tasks. FSALPS uses a novel frequency count system to rank features in the GP population based on evolved feature frequencies. The ranked features are translated into probabilities, which are used to control evolutionary processes such as terminal–symbol selection for the construction of GP trees/sub-trees. The FSALPS metaheuristic continuously refines the feature subset selection process whiles simultaneously evolving efficient classifiers through a non–converging evolutionary process that favors selection of features with high discrimination of class labels. We investigated and compared the performance of canonical GP, ALPS and FSALPS on high–dimensional benchmark classification datasets, including a hyperspectral image. Using Tukey’s HSD ANOVA test at a 95% confidence interval, ALPS and FSALPS dominated canonical GP in evolving smaller but efficient trees with less bloat expressions. FSALPS significantly outperformed canonical GP and ALPS and some reported feature selection strategies in related literature on dimensionality reduction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a model of short-range ballistic missile defense and uses it to study the performance of Israel’s Iron Dome system. The deterministic base model allows for inaccurate missiles, unsuccessful interceptions, and civil defense. Model enhancements consider the trade-offs in attacking the interception system, the difficulties faced by militants in assembling large salvos, and the effects of imperfect missile classification by the defender. A stochastic model is also developed. Analysis shows that system performance can be highly sensitive to the missile salvo size, and that systems with higher interception rates are more “fragile” when overloaded. The model is calibrated using publically available data about Iron Dome’s use during Operation Pillar of Defense in November 2012. If the systems performed as claimed, they saved Israel an estimated 1778 casualties and $80 million in property damage, and thereby made preemptive strikes on Gaza about 8 times less valuable to Israel. Gaza militants could have inflicted far more damage by grouping their rockets into large salvos, but this may have been difficult given Israel’s suppression efforts. Counter-battery fire by the militants is unlikely to be worthwhile unless they can obtain much more accurate missiles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies testing for a unit root for large n and T panels in which the cross-sectional units are correlated. To model this cross-sectional correlation, we assume that the data is generated by an unknown number of unobservable common factors. We propose unit root tests in this environment and derive their (Gaussian) asymptotic distribution under the null hypothesis of a unit root and local alternatives. We show that these tests have significant asymptotic power when the model has no incidental trends. However, when there are incidental trends in the model and it is necessary to remove heterogeneous deterministic components, we show that these tests have no power against the same local alternatives. Through Monte Carlo simulations, we provide evidence on the finite sample properties of these new tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an abstract two-agent model, we show that every deterministic joint choice function compatible with the hypothesis that agents act noncooperatively is also compatible with the hypothesis that they act cooperatively. the converse is false.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uncertainties as to future supply costs of nonrenewable natural resources, such as oil and gas, raise the issue of the choice of supply sources. In a perfectly deterministic world, an efficient use of multiple sources of supply requires that any given market exhausts the supply it can draw from a low cost source before moving on to a higher cost one; supply sources should be exploited in strict sequence of increasing marginal cost, with a high cost source being left untouched as long as a less costly source is available. We find that this may not be the efficient thing to do in a stochastic world. We show that there exist conditions under which it can be efficient to use a risky supply source in order to conserve a cheaper non risky source. The benefit of doing this comes from the fact that it leaves open the possibility of using it instead of the risky source in the event the latter’s future cost conditions suddenly deteriorate. There are also conditions under which it will be efficient to use a more costly non risky source while a less costly risky source is still available. The reason is that this conserves the less costly risky source in order to use it in the event of a possible future drop in its cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although some are excited about the possibility of using current scientific research into the biological causes of sexual orientation to ground rights claims, I argue that basing rights claims on this research is unwise because this research, specifically the hormonal, genetic, and structural research, is organized around the inversion assumption, a conceptual scheme within which some aspect of the biology of gay men and lesbians is thought to be inverted along sex lines.While there are many reasons to worry about the use of the inversion assumption, I focus on problems that arise from a further set of claims that must be assumed in order to make the use of the inversion assumption coherent. This further set of assumptions includes the claims (1) that heterosexuality is the standard state and that (2) this standard state is sexually-dimorphic and (3) deterministic. I argue that this set of assumptions is problematic because it results in ideological consequences that are both sexist and heterosexist.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'imputation est souvent utilisée dans les enquêtes pour traiter la non-réponse partielle. Il est bien connu que traiter les valeurs imputées comme des valeurs observées entraîne une sous-estimation importante de la variance des estimateurs ponctuels. Pour remédier à ce problème, plusieurs méthodes d'estimation de la variance ont été proposées dans la littérature, dont des méthodes adaptées de rééchantillonnage telles que le Bootstrap et le Jackknife. Nous définissons le concept de double-robustesse pour l'estimation ponctuelle et de variance sous l'approche par modèle de non-réponse et l'approche par modèle d'imputation. Nous mettons l'emphase sur l'estimation de la variance à l'aide du Jackknife qui est souvent utilisé dans la pratique. Nous étudions les propriétés de différents estimateurs de la variance à l'aide du Jackknife pour l'imputation par la régression déterministe ainsi qu'aléatoire. Nous nous penchons d'abord sur le cas de l'échantillon aléatoire simple. Les cas de l'échantillonnage stratifié et à probabilités inégales seront aussi étudiés. Une étude de simulation compare plusieurs méthodes d'estimation de variance à l'aide du Jackknife en terme de biais et de stabilité relative quand la fraction de sondage n'est pas négligeable. Finalement, nous établissons la normalité asymptotique des estimateurs imputés pour l'imputation par régression déterministe et aléatoire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le degré de rétention de l’arboricolisme dans le répertoire locomoteur des hominines fossiles du Pliocène est toujours matière à débat, les études ayant principalement porté sur la courbure des phalanges et la proportion des membres. Vu la récente découverte de DIK-1-1 (A. afarensis) et de la scapula qui lui est associée, l’étude de cet os d’un point de vue fonctionnel est intéressante, puisqu’il est directement impliqué dans la locomotion de presque tous les hominoïdes. Le but de cette étude est de tenter d’établir un lien entre l’orientation supéro-inférieure (SI) et antéro-postérieure (AP) de la cavité glénoïde de la scapula et les comportements locomoteurs chez les grands singes et l’humain moderne. Des analyses comparatives sur les adultes ont été réalisées pour 1) voir s’il existe des différences dans la morphologie étudiée entre les espèces et 2) voir si ces différences peuvent être expliquées par la taille corporelle. Des analyses ontogéniques ont aussi été réalisées pour voir si un accroissement de la taille corporelle pendant le développement et les changements locomoteurs qui y sont associés correspondent à un changement d’orientation de la cavité glénoïde. Les résultats montrent que les humains ont une cavité glénoïde qui est orientée moins supérieurement que les grands singes, mais que Pongo, bien qu’étant le plus arboricole, n’a pas l’orientation la plus supérieure. Les « knuckle-walkers » (Pan et Gorilla) se distinguent des autres hominoïdes avec une orientation de la surface glénoïde relative à l’épine plus inférieure. La taille corporelle ne semble pas influencer la morphologie étudiée, sauf parfois chez le gorille. Seuls l’humain et les mâles Pongo montrent un changement ontogénique dans l’orientation de la cavité glénoïde relativement à l’épine. Sur la base de ces résultats, l’orientation de la cavité glénoïde semble refléter partiellement la fonction du membre supérieur dans la locomotion, mais des recherches plus poussées sont nécessaires. Mots-Clés : Scapula, cavité glénoïde, grands singes, humains, locomotion, arboricolisme.