855 resultados para Interest rates -- Mathematical models.
Resumo:
E rövid írás aktualitását az adja, hogy nemrégen jelent meg magyarul Thomas Piketty híres könyve: A tőke a 21. században. A könyv központi gondolata, hogy az elmúlt kétszáz év átlagában a gazdaság átlagos növekedési üteme, a g jelentősen elmarad az r értékétől, vagyis a tőke átlagos hozamától. A nevezetes r>g összefüggés egyik lehetséges oka, hogy az r szórása nagyobb, mint a g-é. A nagyobb szórás mögött nagyobb kockázati motívumok állhatnak. Ebből következően a Piketty által javasolt adókat jelentős mértékben a pénzügyi tőkének kellene viselnie. ____ The immediacy of this note derives from the fact that Picatty's famous Capital in the Twenty-First Century has been recently published in Hungarian. The central issue in the book is that taking the average of 200 years, the mean growth rate g is lower than the average return on capital r. One possible explanation of this famous r > g inequality may be that the standard deviation of r is greater than that of g because of the greater appetite for risk. The implication to draw from this is that financial capital should bear the majority of the taxes being suggested by Piketty.
Resumo:
This Licentiate Thesis is devoted to the presentation and discussion of some new contributions in applied mathematics directed towards scientific computing in sports engineering. It considers inverse problems of biomechanical simulations with rigid body musculoskeletal systems especially in cross-country skiing. This is a contrast to the main research on cross-country skiing biomechanics, which is based mainly on experimental testing alone. The thesis consists of an introduction and five papers. The introduction motivates the context of the papers and puts them into a more general framework. Two papers (D and E) consider studies of real questions in cross-country skiing, which are modelled and simulated. The results give some interesting indications, concerning these challenging questions, which can be used as a basis for further research. However, the measurements are not accurate enough to give the final answers. Paper C is a simulation study which is more extensive than paper D and E, and is compared to electromyography measurements in the literature. Validation in biomechanical simulations is difficult and reducing mathematical errors is one way of reaching closer to more realistic results. Paper A examines well-posedness for forward dynamics with full muscle dynamics. Moreover, paper B is a technical report which describes the problem formulation and mathematical models and simulation from paper A in more detail. Our new modelling together with the simulations enable new possibilities. This is similar to simulations of applications in other engineering fields, and need in the same way be handled with care in order to achieve reliable results. The results in this thesis indicate that it can be very useful to use mathematical modelling and numerical simulations when describing cross-country skiing biomechanics. Hence, this thesis contributes to the possibility of beginning to use and develop such modelling and simulation techniques also in this context.
Resumo:
This work presents a computational, called MOMENTS, code developed to be used in process control to determine a characteristic transfer function to industrial units when radiotracer techniques were been applied to study the unit´s performance. The methodology is based on the measuring the residence time distribution function (RTD) and calculate the first and second temporal moments of the tracer data obtained by two scintillators detectors NaI positioned to register a complete tracer movement inside the unit. Non linear regression technique has been used to fit various mathematical models and a statistical test was used to select the best result to the transfer function. Using the code MOMENTS, twelve different models can be used to fit a curve and calculate technical parameters to the unit.
Resumo:
Economic losses resulting from disease development can be reduced by accurate and early detection of plant pathogens. Early detection can provide the grower with useful information on optimal crop rotation patterns, varietal selections, appropriate control measures, harvest date and post harvest handling. Classical methods for the isolation of pathogens are commonly used only after disease symptoms. This frequently results in a delay in application of control measures at potentially important periods in crop production. This paper describes the application of both antibody and DNA based systems to monitor infection risk of air and soil borne fungal pathogens and the use of this information with mathematical models describing risk of disease associated with environmental parameters.
Resumo:
Although tyrosine kinase inhibitors (TKIs) such as imatinib have transformed chronic myelogenous leukemia (CML) into a chronic condition, these therapies are not curative in the majority of cases. Most patients must continue TKI therapy indefinitely, a requirement that is both expensive and that compromises a patient's quality of life. While TKIs are known to reduce leukemic cells' proliferative capacity and to induce apoptosis, their effects on leukemic stem cells, the immune system, and the microenvironment are not fully understood. A more complete understanding of their global therapeutic effects would help us to identify any limitations of TKI monotherapy and to address these issues through novel combination therapies. Mathematical models are a complementary tool to experimental and clinical data that can provide valuable insights into the underlying mechanisms of TKI therapy. Previous modeling efforts have focused on CML patients who show biphasic and triphasic exponential declines in BCR-ABL ratio during therapy. However, our patient data indicates that many patients treated with TKIs show fluctuations in BCR-ABL ratio yet are able to achieve durable remissions. To investigate these fluctuations, we construct a mathematical model that integrates CML with a patient's autologous immune response to the disease. In our model, we define an immune window, which is an intermediate range of leukemic concentrations that lead to an effective immune response against CML. While small leukemic concentrations provide insufficient stimulus, large leukemic concentrations actively suppress a patient's immune system, thus limiting it's ability to respond. Our patient data and modeling results suggest that at diagnosis, a patient's high leukemic concentration is able to suppress their immune system. TKI therapy drives the leukemic population into the immune window, allowing the patient's immune cells to expand and eventually mount an efficient response against the residual CML. This response drives the leukemic population below the immune window, causing the immune population to contract and allowing the leukemia to partially recover. The leukemia eventually reenters the immune window, thus stimulating a sequence of weaker immune responses as the two populations approach equilibrium. We hypothesize that a patient's autologous immune response to CML may explain the fluctuations in BCR-ABL ratio that are regularly seen during TKI therapy. These fluctuations may serve as a signature of a patient's individual immune response to CML. By applying our modeling framework to patient data, we are able to construct an immune profile that can then be used to propose patient-specific combination therapies aimed at further reducing a patient's leukemic burden. Our characterization of a patient's anti-leukemia immune response may be especially valuable in the study of drug resistance, treatment cessation, and combination therapy.
Resumo:
This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed timevarying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible realtime term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.
Resumo:
This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed time-varying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible real-time term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.
Resumo:
People, animals and the environment can be exposed to multiple chemicals at once from a variety of sources, but current risk assessment is usually carried out based on one chemical substance at a time. In human health risk assessment, ingestion of food is considered a major route of exposure to many contaminants, namely mycotoxins, a wide group of fungal secondary metabolites that are known to potentially cause toxicity and carcinogenic outcomes. Mycotoxins are commonly found in a variety of foods including those intended for consumption by infants and young children and have been found in processed cereal-based foods available in the Portuguese market. The use of mathematical models, including probabilistic approaches using Monte Carlo simulations, constitutes a prominent issue in human health risk assessment in general and in mycotoxins exposure assessment in particular. The present study aims to characterize, for the first time, the risk associated with the exposure of Portuguese children to single and multiple mycotoxins present in processed cereal-based foods (CBF). Portuguese children (0-3 years old) food consumption data (n=103) were collected using a 3 days food diary. Contamination data concerned the quantification of 12 mycotoxins (aflatoxins, ochratoxin A, fumonisins and trichothecenes) were evaluated in 20 CBF samples marketed in 2014 and 2015 in Lisbon; samples were analyzed by HPLC-FLD, LC-MS/MS and GC-MS. Daily exposure of children to mycotoxins was performed using deterministic and probabilistic approaches. Different strategies were used to treat the left censored data. For aflatoxins, as carcinogenic compounds, the margin of exposure (MoE) was calculated as a ratio of BMDL (benchmark dose lower confidence limit) to the aflatoxin exposure. The magnitude of the MoE gives an indication of the risk level. For the remaining mycotoxins, the output of exposure was compared to the dose reference values (TDI) in order to calculate the hazard quotients (ratio between exposure and a reference dose, HQ). For the cumulative risk assessment of multiple mycotoxins, the concentration addition (CA) concept was used. The combined margin of exposure (MoET) and the hazard index (HI) were calculated for aflatoxins and the remaining mycotoxins, respectively. 71% of CBF analyzed samples were contaminated with mycotoxins (with values below the legal limits) and approximately 56% of the studied children consumed CBF at least once in these 3 days. Preliminary results showed that children exposure to single mycotoxins present in CBF were below the TDI. Aflatoxins MoE and MoET revealed a reduced potential risk by exposure through consumption of CBF (with values around 10000 or more). HQ and HI values for the remaining mycotoxins were below 1. Children are a particularly vulnerable population group to food contaminants and the present results point out an urgent need to establish legal limits and control strategies regarding the presence of multiple mycotoxins in children foods in order to protect their health. The development of packaging materials with antifungal properties is a possible solution to control the growth of moulds and consequently to reduce mycotoxin production, contributing to guarantee the quality and safety of foods intended for children consumption.
Resumo:
We summarise the properties and the fundamental mathematical results associated with basic models which describe coagulation and fragmentation processes in a deterministic manner and in which cluster size is a discrete quantity (an integer multiple of some basic unit size). In particular, we discuss Smoluchowski's equation for aggregation, the Becker-Döring model of simultaneous aggregation and fragmentation, and more general models involving coagulation and fragmentation.
Resumo:
This paper studies monetary policy transmission using several statistical tools -- We find that the relationships between the policy interest rate and the financial system’s interest rates are positive and statistically significant, and transmission is complete eight months after policy shocks occur -- The speed of transmission varies according to the type of interest rates -- Transmission is faster for interest rates on loans provided to households, and is particularly rapid and complete for rates on preferential commercial loans -- Transmission is slower for credit card and mortgage rates, due to regulatory issues (interest rate ceilings)
Resumo:
Ecological models written in a mathematical language L(M) or model language, with a given style or methodology can be considered as a text. It is possible to apply statistical linguistic laws and the experimental results demonstrate that the behaviour of a mathematical model is the same of any literary text of any natural language. A text has the following characteristics: (a) the variables, its transformed functions and parameters are the lexic units or LUN of ecological models; (b) the syllables are constituted by a LUN, or a chain of them, separated by operating or ordering LUNs; (c) the flow equations are words; and (d) the distribution of words (LUM and CLUN) according to their lengths is based on a Poisson distribution, the Chebanov's law. It is founded on Vakar's formula, that is calculated likewise the linguistic entropy for L(M). We will apply these ideas over practical examples using MARIOLA model. In this paper it will be studied the problem of the lengths of the simple lexic units composed lexic units and words of text models, expressing these lengths in number of the primitive symbols, and syllables. The use of these linguistic laws renders it possible to indicate the degree of information given by an ecological model.
Resumo:
In this thesis we present a mathematical formulation of the interaction between microorganisms such as bacteria or amoebae and chemicals, often produced by the organisms themselves. This interaction is called chemotaxis and leads to cellular aggregation. We derive some models to describe chemotaxis. The first is the pioneristic Keller-Segel parabolic-parabolic model and it is derived by two different frameworks: a macroscopic perspective and a microscopic perspective, in which we start with a stochastic differential equation and we perform a mean-field approximation. This parabolic model may be generalized by the introduction of a degenerate diffusion parameter, which depends on the density itself via a power law. Then we derive a model for chemotaxis based on Cattaneo's law of heat propagation with finite speed, which is a hyperbolic model. The last model proposed here is a hydrodynamic model, which takes into account the inertia of the system by a friction force. In the limit of strong friction, the model reduces to the parabolic model, whereas in the limit of weak friction, we recover a hyperbolic model. Finally, we analyze the instability condition, which is the condition that leads to aggregation, and we describe the different kinds of aggregates we may obtain: the parabolic models lead to clusters or peaks whereas the hyperbolic models lead to the formation of network patterns or filaments. Moreover, we discuss the analogy between bacterial colonies and self gravitating systems by comparing the chemotactic collapse and the gravitational collapse (Jeans instability).
Resumo:
Esta tese é composta de três artigos que analisam a estrutura a termo das taxas de juros usando diferentes bases de dados e modelos. O capítulo 1 propõe um modelo paramétrico de taxas de juros que permite a segmentação e choques locais na estrutura a termo. Adotando dados do tesouro americano, duas versões desse modelo segmentado são implementadas. Baseado em uma sequência de 142 experimentos de previsão, os modelos propostos são comparados à benchmarks e concluí-se que eles performam melhor nos resultados das previsões fora da amostra, especialmente para as maturidades curtas e para o horizonte de previsão de 12 meses. O capítulo 2 acrescenta restrições de não arbitragem ao estimar um modelo polinomial gaussiano dinâmico de estrutura a termo para o mercado de taxas de juros brasileiro. Esse artigo propõe uma importante aproximação para a série temporal dos fatores de risco da estrutura a termo, que permite a extração do prêmio de risco das taxas de juros sem a necessidade de otimização de um modelo dinâmico completo. Essa metodologia tem a vantagem de ser facilmente implementada e obtém uma boa aproximação para o prêmio de risco da estrutura a termo, que pode ser usada em diferentes aplicações. O capítulo 3 modela a dinâmica conjunta das taxas nominais e reais usando um modelo afim de não arbitagem com variáveis macroeconômicas para a estrutura a termo, afim de decompor a diferença entre as taxas nominais e reais em prêmio de risco de inflação e expectativa de inflação no mercado americano. Uma versão sem variáveis macroeconômicas e uma versão com essas variáveis são implementadas e os prêmios de risco de inflação obtidos são pequenos e estáveis no período analisado, porém possuem diferenças na comparação dos dois modelos analisados.
Resumo:
In this work the differentiability of the principal eigenvalue lambda = lambda(1)(Gamma) to the localized Steklov problem -Delta u + qu = 0 in Omega, partial derivative u/partial derivative nu = lambda chi(Gamma)(x)u on partial derivative Omega, where Gamma subset of partial derivative Omega is a smooth subdomain of partial derivative Omega and chi(Gamma) is its characteristic function relative to partial derivative Omega, is shown. As a key point, the flux subdomain Gamma is regarded here as the variable with respect to which such differentiation is performed. An explicit formula for the derivative of lambda(1) (Gamma) with respect to Gamma is obtained. The lack of regularity up to the boundary of the first derivative of the principal eigenfunctions is a further intrinsic feature of the problem. Therefore, the whole analysis must be done in the weak sense of H(1)(Omega). The study is of interest in mathematical models in morphogenesis. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Financial modelling in the area of option pricing involves the understanding of the correlations between asset and movements of buy/sell in order to reduce risk in investment. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. In turn, analysis tools rely on fast numerical algorithms for the solution of financial mathematical models. There are many different financial activities apart from shares buy/sell activities. The main aim of this chapter is to discuss a distributed algorithm for the numerical solution of a European option. Both linear and non-linear cases are considered. The algorithm is based on the concept of the Laplace transform and its numerical inverse. The scalability of the algorithm is examined. Numerical tests are used to demonstrate the effectiveness of the algorithm for financial analysis. Time dependent functions for volatility and interest rates are also discussed. Applications of the algorithm to non-linear Black-Scholes equation where the volatility and the interest rate are functions of the option value are included. Some qualitative results of the convergence behaviour of the algorithm is examined. This chapter also examines the various computational issues of the Laplace transformation method in terms of distributed computing. The idea of using a two-level temporal mesh in order to achieve distributed computation along the temporal axis is introduced. Finally, the chapter ends with some conclusions.