923 resultados para Exponential financial models
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Includes bibliography
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this paper we describe the main causes of the recent financial crisis as a result of many theoretical, methodological, and practical shortcomings mostly according to heterodox, but also including some important orthodox economists. At theoretical level, there are problems concerning teaching and using economic models with overly unrealistic assumptions. In the methodological front, we find the unsuspected shadow of Milton Friedman’s ‘unrealisticism of assumptions’ thesis lurking behind the construction of this kind of models and the widespread neglect of methodological issues. Of course, the most evident shortcomings are at the practical level: (i) huge interests of the participants in the financial markets (banks, central bankers, regulators, rating agencies mortgage brokers, politicians, governments, executives, economists, etc. mainly in the US, Canada and Europe, but also in Japan and the rest of the world), (ii) in an almost completely free financial and economic market, that is, one (almost) without any regulation or supervision, (iii) decision-taking upon some not well regarded qualities, like irresponsibility, ignorance, and inertia; and (iv) difficulties to understand the current crisis as well as some biases directing economic rescues by governments. Following many others, we propose that we take this episode as an opportunity to reflect on, and hopefully redirect, economic theory and practice.
Resumo:
This paper addresses the investment decisions considering the presence of financial constraints of 373 large Brazilian firms from 1997 to 2004, using panel data. A Bayesian econometric model was used considering ridge regression for multicollinearity problems among the variables in the model. Prior distributions are assumed for the parameters, classifying the model into random or fixed effects. We used a Bayesian approach to estimate the parameters, considering normal and Student t distributions for the error and assumed that the initial values for the lagged dependent variable are not fixed, but generated by a random process. The recursive predictive density criterion was used for model comparisons. Twenty models were tested and the results indicated that multicollinearity does influence the value of the estimated parameters. Controlling for capital intensity, financial constraints are found to be more important for capital-intensive firms, probably due to their lower profitability indexes, higher fixed costs and higher degree of property diversification.
Resumo:
The Conway-Maxwell Poisson (COMP) distribution as an extension of the Poisson distribution is a popular model for analyzing counting data. For the first time, we introduce a new three parameter distribution, so-called the exponential-Conway-Maxwell Poisson (ECOMP) distribution, that contains as sub-models the exponential-geometric and exponential-Poisson distributions proposed by Adamidis and Loukas (Stat Probab Lett 39:35-42, 1998) and KuAY (Comput Stat Data Anal 51:4497-4509, 2007), respectively. The new density function can be expressed as a mixture of exponential density functions. Expansions for moments, moment generating function and some statistical measures are provided. The density function of the order statistics can also be expressed as a mixture of exponential densities. We derive two formulae for the moments of order statistics. The elements of the observed information matrix are provided. Two applications illustrate the usefulness of the new distribution to analyze positive data.
Resumo:
The fatigue crack behavior in metals and alloys under constant amplitude test conditions is usually described by relationships between the crack growth rate da/dN and the stress intensity factor range Delta K. In the present work, an enhanced two-parameter exponential equation of fatigue crack growth was introduced in order to describe sub-critical crack propagation behavior of Al 2524-T3 alloy, commonly used in aircraft engineering applications. It was demonstrated that besides adequately correlating the load ratio effects, the exponential model also accounts for the slight deviations from linearity shown by the experimental curves. A comparison with Elber, Kujawski and "Unified Approach" models allowed for verifying the better performance, when confronted to the other tested models, presented by the exponential model. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The issue of assessing variance components is essential in deciding on the inclusion of random effects in the context of mixed models. In this work we discuss this problem by supposing nonlinear elliptical models for correlated data by using the score-type test proposed in Silvapulle and Silvapulle (1995). Being asymptotically equivalent to the likelihood ratio test and only requiring the estimation under the null hypothesis, this test provides a fairly easy computable alternative for assessing one-sided hypotheses in the context of the marginal model. Taking into account the possible non-normal distribution, we assume that the joint distribution of the response variable and the random effects lies in the elliptical class, which includes light-tailed and heavy-tailed distributions such as Student-t, power exponential, logistic, generalized Student-t, generalized logistic, contaminated normal, and the normal itself, among others. We compare the sensitivity of the score-type test under normal, Student-t and power exponential models for the kinetics data set discussed in Vonesh and Carter (1992) and fitted using the model presented in Russo et al. (2009). Also, a simulation study is performed to analyze the consequences of the kurtosis misspecification.
Resumo:
Within the nutritional context, the supplementation of microminerals in bird food is often made in quantities exceeding those required in the attempt to ensure the proper performance of the animals. The experiments of type dosage x response are very common in the determination of levels of nutrients in optimal food balance and include the use of regression models to achieve this objective. Nevertheless, the regression analysis routine, generally, uses a priori information about a possible relationship between the response variable. The isotonic regression is a method of estimation by least squares that generates estimates which preserves data ordering. In the theory of isotonic regression this information is essential and it is expected to increase fitting efficiency. The objective of this work was to use an isotonic regression methodology, as an alternative way of analyzing data of Zn deposition in tibia of male birds of Hubbard lineage. We considered the models of plateau response of polynomial quadratic and linear exponential forms. In addition to these models, we also proposed the fitting of a logarithmic model to the data and the efficiency of the methodology was evaluated by Monte Carlo simulations, considering different scenarios for the parametric values. The isotonization of the data yielded an improvement in all the fitting quality parameters evaluated. Among the models used, the logarithmic presented estimates of the parameters more consistent with the values reported in literature.
Resumo:
A rigorous asymptotic theory for Wald residuals in generalized linear models is not yet available. The authors provide matrix formulae of order O(n(-1)), where n is the sample size, for the first two moments of these residuals. The formulae can be applied to many regression models widely used in practice. The authors suggest adjusted Wald residuals to these models with approximately zero mean and unit variance. The expressions were used to analyze a real dataset. Some simulation results indicate that the adjusted Wald residuals are better approximated by the standard normal distribution than the Wald residuals.
Resumo:
This paper is concerned with the energy decay for a class of plate equations with memory and lower order perturbation of p-Laplacian type, utt+?2u-?pu+?0tg(t-s)?u(s)ds-?ut+f(u)=0inOXR+, with simply supported boundary condition, where O is a bounded domain of RN, g?>?0 is a memory kernel that decays exponentially and f(u) is a nonlinear perturbation. This kind of problem without the memory term models elastoplastic flows.
Resumo:
Over the last decade, Brazil has pioneered an innovative model of branchless banking, known as correspondent banking, involving distribution partnership between banks, several kinds of retailers and a variety of other participants, which have allowed an unprecedented growth in bank outreach and became a reference worldwide. However, despite the extensive number of studies recently developed focusing on Brazilian branchless banking, there exists a clear research gap in the literature. It is still necessary to identify the different business configurations involving network integration through which the branchless banking channel can be structured, as well as the way they relate to the range of bank services delivered. Given this gap, our objective is to investigate the relationship between network integration models and services delivered through the branchless banking channel. Based on twenty interviews with managers involved with the correspondent banking business and data collected on almost 300 correspondent locations, our research is developed in two steps. First, we created a qualitative taxonomy through which we identified three classes of network integration models. Second, we performed a cluster analysis to explain the groups of financial services that fit each model. By contextualizing correspondents' network integration processes through the lens of transaction costs economics, our results suggest that the more suited to deliver social-oriented, "pro-poor'' services the channel is, the more it is controlled by banks. This research offers contributions to managers and policy makers interested in understanding better how different correspondent banking configurations are related with specific portfolios of services. Researchers interested in the subject of branchless banking can also benefit from the taxonomy presented and the transaction costs analysis of this kind of banking channel, which has been adopted in a number of developing countries all over the world now. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper presents the results of a simulation using physical objects. This concept integrates the physical dimensions of an entity such as length, width, and weight, with the usual process flow paradigm, recurrent in the discrete event simulation models. Based on a naval logistics system, we applied this technique in an access channel of the largest port of Latin America. This system is composed by vessel movement constrained by the access channel dimensions. Vessel length and width dictates whether it is safe or not to have one or two ships simultaneously. The success delivered by the methodology proposed was an accurate validation of the model, approximately 0.45% of deviation, when compared to real data. Additionally, the model supported the design of new terminals operations for Santos, delivering KPIs such as: canal utilization, queue time, berth utilization, and throughput capability
Resumo:
In the first chapter, we consider the joint estimation of objective and risk-neutral parameters for SV option pricing models. We propose a strategy which exploits the information contained in large heterogeneous panels of options, and we apply it to S&P 500 index and index call options data. Our approach breaks the stochastic singularity between contemporaneous option prices by assuming that every observation is affected by measurement error. We evaluate the likelihood function by using a MC-IS strategy combined with a Particle Filter algorithm. The second chapter examines the impact of different categories of traders on market transactions. We estimate a model which takes into account traders’ identities at the transaction level, and we find that the stock prices follow the direction of institutional trading. These results are carried out with data from an anonymous market. To explain our estimates, we examine the informativeness of a wide set of market variables and we find that most of them are unambiguously significant to infer the identity of traders. The third chapter investigates the relationship between the categories of market traders and three definitions of financial durations. We consider trade, price and volume durations, and we adopt a Log-ACD model where we include information on traders at the transaction level. As to trade durations, we observe an increase of the trading frequency when informed traders and the liquidity provider intensify their presence in the market. For price and volume durations, we find the same effect to depend on the state of the market activity. The fourth chapter proposes a strategy to express order aggressiveness in quantitative terms. We consider a simultaneous equation model to examine price and volume aggressiveness at Euronext Paris, and we analyse the impact of a wide set of order book variables on the price-quantity decision.
Resumo:
In this work we address the problem of finding formulas for efficient and reliable analytical approximation for the calculation of forward implied volatility in LSV models, a problem which is reduced to the calculation of option prices as an expansion of the price of the same financial asset in a Black-Scholes dynamic. Our approach involves an expansion of the differential operator, whose solution represents the price in local stochastic volatility dynamics. Further calculations then allow to obtain an expansion of the implied volatility without the aid of any special function or expensive from the computational point of view, in order to obtain explicit formulas fast to calculate but also as accurate as possible.