19 resultados para Real Interest Rate Differentials

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article makes a connection between Lucas` (1978) asset pricing model and the macroeconomic dynamics for some selected countries. Both the relative risk aversion and the impatience for postponing consumption by synthesizing the investor behaviour can help to understand some key macroeconomic issues across countries, such as the savings decision and the real interest rate. I find that the government consumption makes worse the so-called `equity premium-interest rate puzzle`. The first root of the quadratic function for explaining the real interest rate can produce this puzzle, but not the second root. Thus, Mehra and Prescott (1985) identified only one possible solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this work is to verify the stability of the relationship between real activity and interest rate spread. The test is based on Chen (1988) and Osorio and Galea (2006). The analysis is applied to Chile and the United States, from 1980 to 1999. In general, in both cases the relationship was statistically significant in early 80s, but a break point is found in both countries during that decades, suggesting that the relationship depends on the monetary rule follow by the Central Bank.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A macrodynamic model is proposed in which the real exchange rate and the elasticity of labour supply interact defining different trajectories of growth and income distribution in a developing economy. Growth depends on imports of capital goods which are paid with exports (there are no capital flows) and hence is constrained by equilibrium in current account. The role of the elasticity of labour supply is to prevent the real exchange rate from appreciating as the economy grows, thereby sustaining international competitiveness. The model allows for endogenous technological change and considers the impact of migration from the subsistence to the modern sector on the cumulative (Kaldor-Verdoorn) process of learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gaseous N losses from soil are considerable, resulting mostly from ammonia volatilization linked to agricultural activities such as pasture fertilization. The use of simple and accessible measurement methods of such losses is fundamental in the evaluation of the N cycle in agricultural systems. The purpose of this study was to evaluate quantification methods of NH3 volatilization from fertilized surface soil with urea, with minimal influence on the volatilization processes. The greenhouse experiment was arranged in a completely randomized design with 13 treatments and five replications, with the following treatments: (1) Polyurethane foam (density 20 kg m-3) with phosphoric acid solution absorber (foam absorber), installed 1, 5, 10 and 20 cm above the soil surface; (2) Paper filter with sulfuric acid solution absorber (paper absorber, 1, 5, 10 and 20 cm above the soil surface); (3) Sulfuric acid solution absorber (1, 5 and 10 cm above the soil surface); (4) Semi-open static collector; (5) 15N balance (control). The foam absorber placed 1 cm above the soil surface estimated the real daily rate of loss and accumulated loss of NH3N and proved efficient in capturing NH3 volatized from urea-treated soil. The estimates based on acid absorbers 1, 5 and 10 cm above the soil surface and paper absorbers 1 and 5 cm above the soil surface were only realistic for accumulated N-NH3 losses. Foam absorbers can be indicated to quantify accumulated and daily rates of NH3 volatilization losses similarly to an open static chamber, making calibration equations or correction factors unnecessary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the factors that influence the issuing price of debentures in Brazil in the period from year 2000 to 2004, applying a factor model, in which exogenous variables explain return and price behavior. The variables in this study include: rating, choice of index, maturity, country risk, basic interest rate, long-term and short-term rate spread, the stock market index, and the foreign exchange rate. Results indicate that the index variable, probability of default and bond`s maturity influence pricing and points out associations of long-term bonds with better rating issues. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most models currently used to determine optimal foreign reserve holdings take the level of international debt as given. However, given the sovereign`s willingness-to-pay incentive problems, reserve accumulation may reduce sustainable debt levels. In addition, assuming constant debt levels does not allow addressing one of the puzzles behind using reserves as a means to avoid the negative effects of crisis: why do not sovereign countries reduce their sovereign debt instead? To study the joint decision of holding sovereign debt and reserves, we construct a stochastic dynamic equilibrium model calibrated to a sample of emerging markets. We obtain that the reserve accumulation does not play a quantitatively important role in this model. In fact, we find the optimal policy is not to hold reserves at all. This finding is robust to considering interest rate shocks, sudden stops, contingent reserves and reserve dependent output costs. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine`s capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine`s potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

ArtinM is a D-mannose binding lectin that has been arousing increasing interest because of its biomedical properties, especially those involving the stimulation of Th1 immune response, which confers protection against intracellular pathogens The potential pharmaceutical applications of ArtinM have motivated the production of its recombinant form (rArtinM) so that it is important to compare the sugar-binding properties of jArtinM and rArtinM in order to take better advantage of the potential applications of the recombinant lectin. In this work, a biosensor framework based on a Quartz Crystal Microbalance was established with the purpose of making a comparative study of the activity of native and recombinant ArtinM protein The QCM transducer was strategically functionalized to use a simple model of protein binding kinetics. This approach allowed for the determination of the binding/dissociation kinetics rate and affinity equilibrium constant of both forms of ArtinM with horseradish peroxidase glycoprotein (HRP), a N-glycosylated protein that contains the trimannoside Man alpha 1-3[Man alpha 1-6]Man, which is a known ligand for jArtinM (Jeyaprakash et al, 2004). Monitoring of the real-time binding of rArtinM shows that it was able to bind HRP, leading to an analytical curve similar to that of jArtinM, with statistically equivalent kinetic rates and affinity equilibrium constants for both forms of ArtinM The lower reactivity of rArtinM with HRP than jArtinM was considered to be due to a difference in the number of Carbohydrate Recognition Domains (CRDs) per molecule of each lectin form rather than to a difference in the energy of binding per CRD of each lectin form. (C) 2010 Elsevier B V. All rights reserved

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Historically, the cure rate model has been used for modeling time-to-event data within which a significant proportion of patients are assumed to be cured of illnesses, including breast cancer, non-Hodgkin lymphoma, leukemia, prostate cancer, melanoma, and head and neck cancer. Perhaps the most popular type of cure rate model is the mixture model introduced by Berkson and Gage [1]. In this model, it is assumed that a certain proportion of the patients are cured, in the sense that they do not present the event of interest during a long period of time and can found to be immune to the cause of failure under study. In this paper, we propose a general hazard model which accommodates comprehensive families of cure rate models as particular cases, including the model proposed by Berkson and Gage. The maximum-likelihood-estimation procedure is discussed. A simulation study analyzes the coverage probabilities of the asymptotic confidence intervals for the parameters. A real data set on children exposed to HIV by vertical transmission illustrates the methodology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we develop a flexible cure rate survival model by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell Poisson distribution. This model includes as special cases some of the well-known cure rate models discussed in the literature. Next, we discuss the maximum likelihood estimation of the parameters of this cure rate survival model. Finally, we illustrate the usefulness of this model by applying it to a real cutaneous melanoma data. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes a visual stimulus generator (VSImG) capable of displaying a gray-scale, 256 x 256 x 8 bitmap image with a frame rate of 500 Hz using a boustrophedonic scanning technique. It is designed for experiments with motion-sensitive neurons of the fly`s visual system, where the flicker fusion frequency of the photoreceptors can reach up to 500 Hz. Devices with such a high frame rate are not commercially available, but are required, if sensory systems with high flicker fusion frequency are to be studied. The implemented hardware approach gives us complete real-time control of the displacement sequence and provides all the signals needed to drive an electrostatic deflection display. With the use of analog signals, very small high-resolution displacements, not limited by the image`s pixel size can be obtained. Very slow image displacements with visually imperceptible steps can also be generated. This can be of interest for other vision research experiments. Two different stimulus files can be used simultaneously, allowing the system to generate X-Y displacements on one display or independent movements on two displays as long as they share the same bitmap image. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article discusses the main aspects of the Brazilian real estate market in order to illustrate if it would be attractive for a typical American real estate investor to buy office-building portfolios in Brazil. The article emphasizes: [i] - the regulatory frontiers, comparing investment securitization, using a typical American REIT structure, with the Brazilian solution, using the Fundo de Investimento Imobiliario - FII; [ii] - the investment quality attributes in the Brazilian market, using an office building prototype, and [iii] - the comparison of [risk vs. yield] generated by an investment in the Brazilian market, using a FII, benchmarked against an existing REIT (OFFICE SUB-SECTOR) in the USA market. We conclude that investing dollars exchanged for Reais [the Brazilian currency] in a FII with a triple A office-building portfolio in the Sao Paulo marketplace will yield an annual income and a premium return above an American REIT investment. The highly aggressive scenario, along with the strong persistent exchange rate detachment to the IGP-M variations, plus instabilities affecting the generation of income, and even if we adopt a 300-point margin for the Brazil-Risk level, demonstrates that an investment opportunity in the Brazilian market, in the segment we have analyzed, outperforms an equivalent investment in the American market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to study the relationship between the debt level and the asset structure of Brazilian companies of the agribusiness sector, since it is considered a current and relevant discussion: to evaluate the mechanisms for fund-raising and guarantees. The methodology of Granger`s Causality test and Autoregressive Vectors was used to conduct a comparative analysis, applied to a financial database of companies with open capital of Brazilian agribusiness, in particular the agricultural sector and Fisheries and Food and Beverages in a period of 10 years (1997-2007) from quarterly series available in the database of Economatica(R). The results demonstrated that changes in leverage generate variations in the tangibility of the companies, a fact that can be explained by the large search of funding secured by fiduciary transfer of fixed assets, which facilitates access to credit by business of the Agribusiness sector, increasing the payment time and lowering interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.