929 resultados para Bushy Run, Battle of, Pa., 1763.
Resumo:
This paper demonstrates that recent influential contributions to monetary policy imply an emerging consensus whereby neither rigid rules nor complete discretion are found optimal. Instead, middle-ground monetary regimes based on rules (operative under 'normal' circumstances) to anchor inflation expectations over the long run, but designed with enough flexibility to mitigate the short-run effect of shocks (with communicated discretion in 'exceptional' circumstances temporarily overriding these rules), are gaining support in theoretical models and policy formulation and implementation. The opposition of 'rules versus discretion' has, thus, reappeared as the synthesis of 'rules cum discretion', in essence as inflation-forecast targeting. But such synthesis is not without major theoretical problems, as we argue in this contribution. Furthermore, the very recent real-world events have made it obvious that the inflation targeting strategy of monetary policy, which rests upon the new consensus paradigm in modern macroeconomics is at best a 'fair weather' model. In the turbulent economic climate of highly unstable inflation, deep financial crisis and world-wide, abrupt economic slowdown nowadays this approach needs serious rethinking to say the least, if not abandoning it altogether
Resumo:
Here, we studied the self-assembly of two peptide amphiphiles, C16-Gly-Gly-Gly-Arg-Gly- Asp (PA 1: C16-GGG-RGD) and C16-Gly-Gly-Gly-Arg-Gly-Asp-Ser (PA 2: C16-GGG-RGDS).We showed that PA 1 and PA 2 self-assemble into nanotapes with an internal bilayer structure. C16 chains were highly interdigitated within the nanotape cores, while the peptide blocks formed water-exposed b-sheets too. PA 1 nanotapes were characterized by one spacing distribution, corresponding to a more regular internal structure than that of PA 2 nanotapes, which presented two different spacing distributions. We showed that it is possible to obtain homogeneous nanotapes in water by co-assembling PA 1 or PA 2 with the negatively charged diluent C16-Glu-Thr-Thr-Glu- Ser (PA 3: C16-ETTES). The homogeneous tapes formed by PA 1–PA 3 or PA 2–PA 3 mixtures presented a structure similar to that observed for the corresponding pure PA 1 or PA 2 nanotapes. The mixed nanotapes, which were able to form a stabilized matrix containing homogeneously distributed cell adhesive RGD groups, represent promising materials for designing new cell adhesion substrates.
Resumo:
BnF fr. 95 is a late 13th century manuscript containing Arthurian romances and other fictional and didactic texts. The Estoire del saint Graal and Merlin section is the most highly illuminated, with a rich marginal iconography, an unusual feature in the illustration of lay works and in these texts’ manuscript tradition. This article shows how in Merlin and its Vulgate Sequel marginal scenes overlap with widespread subjects in courtly and chivalric vernacular romances, in contrast with Latin and religious works. The reuse of similar patterns in principal and marginal miniatures, examined in the episode of the Battle of Danablaise, where King Arthur fights the Saxon King Rion, highlights the need for a comprehensive reading of text and images, taking into account the mise en page and the different levels of illustration in the manuscript.
Resumo:
We examine the empirical impact of trade openness on the short-run underpricing of initial public offerings (IPOs) using city-level real estate data. This paper represents a first attempt to employ a macroeconomic approach to explain IPO performance. We investigate an openness effect in which urban economic openness (UEO) has a significant impact on the productivity and on the prices of both direct and indirect real estate due to productivity gains of companies in more open areas. This in turn positively affects the firm’s profitability, enhancing the confidence in the local real estate market and the future company performance and decreasing the uncertainty of the IPO valuation. And as a result, we find that issuers have less incentive to underprice the IPO shares. China provides a suitable experimental ground to study the immense underpricing in developing markets, which cannot solely be accounted for by firm specific effects. First, Chinese real estate companies show strong geographic patterns focusing their businesses locally – usually at a city level. Second, we observe a degree of openness which is significantly heterogeneous across Chinese cities. Controlling for company-specific variables, location and state ownership, we find the evidence that companies whose businesses are in economically more open areas experience less IPO underpricing. Our results show high explanatory power and are robust to diverse specifications.
Resumo:
A rapid, sensitive and specific method for quantifying ciprofibrate in human plasma using bezafibrate as the internal standard (IS) is described. The sample was acidified prior extraction with formic acid (88%). The analyte and the IS were extracted from plasma by liquid-liquid extraction using an organic solvent (diethyl ether/dichloromethane 70/30 (v/v)). The extracts were analyzed by high performance liquid chromatography coupled with electrospray tandem mass spectrometry (HPLC-MS/MS). Chromatography was performed using Genesis C18 4 mu m analytical column (4.6 x 150 mm i.d.) and a mobile phase consisting of acetonitrile/water (70/30, v/v) and 1 mM acetic acid. The method had a chromatographic run time of 3.4 min and a linear calibration curve over the range 0.1-60 mu g/mL (r > 0.99). The limit of quantification was 0.1 mu g/mL. The intra- and interday accuracy and precision values of the assay were less than 13.5%. The stability tests indicated no significant degradation. The recovery of ciprofibrate was 81.2%, 73.3% and 76.2% for the 0.3, 5.0 and 48.0 ng/mL standard concentrations, respectively. For ciprofibrate, the optimized parameters of the declustering potential, collision energy and collision exit potential were -51 V, -16 eV and -5 V, respectively. The method was also validated without the use of the internal standard. This HPLC-MS/MS procedure was used to assess the bioequivalence of two ciprofibrate 100 mg tablet formulations in healthy volunteers of both sexes. The following pharmacokinetic parameters were obtained from the ciprofibrate plasma concentration vs. time curves: AUC(last), AUC(0-168 h), C(max) and T(max). The geometric mean with corresponding 90% confidence interval (CI) for test/reference percent ratios were 93.80% (90% CI = 88.16-99.79%) for C(max), 98.31% (90% CI = 94.91-101.83%) for AUC(last) and 97.67% (90% CI = 94.45-101.01%) for AUC(0-168 h). Since the 90% Cl for AUC(last), AUC(0-168 h) and C(max) ratios were within the 80-125% interval proposed by the US FDA, it was concluded that ciprofibrate (Lipless (R) 100 mg tablet) formulation manufactured by Biolab Sanus Farmaceutica Ltda. is bioequivalent to the Oroxadin (R) (100 mg tablet) formulation for both the rate and the extent of absorption. (C) 2011 Published by Elsevier B.V.
Resumo:
this article addresses the welfare and macroeconomics effects of fiscal policy in a frarnework where govemment chooses tax rates and the distribution of revenues between consumption and investment. We construct and simulate a model where public consumption affects individuaIs' utility and public capital is an argument of the production function. The simulations suggest that by simply reallocating expenditures from consumption to investment, the govemment can increase the equilibrium leveIs of capital stock, hours worked, output and labor productivity. Funhennore, we 'show that the magnitude and direction of the long run impact of fiscal policy depends on the size of the elasticity of output to public capital. If this parameter is high enough, it may be the case that capital stock, within limits, increases with tax rates.
Resumo:
Este Artigo tem como Objetivo Apresentar uma Análise das Práticas Atuais da Política Monetária, a Política de Regras Monetárias . Discute o Contexto Histórico em que Tais Práticas Apareceram, e suas Implicações, Especialmente de que a Oferta Monetária Passa a ser é Endógena, e não Exógena, como na Visão Monetarista Tradicional. a Nova Doutrina Também Faz a Suposição Extrema de que Embora a Política Monetária Possa ser Efetiva no Curto Prazo, Ela é Neutra no Longo Prazo. esta Hipótese Bem como suas Implicações para a Política Monetária Podem ser Sujeitas a Várias Críticas que Serão Desenvolvidas neste Trabalho.
Resumo:
A motivação para este trabalho vem dos principais resultados de Carvalho e Schwartzman (2008), onde a heterogeneidade surge a partir de diferentes regras de ajuste de preço entre os setores. Os momentos setoriais da duração da rigidez nominal são su cientes para explicar certos efeitos monetários. Uma vez que concordamos que a heterogeneidade é relevante para o estudo da rigidez de preços, como poderíamos escrever um modelo com o menor número possível de setores, embora com um mínimo de heterogeneidade su ciente para produzir qualquer impacto monetário desejado, ou ainda, qualquer três momentos da duração? Para responder a esta questão, este artigo se restringe a estudar modelos com hazard-constante e considera que o efeito acumulado e a dinâmica de curto-prazo da política monetária são boas formas de se resumir grandes economias heterogêneas. Mostramos que dois setores são su cientes para resumir os efeitos acumulados de choques monetários, e economias com 3 setores são boas aproximações para a dinâmica destes efeitos. Exercícios numéricos para a dinâmica de curto prazo de uma economia com rigidez de informação mostram que aproximar 500 setores usando apenas 3 produz erros inferiores a 3%. Ou seja, se um choque monetário reduz o produto em 5%, a economia aproximada produzirá um impacto entre 4,85% e 5,15%. O mesmo vale para a dinâmica produzida por choques de nível de moeda em uma economia com rigidez de preços. Para choques na taxa de crescimento da moeda, a erro máximo por conta da aproximação é de 2,4%.
Resumo:
This research argues that Brazil should create conditions for long run financing of development. The end of the inflationary process is just a first step. A second step is the development of an institutional framework to offer good alternatives to investors that have long run targets, such as pension funds etc. Particularly, in the Brazilian market predominates the trade of preferred shares instead of common shares that give more prerrogatives for investors that plan to hold these shares on long term basis. This attitude turns the market more volatile and in this way corporations lose the chance of financing their projects with large amounts of capital and have to rely instead on more debt which weakens their financial strenght.
Resumo:
It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.
Resumo:
Este trabalho tem a finalidade de analisar as evidências de relações de longo prazo entre a taxa de câmbio real (“RER”), a posição internacional de investimentos (“NFA”) e o efeito Balassa-Samuelson (“PREL”) em um grupo de 28 países, grupo este que inclui países em diferentes estágios de desenvolvimento. A metodologia utilizada foi a de testes de cointegração. Os testes aplicados foram desenvolvidos por Bierens (1997), teste não paramétrico, e por Saikkonen e Lütkepohl (2000a, b, c), teste que consiste em primeiro estimar um termo determinístico. Evidências de cointegração são constatadas, em ambos os testes, na maioria dos países estudados. Entretanto, houve diferenças relevantes entre os resultados encontrados através dos dois testes aplicados. Estas diferenças entre os resultados, bem como alguns casos especiais de países que não demonstraram evidências de cointegração, requerem análises mais aprofundadas sobre o comportamento de longo prazo das três variáveis estudadas.
Resumo:
Este trabalho tem por objetivo estimar um modelo empírico para relacionar os gastos em publicidade com a receita das firmas, de forma a servir como ferramenta de tomada de decisão, para isso vamos fazer um estudo de caso da indústria de telecomunicações. A Indústria de comunicação (publicidade) no Brasil, segundo dados do IBGE de 2008, é responsável por 4% do PIB, gerando receitas da ordem 115 bilhões de reais. Com 113 mil empresas que geram 711 mil empregos, ocupam 866 mil pessoas e pagam 11,8 bilhões em salários e encargos. No entanto, a maioria dos gestores de marketing declara não ter instrumentos para medir o impacto de suas ações no resultado das empresas. O modelo empírico será estimado tendo como base dados mensais dos serviços de ligações de longa distância nacional da Embratel para o período de janeiro de 2009 até dezembro de 2011. As informações quase sempre não disponíveis, só puderam ser usadas devido ao compromisso de confidencialidade. A partir de técnicas de cointegração, foi calculada a elasticidade de longo prazo da receita em relação aos gastos com publicidade e ao preço, assim com as respectivas velocidades de ajustamento aos desvios de curto prazo. Os resultados sugerem que a receita responde positivamente às variações dos gastos em publicidade, embora o percentual seja relativamente baixo, através do teorema de Dorfman-Steiner conseguimos indicar que o ponto ótimo da relação entre gastos com publicidade e a receita seria de aproximadamente 20%, respeitadas as limitações do modelo.
Resumo:
Este trabalho se propõe a estudar as implicações macroeconômicas da existência do BNDES na economia. Construímos aqui um modelo DSGE contemplando as características do BNDES e realizamos exercícios sobre o mesmo. Este é o primeiro trabalho a analisar o impacto de curto prazo do BNDES, sendo essa sua contribuição central. Constatamos aqui que o BNDES atua de forma a amplificar os choques de produtividade sobre a economia e reduz a eficácia da política monetária.
Resumo:
Exchange rate misalignment assessment is becoming more relevant in recent period particularly after the nancial crisis of 2008. There are di erent methodologies to address real exchange rate misalignment. The real exchange misalignment is de ned as the di erence between actual real e ective exchange rate and some equilibrium norm. Di erent norms are available in the literature. Our paper aims to contribute to the literature by showing that Behavioral Equilibrium Exchange Rate approach (BEER) adopted by Clark & MacDonald (1999), Ubide et al. (1999), Faruqee (1994), Aguirre & Calderón (2005) and Kubota (2009) among others can be improved in two following manners. The rst one consists of jointly modeling real e ective exchange rate, trade balance and net foreign asset position. The second one has to do with the possibility of explicitly testing over identifying restrictions implied by economic theory and allowing the analyst to show that these restrictions are not falsi ed by the empirical evidence. If the economic based identifying restrictions are not rejected it is also possible to decompose exchange rate misalignment in two pieces, one related to long run fundamentals of exchange rate and the other related to external account imbalances. We also discuss some necessary conditions that should be satis ed for disrcarding trade balance information without compromising exchange rate misalignment assessment. A statistical (but not a theoretical) identifying strategy for calculating exchange rate misalignment is also discussed. We illustrate the advantages of our approach by analyzing the Brazilian case. We show that the traditional approach disregard important information of external accounts equilibrium for this economy.
Resumo:
This thesis contains three chapters. The first chapter uses a general equilibrium framework to simulate and compare the long run effects of the Patient Protection and Affordable Care Act (PPACA) and of health care costs reduction policies on macroeconomic variables, government budget, and welfare of individuals. We found that all policies were able to reduce uninsured population, with the PPACA being more effective than cost reductions. The PPACA increased public deficit mainly due to the Medicaid expansion, forcing tax hikes. On the other hand, cost reductions alleviated the fiscal burden of public insurance, reducing public deficit and taxes. Regarding welfare effects, the PPACA as a whole and cost reductions are welfare improving. High welfare gains would be achieved if the U.S. medical costs followed the same trend of OECD countries. Besides, feasible cost reductions are more welfare improving than most of the PPACA components, proving to be a good alternative. The second chapter documents that life cycle general equilibrium models with heterogeneous agents have a very hard time reproducing the American wealth distribution. A common assumption made in this literature is that all young adults enter the economy with no initial assets. In this chapter, we relax this assumption – not supported by the data – and evaluate the ability of an otherwise standard life cycle model to account for the U.S. wealth inequality. The new feature of the model is that agents enter the economy with assets drawn from an initial distribution of assets. We found that heterogeneity with respect to initial wealth is key for this class of models to replicate the data. According to our results, American inequality can be explained almost entirely by the fact that some individuals are lucky enough to be born into wealth, while others are born with few or no assets. The third chapter documents that a common assumption adopted in life cycle general equilibrium models is that the population is stable at steady state, that is, its relative age distribution becomes constant over time. An open question is whether the demographic assumptions commonly adopted in these models in fact imply that the population becomes stable. In this chapter we prove the existence of a stable population in a demographic environment where both the age-specific mortality rates and the population growth rate are constant over time, the setup commonly adopted in life cycle general equilibrium models. Hence, the stability of the population do not need to be taken as assumption in these models.