932 resultados para ExternaI Time-varying Reference Consumption Level
Resumo:
This paper exploits a structural time series approach to model the time pattern of multiple and resurgent food scares and their direct and cross-product impacts on consumer response. A structural time series Almost Ideal Demand System (STS-AIDS) is embedded in a vector error correction framework to allow for dynamic effects (VEC-STS-AIDS). Italian aggregate household data on meat demand is used to assess the time-varying impact of a resurgent BSE crisis (1996 and 2000) and the 1999 Dioxin crisis. The VEC-STS-AIDS model monitors the short-run impacts and performs satisfactorily in terms of residuals diagnostics, overcoming the major problems encountered by the customary vector error correction approach.
Resumo:
We review the procedures and challenges that must be considered when using geoid data derived from the Gravity and steady-state Ocean Circulation Explorer (GOCE) mission in order to constrain the circulation and water mass representation in an ocean 5 general circulation model. It covers the combination of the geoid information with timemean sea level information derived from satellite altimeter data, to construct a mean dynamic topography (MDT), and considers how this complements the time-varying sea level anomaly, also available from the satellite altimeter. We particularly consider the compatibility of these different fields in their spatial scale content, their temporal rep10 resentation, and in their error covariances. These considerations are very important when the resulting data are to be used to estimate ocean circulation and its corresponding errors. We describe the further steps needed for assimilating the resulting dynamic topography information into an ocean circulation model using three different operational fore15 casting and data assimilation systems. We look at methods used for assimilating altimeter anomaly data in the absence of a suitable geoid, and then discuss different approaches which have been tried for assimilating the additional geoid information. We review the problems that have been encountered and the lessons learned in order the help future users. Finally we present some results from the use of GRACE geoid in20 formation in the operational oceanography community and discuss the future potential gains that may be obtained from a new GOCE geoid.
Resumo:
The evaluation of investment fund performance has been one of the main developments of modern portfolio theory. Most studies employ the technique developed by Jensen (1968) that compares a particular fund's returns to a benchmark portfolio of equal risk. However, the standard measures of fund manager performance are known to suffer from a number of problems in practice. In particular previous studies implicitly assume that the risk level of the portfolio is stationary through the evaluation period. That is unconditional measures of performance do not account for the fact that risk and expected returns may vary with the state of the economy. Therefore many of the problems encountered in previous performance studies reflect the inability of traditional measures to handle the dynamic behaviour of returns. As a consequence Ferson and Schadt (1996) suggest an approach to performance evaluation called conditional performance evaluation which is designed to address this problem. This paper utilises such a conditional measure of performance on a sample of 27 UK property funds, over the period 1987-1998. The results of which suggest that once the time varying nature of the funds beta is corrected for, by the addition of the market indicators, the average fund performance show an improvement over that of the traditional methods of analysis.
Resumo:
Rising sea level is perhaps the most severe consequence of climate warming, as much of the world’s population and infrastructure is located near current sea level (Lemke et al. 2007). A major rise of a metre or more would cause serious problems. Such possibilities have been suggested by Hansen and Sato (2011) who pointed out that sea level was several metres higher than now during the Holsteinian and Eemian interglacials (about 250,000 and 120,000 years ago, respectively), even though the global temperature was then only slightly higher than it is nowadays. It is consequently of the utmost importance to determine whether such a sea level rise could occur and, if so, how fast it might happen. Sea level undergoes considerable changes due to natural processes such as the wind, ocean currents and tidal motions. On longer time scales, the sea level is influenced by steric effects (sea water expansion caused by temperature and salinity changes of the ocean) and by eustatic effects caused by changes in ocean mass. Changes in the Earth’s cryosphere, such as the retreat or expansion of glaciers and land ice areas, have been the dominant cause of sea level change during the Earth’s recent history. During the glacial cycles of the last million years, the sea level varied by a large amount, of the order of 100 m. If the Earth’s cryosphere were to disappear completely, the sea level would rise by some 65 m. The scientific papers in the present volume address the different aspects of the Earth’s cryosphere and how the different changes in the cryosphere affect sea level change. It represents the outcome of the first workshop held within the new ISSI Earth Science Programme. The workshop took place from 22 to 26 March, 2010, in Bern, Switzerland, with the objective of providing an in-depth insight into the future of mountain glaciers and the large land ice areas of Antarctica and Greenland, which are exposed to natural and anthropogenic climate influences, and their effects on sea level change. The participants of the workshop are experts in different fields including meteorology, climatology, oceanography, glaciology and geodesy; they use advanced space-based observational studies and state-of-the-art numerical modelling.
Resumo:
We investigate the role of the anthropogenic heat flux on the urban heat island of London. To do this, the time-varying anthropogenic heat flux is added to an urban surface-energy balance parametrization, the Met Office–Reading Urban Surface Exchange Scheme (MORUSES), implemented in a 1 km resolution version of the UK Met Office Unified Model. The anthropogenic heat flux is derived from energy-demand data for London and is specified on the model's 1 km grid; it includes variations on diurnal and seasonal time-scales. We contrast a spring case with a winter case, to illustrate the effects of the larger anthropogenic heat flux in winter and the different roles played by thermodynamics in the different seasons. The surface-energy balance channels the anthropogenic heat into heating the urban surface, which warms slowly because of the large heat capacity of the urban surface. About one third of this additional warming goes into increasing the outgoing long-wave radiation and only about two thirds goes into increasing the sensible heat flux that warms the atmosphere. The anthropogenic heat flux has a larger effect on screen-level temperatures in the winter case, partly because the anthropogenic flux is larger then and partly because the boundary layer is shallower in winter. For the specific winter case studied here, the anthropogenic heat flux maintains a well-mixed boundary layer through the whole night over London, whereas the surrounding rural boundary layer becomes strongly stably stratified. This finding is likely to have important implications for air quality in winter. On the whole, inclusion of the anthropogenic heat flux improves the comparison between model simulations and measurements of screen-level temperature slightly and indicates that the anthropogenic heat flux is beginning to be an important factor in the London urban heat island.
Resumo:
The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.
Resumo:
We pursue the first large-scale investigation of a strongly growing mutual fund type: Islamic funds. Based on an unexplored, survivorship bias-adjusted data set, we analyse the financial performance and investment style of 265 Islamic equity funds from 20 countries. As Islamic funds often have diverse investment regions, we develop a (conditional) three-level Carhart model to simultaneously control for exposure to different national, regional and global equity markets and investment styles. Consistent with recent evidence for conventional funds, we find Islamic funds to display superior learning in more developed Islamic financial markets. While Islamic funds from these markets are competitive to international equity benchmarks, funds from especially Western nations with less Islamic assets tend to significantly underperform. Islamic funds’ investment style is somewhat tilted towards growth stocks. Funds from predominantly Muslim economies also show a clear small cap preference. These results are consistent over time and robust to time varying market exposures and capital market restrictions.
Resumo:
In this paper we study the high-latitude plasma flow variations associated with a periodic (∼8 min) sequence of auroral forms moving along the polar cap boundary, which appear to be the most regularly occuring dayside auroral phenomenon under conditions of southward directed interplanetary magnetic field. Satellite data on auroral particle precipitation and ionospheric plasma drifts from DMSP F10 and F11 are combined with ground-based optical and ion flow measurements for January 7, 1992. Ionospheric flow measurements of 10-s resolution over the range of invariant latitudes from 71° to 76° were obtained by operating both the European incoherent scatter (EISCAT) UHF and VHF radars simultaneously. The optical site (Ny Ålesund, Svalbard) and the EISCAT radar field of view were located in the postnoon sector during the actual observations. The West Greenland magnetometers provided information about temporal variations of high-latitude convection in the prenoon sector. Satellite observations of polar cap convection in the northern and southern hemispheres show a standard two-cell pattern consistent with a prevailing negative By component of the interplanetary magnetic field. The 630.0 nm auroral forms located poleward of the persistent cleft aurora and the flow reversal boundary in the ∼1440–1540 MLT sector were observed to coincide with magnetosheath-like particle precipitation and a secondary population of higher energy ions, and they propagated eastward/tailward at speeds comparable with the convection velocity. It is shown that these optical events were accompanied by bursts of sunward (return) flow at lower latitudes in both the morning and the afternoon sectors, consistent with a modulation of Dungey cell convection. The background level of convection was low in this case (Kp =2+). The variability of the high-latitude convection may be explained as resulting from time-varying reconnection at the magnetopause. In that case this study indicates that time variations of the reconnection rate effectively modulates ionospheric convection.
Resumo:
This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society
Resumo:
Accordingly, a variety of firms's technological capabilities studies, the literature recently is still lacking about the dynamic of sector evolution and technological development in inter-firm and their implication for technical and economical financial performance. More lacking is the research catching up the evolution of industrial sectors after the institutional reforms in the 90. For that, the focus of the dissertation is to analyze the main of the evolution of the pulp and paper industry from 1970 to 2004, using as reference points the import-substitution policy and the economic deregulation of the 1990s. Futhermore, the work tries to evaluate how such changes at industry level have been perceived from a firm point of view in terms of accumulation of technological capabilities and improvement of economic financial performance. This linkage is tested and examined in the following firms: Aracruz (Barra do Riacho establishment ), Klabin (Monte Alegre establishment) e Votorantim Celulose e Papel ¿ VCP (Jacareí establishment), defining the same time period of sectoral level. As far as the industry level study is concerned, it is based on the average rate of annual growth of some selected variables, given that the technological capabilities test is performed according to the methodology already existing in the literature, but properly adapted to the pulp and paper case. Similarly, the analysis regarding the improvement of the economic financial performance is based on a set of industry specific indicators. Hence, the work is built upon multiple case studies, taking into account both the qualitative and quantitative evidence, i.e. interviews, direct observations, as well as firm reports. Finally, it is worth emphasizing as the analysis of the changes in the sector, in conjunction with the above mentioned methodology used to measure the technological capabilities in the context of an evolving industrial regime, is still lacking in emerging economies as well as in Brazil. According to the empirical evidence, the reforms of the 1990s had a positive impact on the industrial development, from both the national and international viewpoint. Such a transformation was evident at firm level in terms of accumulation of technological capabilities and improvement of economic financial indicators. Indeed, the results show that the speed of accumulation of technological capabilities within the firms influences positively the performance indicators. On the other hand, these are also related to external factors, such as the macroeconomic conditions, which as such have not been considered in details.
Resumo:
O conceito de paridade coberta de juros sugere que, na ausência de barreiras para arbitragem entre mercados, o diferencial de juros entre dois ativos, idênticos em todos os pontos relevantes, com exceção da moeda de denominação, na ausência de risco de variação cambial deve ser igual a zero. Porém, uma vez que existam riscos não diversificáveis, representados pelo risco país, inerentes a economias emergentes, os investidores exigirão uma taxa de juros maior que a simples diferença entre as taxas de juros doméstica e externa. Este estudo tem por objetivo avaliar se o ajustamento das condições de paridade coberta de juros por prêmios de risco é suficiente para a validação da relação de não-arbitragem para o mercado brasileiro, durante o período de 2007 a 2010. O risco país contamina todos os ativos financeiros emitidos em uma determinada economia e pode ser descrito como a somatória do risco de default (ou risco soberano) e do risco de conversibilidade percebidos pelo mercado. Para a estimação da equação de não arbitragem foram utilizadas regressões por Mínimos Quadrados Ordinários, parâmetros variantes no tempo (TVP) e Mínimos Quadrados Recursivos, e os resultados obtidos não são conclusivos sobre a validação da relação de paridade coberta de juros, mesmo ajustando para prêmio de risco. Erros de medidas de dados, custo de transação e intervenções e políticas restritivas no mercado de câmbio podem ter contribuído para este resultado.
Resumo:
This paper aims at contributing to the research agenda on the sources of price stickiness, showing that the adoption of nominal price rigidity may be an optimal firms' reaction to the consumers' behavior, even if firms have no adjustment costs. With regular broadly accepted assumptions on economic agents behavior, we show that firms' competition can lead to the adoption of sticky prices as an (sub-game perfect) equilibrium strategy. We introduce the concept of a consumption centers model economy in which there are several complete markets. Moreover, we weaken some traditional assumptions used in standard monetary policy models, by assuming that households have imperfect information about the ineflicient time-varying cost shocks faced by the firms, e.g. the ones regarding to inefficient equilibrium output leveIs under fiexible prices. Moreover, the timing of events are assumed in such a way that, at every period, consumers have access to the actual prices prevailing in the market only after choosing a particular consumption center. Since such choices under uncertainty may decrease the expected utilities of risk averse consumers, competitive firms adopt some degree of price stickiness in order to minimize the price uncertainty and fi attract more customers fi.'
Resumo:
This paper investigates the relationship between consumer demand and corporate performance in several consumer industries in the UK, using two independent datasets. It uses data on consumer expenditures and the retail price index to estimate Almost Ideal Demand Systems on micro-data and compute timevarying price elasticities of demand for disaggregated commodity groups. Then, it matches the product definitions to the Standard Industry Classification and uses the estimated elasticities to investigate the impact of consumer behaviour on firm-level profitability equations. The time-varying household characteristics are ideal instruments for the demand effects in the firms' supply equation. The paper concludes that demand elasticities have a significant and tangible impact on the profitability of UK firms and that this impact can shed some light on the relationship between market structure and economic performance.
Resumo:
Este trabalho visa analisar a relação entre política monetária e persistência inflacionária no período recente, após a introdução do regime de metas de inflação no Brasil. Através de um modelo novo-keynesiano simplificado, o grau de persistência do hiato de inflação é modelado como função dos pesos da regra de política monetária. A evolução temporal da regra de Taylor é confrontada com a curva estimada de persistência do hiato de inflação, demonstrando que mudanças na condução da política monetária levam a alterações do nível de persistência inflacionária na economia. Uma adaptação do modelo, com uma regra de Taylor que incorpora expectativas do hiato do produto, chega aos mesmos resultados com maior precisão.
Resumo:
Composite resins have been subjected to structural modifications aiming at improved optical and mechanical properties. The present study consisted in an in vitro evaluation of the staining behavior of two nanohybrid resins (NH1 and NH2), a nanoparticulated resin (NP) and a microhybrid resin (MH). Samples of these materials were prepared and immersed in commonly ingested drinks, i.e., coffee, red wine and acai berry for periods of time varying from 1 to 60 days. Cylindrical samples of each resin were shaped using a metallic die and polymerized during 30 s both on the bottom and top of its disk. All samples were polished and immersed in the staining solutions. After 24 hours, three samples of each resin immersed in each solution were removed and placed in a spectrofotome ter for analysis. To that end, the samples were previously diluted in HCl at 50%. Tukey tests were carried out in the statistical analysis of the results. The results revealed that there was a clear difference in the staining behavior of each material. The nanoparticulated resin did not show better color stability compared to the microhybrid resin. Moreover, all resins stained with time. The degree of staining decreased in the sequence nanoparticulated, microhybrid, nanohybrid MH2 and MH1. Wine was the most aggressive drink followed by coffee and acai berry. SEM and image analysis revealed significant porosity on the surface of MH resin and relatively large pores on a NP sample. The NH2 resin was characterized by homogeneous dispersion of particles and limited porosity. Finally, the NH1 resin depicted the lowest porosity level. The results revealed that staining is likely related to the concentration of inorganic pa rticles and surface porosity