83 resultados para Earnings and dividend announcements, high frequency data, information asymmetry
Resumo:
This paper reports the results of a 2-year study of water quality in the River Enborne, a rural river in lowland England. Concentrations of nitrogen and phosphorus species and other chemical determinands were monitored both at high-frequency (hourly), using automated in situ instrumentation, and by manual weekly sampling and laboratory analysis. The catchment land use is largely agricultural, with a population density of 123 persons km−2. The river water is largely derived from calcareous groundwater, and there are high nitrogen and phosphorus concentrations. Agricultural fertiliser is the dominant source of annual loads of both nitrogen and phosphorus. However, the data show that sewage effluent discharges have a disproportionate effect on the river nitrogen and phosphorus dynamics. At least 38% of the catchment population use septic tank systems, but the effects are hard to quantify as only 6% are officially registered, and the characteristics of the others are unknown. Only 4% of the phosphorus input and 9% of the nitrogen input is exported from the catchment by the river, highlighting the importance of catchment process understanding in predicting nutrient concentrations. High-frequency monitoring will be a key to developing this vital process understanding.
Resumo:
Market failure can be corrected using different regulatory approaches ranging from high to low intervention. Recently, classic regulations have been criticized as costly and economically irrational and thus policy makers are giving more consideration to soft regulatory techniques such as information remedies. However, despite the plethora of food information conveyed by different media there appears to be a lack of studies exploring how consumers evaluate this information and how trust towards publishers influence their choices for food information. In order to fill such a gap, this study investigates questions related to topics which are more relevant to consumers, who should disseminate trustful food information, and how communication should be conveyed and segmented. Primary data were collected both through qualitative (in depth interviews and focus groups) and quantitative research (web and mail surveys). Attitudes, willingness to pay for food information and trust towards public and private sources conveying information through a new food magazine were assessed using both multivariate statistical methods and econometric analysis. The study shows that consumer attitudes towards food information topics can be summarized along three cognitive-affective dimensions: the agro-food system, enjoyment and wellness. Information related to health risks caused by nutritional disorders and food safety issues caused by bacteria and chemical substances is the most important for about 90% of respondents. Food information related to regulations and traditions is also considered important for more than two thirds of respondents, while information about food production and processing techniques, life style and food fads are considered less important by the majority of respondents. Trust towards food information disseminated by public bodies is higher than that observed for private bodies. This behavior directly affects willingness to pay (WTP) for food information provided by public and private publishers when markets are shocked by a food safety incident. WTP for consumer association (€ 1.80) and the European Food Safety Authority (€ 1.30) are higher than WTP for the independent and food industry publishers which cluster around zero euro. Furthermore, trust towards the type of publisher also plays a key role in food information market segmentation together with socio-demographic and economic variables such as gender, age, presence of children and income. These findings invite policy makers to reflect on the possibility of using information remedies conveyed using trusted sources of information to specific segments of consumers as an interesting soft alternative to the classic way of regulating modern food markets.
Resumo:
This paper forecasts Daily Sterling exchange rate returns using various naive, linear and non-linear univariate time-series models. The accuracy of the forecasts is evaluated using mean squared error and sign prediction criteria. These show only a very modest improvement over forecasts generated by a random walk model. The Pesaran–Timmerman test and a comparison with forecasts generated artificially shows that even the best models have no evidence of market timing ability.
Resumo:
We describe some recent advances in the numerical solution of acoustic scattering problems. A major focus of the paper is the efficient solution of high frequency scattering problems via hybrid numerical-asymptotic boundary element methods. We also make connections to the unified transform method due to A. S. Fokas and co-authors, analysing particular instances of this method, proposed by J. A. De-Santo and co-authors, for problems of acoustic scattering by diffraction gratings.
Resumo:
Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.
Resumo:
We consider the problem of scattering of a time-harmonic acoustic incident plane wave by a sound soft convex polygon. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the computational cost required to achieve a prescribed level of accuracy grows linearly with respect to the frequency of the incident wave. Recently Chandler–Wilde and Langdon proposed a novel Galerkin boundary element method for this problem for which, by incorporating the products of plane wave basis functions with piecewise polynomials supported on a graded mesh into the approximation space, they were able to demonstrate that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency. Here we propose a related collocation method, using the same approximation space, for which we demonstrate via numerical experiments a convergence rate identical to that achieved with the Galerkin scheme, but with a substantially reduced computational cost.
Resumo:
In this paper we consider the problem of time-harmonic acoustic scattering in two dimensions by convex polygons. Standard boundary or finite element methods for acoustic scattering problems have a computational cost that grows at least linearly as a function of the frequency of the incident wave. Here we present a novel Galerkin boundary element method, which uses an approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh, with smaller elements closer to the corners of the polygon. We prove that the best approximation from the approximation space requires a number of degrees of freedom to achieve a prescribed level of accuracy that grows only logarithmically as a function of the frequency. Numerical results demonstrate the same logarithmic dependence on the frequency for the Galerkin method solution. Our boundary element method is a discretization of a well-known second kind combined-layer-potential integral equation. We provide a proof that this equation and its adjoint are well-posed and equivalent to the boundary value problem in a Sobolev space setting for general Lipschitz domains.
Resumo:
In this article we review recent progress on the design, analysis and implementation of numerical-asymptotic boundary integral methods for the computation of frequency-domain acoustic scattering in a homogeneous unbounded medium by a bounded obstacle. The main aim of the methods is to allow computation of scattering at arbitrarily high frequency with finite computational resources.
Resumo:
Current changes in the tropical hydrological cycle, including water vapour and precipitation, are presented over the period 1979-2008 based on a diverse suite of observational datasets and atmosphere-only climate models. Models capture the observed variability in tropical moisture while reanalyses cannot. Observed variability in precipitation is highly dependent upon the satellite instruments employed and only cursory agreement with model simulations, primarily relating to the interannual variability associated with the El Niño Southern Oscillation. All datasets display a positive relationship between precipitation and surface temperature but with a large spread. The tendency for wet, ascending regions to become wetter at the expense of dry, descending regimes is in general reproduced. Finally, the frequency of extreme precipitation is shown to rise with warming in the observations and for the model ensemble mean but with large spread in the model simulations. The influence of the Earth’s radiative energy balance in relation to changes in the tropical water cycle are discussed
Resumo:
An isentropic potential vorticity (PV) budget analysis is employed to examine the role of synoptic transients, advection, and nonconservative processes as forcings for the evolution of the low-frequency PV anomalies locally and those associated with the North Atlantic Oscillation (NAO) and the Pacific–North American (PNA) pattern. Specifically, the rate of change of the low-frequency PV is expressed as a sum of tendencies due to divergence of eddy transport, advection by the low-frequency flow (hereafter referred to as advection), and the residual nonconservative processes. The balance between the variances and covariances of these terms is illustrated using a novel vector representation. It is shown that for most locations, as well as for the PNA pattern, the PV variability is dominantly driven by advection. The eddy forcing explains a small amount of the tendency variance. For the NAO, the role of synoptic eddy fluxes is found to be stronger, explaining on average 15% of the NAO tendency variance. Previous studies have not assessed quantitively how the various forcings balance the tendency. Thus, such studies may have overestimated the role of eddy fluxes for the evolution of teleconnections by examining, for example, composites and regressions that indicate maintenance, rather than evolution driven by the eddies. The authors confirm this contrasting view by showing that during persistent blocking (negative NAO) episodes the eddy driving is relatively stronger.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.