917 resultados para Earnings and dividend announcements, high frequency data, information asymmetry
Resumo:
Sub-seasonal variability including equatorial waves significantly influence the dehydration and transport processes in the tropical tropopause layer (TTL). This study investigates the wave activity in the TTL in 7 reanalysis data sets (RAs; NCEP1, NCEP2, ERA40, ERA-Interim, JRA25, MERRA, and CFSR) and 4 chemistry climate models (CCMs; CCSRNIES, CMAM, MRI, and WACCM) using the zonal wave number-frequency spectral analysis method with equatorially symmetric-antisymmetric decomposition. Analyses are made for temperature and horizontal winds at 100 hPa in the RAs and CCMs and for outgoing longwave radiation (OLR), which is a proxy for convective activity that generates tropopause-level disturbances, in satellite data and the CCMs. Particular focus is placed on equatorial Kelvin waves, mixed Rossby-gravity (MRG) waves, and the Madden-Julian Oscillation (MJO). The wave activity is defined as the variance, i.e., the power spectral density integrated in a particular zonal wave number-frequency region. It is found that the TTL wave activities show significant difference among the RAs, ranging from 0.7 (for NCEP1 and NCEP2) to 1.4 (for ERA-Interim, MERRA, and CFSR) with respect to the averages from the RAs. The TTL activities in the CCMs lie generally within the range of those in the RAs, with a few exceptions. However, the spectral features in OLR for all the CCMs are very different from those in the observations, and the OLR wave activities are too low for CCSRNIES, CMAM, and MRI. It is concluded that the broad range of wave activity found in the different RAs decreases our confidence in their validity and in particular their value for validation of CCM performance in the TTL, thereby limiting our quantitative understanding of the dehydration and transport processes in the TTL.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
Many macroeconomic series, such as U.S. real output growth, are sampled quarterly, although potentially useful predictors are often observed at a higher frequency. We look at whether a mixed data-frequency sampling (MIDAS) approach can improve forecasts of output growth. The MIDAS specification used in the comparison uses a novel way of including an autoregressive term. We find that the use of monthly data on the current quarter leads to significant improvement in forecasting current and next quarter output growth, and that MIDAS is an effective way to exploit monthly data compared with alternative methods.
Resumo:
Large changes in the extent of northern subtropical arid regions during the Holocene are attributed to orbitally forced variations in monsoon strength and have been implicated in the regulation of atmospheric trace gas concentrations on millenial timescales. Models that omit biogeophysical feedback, however, are unable to account for the full magnitude of African monsoon amplification and extension during the early to middle Holocene (95005000 years B.P.). A data set describing land-surface conditions 6000 years B.P. on a 1 1 grid across northern Africa and the Arabian Peninsula has been prepared from published maps and other sources of palaeoenvironmental data, with the primary aim of providing a realistic lower boundary condition for atmospheric general circulation model experiments similar to those performed in the Palaeoclimate Modelling Intercomparison Project. The data set includes information on the percentage of each grid cell occupied by specific vegetation types (steppe, savanna, xerophytic woods/scrub, tropical deciduous forest, and tropical montane evergreen forest), open water (lakes), and wetlands, plus information on the flow direction of major drainage channels for use in large-scale palaeohydrological modeling.
Resumo:
Data assimilation (DA) systems are evolving to meet the demands of convection-permitting models in the field of weather forecasting. On 19 April 2013 a special interest group meeting of the Royal Meteorological Society brought together UK researchers looking at different aspects of the data assimilation problem at high resolution, from theory to applications, and researchers creating our future high resolution observational networks. The meeting was chaired by Dr Sarah Dance of the University of Reading and Dr Cristina Charlton-Perez from the MetOffice@Reading. The purpose of the meeting was to help define the current state of high resolution data assimilation in the UK. The workshop assembled three main types of scientists: observational network specialists, operational numerical weather prediction researchers and those developing the fundamental mathematical theory behind data assimilation and the underlying models. These three working areas are intrinsically linked; therefore, a holistic view must be taken when discussing the potential to make advances in high resolution data assimilation.
Resumo:
In this paper we propose and analyse a hybrid numerical-asymptotic boundary element method for the solution of problems of high frequency acoustic scattering by a class of sound-soft nonconvex polygons. The approximation space is enriched with carefully chosen oscillatory basis functions; these are selected via a study of the high frequency asymptotic behaviour of the solution. We demonstrate via a rigorous error analysis, supported by numerical examples, that to achieve any desired accuracy it is sufficient for the number of degrees of freedom to grow only in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods. This appears to be the first such numerical analysis result for any problem of scattering by a nonconvex obstacle. Our analysis is based on new frequency-explicit bounds on the normal derivative of the solution on the boundary and on its analytic continuation into the complex plane.
Resumo:
Since 1999, the National Commission for the Knowledge and Use of the Biodiversity (CONABIO) in Mexico has been developing and managing the Operational program for the detection of hot-spots using remote sensing techniques. This program uses images from the MODerate resolution Imaging Spectroradiometer (MODIS) onboard the Terra and Aqua satellites and from the Advanced Very High Resolution Radiometer of the National Oceanic and Atmospheric Administration (NOAA-AVHRR), which are operationally received through the Direct Readout station (DR) at CONABIO. This allows the near-real time monitoring of fire events in Mexico and Central America. In addition to the detection of active fires, the location of hot spots are classified with respect to vegetation types, accessibility, and risk to Nature Protection Areas (NPA). Besides the fast detection of fires, further analysis is necessary due to the considerable effects of forest fires on biodiversity and human life. This fire impact assessment is crucial to support the needs of resource managers and policy makers for adequate fire recovery and restoration actions. CONABIO attempts to meet these requirements, providing post-fire assessment products as part of the management system in particular for satellite-based burnt area mapping. This paper provides an overview of the main components of the operational system and will present an outlook to future activities and system improvements, especially the development of a burnt area product. A special focus will also be placed on the fire occurrence within NPAs of Mexico
Resumo:
The order Scorpiones is one of the most cytogenetically interesting groups within Arachnida by virtue of the combination of chromosome singularities found in the 59 species analyzed so far. In this work, mitotic and meiotic chromosomes of 2 species of the family Bothriuridae were detailed. This family occupies a basal position within the superfamily Scorpionoidea. Furthermore, review of the cytogenetic data of all previously studied scorpions is presented. Light microscopy chromosome analysis showed that Bothriurus araguayae and Bothriurus rochensis possess low diploid numbers compared with those of species belonging to closely related families. Gonadal cells examined under light and in transmission electron microscopy revealed, for the first time, that the Bothriuridae species possess typical monocentric chromosomes, and male meiosis presented chromosomes with synaptic and achiasmatic behavior. Moreover, in the sample of B. araguayae studied, heterozygous translocations were verified. The use of techniques to highlight specific chromosomal regions also revealed additional differences between the 2 Bothriurus species. The results herein recorded and the overview elaborated using the available cytogenetic information of Scorpiones elucidated current understanding regarding the processes of chromosome evolution that have occurred in Bothriuridae and in Scorpiones as a whole.
Resumo:
High-frequency extensions of magnetorotational instability driven by the Velikhov effect beyond the standard magnetohydrodynamic (MHD) regime are studied. The existence of the well-known Hall regime and a new electron inertia regime is demonstrated. The electron inertia regime is realized for a lesser plasma magnetization of rotating plasma than that in the Hall regime. It includes the subregime of nonmagnetized electrons. It is shown that, in contrast to the standard MHD regime and the Hall regime, magnetorotational instability in this subregime can be driven only at positive values of dln Omega/dlnr, where Omega is the plasma rotation frequency and r is the radial coordinate. The permittivity of rotating plasma beyond the standard MHD regime, including both the Hall regime and the electron inertia regime, is calculated.
Resumo:
This article presents the data-rich findings of an experiment with enlisting patron-driven/demand-driven acquisitions (DDA) of ebooks in two ways. The first experiment entailed comparison of DDA eBook usage against newly ordered hardcopy materials circulation, both overall and ebook vs. print usage within the same subject areas. Secondly, this study experimented with DDA ebooks as a backup plan for unfunded requests left over at the end of the fiscal year.
Resumo:
This article presents the data-rich findings of an experiment with enlisting patron-driven/demand-driven acquisitions (DDA) of ebooks in two ways. The first experiment entailed comparison of DDA eBook usage against newly ordered hardcopy materials circulation, both overall and ebook vs. print usage within the same subject areas. Secondly, this study experimented with DDA ebooks as a backup plan for unfunded requests left over at the end of the fiscal year.
Resumo:
Initial endogenous growth models emphasized the importance of external effects and increasing retums in explaining growth. Empirically, this hypothesis can be confumed if the coefficient of physical capital per hour is unity in the aggregate production function. Previous estimates using time series data rejected this hypothesis, although cross-country estimates did nol The problem lies with the techniques employed, which are unable to capture low-frequency movements of high-frequency data. Using cointegration, new time series evidence confum the theory and conform to cross-country evidence. The implied Solow residual, which takes into account externaI effects to aggregate capital, has its behavior analyzed. The hypothesis that it is explained by government expenditures on infrasttucture is confIrmed. This suggests a supply-side role for government affecting productivity.
Resumo:
This paper investigates the impact of price limits on the Brazilian futures markets using high frequency data. The aim is to identify whether there is a cool-off or a magnet effect. For that purpose, we examine a tick-by-tick data set that includes all contracts on the Sao Paulo stock index futures traded on the Brazilian Mercantile and Futures Exchange from January 1997 to December 1999. The results indicate that the conditional mean features a floor cool-off effect, whereas the conditional variance significantly increases as the price approaches the upper limit. We then build a trading strategy that accounts for the cool-off effect in the conditional mean so as to demonstrate that the latter has not only statistical, but also economic significance. The in-sample Sharpe ratio indeed is way superior to the buy-and-hold benchmarks we consider, whereas out-of-sample results evince similar performances.
Resumo:
Um levantamento de 320 executivos de marketing feito pelo Conselho CMO e divulgado em junho de 2004 indicou que poucas companhias de alta tecnologia (menos de 20% das empresas entrevistadas) tm desenvolvido medidas e mtricas teis e expressivas para as suas organizaes de marketing. Porm a pesquisa tambm revelou que companhias que estabeleceram medidas formais e compreensivas atingiram resultados financeiros superiores e tiveram mais confiana do CEO na funo de marketing. Esta dissertao prov uma viso geral da informao precisa para executivos de marketing entenderem e implementarem processos para medio de performance de marketing (MPM) em suas organizaes. Ela levanta questes para gerentes de marketing na industria de alta tecnologia com respeito s demandas para maior responsabilidade final, valor de medio para o melhoramento dos processos de marketing, iniciativas para determinar a lucratividade dos investimentos em marketing, e a importncia das atividades de marketing nos relatrios corporativos. Esta dissertao defende a implementao de MPM, mapeando seus benefcios de medio para ambos gerentes de marketing e as suas empresas. o trabalho logo explora alguns conceitos gerais de medio de marketing e investiga algumas abordagens a MPM propostas pela industria, pela comunidade acadmica, e pelos analistas. Finalmente, a dissertao descreve algumas prticas que todo gerente de marketing na industria de alta tecnologia deve considerar quando adotando MPM. As sugestes so gerais, mas devem familiarizar o leitor com as informaes precisas para habilitar processos e rigor na sua organizao com respeito a MPM.
Resumo:
This paper proposes a two-step procedure to back out the conditional alpha of a given stock using high-frequency data. We rst estimate the realized factor loadings of the stocks, and then retrieve their conditional alphas by estimating the conditional expectation of their risk-adjusted returns. We start with the underlying continuous-time stochastic process that governs the dynamics of every stock price and then derive the conditions under which we may consistently estimate the daily factor loadings and the resulting conditional alphas. We also contribute empiri-cally to the conditional CAPM literature by examining the main drivers of the conditional alphas of the S&P 100 index constituents from January 2001 to December 2008. In addition, to con rm whether these conditional alphas indeed relate to pricing errors, we assess the performance of both cross-sectional and time-series momentum strategies based on the conditional alpha estimates. The ndings are very promising in that these strategies not only seem to perform pretty well both in absolute and relative terms, but also exhibit virtually no systematic exposure to the usual risk factors (namely, market, size, value and momentum portfolios).