6 resultados para Information content

em Duke University


Relevância:

70.00% 70.00%

Publicador:

Resumo:

I demonstrate a powerful tension between acquiring information and incorporating it into asset prices, the two core elements of price discovery. As a salient case, I focus on the transformative rise of algorithmic trading (AT) typically associated with improved price efficiency. Using a measure of the relative information content of prices and a comprehensive panel of 37,325 stock-quarters of SEC market data, I establish instead that algorithmic trading strongly decreases the net amount of information in prices. The increase in price distortions associated with the AT “information gap” is roughly $42.6 billion/year for U.S. common stocks around earnings announcement events alone. Information losses are concentrated among stocks with high shares of algorithmic liquidity takers relative to algorithmic liquidity makers, suggesting that aggressive AT powerfully deters fundamental information acquisition despite its importance for translating available information into prices.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of spatial downscaling strategies is to increase the information content of coarse datasets at smaller scales. In the case of quantitative precipitation estimation (QPE) for hydrological applications, the goal is to close the scale gap between the spatial resolution of coarse datasets (e.g., gridded satellite precipitation products at resolution L × L) and the high resolution (l × l; L»l) necessary to capture the spatial features that determine spatial variability of water flows and water stores in the landscape. In essence, the downscaling process consists of weaving subgrid-scale heterogeneity over a desired range of wavelengths in the original field. The defining question is, which properties, statistical and otherwise, of the target field (the known observable at the desired spatial resolution) should be matched, with the caveat that downscaling methods be as a general as possible and therefore ideally without case-specific constraints and/or calibration requirements? Here, the attention is focused on two simple fractal downscaling methods using iterated functions systems (IFS) and fractal Brownian surfaces (FBS) that meet this requirement. The two methods were applied to disaggregate spatially 27 summertime convective storms in the central United States during 2007 at three consecutive times (1800, 2100, and 0000 UTC, thus 81 fields overall) from the Tropical Rainfall Measuring Mission (TRMM) version 6 (V6) 3B42 precipitation product (~25-km grid spacing) to the same resolution as the NCEP stage IV products (~4-km grid spacing). Results from bilinear interpolation are used as the control. A fundamental distinction between IFS and FBS is that the latter implies a distribution of downscaled fields and thus an ensemble solution, whereas the former provides a single solution. The downscaling effectiveness is assessed using fractal measures (the spectral exponent β, fractal dimension D, Hurst coefficient H, and roughness amplitude R) and traditional operational scores statistics scores [false alarm rate (FR), probability of detection (PD), threat score (TS), and Heidke skill score (HSS)], as well as bias and the root-mean-square error (RMSE). The results show that both IFS and FBS fractal interpolation perform well with regard to operational skill scores, and they meet the additional requirement of generating structurally consistent fields. Furthermore, confidence intervals can be directly generated from the FBS ensemble. The results were used to diagnose errors relevant for hydrometeorological applications, in particular a spatial displacement with characteristic length of at least 50 km (2500 km2) in the location of peak rainfall intensities for the cases studied. © 2010 American Meteorological Society.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Animals communicating via scent often deposit composite signals that incorporate odorants from multiple sources; however, the function of mixing chemical signals remains understudied. We tested both a 'multiple-messages' and a 'fixative' hypothesis of composite olfactory signalling, which, respectively, posit that mixing scents functions to increase information content or prolong signal longevity. Our subjects-adult, male ring-tailed lemurs (Lemur catta)-have a complex scent-marking repertoire, involving volatile antebrachial (A) secretions, deposited pure or after being mixed with a squalene-rich paste exuded from brachial (B) glands. Using behavioural bioassays, we examined recipient responses to odorants collected from conspecific strangers. We concurrently presented pure A, pure B and mixed A + B secretions, in fresh or decayed conditions. Lemurs preferentially responded to mixed over pure secretions, their interest increasing and shifting over time, from sniffing and countermarking fresh mixtures, to licking and countermarking decayed mixtures. Substituting synthetic squalene (S)-a well-known fixative-for B secretions did not replicate prior results: B secretions, which contain additional chemicals that probably encode salient information, were preferred over pure S. Whereas support for the 'multiple-messages' hypothesis underscores the unique contribution from each of an animal's various secretions, support for the 'fixative' hypothesis highlights the synergistic benefits of composite signals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change is thought to be one of the most pressing environmental problems facing humanity. However, due in part to failures in political communication and how the issue has been historically defined in American politics, discussions of climate change remain gridlocked and polarized. In this dissertation, I explore how climate change has been historically constructed as a political issue, how conflicts between climate advocates and skeptics have been communicated, and what effects polarization has had on political communication, particularly on the communication of climate change to skeptical audiences. I use a variety of methodological tools to consider these questions, including evolutionary frame analysis, which uses textual data to show how issues are framed and constructed over time; Kullback-Leibler divergence content analysis, which allows for comparison of advocate and skeptical framing over time; and experimental framing methods to test how audiences react to and process different presentations of climate change. I identify six major portrayals of climate change from 1988 to 2012, but find that no single construction of the issue has dominated the public discourse defining the problem. In addition, the construction of climate change may be associated with changes in public political sentiment, such as greater pessimism about climate action when the electorate becomes more conservative. As the issue of climate change has become more polarized in American politics, one proposed causal pathway for the observed polarization is that advocate and skeptic framing of climate change focuses on different facets of the issue and ignores rival arguments, a practice known as “talking past.” However, I find no evidence of increased talking past in 25 years of popular newsmedia reporting on the issue, suggesting both that talking past has not driven public polarization or that polarization is occurring in venues outside of the mainstream public discourse, such as blogs. To examine how polarization affects political communication on climate change, I test the cognitive processing of a variety of messages and sources that promote action against climate change among Republican individuals. Rather than identifying frames that are powerful enough to overcome polarization, I find that Republicans exhibit telltale signs of motivated skepticism on the issue, that is, they reject framing that runs counter to their party line and political identity. This result suggests that polarization constrains political communication on polarized issues, overshadowing traditional message and source effects of framing and increasing the difficulty communicators experience in reaching skeptical audiences.