65 resultados para Coding and Information Theory
Resumo:
This thesis focuses on three main questions. The first uses ExchangeTraded Funds (ETFs) to evaluate estimated adverse selection costs obtained spread decomposition models. The second compares the Probability of Informed Trading (PIN) in Exchange-Traded Funds to control securities. The third examines the intra-day ETF trading patterns. These spread decomposition models evaluated are Glosten and Harris (1988); George, Kaul, and Nimalendran (1991); Lin, Sanger, and Booth (1995); Madhavan, Richardson, and Roomans (1997); Huang and Stoll (1997). Using the characteristics of ETFs it is shown that only the Glosten and Harris (1988) and Madhavan, et al (1997) models provide theoretically consistent results. When the PIN measure is employed ETFs are shown to have greater PINs than control securities. The investigation of the intra-day trading patterns shows that return volatility and trading volume have a U-shaped intra-day pattern. A study of trading systems shows that ETFs on the American Stock Exchange (AMEX) have a U-shaped intra-day pattern of bid-ask spreads, while ETFs on NASDAQ do not. Specifically, ETFs on NASDAQ have higher bid-ask spreads at the market opening, then the lowest bid-ask spread in the middle of the day. At the close of the market, the bid-ask spread of ETFs on NASDAQ slightly elevated when compared to mid-day.
Resumo:
In this paper, we propose a text mining method called LRD (latent relation discovery), which extends the traditional vector space model of document representation in order to improve information retrieval (IR) on documents and document clustering. Our LRD method extracts terms and entities, such as person, organization, or project names, and discovers relationships between them by taking into account their co-occurrence in textual corpora. Given a target entity, LRD discovers other entities closely related to the target effectively and efficiently. With respect to such relatedness, a measure of relation strength between entities is defined. LRD uses relation strength to enhance the vector space model, and uses the enhanced vector space model for query based IR on documents and clustering documents in order to discover complex relationships among terms and entities. Our experiments on a standard dataset for query based IR shows that our LRD method performed significantly better than traditional vector space model and other five standard statistical methods for vector expansion.
Resumo:
We introduce a flexible visual data mining framework which combines advanced projection algorithms from the machine learning domain and visual techniques developed in the information visualization domain. The advantage of such an interface is that the user is directly involved in the data mining process. We integrate principled projection algorithms, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates and billboarding, to provide a visual data mining framework. Results on a real-life chemoinformatics dataset using GTM are promising and have been analytically compared with the results from the traditional projection methods. It is also shown that the HGTM algorithm provides additional value for large datasets. The computational complexity of these algorithms is discussed to demonstrate their suitability for the visual data mining framework. Copyright 2006 ACM.
Resumo:
TEST is a novel taxonomy of knowledge representations based on three distinct hierarchically organized representational features: Tropism, Embodiment, and Situatedness. Tropic representational features reflect constraints of the physical world on the agent's ability to form, reactivate, and enrich embodied (i.e., resulting from the agent's bodily constraints) conceptual representations embedded in situated contexts. The proposed hierarchy entails that representations can, in principle, have tropic features without necessarily having situated and/or embodied features. On the other hand, representations that are situated and/or embodied are likely to be simultaneously tropic. Hence, although we propose tropism as the most general term, the hierarchical relationship between embodiment and situatedness is more on a par, such that the dominance of one component over the other relies on the distinction between offline storage versus online generation as well as on representation-specific properties. © 2013 Cognitive Science Society, Inc.
Resumo:
If history matters for organization theory, then we need greater reflexivity regarding the epistemological problem of representing the past; otherwise, history might be seen as merely a repository of ready-made data. To facilitate this reflexivity, we set out three epistemological dualisms derived from historical theory to explain the relationship between history and organization theory: (1) in the dualism of explanation, historians are preoccupied with narrative construction, whereas organization theorists subordinate narrative to analysis; (2) in the dualism of evidence, historians use verifiable documentary sources, whereas organization theorists prefer constructed data; and (3) in the dualism of temporality, historians construct their own periodization, whereas organization theorists treat time as constant for chronology. These three dualisms underpin our explication of four alternative research strategies for organizational history: corporate history, consisting of a holistic, objectivist narrative of a corporate entity; analytically structured history, narrating theoretically conceptualized structures and events; serial history, using replicable techniques to analyze repeatable facts; and ethnographic history, reading documentary sources "against the grain." Ultimately, we argue that our epistemological dualisms will enable organization theorists to justify their theoretical stance in relation to a range of strategies in organizational history, including narratives constructed from documentary sources found in organizational archives. Copyright of the Academy of Management, all rights reserved.
Resumo:
Over the last six years, Aston University Library & Information Services Induction Team have worked on the Welcome experience for new and returning students to the Library. The article provides an overview of the Induction programme and how it has evolved to engage students pre and post arrival to the University.
Resumo:
Orthodox depictions of a fraught labour–environmental relationship privileging class, ideological and programmatic differences are problematised by newly quantified evidence of British unions' pro-environmental policy-making since 1967. The following narrative blends widely accepted accounts of the fortunes of both movements with an evaluation of Britain's shifting political opportunity structure and coalition theory to identify an alternative range of constraints and opportunities influencing the propensity and capacity of both movements to interact effectively, culminating recently in unions' emergence as environmental actors in their own right.
Resumo:
The protection of cyberspace has become one of the highest security priorities of governments worldwide. The EU is not an exception in this context, given its rapidly developing cyber security policy. Since the 1990s, we could observe the creation of three broad areas of policy interest: cyber-crime, critical information infrastructures and cyber-defence. One of the main trends transversal to these areas is the importance that the private sector has come to assume within them. In particular in the area of critical information infrastructure protection, the private sector is seen as a key stakeholder, given that it currently operates most infrastructures in this area. As a result of this operative capacity, the private sector has come to be understood as the expert in network and information systems security, whose knowledge is crucial for the regulation of the field. Adopting a Regulatory Capitalism framework, complemented by insights from Network Governance, we can identify the shifting role of the private sector in this field from one of a victim in need of protection in the first phase, to a commercial actor bearing responsibility for ensuring network resilience in the second, to an active policy shaper in the third, participating in the regulation of NIS by providing technical expertise. By drawing insights from the above-mentioned frameworks, we can better understand how private actors are involved in shaping regulatory responses, as well as why they have been incorporated into these regulatory networks.
Resumo:
Purpose: To assess the compliance of Daily Disposable Contact Lenses (DDCLs) wearers with replacing lenses at a manufacturer-recommended replacement frequency. To evaluate the ability of two different Health Behavioural Theories (HBT), The Health Belief Model (HBM) and The Theory of Planned Behaviour (TPB), in predicting compliance. Method: A multi-centre survey was conducted using a questionnaire completed anonymously by contact lens wearers during the purchase of DDCLs. Results: Three hundred and fifty-four questionnaires were returned. The survey comprised 58.5% females and 41.5% males (mean age 34. ±. 12. years). Twenty-three percent of respondents were non-compliant with manufacturer-recommended replacement frequency (re-using DDCLs at least once). The main reason for re-using DDCLs was "to save money" (35%). Predictions of compliance behaviour (past behaviour or future intentions) on the basis of the two HBT was investigated through logistic regression analysis: both TPB factors (subjective norms and perceived behavioural control) were significant (p. <. 0.01); HBM was less predictive with only the severity (past behaviour and future intentions) and perceived benefit (only for past behaviour) as significant factors (p. <. 0.05). Conclusions: Non-compliance with DDCLs replacement is widespread, affecting 1 out of 4 Italian wearers. Results from the TPB model show that the involvement of persons socially close to the wearers (subjective norms) and the improvement of the procedure of behavioural control of daily replacement (behavioural control) are of paramount importance in improving compliance. With reference to the HBM, it is important to warn DDCLs wearers of the severity of a contact-lens-related eye infection, and to underline the possibility of its prevention.
Resumo:
The adoption of DRG coding may be seen as a central feature of the mechanisms of the health reforms in New Zealand. This paper presents a story of the use of DRG coding by describing the experience of one major health provider. The conventional literature portrays casemix accounting and medical coding systems as rational techniques for the collection and provision of information for management and contracting decisions/negotiations. Presents a different perspective on the implications and effects of the adoption of DRG technology, in particular the part played by DRG coding technology as a part of a casemix system is explicated from an actor network theory perspective. Medical coding and the DRG methodology will be argued to represent ``black boxes''. Such technological ``knowledge objects'' provide strong points in the networks which are so important to the processes of change in contemporary organisations.
Resumo:
This thesis presents the results of a multi-method investigation of employee perceptions of fairness in relation to their career management experiences. Organisational justice theory (OJT) was developed as a theoretical framework and data were gathered via 325 quantitative questionnaires, 20 semi-structured interviews and the analysis of a variety of company documents and materials. The results of the questionnaire survey provided strong support for the salience of employee perceptions of justice in regard to their evaluations of organisational career management (OCM) practices, with statistical support emerging for both an agent-systems and interaction model of organisational justice. The qualitative semi-structured interviews provided more detailed analysis of how fairness was experienced in practice, and confirmed the importance of the OJT constructs of fairness within this career management context. Fairness themes to emerge from this analysis included, equity, needs, voice, bias suppression, consistency, ethicality, respect and feedback drawing on many of the central tenants of distributive, procedural, interpersonal and information justice. For the career management literature there is empirical confirmation of a new theoretical framework for understanding employee evaluations of, and reactions to, OCM practices. For the justice literatures a new contextual domain is explored and confirmed, thus extending further the influence and applicability of the theory. For practitioners a new framework for developing, delivering and evaluating their own OCM policies and systems is presented.
Resumo:
DUE TO INCOMPLETE PAPERWORK, ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
In the agrifood sector, the explosive increase in information about environmental sustainability, often in uncoordinated information systems, has created a new form of ignorance ('meta-ignorance') that diminishes the effectiveness of information on decision-makers. Flows of information are governed by informal and formal social arrangements that we can collectively call Informational Institutions. In this paper, we have reviewed the recent literature on such institutions. From the perspectives of information theory and new institutional economics, current informational institutions are increasing the information entropy of communications concerning environmental sustainability and stakeholders' transaction costs of using relevant information. In our view this reduces the effectiveness of informational governance. Future research on informational governance should explicitly address these aspects.
Resumo:
Erasure control coding has been exploited in communication networks with an aim to improve the end-to-end performance of data delivery across the network. To address the concerns over the strengths and constraints of erasure coding schemes in this application, we examine the performance limits of two erasure control coding strategies, forward erasure recovery and adaptive erasure recovery. Our investigation shows that the throughput of a network using an (n, k) forward erasure control code is capped by r =k/n when the packet loss rate p ≤ (te/n) and by k(l-p)/(n-te) when p > (t e/n), where te is the erasure control capability of the code. It also shows that the lower bound of the residual loss rate of such a network is (np-te)/(n-te) for (te/n) < p ≤ 1. Especially, if the code used is maximum distance separable, the Shannon capacity of the erasure channel, i.e. 1-p, can be achieved and the residual loss rate is lower bounded by (p+r-1)/r, for (1-r) < p ≤ 1. To address the requirements in real-time applications, we also investigate the service completion time of different schemes. It is revealed that the latency of the forward erasure recovery scheme is fractionally higher than that of the scheme without erasure control coding or retransmission mechanisms (using UDP), but much lower than that of the adaptive erasure scheme when the packet loss rate is high. Results on comparisons between the two erasure control schemes exhibit their advantages as well as disadvantages in the role of delivering end-to-end services. To show the impact of the bounds derived on the end-to-end performance of a TCP/IP network, a case study is provided to demonstrate how erasure control coding could be used to maximize the performance of practical systems. © 2010 IEEE.