886 resultados para Systems and Information Theory
Resumo:
This thesis focuses on three main questions. The first uses ExchangeTraded Funds (ETFs) to evaluate estimated adverse selection costs obtained spread decomposition models. The second compares the Probability of Informed Trading (PIN) in Exchange-Traded Funds to control securities. The third examines the intra-day ETF trading patterns. These spread decomposition models evaluated are Glosten and Harris (1988); George, Kaul, and Nimalendran (1991); Lin, Sanger, and Booth (1995); Madhavan, Richardson, and Roomans (1997); Huang and Stoll (1997). Using the characteristics of ETFs it is shown that only the Glosten and Harris (1988) and Madhavan, et al (1997) models provide theoretically consistent results. When the PIN measure is employed ETFs are shown to have greater PINs than control securities. The investigation of the intra-day trading patterns shows that return volatility and trading volume have a U-shaped intra-day pattern. A study of trading systems shows that ETFs on the American Stock Exchange (AMEX) have a U-shaped intra-day pattern of bid-ask spreads, while ETFs on NASDAQ do not. Specifically, ETFs on NASDAQ have higher bid-ask spreads at the market opening, then the lowest bid-ask spread in the middle of the day. At the close of the market, the bid-ask spread of ETFs on NASDAQ slightly elevated when compared to mid-day.
Resumo:
WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.
Resumo:
Supply Chain Risk Management (SCRM) has become a popular area of research and study in recent years. This can be highlighted by the number of peer reviewed articles that have appeared in academic literature. This coupled with the realisation by companies that SCRM strategies are required to mitigate the risks that they face, makes for challenging research questions in the field of risk management. The challenge that companies face today is not only to identify the types of risks that they face, but also to assess the indicators of risk that face them. This will allow them to mitigate that risk before any disruption to the supply chain occurs. The use of social network theory can aid in the identification of disruption risk. This thesis proposes the combination of social networks, behavioural risk indicators and information management, to uniquely identify disruption risk. The propositions that were developed from the literature review and exploratory case study in the aerospace OEM, in this thesis are:- By improving information flows, through the use of social networks, we can identify supply chain disruption risk. - The management of information to identify supply chain disruption risk can be explored using push and pull concepts. The propositions were further explored through four focus group sessions, two within the OEM and two within an academic setting. The literature review conducted by the researcher did not find any studies that have evaluated supply chain disruption risk management in terms of social network analysis or information management studies. The evaluation of SCRM using these methods is thought to be a unique way of understanding the issues in SCRM that practitioners face today in the aerospace industry.
Resumo:
We present information-theory analysis of the tradeoff between bit-error rate improvement and the data-rate loss using skewed channel coding to suppress pattern-dependent errors in digital communications. Without loss of generality, we apply developed general theory to the particular example of a high-speed fiber communication system with a strong patterning effect. © 2007 IEEE.
Resumo:
In different fields a conception of granules is applied both as a group of elements defined by internal properties and as something inseparable whole reflecting external properties. Granular computing may be interpreted in terms of abstraction, generalization, clustering, levels of abstraction, levels of detail, and so on. We have proposed to use multialgebraic systems as a mathematical tool for synthesis and analysis of granules and granule structures. The theorem of necessary and sufficient conditions for multialgebraic systems existence has been proved.
Resumo:
The basic structure of the General Information Theory (GIT) is presented in the paper. The main divisions of the GIT are outlined. Some new results are pointed.
Resumo:
In a paper the method of complex systems and processes clustering based use of genetic algorithm is offered. The aspects of its realization and shaping of fitness-function are considered. The solution of clustering task of Ukraine areas on socio-economic indexes is represented and comparative analysis with outcomes of classical methods is realized.
Resumo:
AMS subject classification: 49N35,49N55,65Lxx.
Resumo:
In this paper we summarize our recently proposed work on the information theory analysis of regenerative channels. We discuss how the design and the transfer function properties of the regenerator affect the noise statistics and enable Shannon capacities higher than that of the corresponding linear channels (in the absence of regeneration).
Resumo:
A posztszocialista átalakulással foglalkozó irodalom rendszerint az átmenet politikai, gazdasági és társadalmi oldalával foglalkozik, holott az elmúlt húsz évben fontos változások mentek végbe a technikai haladás terén is. A kapitalizmus egyik fő erénye a dinamizmus, a vállalkozás, az innovációs folyamat erős ösztönzése. Valamennyi (polgári célokra használt) forradalmian új terméket a kapitalista rendszer hozta létre, a szocialista rendszer legfeljebb katonai rendeltetésű új termékekkel tudott előállni. A cikk azt elemzi, hogy mennyiben magyarázható ez a mélyreható különbség a két rendszer veleszületett hajlamaival, alapvető tulajdonságaival. Az új termékek térhódítása (köztük a számítógép, a mobiltelefon, az internet, az információs-kommunikációs szféra radikális átalakulása) megváltoztatta az emberek mindennapi életét. Miközben sokan mindezt kedvező változásként élik meg, nem vesznek tudomást a kapitalista rendszer és a gyors technikai haladás közötti okozati összefüggésről. A kapitalizmusnak e fontos erényét a mikroökonómia szokványos oktatása sem világítja meg a diákok számára, és nem kap kellő hangsúlyt a vezető politikusok megnyilvánulásaiban sem. _________________ Literature on post-socialist transformation usually deals with the political, economic and social sides of it, although there have also been important changes in the field of technical advance in the last twenty years. One of capitalisms main virtues is the strong incentive it gives to dynamism, enterprise and the innovation process. Every revolutionary new prodŹuct (for civilian use) has been brought about by the capitalist system. The socialist system was capable at most of developing new military products. The article analyses how far this radical difference can be explained by the innate tendencies and basis attributes of the two systems. Our daily lives have been transformed by these new products (for instance, the sphere of information and communications by the computer, the mobile phone and the internet). While many people see all these as favourable changes, fewer discern the causal relation between the capitalist system and rapid technical progress. Yet the usual syllabus of microeconomics does not enlighten students on this important virtue of capitalism, which is not adequately emphasized in the statements of leading politicians either.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
The protection of cyberspace has become one of the highest security priorities of governments worldwide. The EU is not an exception in this context, given its rapidly developing cyber security policy. Since the 1990s, we could observe the creation of three broad areas of policy interest: cyber-crime, critical information infrastructures and cyber-defence. One of the main trends transversal to these areas is the importance that the private sector has come to assume within them. In particular in the area of critical information infrastructure protection, the private sector is seen as a key stakeholder, given that it currently operates most infrastructures in this area. As a result of this operative capacity, the private sector has come to be understood as the expert in network and information systems security, whose knowledge is crucial for the regulation of the field. Adopting a Regulatory Capitalism framework, complemented by insights from Network Governance, we can identify the shifting role of the private sector in this field from one of a victim in need of protection in the first phase, to a commercial actor bearing responsibility for ensuring network resilience in the second, to an active policy shaper in the third, participating in the regulation of NIS by providing technical expertise. By drawing insights from the above-mentioned frameworks, we can better understand how private actors are involved in shaping regulatory responses, as well as why they have been incorporated into these regulatory networks.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08