899 resultados para Computer Networks and Communications
Resumo:
Complex network theory is a framework increasingly used in the study of air transport networks, thanks to its ability to describe the structures created by networks of flights, and their influence in dynamical processes such as delay propagation. While many works consider only a fraction of the network, created by major airports or airlines, for example, it is not clear if and how such sampling process bias the observed structures and processes. In this contribution, we tackle this problem by studying how some observed topological metrics depend on the way the network is reconstructed, i.e. on the rules used to sample nodes and connections. Both structural and simple dynamical properties are considered, for eight major air networks and different source datasets. Results indicate that using a subset of airports strongly distorts our perception of the network, even when just small ones are discarded; at the same time, considering a subset of airlines yields a better and more stable representation. This allows us to provide some general guidelines on the way airports and connections should be sampled.
Resumo:
With the emerging prevalence of smart phones and 4G LTE networks, the demand for faster-better-cheaper mobile services anytime and anywhere is ever growing. The Dynamic Network Optimization (DNO) concept emerged as a solution that optimally and continuously tunes the network settings, in response to varying network conditions and subscriber needs. Yet, the DNO realization is still at infancy, largely hindered by the bottleneck of the lengthy optimization runtime. This paper presents the design and prototype of a novel cloud based parallel solution that further enhances the scalability of our prior work on various parallel solutions that accelerate network optimization algorithms. The solution aims to satisfy the high performance required by DNO, preliminarily on a sub-hourly basis. The paper subsequently visualizes a design and a full cycle of a DNO system. A set of potential solutions to large network and real-time DNO are also proposed. Overall, this work creates a breakthrough towards the realization of DNO.
Resumo:
With the development of information technology, the theory and methodology of complex network has been introduced to the language research, which transforms the system of language in a complex networks composed of nodes and edges for the quantitative analysis about the language structure. The development of dependency grammar provides theoretical support for the construction of a treebank corpus, making possible a statistic analysis of complex networks. This paper introduces the theory and methodology of the complex network and builds dependency syntactic networks based on the treebank of speeches from the EEE-4 oral test. According to the analysis of the overall characteristics of the networks, including the number of edges, the number of the nodes, the average degree, the average path length, the network centrality and the degree distribution, it aims to find in the networks potential difference and similarity between various grades of speaking performance. Through clustering analysis, this research intends to prove the network parameters’ discriminating feature and provide potential reference for scoring speaking performance.
Resumo:
Communication technologies shape how political activist networks are produced and maintain themselves. In Cuba, despite ideologically and physically oppressive practices by the state, a severe lack of Internet access, and extensive government surveillance, a small network of bloggers and cyberactivists has achieved international visibility and recognition for its critiques of the Cuban government. This qualitative study examines the blogger collective known as Voces Cubanas in Havana, Cuba in 2012, advancing a new approach to the study of transnational activism and the role of technology in the construction of political narrative. Voces Cubanas is analyzed as a network of connections between human and non-human actors that produces and sustains powerful political alliances. Voces Cubanas and its allies work collectively to co-produce contentious political discourses, confronting the dominant ideologies and knowledges produced by the Cuban state. Transnational alliances, the act of translation, and a host of unexpected and improvised technologies play central roles in the production of these narratives, indicating new breed of cyborg sociopolitical action reliant upon fluid and flexible networks and the act of writing.
Resumo:
Localization is one of the key technologies in Wireless Sensor Networks (WSNs), since it provides fundamental support for many location-aware protocols and applications. Constraints on cost and power consumption make it infeasible to equip each sensor node in the network with a Global Position System (GPS) unit, especially for large-scale WSNs. A promising method to localize unknown nodes is to use mobile anchor nodes (MANs), which are equipped with GPS units moving among unknown nodes and periodically broadcasting their current locations to help nearby unknown nodes with localization. A considerable body of research has addressed the Mobile Anchor Node Assisted Localization (MANAL) problem. However to the best of our knowledge, no updated surveys on MAAL reflecting recent advances in the field have been presented in the past few years. This survey presents a review of the most successful MANAL algorithms, focusing on the achievements made in the past decade, and aims to become a starting point for researchers who are initiating their endeavors in MANAL research field. In addition, we seek to present a comprehensive review of the recent breakthroughs in the field, providing links to the most interesting and successful advances in this research field.
Resumo:
In this paper, we investigate the effect of of the primary network on the secondary network when harvesting energy in cognitive radio in the presence of multiple power beacons and multiple secondary transmitters. In particular, the influence of the primary transmitter's transmit power on the energy harvesting secondary network is examined by studying two scenarios of primary transmitter's location, i.e., the primary transmitter's location is near to the secondary network and the primary transmitter's location is far from the secondary network. In the scenario where the primary transmitter locates near to the secondary network, although secondary transmitter can be benefit from the harvested energy from the primary transmitter, the interference caused by the primary transmitter suppresses the secondary network performance. Meanwhile, in both scenarios, despite the fact that the transmit power of the secondary transmitter can be improved by the support of powerful power beacons, the peak interference constraint at the primary receiver limits this advantage. In addition, the deployment of multiple power beacons and multiple secondary transmitters can improve the performance of the secondary network. The analytical expressions of the outage probability of the secondary network in the two scenarios are also provided and verified by numerical simulations.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Le rapide déclin actuel de la biodiversité est inquiétant et les activités humaines en sont la cause directe. De nombreuses aires protégées ont été mises en place pour contrer cette perte de biodiversité. Afin de maximiser leur efficacité, l’amélioration de la connectivité fonctionnelle entre elles est requise. Les changements climatiques perturbent actuellement les conditions environnementales de façon globale. C’est une menace pour la biodiversité qui n’a pas souvent été intégrée lors de la mise en place des aires protégées, jusqu’à récemment. Le mouvement des espèces, et donc la connectivité fonctionnelle du paysage, est impacté par les changements climatiques et des études ont montré qu’améliorer la connectivité fonctionnelle entre les aires protégées aiderait les espèces à faire face aux impacts des changements climatiques. Ma thèse présente une méthode pour concevoir des réseaux d’aires protégées tout en tenant compte des changements climatiques et de la connectivité fonctionnelle. Mon aire d’étude est la région de la Gaspésie au Québec (Canada). La population en voie de disparition de caribou de la Gaspésie-Atlantique (Rangifer tarandus caribou) a été utilisée comme espèce focale pour définir la connectivité fonctionnelle. Cette petite population subit un déclin continu dû à la prédation et la modification de son habitat, et les changements climatiques pourraient devenir une menace supplémentaire. J’ai d’abord construit un modèle individu-centré spatialement explicite pour expliquer et simuler le mouvement du caribou. J’ai utilisé les données VHF éparses de la population de caribou et une stratégie de modélisation patron-orienté pour paramétrer et sélectionner la meilleure hypothèse de mouvement. Mon meilleur modèle a reproduit la plupart des patrons de mouvement définis avec les données observées. Ce modèle fournit une meilleure compréhension des moteurs du mouvement du caribou de la Gaspésie-Atlantique, ainsi qu’une estimation spatiale de son utilisation du paysage dans la région. J’ai conclu que les données éparses étaient suffisantes pour ajuster un modèle individu-centré lorsqu’utilisé avec une modélisation patron-orienté. Ensuite, j’ai estimé l’impact des changements climatiques et de différentes actions de conservation sur le potentiel de mouvement du caribou. J’ai utilisé le modèle individu-centré pour simuler le mouvement du caribou dans des paysages hypothétiques représentant différents scénarios de changements climatiques et d’actions de conservation. Les actions de conservation représentaient la mise en place de nouvelles aires protégées en Gaspésie, comme définies par le scénario proposé par le gouvernement du Québec, ainsi que la restauration de routes secondaires à l’intérieur des aires protégées. Les impacts des changements climatiques sur la végétation, comme définis dans mes scénarios, ont réduit le potentiel de mouvement du caribou. La restauration des routes était capable d’atténuer ces effets négatifs, contrairement à la mise en place des nouvelles aires protégées. Enfin, j’ai présenté une méthode pour concevoir des réseaux d’aires protégées efficaces et j’ai proposé des nouvelles aires protégées à mettre en place en Gaspésie afin de protéger la biodiversité sur le long terme. J’ai créé de nombreux scénarios de réseaux d’aires protégées en étendant le réseau actuel pour protéger 12% du territoire. J’ai calculé la représentativité écologique et deux mesures de connectivité fonctionnelle sur le long terme pour chaque réseau. Les mesures de connectivité fonctionnelle représentaient l’accès général aux aires protégées pour le caribou de la Gaspésie-Atlantique ainsi que son potentiel de mouvement à l’intérieur. J’ai utilisé les estimations de potentiel de mouvement pour la période de temps actuelle ainsi que pour le futur sous différents scénarios de changements climatiques pour représenter la connectivité fonctionnelle sur le long terme. Le réseau d’aires protégées que j’ai proposé était le scénario qui maximisait le compromis entre les trois caractéristiques de réseau calculées. Dans cette thèse, j’ai expliqué et prédit le mouvement du caribou de la Gaspésie-Atlantique sous différentes conditions environnementales, notamment des paysages impactés par les changements climatiques. Ces résultats m’ont aidée à définir un réseau d’aires protégées à mettre en place en Gaspésie pour protéger le caribou au cours du temps. Je crois que cette thèse apporte de nouvelles connaissances sur le comportement de mouvement du caribou de la Gaspésie-Atlantique, ainsi que sur les actions de conservation qui peuvent être prises en Gaspésie afin d’améliorer la protection du caribou et de celle d’autres espèces. Je crois que la méthode présentée peut être applicable à d’autres écosystèmes aux caractéristiques et besoins similaires.
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
This paper describes how the recently developed network-wide real-time signal control strategy TUC has been implemented in three traffic networks with quite different traffic and control infrastructure characteristics: Chania, Greece (23 junctions); Southampton, U.K. (53 junctions); and Munich, Germany (25 junctions), where it has been compared to the respective resident real-time signal control strategies TASS, SCOOT and BALANCE. After a short outline of TUC, the paper describes the three application networks; the application, demonstration and evaluation conditions; as well as the comparative evaluation results. The main conclusions drawn from this high-effort inter-European undertaking is that TUC is an easy-to-implement, inter-operable, low-cost real-time signal control strategy whose performance, after limited fine-tuning, proved to be better or, at least, similar to the ones achieved by long-standing strategies that were in most cases very well fine-tuned over the years in the specific networks.
Resumo:
The use of the term "Electronic Publishing" transcends any notions of the paperless office and of a purely electronic transfer and dissemination of information over networks. It now encompasses all computer-assisted methods for the production of documents and includes the imaging of a document on paper as one of the options to be provided by an integrated processing scheme. Electronic publishing draws heavily on techniques from computer science and information technology but technical, legal, financial and organisational problems have to be overcome before it can replace traditional publication mechanisms. These problems are illustrated with reference to the publication arrangements for the journal `Electronic Publishing Origination, Dissemination and Design'. The authors of this paper are the co-editors of this journal, which appears in traditional form and relies on a wide variety of support from electronic technologies in the pre-publication phase.
Resumo:
Seaports play a critical role as gateways and facilitators of economic interchange and logistics processes and thus have become crucial nodes in globalised production networks andmobility systems. Both the physical port infrastructure and its operational superstructure have undergone intensive evolution processes in an effort to adapt to changing economic environments, technological advances,maritime industry expectations and institutional reforms. The results, in terms of infrastructure, operator models and the role of an individual port within the port system, vary by region, institutional and economic context. While ports have undoubtedly developed in scale to respond to the changing volumes and structures in geographies of trade (Wilmsmeier, 2015), the development of hinterland access infrastructure, regulatory systems and institutional structures have in many instances lagged behind. The resulting bottlenecks reflect deficits in the interplay between the economic system and the factors defining port development (e.g. transport demand, the structure of trade, transport services, institutional capacities, etc. cf. Cullinane and Wilmsmeier, 2011). There is a wide range of case study approaches and analyses of individual ports, but analyses from a port system perspective are less common, and those that exist are seldom critical of the dominant discourse assuming the efficiency of market competition (cf. Debrie et al., 2013). This special section aims to capture the spectrum of approaches in current geography research on port system evolution. Thus, the papers reach from the traditional spatial approach (Rodrigue and Ashar, this volume) to network analysis (Mohamed-Chérif and Ducruet, this volume) to institutional discussions (Vonck and Notteboom, this volume; Wilmsmeier and Monios, this volume). The selection of papers allows an opening of discussion and reflection on current research, necessary critical analysis of the influences on port systemevolution and,most importantly, future directions. The remainder of this editorial aims to reflect on these challenges and identify the potential for future research.