301 resultados para transmissions
Resumo:
The North Atlantic is considered a stronghold for the critically endangered leatherback sea turtle. However, limited information exists regarding the movements of individuals to and from the seas off Europe's northwesterly fringe, an area where featherbacks have been historically sighted for the past 200 yr. Here, we used satellite telemetry to record the movements and behaviour of 2 individuals bycaught in fisheries off the southwest coast of Ireland. The turtle T1 (tagged 1 September 2005; female; tracked 375 d) immediately travelled south via Madeira and the Canaries, before residing in West African waters for 3 mo. In spring, T1 migrated north towards Newfoundland where transmissions ceased. T2 (29 June 2006; male; 233 d) travelled south for a short period before spending 66 d west of the Bay of Biscay, an area previously asserted as a high-use area for leatherbacks. This prolonged high latitude summer residence corresponded with a mesoscale feature evident from satellite imagery, with the implication that this turtle had found a rich feeding site. A marked change in dive behaviour was apparent as the turtle exited this feature and provided useful insights on leatherback diving behaviour. T2 headed south in October 2006, and performed the deepest-ever dive recorded by a reptile (1280 m) southwest of Cape Verde. Unlike T1, T2 swam southwest towards Brazil before approaching the major nesting beaches of French Guiana and Surinam. Importantly, these tracks document the movement of leatherbacks from one of the remotest foraging grounds in the North Atlantic. © Inter-Research 2008.
Resumo:
This work investigates the end-to-end performance of randomized distributed space-time codes with complex Gaussian distribution, when employed in a wireless relay network. The relaying nodes are assumed to adopt a decode-and-forward strategy and transmissions are affected by small and large scale fading phenomena. Extremely tight, analytical approximations of the end-to-end symbol error probability and of the end-to-end outage probability are derived and successfully validated through Monte-Carlo simulation. For the high signal-to-noise ratio regime, a simple, closed-form expression for the symbol error probability is further provided.
Resumo:
The technological constraints of early British television encouraged drama productions which emphasised the immediate, the enclosed and the close-up, an approach which Jason Jacobs described in the title of his seminal study as 'the intimate screen'. While Jacobs' book showed that this conception of early British television drama was only part of the reality, he did not focus on the role that special effects played in expanding the scope of the early television screen. This article will focus upon this role, showing that special effects were not only of use in expanding the temporal and spatial scope of television, but were also considered to be of interest to the audience as a way of exploring the new medium, receiving coverage in the popular press. These effects included pre-recorded film inserts, pre-recorded narration, multiple sets, model work and animation, combined with the live studio performances. Drawing upon archival research into television production files and scripts as well as audience responses and periodical coverage of television at the time of broadcast, this article will focus on telefantasy. This genre offered particular opportunities for utilising effects in ways that seemed appropriate for the experimentation with the form of television and for the drama narratives. This period also saw a variety of shifts within television as the BBC sought to determine a specific identity and understand the possibilities for the new medium.
This research also incorporates the BBC's own research and internal dialogue concerning audiences and how their tastes should best be met, at a time when the television audience was not only growing in terms of number but was also expanding geographically and socially beyond the moneyed Londoners who could afford the first television sets and were within range of the Alexandra Palace transmissions. The primary case study for this article will be the 1949 production of H.G.Wells’ The Time Machine, which incorporated pre-recorded audio and film inserts, which expanded the narrative out of the live studio performance both temporally and spatially, with the effects work receiving coverage in the popular magazine Illustrated. Other productions considered will be the 1938 and 1948 productions of RUR, the 1948 production of Blithe Spirit, and the 1950 adaptation of The Strange Case of Dr Jekyll and Mr Hyde. Despite the focus on telefantasy, this article will also include examples from other genres, both dramatic and factual, showing how the BBC's response to the changing television audience was to restrict drama to a more 'realistic' aesthetic and to move experimentation with televisual form to non-drama productions such as variety performances.
Resumo:
A digital directional modulation (DM) transmitter structure is proposed from a practical implementation point of view in this paper. This digital DM architecture is built with the help of several off-the-shelf physical layer wireless experiment platform hardware boards. When compared with previous analogue DM transmitter architectures, the digital means offers more precise and fast control on the updates of the array excitations. More importantly, it is an ideal physical arrangement to implement the most universal DM synthesis algorithm, i.e., the orthogonal vector approach. The practical issues in digital DM system calibrations are described and solved. The bit error rates (BERs) are measured via real-time data transmissions to illustrate the DM advantages, in terms of secrecy performance, over conventional non-DM beam-steering transmitters.
Resumo:
A unique property of body area networks (BANs) is the mobility of the network as the user moves freely around. This mobility represents a significant challenge for BANs, since, in order to operate efficiently, they need to be able to adapt to the changing propagation environment. A method is presented that allows BAN nodes to classify the current operating environment in terms of multipath conditions, based on received signal strength indicator values during normal packet transmissions. A controlled set of measurements was carried out to study the effect different environments inflict on on-body link signal strength in a 2.45 GHz BAN. The analysis shows that, by using two statistical parameters, gathered over a period of one second, BAN nodes can successfully classify the operating environment for over 90% of the time.
Resumo:
Flow processing is a fundamental element of stateful traffic classification and it has been recognized as an essential factor for delivering today’s application-aware network operations and security services. The basic function within a flow processing engine is to search and maintain a flow table, create new flow entries if no entry matches and associate each entry with flow states and actions for future queries. Network state information on a per-flow basis must be managed in an efficient way to enable Ethernet frame transmissions at 40 Gbit/s (Gbps) and 100 Gbps in the near future. This paper presents a hardware solution of flow state management for implementing large-scale flow tables on popular computer memories using DDR3 SDRAMs. Working with a dedicated flow lookup table at over 90 million lookups per second, the proposed system is able to manage 512-bit state information at run time.
Resumo:
Power electronics plays an important role in the control and conversion of modern electric power systems. In particular, to integrate various renewable energies using DC transmissions and to provide more flexible power control in AC systems, significant efforts have been made in the modulation and control of power electronics devices. Pulse width modulation (PWM) is a well developed technology in the conversion between AC and DC power sources, especially for the purpose of harmonics reduction and energy optimization. As a fundamental decoupled control method, vector control with PI controllers has been widely used in power systems. However, significant power loss occurs during the operation of these devices, and the loss is often dissipated in the form of heat, leading to significant maintenance effort. Though much work has been done to improve the power electronics design, little has focused so far on the investigation of the controller design to reduce the controller energy consumption (leading to power loss in power electronics) while maintaining acceptable system performance. This paper aims to bridge the gap and investigates their correlations. It is shown a more thoughtful controller design can achieve better balance between energy consumption in power electronics control and system performance, which potentially leads to significant energy saving for integration of renewable power sources.
Resumo:
Whole genome sequencing (WGS) technology holds great promise as a tool for the forensic epidemiology of bacterial pathogens. It is likely to be particularly useful for studying the transmission dynamics of an observed epidemic involving a largely unsampled 'reservoir' host, as for bovine tuberculosis (bTB) in British and Irish cattle and badgers. BTB is caused by Mycobacterium bovis, a member of the M. tuberculosis complex that also includes the aetiological agent for human TB. In this study, we identified a spatio-temporally linked group of 26 cattle and 4 badgers infected with the same Variable Number Tandem Repeat (VNTR) type of M. bovis. Single-nucleotide polymorphisms (SNPs) between sequences identified differences that were consistent with bacterial lineages being persistent on or near farms for several years, despite multiple clear whole herd tests in the interim. Comparing WGS data to mathematical models showed good correlations between genetic divergence and spatial distance, but poor correspondence to the network of cattle movements or within-herd contacts. Badger isolates showed between zero and four SNP differences from the nearest cattle isolate, providing evidence for recent transmissions between the two hosts. This is the first direct genetic evidence of M. bovis persistence on farms over multiple outbreaks with a continued, ongoing interaction with local badgers. However, despite unprecedented resolution, directionality of transmission cannot be inferred at this stage. Despite the often notoriously long timescales between time of infection and time of sampling for TB, our results suggest that WGS data alone can provide insights into TB epidemiology even where detailed contact data are not available, and that more extensive sampling and analysis will allow for quantification of the extent and direction of transmission between cattle and badgers. © 2012 Biek et al.
Resumo:
Na última década tem-se assistido a um crescimento exponencial das redes de comunicações sem fios, nomeadamente no que se refere a taxa de penetração do serviço prestado e na implementação de novas infra-estruturas em todo o globo. É ponto assente neste momento que esta tendência irá não só continuar como se fortalecer devido à convergência que é esperada entre as redes móveis sem fio e a disponibilização de serviços de banda larga para a rede Internet fixa, numa evolução para um paradigma de uma arquitectura integrada e baseada em serviços e aplicações IP. Por este motivo, as comunicações móveis sem fios irão ter um papel fundamental no desenvolvimento da sociedade de informação a médio e longo prazos. A estratégia seguida no projecto e implementação das redes móveis celulares da actual geração (2G e 3G) foi a da estratificação da sua arquitectura protocolar numa estrutura modular em camadas estanques, onde cada camada do modelo é responsável pela implementação de um conjunto de funcionalidades. Neste modelo a comunicação dá-se apenas entre camadas adjacentes através de primitivas de comunicação pré-estabelecidas. Este modelo de arquitectura resulta numa mais fácil implementação e introdução de novas funcionalidades na rede. Entretanto, o facto das camadas inferiores do modelo protocolar não utilizarem informação disponibilizada pelas camadas superiores, e vice-versa acarreta uma degradação no desempenho do sistema. Este paradigma é particularmente importante quando sistemas de antenas múltiplas são implementados (sistemas MIMO). Sistemas de antenas múltiplas introduzem um grau adicional de liberdade no que respeita a atribuição de recursos rádio: o domínio espacial. Contrariamente a atribuição de recursos no domínio do tempo e da frequência, no domínio espacial os recursos rádio mapeados no domínio espacial não podem ser assumidos como sendo completamente ortogonais, devido a interferência resultante do facto de vários terminais transmitirem no mesmo canal e/ou slots temporais mas em feixes espaciais diferentes. Sendo assim, a disponibilidade de informação relativa ao estado dos recursos rádio às camadas superiores do modelo protocolar é de fundamental importância na satisfação dos critérios de qualidade de serviço exigidos. Uma forma eficiente de gestão dos recursos rádio exige a implementação de algoritmos de agendamento de pacotes de baixo grau de complexidade, que definem os níveis de prioridade no acesso a esses recursos por base dos utilizadores com base na informação disponibilizada quer pelas camadas inferiores quer pelas camadas superiores do modelo. Este novo paradigma de comunicação, designado por cross-layer resulta na maximização da capacidade de transporte de dados por parte do canal rádio móvel, bem como a satisfação dos requisitos de qualidade de serviço derivados a partir da camada de aplicação do modelo. Na sua elaboração, procurou-se que o standard IEEE 802.16e, conhecido por Mobile WiMAX respeitasse as especificações associadas aos sistemas móveis celulares de quarta geração. A arquitectura escalonável, o baixo custo de implementação e as elevadas taxas de transmissão de dados resultam num processo de multiplexagem de dados e valores baixos no atraso decorrente da transmissão de pacotes, os quais são atributos fundamentais para a disponibilização de serviços de banda larga. Da mesma forma a comunicação orientada à comutação de pacotes, inenente na camada de acesso ao meio, é totalmente compatível com as exigências em termos da qualidade de serviço dessas aplicações. Sendo assim, o Mobile WiMAX parece satisfazer os requisitos exigentes das redes móveis de quarta geração. Nesta tese procede-se à investigação, projecto e implementação de algoritmos de encaminhamento de pacotes tendo em vista a eficiente gestão do conjunto de recursos rádio nos domínios do tempo, frequência e espacial das redes móveis celulares, tendo como caso prático as redes móveis celulares suportadas no standard IEEE802.16e. Os algoritmos propostos combinam métricas provenientes da camada física bem como os requisitos de qualidade de serviço das camadas superiores, de acordo com a arquitectura de redes baseadas no paradigma do cross-layer. O desempenho desses algoritmos é analisado a partir de simulações efectuadas por um simulador de sistema, numa plataforma que implementa as camadas física e de acesso ao meio do standard IEEE802.16e.
Resumo:
Wireless communication technologies have become widely adopted, appearing in heterogeneous applications ranging from tracking victims, responders and equipments in disaster scenarios to machine health monitoring in networked manufacturing systems. Very often, applications demand a strictly bounded timing response, which, in distributed systems, is generally highly dependent on the performance of the underlying communication technology. These systems are said to have real-time timeliness requirements since data communication must be conducted within predefined temporal bounds, whose unfulfillment may compromise the correct behavior of the system and cause economic losses or endanger human lives. The potential adoption of wireless technologies for an increasingly broad range of application scenarios has made the operational requirements more complex and heterogeneous than before for wired technologies. On par with this trend, there is an increasing demand for the provision of cost-effective distributed systems with improved deployment, maintenance and adaptation features. These systems tend to require operational flexibility, which can only be ensured if the underlying communication technology provides both time and event triggered data transmission services while supporting on-line, on-the-fly parameter modification. Generally, wireless enabled applications have deployment requirements that can only be addressed through the use of batteries and/or energy harvesting mechanisms for power supply. These applications usually have stringent autonomy requirements and demand a small form factor, which hinders the use of large batteries. As the communication support may represent a significant part of the energy requirements of a station, the use of power-hungry technologies is not adequate. Hence, in such applications, low-range technologies have been widely adopted. In fact, although low range technologies provide smaller data rates, they spend just a fraction of the energy of their higher-power counterparts. The timeliness requirements of data communications, in general, can be met by ensuring the availability of the medium for any station initiating a transmission. In controlled (close) environments this can be guaranteed, as there is a strict regulation of which stations are installed in the area and for which purpose. Nevertheless, in open environments, this is hard to control because no a priori abstract knowledge is available of which stations and technologies may contend for the medium at any given instant. Hence, the support of wireless real-time communications in unmanaged scenarios is a highly challenging task. Wireless low-power technologies have been the focus of a large research effort, for example, in the Wireless Sensor Network domain. Although bringing extended autonomy to battery powered stations, such technologies are known to be negatively influenced by similar technologies contending for the medium and, especially, by technologies using higher power transmissions over the same frequency bands. A frequency band that is becoming increasingly crowded with competing technologies is the 2.4 GHz Industrial, Scientific and Medical band, encompassing, for example, Bluetooth and ZigBee, two lowpower communication standards which are the base of several real-time protocols. Although these technologies employ mechanisms to improve their coexistence, they are still vulnerable to transmissions from uncoordinated stations with similar technologies or to higher power technologies such as Wi- Fi, which hinders the support of wireless dependable real-time communications in open environments. The Wireless Flexible Time-Triggered Protocol (WFTT) is a master/multi-slave protocol that builds on the flexibility and timeliness provided by the FTT paradigm and on the deterministic medium capture and maintenance provided by the bandjacking technique. This dissertation presents the WFTT protocol and argues that it allows supporting wireless real-time communication services with high dependability requirements in open environments where multiple contention-based technologies may dispute the medium access. Besides, it claims that it is feasible to provide flexible and timely wireless communications at the same time in open environments. The WFTT protocol was inspired on the FTT paradigm, from which higher layer services such as, for example, admission control has been ported. After realizing that bandjacking was an effective technique to ensure the medium access and maintenance in open environments crowded with contention-based communication technologies, it was recognized that the mechanism could be used to devise a wireless medium access protocol that could bring the features offered by the FTT paradigm to the wireless domain. The performance of the WFTT protocol is reported in this dissertation with a description of the implemented devices, the test-bed and a discussion of the obtained results.
Resumo:
In Mobile Ad hoc NETworks (MANETs), where cooperative behaviour is mandatory, there is a high probability for some nodes to become overloaded with packet forwarding operations in order to support neighbor data exchange. This altruistic behaviour leads to an unbalanced load in the network in terms of traffic and energy consumption. In such scenarios, mobile nodes can benefit from the use of energy efficient and traffic fitting routing protocol that better suits the limited battery capacity and throughput limitation of the network. This PhD work focuses on proposing energy efficient and load balanced routing protocols for ad hoc networks. Where most of the existing routing protocols simply consider the path length metric when choosing the best route between a source and a destination node, in our proposed mechanism, nodes are able to find several routes for each pair of source and destination nodes and select the best route according to energy and traffic parameters, effectively extending the lifespan of the network. Our results show that by applying this novel mechanism, current flat ad hoc routing protocols can achieve higher energy efficiency and load balancing. Also, due to the broadcast nature of the wireless channels in ad hoc networks, other technique such as Network Coding (NC) looks promising for energy efficiency. NC can reduce the number of transmissions, number of re-transmissions, and increase the data transfer rate that directly translates to energy efficiency. However, due to the need to access foreign nodes for coding and forwarding packets, NC needs a mitigation technique against unauthorized accesses and packet corruption. Therefore, we proposed different mechanisms for handling these security attacks by, in particular by serially concatenating codes to support reliability in ad hoc network. As a solution to this problem, we explored a new security framework that proposes an additional degree of protection against eavesdropping attackers based on using concatenated encoding. Therefore, malicious intermediate nodes will find it computationally intractable to decode the transitive packets. We also adopted another code that uses Luby Transform (LT) as a pre-coding code for NC. Primarily being designed for security applications, this code enables the sink nodes to recover corrupted packets even in the presence of byzantine attacks.
Resumo:
This thesis investigates the use and significance of X-ray crystallographic visualisations of molecular structures in postwar British material culture across scientific practice and industrial design. It is based on research into artefacts from three areas: X-ray crystallographers’ postwar practices of visualising molecular structures using models and diagrams; the Festival Pattern Group scheme for the 1951 Festival of Britain, in which crystallographic visualisations formed the aesthetic basis of patterns for domestic objects; and postwar furnishings with a ‘ball-and-rod’ form and construction reminiscent of those of molecular models. A key component of the project is methodological. The research brings together subjects, themes and questions traditionally covered separately by two disciplines, the history of design and history of science. This focus necessitated developing an interdisciplinary set of methods, which results in the reassessment of disciplinary borders and productive cross-disciplinary methodological applications. This thesis also identifies new territory for shared methods: it employs network models to examine cross-disciplinary interaction between practitioners in crystallography and design, and a biographical approach to designed objects that over time became mediators of historical narratives about science. Artefact-based, archival and oral interviewing methods illuminate the production, use and circulation of the objects examined in this research. This interdisciplinary approach underpins the generation of new historical narratives in this thesis. It revises existing histories of the cultural transmissions between X-ray crystallography and the production and reception of designed objects in postwar Britain. I argue that these transmissions were more complex than has been acknowledged by historians: they were contingent upon postwar scientific and design practices, material conditions in postwar Britain and the dynamics of historical memory, both scholarly and popular. This thesis comprises four chapters. Chapter one explores X-ray crystallographers’ visualisation practices, conceived here as a form of craft. Chapter two builds on this, demonstrating that the Festival Pattern Group witnesses the encounter between crystallographic practice, design practice and aesthetic ideologies operating within social networks associated with postwar modernisms. Chapters three and four focus on ball-and-rod furnishings in postwar and present-day Britain, respectively. I contend that strong relationships between these designed objects and crystallographic visualisations, for example the appellation ‘atomic design’, have been largely realised through historical narratives active today in the consumption of ‘retro’ and ‘mid-century modern’ artefacts. The attention to contemporary historical narratives necessitates this dual historical focus: the research is rooted in the period from the end of the Second World War until the early 1960s, but extends to the history of now. This thesis responds to the need for practical research on methods for studying cross-disciplinary interactions and their histories. It reveals the effects of submitting historical subjects that are situated on disciplinary boundaries to interdisciplinary interpretation. Old models, such as that of unidirectional ‘influence’, subside and the resulting picture is a refracted one: this study demonstrates that the material form and meaning of crystallographic visualisations, within scientific practice and across their use and echoes in designed objects, are multiple and contingent.
Resumo:
The Acoustic Oceanographic Buoy (AOB) Telemetry System has been designed to meet acoustic rapid environmental assessment requirements. It uses a standard institute of Electrical and Electronics Engineers 802.11 wireless local area network (WLAN) to integrate the air radio network (RaN) and a hydrophone array and acoustic source to integrate the underwater acoustic network (AcN). It offers advantages including local data storage, dedicated signal processing, and global positioning system (GPS) timing and localization. The AOB can also be integrated with other similar systems, due to its WLAN transceivers, to form a flexible network and perform on-line high speed data transmissions. The AOB is a reusable system that requires less maintenance and can also work as a salt-water plug-and-play system at sea as it is designed to operate in free drifting mode. The AOB is also suitable for performing distributed digital signal processing tasks due to its digital signal processor facility.
Resumo:
Pelagic longliners targeting swordfish and tunas in oceanic waters regularly capture sharks as bycatch, including currently protected species as the bigeye thresher, Alopias superciliosus. Fifteen bigeye threshers were tagged with pop-up satellite archival tags (PSATs) in 2012-2014 in the tropical northeast Atlantic, with successful transmissions received from 12 tags for a total of 907 tracking days. Marked diel vertical movements were recorded on all specimens, with most of the daytime spent in deeper colder water (mean depth = 353 m, SD = 73; mean temperature = 10.7 °C, SD = 1.8) and nighttime spent in warmer water closer to the surface (mean depth = 72 m, SD = 54; mean temperature = 21.9 °C, SD = 3.7). The operating depth of the pelagic longline gear was measured with Minilog Temperature and Depth Recorders (TDRs), and the overlap with habitat utilization was calculated. Overlap is taking place mainly during the night and is higher for juveniles. The results presented herein can be used as inputs for Ecological Risk Assessments for bigeye threshers captured in oceanic tuna fisheries, and serve as a basis for efficient management and conservation of this vulnerable shark species.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações