929 resultados para Radio broadcast
Resumo:
Vaughn, James, ''Propaganda by Proxy': Britain, America and Arab Radio Broadcasting 1953-1957 ', Historical Journal of Film, Radio and Television (2002) 22(2) pp.157-172 RAE2008
Resumo:
Pryse, Sian, 'Radio tomography: A new experimental technique', Surveys in Geophysics (2003) 24 pp.1-38 RAE2008
Resumo:
Pryse, Sian; Dewis, K.L.; Middleton, H.R.; Balthazor, R.L., (2005) 'The dayside high-latitude trough under quiet geomagnetic conditions: Radio tomography and the CTIP model', Annales Geophysicae 23(4) pp.1199-1206 RAE2008
Resumo:
Wydział Historyczny: Instytut Etnologii i Antropologii Kulturowej
Resumo:
The proliferation of mobile computers and wireless networks requires the design of future distributed real-time applications to recognize and deal with the significant asymmetry between downstream and upstream communication capacities, and the significant disparity between server and client storage capacities. Recent research work proposed the use of Broadcast Disks as a scalable mechanism to deal with this problem. In this paper, we propose a new broadcast disks protocol, based on our Adaptive Information Dispersal Algorithm (AIDA). Our protocol is different from previous broadcast disks protocols in that it improves communication timeliness, fault-tolerance, and security, while allowing for a finer control of multiplexing of prioritized data (broadcast frequencies). We start with a general introduction of broadcast disks. Next, we propose broadcast disk organizations that are suitable for real-time applications. Next, we present AIDA and show its fault-tolerance and security properties. We conclude the paper with the description and analysis of AIDA-based broadcast disks organizations that achieve both timeliness and fault-tolerance, while preserving downstream communication capacity.
Resumo:
The design of programs for broadcast disks which incorporate real-time and fault-tolerance requirements is considered. A generalized model for real-time fault-tolerant broadcast disks is defined. It is shown that designing programs for broadcast disks specified in this model is closely related to the scheduling of pinwheel task systems. Some new results in pinwheel scheduling theory are derived, which facilitate the efficient generation of real-time fault-tolerant broadcast disk programs.
Resumo:
There is an increased interest in using broadcast disks to support mobile access to real-time databases. However, previous work has only considered the design of real-time immutable broadcast disks, the contents of which do not change over time. This paper considers the design of programs for real-time mutable broadcast disks - broadcast disks whose contents are occasionally updated. Recent scheduling-theoretic results relating to pinwheel scheduling and pfair scheduling are used to design algorithms for the efficient generation of real-time mutable broadcast disk programs.
Resumo:
In an n-way broadcast application each one of n overlay nodes wants to push its own distinct large data file to all other n-1 destinations as well as download their respective data files. BitTorrent-like swarming protocols are ideal choices for handling such massive data volume transfers. The original BitTorrent targets one-to-many broadcasts of a single file to a very large number of receivers and thus, by necessity, employs an almost random overlay topology. n-way broadcast applications on the other hand, owing to their inherent n-squared nature, are realizable only in small to medium scale networks. In this paper, we show that we can leverage this scale constraint to construct optimized overlay topologies that take into consideration the end-to-end characteristics of the network and as a consequence deliver far superior performance compared to random and myopic (local) approaches. We present the Max-Min and MaxSum peer-selection policies used by individual nodes to select their neighbors. The first one strives to maximize the available bandwidth to the slowest destination, while the second maximizes the aggregate output rate. We design a swarming protocol suitable for n-way broadcast and operate it on top of overlay graphs formed by nodes that employ Max-Min or Max-Sum policies. Using trace-driven simulation and measurements from a PlanetLab prototype implementation, we demonstrate that the performance of swarming on top of our constructed topologies is far superior to the performance of random and myopic overlays. Moreover, we show how to modify our swarming protocol to allow it to accommodate selfish nodes.
The s-mote: a versatile heterogeneous multi-radio platform for wireless sensor networks applications
Resumo:
This paper presents a novel architecture and its implementation for a versatile, miniaturised mote which can communicate concurrently using a variety of combinations of ISM bands, has increased processing capability, and interoperability with mainstream GSM technology. All these features are integrated in a small form factor platform. The platform can have many configurations which could satisfy a variety of applications’ constraints. To the best of our knowledge, it is the first integrated platform of this type reported in the literature. The proposed platform opens the way for enhanced levels of Quality of Service (QoS), with respect to reliability, availability and latency, in addition to facilitating interoperability and power reduction compared to existing platforms. The small form factor also allows potential of integration with other mobile platforms including smart phones.
Resumo:
Ultra Wide Band (UWB) transmission has recently been the object of considerable attention in the field of next generation location aware wireless sensor networks. This is due to its fine time resolution, energy efficient and robustness to interference in harsh environments. This paper presents a thorough applied examination of prototype IEEE 802.15.4a impulse UWB transceiver technology to quantify the effect of line of sight (LOS) and non line of sight (NLOS) ranging in real indoor and outdoor environments. Results included draw on an extensive array of experiments that fully characterize the 802.15.4a UWB transceiver technology, its reliability and ranging capabilities for the first time. A new two way (TW) ranging protocol is proposed. The goal of this work is to validate the technology as a dependable wireless communications mechanism for the subset of sensor network localization applications where reliability and precision positions are key concerns.
Resumo:
Science Foundation Ireland (CSET - Centre for Science, Engineering and Technology, Grant No. 07/CE/11147)
Resumo:
This thesis investigates the optimisation of Coarse-Fine (CF) spectrum sensing architectures under a distribution of SNRs for Dynamic Spectrum Access (DSA). Three different detector architectures are investigated: the Coarse-Sorting Fine Detector (CSFD), the Coarse-Deciding Fine Detector (CDFD) and the Hybrid Coarse-Fine Detector (HCFD). To date, the majority of the work on coarse-fine spectrum sensing for cognitive radio has focused on a single value for the SNR. This approach overlooks the key advantage that CF sensing has to offer, namely that high powered signals can be easily detected without extra signal processing. By considering a range of SNR values, the detector can be optimised more effectively and greater performance gains realised. This work considers the optimisation of CF spectrum sensing schemes where the security and performance are treated separately. Instead of optimising system performance at a single, constant, low SNR value, the system instead is optimised for the average operating conditions. The security is still provided such that at the low SNR values the safety specifications are met. By decoupling the security and performance, the system’s average performance increases whilst maintaining the protection of licensed users from harmful interference. The different architectures considered in this thesis are investigated in theory, simulation and physical implementation to provide a complete overview of the performance of each system. This thesis provides a method for estimating SNR distributions which is quick, accurate and relatively low cost. The CSFD is modelled and the characteristic equations are found for the CDFD scheme. The HCFD is introduced and optimisation schemes for all three architectures are proposed. Finally, using the Implementing Radio In Software (IRIS) test-bed to confirm simulation results, CF spectrum sensing is shown to be significantly quicker than naive methods, whilst still meeting the required interference probability rates and not requiring substantial receiver complexity increases.
Resumo:
Poor oxygenation (hypoxia) is a common characteristic of human solid tumours, and is associated with cell survival, metastasis and resistance to radio- and chemotherapies. Hypoxia-induced stabilisation of hypoxia-inducible factor-1α (HIF-1α) leads to changes in expression of various genes associated with growth, vascularisation and metabolism. However whether HIF-1α plays a causal role in promoting hypoxic resistance to antitumour therapies remains unclear. In this study we used pharmacological and genetic methods to investigate the HIF-1α contribution to radio- and chemoresistance in four cancer cell lines derived from cervical, breast, prostate and melanoma human tumours. Under normoxia or hypoxia (<0.2% or 0.5% oxygen) the cells were exposed to either a standard irradiation dose (6.2 Gy) or chemotherapeutic drug (cisplatin), and subsequent cell proliferation (after 7 days) was measured in terms of resazurin reduction. Oxygen-dependent radio- and chemosensitivity was evident in all wild type whereas it was reduced or abolished in HIF-1α (siRNA) knockdown cells. The effects of HIF-1α-modulating drugs (EDHB, CoCl2, deferoxamine to stabilise and R59949 to destabilise it) reflected both HIF-1α-dependent and independent mechanisms. Collectively the data show that HIF-1α played a causal role in our in vitro model of hypoxia-induced radioresistance whereas its contribution to oxygendependent sensitivity to cisplatin was less clear-cut. Although this behavior is likely to be conditioned by further biological and physical factors operating in vivo, it is consistent with the hypothesis that interventions directed at HIF-1α may improve the clinical effectiveness of tumour treatments.
Resumo:
Very Long Baseline Interferometry (VLBI) polarisation observations of the relativistic jets from Active Galactic Nuclei (AGN) allow the magnetic field environment around the jet to be probed. In particular, multi-wavelength observations of AGN jets allow the creation of Faraday rotation measure maps which can be used to gain an insight into the magnetic field component of the jet along the line of sight. Recent polarisation and Faraday rotation measure maps of many AGN show possible evidence for the presence of helical magnetic fields. The detection of such evidence is highly dependent both on the resolution of the images and the quality of the error analysis and statistics used in the detection. This thesis focuses on the development of new methods for high resolution radio astronomy imaging in both of these areas. An implementation of the Maximum Entropy Method (MEM) suitable for multi-wavelength VLBI polarisation observations is presented and the advantage in resolution it possesses over the CLEAN algorithm is discussed and demonstrated using Monte Carlo simulations. This new polarisation MEM code has been applied to multi-wavelength imaging of the Active Galactic Nuclei 0716+714, Mrk 501 and 1633+382, in each case providing improved polarisation imaging compared to the case of deconvolution using the standard CLEAN algorithm. The first MEM-based fractional polarisation and Faraday-rotation VLBI images are presented, using these sources as examples. Recent detections of gradients in Faraday rotation measure are presented, including an observation of a reversal in the direction of a gradient further along a jet. Simulated observations confirming the observability of such a phenomenon are conducted, and possible explanations for a reversal in the direction of the Faraday rotation measure gradient are discussed. These results were originally published in Mahmud et al. (2013). Finally, a new error model for the CLEAN algorithm is developed which takes into account correlation between neighbouring pixels. Comparison of error maps calculated using this new model and Monte Carlo maps show striking similarities when the sources considered are well resolved, indicating that the method is correctly reproducing at least some component of the overall uncertainty in the images. The calculation of many useful quantities using this model is demonstrated and the advantages it poses over traditional single pixel calculations is illustrated. The limitations of the model as revealed by Monte Carlo simulations are also discussed; unfortunately, the error model does not work well when applied to compact regions of emission.