836 resultados para Overhead conductors
Resumo:
In this paper, we describe dynamic unicast to increase communication efficiency in opportunistic Information-centric networks. The approach is based on broadcast requests to quickly find content and dynamically creating unicast links to content sources without the need of neighbor discovery. The links are kept temporarily as long as they deliver content and are quickly removed otherwise. Evaluations in mobile networks show that this approach maintains ICN flexibility to support seamless mobile communication and achieves up to 56.6% shorter transmission times compared to broadcast in case of multiple concurrent requesters. Apart from that, dynamic unicast unburdens listener nodes from processing unwanted content resulting in lower processing overhead and power consumption at these nodes. The approach can be easily included into existing ICN architectures using only available data structures.
Resumo:
This paper introduces an area- and power-efficient approach for compressive recording of cortical signals used in an implantable system prior to transmission. Recent research on compressive sensing has shown promising results for sub-Nyquist sampling of sparse biological signals. Still, any large-scale implementation of this technique faces critical issues caused by the increased hardware intensity. The cost of implementing compressive sensing in a multichannel system in terms of area usage can be significantly higher than a conventional data acquisition system without compression. To tackle this issue, a new multichannel compressive sensing scheme which exploits the spatial sparsity of the signals recorded from the electrodes of the sensor array is proposed. The analysis shows that using this method, the power efficiency is preserved to a great extent while the area overhead is significantly reduced resulting in an improved power-area product. The proposed circuit architecture is implemented in a UMC 0.18 [Formula: see text]m CMOS technology. Extensive performance analysis and design optimization has been done resulting in a low-noise, compact and power-efficient implementation. The results of simulations and subsequent reconstructions show the possibility of recovering fourfold compressed intracranial EEG signals with an SNR as high as 21.8 dB, while consuming 10.5 [Formula: see text]W of power within an effective area of 250 [Formula: see text]m × 250 [Formula: see text]m per channel.
Resumo:
Software corpora facilitate reproducibility of analyses, however, static analysis for an entire corpus still requires considerable effort, often duplicated unnecessarily by multiple users. Moreover, most corpora are designed for single languages increasing the effort for cross-language analysis. To address these aspects we propose Pangea, an infrastructure allowing fast development of static analyses on multi-language corpora. Pangea uses language-independent meta-models stored as object model snapshots that can be directly loaded into memory and queried without any parsing overhead. To reduce the effort of performing static analyses, Pangea provides out-of-the box support for: creating and refining analyses in a dedicated environment, deploying an analysis on an entire corpus, using a runner that supports parallel execution, and exporting results in various formats. In this tool demonstration we introduce Pangea and provide several usage scenarios that illustrate how it reduces the cost of analysis.
Resumo:
Information-centric networking (ICN) enables communication in isolated islands, where fixed infrastructure is not available, but also supports seamless communication if the infrastructure is up and running again. In disaster scenarios, when a fixed infrastructure is broken, content discovery algorit hms are required to learn what content is locally available. For example, if preferred content is not available, users may also be satisfied with second best options. In this paper, we describe a new content discovery algorithm and compare it to existing Depth-first and Breadth-first traversal algorithms. Evaluations in mobile scenarios with up to 100 nodes show that it results in better performance, i.e., faster discovery time and smaller traffic overhead, than existing algorithms.
Resumo:
The shift from host-centric to information-centric networking (ICN) promises seamless communication in mobile networks. However, most existing works either consider well-connected networks with high node density or introduce modifications to {ICN} message processing for delay-tolerant Networking (DTN). In this work, we present agent-based content retrieval, which provides information-centric {DTN} support as an application module without modifications to {ICN} message processing. This enables flexible interoperability in changing environments. If no content source can be found via wireless multi-hop routing, requesters may exploit the mobility of neighbor nodes (called agents) by delegating content retrieval to them. Agents that receive a delegation and move closer to content sources can retrieve data and return it back to requesters. We show that agent-based content retrieval may be even more efficient in scenarios where multi-hop communication is possible. Furthermore, we show that broadcast communication may not be necessarily the best option since dynamic unicast requests have little overhead and can better exploit short contact times between nodes (no broadcast delays required for duplicate suppression).
Resumo:
With the ongoing shift in the computer graphics industry toward Monte Carlo rendering, there is a need for effective, practical noise-reduction techniques that are applicable to a wide range of rendering effects and easily integrated into existing production pipelines. This course surveys recent advances in image-space adaptive sampling and reconstruction algorithms for noise reduction, which have proven very effective at reducing the computational cost of Monte Carlo techniques in practice. These approaches leverage advanced image-filtering techniques with statistical methods for error estimation. They are attractive because they can be integrated easily into conventional Monte Carlo rendering frameworks, they are applicable to most rendering effects, and their computational overhead is modest.
Resumo:
Digital terrain models (DTM) typically contain large numbers of postings, from hundreds of thousands to billions. Many algorithms that run on DTMs require topological knowledge of the postings, such as finding nearest neighbors, finding the posting closest to a chosen location, etc. If the postings are arranged irregu- larly, topological information is costly to compute and to store. This paper offers a practical approach to organizing and searching irregularly-space data sets by presenting a collection of efficient algorithms (O(N),O(lgN)) that compute important topological relationships with only a simple supporting data structure. These relationships include finding the postings within a window, locating the posting nearest a point of interest, finding the neighborhood of postings nearest a point of interest, and ordering the neighborhood counter-clockwise. These algorithms depend only on two sorted arrays of two-element tuples, holding a planimetric coordinate and an integer identification number indicating which posting the coordinate belongs to. There is one array for each planimetric coordinate (eastings and northings). These two arrays cost minimal overhead to create and store but permit the data to remain arranged irregularly.
Resumo:
Child abuse correlated with excessive infant crying affects millions of families each year, with consequences of the abuse lasting a lifetime. The University Of Texas School Of Medicine's Colic Clinic is currently in the early stages of testing Dr. Harvey Karp's combinatorial soothing technique for infants called "The Happiest Baby on the Block". In order to gauge the program's potential effectiveness, the Colic Clinic Protocol was examined in order to assess the applicability of the intervention to known causal factors of child abuse associated with excessive infant crying. ^ This evaluation also carried out an anticipated cost-benefit breakout analysis for the implementation of the intervention for 100 children and compared the cost of the program implementation to the cost associated with a single instance of child abuse. The analysis revealed that while accounting for materials, advertising, salaried personnel and other overhead expenses, the cost to implement the intervention was less than half the cost of the medical treatment associated with a single victim of whiplash-shaken infant syndrome. ^ Although the program is still in its early evaluative phase, the future implications of this work are extensive. If this intervention is revealed to be relevant and cost effective, it will precipitate sweeping changes in medical education and training, public health detection and prevention programs, and law enforcement.^
Resumo:
Joint interpretation of magnetotelluric and geomagnetic depth sounding data in the western European Alps offer new insights into the conductivity structure of the Earth's crust and mantle. This first large scale electromagnetic study in the Alps covers a cross-section from Germany to northern Italy and shows the importance of the alpine mountain chain as an interrupter of continuous conductors. Poor data quality due to the highly crystalline underground is overcome by Remote Reference and Robust Processing techniques. 3d-forward-modelling reveals on the one hand interrupted dipping crustal conductors with maximum conductance of 4960 S and on the other hand a lithosphere thickening up to 208 km beneath the central western Alps. Graphite networks arising from Paleozoic sedimentary deposits are considered to be accountable for the occurrence of high conductivity and the distribution pattern of crustal conductors. The influence of huge sedimentary molasse basins on the electromagnetic data is suggested to be minor compared with the influence of crustal conductors. In conclusion, electromagnetic results can be attributed to the geological, tectonic and palaeogeographical background. Dipping direction (S-SE) and maximum angle (10.1°) of the northern crustal conductor reveal the main thrusting conditions beneath the Helvetic Alps whereas the existence of a crustal conductor in the Briançonnais supports theses about its palaeographic belonging to the Iberian Peninsula.
Resumo:
The Bounty Trough, east of New Zealand, lies along the southeastern edge of the present-day Subtropical Front (STF), and is a major conduit via the Bounty Channel, for terrigenous sediment supply from the uplifted Southern Alps to the abyssal Bounty Fan. Census data on 65 benthic foraminiferal faunas (>63 µm) from upper bathyal (ODP 1119), lower bathyal (DSDP 594) and abyssal (ODP 1122) sequences, test and refine existing models for the paleoceanographic and sedimentary history of the trough through the last 150 ka (marine isotope stages, MIS 6-1). Cluster analysis allows recognition of six species groups, whose distribution patterns coincide with bathymetry, the climate cycles and displaced turbidite beds. Detrended canonical correspondence analysis and comparisons with modern faunal patterns suggest that the groups are most strongly influenced by food supply (organic carbon flux), and to a lesser extent by bottom water oxygen and factors relating to sediment type. Major faunal changes at upper bathyal depths (1119) probably resulted from cycles of counter-intuitive seaward-landward migrations of the Southland Front (SF) (north-south sector of the STF). Benthic foraminiferal changes suggest that lower nutrient, cool Subantarctic Surface Water (SAW) was overhead in warm intervals, and higher nutrient-bearing, warm neritic Subtropical Surface Water (STW) was overhead in cold intervals. At lower bathyal depths (594), foraminiferal changes indicate increased glacial productivity and lowered bottom oxygen, attributed to increased upwelling and inflow of cold, nutrient-rich, Antarctic Intermediate Water (AAIW) and shallowing of the oxygen-minimum zone (upper Circum Polar Deep Water, CPDW). The observed cyclical benthic foraminiferal changes are not a result of associations migrating up and down the slope, as glacial faunas (dominated by Globocassidulina canalisuturata and Eilohedra levicula at upper and lower bathyal depths, respectively) are markedly different from those currently living in the Bounty Trough. On the abyssal Bounty Fan (1122), faunal changes correlate most strongly with grain size, and are attributed to varying amounts of mixing of displaced and in-situ faunas. Most of the displaced foraminifera in turbiditic sand beds are sourced from mid-outer shelf depths at the head of the Bounty Channel. Turbidity currents were more prevalent during, but not restricted to, glacial intervals.
Resumo:
The railway overhead (or catenary) is the system of cables responsible for providing electric current to the train. This system has been reported as wind-sensitive (Scanlon et al., 2000), and particularly to the occurrence of galloping phenomena. Galloping phenomena of the railway overhead consists of undamped cable oscillations triggered by aerodynamic forces acting on the contact wire. As is well known, aerodynamic loads on the contact wire depends on the incident flow mean velocity and the angle of attack. The presence of embankments or hills modifies both vertical velocities profiles and angles of attack of the flow (Paiva et al., 2009). The presence of these cross-wind related oscillations can interfere with the safe operation of the railway service (Johnson, 1996). Therefore a correct modelling of the phenomena is required to avoid these unwanted oscillations.
Resumo:
La capacidad de comunicación de los seres humanos ha crecido gracias a la evolución de dispositivos móviles cada vez más pequeños, manejables, potentes, de mayor autonomía y más asequibles. Esta tendencia muestra que en un futuro próximo cercano cada persona llevaría consigo por lo menos un dispositivo de altas prestaciones. Estos dispositivos tienen incorporados algunas formas de comunicación: red de telefonía, redes inalámbricas, bluetooth, entre otras. Lo que les permite también ser empleados para la configuración de redes móviles Ad Hoc. Las redes móviles Ad Hoc, son redes temporales y autoconfigurables, no necesitan un punto de acceso para que los nodos intercambien información entre sí. Cada nodo realiza las tareas de encaminador cuando sea requerido. Los nodos se pueden mover, cambiando de ubicación a discreción. La autonomía de estos dispositivos depende de las estrategias de como sus recursos son utilizados. De tal forma que los protocolos, algoritmos o modelos deben ser diseñados de forma eficiente para no impactar el rendimiento del dispositivo, siempre buscando un equilibrio entre sobrecarga y usabilidad. Es importante definir una gestión adecuada de estas redes especialmente cuando estén siendo utilizados en escenarios críticos como los de emergencias, desastres naturales, conflictos bélicos. La presente tesis doctoral muestra una solución eficiente para la gestión de redes móviles Ad Hoc. La solución contempla dos componentes principales: la definición de un modelo de gestión para redes móviles de alta disponibilidad y la creación de un protocolo de enrutamiento jerárquico asociado al modelo. El modelo de gestión propuesto, denominado High Availability Management Ad Hoc Network (HAMAN), es definido en una estructura de cuatro niveles, acceso, distribución, inteligencia e infraestructura. Además se describen los componentes de cada nivel: tipos de nodos, protocolos y funcionamiento. Se estudian también las interfaces de comunicación entre cada componente y la relación de estas con los niveles definidos. Como parte del modelo se diseña el protocolo de enrutamiento Ad Hoc, denominado Backup Cluster Head Protocol (BCHP), que utiliza como estrategia de encaminamiento el empleo de cluster y jerarquías. Cada cluster tiene un Jefe de Cluster que concentra la información de enrutamiento y de gestión y la envía al destino cuando esta fuera de su área de cobertura. Para mejorar la disponibilidad de la red el protocolo utiliza un Jefe de Cluster de Respaldo el que asume las funciones del nodo principal del cluster cuando este tiene un problema. El modelo HAMAN es validado a través de un proceso la simulación del protocolo BCHP. El protocolo BCHP se implementa en la herramienta Network Simulator 2 (NS2) para ser simulado, comparado y contrastado con el protocolo de enrutamiento jerárquico Cluster Based Routing Protocol (CBRP) y con el protocolo de enrutamiento Ad Hoc reactivo denominado Ad Hoc On Demand Distance Vector Routing (AODV). Abstract The communication skills of humans has grown thanks to the evolution of mobile devices become smaller, manageable, powerful, more autonomy and more affordable. This trend shows that in the near future each person will carry at least one high-performance device. These high-performance devices have some forms of communication incorporated: telephony network, wireless networks, bluetooth, among others. What can also be used for configuring mobile Ad Hoc networks. Ad Hoc mobile networks, are temporary and self-configuring networks, do not need an access point for exchange information between their nodes. Each node performs the router tasks as required. The nodes can move, change location at will. The autonomy of these devices depends on the strategies of how its resources are used. So that the protocols, algorithms or models should be designed to efficiently without impacting device performance seeking a balance between overhead and usability. It is important to define appropriate management of these networks, especially when being used in critical scenarios such as emergencies, natural disasters, wars. The present research shows an efficient solution for managing mobile ad hoc networks. The solution comprises two main components: the definition of a management model for highly available mobile networks and the creation of a hierarchical routing protocol associated with the model. The proposed management model, called High Availability Management Ad Hoc Network (HAMAN) is defined in a four-level structure: access, distribution, intelligence and infrastructure. The components of each level: types of nodes, protocols, structure of a node are shown and detailed. It also explores the communication interfaces between each component and the relationship of these with the levels defined. The Ad Hoc routing protocol proposed, called Backup Cluster Head Protocol( BCHP), use of cluster and hierarchies like strategies. Each cluster has a cluster head which concentrates the routing information and management and sent to the destination when out of cluster coverage area. To improve the availability of the network protocol uses a Backup Cluster Head who assumes the functions of the node of the cluster when it has a problem. The HAMAN model is validated accross the simulation of their BCHP routing protocol. BCHP protocol has been implemented in the simulation tool Network Simulator 2 (NS2) to be simulated, compared and contrasted with a hierarchical routing protocol Cluster Based Routing Protocol (CBRP) and a routing protocol called Reactive Ad Hoc On Demand Distance Vector Routing (AODV).
Resumo:
Participatory Sensing combines the ubiquity of mobile phones with sensing capabilities of Wireless Sensor Networks. It targets pervasive collection of information, e.g., temperature, traffic conditions, or health-related data. As users produce measurements from their mobile devices, voluntary participation becomes essential. However, a number of privacy concerns -- due to the personal information conveyed by data reports -- hinder large-scale deployment of participatory sensing applications. Prior work on privacy protection, for participatory sensing, has often relayed on unrealistic assumptions and with no provably-secure guarantees. The goal of this project is to introduce PEPSI: a Privacy-Enhanced Participatory Sensing Infrastructure. We explore realistic architectural assumptions and a minimal set of (formal) privacy requirements, aiming at protecting privacy of both data producers and consumers. We design a solution that attains privacy guarantees with provable security at very low additional computational cost and almost no extra communication overhead.
Resumo:
Although context could be exploited to improve the performance, elasticity and adaptation in most distributed systems that adopt the publish/subscribe (P/S) model of communication, only very few works have explored domains with highly dynamic context, whereas most adopted models are context agnostic. In this paper, we present the key design principles underlying a novel context-aware content-based P/S (CA-CBPS) model of communication, where the context is explicitly managed, focusing on the minimization of network overhead in domains with recurrent context changes thanks to contextual scoping. We highlight how we dealt with the main shortcomings of most of the current approaches. Our research is some of the first to study the problem of explicitly introducing context-awareness into the P/S model to capitalize on contextual information. The envisioned CA-CBPS middleware enables the cloud ecosystem of services to communicate very efficiently, in a decoupled, but contextually scoped fashion.