90 resultados para geographical data


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensor/actuator networks promised to extend automated monitoring and control into industrial processes. Avionic system is one of the prominent technologies that can highly gain from dense sensor/actuator deployments. An aircraft with smart sensing skin would fulfill the vision of affordability and environmental friendliness properties by reducing the fuel consumption. Achieving these properties is possible by providing an approximate representation of the air flow across the body of the aircraft and suppressing the detected aerodynamic drags. To the best of our knowledge, getting an accurate representation of the physical entity is one of the most significant challenges that still exists with dense sensor/actuator network. This paper offers an efficient way to acquire sensor readings from very large sensor/actuator network that are located in a small area (dense network). It presents LIA algorithm, a Linear Interpolation Algorithm that provides two important contributions. First, it demonstrates the effectiveness of employing a transformation matrix to mimic the environmental behavior. Second, it renders a smart solution for updating the previously defined matrix through a procedure called learning phase. Simulation results reveal that the average relative error in LIA algorithm can be reduced by as much as 60% by exploiting transformation matrix.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). The physical parameters of the data center (such as power, temperature, pressure, humidity) are tightly coupled with computations, even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in a cloud infrastructure hosted in the data center. In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolutionof the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center andwith them, _and opportunities to optimize energy consumption. Havinga high resolution picture of the data center conditions, also enables minimizing local hotspots, perform more accurate predictive maintenance (pending failures in cooling and other infrastructure equipment can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of designing an algorithm for acquiring sensor readings. Consider specifically the problem of obtaining an approximate representation of sensor readings where (i) sensor readings originate from different sensor nodes, (ii) the number of sensor nodes is very large, (iii) all sensor nodes are deployed in a small area (dense network) and (iv) all sensor nodes communicate over a communication medium where at most one node can transmit at a time (a single broadcast domain). We present an efficient algorithm for this problem, and our novel algorithm has two desired properties: (i) it obtains an interpolation based on all sensor readings and (ii) it is scalable, that is, its time-complexity is independent of the number of sensor nodes. Achieving these two properties is possible thanks to the close interlinking of the information processing algorithm, the communication system and a model of the physical world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses earthquake data in the perspective of dynamical systems and its Pseudo Phase Plane representation. The seismic data is collected from the Bulletin of the International Seismological Centre. The geological events are characterised by their magnitude and geographical location and described by means of time series of sequences of Dirac impulses. Fifty groups of data series are considered, according to the Flinn-Engdahl seismic regions of Earth. For each region, Pearson’s correlation coefficient is used to find the optimal time delay for reconstructing the Pseudo Phase Plane. The Pseudo Phase Plane plots are then analysed and characterised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless Sensor Networks (WSNs) are increasingly used in various application domains like home-automation, agriculture, industries and infrastructure monitoring. As applications tend to leverage larger geographical deployments of sensor networks, the availability of an intuitive and user friendly programming abstraction becomes a crucial factor in enabling faster and more efficient development, and reprogramming of applications. We propose a programming pattern named sMapReduce, inspired by the Google MapReduce framework, for mapping application behaviors on to a sensor network and enabling complex data aggregation. The proposed pattern requires a user to create a network-level application in two functions: sMap and Reduce, in order to abstract away from the low-level details without sacrificing the control to develop complex logic. Such a two-fold division of programming logic is a natural-fit to typical sensor networking operation which makes sensing and topological modalities accessible to the user.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network control systems (NCSs) are spatially distributed systems in which the communication between sensors, actuators and controllers occurs through a shared band-limited digital communication network. However, the use of a shared communication network, in contrast to using several dedicated independent connections, introduces new challenges which are even more acute in large scale and dense networked control systems. In this paper we investigate a recently introduced technique of gathering information from a dense sensor network to be used in networked control applications. Obtaining efficiently an approximate interpolation of the sensed data is exploited as offering a good tradeoff between accuracy in the measurement of the input signals and the delay to the actuation. These are important aspects to take into account for the quality of control. We introduce a variation to the state-of-the-art algorithms which we prove to perform relatively better because it takes into account the changes over time of the input signal within the process of obtaining an approximate interpolation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cluster scheduling and collision avoidance are crucial issues in large-scale cluster-tree Wireless Sensor Networks (WSNs). The paper presents a methodology that provides a Time Division Cluster Scheduling (TDCS) mechanism based on the cyclic extension of RCPS/TC (Resource Constrained Project Scheduling with Temporal Constraints) problem for a cluster-tree WSN, assuming bounded communication errors. The objective is to meet all end-to-end deadlines of a predefined set of time-bounded data flows while minimizing the energy consumption of the nodes by setting the TDCS period as long as possible. Sinceeach cluster is active only once during the period, the end-to-end delay of a given flow may span over several periods when there are the flows with opposite direction. The scheduling tool enables system designers to efficiently configure all required parameters of the IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs in the network design time. The performance evaluation of thescheduling tool shows that the problems with dozens of nodes can be solved while using optimal solvers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The simulation analysis is important approach to developing and evaluating the systems in terms of development time and cost. This paper demonstrates the application of Time Division Cluster Scheduling (TDCS) tool for the configuration of IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs using the simulation analysis, as an illustrative example that confirms the practical applicability of the tool. The simulation study analyses how the number of retransmissions impacts the reliability of data transmission, the energy consumption of the nodes and the end-to-end communication delay, based on the simulation model that was implemented in the Opnet Modeler. The configuration parameters of the network are obtained directly from the TDCS tool. The simulation results show that the number of retransmissions impacts the reliability, the energy consumption and the end-to-end delay, in a way that improving the one may degrade the others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cooperating objects (COs) is a recently coined term used to signify the convergence of classical embedded computer systems, wireless sensor networks and robotics and control. We present essential elements of a reference architecture for scalable data processing for the CO paradigm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider a network where all nodes share a single broadcast domain such as a wired broadcast network. Nodes take sensor readings but individual sensor readings are not the most important pieces of data in the system. Instead, we are interested in aggregated quantities of the sensor readings such as minimum and maximum values, the number of nodes and the median among a set of sensor readings on different nodes. In this paper we show that a prioritized medium access control (MAC) protocol may advantageously be exploited to efficiently compute aggregated quantities of sensor readings. In this context, we propose a distributed algorithm that has a very low time and message-complexity for computing certain aggregated quantities. Importantly, we show that if every sensor node knows its geographical location, then sensor data can be interpolated with our novel distributed algorithm, and the message-complexity of the algorithm is independent of the number of nodes. Such an interpolation of sensor data can be used to compute any desired function; for example the temperature gradient in a room (e.g., industrial plant) densely populated with sensor nodes, or the gas concentration gradient within a pipeline or traffic tunnel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Doctoral Thesis in Information Systems and Technologies Area of Engineering and Manag ement Information Systems

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this study is the analysis of the dynamical properties of financial data series from worldwide stock market indexes during the period 2000–2009. We analyze, under a regional criterium, ten main indexes at a daily time horizon. The methods and algorithms that have been explored for the description of dynamical phenomena become an effective background in the analysis of economical data. We start by applying the classical concepts of signal analysis, fractional Fourier transform, and methods of fractional calculus. In a second phase we adopt the multidimensional scaling approach. Stock market indexes are examples of complex interacting systems for which a huge amount of data exists. Therefore, these indexes, viewed from a different perspectives, lead to new classification patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the calculation of derivatives of fractional order for non-smooth data. The noise is avoided by adopting an optimization formulation using genetic algorithms (GA). Given the flexibility of the evolutionary schemes, a hierarchical GA composed by a series of two GAs, each one with a distinct fitness function, is established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the traditional software and database development approaches tend to be serial, not evolutionary and certainly not agile, especially on data-oriented aspects. Most of the more commonly used methodologies are strict, meaning they’re composed by several stages each with very specific associated tasks. A clear example is the Rational Unified Process (RUP), divided into Business Modeling, Requirements, Analysis & Design, Implementation, Testing and Deployment. But what happens when the needs of a well design and structured plan, meet the reality of a small starting company that aims to build an entire user experience solution. Here resource control and time productivity is vital, requirements are in constant change, and so is the product itself. In order to succeed in this environment a highly collaborative and evolutionary development approach is mandatory. The implications of constant changing requirements imply an iterative development process. Project focus is on Data Warehouse development and business modeling. This area is usually a tricky one. Business knowledge is part of the enterprise, how they work, their goals, what is relevant for analyses are internal business processes. Throughout this document it will be explained why Agile Modeling development was chosen. How an iterative and evolutionary methodology, allowed for reasonable planning and documentation while permitting development flexibility, from idea to product. More importantly how it was applied on the development of a Retail Focused Data Warehouse. A productized Data Warehouse built on the knowledge of not one but several client needs. One that aims not just to store usual business areas but create an innovative sets of business metrics by joining them with store environment analysis, converting Business Intelligence into Actionable Business Intelligence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Politécnico do Porto para obtenção do Grau de Mestre em Gestão das Organizações, Ramo de Gestão de Empresas. Orientada por Prof. Dra. Maria Rosário Moreira e Prof. Dr. Paulo Sousa