801 resultados para Cognitive Radio Sensor Networks (CRSN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic programming is known to provide good solutions for many problems like the evolution of network protocols and distributed algorithms. In such cases it is most likely a hardwired module of a design framework that assists the engineer to optimize specific aspects of the system to be developed. It provides its results in a fixed format through an internal interface. In this paper we show how the utility of genetic programming can be increased remarkably by isolating it as a component and integrating it into the model-driven software development process. Our genetic programming framework produces XMI-encoded UML models that can easily be loaded into widely available modeling tools which in turn posses code generation as well as additional analysis and test capabilities. We use the evolution of a distributed election algorithm as an example to illustrate how genetic programming can be combined with model-driven development. This example clearly illustrates the advantages of our approach – the generation of source code in different programming languages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The content of this paper is a snapshot of a current project looking at producing a real-time sensor-based building assessment tool, and a system that personalises workspaces using multi-agent technology. Both systems derive physical environment information from a wireless sensor network that allows clients to subscribe to real-time sensed data. The principal ideologies behind this project are energy efficiency and well-being of occupants; in the context of leveraging the current state-of-the-art in agent technology, wireless sensor networks and building assessment systems to enable the optimisation and assessment of buildings. Participants of this project are from both industry (construction and research) and academia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Java language first came to public attention in 1995. Within a year, it was being speculated that Java may be a good language for parallel and distributed computing. Its core features, including being objected oriented and platform independence, as well as having built-in network support and threads, has encouraged this view. Today, Java is being used in almost every type of computer-based system, ranging from sensor networks to high performance computing platforms, and from enterprise applications through to complex research-based.simulations. In this paper the key features that make Java a good language for parallel and distributed computing are first discussed. Two Java-based middleware systems, namely MPJ Express, an MPI-like Java messaging system, and Tycho, a wide-area asynchronous messaging framework with an integrated virtual registry are then discussed. The paper concludes by highlighting the advantages of using Java as middleware to support distributed applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The content of this paper is a snapshot of a current project looking at producing a real-time sensor-based building assessment tool, and a system that personalises work-spaces using multi-agent technology. Both systems derive physical environment information from a wireless sensor network that allows clients to subscribe to real-time sensed data. The principal ideologies behind this project are energy efficiency and well-being of occupants; in the context of leveraging the current state-of-the-art in agent technology, wireless sensor networks and building assessment systems to enable the optimisation and assessment of buildings. Participants of this project are from both industry (construction and research) and academia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the impact of imperfect synchronisation on D-STBC when combined with incremental relay. To suppress such an impact, a novel detection scheme is proposed, which retains the two key features of the STBC principle: simplicity (i.e. linear computational complexity), and optimality (i.e. maximum likelihood). These two features make the new detector very suitable for low power wireless networks (e.g. sensor networks).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses the RFID implants for identification via a sensor network. Brain-computer implants linked in to a wireless network. Biometric identification via body sensors is also discussed. The use of a network as a means for remote and distance monitoring of humans opens up a range of potential uses. Where implanted identification is concerned this immediately offers high security access to specific areas by means of only an RFID device. If a neural implant is employed then clearly the information exchanged with a network can take on a much richer form, allowing for identification and response to an individual's needs based on the signals apparent on their nervous system.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless technology based pervasive healthcare has been proposed in many applications such as disease management and accident prevention for cost saving and promoting citizen’s wellbeing. However, the emphasis so far is on the artefacts with limited attentions to guiding the development of an effective and efficient solution for pervasive healthcare. Therefore, this paper aims to propose a framework of multi-agent systems design for pervasive healthcare by adopting the concept of pervasive informatics and using the methods of organisational semiotics. The proposed multi-agent system for pervasive healthcare utilises sensory information to support healthcare professionals for providing appropriate care. The key contributions contain theoretical aspect and practical aspect. In theory, this paper articulates the information interactions between the pervasive healthcare environment and stakeholders by using the methods of organisational semiotics; in practice, the proposed framework improves the healthcare quality by providing appropriate medical attentions when and as needed. In this paper, both systems and functional architecture of the multi-agent system are elaborated with the use of wireless technologies such as RFID and wireless sensor networks. The future study will focus on the implementation of the proposed framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Safety is an element of extreme priority in mining operations, currently many traditional mining countries are investing in the implementation of wireless sensors capable of detecting risk factors; through early warning signs to prevent accidents and significant economic losses. The objective of this research is to contribute to the implementation of sensors for continuous monitoring inside underground mines providing technical parameters for the design of sensor networks applied in underground coal mines. The application of sensors capable of measuring in real time variables of interest, promises to be of great impact on safety for mining industry. The relationship between the geological conditions and mining method design, establish how to implement a system of continuous monitoring. In this paper, the main causes of accidents for underground coal mines are established based on existing worldwide reports. Variables (temperature, gas, structural faults, fires) that can be related to the most frequent causes of disaster and its relevant measuring range are then presented, also the advantages, management and mining operations are discussed, including the analyzed of applying these systems in terms of Benefit, Opportunity, Cost and Risk. The publication focuses on coal mining, based on the proportion of these events a year worldwide, where a significant number of workers are seriously injured or killed. Finally, a dynamic assessment of safety at underground mines it is proposed, this approach offers a contribution to design personalized monitoring networks, the experience developed in coal mines provides a tool that facilitates the application development of technology within underground coal mines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A flood warning system incorporates telemetered rainfall and flow/water level data measured at various locations in the catchment area. Real-time accurate data collection is required for this use, and sensor networks improve the system capabilities. However, existing sensor nodes struggle to satisfy the hydrological requirements in terms of autonomy, sensor hardware compatibility, reliability and long-range communication. We describe the design and development of a real-time measurement system for flood monitoring, and its deployment in a flash-flood prone 650 km2 semiarid watershed in Southern Spain. A developed low-power and long-range communication device, so-called DatalogV1, provides automatic data gathering and reliable transmission. DatalogV1 incorporates self-monitoring for adapting measurement schedules for consumption management and to capture events of interest. Two tests are used to assess the success of the development. The results show an autonomous and robust monitoring system for long-term collection of water level data in many sparse locations during flood events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trust is one of the most important factors that influence the successful application of network service environments, such as e-commerce, wireless sensor networks, and online social networks. Computation models associated with trust and reputation have been paid special attention in both computer societies and service science in recent years. In this paper, a dynamical computation model of reputation for B2C e-commerce is proposed. Firstly, conceptions associated with trust and reputation are introduced, and the mathematical formula of trust for B2C e-commerce is given. Then a dynamical computation model of reputation is further proposed based on the conception of trust and the relationship between trust and reputation. In the proposed model, classical varying processes of reputation of B2C e-commerce are discussed. Furthermore, the iterative trust and reputation computation models are formulated via a set of difference equations based on the closed-loop feedback mechanism. Finally, a group of numerical simulation experiments are performed to illustrate the proposed model of trust and reputation. Experimental results show that the proposed model is effective in simulating the dynamical processes of trust and reputation for B2C e-commerce.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Enriquillo and Azuei are saltwater lakes located in a closed water basin in the southwestern region of the island of La Hispaniola, these have been experiencing dramatic changes in total lake-surface area coverage during the period 1980-2012. The size of Lake Enriquillo presented a surface area of approximately 276 km2 in 1984, gradually decreasing to 172 km2 in 1996. The surface area of the lake reached its lowest point in the satellite observation record in 2004, at 165 km2. Then the recent growth of the lake began reaching its 1984 size by 2006. Based on surface area measurement for June and July 2013, Lake Enriquillo has a surface area of ~358 km2. Sumatra sizes at both ends of the record are 116 km2 in 1984 and 134 km2in 2013, an overall 15.8% increase in 30 years. Determining the causes of lake surface area changes is of extreme importance due to its environmental, social, and economic impacts. The overall goal of this study is to quantify the changing water balance in these lakes and their catchment area using satellite and ground observations and a regional atmospheric-hydrologic modeling approach. Data analyses of environmental variables in the region reflect a hydrological unbalance of the lakes due to changing regional hydro-climatic conditions. Historical data show precipitation, land surface temperature and humidity, and sea surface temperature (SST), increasing over region during the past decades. Salinity levels have also been decreasing by more than 30% from previously reported baseline levels. Here we present a summary of the historical data obtained, new sensors deployed in the sourrounding sierras and the lakes, and the integrated modeling exercises. As well as the challenges of gathering, storing, sharing, and analyzing this large volumen of data in a remote location from such a diverse number of sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New business and technology platforms are required to sustainably manage urban water resources [1,2]. However, any proposed solutions must be cognisant of security, privacy and other factors that may inhibit adoption and hence impact. The FP7 WISDOM project (funded by the European Commission - GA 619795) aims to achieve a step change in water and energy savings via the integration of innovative Information and Communication Technologies (ICT) frameworks to optimize water distribution networks and to enable change in consumer behavior through innovative demand management and adaptive pricing schemes [1,2,3]. The WISDOM concept centres on the integration of water distribution, sensor monitoring and communication systems coupled with semantic modelling (using ontologies, potentially connected to BIM, to serve as intelligent linkages throughout the entire framework) and control capabilities to provide for near real-time management of urban water resources. Fundamental to this framework are the needs and operational requirements of users and stakeholders at domestic, corporate and city levels and this requires the interoperability of a number of demand and operational models, fed with data from diverse sources such as sensor networks and crowsourced information. This has implications regarding the provenance and trustworthiness of such data and how it can be used in not only the understanding of system and user behaviours, but more importantly in the real-time control of such systems. Adaptive and intelligent analytics will be used to produce decision support systems that will drive the ability to increase the variability of both supply and consumption [3]. This in turn paves the way for adaptive pricing incentives and a greater understanding of the water-energy nexus. This integration is complex and uncertain yet being typical of a cyber-physical system, and its relevance transcends the water resource management domain. The WISDOM framework will be modeled and simulated with initial testing at an experimental facility in France (AQUASIM – a full-scale test-bed facility to study sustainable water management), then deployed and evaluated in in two pilots in Cardiff (UK) and La Spezia (Italy). These demonstrators will evaluate the integrated concept providing insight for wider adoption.