871 resultados para multipath routing
Resumo:
Using Wireless Sensor Networks (WSNs) in healthcare systems has had a lot of attention in recent years. In much of this research tasks like sensor data processing, health states decision making and emergency message sending are done by a remote server. Many patients with lots of sensor data consume a great deal of communication resources, bring a burden to the remote server and delay the decision time and notification time. A healthcare application for elderly people using WSN has been simulated in this paper. A WSN designed for the proposed healthcare application needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from the patients to the medical centre. Based on these requirements, A cross layer based on the modified versions of APTEEN and GinMAC has been designed and implemented, with new features, such as a mobility module and routes discovery algorithms have been added. Simulation results show that the proposed cross layer based protocol can conserve energy for nodes and provide the required performance such as life time of the network, delay and reliability for the proposed healthcare application.
Resumo:
The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.
Resumo:
This Thesis Work will concentrate on a very interesting problem, the Vehicle Routing Problem (VRP). In this problem, customers or cities have to be visited and packages have to be transported to each of them, starting from a basis point on the map. The goal is to solve the transportation problem, to be able to deliver the packages-on time for the customers,-enough package for each Customer,-using the available resources- and – of course - to be so effective as it is possible.Although this problem seems to be very easy to solve with a small number of cities or customers, it is not. In this problem the algorithm have to face with several constraints, for example opening hours, package delivery times, truck capacities, etc. This makes this problem a so called Multi Constraint Optimization Problem (MCOP). What’s more, this problem is intractable with current amount of computational power which is available for most of us. As the number of customers grow, the calculations to be done grows exponential fast, because all constraints have to be solved for each customers and it should not be forgotten that the goal is to find a solution, what is best enough, before the time for the calculation is up. This problem is introduced in the first chapter: form its basics, the Traveling Salesman Problem, using some theoretical and mathematical background it is shown, why is it so hard to optimize this problem, and although it is so hard, and there is no best algorithm known for huge number of customers, why is it a worth to deal with it. Just think about a huge transportation company with ten thousands of trucks, millions of customers: how much money could be saved if we would know the optimal path for all our packages.Although there is no best algorithm is known for this kind of optimization problems, we are trying to give an acceptable solution for it in the second and third chapter, where two algorithms are described: the Genetic Algorithm and the Simulated Annealing. Both of them are based on obtaining the processes of nature and material science. These algorithms will hardly ever be able to find the best solution for the problem, but they are able to give a very good solution in special cases within acceptable calculation time.In these chapters (2nd and 3rd) the Genetic Algorithm and Simulated Annealing is described in details, from their basis in the “real world” through their terminology and finally the basic implementation of them. The work will put a stress on the limits of these algorithms, their advantages and disadvantages, and also the comparison of them to each other.Finally, after all of these theories are shown, a simulation will be executed on an artificial environment of the VRP, with both Simulated Annealing and Genetic Algorithm. They will both solve the same problem in the same environment and are going to be compared to each other. The environment and the implementation are also described here, so as the test results obtained.Finally the possible improvements of these algorithms are discussed, and the work will try to answer the “big” question, “Which algorithm is better?”, if this question even exists.
Resumo:
This thesis presents DCE, or Dynamic Conditional Execution, as an alternative to reduce the cost of mispredicted branches. The basic idea is to fetch all paths produced by a branch that obey certain restrictions regarding complexity and size. As a result, a smaller number of predictions is performed, and therefore, a lesser number of branches are mispredicted. DCE fetches through selected branches avoiding disruptions in the fetch flow when these branches are fetched. Both paths of selected branches are executed but only the correct path commits. In this thesis we propose an architecture to execute multiple paths of selected branches. Branches are selected based on the size and other conditions. Simple and complex branches can be dynamically predicated without requiring a special instruction set nor special compiler optimizations. Furthermore, a technique to reduce part of the overhead generated by the execution of multiple paths is proposed. The performance achieved reaches levels of up to 12% when comparing a Local predictor used in DCE against a Global predictor used in the reference machine. When both machines use a Local predictor, the speedup is increased by an average of 3-3.5%.
Resumo:
The wavelet transform is used to reduce the high frequency multipath of pseudorange and carrier phase GPS double differences (DDs). This transform decomposes the DD signal, thus separating the high frequencies due to multipath effects. After the decomposition, the wavelet shrinkage is performed by thresholding to eliminate the high frequency component. Then the signal can be reconstructed without the high frequency component. We show how to choose the best threshold. Although the high frequency multipath is not the main multipath error component, its correction provides improvements of about 30% in pseudorange average residuals and 24% in carrier phases. The results also show that the ambiguity solutions become more reliable after correcting the high frequency multipath.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Wavelets are being extensively used in Geodetic applications. In this paper, the Multi-Resolution Analysis (MRA) using wavelets is applied to pseudorange and carrier phase GPS double differences (DDs) in order to reduce multipath effects. The wavelets were already applied to GPS carrier phase DDs, but some questions remain: How good can be the results, and are all multipath effects reduced? The answers to these questions are discussed in this paper. Thus, the wavelet transform is used to decompose the DD signals, splitting them in lower resolution components. After the decomposition process, the wavelet shrinkage is performed by thresholding to eliminate the components relative to multipath effects. Then, the DD observation can be reconstructed. This new DD signal is used to perform the baseline processing. The daily multipath repeatability was verified. With the application of the proposed approach, the results showed that the reliability of the ambiguity resolution and accuracy of the results improved when compared with the standard procedure. Furthermore, the method showed to be very efficient computationally, because, it is not noticed, at practical level, difference in the time span between the processing with and without application of the proposed method. However, only the high frequency multipath was eliminated.
Resumo:
Integer carrier phase ambiguity resolution is the key to rapid and high-precision global navigation satellite system (GNSS) positioning and navigation. As important as the integer ambiguity estimation, it is the validation of the solution, because, even when one uses an optimal, or close to optimal, integer ambiguity estimator, unacceptable integer solution can still be obtained. This can happen, for example, when the data are degraded by multipath effects, which affect the real-valued float ambiguity solution, conducting to an incorrect integer (fixed) ambiguity solution. Thus, it is important to use a statistic test that has a correct theoretical and probabilistic base, which has became possible by using the Ratio Test Integer Aperture (RTIA) estimator. The properties and underlying concept of this statistic test are shortly described. An experiment was performed using data with and without multipath. Reflector objects were placed surrounding the receiver antenna aiming to cause multipath. A method based on multiresolution analysis by wavelet transform is used to reduce the multipath of the GPS double difference (DDs) observations. So, the objective of this paper is to compare the ambiguity resolution and validation using data from these two situations: data with multipath and with multipath reduced by wavelets. Additionally, the accuracy of the estimated coordinates is also assessed by comparing with the ground truth coordinates, which were estimated using data without multipath effects. The success and fail probabilities of the RTIA were, in general, coherent and showed the efficiency and the reliability of this statistic test. After multipath mitigation, ambiguity resolution becomes more reliable and the coordinates more precise. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
Low-frequency multipath is still one of the major challenges for high precision GPS relative positioning. In kinematic applications, mainly, due to geometry changes, the low-frequency multipath is difficult to be removed or modeled. Spectral analysis has a powerful technique to analyze this kind of non-stationary signals: the wavelet transform. However, some processes and specific ways of processing are necessary to work together in order to detect and efficiently mitigate low-frequency multipath. In this paper, these processes are discussed. Some experiments were carried out in a kinematic mode with a controlled and known vehicle movement. The data were collected in the presence of a reflector surface placed close to the vehicle to cause, mainly, low-frequency multipath. From theanalyses realized, the results in terms of double difference residuals and statistical tests showed that the proposed methodology is very efficient to detect and mitigate low-frequency multipath effects. © 2008 IEEE.
Resumo:
To ensure high accuracy results from GPS relative positioning, the multipath effects have to be mitigated. Although the careful selection of antenna site and the use of especial antennas and receivers can minimize multipath, it cannot always be eliminated and frequently the residual multipath disturbance remains as the major error in GPS results. The high-frequency multipath from large delays can be attenuated by double difference (DD) denoising methods. But the low-frequency multipath from short delays is very difficult to be reduced or modeled. In this paper, it is proposed a method based on wavelet regression (WR), which can effectively detect and reduce the low-frequency multipath. The wavelet technique is firstly applied to decompose the DD residuals into the low-frequency bias and high-frequency noise components. The extracted bias components by WR are then directly applied to the DD observations to correct them from the trend. The remaining terms, largely characterized by the high-frequency measurement noise, are expected to give the best linear unbiased solutions from a least-squares (LS) adjustment. An experiment was carried out using objects placed close to the receiver antenna to cause, mainly, low-frequency multipath. The data were collected for two days to verify the multipath repeatability. The ground truth coordinates were computed with data collected in the absence of the reflector objects. The coordinates and ambiguity solution were compared with and without the multipath mitigation using WR. After mitigating the multipath, ambiguity resolution became more reliable and the coordinates were more accurate.
Resumo:
The Capacitated Arc Routing Problem (CARP) is a well-known NP-hard combinatorial optimization problem where, given an undirected graph, the objective is to find a minimum cost set of tours servicing a subset of required edges under vehicle capacity constraints. There are numerous applications for the CARP, such as street sweeping, garbage collection, mail delivery, school bus routing, and meter reading. A Greedy Randomized Adaptive Search Procedure (GRASP) with Path-Relinking (PR) is proposed and compared with other successful CARP metaheuristics. Some features of this GRASP with PR are (i) reactive parameter tuning, where the parameter value is stochastically selected biased in favor of those values which historically produced the best solutions in average; (ii) a statistical filter, which discard initial solutions if they are unlikely to improve the incumbent best solution; (iii) infeasible local search, where high-quality solutions, though infeasible, are used to explore the feasible/infeasible boundaries of the solution space; (iv) evolutionary PR, a recent trend where the pool of elite solutions is progressively improved by successive relinking of pairs of elite solutions. Computational tests were conducted using a set of 81 instances, and results reveal that the GRASP is very competitive, achieving the best overall deviation from lower bounds and the highest number of best solutions found. © 2011 Elsevier Ltd. All rights reserved.
Resumo:
A significant set of information stored in different databases around the world, can be shared through peer-topeer databases. With that, is obtained a large base of knowledge, without the need for large investments because they are used existing databases, as well as the infrastructure in place. However, the structural characteristics of peer-topeer, makes complex the process of finding such information. On the other side, these databases are often heterogeneous in their schemas, but semantically similar in their content. A good peer-to-peer databases systems should allow the user access information from databases scattered across the network and receive only the information really relate to your topic of interest. This paper proposes to use ontologies in peer-to-peer database queries to represent the semantics inherent to the data. The main contribution of this work is enable integration between heterogeneous databases, improve the performance of such queries and use the algorithm of optimization Ant Colony to solve the problem of locating information on peer-to-peer networks, which presents an improve of 18% in results. © 2011 IEEE.
Resumo:
In a peer-to-peer network, the nodes interact with each other by sharing resources, services and information. Many applications have been developed using such networks, being a class of such applications are peer-to-peer databases. The peer-to-peer databases systems allow the sharing of unstructured data, being able to integrate data from several sources, without the need of large investments, because they are used existing repositories. However, the high flexibility and dynamicity of networks the network, as well as the absence of a centralized management of information, becomes complex the process of locating information among various participants in the network. In this context, this paper presents original contributions by a proposed architecture for a routing system that uses the Ant Colony algorithm to optimize the search for desired information supported by ontologies to add semantics to shared data, enabling integration among heterogeneous databases and the while seeking to reduce the message traffic on the network without causing losses in the amount of responses, confirmed by the improve of 22.5% in this amount. © 2011 IEEE.