920 resultados para Ad-Hoc Network
Resumo:
Cooperative caching is used in mobile ad hoc networks to reduce the latency perceived by the mobile clients while retrieving data and to reduce the traffic load in the network. Caching also increases the availability of data due to server disconnections. The implementation of a cooperative caching technique essentially involves four major design considerations (i) cache placement and resolution, which decides where to place and how to locate the cached data (ii) Cache admission control which decides the data to be cached (iii) Cache replacement which makes the replacement decision when the cache is full and (iv) consistency maintenance, i.e. maintaining consistency between the data in server and cache. In this paper we propose an effective cache resolution technique, which reduces the number of messages flooded in to the network to find the requested data. The experimental results gives a promising result based on the metrics of studies.
Resumo:
Mobile Ad-hoc Networks (MANETS) consists of a collection of mobile nodes without having a central coordination. In MANET, node mobility and dynamic topology play an important role in the performance. MANET provide a solution for network connection at anywhere and at any time. The major features of MANET are quick set up, self organization and self maintenance. Routing is a major challenge in MANET due to it’s dynamic topology and high mobility. Several routing algorithms have been developed for routing. This paper studies the AODV protocol and how AODV is performed under multiple connections in the network. Several issues have been identified. The bandwidth is recognized as the prominent factor reducing the performance of the network. This paper gives an improvement of normal AODV for simultaneous multiple connections under the consideration of bandwidth of node.
Resumo:
With the increasing popularity of wireless network and its application, mobile ad-hoc networks (MANETS) emerged recently. MANET topology is highly dynamic in nature and nodes are highly mobile so that the rate of link failure is more in MANET. There is no central control over the nodes and the control is distributed among nodes and they can act as either router or source. MANTEs have been considered as isolated stand-alone network. Node can add or remove at any time and it is not infrastructure dependent. So at any time at any where the network can setup and a trouble free communication is possible. Due to more chances of link failures, collisions and transmission errors in MANET, the maintenance of network became costly. As per the study more frequent link failures became an important aspect of diminishing the performance of the network and also it is not predictable. The main objective of this paper is to study the route instability in AODV protocol and suggest a solution for improvement. This paper proposes a new approach to reduce the route failure by storing the alternate route in the intermediate nodes. In this algorithm intermediate nodes are also involved in the route discovery process. This reduces the route establishment overhead as well as the time to find the reroute when a link failure occurs.
Resumo:
Sensor networks are one of the fastest growing areas in broad of a packet is in transit at any one time. In GBR, each node in the network can look at itsneighbors wireless ad hoc networking (? Eld. A sensor node, typically'hop count (depth) and use this to decide which node to forward contains signal-processing circuits, micro-controllers and a the packet on to. If the nodes' power level drops below a wireless transmitter/receiver antenna. Energy saving is one certain level it will increase the depth to discourage trafiE of the critical issue forfor sensor networks since most sensors are equipped with non-rechargeable batteries that have limited lifetime.
Resumo:
We describe an adaptive, mid-level approach to the wireless device power management problem. Our approach is based on reinforcement learning, a machine learning framework for autonomous agents. We describe how our framework can be applied to the power management problem in both infrastructure and ad~hoc wireless networks. From this thesis we conclude that mid-level power management policies can outperform low-level policies and are more convenient to implement than high-level policies. We also conclude that power management policies need to adapt to the user and network, and that a mid-level power management framework based on reinforcement learning fulfills these requirements.
Resumo:
El trasplante de órganos es considerado uno de los avances más significativos de la medicina moderna y es un procedimiento cada vez más exitoso en términos de supervivencia de los pacientes, siendo actualmente la mejor opción de tratamiento para los pacientes con innumerables patologías. El proceso de donación es insuficiente para cubrir las necesidades de trasplante de la población, por lo tanto, se hace necesario el desarrollo de nuevas estrategias para fortalecer la experiencia y efectividad de los programas existentes. La falta de conocimiento de los profesionales de la salud, su percepción y actitud hacia temas relacionados con el proceso de donación, pueden convertirlos en facilitadores o barreras para la identificación de potenciales donantes. Por esta razón, los recursos disponibles, las actitudes hacia la donación, la legislación y conocimiento de los procesos involucrados en la donación de tejidos y órganos son críticos. Dada la influencia de los profesionales de salud se definen los objetivos de este proyecto de tesis: determinar cuál es el conocimiento y las habilidades de los profesionales de la salud encargados de los trasplantes de órganos y de tejidos en la regional 1, evaluados mediante una herramienta educativa para contribuir a mejorar un programa eficiente de Donación de Órganos y tejidos y así mismo, fijar recomendaciones en aras de aumentar las tasas de donación, con especial énfasis en la actividad hospitalaria en el país. METODOLOGIA Se realizó un estudio basado en el análisis de la evaluación de conocimientos del proceso donación- trasplante de órganos y tejidos en el personal de salud participante en la herramienta educativa llamada “Curso taller primer respondiente del potencial donante de órganos y tejidos”. Este curso incluía un formato evaluativo que fue diligenciado de manera anónima por los participantes antes y después de recibir el contenido del curso. El estudio se desarrolló en personal de la Salud de IPS pertenecientes a la Regional I, de la Red Nacional de donación y trasplantes de órganos y tejidos. Con el fin de evidenciar si existen diferencias en el conocimiento de los participantes del curso antes y después de asistir al mismo, se utilizó la prueba de McNemar (p< 0.05). RESULTADOS Entre julio del 2011 y junio del 2012, se realizó el “Curso taller primer respondiente del potencial donante de órganos y tejidos” y se obtuvieron 303 encuestados incluidos médicos, enfermeras y auxiliares de enfermería. Al inicio del curso las respuestas acertadas con relación a legislación, selección del donante, muerte encefálica y mantenimiento del donante estuvieron alrededor del 50%. No fue posible detectar la profesión que pudiese generar riesgo en la detección del donante y los procesos asociados. Posterior al curso, el 72% de las preguntas se respondieron de manera correcta, lo que representa un incremento estadísticamente significativo. Este cambio evidenció significancia estadística al usar la prueba de McNemar y arrojar un valor de p=0.00. .DISCUSIÓN El personal de salud participante en el curso taller proveniente de unidades involucradas como generadoras de donantes muestra un déficit de conocimientos del proceso donación trasplantes lo que puede convertirlos en limitantes para dicho proceso
Resumo:
1) Contribuir en las decisiones de la política pública destacando la importancia de la educación superior en el desarrollo. 2) Exponer el estado actual de la financiación de la educación superior en el ámbito global y regional, y ofrecer un conjunto de recomendaciones y de buenas prácticas que puedan ser aplicadas, con los imprescindibles ajustes, en otros contextos. 70 personas de 43 países. Examina los temas globales en la financiación de la educación superior, analiza las percepciones regionales de la educación superior y estudia las perspectivas de los líderes sobre la financiación de la educación superior. Encuesta Delphi. El planteamiento con el que se diseñó el cuestionario respondía a varias premisas. En primer lugar, la temática, que debía ser, acorde con la publicación, sobre la financiación de la educación superior. A partir de aquí se trabajó para identificar qué elementos debía contener el cuestionario, para que fuese relevante en términos de contenido y que, sin ser exhaustivo, plantease temas clave sobre la financiación de la educación superior. En segundo lugar, la brevedad del cuestionario se planteó como un elemento importante, de manera que se estipuló que éste no debía contener más de seis preguntas. Con estas pautas se identificaron una serie de cuestiones relacionadas con la educación superior. Los factores que influyen en las tendencias de la financiación de la educación superior son: la expansión masiva de la educación superior, la incapacidad del Estado para financiar dicha expansión masiva y la consiguiente aparición del sector privado, la base de la distribución de costes con padres y estudiantes, la demanda pública de rendición de cuentas y de una buena relación calidad-precio, la aparición de proveedores extranjeros a través del Acuerdo General sobre el Comercio de Servicios (AGCS) y, finalmente la necesidad de ajustar la financiación del Estado para reducir la creciente disparidad. 1) En los países desarrollados, los sistemas y las instituciones de educación superior, así como las universidades, se encuentran en una situación ventajosa, por sus recursos financieros, por estar en el estado del arte en los temas de investigación y por su fácil acceso a las redes de información. Sin embargo, la cooperación con las universidades de los países en desarrollo es no sólo un deber ético, sino también una fuente de conocimiento insustituible. 2) La educación superior debe considerarse un servicio público, independientemente de cuál sea su fuente de financiamiento. Esto implica que las instituciones de educación superior, tanto públicas como privadas, asuman un compromiso público con la sociedad no sólo el saber cómo sino además el saber por qué o para qué. Esto significa realzar el papel de las Instituciones de Educación Superior (IES) en la construcción y el desarrollo de recursos sociales y humanos, para lo cual el Estado debe establecer políticas ad hoc y promover, regular y financiar la educación superior. Se debe partir del principio de que no se puede excluir a nadie del saber y de sus beneficios. Sólo el Estado está en condiciones de coordinar adecuadamente el uso de los recursos y de priorizar y financiar áreas no rentables a corto plazo como, por ejemplo, las investigaciones para evitar la contaminación ambiental y promover el desarrollo sostenible. 3) Es necesario diversificar aún más los modelos de financiación de la educación superior y los métodos de capacitación de matrícula, para satisfacer la demanda sin afectar la calidad.
Resumo:
Wireless Personal Area Networks (WPANs) are offering high data rates suitable for interconnecting high bandwidth personal consumer devices (Wireless HD streaming, Wireless-USB and Bluetooth EDR). ECMA-368 is the Physical (PHY) and Media Access Control (MAC) backbone of many of these wireless devices. WPAN devices tend to operate in an ad-hoc based network and therefore it is important to successfully latch onto the network and become part of one of the available piconets. This paper presents a new algorithm for detecting the Packet/Fame Sync (PFS) signal in ECMA-368 to identify piconets and aid symbol timing. The algorithm is based on correlating the received PFS symbols with the expected locally stored symbols over the 24 or 12 PFS symbols, but selecting the likely TFC based on the highest statistical mode from the 24 or 12 best correlation results. The results are very favorable showing an improvement margin in the order of 11.5dB in reference sensitivity tests between the required performance using this algorithm and the performance of comparable systems.
Resumo:
The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.
Resumo:
The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.
Resumo:
The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform
Resumo:
Vehicular networks ensure that the information received from any vehicle is promptly and correctly propagated to nearby vehicles, to prevent accidents. A crucial point is how to trust the information transmitted, when the neighboring vehicles are rapidly changing and moving in and out of range. Current trust management schemes for vehicular networks establish trust by voting on the decision received by several nodes, which might not be required for practical scenarios. It might just be enough to check the validity of incoming information. Due to the ephemeral nature of vehicular networks, reputation schemes for mobile ad hoc networks (MANETs) cannot be applied to vehicular ad hoc networks (VANET). We point out several limitations of trust management schemes for VANET. In particular, we identify the problem of information cascading and oversampling, which commonly arise in social networks. Oversampling is a situation in which a node observing two or more nodes, takes into consideration both their opinions equally without knowing that they might have influenced each other in decision making. We show that simple voting for decision making, leads to oversampling and gives incorrect results. We propose an algorithm to overcome this problem in VANET. This is the first paper which discusses the concept of cascading effect and oversampling effects to ad hoc networks. © 2011 IEEE.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The focus of the activities of the Economic Commission for Latin America and the Caribbean/Caribbean Development and Cooperation Committee (ECLAC/CDCC) secretariat during the 2006-2007 biennium continued to be on assistance to member governments of the subregion with policy-making and development strategies, especially on issues relevant to the promotion of the economic, social, and environmental dimensions of development in the Caribbean. The Subregional Headquarters for the Caribbean worked closely with member countries of the CDCC in an effort to ensure the relevance of outputs which would inform policy options. This involved the strengthening of partnerships with both regional and subregional institutions and relevant agencies of the United Nations system working in the Caribbean. A major decision was taken to refocus the operational aspects of the secretariat to ensure that they were relevant to the development goals of its members. This involved the introduction of a thematic approach to the work of the office. One of the changes resulting from this was the restructuring and renaming of the Caribbean Documentation Centre. The Caribbean Knowledge Management Centre (CKMC), as it is now known, has changed its emphasis from organizing and disseminating documents, and is now a more proactive partner in the research undertaken by staff and other users of the service. The CKMC manages the ECLAC website, the public face of the organization. Newsletters and all other documents, including Information and Communications Technology (ICT) profiles of selected countries, prepared by the secretariat, are now available online at the ECLAC/CDCC website www.eclacpos.org . The Caribbean Knowledge Management Portal was launched at a meeting of information specialists in St. Vincent and the Grenadines in 2007. In addition to reaching a wider public, this measure was introduced as a means of reducing the cost of printing or disseminating publications. In spite of the unusually high vacancy rate, at both the international and local levels, during the biennium, the subregional headquarters accomplished 98 per cent of the 119 outputs earmarked for the period. Using vacant positions to carry out the assignments was not an easy task, given the complexity in recruiting qualified and experienced persons for short periods. Nevertheless, consultancy services and short-term replacement staff greatly aided the delivery of these outputs. All the same, 35 work months remained unused during the biennium, leaving 301 work months to complete the outputs. In addition to the unoccupied positions, the work of the subprogramme was severely affected by the rising cost of regional and subregional travel which limited the ability of staff to network and interact with colleagues of member countries. This also hampered the outreach programme carried out mainly through ad hoc expert group meetings. In spite of these shortcomings, the period proved to be successful for the subprogramme as it engaged the attention of member countries in its work either through direct or indirect participation. Staff members completed 36 technical papers plus the reports of the meetings and workshops. A total of 523 persons, representing member countries, participated in the 18 intergovernmental and expert meetings convened by the secretariat in the 24-month period. In its effort to build technical capacity, the subprogramme convened 15 workshops/seminars which offered training for 446 persons.
Resumo:
A pesquisa apresentada nesta dissertação descreve a elaboração de um protocolo de roteamento para aplicações de Redes de Sensores Sem Fio (RSSF) em cidade inteligentes com forte restrição de energia e alta densidade de nodos. Através do estudo dos principais objetivos da comunicação de dados e do levantamento do estado-da-arte sobre os protocolos de roteamento e tecnologias para RSSF, a proposta contempla requisitos como: vazão de dados, confiabilidade de entrega e eficiência energética. A pesquisa apresenta em detalhes o protocolo AODV (Ad hoc On Demand Distance Vector), bem como sua relevância no contexto de RSSF devido a sua popularidade entre as plataformas de dispositivos comercializados. Além disso, são apresentados protocolos derivados do AODV, e a ausência de uma proposta robusta capaz de contemplar os requisitos levantados. O protocolo REL (Routing by Energy and Link Quality) é o resultado da pesquisa levantada e a proposta de solução para roteamento plano sob demanda baseado em eficiência energética e qualidade de enlace para prover um roteamento escalável, capaz de realizar balanceamento de carga e prolongar o tempo de vida da rede. O protocolo REL foi avaliado através de simulação e tesbed, a fim de garantir validação da proposta em ambiente real de escala reduzida e simulado de alta densidade. Os resultados mostraram que o protocolo REL apresenta considerável melhoria de entrega de dados através da escolha de enlaces confiáveis de transmissão e menos suscetíveis a erro, além de moderado consumo de energia capaz de prolongar o tempo de vida da rede, evitando a saturação prematura de nodos.