874 resultados para Distributed data access
Resumo:
Environmental management is a complex task. The amount and heterogeneity of the data needed for an environmental decision making tool is overwhelming without adequate database systems and innovative methodologies. As far as data management, data interaction and data processing is concerned we here propose the use of a Geographical Information System (GIS) whilst for the decision making we suggest a Multi-Agent System (MAS) architecture. With the adoption of a GIS we hope to provide a complementary coexistence between heterogeneous data sets, a correct data structure, a good storage capacity and a friendly user’s interface. By choosing a distributed architecture such as a Multi-Agent System, where each agent is a semi-autonomous Expert System with the necessary skills to cooperate with the others in order to solve a given task, we hope to ensure a dynamic problem decomposition and to achieve a better performance compared with standard monolithical architectures. Finally, and in view of the partial, imprecise, and ever changing character of information available for decision making, Belief Revision capabilities are added to the system. Our aim is to present and discuss an intelligent environmental management system capable of suggesting the more appropriate land-use actions based on the existing spatial and non-spatial constraints.
Resumo:
Wireless Body Area Networks (WBANs) have emerged as a promising technology for medical and non-medical applications. WBANs consist of a number of miniaturized, portable, and autonomous sensor nodes that are used for long-term health monitoring of patients. These sensor nodes continuously collect information of patients, which are used for ubiquitous health monitoring. In addition, WBANs may be used for managing catastrophic events and increasing the effectiveness and performance of rescue forces. The huge amount of data collected by WBAN nodes demands scalable, on-demand, powerful, and secure storage and processing infrastructure. Cloud computing is expected to play a significant role in achieving the aforementioned objectives. The cloud computing environment links different devices ranging from miniaturized sensor nodes to high-performance supercomputers for delivering people-centric and context-centric services to the individuals and industries. The possible integration of WBANs with cloud computing (WBAN-cloud) will introduce viable and hybrid platform that must be able to process the huge amount of data collected from multiple WBANs. This WBAN-cloud will enable users (including physicians and nurses) to globally access the processing and storage infrastructure at competitive costs. Because WBANs forward useful and life-critical information to the cloud – which may operate in distributed and hostile environments, novel security mechanisms are required to prevent malicious interactions to the storage infrastructure. Both the cloud providers and the users must take strong security measures to protect the storage infrastructure.
Resumo:
Thesis presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the subject of Electrical and Computer Engineering
Resumo:
The importance of wind power energy for energy and environmental policies has been growing in past recent years. However, because of its random nature over time, the wind generation cannot be reliable dispatched and perfectly forecasted, becoming a challenge when integrating this production in power systems. In addition the wind energy has to cope with the diversity of production resulting from alternative wind power profiles located in different regions. In 2012, Portugal presented a cumulative installed capacity distributed over 223 wind farms [1]. In this work the circular data statistical methods are used to analyze and compare alternative spatial wind generation profiles. Variables indicating extreme situations are analyzed. The hour (s) of the day where the farm production attains its maximum daily production is considered. This variable was converted into circular variable, and the use of circular statistics enables to identify the daily hour distribution for different wind production profiles. This methodology was applied to a real case, considering data from the Portuguese power system regarding the year 2012 with a 15-minutes interval. Six geographical locations were considered, representing different wind generation profiles in the Portuguese system.In this work the circular data statistical methods are used to analyze and compare alternative spatial wind generation profiles. Variables indicating extreme situations are analyzed. The hour (s) of the day where the farm production attains its maximum daily production is considered. This variable was converted into circular variable, and the use of circular statistics enables to identify the daily hour distribution for different wind production profiles. This methodology was applied to a real case, considering data from the Portuguese power system regarding the year 2012 with a 15-minutes interval. Six geographical locations were considered, representing different wind generation profiles in the Portuguese system.
Topics regarding access to european information institutions: European Union so close and yet so far
Resumo:
From the 1990s, the Parliament, the Council and the European Commission adopted a new approach to disclosure of their working papers. Legal instruments to regulate and allow a fairly broad access to internal working documents of these institutions were created. European institutions also exploited the potential of Information and Communication Technologies, developing new instruments to register the documents produced and make them accessible to the public. The commitment to transparency sought to shows a more credible European government, and reduces the democratic deficit. However, the data analysis regarding access to EU institutions documents shows that general public is still far from direct contact with European bodies.
Resumo:
This paper intends to present the legal background that support dissemination and access to documents from European institutions, namely the Parliament, the Council and the European Commission. Currently, this legal framework is accomplished with a set of Internet tools that are analyzed regarding official documents types and options searches available. Some statistical data on access to European information published in annual reports from the institutions are also evaluated. The relationship between shadow and light in transparency to access administrative documents and marketing issues of a political communication are underlined. Neo-institutional approach, reputational concept in public organizations and systemic perspective are used as theoretical background.
Resumo:
In recent years, vehicular cloud computing (VCC) has emerged as a new technology which is being used in wide range of applications in the area of multimedia-based healthcare applications. In VCC, vehicles act as the intelligent machines which can be used to collect and transfer the healthcare data to the local, or global sites for storage, and computation purposes, as vehicles are having comparatively limited storage and computation power for handling the multimedia files. However, due to the dynamic changes in topology, and lack of centralized monitoring points, this information can be altered, or misused. These security breaches can result in disastrous consequences such as-loss of life or financial frauds. Therefore, to address these issues, a learning automata-assisted distributive intrusion detection system is designed based on clustering. Although there exist a number of applications where the proposed scheme can be applied but, we have taken multimedia-based healthcare application for illustration of the proposed scheme. In the proposed scheme, learning automata (LA) are assumed to be stationed on the vehicles which take clustering decisions intelligently and select one of the members of the group as a cluster-head. The cluster-heads then assist in efficient storage and dissemination of information through a cloud-based infrastructure. To secure the proposed scheme from malicious activities, standard cryptographic technique is used in which the auotmaton learns from the environment and takes adaptive decisions for identification of any malicious activity in the network. A reward and penalty is given by the stochastic environment where an automaton performs its actions so that it updates its action probability vector after getting the reinforcement signal from the environment. The proposed scheme was evaluated using extensive simulations on ns-2 with SUMO. The results obtained indicate that the proposed scheme yields an improvement of 10 % in detection rate of malicious nodes when compared with the existing schemes.
Resumo:
Disaster management is one of the most relevant application fields of wireless sensor networks. In this application, the role of the sensor network usually consists of obtaining a representation or a model of a physical phenomenon spreading through the affected area. In this work we focus on forest firefighting operations, proposing three fully distributed ways for approximating the actual shape of the fire. In the simplest approach, a circular burnt area is assumed around each node that has detected the fire and the union of these circles gives the overall fire’s shape. However, as this approach makes an intensive use of the wireless sensor network resources, we have proposed to incorporate two in-network aggregation techniques, which do not require considering the complete set of fire detections. The first technique models the fire by means of a complex shape composed of multiple convex hulls representing different burning areas, while the second technique uses a set of arbitrary polygons. Performance evaluation of realistic fire models on computer simulations reveals that the method based on arbitrary polygons obtains an improvement of 20% in terms of accuracy of the fire shape approximation, reducing the overhead in-network resources to 10% in the best case.
Resumo:
In this manuscript we tackle the problem of semidistributed user selection with distributed linear precoding for sum rate maximization in multiuser multicell systems. A set of adjacent base stations (BS) form a cluster in order to perform coordinated transmission to cell-edge users, and coordination is carried out through a central processing unit (CU). However, the message exchange between BSs and the CU is limited to scheduling control signaling and no user data or channel state information (CSI) exchange is allowed. In the considered multicell coordinated approach, each BS has its own set of cell-edge users and transmits only to one intended user while interference to non-intended users at other BSs is suppressed by signal steering (precoding). We use two distributed linear precoding schemes, Distributed Zero Forcing (DZF) and Distributed Virtual Signalto-Interference-plus-Noise Ratio (DVSINR). Considering multiple users per cell and the backhaul limitations, the BSs rely on local CSI to solve the user selection problem. First we investigate how the signal-to-noise-ratio (SNR) regime and the number of antennas at the BSs impact the effective channel gain (the magnitude of the channels after precoding) and its relationship with multiuser diversity. Considering that user selection must be based on the type of implemented precoding, we develop metrics of compatibility (estimations of the effective channel gains) that can be computed from local CSI at each BS and reported to the CU for scheduling decisions. Based on such metrics, we design user selection algorithms that can find a set of users that potentially maximizes the sum rate. Numerical results show the effectiveness of the proposed metrics and algorithms for different configurations of users and antennas at the base stations.
Resumo:
XXXIII Simpósio Brasileiro de Redes de Computadores e Sistemas Distribuídos (SBRC 2015). 15 to 19, May, 2015, III Workshop de Comunicação em Sistemas Embarcados Críticos. Vitória, Brasil.
Resumo:
A evolução tecnológica das últimas décadas na área das Tecnologias da Informação e Comunicação (TIC) contribuiu para a proliferação de fontes de informação e de sistemas de partilha de recursos. As diversas redes sociais são um exemplo paradigmático de sistemas de partilha tanto de informação como de recursos (e.g. audiovisuais). Essa abundância crescente de recursos e fontes aumenta a importância de sistemas capazes de recomendar em tempo útil recursos personalizados, tendo por base o perfil e o contexto do utilizador. O objetivo deste projeto é partilhar e recomendar locais, artigos e vídeos em função do contexto do utilizador assim como proporcionar uma experiência mais rica de reprodução dos vídeos partilhados, simulando as condições de gravação dos vídeos. Este sistema teve como inspiração dois projetos anteriormente desenvolvidos de partilha e recomendação de locais, artigos e vídeos turísticos em função da localização do utilizador. O sistema desenvolvido consiste numa aplicação distribuída composta por um módulo cliente Android, que inclui a interface com o utilizador e o consumo direto de serviços externos de suporte, e um módulo servidor que controla o acesso à base de dados central e inclui o serviço de recomendação baseado no contexto do utilizador. A comunicação entre os módulos cliente e servidor utiliza um protocolo do nível de aplicação dedicado. As recomendações geradas pelo sistema têm por base o perfil de utilizador, informação contextual (posição do utilizador, data e hora atual e velocidade atual do utilizador) e podem ser geradas a pedido do utilizador ou automaticamente, caso sejam encontrados pontos de interesse de grande relevância para o utilizador. Os pontos de interesse recomendados são apresentados com recurso ao Google Maps, incluindo o período de funcionamento, artigos complementares e a reprodução imersiva dos vídeos relacionados. Essa imersão tem em consideração as condições meteorológicas, temporais e espaciais aquando da gravação do vídeo.
Resumo:
The aim of this study was to undertake a comparative analysis of the practices and information behaviour of European information users who visit information units specialising in European information in Portugal and Spain. The study used a quantitative methodology based on a questionnaire containing closed questions and one open question. The questions covered the general sociological profile of the respondents and their use of European Document Centres, in addition to analysing aspects associated with information behaviour relating to European themes. The study therefore examined data on the preferred means and sources for accessing European information, types of documents and the subjects investigated most. The use of European databases and the Internet to access material on Europe was also studied, together with the reasons which users considered made it easy or difficult to access European information, and the aspects they valued most in accessing this information. The questionnaire was administered in European Document Centres in 2008 and 2010.
Resumo:
The vision of the Internet of Things (IoT) includes large and dense deployment of interconnected smart sensing and monitoring devices. This vast deployment necessitates collection and processing of large volume of measurement data. However, collecting all the measured data from individual devices on such a scale may be impractical and time consuming. Moreover, processing these measurements requires complex algorithms to extract useful information. Thus, it becomes imperative to devise distributed information processing mechanisms that identify application-specific features in a timely manner and with a low overhead. In this article, we present a feature extraction mechanism for dense networks that takes advantage of dominance-based medium access control (MAC) protocols to (i) efficiently obtain global extrema of the sensed quantities, (ii) extract local extrema, and (iii) detect the boundaries of events, by using simple transforms that nodes employ on their local data. We extend our results for a large dense network with multiple broadcast domains (MBD). We discuss and compare two approaches for addressing the challenges with MBD and we show through extensive evaluations that our proposed distributed MBD approach is fast and efficient at retrieving the most valuable measurements, independent of the number sensor nodes in the network.
Resumo:
O crescimento dos sistemas de informação e a sua utilização massiva criou uma nova realidade no acesso a experiências remotas que se encontram geograficamente distribuídas. Nestes últimos tempos, a temática dos laboratórios remotos apareceu nos mais diversos campos como o do ensino ou o de sistemas industriais de controlo e monitorização. Como o acesso aos laboratórios é efectuado através de um meio permissivo como é o caso da Internet, a informação pode estar à mercê de qualquer atacante. Assim, é necessário garantir a segurança do acesso, de forma a criar condições para que não se verifique a adulteração dos valores obtidos, bem como a existência de acessos não permitidos. Os mecanismos de segurança adoptados devem ter em consideração a necessidade de autenticação e autorização, sendo estes pontos críticos no que respeita à segurança, pois estes laboratórios podem estar a controlar equipamentos sensíveis e dispendiosos, podendo até eventualmente comprometer em certos casos o controlo e a monotorização de sistemas industriais. Este trabalho teve como objectivo a análise da segurança em redes, tendo sido realizado um estudo sobre os vários conceitos e mecanismos de segurança necessários para garantir a segurança nas comunicações entre laboratórios remotos. Dele resultam as três soluções apresentadas de comunicação segura para laboratórios remotos distribuídos geograficamente, recorrendo às tecnologias IPSec, OpenVPN e PPTP. De forma a minimizar custos, toda a implementação foi assente em software de código aberto e na utilização de um computador de baixo custo. No que respeita à criação das VPNs, estas foram configuradas de modo a permitir obter os resultados pretendidos na criação de uma ligação segura para laboratórios remotos. O pfSense mostrou-se a escolha acertada visto que suporta nativamente quaisquer das tecnologias que foram estudadas e implementadas, sem necessidade de usar recursos físicos muito caros, permitindo o uso de tecnologias de código aberto sem comprometer a segurança no funcionamento das soluções que suportam a segurança nas comunicações dos laboratórios remotos.
Resumo:
Fasciolosis is a disease of importance for both veterinary and public health. For the first time, georeferenced prevalence data of Fasciola hepatica in bovines were collected and mapped for the Brazilian territory and data availability was discussed. Bovine fasciolosis in Brazil is monitored on a Federal, State and Municipal level, and to improve monitoring it is essential to combine the data collected on these three levels into one dataset. Data were collected for 1032 municipalities where livers were condemned by the Federal Inspection Service (MAPA/SIF) because of the presence of F. hepatica. The information was distributed over 11 states: Espírito Santo, Goiás, Minas Gerais, Mato Grosso do Sul, Mato Grosso, Pará, Paraná, Rio de Janeiro, Rio Grande do Sul, Santa Catarina and São Paulo. The highest prevalence of fasciolosis was observed in the southern states, with disease clusters along the coast of Paraná and Santa Catarina and in Rio Grande do Sul. Also, temporal variation of the prevalence was observed. The observed prevalence and the kriged prevalence maps presented in this paper can assist both animal and human health workers in estimating the risk of infection in their state or municipality.