901 resultados para Data dissemination and sharing
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Educação Especial, domínio Cognição e Multideficiência
Resumo:
Estuaries are perhaps the most threatened environments in the coastal fringe; the coincidence of high natural value and attractiveness for human use has led to conflicts between conservation and development. These conflicts occur in the Sado Estuary since its location is near the industrialised zone of Peninsula of Setúbal and at the same time, a great part of the Estuary is classified as a Natural Reserve due to its high biodiversity. These facts led us to the need of implementing a model of environmental management and quality assessment, based on methodologies that enable the assessment of the Sado Estuary quality and evaluation of the human pressures in the estuary. These methodologies are based on indicators that can better depict the state of the environment and not necessarily all that could be measured or analysed. Sediments have always been considered as an important temporary source of some compounds or a sink for other type of materials or an interface where a great diversity of biogeochemical transformations occur. For all this they are of great importance in the formulation of coastal management system. Many authors have been using sediments to monitor aquatic contamination, showing great advantages when compared to the sampling of the traditional water column. The main objective of this thesis was to develop an estuary environmental management framework applied to Sado Estuary using the DPSIR Model (EMMSado), including data collection, data processing and data analysis. The support infrastructure of EMMSado were a set of spatially contiguous and homogeneous regions of sediment structure (management units). The environmental quality of the estuary was assessed through the sediment quality assessment and integrated in a preliminary stage with the human pressure for development. Besides the earlier explained advantages, studying the quality of the estuary mainly based on the indicators and indexes of the sediment compartment also turns this methodology easier, faster and human and financial resource saving. These are essential factors to an efficient environmental management of coastal areas. Data management, visualization, processing and analysis was obtained through the combined use of indicators and indices, sampling optimization techniques, Geographical Information Systems, remote sensing, statistics for spatial data, Global Positioning Systems and best expert judgments. As a global conclusion, from the nineteen management units delineated and analyzed three showed no ecological risk (18.5 % of the study area). The areas of more concern (5.6 % of the study area) are located in the North Channel and are under strong human pressure mainly due to industrial activities. These areas have also low hydrodynamics and are, thus associated with high levels of deposition. In particular the areas near Lisnave and Eurominas industries can also accumulate the contamination coming from Águas de Moura Channel, since particles coming from that channel can settle down in that area due to residual flow. In these areas the contaminants of concern, from those analyzed, are the heavy metals and metalloids (Cd, Cu, Zn and As exceeded the PEL guidelines) and the pesticides BHC isomers, heptachlor, isodrin, DDT and metabolits, endosulfan and endrin. In the remain management units (76 % of the study area) there is a moderate impact potential of occurrence of adverse ecological effects and in some of these areas no stress agents could be identified. This emphasizes the need for further research, since unmeasured chemicals may be causing or contributing to these adverse effects. Special attention must be taken to the units with moderate impact potential of occurrence of adverse ecological effects, located inside the natural reserve. Non-point source pollution coming from agriculture and aquaculture activities also seem to contribute with important pollution load into the estuary entering from Águas de Moura Channel. This pressure is expressed in a moderate impact potential for ecological risk existent in the areas near the entrance of this Channel. Pressures may also came from Alcácer Channel although they were not quantified in this study. The management framework presented here, including all the methodological tools may be applied and tested in other estuarine ecosystems, which will also allow a comparison between estuarine ecosystems in other parts of the globe.
Resumo:
Managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). The physical parameters of the data center (such as power, temperature, pressure, humidity) are tightly coupled with computations, even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in a cloud infrastructure hosted in the data center. In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolutionof the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center andwith them, _and opportunities to optimize energy consumption. Havinga high resolution picture of the data center conditions, also enables minimizing local hotspots, perform more accurate predictive maintenance (pending failures in cooling and other infrastructure equipment can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.
Resumo:
With the advent of wearable sensing and mobile technologies, biosignals have seen an increasingly growing number of application areas, leading to the collection of large volumes of data. One of the difficulties in dealing with these data sets, and in the development of automated machine learning systems which use them as input, is the lack of reliable ground truth information. In this paper we present a new web-based platform for visualization, retrieval and annotation of biosignals by non-technical users, aimed at improving the process of ground truth collection for biomedical applications. Moreover, a novel extendable and scalable data representation model and persistency framework is presented. The results of the experimental evaluation with possible users has further confirmed the potential of the presented framework.
Resumo:
In global scientific experiments with collaborative scenarios involving multinational teams there are big challenges related to data access, namely data movements are precluded to other regions or Clouds due to the constraints on latency costs, data privacy and data ownership. Furthermore, each site is processing local data sets using specialized algorithms and producing intermediate results that are helpful as inputs to applications running on remote sites. This paper shows how to model such collaborative scenarios as a scientific workflow implemented with AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic), a decentralized framework offering a feasible solution to run the workflow activities on distributed data centers in different regions without the need of large data movements. The AWARD workflow activities are independently monitored and dynamically reconfigured and steering by different users, namely by hot-swapping the algorithms to enhance the computation results or by changing the workflow structure to support feedback dependencies where an activity receives feedback output from a successor activity. A real implementation of one practical scenario and its execution on multiple data centers of the Amazon Cloud is presented including experimental results with steering by multiple users.
Resumo:
Although the computational power of mobile devices has been increasing, it is still not enough for some classes of applications. In the present, these applications delegate the computing power burden on servers located on the Internet. This model assumes an always-on Internet connectivity and implies a non-negligible latency. The thesis addresses the challenges and contributions posed to the application of a mobile collaborative computing environment concept to wireless networks. The goal is to define a reference architecture for high performance mobile applications. Current work is focused on efficient data dissemination on a highly transitive environment, suitable to many mobile applications and also to the reputation and incentive system available on this mobile collaborative computing environment. For this we are improving our already published reputation/incentive algorithm with knowledge from the usage pattern from the eduroam wireless network in the Lisbon area.
Resumo:
ABSTRACT OBJECTIVE To describe methods and challenges faced in the health impact assessment of vaccination programs, focusing on the pneumococcal conjugate and rotavirus vaccines in Latin America and the Caribbean. METHODS For this narrative review, we searched for the terms "rotavirus", "pneumococcal", "conjugate vaccine", "vaccination", "program", and "impact" in the databases Medline and LILACS. The search was extended to the grey literature in Google Scholar. No limits were defined for publication year. Original articles on the health impact assessment of pneumococcal and rotavirus vaccination programs in Latin America and the Caribbean in English, Spanish or Portuguese were included. RESULTS We identified 207 articles. After removing duplicates and assessing eligibility, we reviewed 33 studies, 25 focusing on rotavirus and eight on pneumococcal vaccination programs. The most frequent studies were ecological, with time series analysis or comparing pre- and post-vaccination periods. The main data sources were: health information systems; population-, sentinel- or laboratory-based surveillance systems; statistics reports; and medical records from one or few health care services. Few studies used primary data. Hospitalization and death were the main outcomes assessed. CONCLUSIONS Over the last years, a significant number of health impact assessments of pneumococcal and rotavirus vaccination programs have been conducted in Latin America and the Caribbean. These studies were carried out few years after the programs were implemented, meet the basic methodological requirements and suggest positive health impact. Future assessments should consider methodological issues and challenges arisen in these first studies conducted in the region.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
Data analytic applications are characterized by large data sets that are subject to a series of processing phases. Some of these phases are executed sequentially but others can be executed concurrently or in parallel on clusters, grids or clouds. The MapReduce programming model has been applied to process large data sets in cluster and cloud environments. For developing an application using MapReduce there is a need to install/configure/access specific frameworks such as Apache Hadoop or Elastic MapReduce in Amazon Cloud. It would be desirable to provide more flexibility in adjusting such configurations according to the application characteristics. Furthermore the composition of the multiple phases of a data analytic application requires the specification of all the phases and their orchestration. The original MapReduce model and environment lacks flexible support for such configuration and composition. Recognizing that scientific workflows have been successfully applied to modeling complex applications, this paper describes our experiments on implementing MapReduce as subworkflows in the AWARD framework (Autonomic Workflow Activities Reconfigurable and Dynamic). A text mining data analytic application is modeled as a complex workflow with multiple phases, where individual workflow nodes support MapReduce computations. As in typical MapReduce environments, the end user only needs to define the application algorithms for input data processing and for the map and reduce functions. In the paper we present experimental results when using the AWARD framework to execute MapReduce workflows deployed over multiple Amazon EC2 (Elastic Compute Cloud) instances.
Resumo:
Portugal joined the effort to create the EPOS infrastructure in 2008, and it became immediately apparent that a national network of Earth Sciences infrastructures was required to participate in the initiative. At that time, FCT was promoting the creation of a national infrastructure called RNG - Rede Nacional de Geofísica (National Geophysics Network). A memorandum of understanding had been agreed upon, and it seemed therefore straightforward to use RNG (enlarged to include relevant participants that were not RNG members) as the Portuguese partner to EPOS-PP. However, at the time of signature of the EPOS-PP contract with the European Commission (November 2010), RNG had not gained formal identity yet, and IST (one of the participants) signed the grant agreement on behalf of the Portuguese consortium. During 2011 no progress was made towards the formal creation of RNG, and the composition of the network – based on proposals submitted to a call issued in 2002 – had by then become obsolete. On February 2012, the EPOS national contact point was mandated by the representatives of the participating national infrastructures to request from FCT the recognition of a new consortium - C3G, Collaboratory for Geology, Geodesy and Geophysics - as the Portuguese partner to EPOS-PP. This request was supported by formal letters from the following institutions: ‐ LNEG. Laboratório Nacional de Energia e Geologia (National Geological Survey); ‐ IGP ‐ Instituto Geográfico Português (National Geographic Institute); ‐ IDL, Instituto Dom Luiz – Laboratório Associado ‐ CGE, Centro de Geofísica de Évora; ‐ FCTUC, Faculdade de Ciências e Tecnologia da Universidade de Coimbra; ‐ Instituto Superior de Engenharia de Lisboa; ‐ Instituto Superior Técnico; ‐ Universidade da Beira Interior. While Instituto de Meteorologia (Meteorological Institute, in charge of the national seismographic network) actively supports the national participation in EPOS, a letter of support was not feasible in view of the organic changes underway at the time. C3G aims at the integration and coordination, at national level, of existing Earth Sciences infrastructures, namely: ‐ seismic and geodetic networks (IM, IST, IDL, CGE); ‐ rock physics laboratories (ISEL); ‐ geophysical laboratories dedicated to natural resources and environmental studies; ‐ geological and geophysical data repositories; ‐ facilities for data storage and computing resources. The C3G - Collaboratory for Geology, Geodesy and Geophysics will be coordinated by Universidade da Beira Interior, whose Department of Informatics will host the C3G infrastructure.
Resumo:
Esta dissertação incide sobre a problemática da construção de um data warehouse para a empresa AdClick que opera na área de marketing digital. O marketing digital é um tipo de marketing que utiliza os meios de comunicação digital, com a mesma finalidade do método tradicional que se traduz na divulgação de bens, negócios e serviços e a angariação de novos clientes. Existem diversas estratégias de marketing digital tendo em vista atingir tais objetivos, destacando-se o tráfego orgânico e tráfego pago. Onde o tráfego orgânico é caracterizado pelo desenvolvimento de ações de marketing que não envolvem quaisquer custos inerentes à divulgação e/ou angariação de potenciais clientes. Por sua vez o tráfego pago manifesta-se pela necessidade de investimento em campanhas capazes de impulsionar e atrair novos clientes. Inicialmente é feita uma abordagem do estado da arte sobre business intelligence e data warehousing, e apresentadas as suas principais vantagens as empresas. Os sistemas business intelligence são necessários, porque atualmente as empresas detêm elevados volumes de dados ricos em informação, que só serão devidamente explorados fazendo uso das potencialidades destes sistemas. Nesse sentido, o primeiro passo no desenvolvimento de um sistema business intelligence é concentrar todos os dados num sistema único integrado e capaz de dar apoio na tomada de decisões. É então aqui que encontramos a construção do data warehouse como o sistema único e ideal para este tipo de requisitos. Nesta dissertação foi elaborado o levantamento das fontes de dados que irão abastecer o data warehouse e iniciada a contextualização dos processos de negócio existentes na empresa. Após este momento deu-se início à construção do data warehouse, criação das dimensões e tabelas de factos e definição dos processos de extração e carregamento dos dados para o data warehouse. Assim como a criação das diversas views. Relativamente ao impacto que esta dissertação atingiu destacam-se as diversas vantagem a nível empresarial que a empresa parceira neste trabalho retira com a implementação do data warehouse e os processos de ETL para carregamento de todas as fontes de informação. Sendo que algumas vantagens são a centralização da informação, mais flexibilidade para os gestores na forma como acedem à informação. O tratamento dos dados de forma a ser possível a extração de informação a partir dos mesmos.
Resumo:
This paper appears in International Journal of Projectics. Vol 4(1), pp. 39-49
Resumo:
The goal of the work presented in this paper is to provide mobile platforms within our campus with a GPS based data service capable of supporting precise outdoor navigation. This can be achieved by providing campus-wide access to real time Differential GPS (DGPS) data. As a result, we designed and implemented a three-tier distributed system that provides Internet data links between remote DGPS sources and the campus and a campus-wide DGPS data dissemination service. The Internet data link service is a two-tier client/server where the server-side is connected to the DGPS station and the client-side is located at the campus. The campus-wide DGPS data provider disseminates the DGPS data received at the campus via the campus Intranet and via a wireless data link. The wireless broadcast is intended for portable receivers equipped with a DGPS wireless interface and the Intranet link is provided for receivers with a DGPS serial interface. The application is expected to provide adequate support for accurate outdoor campus navigation tasks.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática.
Resumo:
Dimensionality reduction plays a crucial role in many hyperspectral data processing and analysis algorithms. This paper proposes a new mean squared error based approach to determine the signal subspace in hyperspectral imagery. The method first estimates the signal and noise correlations matrices, then it selects the subset of eigenvalues that best represents the signal subspace in the least square sense. The effectiveness of the proposed method is illustrated using simulated and real hyperspectral images.