816 resultados para Cloud OS, cloud operating system, cloud computing
Resumo:
The use of 3D data in mobile robotics applications provides valuable information about the robot’s environment but usually the huge amount of 3D information is unmanageable by the robot storage and computing capabilities. A data compression is necessary to store and manage this information but preserving as much information as possible. In this paper, we propose a 3D lossy compression system based on plane extraction which represent the points of each scene plane as a Delaunay triangulation and a set of points/area information. The compression system can be customized to achieve different data compression or accuracy ratios. It also supports a color segmentation stage to preserve original scene color information and provides a realistic scene reconstruction. The design of the method provides a fast scene reconstruction useful for further visualization or processing tasks.
Resumo:
Recent technological advances have paved the way for developing and offering advanced services for the stakeholders in the agricultural sector. A paradigm shift is underway from proprietary and monolithic tools to Internet-based, cloud hosted, open systems that will enable more effective collaboration between stakeholders. This new paradigm includes the technological support of application developers to create specialized services that will seamlessly interoperate, thus creating a sophisticated and customisable working environment for the end users. We present the implementation of an open architecture that instantiates such an approach, based on a set of domain independent software tools called "generic enablers" that have been developed in the context of the FI-WARE project. The implementation is used to validate a number of innovative concepts for the agricultural sector such as the notion of a services' market place and the system's adaptation to network failures. During the design and implementation phase, the system has been evaluated by end users, offering us valuable feedback. The results of the evaluation process validate the acceptance of such a system and the need of farmers to have access to sophisticated services at affordable prices. A summary of this evaluation process is also presented in this paper. © 2013 Elsevier B.V.
Resumo:
This paper presents the Accurate Google Cloud Simulator (AGOCS) – a novel high-fidelity Cloud workload simulator based on parsing real workload traces, which can be conveniently used on a desktop machine for day-to-day research. Our simulation is based on real-world workload traces from a Google Cluster with 12.5K nodes, over a period of a calendar month. The framework is able to reveal very precise and detailed parameters of the executed jobs, tasks and nodes as well as to provide actual resource usage statistics. The system has been implemented in Scala language with focus on parallel execution and an easy-to-extend design concept. The paper presents the detailed structural framework for AGOCS and discusses our main design decisions, whilst also suggesting alternative and possibly performance enhancing future approaches. The framework is available via the Open Source GitHub repository.
Resumo:
Current trends in broadband mobile networks are addressed towards the placement of different capabilities at the edge of the mobile network in a centralised way. On one hand, the split of the eNB between baseband processing units and remote radio headers makes it possible to process some of the protocols in centralised premises, likely with virtualised resources. On the other hand, mobile edge computing makes use of processing and storage capabilities close to the air interface in order to deploy optimised services with minimum delay. The confluence of both trends is a hot topic in the definition of future 5G networks. The full centralisation of both technologies in cloud data centres imposes stringent requirements to the fronthaul connections in terms of throughput and latency. Therefore, all those cells with limited network access would not be able to offer these types of services. This paper proposes a solution for these cases, based on the placement of processing and storage capabilities close to the remote units, which is especially well suited for the deployment of clusters of small cells. The proposed cloud-enabled small cells include a highly efficient microserver with a limited set of virtualised resources offered to the cluster of small cells. As a result, a light data centre is created and commonly used for deploying centralised eNB and mobile edge computing functionalities. The paper covers the proposed architecture, with special focus on the integration of both aspects, and possible scenarios of application.
Resumo:
In a general purpose cloud system efficiencies are yet to be had from supporting diverse applications and their requirements within a storage system used for a private cloud. Supporting such diverse requirements poses a significant challenge in a storage system that supports fine grained configuration on a variety of parameters. This paper uses the Ceph distributed file system, and in particular its global parameters, to show how a single changed parameter can effect the performance for a range of access patterns when tested with an OpenStack cloud system.
Resumo:
Clouds are important in weather prediction, climate studies and aviation safety. Important parameters include cloud height, type and cover percentage. In this paper, the recent improvements in the development of a low-cost cloud height measurement setup are described. It is based on stereo vision with consumer digital cameras. The cameras positioning is calibrated using the position of stars in the night sky. An experimental uncertainty analysis of the calibration parameters is performed. Cloud height measurement results are presented and compared with LIDAR measurements.
Resumo:
Recent technological advancements have played a key role in seamlessly integrating cloud, edge, and Internet of Things (IoT) technologies, giving rise to the Cloud-to-Thing Continuum paradigm. This cloud model connects many heterogeneous resources that generate a large amount of data and collaborate to deliver next-generation services. While it has the potential to reshape several application domains, the number of connected entities remarkably broadens the security attack surface. One of the main problems is the lack of security measures to adapt to the dynamic and evolving conditions of the Cloud-To-Thing Continuum. To address this challenge, this dissertation proposes novel adaptable security mechanisms. Adaptable security is the capability of security controls, systems, and protocols to dynamically adjust to changing conditions and scenarios. However, since the design and development of novel security mechanisms can be explored from different perspectives and levels, we place our attention on threat modeling and access control. The contributions of the thesis can be summarized as follows. First, we introduce a model-based methodology that secures the design of edge and cyber-physical systems. This solution identifies threats, security controls, and moving target defense techniques based on system features. Then, we focus on access control management. Since access control policies are subject to modifications, we evaluate how they can be efficiently shared among distributed areas, highlighting the effectiveness of distributed ledger technologies. Furthermore, we propose a risk-based authorization middleware, adjusting permissions based on real-time data, and a federated learning framework that enhances trustworthiness by weighting each client's contributions according to the quality of their partial models. Finally, since authorization revocation is another critical concern, we present an efficient revocation scheme for verifiable credentials in IoT networks, featuring decentralization, demanding minimum storage and computing capabilities. All the mechanisms have been evaluated in different conditions, proving their adaptability to the Cloud-to-Thing Continuum landscape.
Resumo:
A compact frequency standard based on an expanding cold (133)CS cloud is under development in our laboratory. In a first experiment, Cs cold atoms were prepared by a magneto-optical trap in a vapor cell, and a microwave antenna was used to transmit the radiation for the clock transition. The signal obtained from fluorescence of the expanding cold atoms cloud is used to lock a microwave chain. In this way the overall system stability is evaluated. A theoretical model based on a two-level system interacting with the two microwave pulses enables interpretation for the observed features, especially the poor Ramsey fringes contrast. (C) 2008 Optical Society of America.
Resumo:
Cloud-aerosol interaction is a key issue in the climate system, affecting the water cycle, the weather, and the total energy balance including the spatial and temporal distribution of latent heat release. Information on the vertical distribution of cloud droplet microphysics and thermodynamic phase as a function of temperature or height, can be correlated with details of the aerosol field to provide insight on how these particles are affecting cloud properties and their consequences to cloud lifetime, precipitation, water cycle, and general energy balance. Unfortunately, today's experimental methods still lack the observational tools that can characterize the true evolution of the cloud microphysical, spatial and temporal structure in the cloud droplet scale, and then link these characteristics to environmental factors and properties of the cloud condensation nuclei. Here we propose and demonstrate a new experimental approach (the cloud scanner instrument) that provides the microphysical information missed in current experiments and remote sensing options. Cloud scanner measurements can be performed from aircraft, ground, or satellite by scanning the side of the clouds from the base to the top, providing us with the unique opportunity of obtaining snapshots of the cloud droplet microphysical and thermodynamic states as a function of height and brightness temperature in clouds at several development stages. The brightness temperature profile of the cloud side can be directly associated with the thermodynamic phase of the droplets to provide information on the glaciation temperature as a function of different ambient conditions, aerosol concentration, and type. An aircraft prototype of the cloud scanner was built and flew in a field campaign in Brazil. The CLAIM-3D (3-Dimensional Cloud Aerosol Interaction Mission) satellite concept proposed here combines several techniques to simultaneously measure the vertical profile of cloud microphysics, thermodynamic phase, brightness temperature, and aerosol amount and type in the neighborhood of the clouds. The wide wavelength range, and the use of multi-angle polarization measurements proposed for this mission allow us to estimate the availability and characteristics of aerosol particles acting as cloud condensation nuclei, and their effects on the cloud microphysical structure. These results can provide unprecedented details on the response of cloud droplet microphysics to natural and anthropogenic aerosols in the size scale where the interaction really happens.
Resumo:
A procedure for simultaneous separation/preconcentration of copper. zinc, cadmium, and nickel in water samples, based on cloud point extraction (CPE) as a prior step to their determination by inductively coupled plasma optic emission spectrometry (ICP-OES), has been developed. The analytes reacted with 4-(2-pyridylazo)-resorcinol (PAR) at pH 5 to form hydrophobic chelates, which were separated and preconcentrated in a surfactant-rich phase of octylphenoxypolyethoxyethanol (Triton X-I 14). The parameters affecting the extraction efficiency of the proposed method, such as sample pH, complexing agent concentration, buffer amount, surfactant concentration, temperature, kinetics of complexation reaction, and incubation time were optimized and their respective values were 5, 0.6 mmol L(-1). 0.3 mL, 0.15% (w/v), 50 degrees C, 40 min, and 10 min for 15 mL of preconcentrated solution. The method presented precision (R.S.D.) between 1.3% and 2.6% (n = 9). The concentration factors with and without dilution of the surfactant-rich phase for the analytes ranged from 9.4 to 10.1 and from 94.0 to 100.1, respectively. The limits of detection (L.O.D.) obtained for copper, zinc, cadmium, and nickel were 1.2, 1.1, 1.0. and 6.3 mu g L(-1), respectively. The accuracy of the procedure was evaluated through recovery experiments on aqueous samples. (C) 2009 Published by Elsevier B.V.
Resumo:
A flow injection (FI) micelle-mediated separation/preconcentration procedure for the determination of lead and cadmium by flame atomic absorption spectrometry (FAAS) has been proposed. The analytes reacted with 1-(2-thiazolylazo)-2-naphthol (TAN) to form hydrophobic chelates, which were extracted into the micelles of 0.05% (w/v) Triton X-114 in a solution buffered at pH 8.4. In the preconcentration stage, the micellar solution was continuously injected into a flow system with four mini-columns packed with cotton, glass wool. or TNT compresses for phase separation. The analytes-containing micelles were eluted from the mini-columns by a stream of 3 mol L(-1) HCl solution and the analytes were determined by FAAS. Chemical and flow variables affecting the preconcentration of the analytes were studied. For 15 mL. of preconcentrated solution, the enhancement factors varied between 15.1 and 20.3, the limits of detection were approximately 4.5 and 0.75 mu g L(-1) for lead and cadmium, respectively. For a solution containing 100 and 10 mu g L(-1) of lead and cadmium, respectively, the R.S.D. values varied from 1.6 to 3.2% (n = 7). The accuracy of the preconcentration system was evaluated by recovery measurements on spiked water samples. The method was susceptible to matrix effects, but these interferences were minimized by adding barium ions as masking agent in the sample solutions, and recoveries from spiked sample varied in the range of 95.1-107.3%. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We report the discovery, from the H I Parkes All-Sky Survey (HIPASS), of an isolated cloud of neutral hydrogen, which we believe to be extragalactic. The H I mass of the cloud (HIPASS J1712-64) is very low, 1.7 x 10(7) M-circle dot, using an estimated distance of similar to 3.2 Mpc. Most significantly, we have found no optical companion to this object to very faint limits [mu(B) similar to 27 mag arcsec(-2)]. HIPASS J1712-64 appears to be a binary system similar to, but much less massive than, H I 1225 + 01 (the Virgo H. I cloud) and has a size of at least 15 kpc. The mean velocity dispersion measured with the Australia Telescope Compact Array (ATCA) is only 4 km s(-1) for the main component and, because of the weak or nonexistent star formation, possibly reflects the thermal line width (T < 2000 K) rather than bulk motion or turbulence. The peak column density for HIPASS J1712-64, from the combined Parkes and ATCA data, is only 3.5 x 1019 cm(-2), which is estimated to be a factor of 2 below the critical threshold for star formation. Apart from its significantly higher velocity, the properties of HIPASS J1712-64 are similar to the recently recognized class of compact high-velocity clouds. We therefore consider the evidence for a Local Group or Galactic origin, although a more plausible alternative is that HIPASS J1712-64 was ejected from the interacting Magellanic Cloud-Galaxy system at perigalacticon similar to 2 x 10(8) yr ago.
Resumo:
A new cloud-point extraction and preconcentration method, using a cationic, surfactant, Aliquat-336 (tricaprylyl-methy;ammonium chloride), his-been developed for the determination of cyanobacterial toxins, microcystins, in natural waters. Sodium sulfate was used to induce phase separation at 25 degreesC. The phase behavior of Aliquat-336 with respect to concentration of Na2SO4 was studied. The cloud-point system revealed a very high phase volume ratio compared to other established systems of nonionic, anionic, and cationic surfactants: At pH 6-7, it showed an outstanding selectivity in ahalyte extraction for anionic species. Only MC-LR and MC-YR, which are known to be predominantly anionic, were extracted (with averaged recoveries of 113.9 +/- 9% and 87.1 +/- 7%, respectively). MC-RR, which is likely to be amphoteric at the above pH range, was. not cle tectable in.the extract. Coupled to HPLC/UV separation and detection, the cloud-point extraction method (with 2.5 mM Aliquat-336 and 75 mM Na2SO4 at 25 degreesC) offered detection limits of 150 +/- 7 and 470 +/- 72 pg/mL for MC-LR and MC-YR, respectively, in 25 mL of deionized water. Repeatability of the method was 7.6% for MC-LR and 7.3% for MC-YR: The cloud-point extraction process can be. completed within 10-15 min with no cleanup steps required. Applicability of the new method to the determination of microcystins in real samples was demonstrated using natural surface waters, collected from a local river and a local duck pond spiked with realistic. concentrations of microcystins. Effects of salinity and organic matter (TOC) content in the water sample on the extraction efficiency were also studied.
Resumo:
O presente projecto tem como objectivo a disponibilização de uma plataforma de serviços para gestão e contabilização de tempo remunerável, através da marcação de horas de trabalho, férias e faltas (com ou sem justificação). Pretende-se a disponibilização de relatórios com base nesta informação e a possibilidade de análise automática dos dados, como por exemplo excesso de faltas e férias sobrepostas de trabalhadores. A ênfase do projecto está na disponibilização de uma arquitectura que facilite a inclusão destas funcionalidades. O projecto está implementado sobre a plataforma Google App Engine (i.e. GAE), de forma a disponibilizar uma solução sob o paradigma de Software as a Service, com garantia de disponibilidade e replicação de dados. A plataforma foi escolhida a partir da análise das principais plataformas cloud existentes: Google App Engine, Windows Azure e Amazon Web Services. Foram analisadas as características de cada plataforma, nomeadamente os modelos de programação, os modelos de dados disponibilizados, os serviços existentes e respectivos custos. A escolha da plataforma foi realizada com base nas suas características à data de iniciação do presente projecto. A solução está estruturada em camadas, com as seguintes componentes: interface da plataforma, lógica de negócio e lógica de acesso a dados. A interface disponibilizada está concebida com observação dos princípios arquitecturais REST, suportando dados nos formatos JSON e XML. A esta arquitectura base foi acrescentada uma componente de autorização, suportada em Spring-Security, sendo a autenticação delegada para os serviços Google Acounts. De forma a permitir o desacoplamento entre as várias camadas foi utilizado o padrão Dependency Injection. A utilização deste padrão reduz a dependência das tecnologias utilizadas nas diversas camadas. Foi implementado um protótipo, para a demonstração do trabalho realizado, que permite interagir com as funcionalidades do serviço implementadas, via pedidos AJAX. Neste protótipo tirou-se partido de várias bibliotecas javascript e padrões que simplificaram a sua realização, tal como o model-view-viewmodel através de data binding. Para dar suporte ao desenvolvimento do projecto foi adoptada uma abordagem de desenvolvimento ágil, baseada em Scrum, de forma a implementar os requisitos do sistema, expressos em user stories. De forma a garantir a qualidade da implementação do serviço foram realizados testes unitários, sendo também feita previamente a análise da funcionalidade e posteriormente produzida a documentação recorrendo a diagramas UML.
Resumo:
A navegação e a interpretação do meio envolvente por veículos autónomos em ambientes não estruturados continua a ser um grande desafio na actualidade. Sebastian Thrun, descreve em [Thr02], que o problema do mapeamento em sistemas robóticos é o da aquisição de um modelo espacial do meio envolvente do robô. Neste contexto, a integração de sistemas sensoriais em plataformas robóticas, que permitam a construção de mapas do mundo que as rodeia é de extrema importância. A informação recolhida desses dados pode ser interpretada, tendo aplicabilidade em tarefas de localização, navegação e manipulação de objectos. Até à bem pouco tempo, a generalidade dos sistemas robóticos que realizavam tarefas de mapeamento ou Simultaneous Localization And Mapping (SLAM), utilizavam dispositivos do tipo laser rangefinders e câmaras stereo. Estes equipamentos, para além de serem dispendiosos, fornecem apenas informação bidimensional, recolhidas através de cortes transversais 2D, no caso dos rangefinders. O paradigma deste tipo de tecnologia mudou consideravelmente, com o lançamento no mercado de câmaras RGB-D, como a desenvolvida pela PrimeSense TM e o subsequente lançamento da Kinect, pela Microsoft R para a Xbox 360 no final de 2010. A qualidade do sensor de profundidade, dada a natureza de baixo custo e a sua capacidade de aquisição de dados em tempo real, é incontornável, fazendo com que o sensor se tornasse instantaneamente popular entre pesquisadores e entusiastas. Este avanço tecnológico deu origem a várias ferramentas de desenvolvimento e interacção humana com este tipo de sensor, como por exemplo a Point Cloud Library [RC11] (PCL). Esta ferramenta tem como objectivo fornecer suporte para todos os blocos de construção comuns que uma aplicação 3D necessita, dando especial ênfase ao processamento de nuvens de pontos de n dimensões adquiridas a partir de câmaras RGB-D, bem como scanners laser, câmaras Time-of-Flight ou câmaras stereo. Neste contexto, é realizada nesta dissertação, a avaliação e comparação de alguns dos módulos e métodos constituintes da biblioteca PCL, para a resolução de problemas inerentes à construção e interpretação de mapas, em ambientes indoor não estruturados, utilizando os dados provenientes da Kinect. A partir desta avaliação, é proposta uma arquitectura de sistema que sistematiza o registo de nuvens de pontos, correspondentes a vistas parciais do mundo, num modelo global consistente. Os resultados da avaliação realizada à biblioteca PCL atestam a sua viabilidade, para a resolução dos problemas propostos. Prova da sua viabilidade, são os resultados práticos obtidos, da implementação da arquitectura de sistema proposta, que apresenta resultados de desempenho interessantes, como também boas perspectivas de integração deste tipo de conceitos e tecnologia em plataformas robóticas desenvolvidas no âmbito de projectos do Laboratório de Sistemas Autónomos (LSA).