980 resultados para Point cloud processing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maintaining a high level of data security with a low impact on system performance is more challenging in wireless multimedia applications. Protocols that are used for wireless local area network (WLAN) security are known to significantly degrade performance. In this paper, we propose an enhanced security system for a WLAN. Our new design aims to decrease the processing delay and increase both the speed and throughput of the system, thereby making it more efficient for multimedia applications. Our design is based on the idea of offloading computationally intensive encryption and authentication services to the end systems’ CPUs. The security operations are performed by the hosts’ central processor (which is usually a powerful processor) before delivering the data to a wireless card (which usually has a low-performance processor). By adopting this design, we show that both the delay and the jitter are significantly reduced. At the access point, we improve the performance of network processing hardware for real-time cryptographic processing by using a specialized processor implemented with field-programmable gate array technology. Furthermore, we use enhanced techniques to implement the Counter (CTR) Mode with Cipher Block Chaining Message Authentication Code Protocol (CCMP) and the CTR protocol. Our experiments show that it requires timing in the range of 20–40 μs to perform data encryption and authentication on different end-host CPUs (e.g., Intel Core i5, i7, and AMD 6-Core) as compared with 10–50 ms when performed using the wireless card. Furthermore, when compared with the standard WiFi protected access II (WPA2), results show that our proposed security system improved the speed to up to 3.7 times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, data centers are large energy consumers and the trend for next years is expected to increase further, considering the growth in the order of cloud services. A large portion of this power consumption is due to the control of physical parameters of the data center (such as temperature and humidity). However, these physical parameters are tightly coupled with computations, and even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in the cloud infrastructure hosted in the data center. Therefore, managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolution of the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center and with them, find opportunities to optimize energy consumptions. Having a high-resolution picture of the data center conditions, also enables minimizing local hot-spots, perform more accurate predictive maintenance (failures in all infrastructure equipments can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1st ASPIC International Congress

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biosensors have opened new horizons in biomedical analysis, by ensuring increased assay speed and flexibility, and allowing point-of-care applications, multi-target analyses, automation and reduced costs of testing. This has been a result of many studies merging nanotechnology with biochemistry over the years, thereby enabling the creation of more suitable environments to biological receptors and their substitution by synthetic analogue materials. Sol-gel chemistry, among other materials, is deeply involved in this process. Sol-gel processing allows the immobilization of organic molecules, biomacromolecules and cells maintaining their properties and activities, permitting their integration into different transduction devices, of electrochemical or optical nature, for single or multiple analyses. Sol-gel also allows to the production of synthetic materials mimicking the activity of natural receptors, while bringing advantages, mostly in terms of cost and stability. Moreover, the biocompatibility of sol-gel materials structures of biological nature allowed the use of these materials in emerging in vivo applications. In this chapter, biosensors for biomedical applications based on sol-gel derived composites are presented, compared and described, along with current emerging applications in vivo, concerning drug delivery or biomaterials. Sol-gel materials are shown as a promising tool for current, emerging and future medical applications. - See more at: http://www.eurekaselect.com/127191/article#sthash.iPqqyhox.dpuf

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes a novel use for the polymeric film, poly(o-aminophenol) (PAP) that was made responsive to a specific protein. This was achieved through templated electropolymerization of aminophenol (AP) in the presence of protein. The procedure involved adsorbing protein on the electrode surface and thereafter electroploymerizing the aminophenol. Proteins embedded at the outer surface of the polymeric film were digested by proteinase K and then washed away thereby creating vacant sites. The capacity of the template film to specifically rebind protein was tested with myoglobin (Myo), a cardiac biomarker for ischemia. The films acted as biomimetic artificial antibodies and were produced on a gold (Au) screen printed electrode (SPE), as a step towards disposable sensors to enable point-of-care applications. Raman spectroscopy was used to follow the surface modification of the Au-SPE. The ability of the material to rebind Myo was measured by electrochemical techniques, namely electrochemical impedance spectroscopy (EIS) and square wave voltammetry (SWV). The devices displayed linear responses to Myo in EIS and SWV assays down to 4.0 and 3.5 μg/mL, respectively, with detection limits of 1.5 and 0.8 μg/mL. Good selectivity was observed in the presence of troponin T (TnT) and creatine kinase (CKMB) in SWV assays, and accurate results were obtained in applications to spiked serum. The sensor described in this work is a potential tool for screening Myo in point-of-care due to the simplicity of fabrication, disposability, short time response, low cost, good sensitivity and selectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A gold screen printed electrode (Au-SPE) was modified by merging Molecular Imprinting and Self-Assembly Monolayer techniques for fast screening cardiac biomarkers in point-of-care (POC). For this purpose, Myoglobin (Myo) was selected as target analyte and its plastic antibody imprinted over a glutaraldehyde (Glu)/cysteamine (Cys) layer on the gold-surface. The imprinting effect was produced by growing a reticulated polymer of acrylamide (AAM) and N,N′-methylenebisacrylamide (NNMBA) around the Myo template, covalently attached to the biosensing surface. Electrochemical impedance spectroscopy (EIS) and cyclic voltammetry (CV) studies were carried out in all chemical modification steps to confirm the surface changes in the Au-SPE. The analytical features of the resulting biosensor were studied by different electrochemical techniques, including EIS, square wave voltammetry (SWV) and potentiometry. The limits of detection ranged from 0.13 to 8 μg/mL. Only potentiometry assays showed limits of detection including the cut-off Myo levels. Quantitative information was also produced for Myo concentrations ≥0.2 μg/mL. The linear response of the biosensing device showed an anionic slope of ~70 mV per decade molar concentration up to 0.3 μg/mL. The interference of coexisting species was tested and good selectivity was observed. The biosensor was successfully applied to biological fluids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The underground scenarios are one of the most challenging environments for accurate and precise 3d mapping where hostile conditions like absence of Global Positioning Systems, extreme lighting variations and geometrically smooth surfaces may be expected. So far, the state-of-the-art methods in underground modelling remain restricted to environments in which pronounced geometric features are abundant. This limitation is a consequence of the scan matching algorithms used to solve the localization and registration problems. This paper contributes to the expansion of the modelling capabilities to structures characterized by uniform geometry and smooth surfaces, as is the case of road and train tunnels. To achieve that, we combine some state of the art techniques from mobile robotics, and propose a method for 6DOF platform positioning in such scenarios, that is latter used for the environment modelling. A visual monocular Simultaneous Localization and Mapping (MonoSLAM) approach based on the Extended Kalman Filter (EKF), complemented by the introduction of inertial measurements in the prediction step, allows our system to localize himself over long distances, using exclusively sensors carried on board a mobile platform. By feeding the Extended Kalman Filter with inertial data we were able to overcome the major problem related with MonoSLAM implementations, known as scale factor ambiguity. Despite extreme lighting variations, reliable visual features were extracted through the SIFT algorithm, and inserted directly in the EKF mechanism according to the Inverse Depth Parametrization. Through the 1-Point RANSAC (Random Sample Consensus) wrong frame-to-frame feature matches were rejected. The developed method was tested based on a dataset acquired inside a road tunnel and the navigation results compared with a ground truth obtained by post-processing a high grade Inertial Navigation System and L1/L2 RTK-GPS measurements acquired outside the tunnel. Results from the localization strategy are presented and analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Development and standardization of reliable methods for detection of Mycobacterium tuberculosis in clinical samples is an important goal in laboratories throughout the world. In this work, lung and spleen fragments from a patient who died with the diagnosis of miliary tuberculosis were used to evaluate the influence of the type of fixative as well as the fixation and paraffin inclusion protocols on PCR performance in paraffin embedded specimens. Tissue fragments were fixed for four h to 48 h, using either 10% non-buffered or 10% buffered formalin, and embedded in pure paraffin or paraffin mixed with bee wax. Specimens were submitted to PCR for amplification of the human beta-actin gene and separately for amplification of the insertion sequence IS6110, specific from the M. tuberculosis complex. Amplification of the beta-actin gene was positive in all samples. No amplicons were generated by PCR-IS6110 when lung tissue fragments were fixed using 10% non-buffered formalin and were embedded in paraffin containing bee wax. In conclusion, combined inhibitory factors interfere in the detection of M. tuberculosis in stored material. It is important to control these inhibitory factors in order to implement molecular diagnosis in pathology laboratories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study several natural and man-made complex phenomena in the perspective of dynamical systems. For each class of phenomena, the system outputs are time-series records obtained in identical conditions. The time-series are viewed as manifestations of the system behavior and are processed for analyzing the system dynamics. First, we use the Fourier transform to process the data and we approximate the amplitude spectra by means of power law functions. We interpret the power law parameters as a phenomenological signature of the system dynamics. Second, we adopt the techniques of non-hierarchical clustering and multidimensional scaling to visualize hidden relationships between the complex phenomena. Third, we propose a vector field based analogy to interpret the patterns unveiled by the PL parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Mecânica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Com a crescente preocupação em dinamizar as exportações e potenciar os seus efeitos na economia, muitos trabalhos têm tentado encontrar fatores potenciadores do sucesso das empresas no mercado internacional (dimensão, produtividade pré exportadora, idade, fase do ciclo de produção, relacionamento prévio com o exterior, etc.). Temas como a seleção natural do mercado e a aprendizagem pela exportação, são transversais e incontornáveis nos trabalhos empíricos que abordam o estudo das exportações ao nível das empresas. No entanto, não nos devemos esquecer que uma das principais motivações das empresas, é a maximização do lucro. Com efeito, uma nova onda de trabalhos tem-se voltado para a o impacto que as exportações têm sobre a rentabilidade das empresas. Utilizando um modelo de efeitos fixos com dados em painel, aplicado a uma base de dados de empresas portuguesas, com espetro temporal entre 2008 e 2012, este trabalho encontra evidências e que as exportações são um fraco potenciador da rentabilidade das empresas. Do ponto de vista da organização do presente trabalho, no primeiro capítulo será apresentada uma breve revisão de literatura enquadradora do tema; no segundo capítulo será apresentada a base de dados, tratamento e a abordagem econométrica; por último será apresentada uma conclusão, com os resultados principais do trabalho e com algumas questões que poderão ser abordadas no futuro.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atualmente, as Tecnologias de Informação (TI) são cada vez mais vitais dentro das organizações. As TI são o motor de suporte do negócio. Para grande parte das organizações, o funcionamento e desenvolvimento das TI têm como base infraestruturas dedicadas (internas ou externas) denominadas por Centro de Dados (CD). Nestas infraestruturas estão concentrados os equipamentos de processamento e armazenamento de dados de uma organização, por isso, são e serão cada vez mais desafiadas relativamente a diversos fatores tais como a escalabilidade, disponibilidade, tolerância à falha, desempenho, recursos disponíveis ou disponibilizados, segurança, eficiência energética e inevitavelmente os custos associados. Com o aparecimento das tecnologias baseadas em computação em nuvem e virtualização, abrese todo um leque de novas formas de endereçar os desafios anteriormente descritos. Perante este novo paradigma, surgem novas oportunidades de consolidação dos CD que podem representar novos desafios para os gestores de CD. Por isso, é no mínimo irrealista para as organizações simplesmente eliminarem os CD ou transforma-los segundo os mais altos padrões de qualidade. As organizações devem otimizar os seus CD, contudo um projeto eficiente desta natureza, com capacidade para suportar as necessidades impostas pelo mercado, necessidades dos negócios e a velocidade da evolução tecnológica, exigem soluções complexas e dispendiosas tanto para a sua implementação como a sua gestão. É neste âmbito que surge o presente trabalho. Com o objetivo de estudar os CD inicia-se um estudo sobre esta temática, onde é detalhado o seu conceito, evolução histórica, a sua topologia, arquitetura e normas existentes que regem os mesmos. Posteriormente o estudo detalha algumas das principais tendências condicionadoras do futuro dos CD. Explorando o conhecimento teórico resultante do estudo anterior, desenvolve-se uma metodologia de avaliação dos CD baseado em critérios de decisão. O estudo culmina com uma análise sobre uma nova solução tecnológica e a avaliação de três possíveis cenários de implementação: a primeira baseada na manutenção do atual CD; a segunda baseada na implementação da nova solução em outro CD em regime de hosting externo; e finalmente a terceira baseada numa implementação em regime de IaaS.