999 resultados para COMPAYRE, GABRIEL
Resumo:
Forty children with a diagnosis of Visceral Toxocariasis were evaluated prospectively from February 1982 to June 1989. Diagnosis was established by clinical, laboratorial and serological (ELISA - ES Toxocara canis antigen) evaluations. A great clinical polymorphism was found in our patients, ranging from unspecific or absent manifestations to an exhuberant symptomatology. The laboratorial findings were: leukocytosis,eosinophilia and elevation of serum gammaglobulin and isohemagglutinin levels. No significant relationship between clinical findings and laboratorial parameters was found. Serology (ELISA) was a method of great diagnostic support but did not show a correlation with clinical and laboratorial findings in this study. There was a significant relationship between pulmonary manifestations and the presence of signs and/or symptoms, when the patients were sent to us. Our findings, especially the high incidence of pulmonary manifestations, suggest that Visceral Toxocariasis has to be included in the differential diagnostic of children with pulmonary manifestations, characteristic epidemiological data and associated eosinophilia.
Resumo:
BACKGROUNDWhile the pharmaceutical industry keeps an eye on plasmid DNA production for new generation gene therapies, real-time monitoring techniques for plasmid bioproduction are as yet unavailable. This work shows the possibility of in situ monitoring of plasmid production in Escherichia coli cultures using a near infrared (NIR) fiber optic probe. RESULTSPartial least squares (PLS) regression models based on the NIR spectra were developed for predicting bioprocess critical variables such as the concentrations of biomass, plasmid, carbon sources (glucose and glycerol) and acetate. In order to achieve robust models able to predict the performance of plasmid production processes, independently of the composition of the cultivation medium, cultivation strategy (batch versus fed-batch) and E. coli strain used, three strategies were adopted, using: (i) E. coliDH5 cultures conducted under different media compositions and culture strategies (batch and fed-batch); (ii) engineered E. coli strains, MG1655endArecApgi and MG1655endArecA, grown on the same medium and culture strategy; (iii) diverse E. coli strains, over batch and fed-batch cultivations and using different media compositions. PLS models showed high accuracy for predicting all variables in the three groups of cultures. CONCLUSIONNIR spectroscopy combined with PLS modeling provides a fast, inexpensive and contamination-free technique to accurately monitoring plasmid bioprocesses in real time, independently of the medium composition, cultivation strategy and the E. coli strain used.
Resumo:
This paper proposes an FPGA-based architecture for onboard hyperspectral unmixing. This method based on the Vertex Component Analysis (VCA) has several advantages, namely it is unsupervised, fully automatic, and it works without dimensionality reduction (DR) pre-processing step. The architecture has been designed for a low cost Xilinx Zynq board with a Zynq-7020 SoC FPGA based on the Artix-7 FPGA programmable logic and tested using real hyperspectral datasets. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low cost embedded systems.
Resumo:
Remote hyperspectral sensors collect large amounts of data per flight usually with low spatial resolution. It is known that the bandwidth connection between the satellite/airborne platform and the ground station is reduced, thus a compression onboard method is desirable to reduce the amount of data to be transmitted. This paper presents a parallel implementation of an compressive sensing method, called parallel hyperspectral coded aperture (P-HYCA), for graphics processing units (GPU) using the compute unified device architecture (CUDA). This method takes into account two main properties of hyperspectral dataset, namely the high correlation existing among the spectral bands and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. Experimental results conducted using synthetic and real hyperspectral datasets on two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN, reveal that the use of GPUs can provide real-time compressive sensing performance. The achieved speedup is up to 20 times when compared with the processing time of HYCA running on one core of the Intel i7-2600 CPU (3.4GHz), with 16 Gbyte memory.
Resumo:
The application of compressive sensing (CS) to hyperspectral images is an active area of research over the past few years, both in terms of the hardware and the signal processing algorithms. However, CS algorithms can be computationally very expensive due to the extremely large volumes of data collected by imaging spectrometers, a fact that compromises their use in applications under real-time constraints. This paper proposes four efficient implementations of hyperspectral coded aperture (HYCA) for CS, two of them termed P-HYCA and P-HYCA-FAST and two additional implementations for its constrained version (CHYCA), termed P-CHYCA and P-CHYCA-FAST on commodity graphics processing units (GPUs). HYCA algorithm exploits the high correlation existing among the spectral bands of the hyperspectral data sets and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. The proposed P-HYCA and P-CHYCA implementations have been developed using the compute unified device architecture (CUDA) and the cuFFT library. Moreover, this library has been replaced by a fast iterative method in the P-HYCA-FAST and P-CHYCA-FAST implementations that leads to very significant speedup factors in order to achieve real-time requirements. The proposed algorithms are evaluated not only in terms of reconstruction error for different compressions ratios but also in terms of computational performance using two different GPU architectures by NVIDIA: 1) GeForce GTX 590; and 2) GeForce GTX TITAN. Experiments are conducted using both simulated and real data revealing considerable acceleration factors and obtaining good results in the task of compressing remotely sensed hyperspectral data sets.
Resumo:
Dissertação apresentada à Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Geológica (Geotecnia)
Resumo:
One of the main problems of hyperspectral data analysis is the presence of mixed pixels due to the low spatial resolution of such images. Linear spectral unmixing aims at inferring pure spectral signatures and their fractions at each pixel of the scene. The huge data volumes acquired by hyperspectral sensors put stringent requirements on processing and unmixing methods. This letter proposes an efficient implementation of the method called simplex identification via split augmented Lagrangian (SISAL) which exploits the graphics processing unit (GPU) architecture at low level using Compute Unified Device Architecture. SISAL aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The proposed implementation is performed in a pixel-by-pixel fashion using coalesced accesses to memory and exploiting shared memory to store temporary data. Furthermore, the kernels have been optimized to minimize the threads divergence, therefore achieving high GPU occupancy. The experimental results obtained for the simulated and real hyperspectral data sets reveal speedups up to 49 times, which demonstrates that the GPU implementation can significantly accelerate the method's execution over big data sets while maintaining the methods accuracy.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
XX Seminário de Investigação em Educação Matemática (pp. 228-238). Viana do Castelo
Resumo:
Dissertação de Mestrado em Migrações Inter-etnicidades e Transnacionalismo.
Resumo:
O relatório resulta do estágio realizado na empresa Sika Brasil, no âmbito da Unidade Curricular de DIPRE do Mestrado de Engenharia Civil do Instituto Superior de Engenharia do Porto. A área de especialidade incidiu na recuperação e reforço de estruturas (Target Market Refurbishment), onde se deu a oportunidade de aprofundar conhecimentos em reforço estrutural através do sistema de reforço com compósitos de fibra de carbono (CFRP) colado exteriormente (EBR - Externally Bonded Reinforcement). O estágio realizado permitiu uma abordagem com a gama de produtos de recuperação e reforço da Sika Brasil, sendo que houve um foco muito grande nos produtos que respeitam ao reforço estrutural com compósitos de fibra de carbono. Este documento visa várias etapas do estágio, relacionadas diretamente com o reforço estrutural com CFRP. Foi feito um levantamento teórico das características dos compósitos de fibra de carbono, dando a conhecer os materiais envolvidos no sistema, as suas propriedades mecânicas e o seu âmbito de aplicação. No sentido de ter um diálogo profícuo com os projetistas e aplicadores de sistemas compósitos de fibra, foi realizada uma análise do procedimento de cálculo para o dimensionamento de reforço CFRP, à luz do Bulletin 14 fib:01 (2001), bem como uma análise da situação de incêndio para os sistemas compósitos. Consta neste documento uma análise feita entre os principais fornecedores de sistemas de CFRP no Brasil, baseando-se a mesma no conteúdo das fichas técnicas de produto relativas ao sistema de reforço EBR e respetiva comparação com a informação necessária para dimensionamento, de acordo com o Bulletin 14 fib:01 (2001). É relatado um reforço estrutural, como caso de estudo, tendo-se dado a oportunidade de se acompanhar desde a sua fase de projeto até à fase de execução. Por fim, este documento contém a simulação de dois programas da Sika para dimensionamento de reforço CFRP. A simulação foi feita para uma viga submetida a esforço de flexão, com as características geométricas e solicitações previamente definidas.
Resumo:
Numa Estação de Tratamento de Águas Residuais (ETAR), são elevados os custos não só de tratamento das águas residuais como também de manutenção dos equipamentos lá existentes, nesse sentido procura-se utilizar processos capazes de transformar os resíduos em produtos úteis. A Digestão Anaeróbia (DA) é um processo atualmente disponível capaz de contribuir para a redução da poluição ambiental e ao mesmo tempo de valorizar os subprodutos gerados. Durante o processo de DA é produzido um gás, o biogás, que pode ser utilizado como fonte de energia, reduzindo assim a dependência energética da ETAR e a emissão de gases com efeito de estufa para a atmosfera. A otimização do processo de DA das lamas é essencial para o aumento da produção de biogás, mas a complexidade do processo constitui um obstáculo à sua otimização. Neste trabalho, aplicaram-se Redes Neuronais Artificiais (RNA) ao processo de DA de lamas de ETAR. RNA são modelos simplificados inspirados no funcionamento das células neuronais humanas e que adquirem conhecimento através da experiência. Quando a RNA é criada e treinada, produz valores de output aproximadamente corretos para os inputs fornecidos. Foi esse o motivo para recorrer a RNA na otimização da produção de biogás no digestor I da ETAR Norte da SIMRIA, usando o programa NeuralToolsTM da PalisadeTM para desenvolvimento das RNA. Para tal, efetuou-se uma análise e tratamento de dados referentes aos últimos quatro anos de funcionamento do digestor. Os resultados obtidos permitiram concluir que as RNA modeladas apresentam boa capacidade de generalização do processo de DA. Considera-se que este caso de estudo é promissor, fornecendo uma boa base para o desenvolvimento de modelos eventualmente mais gerais de RNA que, aplicado conjuntamente com as características de funcionamento de um digestor e o processo de DA, permitirá otimizar a produção de biogás em ETAR.
Resumo:
This document presents a tool able to automatically gather data provided by real energy markets and to generate scenarios, capture and improve market players’ profiles and strategies by using knowledge discovery processes in databases supported by artificial intelligence techniques, data mining algorithms and machine learning methods. It provides the means for generating scenarios with different dimensions and characteristics, ensuring the representation of real and adapted markets, and their participating entities. The scenarios generator module enhances the MASCEM (Multi-Agent Simulator of Competitive Electricity Markets) simulator, endowing a more effective tool for decision support. The achievements from the implementation of the proposed module enables researchers and electricity markets’ participating entities to analyze data, create real scenarios and make experiments with them. On the other hand, applying knowledge discovery techniques to real data also allows the improvement of MASCEM agents’ profiles and strategies resulting in a better representation of real market players’ behavior. This work aims to improve the comprehension of electricity markets and the interactions among the involved entities through adequate multi-agent simulation.
Resumo:
Power systems have been through deep changes in recent years, namely due to the operation of competitive electricity markets in the scope the increasingly intensive use of renewable energy sources and distributed generation. This requires new business models able to cope with the new opportunities that have emerged. Virtual Power Players (VPPs) are a new type of player that allows aggregating a diversity of players (Distributed Generation (DG), Storage Agents (SA), Electrical Vehicles (V2G) and consumers) to facilitate their participation in the electricity markets and to provide a set of new services promoting generation and consumption efficiency, while improving players’ benefits. A major task of VPPs is the remuneration of generation and services (maintenance, market operation costs and energy reserves), as well as charging energy consumption. This paper proposes a model to implement fair and strategic remuneration and tariff methodologies, able to allow efficient VPP operation and VPP goals accomplishment in the scope of electricity markets.
Resumo:
Electricity Markets are not only a new reality but an evolving one as the involved players and rules change at a relatively high rate. Multi-agent simulation combined with Artificial Intelligence techniques may result in very helpful sophisticated tools. This paper presents a new methodology for the management of coalitions in electricity markets. This approach is tested using the multi-agent market simulator MASCEM (Multi-Agent Simulator of Competitive Electricity Markets), taking advantage of its ability to provide the means to model and simulate Virtual Power Players (VPP). VPPs are represented as coalitions of agents, with the capability of negotiating both in the market and internally, with their members in order to combine and manage their individual specific characteristics and goals, with the strategy and objectives of the VPP itself. A case study using real data from the Iberian Electricity Market is performed to validate and illustrate the proposed approach.