991 resultados para sensor technique
Resumo:
Nowadays the incredible grow of mobile devices market led to the need for location-aware applications. However, sometimes person location is difficult to obtain, since most of these devices only have a GPS (Global Positioning System) chip to retrieve location. In order to suppress this limitation and to provide location everywhere (even where a structured environment doesn’t exist) a wearable inertial navigation system is proposed, which is a convenient way to track people in situations where other localization systems fail. The system combines pedestrian dead reckoning with GPS, using widely available, low-cost and low-power hardware components. The system innovation is the information fusion and the use of probabilistic methods to learn persons gait behavior to correct, in real-time, the drift errors given by the sensors.
Resumo:
Nowadays there is an increase of location-aware mobile applications. However, these applications only retrieve location with a mobile device's GPS chip. This means that in indoor or in more dense environments these applications don't work properly. To provide location information everywhere a pedestrian Inertial Navigation System (INS) is typically used, but these systems can have a large estimation error since, in order to turn the system wearable, they use low-cost and low-power sensors. In this work a pedestrian INS is proposed, where force sensors were included to combine with the accelerometer data in order to have a better detection of the stance phase of the human gait cycle, which leads to improvements in location estimation. Besides sensor fusion an information fusion architecture is proposed, based on the information from GPS and several inertial units placed on the pedestrian body, that will be used to learn the pedestrian gait behavior to correct, in real-time, the inertial sensors errors, thus improving location estimation.
Resumo:
One of the most challenging task underlying many hyperspectral imagery applications is the linear unmixing. The key to linear unmixing is to find the set of reference substances, also called endmembers, that are representative of a given scene. This paper presents the vertex component analysis (VCA) a new method to unmix linear mixtures of hyperspectral sources. The algorithm is unsupervised and exploits a simple geometric fact: endmembers are vertices of a simplex. The algorithm complexity, measured in floating points operations, is O (n), where n is the sample size. The effectiveness of the proposed scheme is illustrated using simulated data.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
A Norfloxacina (NFX) é um antibiótico antibacteriano indicado para combater bactérias Gram-negativas e amplamente utilizado para o tratamento de infeções no trato respiratório e urinário. Com a necessidade de realizar estudos clínicos e farmacológicos esenvolveram-se métodos de análise rápida e sensitiva para a determinação da Norfloxacina. Neste trabalho foi desenvolvido um novo sensor eletroquímico sensível e seletivo para a deteção da NFX. O sensor foi construído a partir de modificações efetuadas num elétrodo de carbono vítreo. Inicialmente o elétrodo foi modificado com a deposição de uma suspensão de nanotubos de carbono de paredes múltiplas (MWCNT) de modo a aumentar a sensibilidade de resposta analítica. De seguida um filme polímerico molecularmente impresso (MIP) foi preparado por eletrodeposição, a partir de uma solução contendo pirrol (monómero funcional) e NFX (template). Um elétrodo de controlo não impresso foi também preparado (NIP). Estudouse e caraterizou-se a resposta eletroquímica do sensor para a oxidação da NFX por voltametria de onda quadrada. Foram optimizados diversos parâmetros experimentais, tais como, condições ótimas de polimerização, condições de incubação e condições de extração. O sensor apresenta um comportamento linear entre a intensidade da corrente do pico e o logaritmo da concentração de NFX na gama entre 0,1 e 8μM. Os resultados obtidos apresentam boa precisão, com repetibilidade inferior a 6% e reprodutibilidade inferior a 9%. Foi calculado a partir da curva de calibração um limite de deteção de 0,2 μM O método desenvolvido é seletivo, rápido e de fácil manuseamento. O sensor molecularmente impresso foi aplicado com sucesso na deteção da NFX em amostras de urina real e água.
Resumo:
Dissertação apresentada para obtenção do grau de Mestre em Bioquímica Estrutural e Funcional, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Visceral larva migrans (VLM) is a clinical syndrome caused by infection of man by Toxocara spp, the common roundworm of dogs and cats. Tissue migration of larval stages causes illness specially in children. Because larvae are difficult to detect in tissues, diagnosis is mostly based on serology. After the introduction of the enzyme-linked immunosorbent assay (ELISA) using the larval excretory-secretory antigen of T. canis (TES), the diagnosis specificity was greatly improved although cross-reactivity with other helminths are still being reported. In Brazil, diagnosis is routinely made after absorption of serum samples with Ascaris suum antigens, a nematode antigenicaly related with Ascaris lumbricoides which is a common intestinal nematode of children. In order to identify T. canis antigens that cross react to A. suum antigens we analyzed TES antigen by SDS-PAGE and Western blotting techniques. When we used serum samples from patients suspected of VLM and positive result by ELISA as well as a reference serum sample numerous bands were seen (molecular weight of 210-200 kDa, 116-97 kDa, 55-50 kDa and 35-29 kDa). Among these there is at least one band with molecular weight around 55-66 kDa that seem to be responsible for the cross-reactivity between T. canis e A. suum once it disappears when previous absorption of serum samples with A. suum antigens is performed
Resumo:
A monitorização ambiental é essencial para a tomada de decisões tanto na ciência como na indústria. Em particular, uma vez que a água é essencial à vida e a superfície da Terra é composta principalmente por água, a monitorização do clima e dos parâmetros relacionados com a água em ecossistemas sensíveis, tais como oceanos, lagoas, rios e lagos, é de extrema importância. Um dos métodos mais comuns para monitorar a água é implantar bóias. O presente trabalho está integrado num projeto mais amplo, com o objetivo de projectar e desenvolver uma bóia autónoma para a investigação científica com dois modos de funcionamento: (i) monitorização ambiental ; e (ii) baliza ativa de regata. Assim, a bóia tem duas aplicações principais: a coleta e armazenamento de dados e a assistência a regatas de veleiros autónomos. O projeto arrancou há dois anos com um grupo de quatro estudantes internacionais. Eles projetaram e construíram a estrutura física, compraram e montaram o sistema de ancoragem da bóia e escolherem a maioria dos componentes electrónicos para o sistema geral de controlo e medição. Este ano, durante o primeiro semestre, dois estudantes belgas - Jeroen Vervenne e Hendrick Verschelde – trabalharam nos subsistemas de recolha e armazenamento de dados (unidade de controlo escrava) e de telemetria e configuração (unidade de controlo mestre) assim como definiram o protocolo de comunicação da aplicação. O trabalho desta tese continua o desenvolvimento do subsistema de telemetria e configuração. Este subsistema _e responsável pela configuração do modo de funcionamento e dos sensores assim como pela comunicação com a estacão de base (controlo ambiental), barcos (baliza ativa de regata) e com o subsistema de recolha e armazenamento de dados. O desenvolvimento do subsistema de recolha e armazenamento de dados, que coleta e armazena num cartão SD os dados dos sensores selecionados, prossegue com outro estudante belga - Mathias van Flieberge. O objetivo desta tese é, por um lado, implementar o subsistema de telemetria e de configuração na unidade de controle mestre e, por outro lado, refinar e implementar, conjuntamente com Mathias van Flieberge, o protocolo de nível de aplicação projetado. Em particular, a unidade de controlo mestre deve processar e atribuir prioridades às mensagens recebidas da estacão base, solicitar dados à unidade de controlo escrava e difundir mensagens com informação de posição e condições de vento e água no modo de regata. Enquanto que a comunicação entre a unidade de controlo mestre e a estacão base e a unidade de controlo mestre e os barcos é sem fios, a unidade de controlo mestre e a unidade de controlo escrava comunicam através de uma ligação série. A bóia tem atualmente duas limitações: (i) a carga máxima é de 40 kg; e (ii) apenas pode ser utilizada em rios ou próximo da costa dada à limitação de distância imposta pela técnica de comunicação sem fios escolhida.
Resumo:
Dissertação apresentada à Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para complementar os requerimentos para a obtenção do grau de Mestre em Engenharia Biomédica
Resumo:
We report an adaptation of a technique for the blood sample collection (GFM) as well as for the extraction and amplification of Plasmodium DNA for the diagnosis of malaria infection by the PCR/ELISA. The method of blood sample collection requires less expertise and saves both time and money, thus reducing the cost by more than half. The material is also suitable for genetic analysis in either fresh or stored specimens prepared by this method.
Resumo:
Three GST fusion recombinant antigen of Treponema pallidum, described as GST-rTp47, GST-rTp17 and GST-rTp15 were analyzed by Western blotting techniques. We have tested 53 serum samples: 25 from patients at different clinical stages of syphilis, all of them presenting anti-treponemal antibody, 25 from healthy blood donors and three from patients with sexually transmitted disease (STD) other than syphilis. Almost all samples from patients with syphilis presented a strong reactivity with GST-rTp17 antigen. Some samples were non-reactive or showed a weak reaction with GST-rTp47 and/or GST-rTp15, and apparently there was no correlation with the stage of disease. There was no seropositivity among blood donors. No sample reacted with purified GST. We concluded that due to their specificity these recombinant antigens can be used as GST fusion protein for development of syphilis diagnostic assays.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Conservação e Restauro
Resumo:
Ammonia is an important gas in many power plants and industrial processes so its detection is of extreme importance in environmental monitoring and process control due to its high toxicity. Ammonia’s threshold limit is 25 ppm and the exposure time limit is 8 h, however exposure to 35 ppm is only secure for 10 min. In this work a brief introduction to ammonia aspects are presented, like its physical and chemical properties, the dangers in its manipulation, its ways of production and its sources. The application areas in which ammonia gas detection is important and needed are also referred: environmental gas analysis (e.g. intense farming), automotive-, chemical- and medical industries. In order to monitor ammonia gas in these different areas there are some requirements that must be attended. These requirements determine the choice of sensor and, therefore, several types of sensors with different characteristics were developed, like metal oxides, surface acoustic wave-, catalytic-, and optical sensors, indirect gas analyzers, and conducting polymers. All the sensors types are described, but more attention will be given to polyaniline (PANI), particularly to its characteristics, syntheses, chemical doping processes, deposition methods, transduction modes, and its adhesion to inorganic materials. Besides this, short descriptions of PANI nanostructures, the use of electrospinning in the formation of nanofibers/microfibers, and graphene and its characteristics are included. The created sensor is an instrument that tries to achieve a goal of the medical community in the control of the breath’s ammonia levels being an easy and non-invasive method for diagnostic of kidney malfunction and/or gastric ulcers. For that the device should be capable to detect different levels of ammonia gas concentrations. So, in the present work an ammonia gas sensor was developed using a conductive polymer composite which was immobilized on a carbon transducer surface. The experiments were targeted to ammonia measurements at ppb level. Ammonia gas measurements were carried out in the concentration range from 1 ppb to 500 ppb. A commercial substrate was used; screen-printed carbon electrodes. After adequate surface pre-treatment of the substrate, its electrodes were covered by a nanofibrous polymeric composite. The conducting polyaniline doped with sulfuric acid (H2SO4) was blended with reduced graphene oxide (RGO) obtained by wet chemical synthesis. This composite formed the basis for the formation of nanofibers by electrospinning. Nanofibers will increase the sensitivity of the sensing material. The electrospun PANI-RGO fibers were placed on the substrate and then dried at ambient temperature. Amperometric measurements were performed at different ammonia gas concentrations (1 to 500 ppb). The I-V characteristics were registered and some interfering gases were studied (NO2, ethanol, and acetone). The gas samples were prepared in a custom setup and were diluted with dry nitrogen gas. Electrospun nanofibers of PANI-RGO composite demonstrated an enhancement in NH3 gas detection when comparing with only electrospun PANI nanofibers. Was visible higher range of resistance at concentrations from 1 to 500 ppb. It was also observed that the sensor had stable, reproducible and recoverable properties. Moreover, it had better response and recovery times. The new sensing material of the developed sensor demonstrated to be a good candidate for ammonia gas determination.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Conservação e Restauro, especialização em pintura sobre tela
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica