978 resultados para Antigen-Antibody Complex -- analysis
Resumo:
Journal of Bacteriology (Apr 2006) 3024-3036
Resumo:
The aim of this work was to assess the influence of meteorological conditions on the dispersion of particulate matter from an industrial zone into urban and suburban areas. The particulate matter concentration was related to the most important meteorological variables such as wind direction, velocity and frequency. A coal-fired power plant was considered to be the main emission source with two stacks of 225 m height. A middle point between the two stacks was taken as the centre of two concentric circles with 6 and 20 km radius delimiting the sampling area. About 40 sampling collectors were placed within this area. Meteorological data was obtained from a portable meteorological station placed at approximately 1.7 km to SE from the stacks. Additional data was obtained from the electrical company that runs the coal power plant. These data covers the years from 2006 to the present. A detailed statistical analysis was performed to identify the most frequent meteorological conditions concerning mainly wind speed and direction. This analysis revealed that the most frequent wind blows from Northwest and North and the strongest winds blow from Northwest. Particulate matter deposition was obtained in two sampling campaigns carried out in summer and in spring. For the first campaign the monthly average flux deposition was 1.90 g/m2 and for the second campaign this value was 0.79 g/m2. Wind dispersion occurred predominantly from North to South, away from the nearest residential area, located at about 6 km to Northwest from the stacks. Nevertheless, the higher deposition fluxes occurred in the NW/N and NE/E quadrants. This study was conducted considering only the contribution of particulate matter from coal combustion, however, others sources may be present as well, such as road traffic. Additional chemical analyses and microanalysis are needed to identify the source linkage to flux deposition levels.
Resumo:
Sera from 299 fishermen 16 to 80 years old, residents in Cananeia and Iguape counties, southern cost of São Paulo State, Brazil, were studied in order to identify a possible association between the prevalence of specific antibodies to the hepatitis B virus (HBV) and exposure to haematophagus mosquitoes evaluated by the prevalence of arbovirus antibodies. This professional group presented the highest prevalence of arbovirus antibodies (54.1%) in past investigations carried out in this heavily forested region. Detection of antibody to hepatitis B core antigen (anti-HBc) in the sera was done by enzyme immunoassay (Roche). Prevalence of anti-HBc antibodies in this group was 31.4% (94/299) which is very high compared with 7.2% to 15.0% for different groups of healthy adults in State of São Paulo. No significant difference is observed between the prevalences of HBV antibodies in Iguape and Cananeia. Prevalence of anti-HBc and anti-arbovirus antibodies increases with age. There is a concordance in the distribution according to age groups of the frequency of anti-HBc and anti-arbovirus positive sera. Ag HBs was detected in 4% of the studied sera. These results support the hypothesis that the transmission of the hepatitis B virus and the arboviruses may be due to the same factor, one of the possibilities would be by anthropophilic mosquitoes.
Resumo:
In an attempt to be as close as possible to the infected and treated patients of the endemic areas of schistosomiasis (S. mansoni) and in order to achieve a long period of follow-up, mice were repeatedly infected with a low number of cercariae. Survival data and histological variables such as schistosomal granuloma, portal changes, hepatocellular necrosis, hepatocellular regeneration, schistosomotic pigment, periductal fibrosis and chiefly bile ducts changes were analysed in the infected treated and non treated mice. Oxamniquine chemotherapy in repeatedly infected mice prolonged survival significantly when compared to non-treated animals (chi-square 9.24, p = 0.0024), thus confirming previous results with a similar experimental model but with a shorter term follow-up. Furthermore, mortality decreased rapidly after treatment suggesting an abrupt reduction in the severity of hepatic lesions. A morphological and immunohistochemical study of the liver was carried out. Portal fibrosis, with a pattern resembling human Symmers fibrosis was present at a late phase in the infected animals. Bile duct lesions were quite close to those described in human Mansonian schistosomiasis. Schistosomal antigen was observed in one isolated altered bile duct cell. The pathogenesis of the bile duct changes and its relation to the parasite infection and/or their antigens are discussed.
Resumo:
Proceedings of the Information Technology Applications in Biomedicine, Ioannina - Epirus, Greece, October 26-28, 2006
Resumo:
Serum sample obtained from a male, 12 year old patient suffering from Guillain-Barré syndrome (GBS) was positive for human T-lymphotropic virus (HTLV-I) antibody by the enzyme-linked immunosorbent assay (ELISA) and the Western Blot analysis (WB). Attempts to isolate enteroviruses (including poliovirus) from faecal material in both tissue culture and suckling mice were unsuccessful; in addition, acute and convalescent paired serum samples did not show any evidence of recent poliovirus infection when tested against the three serotypes. Specific tests for detection of Epstein-Barr virus infection were not performed; however, the Paul-Bunnel test yielded negative results. ELISA for detection of anti-cytomegalovirus IgM was also negative. The concomitant occurrence of either adult T cell leukemia (ATL) or lymphoma was not recorded in this case.
Resumo:
Measurements in civil engineering load tests usually require considerable time and complex procedures. Therefore, measurements are usually constrained by the number of sensors resulting in a restricted monitored area. Image processing analysis is an alternative way that enables the measurement of the complete area of interest with a simple and effective setup. In this article photo sequences taken during load displacement tests were captured by a digital camera and processed with image correlation algorithms. Three different image processing algorithms were used with real images taken from tests using specimens of PVC and Plexiglas. The data obtained from the image processing algorithms were also compared with the data from physical sensors. A complete displacement and strain map were obtained. Results show that the accuracy of the measurements obtained by photogrammetry is equivalent to that from the physical sensors but with much less equipment and fewer setup requirements. © 2015Computer-Aided Civil and Infrastructure Engineering.
Resumo:
A previous seroepidemiological study in the rural zone of Vargem Alta (ES) SouthEast of Brazil, showed a prevalence of up to 9% of hepatitis B surface antigen (HBsAg) in some areas. One hundred susceptible children aging 1 to 5 years old were selected and immunized with a recombinant DNA hepatitis B vaccine (Smith-Kline 20 mcg) using the 0-1-6 months vaccination schedule. Blood samples were collected at the time of the first vaccine dose (month 0) in order to confirm susceptible individuals and 1,3,6 and 8 months after the first dose , to evaluate the antibody response. Our results showed that two and five months after the second dose, 79% and 88% of children seroconverted respectively, reaching 97% after the third dose. The levels of anti-HBs were calculated in milli International Units/ml (mIU/ml) and demonstrated the markedly increase of protective levels of antibodies after the third dose. These data showed a good immunogenicity of the DNA recombinant hepatitis B vaccine when administered in children of endemic areas.
Resumo:
Two groups of patients undergoing hemodialysis (HD) maintenance were evaluated for their antibody response to non-structural c100/3 protein and structural core protein of hepatitis C virus (HCV). Forty-six patients (Group 1) never presented liver abnormalities during HD treatment, while 52 patients (Group 2) had either current or prior liver enzyme elevations. Prevalence rates of 32.6% and 41.3% were found for anti-c100/3 and anti-HCV core antibodies, respectively, in patients with silent infections (Group 1). The rate of anti-c100/3 in patients of Group 2 was 71.15% and reached 86.5% for anti-HCV core antibodies. The recognition of anti-c100/3 and anti-core antibodies was significantly higher in Group 2 than in Group 1. A line immunoassay composed of structural and non-structural peptides was used as a confirmation assay. HBV infection, measured by the presence of anti-HBc antibodies, was observed in 39.8% of the patients. Six were HBsAg chronic carriers and 13 had naturally acquired anti-HBs antibodies. The duration of HD treatment was correlated with anti-HCV positivity. A high prevalence of 96.7% (Group 2) was found in patients who underwent more than 5 years of treatment. Our results suggest that anti-HCV core ELISA is more accurate for detecting HCV infection than anti-c100/3. Although the risk associated with the duration of HD treatment and blood transfusion was high, additional factors such as a significant non-transfusional spread of HCV seems to play a role as well. The identification of infective patients by more sensitive methods for HCV genome detection should help to control the transmission of HCV in the unit under study.
Resumo:
The study evaluated six Plasmodium falciparum antigen extracts to be used in the IgG and IgM enzyme-linked immunosorbent assays (ELISA), for malaria diagnosis and epidemiological studies. Results obtained with eighteen positive and nine negative control sera indicated that there were statistically significant differences among these antigen extracts (Multifactor ANOVA, p< 0.0001). Urea, sodium deoxycholate and Zwittergent antigen extracts performed better than did the three others, their features being very similar for the detection of IgG antibodies. Urea, alkaline and sodium deoxycholate antigen extracts proved to be better than the others for the detection of IgM antibodies. A straight line relationship was found between the optical densities (or their respective log 10) and the log 10 of antibody dilutions, with a very constant slope. Thus serum titers could be determined by direct titration and by two different equations, needing only one serum dilution. For IgM antibody detections, log 10 expression gave results that better correlated with direct titration (95% Bonferroni). For IgG antibody detections, the titer differences were not significant. The reproducibility of antibody titers and antigen batches was also evaluated, giving satisfactory results.
Resumo:
This paper analyses forest fires in the perspective of dynamical systems. Forest fires exhibit complex correlations in size, space and time, revealing features often present in complex systems, such as the absence of a characteristic length-scale, or the emergence of long range correlations and persistent memory. This study addresses a public domain forest fires catalogue, containing information of events for Portugal, during the period from 1980 up to 2012. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses with amplitude proportional to the burnt area. First, we consider mutual information to correlate annual patterns. We use visualization trees, generated by hierarchical clustering algorithms, in order to compare and to extract relationships among the data. Second, we adopt the Multidimensional Scaling (MDS) visualization tool. MDS generates maps where each object corresponds to a point. Objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to identify forest fire patterns.
Resumo:
This paper shows several ways to analyse the performance of a safety barrier, depending on the objective to be achieved and present a method to analyse binary components usually present on sensor systems of safety barriers. An application example of a water-based fire system is presented and the Probability of Failure on Demand (PFD) of the sensor system is determined based on the analysis of pressure switches installed in this safety barrier. The knowledge of such information will allow the determination of safety barrier’s availability.
Resumo:
Abstract The investigation of the web of relationships between the different elements of the immune system has proven instrumental to better understand this complex biological system. This is particularly true in the case of the interactions between B and T lymphocytes, both during cellular development and at the stage of cellular effectors functions. The understanding of the B–T cells interdependency and the possibility to manipulate this relationship may be directly applicable to situations where immunity is deficient, as is the case of cancer or immune suppression after radio and chemotherapy. The work presented here started with the development of a novel and accurate tool to directly assess the diversity of the cellular repertoire (Chapter III). Contractions of T cell receptor diversity have been related with a deficient immune status. This method uses gene chips platforms where nucleic acids coding for lymphocyte receptors are hybridized and is based on the fact that the frequency of hybridization of nucleic acids to the oligonucleotides on a gene chip varies in direct proportion to diversity. Subsequently, and using this new method and other techniques of cell quantification I examined, in an animal model, the role that polyclonal B cells and immunoglobulin exert upon T cell development in the thymus, specifically on the acquisition of a broader repertoire diversity by the T cell receptors (Chapter IV and V). The hypothesis tested was if the presence of more diverse peptides in the thymus, namely polyclonal immunoglobulin, would induce the generation of more diverse T cells precursors. The results obtained demonstrated that the diversity of the T cell compartment is increased by the presence of polyclonal immunoglobulin. Polyclonal immunoglobulin, and particularly the Fab fragments of the molecule, represent the most diverse self-molecules in the body and its peptides are presented by antigen presenting cells to precursor T cells in the thymus during its development. This probably contributes significantly to the generation of receptor diversity. Furthermore, we also demonstrated that a more diverse repertoire of T lymphocytes is associated with a more effective and robust T cell immune function in vivo, as mice with a more diverse T cell receptors reject minor histocompatiblility discordant skin grafts faster than mice with a shrunken T cell receptor repertoire (Chapter V). We believe that a broader T cell receptor diversity allows a more efficient recognition and rejection of a higher range of external and internal aggressions. In this work it is demonstrated that a reduction of TCR diversity by thymectomy in wild type mice significantly increased survival of H-Y incompatible skin grafts, indicating decrease on T cell function. In addiction reconstitution of T-cell diversity in mice with a decreased T cell repertoire diversity with immunoglobulin Fab fragments, lead to a increase on TCR diversity and to a significantly decreased survival of the skin grafts (Chapter V). These results strongly suggest that increases on T cell repertoire diversity contribute to improvement of T cell function. Our results may have important implications on therapy and immune reconstitution in the context of AIDS, cancer, autoimmunity and post myeloablative treatments. Based on the previous results, we tested the clinical hypothesis that patients with haematological malignancies subjected to stem cell transplantation who recovered a robust immune system would have a better survival compared to patients who did not recover such a robust immune system. This study was undertaken by the examination of the progression and overall survival of 42 patients with mantle cell non-Hodgkin lymphoma receiving autologous hematopoietic stem cell transplantation (Chapter VI). The results obtained show that patients who recovered higher numbers of lymphocytes soon after autologous transplantation had a statistically significantly longer progression free and overall survivals. These results demonstrate the positive impact that a more robust immune system reconstitution after stem cell transplantation may have upon the survival of patients with haematological malignancies. In a similar clinical research framework, this dissertation also includes the study of the impact of recovering normal serum levels of polyclonal immunoglobulin on the survival of patients with another B cell haematological malignancy, multiple myeloma, after autologous stem cell transplantation (Chapter VII). The relapse free survival of the 110 patients with multiple myeloma analysed was associated with their ability to recover normal serum levels of the polyclonal compartment of immunoglobulin. These results suggest again the important effect of polyclonal immunoglobulin for the (re)generation of the immune competence. We also studied the impact of a robust immunity for the response to treatment with the antibody anti CD20, rituximab, in patients with non- Hodgkin’s lymphoma (NHL) (Chapter VIII). Patients with higher absolute counts of CD4+ T lymphocytes respond better (in terms of longer progression free survival) to rituximab compared to patients with lower number of CD4+ T lymphocytes. These observations highlight again the fact that a competent immune system is required for the clinical benefit of rituximab therapy in NHL patients. In conclusion, the work presented in this dissertation demonstrates, for the first time, that diverse B cells and polyclonal immunoglobulin promote T cell diversification in the thymus and improve T lymphocyte function. Also, it shows that in the setting of immune reconstitution, as after autologous stem cell transplantation for mantle cell lymphoma and in the setting of immune therapy for NHL, the absolute lymphocyte counts are an independent factor predicting progression free and overall survival. These results can have an important application in the clinical practice since the majority of the current treatments for cancer are immunosuppressive and implicate a subsequent immune recovery. Also, the effects of a number of antineoplastic treatments, including biological agents, depend on the immune system activity. In this way, studies similar to the ones presented here, where methods to improve the immune reconstitution are examined, may prove to be instrumental for a better understanding of the immune system and to guide more efficient treatment options and the design of future clinical trials. Resumo O estudo da rede de inter-relações entre os diversos elementos do sistema immune tem-se mostrado um instrumento essencial para uma melhor compreensão deste complexo sistema biológico. Tal é particularmente verdade no caso das interacções entre os linfócitos B e T, quer durante o desenvolvimento celular, quer ao nível das funções celulares efectoras. A compreensão da interdependência entre linfócitos B e T e a possibilidade de manipular esta relação pode ser directamente aplicável a situações em que a imunidade está deficiente, como é o caso das doenças neoplásicas ou da imunossupressão após radio ou quimioterapia. O trabalho apresentado nesta dissertação iniciou-se com o desenvolvimento de um novo método laboratorial para medir directamente a diversidade do reportório celular (Capítulo III). Reduções da diversidade do reportório dos receptores de células T têm sido relacionadas com um estado de imunodeficiência. O método desenvolvido utiliza “gene chips”, aos quais hibridizam os ácidos nucleicos codificantes das cadeias proteicas dos receptores linfocitários. A diversidade é calculada com base na frequência de hibridização do ácido nucleico da amostra aos oligonucleótidos presentes no “gene chip”. De seguida, e utilizando este novo método e outras técnicas de quantificação celular examinei, num modelo animal, o papel que as células policlonais B e a imunoglobulina exercem sobre o desenvolvimento linfocitário T no timo, especificamente na aquisição de um reportório diverso de receptores T (Capítulos IV e V). Testei, então, a hipótese de que a presença no timo de péptidos mais diversos, como a imunoglobulna policlonal, induzisse a génese de precursores T mais diversos. Demonstrámos que a diversidade do compartimento T é aumentado pela presença de imunoglobulina policlonal. A imunoglobulina policlonal, e particularmente os fragmentos Fab desta molécula, representam as moléculas autólogas mais diversas presentes nos organismos vertebrados. Estes péptidos são apresentados por células apresentadoras de antigénio às células precursoras T no timo, durante o desenvolvimento celular T. Tal, provavelmente, contribui para a génese da diversidade dos receptores. Também demonstrámos que a presença de um reportório mais diverso de linfócitos T se associa a um incremento da função imunológica T in vivo. Uma diversidade de receptores T mais extensa parece permitir um reconhecimento e rejeição mais eficientes de um maior número de agressores internos e externos. Demonstrámos que ratinhos com receptores de células T (RCT) com maior diversidade rejeitam transplantes cutâneos discordantes para antigénios minor de histocompatibilidade mais rapidamente do que ratinhos com um menor reportório T (Capítulo V). Por outro lado, uma redução da diversidade do RCT, causada por timectomia de ratinhos de estirpes selvagens, mostrou aumentar significativamente a sobrevivência de transplantes cutâneos incompatíveis para o antigénio H-Y (antigénio minor de histocompatibilidade), indicando uma diminuição da função linfocitária T. Além disso, a reconstituição da diversidade dos linfócitos T em ratinhos com uma diversidade de reportório T diminuída, induzida pela administração de fragmentos Fab de imunoglobulina, conduz a um aumento da diversidade dos RCT e a uma diminuição significativa da sobrevivência dos enxertos cutâneos (Capítulo V). Estes resultados sugerem que o aumento do reportório de células T contribui para uma melhoria das funções celulares T e poderão ter implicações importantes na terapêutica e reconstitutição imunológica em contexto de SIDA, neoplasias, autoimunidade e após tratamentos mieloablativos. Baseado nos resultados anteriores, decidimos testar a hipótese clínica de que doentes com neoplasias hematológicas sujeitos a transplantação de precursores hematopoiéticos e com recuperação imunológica precoce após transplante teriam uma sobrevivência mais longa do que doentes que não recuperassem tão bem a sua imunidade. Analisámos a sobrevivência global e sobrevivência sem doença de 42 doentes com linfoma não Hodgkin de células do manto sujeitos a transplante autólogo de precursores hematopoiéticos (Capítulo VI). Os resultados obtidos mostraram que os doentes que recuperaram contagens mais elevadas de linfócitos imediatamente após o transplante autólogo, apresentaram uma sobrevivência global e sem progressão mais longa do que doentes que não recuperaram contagens linfocitárias tão precocemente. Estes resultados demonstram o efeito positivo de uma reconstitutição imunológica robusta após transplante de presursores hematopoiéticos, sobre a sobrevivência de doentes com neoplasias hematológicas. Do mesmo modo, estudámos o efeito que a recuperação de níveis séricos normais de imunoglobulina policlonal tem na sobrevivência de doentes com outras neoplasias hematológicas de linfócitos B, como o mieloma múltiplo,após transplante autólogo de precursos hematopoiéticos (Capítulo VII). A sobrevivência livre de doença dos 110 doentes com mieloma múltiplo analizados está associada com a sua capacidade de recuperar níveis séricos normais do compartmento policlonal de imunoglobulina. Estes resultados pioneiros indicam a importância da imunoglobulina policlonal para a génese de competência imunológica. Também estudámos o impacto de um sistema imunitário eficiente sobre a resposta ao tratamento com o anticorpo anti CD20, ituximab, em doentes com linfoma não Hodgkin (LNH) (Capítulo VIII). Os resultados mostram que doentes com valores mais elevados de linfócitos T CD4+ respondem melhor (em termos de maior sobrevida livre de doença) ao rituximab, do que doentes com valores mais baixos. Estas observações ilustram a necessidade de um sistema imunitário competente para o benefício clínico da terapêutica com rituximab em doentes com LNH. Em conclusão, o trabalho apresentado nesta dissertação demonstra que as células B e a imunoglobulina policlonal promovem a diversidade das células T no timo e melhoram a função linfocitária T periférica. Concomitantemente, também demonstrámos que, no contexto de reconstituição imune, por exemplo, após transplante autólogo de precursores hematopoiéticos em doentes com linfomas de células do manto, o número absoluto de linfócitos é uma factor independente da sobrevivência. Os resultados demonstram, também, a importância dos valores de linfocitos T na resposta ao tratamento com rituximab no caso de doentes com LNH. O mesmo princípio se prova pelo facto de que doentes com mieloma múltiplo sujeitos a transplante autólogo de precursores hematopoiéticos que recuperam valores normais séricos de imunoglobulinas policlonais, terem melhores taxas de resposta em comparação com doentes que não recuperam valores normais de imunoglobulinas policlonais. Estes resultados podem ter importantes aplicações na prática clínica dado que a maioria dos tratamentos de doenças neoplásicas implica imunossupressão e, subsequente, recuperação imunológica. Estes estudos podem ser um instrumento fundamental para uma melhor compreensão do sistema imune e guiar uma escolha mais eficiente de opções terapêuticas bem como contribuir para a concepção de futuros estudos clínicos.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.