960 resultados para equivalent web thickness method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The local fractional Poisson equations in two independent variables that appear in mathematical physics involving the local fractional derivatives are investigated in this paper. The approximate solutions with the nondifferentiable functions are obtained by using the local fractional variational iteration method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores – Sistemas Digitais e Percepcionais pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologias da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performance evaluation increasingly assumes a more important role in any organizational environment. In the transport area, the drivers are the company’s image and for this reason it is important to develop and increase their performance and commitment to the company goals. This evaluation can be used to motivate driver to improve their performance and to discover training needs. This work aims to create a performance appraisal evaluation model of the drivers based on the multi-criteria decision aid methodology. The MMASSI (Multicriteria Methodology to Support Selection of Information Systems) methodology was adapted by using a template supporting the evaluation according to the freight transportation company in study. The evaluation process involved all drivers (collaborators being evaluated), their supervisors and the company management. The final output is a ranking of the drivers, based on their performance, for each one of the scenarios used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Remote hyperspectral sensors collect large amounts of data per flight usually with low spatial resolution. It is known that the bandwidth connection between the satellite/airborne platform and the ground station is reduced, thus a compression onboard method is desirable to reduce the amount of data to be transmitted. This paper presents a parallel implementation of an compressive sensing method, called parallel hyperspectral coded aperture (P-HYCA), for graphics processing units (GPU) using the compute unified device architecture (CUDA). This method takes into account two main properties of hyperspectral dataset, namely the high correlation existing among the spectral bands and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. Experimental results conducted using synthetic and real hyperspectral datasets on two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN, reveal that the use of GPUs can provide real-time compressive sensing performance. The achieved speedup is up to 20 times when compared with the processing time of HYCA running on one core of the Intel i7-2600 CPU (3.4GHz), with 16 Gbyte memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The investigation of the web of relationships between the different elements of the immune system has proven instrumental to better understand this complex biological system. This is particularly true in the case of the interactions between B and T lymphocytes, both during cellular development and at the stage of cellular effectors functions. The understanding of the B–T cells interdependency and the possibility to manipulate this relationship may be directly applicable to situations where immunity is deficient, as is the case of cancer or immune suppression after radio and chemotherapy. The work presented here started with the development of a novel and accurate tool to directly assess the diversity of the cellular repertoire (Chapter III). Contractions of T cell receptor diversity have been related with a deficient immune status. This method uses gene chips platforms where nucleic acids coding for lymphocyte receptors are hybridized and is based on the fact that the frequency of hybridization of nucleic acids to the oligonucleotides on a gene chip varies in direct proportion to diversity. Subsequently, and using this new method and other techniques of cell quantification I examined, in an animal model, the role that polyclonal B cells and immunoglobulin exert upon T cell development in the thymus, specifically on the acquisition of a broader repertoire diversity by the T cell receptors (Chapter IV and V). The hypothesis tested was if the presence of more diverse peptides in the thymus, namely polyclonal immunoglobulin, would induce the generation of more diverse T cells precursors. The results obtained demonstrated that the diversity of the T cell compartment is increased by the presence of polyclonal immunoglobulin. Polyclonal immunoglobulin, and particularly the Fab fragments of the molecule, represent the most diverse self-molecules in the body and its peptides are presented by antigen presenting cells to precursor T cells in the thymus during its development. This probably contributes significantly to the generation of receptor diversity. Furthermore, we also demonstrated that a more diverse repertoire of T lymphocytes is associated with a more effective and robust T cell immune function in vivo, as mice with a more diverse T cell receptors reject minor histocompatiblility discordant skin grafts faster than mice with a shrunken T cell receptor repertoire (Chapter V). We believe that a broader T cell receptor diversity allows a more efficient recognition and rejection of a higher range of external and internal aggressions. In this work it is demonstrated that a reduction of TCR diversity by thymectomy in wild type mice significantly increased survival of H-Y incompatible skin grafts, indicating decrease on T cell function. In addiction reconstitution of T-cell diversity in mice with a decreased T cell repertoire diversity with immunoglobulin Fab fragments, lead to a increase on TCR diversity and to a significantly decreased survival of the skin grafts (Chapter V). These results strongly suggest that increases on T cell repertoire diversity contribute to improvement of T cell function. Our results may have important implications on therapy and immune reconstitution in the context of AIDS, cancer, autoimmunity and post myeloablative treatments. Based on the previous results, we tested the clinical hypothesis that patients with haematological malignancies subjected to stem cell transplantation who recovered a robust immune system would have a better survival compared to patients who did not recover such a robust immune system. This study was undertaken by the examination of the progression and overall survival of 42 patients with mantle cell non-Hodgkin lymphoma receiving autologous hematopoietic stem cell transplantation (Chapter VI). The results obtained show that patients who recovered higher numbers of lymphocytes soon after autologous transplantation had a statistically significantly longer progression free and overall survivals. These results demonstrate the positive impact that a more robust immune system reconstitution after stem cell transplantation may have upon the survival of patients with haematological malignancies. In a similar clinical research framework, this dissertation also includes the study of the impact of recovering normal serum levels of polyclonal immunoglobulin on the survival of patients with another B cell haematological malignancy, multiple myeloma, after autologous stem cell transplantation (Chapter VII). The relapse free survival of the 110 patients with multiple myeloma analysed was associated with their ability to recover normal serum levels of the polyclonal compartment of immunoglobulin. These results suggest again the important effect of polyclonal immunoglobulin for the (re)generation of the immune competence. We also studied the impact of a robust immunity for the response to treatment with the antibody anti CD20, rituximab, in patients with non- Hodgkin’s lymphoma (NHL) (Chapter VIII). Patients with higher absolute counts of CD4+ T lymphocytes respond better (in terms of longer progression free survival) to rituximab compared to patients with lower number of CD4+ T lymphocytes. These observations highlight again the fact that a competent immune system is required for the clinical benefit of rituximab therapy in NHL patients. In conclusion, the work presented in this dissertation demonstrates, for the first time, that diverse B cells and polyclonal immunoglobulin promote T cell diversification in the thymus and improve T lymphocyte function. Also, it shows that in the setting of immune reconstitution, as after autologous stem cell transplantation for mantle cell lymphoma and in the setting of immune therapy for NHL, the absolute lymphocyte counts are an independent factor predicting progression free and overall survival. These results can have an important application in the clinical practice since the majority of the current treatments for cancer are immunosuppressive and implicate a subsequent immune recovery. Also, the effects of a number of antineoplastic treatments, including biological agents, depend on the immune system activity. In this way, studies similar to the ones presented here, where methods to improve the immune reconstitution are examined, may prove to be instrumental for a better understanding of the immune system and to guide more efficient treatment options and the design of future clinical trials. Resumo O estudo da rede de inter-relações entre os diversos elementos do sistema immune tem-se mostrado um instrumento essencial para uma melhor compreensão deste complexo sistema biológico. Tal é particularmente verdade no caso das interacções entre os linfócitos B e T, quer durante o desenvolvimento celular, quer ao nível das funções celulares efectoras. A compreensão da interdependência entre linfócitos B e T e a possibilidade de manipular esta relação pode ser directamente aplicável a situações em que a imunidade está deficiente, como é o caso das doenças neoplásicas ou da imunossupressão após radio ou quimioterapia. O trabalho apresentado nesta dissertação iniciou-se com o desenvolvimento de um novo método laboratorial para medir directamente a diversidade do reportório celular (Capítulo III). Reduções da diversidade do reportório dos receptores de células T têm sido relacionadas com um estado de imunodeficiência. O método desenvolvido utiliza “gene chips”, aos quais hibridizam os ácidos nucleicos codificantes das cadeias proteicas dos receptores linfocitários. A diversidade é calculada com base na frequência de hibridização do ácido nucleico da amostra aos oligonucleótidos presentes no “gene chip”. De seguida, e utilizando este novo método e outras técnicas de quantificação celular examinei, num modelo animal, o papel que as células policlonais B e a imunoglobulina exercem sobre o desenvolvimento linfocitário T no timo, especificamente na aquisição de um reportório diverso de receptores T (Capítulos IV e V). Testei, então, a hipótese de que a presença no timo de péptidos mais diversos, como a imunoglobulna policlonal, induzisse a génese de precursores T mais diversos. Demonstrámos que a diversidade do compartimento T é aumentado pela presença de imunoglobulina policlonal. A imunoglobulina policlonal, e particularmente os fragmentos Fab desta molécula, representam as moléculas autólogas mais diversas presentes nos organismos vertebrados. Estes péptidos são apresentados por células apresentadoras de antigénio às células precursoras T no timo, durante o desenvolvimento celular T. Tal, provavelmente, contribui para a génese da diversidade dos receptores. Também demonstrámos que a presença de um reportório mais diverso de linfócitos T se associa a um incremento da função imunológica T in vivo. Uma diversidade de receptores T mais extensa parece permitir um reconhecimento e rejeição mais eficientes de um maior número de agressores internos e externos. Demonstrámos que ratinhos com receptores de células T (RCT) com maior diversidade rejeitam transplantes cutâneos discordantes para antigénios minor de histocompatibilidade mais rapidamente do que ratinhos com um menor reportório T (Capítulo V). Por outro lado, uma redução da diversidade do RCT, causada por timectomia de ratinhos de estirpes selvagens, mostrou aumentar significativamente a sobrevivência de transplantes cutâneos incompatíveis para o antigénio H-Y (antigénio minor de histocompatibilidade), indicando uma diminuição da função linfocitária T. Além disso, a reconstituição da diversidade dos linfócitos T em ratinhos com uma diversidade de reportório T diminuída, induzida pela administração de fragmentos Fab de imunoglobulina, conduz a um aumento da diversidade dos RCT e a uma diminuição significativa da sobrevivência dos enxertos cutâneos (Capítulo V). Estes resultados sugerem que o aumento do reportório de células T contribui para uma melhoria das funções celulares T e poderão ter implicações importantes na terapêutica e reconstitutição imunológica em contexto de SIDA, neoplasias, autoimunidade e após tratamentos mieloablativos. Baseado nos resultados anteriores, decidimos testar a hipótese clínica de que doentes com neoplasias hematológicas sujeitos a transplantação de precursores hematopoiéticos e com recuperação imunológica precoce após transplante teriam uma sobrevivência mais longa do que doentes que não recuperassem tão bem a sua imunidade. Analisámos a sobrevivência global e sobrevivência sem doença de 42 doentes com linfoma não Hodgkin de células do manto sujeitos a transplante autólogo de precursores hematopoiéticos (Capítulo VI). Os resultados obtidos mostraram que os doentes que recuperaram contagens mais elevadas de linfócitos imediatamente após o transplante autólogo, apresentaram uma sobrevivência global e sem progressão mais longa do que doentes que não recuperaram contagens linfocitárias tão precocemente. Estes resultados demonstram o efeito positivo de uma reconstitutição imunológica robusta após transplante de presursores hematopoiéticos, sobre a sobrevivência de doentes com neoplasias hematológicas. Do mesmo modo, estudámos o efeito que a recuperação de níveis séricos normais de imunoglobulina policlonal tem na sobrevivência de doentes com outras neoplasias hematológicas de linfócitos B, como o mieloma múltiplo,após transplante autólogo de precursos hematopoiéticos (Capítulo VII). A sobrevivência livre de doença dos 110 doentes com mieloma múltiplo analizados está associada com a sua capacidade de recuperar níveis séricos normais do compartmento policlonal de imunoglobulina. Estes resultados pioneiros indicam a importância da imunoglobulina policlonal para a génese de competência imunológica. Também estudámos o impacto de um sistema imunitário eficiente sobre a resposta ao tratamento com o anticorpo anti CD20, ituximab, em doentes com linfoma não Hodgkin (LNH) (Capítulo VIII). Os resultados mostram que doentes com valores mais elevados de linfócitos T CD4+ respondem melhor (em termos de maior sobrevida livre de doença) ao rituximab, do que doentes com valores mais baixos. Estas observações ilustram a necessidade de um sistema imunitário competente para o benefício clínico da terapêutica com rituximab em doentes com LNH. Em conclusão, o trabalho apresentado nesta dissertação demonstra que as células B e a imunoglobulina policlonal promovem a diversidade das células T no timo e melhoram a função linfocitária T periférica. Concomitantemente, também demonstrámos que, no contexto de reconstituição imune, por exemplo, após transplante autólogo de precursores hematopoiéticos em doentes com linfomas de células do manto, o número absoluto de linfócitos é uma factor independente da sobrevivência. Os resultados demonstram, também, a importância dos valores de linfocitos T na resposta ao tratamento com rituximab no caso de doentes com LNH. O mesmo princípio se prova pelo facto de que doentes com mieloma múltiplo sujeitos a transplante autólogo de precursores hematopoiéticos que recuperam valores normais séricos de imunoglobulinas policlonais, terem melhores taxas de resposta em comparação com doentes que não recuperam valores normais de imunoglobulinas policlonais. Estes resultados podem ter importantes aplicações na prática clínica dado que a maioria dos tratamentos de doenças neoplásicas implica imunossupressão e, subsequente, recuperação imunológica. Estes estudos podem ser um instrumento fundamental para uma melhor compreensão do sistema imune e guiar uma escolha mais eficiente de opções terapêuticas bem como contribuir para a concepção de futuros estudos clínicos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The choice of an information systems is a critical factor of success in an organization's performance, since, by involving multiple decision-makers, with often conflicting objectives, several alternatives with aggressive marketing, makes it particularly complex by the scope of a consensus. The main objective of this work is to make the analysis and selection of a information system to support the school management, pedagogical and administrative components, using a multicriteria decision aid system – MMASSITI – Multicriteria Method- ology to Support the Selection of Information Systems/Information Technologies – integrates a multicriteria model that seeks to provide a systematic approach in the process of choice of Information Systems, able to produce sustained recommendations concerning the decision scope. Its application to a case study has identi- fied the relevant factors in the selection process of school educational and management information system and get a solution that allows the decision maker’ to compare the quality of the various alternatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho de Projecto apresentado como requisito parcial para obtenção do grau de Mestre em Ciência e Sistemas de Informação Geográfica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main problems of hyperspectral data analysis is the presence of mixed pixels due to the low spatial resolution of such images. Linear spectral unmixing aims at inferring pure spectral signatures and their fractions at each pixel of the scene. The huge data volumes acquired by hyperspectral sensors put stringent requirements on processing and unmixing methods. This letter proposes an efficient implementation of the method called simplex identification via split augmented Lagrangian (SISAL) which exploits the graphics processing unit (GPU) architecture at low level using Compute Unified Device Architecture. SISAL aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The proposed implementation is performed in a pixel-by-pixel fashion using coalesced accesses to memory and exploiting shared memory to store temporary data. Furthermore, the kernels have been optimized to minimize the threads divergence, therefore achieving high GPU occupancy. The experimental results obtained for the simulated and real hyperspectral data sets reveal speedups up to 49 times, which demonstrates that the GPU implementation can significantly accelerate the method's execution over big data sets while maintaining the methods accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parallel hyperspectral unmixing problem is considered in this paper. A semisupervised approach is developed under the linear mixture model, where the abundance's physical constraints are taken into account. The proposed approach relies on the increasing availability of spectral libraries of materials measured on the ground instead of resorting to endmember extraction methods. Since Libraries are potentially very large and hyperspectral datasets are of high dimensionality a parallel implementation in a pixel-by-pixel fashion is derived to properly exploits the graphics processing units (GPU) architecture at low level, thus taking full advantage of the computational power of GPUs. Experimental results obtained for real hyperspectral datasets reveal significant speedup factors, up to 164 times, with regards to optimized serial implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many Hyperspectral imagery applications require a response in real time or near-real time. To meet this requirement this paper proposes a parallel unmixing method developed for graphics processing units (GPU). This method is based on the vertex component analysis (VCA), which is a geometrical based method highly parallelizable. VCA is a very fast and accurate method that extracts endmember signatures from large hyperspectral datasets without the use of any a priori knowledge about the constituent spectra. Experimental results obtained for simulated and real hyperspectral datasets reveal considerable acceleration factors, up to 24 times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.