995 resultados para ALPHA DATA CUBES


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* Supported partially by the Bulgarian National Science Fund under Grant MM-1405/2004

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We obtained new Fabry-Perot data cubes and derived velocity fields, monochromatic, and velocity dispersion maps for 28 galaxies in the Hickson compact groups 37, 40, 47, 49, 54, 56, 68, 79, and 93. We also derived rotation curves for 9 of the studied galaxies, 6 of which are strongly asymmetric. Combining these new data with previously published 2D kinematic maps of compact group galaxies, we investigated the differences between the kinematic and morphological position angles for a sample of 46 galaxies. We find that one third of the unbarred compact group galaxies have position angle misalignments between the stellar and gaseous components. This and the asymmetric rotation curves are clear signatures of kinematic perturbations, probably because of interactions among compact group galaxies. A comparison between the B-band Tully-Fisher relation for compact group galaxies and for the GHASP field-galaxy sample shows that, despite the high fraction of compact group galaxies with asymmetric rotation curves, these lay on the TF relation defined by galaxies in less dense environments, although with more scatter. This agrees with previous results, but now confirmed for a larger sample of 41 galaxies. We confirm the tendency for compact group galaxies at the low-mass end of the Tully-Fisher relation (HCG 49b, 89d, 96c, 96d, and 100c) to have either a magnitude that is too bright for its mass (suggesting brightening by star formation) and/or a low maximum rotational velocity for its luminosity (suggesting tidal stripping). These galaxies are outside the Tully Fisher relation at the 1 sigma level, even when the minimum acceptable values of inclinations are used to compute their maximum velocities. Including such galaxies with nu < 100 km s(-1) in the determination of the zero point and slope of the compact group B-band Tully-Fisher relation would strongly change the fit, making it different from the relation for field galaxies, which has to be kept in mind when studying scaling relations of interacting galaxies, especially at high redshifts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We studied, for the first time, the near-infrared, stellar and baryonic Tully-Fisher relations for a sample of field galaxies taken from a homogeneous Fabry-Perot sample of galaxies [the Gassendi HAlpha survey of SPirals (GHASP) survey]. The main advantage of GHASP over other samples is that the maximum rotational velocities were estimated from 2D velocity fields, avoiding assumptions about the inclination and position angle of the galaxies. By combining these data with 2MASS photometry, optical colours, HI masses and different mass-to-light ratio estimators, we found a slope of 4.48 +/- 0.38 and 3.64 +/- 0.28 for the stellar and baryonic Tully-Fisher relation, respectively. We found that these values do not change significantly when different mass-to-light ratio recipes were used. We also point out, for the first time, that the rising rotation curves as well as asymmetric rotation curves show a larger dispersion in the Tully-Fisher relation than the flat ones or the symmetric ones. Using the baryonic mass and the optical radius of galaxies, we found that the surface baryonic mass density is almost constant for all the galaxies of this sample. In this study we also emphasize the presence of a break in the NIR Tully-Fisher relation at M(H,K) similar to -20 and we confirm that late-type galaxies present higher total-to-baryonic mass ratios than early-type spirals, suggesting that supernova feedback is actually an important issue in late-type spirals. Due to the well-defined sample selection criteria and the homogeneity of the data analysis, the Tully-Fisher relation for GHASP galaxies can be used as a reference for the study of this relation in other environments and at higher redshifts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Context. Compact groups of galaxies are entities that have high densities of galaxies and serve as laboratories to study galaxy interactions, intergalactic star formation and galaxy evolution. Aims. The main goal of this study is to search for young objects in the intragroup medium of seven compact groups of galaxies: HCG 2, 7, 22, 23, 92, 100 and NGC 92 as well as to evaluate the stage of interaction of each group. Methods. We used Fabry-Perot velocity fields and rotation curves together with GALEX NUV and FUV images and optical R-band and HI maps. Results. (i) HCG 7 and HCG 23 are in early stages of interaction; (ii) HCG 2 and HCG 22 are mildly interacting; and (iii) HCG 92, HCG 100 and NGC 92 are in late stages of evolution. We find that all three evolved groups contain populations of young blue objects in the intragroup medium, consistent with ages < 100 Myr, of which several are younger than < 10 Myr. We also report the discovery of a tidal dwarf galaxy candidate in the tail of NGC 92. These three groups, besides containing galaxies that have peculiar velocity fields, also show extended HI tails. Conclusions. Our results indicate that the advanced stage of evolution of a group, together with the presence of intragroup HI clouds, may lead to star formation in the intragroup medium. A table containing all intergalactic HII regions and tidal dwarf galaxies confirmed to date is appended.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cross sections of (120)Sn(alpha,alpha)(120)Sn elastic scattering have been extracted from the alpha-particle-beam contamination of a recent (120)Sn((6)He,(6)He)(120)Sn experiment. Both reactions are analyzed using systematic double-folding potentials in the real part and smoothly varying Woods-Saxon potentials in the imaginary part. The potential extracted from the (120)Sn((6)He,(6)He)(120)Sn data may be used as the basis for the construction of a simple global (6)He optical potential. The comparison of the (6)He and alpha data shows that the halo nature of the (6)He nucleus leads to a clear signature in the reflexion coefficients eta(L) : The relevant angular momenta L with eta(L) >> 0 and eta(L) << 1 are shifted to larger L with a broader distribution. This signature is not present in the alpha-scattering data and can thus be used as a new criterion for the definition of a halo nucleus.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Latex a été utilisé pour la redaction de cette thèse.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The kinematics is a fundamental tool to infer the dynamical structure of galaxies and to understand their formation and evolution. Spectroscopic observations of gas emission lines are often used to derive rotation curves and velocity dispersions. It is however difficult to disentangle these two quantities in low spatial-resolution data because of beam smearing. In this thesis, we present 3D-Barolo, a new software to derive the gas kinematics of disk galaxies from emission-line data-cubes. The code builds tilted-ring models in the 3D observational space and compares them with the actual data-cubes. 3D-Barolo works with data at a wide range of spatial resolutions without being affected by instrumental biases. We use 3D-Barolo to derive rotation curves and velocity dispersions of several galaxies in both the local and the high-redshift Universe. We run our code on HI observations of nearby galaxies and we compare our results with 2D traditional approaches. We show that a 3D approach to the derivation of the gas kinematics has to be preferred to a 2D approach whenever a galaxy is resolved with less than about 20 elements across the disk. We moreover analyze a sample of galaxies at z~1, observed in the H-alpha line with the KMOS/VLT spectrograph. Our 3D modeling reveals that the kinematics of these high-z systems is comparable to that of local disk galaxies, with steeply-rising rotation curves followed by a flat part and H-alpha velocity dispersions of 15-40 km/s over the whole disks. This evidence suggests that disk galaxies were already fully settled about 7-8 billion years ago. In summary, 3D-Barolo is a powerful and robust tool to separate physical and instrumental effects and to derive a reliable kinematics. The analysis of large samples of galaxies at different redshifts with 3D-Barolo will provide new insights on how galaxies assemble and evolve throughout cosmic time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three-dimensional spectroscopy techniques are becoming more and more popular, producing an increasing number of large data cubes. The challenge of extracting information from these cubes requires the development of new techniques for data processing and analysis. We apply the recently developed technique of principal component analysis (PCA) tomography to a data cube from the center of the elliptical galaxy NGC 7097 and show that this technique is effective in decomposing the data into physically interpretable information. We find that the first five principal components of our data are associated with distinct physical characteristics. In particular, we detect a low-ionization nuclear-emitting region (LINER) with a weak broad component in the Balmer lines. Two images of the LINER are present in our data, one seen through a disk of gas and dust, and the other after scattering by free electrons and/or dust particles in the ionization cone. Furthermore, we extract the spectrum of the LINER, decontaminated from stellar and extended nebular emission, using only the technique of PCA tomography. We anticipate that the scattered image has polarized light due to its scattered nature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: To describe the process of translation and linguistic and cultural validation of the Evidence Based Practice Questionnaire for the Portuguese context: Questionário de Eficácia Clínica e Prática Baseada em Evidências (QECPBE). METHOD: A methodological and cross-sectional study was developed. The translation and back translation was performed according to traditional standards. Principal Components Analysis with orthogonal rotation according to the Varimax method was used to verify the QECPBE's psychometric characteristics, followed by confirmatory factor analysis. Internal consistency was determined by Cronbach's alpha. Data were collected between December 2013 and February 2014. RESULTS: 358 nurses delivering care in a hospital facility in North of Portugal participated in the study. QECPBE contains 20 items and three subscales: Practice (α=0.74); Attitudes (α=0.75); Knowledge/Skills and Competencies (α=0.95), presenting an overall internal consistency of α=0.74. The tested model explained 55.86% of the variance and presented good fit: χ2(167)=520.009; p = 0.0001; χ2df=3.114; CFI=0.908; GFI=0.865; PCFI=0.798; PGFI=0.678; RMSEA=0.077 (CI90%=0.07-0.08). CONCLUSION: confirmatory factor analysis revealed the questionnaire is valid and appropriate to be used in the studied context.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Inspired by the relational algebra of data processing, this paper addresses the foundations of data analytical processing from a linear algebra perspective. The paper investigates, in particular, how aggregation operations such as cross tabulations and data cubes essential to quantitative analysis of data can be expressed solely in terms of matrix multiplication, transposition and the Khatri–Rao variant of the Kronecker product. The approach offers a basis for deriving an algebraic theory of data consolidation, handling the quantitative as well as qualitative sides of data science in a natural, elegant and typed way. It also shows potential for parallel analytical processing, as the parallelization theory of such matrix operations is well acknowledged.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A capacidade das empresas multinacionais (EMN) de se adaptar às oportunidades e ameaças de seus mercados é um dos fatores estratégicos de maior importância na dinâmica dos negócios atuais. Essa necessidade de adaptação não se restringe apenas aos países desenvolvidos, mas também aos países emergentes. Nesse contexto faz-se necessário a compreensão da estratégia de empresas multinacionais dos países emergentes através do estudo das capacidades dinâmicas e de seus modelos de gestão. Assim este trabalho teve como objetivo geral analisar o desenvolvimento e a transferência das capacidades dinâmicas entre matriz e subsidiárias. Serviu como fundamento teórico deste estudo, a visão baseada em recursos, a ambidestralidade das organizações (exploration x explotation), as capacidades dinâmicas e os modelos estratégicos de gestão de EMNs. A definição escolhida para as capacidades dinâmicas, “habilidades sistemáticas da organização de integrar, construir e reconfigurar suas competências organizacionais de acordo com as ameaças e oportunidades do mercado”, serviu como base para toda a pesquisa. Em termos metodológicos foi utilizada a abordagem qualitativa, através de estudo de caso único como método de pesquisa. Após definição de alguns critérios (EMNs brasileiras, setor de tecnologia, mais de uma subsidiária e atuação em diversos segmentos) para definição da organização a ser estudada, optou-se pela escolha da empresa Alpha. A coleta de dados sobre a Alpha foi baseada em múltiplas fontes, através de informações secundárias e entrevistas em profundidade realizadas com pessoas chaves da organização para investigar de forma direta os processos, competências e recursos existentes na organização. Foram encontradas duas capacidades dinâmicas na Alpha: “processo de desenvolvimento de software” e “desenvolvimento de novos serviços”. A primeira CD está presente na matriz e nas subsidiarias. Sua transferência se deu de maneira integral a todas as subsidiárias e teve como fatores antecedentes à integração e o contexto competitivo. A segunda CD está presente somente na matriz e teve como principais fatores antecedentes a orientação empreendedora e as iniciativas. Sua transferência não ocorreu para qualquer subsidiária. Após a análise dos resultados pôde-se concluir que as duas capacidades dinâmicas, em especial a CD processo de desenvolvimento de software, é geradora de vantagem competitiva para a Alpha.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The aim of this study was to assess clinical and inflammatory markers in nonalcoholic fatty liver disease (NAFLD) in postmenopausal women with metabolic syndrome.Methods: This cross-sectional study included 180 Brazilian women (age >= 45 years and amenorrhea >= 12 months). Metabolic syndrome was diagnosed by the presence of at least three of the following indicators: Waist circumference (WC) > 88 cm, triglycerides (TGs) >= 150 mg/dL, high-density lipoprotein (HDL) < 50 mg/dL; blood pressure >= 130/85 mmHg; and glucose >= 100 mg/dL. NAFLD was diagnosed by abdominal ultrasound. Participants were divided into three groups: Metabolic syndrome alone (n = 53); metabolic syndrome + NAFLD (n = 67); or absence of metabolic syndrome or NAFLD (control, n = 60). Clinical, anthropometric, and biochemical variables were quantified. The inflammatory profile included adiponectin, interleukin-6 (IL-6) and tumor necrosis factor-alpha (TNF-alpha). Data were submitted to statistical analysis using a Tukey test, analysis of variance (ANOVA), chi-squared, Pearson correlation, and logistic regression (odds ratio, OR).Results: Women with metabolic syndrome + NAFLD, abdominal obesity, high glucose, and insulin resistance by HOMA-IR were compared to women with metabolic syndrome alone and controls (P < 0.05). High values of IL-6 and TNF-alpha and low values of adiponectin were observed among women with metabolic syndrome alone or metabolic syndrome + NAFLD when compared to controls (P < 0.05). In multivariate analysis, the variables considered as risk of NAFLD development were: High systolic blood pressure (SBP) [(OR 1.02, 95% confidence interval (CI) 1.0-1.04]; large WC (OR 1.07, 95% CI 1.01-1.13); insulin resistance (OR 3.81, 95% CI 2.01-7.13); and metabolic syndrome (OR 8.68, 95% CI 3.3-24.1). Adiponectin levels reduced NAFLD risk (OR 0.88, 95% CI 0.80-0.96).Conclusion: In postmenopausal women, metabolic syndrome, abdominal obesity, and insulin resistance were risk markers for the development of NAFLD, whereas higher adiponectin values indicated a protection marker.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the presence of turbulence, magnetic field lines lose their dynamical identity and particles entrained on field lines diffuse through space at a rate determined by the amplitude of the turbulence. In previous work (Lazarian and Vishniac, 1999; Kowal et al., 2009; Eyink et al., 2011) we showed that this leads to reconnection speeds which are independent of resistivity. In particular, in Kowal et al. (2009) we showed that numerical simulations were consistent with the predictions of this model. Here we examine the structure of the current sheet in simulations of turbulent reconnection. Laminar flows consistent with the Sweet-Parker reconnection model produce very thin and well ordered currents sheets. On the other hand, the simulations of Kowal et al. (2009) show a strongly disordered state even for relatively low levels of turbulence. Comparing data cubes with and without reconnection, we find that large scale field reversals are the cumulative effect of many individual eddies, each of which has magnetic properties which are not very different from turbulent eddies in a homogeneous background. This implies that the properties of stationary and homogeneous MHD turbulence are a reasonable guide to understanding turbulence during large scale magnetic reconnection events. In addition, dissipation and high energy particle acceleration during reconnection events take place over a macroscopic volume, rather than being confined to a narrow zone whose properties depend on microscopic transport coefficients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El objeto de esta Tesis doctoral es el desarrollo de una metodologia para la deteccion automatica de anomalias a partir de datos hiperespectrales o espectrometria de imagen, y su cartografiado bajo diferentes condiciones tipologicas de superficie y terreno. La tecnologia hiperespectral o espectrometria de imagen ofrece la posibilidad potencial de caracterizar con precision el estado de los materiales que conforman las diversas superficies en base a su respuesta espectral. Este estado suele ser variable, mientras que las observaciones se producen en un numero limitado y para determinadas condiciones de iluminacion. Al aumentar el numero de bandas espectrales aumenta tambien el numero de muestras necesarias para definir espectralmente las clases en lo que se conoce como Maldicion de la Dimensionalidad o Efecto Hughes (Bellman, 1957), muestras habitualmente no disponibles y costosas de obtener, no hay mas que pensar en lo que ello implica en la Exploracion Planetaria. Bajo la definicion de anomalia en su sentido espectral como la respuesta significativamente diferente de un pixel de imagen respecto de su entorno, el objeto central abordado en la Tesis estriba primero en como reducir la dimensionalidad de la informacion en los datos hiperespectrales, discriminando la mas significativa para la deteccion de respuestas anomalas, y segundo, en establecer la relacion entre anomalias espectrales detectadas y lo que hemos denominado anomalias informacionales, es decir, anomalias que aportan algun tipo de informacion real de las superficies o materiales que las producen. En la deteccion de respuestas anomalas se asume un no conocimiento previo de los objetivos, de tal manera que los pixeles se separan automaticamente en funcion de su informacion espectral significativamente diferenciada respecto de un fondo que se estima, bien de manera global para toda la escena, bien localmente por segmentacion de la imagen. La metodologia desarrollada se ha centrado en la implicacion de la definicion estadistica del fondo espectral, proponiendo un nuevo enfoque que permite discriminar anomalias respecto fondos segmentados en diferentes grupos de longitudes de onda del espectro, explotando la potencialidad de separacion entre el espectro electromagnetico reflectivo y emisivo. Se ha estudiado la eficiencia de los principales algoritmos de deteccion de anomalias, contrastando los resultados del algoritmo RX (Reed and Xiaoli, 1990) adoptado como estandar por la comunidad cientifica, con el metodo UTD (Uniform Targets Detector), su variante RXD-UTD, metodos basados en subespacios SSRX (Subspace RX) y metodo basados en proyecciones de subespacios de imagen, como OSPRX (Orthogonal Subspace Projection RX) y PP (Projection Pursuit). Se ha desarrollado un nuevo metodo, evaluado y contrastado por los anteriores, que supone una variacion de PP y describe el fondo espectral mediante el analisis discriminante de bandas del espectro electromagnetico, separando las anomalias con el algortimo denominado Detector de Anomalias de Fondo Termico o DAFT aplicable a sensores que registran datos en el espectro emisivo. Se han evaluado los diferentes metodos de deteccion de anomalias en rangos del espectro electromagnetico del visible e infrarrojo cercano (Visible and Near Infrared-VNIR), infrarrojo de onda corta (Short Wavelenght Infrared-SWIR), infrarrojo medio (Meadle Infrared-MIR) e infrarrojo termico (Thermal Infrared-TIR). La respuesta de las superficies en las distintas longitudes de onda del espectro electromagnetico junto con su entorno, influyen en el tipo y frecuencia de las anomalias espectrales que puedan provocar. Es por ello que se han utilizado en la investigacion cubos de datos hiperepectrales procedentes de los sensores aeroportados cuya estrategia y diseno en la construccion espectrometrica de la imagen difiere. Se han evaluado conjuntos de datos de test de los sensores AHS (Airborne Hyperspectral System), HyMAP Imaging Spectrometer, CASI (Compact Airborne Spectrographic Imager), AVIRIS (Airborne Visible Infrared Imaging Spectrometer), HYDICE (Hyperspectral Digital Imagery Collection Experiment) y MASTER (MODIS/ASTER Simulator). Se han disenado experimentos sobre ambitos naturales, urbanos y semiurbanos de diferente complejidad. Se ha evaluado el comportamiento de los diferentes detectores de anomalias a traves de 23 tests correspondientes a 15 areas de estudio agrupados en 6 espacios o escenarios: Urbano - E1, Semiurbano/Industrial/Periferia Urbana - E2, Forestal - E3, Agricola - E4, Geologico/Volcanico - E5 y Otros Espacios Agua, Nubes y Sombras - E6. El tipo de sensores evaluados se caracteriza por registrar imagenes en un amplio rango de bandas, estrechas y contiguas, del espectro electromagnetico. La Tesis se ha centrado en el desarrollo de tecnicas que permiten separar y extraer automaticamente pixeles o grupos de pixeles cuya firma espectral difiere de manera discriminante de las que tiene alrededor, adoptando para ello como espacio muestral parte o el conjunto de las bandas espectrales en las que ha registrado radiancia el sensor hiperespectral. Un factor a tener en cuenta en la investigacion ha sido el propio instrumento de medida, es decir, la caracterizacion de los distintos subsistemas, sensores imagen y auxiliares, que intervienen en el proceso. Para poder emplear cuantitativamente los datos medidos ha sido necesario definir las relaciones espaciales y espectrales del sensor con la superficie observada y las potenciales anomalias y patrones objetivos de deteccion. Se ha analizado la repercusion que en la deteccion de anomalias tiene el tipo de sensor, tanto en su configuracion espectral como en las estrategias de diseno a la hora de registrar la radiacion prodecente de las superficies, siendo los dos tipos principales de sensores estudiados los barredores o escaneres de espejo giratorio (whiskbroom) y los barredores o escaneres de empuje (pushbroom). Se han definido distintos escenarios en la investigacion, lo que ha permitido abarcar una amplia variabilidad de entornos geomorfologicos y de tipos de coberturas, en ambientes mediterraneos, de latitudes medias y tropicales. En resumen, esta Tesis presenta una tecnica de deteccion de anomalias para datos hiperespectrales denominada DAFT en su variante de PP, basada en una reduccion de la dimensionalidad proyectando el fondo en un rango de longitudes de onda del espectro termico distinto de la proyeccion de las anomalias u objetivos sin firma espectral conocida. La metodologia propuesta ha sido probada con imagenes hiperespectrales reales de diferentes sensores y en diferentes escenarios o espacios, por lo tanto de diferente fondo espectral tambien, donde los resultados muestran los beneficios de la aproximacion en la deteccion de una gran variedad de objetos cuyas firmas espectrales tienen suficiente desviacion respecto del fondo. La tecnica resulta ser automatica en el sentido de que no hay necesidad de ajuste de parametros, dando resultados significativos en todos los casos. Incluso los objetos de tamano subpixel, que no pueden distinguirse a simple vista por el ojo humano en la imagen original, pueden ser detectados como anomalias. Ademas, se realiza una comparacion entre el enfoque propuesto, la popular tecnica RX y otros detectores tanto en su modalidad global como local. El metodo propuesto supera a los demas en determinados escenarios, demostrando su capacidad para reducir la proporcion de falsas alarmas. Los resultados del algoritmo automatico DAFT desarrollado, han demostrado la mejora en la definicion cualitativa de las anomalias espectrales que identifican a entidades diferentes en o bajo superficie, reemplazando para ello el modelo clasico de distribucion normal con un metodo robusto que contempla distintas alternativas desde el momento mismo de la adquisicion del dato hiperespectral. Para su consecucion ha sido necesario analizar la relacion entre parametros biofisicos, como la reflectancia y la emisividad de los materiales, y la distribucion espacial de entidades detectadas respecto de su entorno. Por ultimo, el algoritmo DAFT ha sido elegido como el mas adecuado para sensores que adquieren datos en el TIR, ya que presenta el mejor acuerdo con los datos de referencia, demostrando una gran eficacia computacional que facilita su implementacion en un sistema de cartografia que proyecte de forma automatica en un marco geografico de referencia las anomalias detectadas, lo que confirma un significativo avance hacia un sistema en lo que se denomina cartografia en tiempo real. The aim of this Thesis is to develop a specific methodology in order to be applied in automatic detection anomalies processes using hyperspectral data also called hyperspectral scenes, and to improve the classification processes. Several scenarios, areas and their relationship with surfaces and objects have been tested. The spectral characteristics of reflectance parameter and emissivity in the pattern recognition of urban materials in several hyperspectral scenes have also been tested. Spectral ranges of the visible-near infrared (VNIR), shortwave infrared (SWIR) and thermal infrared (TIR) from hyperspectral data cubes of AHS (Airborne Hyperspectral System), HyMAP Imaging Spectrometer, CASI (Compact Airborne Spectrographic Imager), AVIRIS (Airborne Visible Infrared Imaging Spectrometer), HYDICE (Hyperspectral Digital Imagery Collection Experiment) and MASTER (MODIS/ASTER Simulator) have been used in this research. It is assumed that there is not prior knowledge of the targets in anomaly detection. Thus, the pixels are automatically separated according to their spectral information, significantly differentiated with respect to a background, either globally for the full scene, or locally by the image segmentation. Several experiments on different scenarios have been designed, analyzing the behavior of the standard RX anomaly detector and different methods based on subspace, image projection and segmentation-based anomaly detection methods. Results and their consequences in unsupervised classification processes are discussed. Detection of spectral anomalies aims at extracting automatically pixels that show significant responses in relation of their surroundings. This Thesis deals with the unsupervised technique of target detection, also called anomaly detection. Since this technique assumes no prior knowledge about the target or the statistical characteristics of the data, the only available option is to look for objects that are differentiated from the background. Several methods have been developed in the last decades, allowing a better understanding of the relationships between the image dimensionality and the optimization of search procedures as well as the subpixel differentiation of the spectral mixture and its implications in anomalous responses. In other sense, image spectrometry has proven to be efficient in the characterization of materials, based on statistical methods using a specific reflection and absorption bands. Spectral configurations in the VNIR, SWIR and TIR have been successfully used for mapping materials in different urban scenarios. There has been an increasing interest in the use of high resolution data (both spatial and spectral) to detect small objects and to discriminate surfaces in areas with urban complexity. This has come to be known as target detection which can be either supervised or unsupervised. In supervised target detection, algorithms lean on prior knowledge, such as the spectral signature. The detection process for matching signatures is not straightforward due to the complications of converting data airborne sensor with material spectra in the ground. This could be further complicated by the large number of possible objects of interest, as well as uncertainty as to the reflectance or emissivity of these objects and surfaces. An important objective in this research is to establish relationships that allow linking spectral anomalies with what can be called informational anomalies and, therefore, identify information related to anomalous responses in some places rather than simply spotting differences from the background. The development in recent years of new hyperspectral sensors and techniques, widen the possibilities for applications in remote sensing of the Earth. Remote sensing systems measure and record electromagnetic disturbances that the surveyed objects induce in their surroundings, by means of different sensors mounted on airborne or space platforms. Map updating is important for management and decisions making people, because of the fast changes that usually happen in natural, urban and semi urban areas. It is necessary to optimize the methodology for obtaining the best from remote sensing techniques from hyperspectral data. The first problem with hyperspectral data is to reduce the dimensionality, keeping the maximum amount of information. Hyperspectral sensors augment considerably the amount of information, this allows us to obtain a better precision on the separation of material but at the same time it is necessary to calculate a bigger number of parameters, and the precision lowers with the increase in the number of bands. This is known as the Hughes effects (Bellman, 1957) . Hyperspectral imagery allows us to discriminate between a huge number of different materials however some land and urban covers are made up with similar material and respond similarly which produces confusion in the classification. The training and the algorithm used for mapping are also important for the final result and some properties of thermal spectrum for detecting land cover will be studied. In summary, this Thesis presents a new technique for anomaly detection in hyperspectral data called DAFT, as a PP's variant, based on dimensionality reduction by projecting anomalies or targets with unknown spectral signature to the background, in a range thermal spectrum wavelengths. The proposed methodology has been tested with hyperspectral images from different imaging spectrometers corresponding to several places or scenarios, therefore with different spectral background. The results show the benefits of the approach to the detection of a variety of targets whose spectral signatures have sufficient deviation in relation to the background. DAFT is an automated technique in the sense that there is not necessary to adjust parameters, providing significant results in all cases. Subpixel anomalies which cannot be distinguished by the human eye, on the original image, however can be detected as outliers due to the projection of the VNIR end members with a very strong thermal contrast. Furthermore, a comparison between the proposed approach and the well-known RX detector is performed at both modes, global and local. The proposed method outperforms the existents in particular scenarios, demonstrating its performance to reduce the probability of false alarms. The results of the automatic algorithm DAFT have demonstrated improvement in the qualitative definition of the spectral anomalies by replacing the classical model by the normal distribution with a robust method. For their achievement has been necessary to analyze the relationship between biophysical parameters such as reflectance and emissivity, and the spatial distribution of detected entities with respect to their environment, as for example some buried or semi-buried materials, or building covers of asbestos, cellular polycarbonate-PVC or metal composites. Finally, the DAFT method has been chosen as the most suitable for anomaly detection using imaging spectrometers that acquire them in the thermal infrared spectrum, since it presents the best results in comparison with the reference data, demonstrating great computational efficiency that facilitates its implementation in a mapping system towards, what is called, Real-Time Mapping.