10 resultados para 1117

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methodological issues in research with children have sparked a growing interest by the Sociology of Childhood since the last decades. In Portugal, this interest is more recent, but it has had a significant increase. Considering several researches, namely master thesis, supervised by the authors on the framework of Sociology of Childhood, this proposal intends to characterize some methodological complexities in research with children in Portugal, when we consider their voice and agency in the knowledge producing about them. The goal of this paper is to contribute to the methodological discussion on research with children through the identification of a set of challenges related to: (i) the diversity of methodologies uses in children’s research, (ii) ethical concerns and (iii) the role of the researcher.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proceedings of International Conference Conference Volume 7830 Image and Signal Processing for Remote Sensing XVI Lorenzo Bruzzone Toulouse, France | September 20, 2010

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proceedings of International Conference - SPIE 7477, Image and Signal Processing for Remote Sensing XV - 28 September 2009

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Brown adipose tissue (BAT) plays an important role in whole body metabolism and could potentially mediate weight gain and insulin sensitivity. Although some imaging techniques allow BAT detection, there are currently no viable methods for continuous acquisition of BAT energy expenditure. We present a non-invasive technique for long term monitoring of BAT metabolism using microwave radiometry. Methods: A multilayer 3D computational model was created in HFSS™ with 1.5 mm skin, 3-10 mm subcutaneous fat, 200 mm muscle and a BAT region (2-6 cm3) located between fat and muscle. Based on this model, a log-spiral antenna was designed and optimized to maximize reception of thermal emissions from the target (BAT). The power absorption patterns calculated in HFSS™ were combined with simulated thermal distributions computed in COMSOL® to predict radiometric signal measured from an ultra-low-noise microwave radiometer. The power received by the antenna was characterized as a function of different levels of BAT metabolism under cold and noradrenergic stimulation. Results: The optimized frequency band was 1.5-2.2 GHz, with averaged antenna efficiency of 19%. The simulated power received by the radiometric antenna increased 2-9 mdBm (noradrenergic stimulus) and 4-15 mdBm (cold stimulus) corresponding to increased 15-fold BAT metabolism. Conclusions: Results demonstrated the ability to detect thermal radiation from small volumes (2-6 cm3) of BAT located up to 12 mm deep and to monitor small changes (0.5°C) in BAT metabolism. As such, the developed miniature radiometric antenna sensor appears suitable for non-invasive long term monitoring of BAT metabolism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hyperspectral imaging can be used for object detection and for discriminating between different objects based on their spectral characteristics. One of the main problems of hyperspectral data analysis is the presence of mixed pixels, due to the low spatial resolution of such images. This means that several spectrally pure signatures (endmembers) are combined into the same mixed pixel. Linear spectral unmixing follows an unsupervised approach which aims at inferring pure spectral signatures and their material fractions at each pixel of the scene. The huge data volumes acquired by such sensors put stringent requirements on processing and unmixing methods. This paper proposes an efficient implementation of a unsupervised linear unmixing method on GPUs using CUDA. The method finds the smallest simplex by solving a sequence of nonsmooth convex subproblems using variable splitting to obtain a constraint formulation, and then applying an augmented Lagrangian technique. The parallel implementation of SISAL presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory. The results herein presented indicate that the GPU implementation can significantly accelerate the method's execution over big datasets while maintaining the methods accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dimensionality reduction plays a crucial role in many hyperspectral data processing and analysis algorithms. This paper proposes a new mean squared error based approach to determine the signal subspace in hyperspectral imagery. The method first estimates the signal and noise correlations matrices, then it selects the subset of eigenvalues that best represents the signal subspace in the least square sense. The effectiveness of the proposed method is illustrated using simulated and real hyperspectral images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hyperspectral instruments have been incorporated in satellite missions, providing large amounts of data of high spectral resolution of the Earth surface. This data can be used in remote sensing applications that often require a real-time or near-real-time response. To avoid delays between hyperspectral image acquisition and its interpretation, the last usually done on a ground station, onboard systems have emerged to process data, reducing the volume of information to transfer from the satellite to the ground station. For this purpose, compact reconfigurable hardware modules, such as field-programmable gate arrays (FPGAs), are widely used. This paper proposes an FPGA-based architecture for hyperspectral unmixing. This method based on the vertex component analysis (VCA) and it works without a dimensionality reduction preprocessing step. The architecture has been designed for a low-cost Xilinx Zynq board with a Zynq-7020 system-on-chip FPGA-based on the Artix-7 FPGA programmable logic and tested using real hyperspectral data. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low-cost embedded systems, opening perspectives for onboard hyperspectral image processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Parallel hyperspectral unmixing problem is considered in this paper. A semisupervised approach is developed under the linear mixture model, where the abundance's physical constraints are taken into account. The proposed approach relies on the increasing availability of spectral libraries of materials measured on the ground instead of resorting to endmember extraction methods. Since Libraries are potentially very large and hyperspectral datasets are of high dimensionality a parallel implementation in a pixel-by-pixel fashion is derived to properly exploits the graphics processing units (GPU) architecture at low level, thus taking full advantage of the computational power of GPUs. Experimental results obtained for real hyperspectral datasets reveal significant speedup factors, up to 164 times, with regards to optimized serial implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hyperspectral unmixing methods aim at the decomposition of a hyperspectral image into a collection endmember signatures, i.e., the radiance or reflectance of the materials present in the scene, and the correspondent abundance fractions at each pixel in the image. This paper introduces a new unmixing method termed dependent component analysis (DECA). This method is blind and fully automatic and it overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. DECA is based on the linear mixture model, i.e., each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abundances are modeled as mixtures of Dirichlet densities, thus enforcing the non-negativity and constant sum constraints, imposed by the acquisition process. The endmembers signatures are inferred by a generalized expectation-maximization (GEM) type algorithm. The paper illustrates the effectiveness of DECA on synthetic and real hyperspectral images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Linear unmixing decomposes an hyperspectral image into a collection of re ectance spectra, called endmember signatures, and a set corresponding abundance fractions from the respective spatial coverage. This paper introduces vertex component analysis, an unsupervised algorithm to unmix linear mixtures of hyperpsectral data. VCA exploits the fact that endmembers occupy vertices of a simplex, and assumes the presence of pure pixels in data. VCA performance is illustrated using simulated and real data. VCA competes with state-of-the-art methods with much lower computational complexity.