999 resultados para basic matrix
Resumo:
Sparse matrix-vector multiplication (SMVM) is a fundamental operation in many scientific and engineering applications. In many cases sparse matrices have thousands of rows and columns where most of the entries are zero, while non-zero data is spread over the matrix. This sparsity of data locality reduces the effectiveness of data cache in general-purpose processors quite reducing their performance efficiency when compared to what is achieved with dense matrix multiplication. In this paper, we propose a parallel processing solution for SMVM in a many-core architecture. The architecture is tested with known benchmarks using a ZYNQ-7020 FPGA. The architecture is scalable in the number of core elements and limited only by the available memory bandwidth. It achieves performance efficiencies up to almost 70% and better performances than previous FPGA designs.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
O uso da tecnologia tem crescido nas últimas décadas nas mais diversas áreas, seja na indústria ou no dia-a-dia, e é cada vez mais evidente os benefícios que traz. No desporto não é diferente. Cada dia surgem novos desenvolvimentos objetivando a melhoria do desempenho dos praticantes de atividades físicas, possibilitando atingir resultados nunca antes pensados. Além disto, a utilização da tecnologia no desporto permite a obtenção de dados biomecânicos que podem ser utilizados tanto no treinamento quando na melhoria da qualidade de vida dos atletas auxiliando na prevenção de lesões, por exemplo. Deste modo, o presente projeto se aplica na área do desporto, nomeadamente, na modalidade do surfe, onde a ausência de trabalhos científicos ainda é elevada, aliando a tecnologia eletrônica ao desporto para quantificar informações até então desconhecidas. Três fatores básicos de desempenho foram levantados, sendo eles: equilíbrio, posicionamento dos pés e movimentação da prancha de surfe. Estes fatores levaram ao desenvolvimento de um sistema capaz de medi-los dinamicamente através da medição das forças plantares e da rotação da prancha de surfe. Além da medição dos fatores, o sistema é capaz de armazenar os dados adquiridos localmente através de um cartão de memória, para posterior análise; e também enviá-los através de uma comunicação sem fio, permitindo a visualização do centro de pressões plantares; dos ângulos de rotação da prancha de surfe; e da ativação dos sensores; em tempo real. O dispositivo consiste em um sistema eletrônico embarcado composto por um microcontrolador ATMEGA1280; um circuito de aquisição e condicionamento de sinal analógico; uma central inercial; um módulo de comunicação sem fio RN131; e um conjunto de sensores de força Flexiforce. O firmware embarcado foi desenvolvido em linguagem C. O software Matlab foi utilizado para receção de dados e visualização em tempo real. Os testes realizados demostraram que o funcionamento do sistema atende aos requisitos propostos, fornecendo informação acerca do equilíbrio, através do centro de pressões; do posicionamento dos pés, através da distribuição das pressões plantares; e do movimento da prancha nos eixos pitch e roll, através da central inercial. O erro médio de medição de força verificado foi de -0.0012 ± 0.0064 N, enquanto a mínima distância alcançada na transmissão sem fios foi de 100 m. A potência medida do sistema foi de 330 mW.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Química e Bioquímica
Resumo:
Paper presented at Geo-Spatial Crossroad GI_Forum, Salzburg, Austria.
Resumo:
This article wishes to contribute to the study of the historical processes that have been spotting Muslim populations as favourite targets for political analysis and governance. Focusing on the Portuguese archives, civil as well as military, the article tries to uncover the most conspicuous identity representations (mainly negative or ambivalent) that members of Portuguese colonial apparatus built around Muslim communities living in African colonies, particularly in Guinea-Bissau and Mozambique. The paper shows how these culturally and politically constructed images were related to the more general strategies by which Portuguese imagined their own national identity, both as ‘European’ and as ‘coloniser’ or ‘imperial people’. The basic assumption of this article is that policies enforced in a context of inter-ethnic and religious competition are better understood when linked to the identity strategies inherent to them. These are conceived as strategic constructions aimed at the preservation, the protection and the imaginary expansion of the subject, who looks for groups to be included in and out-groups to reject, exclude, aggress or eliminate. We think that most of the inter-ethnic relationships and conflicts, as well as the very experience of ethnicity, are born from this identity matrix.
Resumo:
This paper addresses the matrix representation of dynamical systems in the perspective of fractional calculus. Fractional elements and fractional systems are interpreted under the light of the classical Cole–Cole, Davidson–Cole, and Havriliak–Negami heuristic models. Numerical simulations for an electrical circuit enlighten the results for matrix based models and high fractional orders. The conclusions clarify the distinction between fractional elements and fractional systems.
Resumo:
Speech interfaces for Assistive Technologies are not common and are usually replaced by others. The market they are targeting is not considered attractive and speech technologies are still not well spread. Industry still thinks they present some performance risks, especially Speech Recognition systems. As speech is the most elemental and natural way for communication, it has strong potential for enhancing inclusion and quality of life for broader groups of users with special needs, such as people with cerebral palsy and elderly staying at their homes. This work is a position paper in which the authors argue for the need to make speech become the basic interface in assistive technologies. Among the main arguments, we can state: speech is the easiest way to interact with machines; there is a growing market for embedded speech in assistive technologies, since the number of disabled and elderly people is expanding; speech technology is already mature to be used but needs adaptation to people with special needs; there is still a lot of R&D to be done in this area, especially when thinking about the Portuguese market. The main challenges are presented and future directions are proposed.
Resumo:
The aim of the present work was to determine the prevalence of IgG and IgM anti-Toxoplasma gondii antibodies and the factors associated to the infection in pregnant women attended in Basic Health Units in Rolândia, Paraná, Brazil. The sample was divided in two groups: group I (320 pregnant women who were analyzed from July 2007 to February 2008) and group II (287 pregnant women who were analyzed from March to October 2008). In group I, it was found 53.1% of pregnant women with IgG reactive and IgM non-reactive, 1.9% with IgG and IgM reactive, 0.3% with IgG non-reactive and IgM reactive and 44.7% with IgG and IgM non-reactive. In group II, it was found 55.1% with IgG reactive and IgM non-reactive and 44.9% with IgG and IgM non-reactive. The variables associated to the presence of IgG antibodies were: residence in rural areas, pregnant women between 35-40 years old, low educational level, low family income, more than one pregnancy, drinking water which does not originate from the public water supply system and the habit of handling soil or sand. Guidance on primary prevention measures and the quarterly serological monitoring of the pregnant women in the risk group are important measures to prevent congenital toxoplasmosis.
Resumo:
The discussion of possible scenarios for the future of Quality is on the priority list of major Quality Practitioners Societies. EOQ – European Organization for Quality (EOQ, 2014) main team for its 58th EOQ-Congress held June 2014 in Göteborg was “Managing Challenges in Quality Leadership” and ASQ - American Society for Quality (ASQ, 2015) appointed “the Future of Quality” for Quality Progress Magazine November 2015 issue. In addition, the ISO 9001:2008 revision process carried by ISO/TC 176 aims to assure that ISO 9001:2015 International Standard remains stable for the next 10 years (ISO, 2014) contributing to an increased discussion on the future of quality. The purpose of this research is to review available Quality Management approaches and outline, adding an academic perspective, expected developments for Quality within the 21st Century. This paper follows a qualitative approach, although data from international organizations is used. A literature review has been undertaken on quality management past and potential future trends. Based on these findings a model is proposed for organization quality management development and propositions for the future of quality management are advanced. Firstly, a state of the art of existing Quality Management approaches is presented, for example, like Total Quality Management (TQM) and Quality Gurus, ISO 9000 International Standards Series (with an outline of the expected changes for ISO 9001:2015), Six Sigma and Business Excellence Models.Secondly, building on theoretical and managerial approaches, a two dimensional matrix – Quality Engineering (QE - technical aspects of quality) and Quality Management (QM: soft aspects of quality) - is presented, outlining five proposed characterizations of Quality maturity levels and giving insights for applications and future developments. Literature review highlights that QM and QE may be addressing similar quality issues but their approaches are different in terms of scope breadth and intensity and they ought to complement and reciprocally reinforce one another. The challenges organizations face within the 21st century have stronger uncertainty, complexity, and differentiation. Two main propositions are advanced as relevant for 21st Century Quality: - QM importance for the sustainable success of organizations will increase and they should be aware of the larger ecosystem to be managed for improvement, possibly leading to the emergence of a new Quality paradigm, The Civilizacional Excellence paradigm. - QE should get more attention from QM and the Quality professionals will have to: a) Master and apply in wider contexts and in additional depth the Quality Tools (basic, intermediate and advanced); b) Have the soft skills needed for its success; c) Be results oriented and better understand and demonstrate the relationships between approaches and results These propositions challenge both scholars and practitioners for a sustained and supported discussion on the future of Quality. “All things are ready, if our mind be so.” (Shakespeare, Henry V, circa 1599).
Resumo:
This article wishes to contribute to the study of the historical processes that have been spotting Muslim populations as favourite targets for political analysis and governance. Focusing on the Portuguese archives, civil as well as military, the article tries to uncover the most conspicuous identity representations (mainly negative or ambivalent) that members of Portuguese colonial apparatus built around Muslim communities living in African colonies, particularly in Guinea- Bissau and Mozambique. The paper shows how these culturally and politically constructed images were related to the more general strategies by which Portuguese imagined their own national identity, both as ‘European’ and as ‘coloniser’ or ‘imperial people’. The basic assumption of this article is that policies enforced in a context of interethnic and religious competition are better understood when linked to the identity strategies inherent to them. These are conceived as strategic constructions aimed at the preservation, protection and imaginary expansion of the subject, who looks for groups to be included in and out-groups to reject, exclude, aggress or eliminate. The author argues that most of the inter-ethnic relationships and conflicts, as well as the very experience of ethnicity, are born from this identity matrix.
Resumo:
Protein nutritionalstatus indicators were studied in weanling albino Swiss mice infected with S. mansoni andfed the Regional Basic Diet (RBD)from Northeast Brazil, a multideficient diet of low-protein content. Each mouse was infected percutaneously with 80 cercariae. The experiment lasted 63 days. The growth curve, food consumption, protein intake, weight gain, Protein Efficiency Ratio (PER) and Net Protein Ratio (NPR) were the parameters investigated. RBD-fed mice showed a marked weight loss, a lower food and protein intake, a slower body weight gain and lower rates of food protein utilization when compared to casein-fed animals. Differences between infected and non-infected mice were not consistent. The present results suggest that the effects of RBD-induced malnutrition on health and nutritional conditions of the mice are more severe than those of Manson's schistosomiasis, in the initial phase of the disease.
Resumo:
Dissertation presented to Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa for obtaining the master degree in Membrane Engineering
Resumo:
Four years after the first visit seventeen public health units were visited again and evaluated as to standards of storage recommended by the Brazilian Immunization Programme. In 100% of the units, refrigerators and proper inside location of vaccines in the refrigerator were adequatety or regularfy maintained and checked, respectively. However, when control of temperature was checked, onfy 64.7% presented adequate storage conditions. In 94.1 % of the units, health workers complained of lack of immediate technical support in emergency situations. In 55.2 % the titers vaccine samples of were under the minimal recommended potency. It is necessary that the factors concerning the cold chain be continualfy evaluated so that the quality of the vaccines that will be used is not affected.