979 resultados para sparse matrix-vector multiplication


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent papers, the authors obtained formulas for directional derivatives of all orders, of the immanant and of the m-th xi-symmetric tensor power of an operator and a matrix, when xi is a character of the full symmetric group. The operator norm of these derivatives was also calculated. In this paper, similar results are established for generalized matrix functions and for every symmetric tensor power.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lutzomyia verrucarum (Townsend, 1913) (Diptera: Psychodidae), vector natural de la verruga peruana o enfermedad de Carrión es una especie propia del Perú. Su distribución geográfica esta entre los paralelos 5º y 13º25' de latitud Sur, se encuentra en los valles Occidentales e Interandinos de los Andes. La distribución altitudinal de Lu. verrucarum en los diversos valles es variable; asi: Occidentales, desde 1100 hasta 2980 msnm e Interandinos, de 1200 a 3200 msnm. En ciertas áreas verrucógenas no hay correlación entre la presencia de Lu. verrucarum y la enfermedad de Carrión lo que suguiere la existencia de vectores secundarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Single processor architectures are unable to provide the required performance of high performance embedded systems. Parallel processing based on general-purpose processors can achieve these performances with a considerable increase of required resources. However, in many cases, simplified optimized parallel cores can be used instead of general-purpose processors achieving better performance at lower resource utilization. In this paper, we propose a configurable many-core architecture to serve as a co-processor for high-performance embedded computing on Field-Programmable Gate Arrays. The architecture consists of an array of configurable simple cores with support for floating-point operations interconnected with a configurable interconnection network. For each core it is possible to configure the size of the internal memory, the supported operations and number of interfacing ports. The architecture was tested in a ZYNQ-7020 FPGA in the execution of several parallel algorithms. The results show that the proposed many-core architecture achieves better performance than that achieved with a parallel generalpurpose processor and that up to 32 floating-point cores can be implemented in a ZYNQ-7020 SoC FPGA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parallel hyperspectral unmixing problem is considered in this paper. A semisupervised approach is developed under the linear mixture model, where the abundance's physical constraints are taken into account. The proposed approach relies on the increasing availability of spectral libraries of materials measured on the ground instead of resorting to endmember extraction methods. Since Libraries are potentially very large and hyperspectral datasets are of high dimensionality a parallel implementation in a pixel-by-pixel fashion is derived to properly exploits the graphics processing units (GPU) architecture at low level, thus taking full advantage of the computational power of GPUs. Experimental results obtained for real hyperspectral datasets reveal significant speedup factors, up to 164 times, with regards to optimized serial implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a new parallel method for sparse spectral unmixing of remotely sensed hyperspectral data on commodity graphics processing units (GPUs) is presented. A semi-supervised approach is adopted, which relies on the increasing availability of spectral libraries of materials measured on the ground instead of resorting to endmember extraction methods. This method is based on the spectral unmixing by splitting and augmented Lagrangian (SUNSAL) that estimates the material's abundance fractions. The parallel method is performed in a pixel-by-pixel fashion and its implementation properly exploits the GPU architecture at low level, thus taking full advantage of the computational power of GPUs. Experimental results obtained for simulated and real hyperspectral datasets reveal significant speedup factors, up to 1 64 times, with regards to optimized serial implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than ever, there is an increase of the number of decision support methods and computer aided diagnostic systems applied to various areas of medicine. In breast cancer research, many works have been done in order to reduce false-positives when used as a double reading method. In this study, we aimed to present a set of data mining techniques that were applied to approach a decision support system in the area of breast cancer diagnosis. This method is geared to assist clinical practice in identifying mammographic findings such as microcalcifications, masses and even normal tissues, in order to avoid misdiagnosis. In this work a reliable database was used, with 410 images from about 115 patients, containing previous reviews performed by radiologists as microcalcifications, masses and also normal tissue findings. Throughout this work, two feature extraction techniques were used: the gray level co-occurrence matrix and the gray level run length matrix. For classification purposes, we considered various scenarios according to different distinct patterns of injuries and several classifiers in order to distinguish the best performance in each case described. The many classifiers used were Naïve Bayes, Support Vector Machines, k-nearest Neighbors and Decision Trees (J48 and Random Forests). The results in distinguishing mammographic findings revealed great percentages of PPV and very good accuracy values. Furthermore, it also presented other related results of classification of breast density and BI-RADS® scale. The best predictive method found for all tested groups was the Random Forest classifier, and the best performance has been achieved through the distinction of microcalcifications. The conclusions based on the several tested scenarios represent a new perspective in breast cancer diagnosis using data mining techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Energy systems worldwide are complex and challenging environments. Multi-agent based simulation platforms are increasing at a high rate, as they show to be a good option to study many issues related to these systems, as well as the involved players at act in this domain. In this scope the authors’ research group has developed a multi-agent system: MASCEM (Multi- Agent System for Competitive Electricity Markets), which simulates the electricity markets environment. MASCEM is integrated with ALBidS (Adaptive Learning Strategic Bidding System) that works as a decision support system for market players. The ALBidS system allows MASCEM market negotiating players to take the best possible advantages from the market context. This paper presents the application of a Support Vector Machines (SVM) based approach to provide decision support to electricity market players. This strategy is tested and validated by being included in ALBidS and then compared with the application of an Artificial Neural Network, originating promising results. The proposed approach is tested and validated using real electricity markets data from MIBEL - Iberian market operator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents several forecasting methodologies based on the application of Artificial Neural Networks (ANN) and Support Vector Machines (SVM), directed to the prediction of the solar radiance intensity. The methodologies differ from each other by using different information in the training of the methods, i.e, different environmental complementary fields such as the wind speed, temperature, and humidity. Additionally, different ways of considering the data series information have been considered. Sensitivity testing has been performed on all methodologies in order to achieve the best parameterizations for the proposed approaches. Results show that the SVM approach using the exponential Radial Basis Function (eRBF) is capable of achieving the best forecasting results, and in half execution time of the ANN based approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wind speed forecasting has been becoming an important field of research to support the electricity industry mainly due to the increasing use of distributed energy sources, largely based on renewable sources. This type of electricity generation is highly dependent on the weather conditions variability, particularly the variability of the wind speed. Therefore, accurate wind power forecasting models are required to the operation and planning of wind plants and power systems. A Support Vector Machines (SVM) model for short-term wind speed is proposed and its performance is evaluated and compared with several artificial neural network (ANN) based approaches. A case study based on a real database regarding 3 years for predicting wind speed at 5 minutes intervals is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A malária, doença parasitária complexa que resulta da interacção entre parasita, hospedeiros humano e vector, constitui um dos principais problemas de saúde a nível mundial. À semelhança de outras doenças parasitárias e infecciosas a malária tem um papel importante na evolução, tendo já sido demonstrado o papel da variação genética humana na resistência à infecção. Após quase meio século de controlo, a malária persiste na ilha de Santiago onde, apesar da baixa endemicidade, os indivíduos apresentam geralmente manifestações moderadas, são diagnosticadas infecções abaixo do nível detectável pela microscopia e o vector se encontra muito próximo da população supostamente susceptível, desconhecendo-se a frequência dos principais polimorfismos genéticos humanos mais relacionados com a doença e a estrutura populacional do mosquito vector. Os objectivos gerais de trabalho desta tese assentam 1) no estudo dos dois clássicos factores genéticos do hospedeiro humano relacionados com a malária, nomeadamente os afectos à anemia das células falciformes, à deficiência em G6PD e a análise dum provável envolvimento da PK e 2) na análise genética das populações do mosquito vector, tentando contribuir para a compreensão da epidemiologia da doença na Ilha, e para a escolha de medidas de controlo apropriadas. Os trabalhos incidiram na detecção do alelo responsável pela hemoglobina S, de polimorfismos no gene da G6PD e da PK em indivíduos não aparentados (Infectados e não Infectados) com análise da sua provável associação com a infecção e, ainda, na genotipagem de loci microssatélites de Anopheles arabiensis com recurso a técnicas baseadas na PCR. Relativamente à anemia falciforme, a frequência dos portadores do traço (indivíduos HbAS) e do alelo HbS foi 6% e 5%, respectivamente, e para as variantes da G6PD, 0,8% para G6PDA- e 0,0% para a G6PDMed, não tendo sido encontrado associação entre os genótipos desses dois factores e a presença de infecção. No que concerne ao gene PKLR não foi encontrada uma associação clara entre os polimorfismos analisados e o estado de infecção, mas foi detectado um acentuado desequilíbrio de linkage entre os loci, apenas nos Não Infectados, o que pode significar que essa região do gene, aparentemente conservada, tenha sido seleccionada por fornecer protecção contra a infecção e/ou doença. A diversidade genética das populações de A. arabiensis em onze loci microssatélites foi moderada com valores médio de He, variando de 0,481 a 0,522 e a Rs de 4 a 5. O valor da diferenciação genética baseado em 7 loci polimórficos foi baixo (FST=0,012; p<0,001) mas significativo, variando entre 0,001 e 0,023 entre os pares de populações. Não foram detectados os alelos de resistência associados ao gene Kdr. A baixa frequência dos alelos associados à G6PD (A- e Med) tem implicações importantes nas estratégias de controlo definidas pelo Programa Nacional de Luta contra o Paludismo (PNLP), uma vez que a primaquina pode continuar a ser administrada como complemento aos regimes terapêuticos, em caso de necessidade. A população de A. arabiensis em Santiago revelou-se relativamente homogénea e com uma estrutura reduzida o que pode, por um lado, representar uma desvantagem por permitir uma provável dispersão dos genes de resistência. Por outro lado, essa relativa homogeneidade poderá representar uma vantagem para a introdução de um programa de controlo baseado na libertação de mosquitos transgénicos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coarse Grained Reconfigurable Architectures (CGRAs) are emerging as enabling platforms to meet the high performance demanded by modern applications (e.g. 4G, CDMA, etc.). Recently proposed CGRAs offer time-multiplexing and dynamic applications parallelism to enhance device utilization and reduce energy consumption at the cost of additional memory (up to 50% area of the overall platform). To reduce the memory overheads, novel CGRAs employ either statistical compression, intermediate compact representation, or multicasting. Each compaction technique has different properties (i.e. compression ratio, decompression time and decompression energy) and is best suited for a particular class of applications. However, existing research only deals with these methods separately. Moreover, they only analyze the compaction ratio and do not evaluate the associated energy overheads. To tackle these issues, we propose a polymorphic compression architecture that interleaves these techniques in a unique platform. The proposed architecture allows each application to take advantage of a separate compression/decompression hierarchy (consisting of various types and implementations of hardware/software decoders) tailored to its needs. Simulation results, using different applications (FFT, Matrix multiplication, and WLAN), reveal that the choice of compression hierarchy has a significant impact on compression ratio (up to 52%), decompression energy (up to 4 orders of magnitude), and configuration time (from 33 n to 1.5 s) for the tested applications. Synthesis results reveal that introducing adaptivity incurs negligible additional overheads (1%) compared to the overall platform area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The eco-epidemiology of T. cruzi infection was investigated in the Eastern border of the Panama Canal in Central Panama. Between 1999 and 2000, 1110 triatomines were collected: 1050 triatomines (94.6%) from palm trees, 27 (2.4%) from periurban habitats and 33 (3.0%) inside houses. All specimens were identified as R. pallescens. There was no evidence of vector domiciliation. Salivary glands from 380 R. pallescens revealed a trypanosome natural infection rate of 7.6%, while rectal ampoule content from 373 triatomines was 45%. Isoenzyme profiles on isolated trypanosomes demonstrated that 85.4% (n = 88) were T. cruzi and 14.6% (n = 15) were T. rangeli. Blood meal analysis from 829 R. pallescens demonstrated a zoophilic vector behavior, with opossums as the preferential blood source. Seroprevalence in human samples from both study sites was less than 2%. Our results demonstrate that T. cruzi survives in the area in balanced association with R. pallescens, and with several different species of mammals in their natural niches. However, the area is an imminent risk of infection for its population, consequently it is important to implement a community educational program regarding disease knowledge and control measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Even though Chagas disease is rare in the Brazilian Amazon, the conditions for the establishment of domiciliated cycles prevail in many areas where triatomines are of frequent occurrence. In Roraima, a previous serological and entomological survey in three agricultural settlements showed the existence of all transmission cycle elements, i.e., individuals infected by Trypanosoma cruzi, triatomine species previously found harboring T. cruzi in the broader Amazon region of neighboring countries and, domicile/ peridomicile conditions favorable to triatomine colonization. Triatoma maculata was the most frequent species, found in chicken houses in the peridomicile and sporadically within residences. Aiming to investigate the possibility of T. maculata to possess the potentiality to transmit T. cruzi in the area, bionomic characteristics were studied under laboratory conditions. These were feeding frequency, time for defecation after a blood meal, time elapsed in voluntary fasting pre- and pos-ecdysis, moulting time periods, pre-oviposition and oviposition periods and index of oviposition, incubation period, egg viability, longevity and mortality rate. Results show that the Passarão population of T. maculata should be considered a potential vector of T. cruzi since it shows a capacity to infest artificial ecotopes in the peridomicile, to carry out large number of meals during the nymphal cycle, to have a relatively short developmental cycle capable of producing 2.9 generations/year, to blood source eclecticism, to defecate immediately after the blood meal while still on the host and to the fact that has been previously found naturally infected by T.cruzi.