10 resultados para General-purpose computing on graphics processing units (GPGPU)

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article discusses issues related to the organization and reception of information in the context of services and public information systems driven by technology. It stems from the assumption that in a ""technologized"" society, the distance between users and information is almost always of cognitive and socio-cultural nature, a product of our effort to design communication. In this context, we favor the approach of the information sign, seeking to answer how a documentary message turns into information, i.e. a structure recognized as socially useful. Observing the structural, cognitive and communicative aspects of the documentary message, based on Documentary Linguistics, Terminology, as well as on Textual Linguistics, the policy of knowledge management and innovation of the Government of the State of Sao Paulo is analyzed, which authorizes the use of Web 2.0, also questioning to what extent this initiative represents innovation in the environment of libraries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a new food classification which assigns foodstuffs according to the extent and purpose of the industrial processing applied to them. Three main groups are defined: unprocessed or minimally processed foods (group 1), processed culinary and food industry ingredients (group 2), and ultra-processed food products (group 3). The use of this classification is illustrated by applying it to data collected in the Brazilian Household Budget Survey which was conducted in 2002/2003 through a probabilistic sample of 48,470 Brazilian households. The average daily food availability was 1,792 kcal/person being 42.5% from group 1 (mostly rice and beans and meat and milk), 37.5% from group 2 (mostly vegetable oils, sugar, and flours), and 20% from group 3 (mostly breads, biscuits, sweets, soft drinks, and sausages). The share of group 3 foods increased with income, and represented almost one third of all calories in higher income households. The impact of the replacement of group 1 foods and group 2 ingredients by group 3 products on the overall quality of the diet, eating patterns and health is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Part I [""Fast Transforms for Acoustic Imaging-Part I: Theory,"" IEEE TRANSACTIONS ON IMAGE PROCESSING], we introduced the Kronecker array transform (KAT), a fast transform for imaging with separable arrays. Given a source distribution, the KAT produces the spectral matrix which would be measured by a separable sensor array. In Part II, we establish connections between the KAT, beamforming and 2-D convolutions, and show how these results can be used to accelerate classical and state of the art array imaging algorithms. We also propose using the KAT to accelerate general purpose regularized least-squares solvers. Using this approach, we avoid ill-conditioned deconvolution steps and obtain more accurate reconstructions than previously possible, while maintaining low computational costs. We also show how the KAT performs when imaging near-field source distributions, and illustrate the trade-off between accuracy and computational complexity. Finally, we show that separable designs can deliver accuracy competitive with multi-arm logarithmic spiral geometries, while having the computational advantages of the KAT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying the correct sense of a word in context is crucial for many tasks in natural language processing (machine translation is an example). State-of-the art methods for Word Sense Disambiguation (WSD) build models using hand-crafted features that usually capturing shallow linguistic information. Complex background knowledge, such as semantic relationships, are typically either not used, or used in specialised manner, due to the limitations of the feature-based modelling techniques used. On the other hand, empirical results from the use of Inductive Logic Programming (ILP) systems have repeatedly shown that they can use diverse sources of background knowledge when constructing models. In this paper, we investigate whether this ability of ILP systems could be used to improve the predictive accuracy of models for WSD. Specifically, we examine the use of a general-purpose ILP system as a method to construct a set of features using semantic, syntactic and lexical information. This feature-set is then used by a common modelling technique in the field (a support vector machine) to construct a classifier for predicting the sense of a word. In our investigation we examine one-shot and incremental approaches to feature-set construction applied to monolingual and bilingual WSD tasks. The monolingual tasks use 32 verbs and 85 verbs and nouns (in English) from the SENSEVAL-3 and SemEval-2007 benchmarks; while the bilingual WSD task consists of 7 highly ambiguous verbs in translating from English to Portuguese. The results are encouraging: the ILP-assisted models show substantial improvements over those that simply use shallow features. In addition, incremental feature-set construction appears to identify smaller and better sets of features. Taken together, the results suggest that the use of ILP with diverse sources of background knowledge provide a way for making substantial progress in the field of WSD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este artigo descreve uma nova classificação de alimentos baseada na extensão e propósito do processamento industrial usado na sua produção. Três grupos são definidos: alimentos não processados ou minimamente processados (grupo 1), alimentos processados utilizados como ingredientes de preparações culinárias ou pela indústria de alimentos (grupo 2), e produtos alimentícios ultra-processados (grupo 3). O uso da classificação é ilustrado aplicando-a a dados coletados por Pesquisa de Orçamentos Familiares conduzida em 2002/2003 em uma amostra probabilística de 48.470 domicílios brasileiros. A disponibilidade diária foi de 1.792kcal/capita, sendo 42,5por cento de alimentos do grupo 1, 37,5por cento do grupo 2 e 20por cento do grupo 3. A contribuição do grupo 3 aumentou com a renda familiar, correspondendo a um terço do total calórico nos domicílios mais afluentes. Discute-se o impacto sobre a qualidade geral da dieta, padrões de alimentação e condições de saúde que poderia ocorrer com a substituição de alimentos do grupo 1 e ingredientes do grupo 2 por produtos alimentícios do grupo 3

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the problem of distributed estimation based on the affine projection algorithm (APA), which is developed from Newton`s method for minimizing a cost function. The proposed solution is formulated to ameliorate the limited convergence properties of least-mean-square (LMS) type distributed adaptive filters with colored inputs. The analysis of transient and steady-state performances at each individual node within the network is developed by using a weighted spatial-temporal energy conservation relation and confirmed by computer simulations. The simulation results also verify that the proposed algorithm provides not only a faster convergence rate but also an improved steady-state performance as compared to an LMS-based scheme. In addition, the new approach attains an acceptable misadjustment performance with lower computational and memory cost, provided the number of regressor vectors and filter length parameters are appropriately chosen, as compared to a distributed recursive-least-squares (RLS) based method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The influence of concentration and incorporation time of different drying excipients on the processing yields and physical properties of Eugenia dysenterica DC spray-dried extracts were investigated following a factorial design. Under the established conditions, the process yield ranged from 57.55 to 89.14%, and in most experiments, the recovered products presented suitable flowability and compressibility, as demonstrated by the Hausner factor, Carr index, and angle of repose. Additionally, in a general way, the parameters related to the dried products` flowability varied over a range acceptable for pharmaceutical purposes. An analysis of variance (ANOVA) proved that both factors and some of their interactions significantly affected most of the investigated responses at different levels. Mannitol proved to be an interesting alternative as an excipient for the drying of herbal extracts, even at low concentrations such as 12.5%. Furthermore, these results imply that the best condition to obtain dry extracts of E. dysenterica with high performance and adequate pharmacotechnical properties involves the lowest concentration and the highest incorporation time of mannitol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abad, CCC, Prado, ML, Ugrinowitsch, C, Tricoli, V, and Barroso, R. Combination of general and specific warm-ups improves leg-press one repetition maximum compared with specific warm-up in trained individuals. J Strength Cond Res 25(8): 2242-2245, 2011-Accurate assessment of muscular strength is critical for exercise prescription and functional evaluation. The warm-up protocol may affect the precision of the 1 repetition maximum (1RM) test. Testing guidelines recommend performing both general and specific warm-ups before strength tests. The general warm-up intends to raise muscle temperature, whereas the specific warm-up aims to increase neuromuscular activation. Although there is scientific evidence for performing the specific warm-up, the effects of general warm-up on strength tests are still unclear. The purpose of this study was to investigate whether the combination of a general with a specific warm-up (G + SWU) protocol would improve leg press 1RM values compared with a specific warm-up (SWU) protocol. Thirteen participants were tested for leg-press 1RM under 2 warm-up conditions. In the first condition, participants performed the SWU only, which was composed of 1 set of 8 repetitions at approximately 50% of the estimated 1RM followed by another set of 3 repetitions at 70% of the estimated 1RM. In the second condition (G + SWU), participants performed the 1RM test after a 20-minute general warm-up on a stationary bicycle at 60% of HRmax and the same specific warm-up as in the SWU. Values of 1RM in SWU and in G + SWU were compared by a paired t-test, and significance level was set at p <= 0.05. Strength values were on average 8.4% (p = 0.002) higher in the G + SWU compared with the SWU. These results suggest that the G + SWU induced temperature-dependent neuromuscular adjustments that increased muscle force production capacity. Therefore, these results support the recommendations of the testing guidelines to perform a moderate intensity general warm-up in addition to the specific warm-up before maximum strength assessments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel cryptography method based on the Lorenz`s attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.