961 resultados para General-purpose computing on graphics processing units (GPGPU)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of five technological procedures and of the contents of total anthocyanins and condensed tan- nins on 19 fermentation-related aroma compounds of young red Mencia wines were studied. Multifactor ANOVA revealed that levels of those volatiles changed significantly over the length of storage in bottles and, to a lesser extent, due to other technological factors considered; total anthocyanins and condensed tannins also changed significantly as a result of the five practices assayed. Five aroma compounds pos- sessed an odour activity value >1 in all wines, and another four in some wines. Linear correlation among volatile compounds and general phenolic composition revealed that total anthocyanins were highly related to 14 different aroma compounds. Multifactor ANOVA, considering the content of total anthocy- anins as a sixth random factor, revealed that this parameter affected significantly the contents of ethyl lactate, ethyl isovalerate, 1-pentanol and ethyl octanoate. Thus, the aroma of young red Mencia wines may be affected by levels of total anthocyanins

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, an architecture based on a scalable and flexible set of Evolvable Processing arrays is presented. FPGA-native Dynamic Partial Reconfiguration (DPR) is used for evolution, which is done intrinsically, letting the system to adapt autonomously to variable run-time conditions, including the presence of transient and permanent faults. The architecture supports different modes of operation, namely: independent, parallel, cascaded or bypass mode. These modes of operation can be used during evolution time or during normal operation. The evolvability of the architecture is combined with fault-tolerance techniques, to enhance the platform with self-healing features, making it suitable for applications which require both high adaptability and reliability. Experimental results show that such a system may benefit from accelerated evolution times, increased performance and improved dependability, mainly by increasing fault tolerance for transient and permanent faults, as well as providing some fault identification possibilities. The evolvable HW array shown is tailored for window-based image processing applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the paper is to discuss the use of knowledge models to formulate general applications. First, the paper presents the recent evolution of the software field where increasing attention is paid to conceptual modeling. Then, the current state of knowledge modeling techniques is described where increased reliability is available through the modern knowledge acquisition techniques and supporting tools. The KSM (Knowledge Structure Manager) tool is described next. First, the concept of knowledge area is introduced as a building block where methods to perform a collection of tasks are included together with the bodies of knowledge providing the basic methods to perform the basic tasks. Then, the CONCEL language to define vocabularies of domains and the LINK language for methods formulation are introduced. Finally, the object oriented implementation of a knowledge area is described and a general methodology for application design and maintenance supported by KSM is proposed. To illustrate the concepts and methods, an example of system for intelligent traffic management in a road network is described. This example is followed by a proposal of generalization for reuse of the resulting architecture. Finally, some concluding comments are proposed about the feasibility of using the knowledge modeling tools and methods for general application design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, function of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells has only very recently been proposed (Jerusalem et al., 2013). In this paper, we present the implementation details of Neurite: the finite difference parallel program used in this reference. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite-explicit and implicit-were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between lectrophysiology and mechanics (Jerusalem et al., 2013). This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon, a segmented dendritic tree, and a damaged axon. The capabilities of the program to deal with large scale scenarios, segmented neuronal structures, and functional deficits under mechanical loading are specifically highlighted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Video analytics play a critical role in most recent traffic monitoring and driver assistance systems. In this context, the correct detection and classification of surrounding vehicles through image analysis has been the focus of extensive research in the last years. Most of the pieces of work reported for image-based vehicle verification make use of supervised classification approaches and resort to techniques, such as histograms of oriented gradients (HOG), principal component analysis (PCA), and Gabor filters, among others. Unfortunately, existing approaches are lacking in two respects: first, comparison between methods using a common body of work has not been addressed; second, no study of the combination potentiality of popular features for vehicle classification has been reported. In this study the performance of the different techniques is first reviewed and compared using a common public database. Then, the combination capabilities of these techniques are explored and a methodology is presented for the fusion of classifiers built upon them, taking into account also the vehicle pose. The study unveils the limitations of single-feature based classification and makes clear that fusion of classifiers is highly beneficial for vehicle verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many computer vision and human-computer interaction applications developed in recent years need evaluating complex and continuous mathematical functions as an essential step toward proper operation. However, rigorous evaluation of this kind of functions often implies a very high computational cost, unacceptable in real-time applications. To alleviate this problem, functions are commonly approximated by simpler piecewise-polynomial representations. Following this idea, we propose a novel, efficient, and practical technique to evaluate complex and continuous functions using a nearly optimal design of two types of piecewise linear approximations in the case of a large budget of evaluation subintervals. To this end, we develop a thorough error analysis that yields asymptotically tight bounds to accurately quantify the approximation performance of both representations. It provides an improvement upon previous error estimates and allows the user to control the trade-off between the approximation error and the number of evaluation subintervals. To guarantee real-time operation, the method is suitable for, but not limited to, an efficient implementation in modern Graphics Processing Units (GPUs), where it outperforms previous alternative approaches by exploiting the fixed-function interpolation routines present in their texture units. The proposed technique is a perfect match for any application requiring the evaluation of continuous functions, we have measured in detail its quality and efficiency on several functions, and, in particular, the Gaussian function because it is extensively used in many areas of computer vision and cybernetics, and it is expensive to evaluate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To assess the effect of additional training of practice nurses and general practitioners in patient centred care on the lifestyle and psychological and physiological status of patients with newly diagnosed type 2 diabetes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spectrum of immunogenic epitopes presented by the H2-IAb MHC class II molecule to CD4+ T cells has been defined for two different (clade B and clade D) HIV envelope (gp140) glycoproteins. Hybridoma T cell lines were generated from mice immunized by a sequential prime and boost regime with DNA, recombinant vaccinia viruses, and protein. The epitopes recognized by reactive T cell hybridomas then were characterized with overlapping peptides synthesized to span the entire gp140 sequence. Evidence of clonality also was assessed with antibodies to T cell receptor Vα and Vβ chains. A total of 80 unique clonotypes were characterized from six individual mice. Immunogenic peptides were identified within only four regions of the HIV envelope. These epitope hotspots comprised relatively short sequences (≈20–80 aa in length) that were generally bordered by regions of heavy glycosylation. Analysis in the context of the gp120 crystal structure showed a pattern of uniform distribution to exposed, nonhelical strands of the protein. A likely explanation is that the physical location of the peptide within the native protein leads to differential antigen processing and consequent epitope selection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary. The EU’s attempts to adopt an EU-wide instrument on the right to access to legal aid in criminal proceedings have not been successful so far. The important issue was originally part of Measure C of the Roadmap for criminal procedural rights,1 but due to political difficulties legal aid was dropped from the agenda. However, on a different plane agreement was reached on this topic as the United Nations General Assembly (UNGA) has adopted the world’s first international instrument dedicated to access to legal aid in December 2012.2 This policy brief argues that the EU should carry on in the ‘spirit’ of these recent developments and adopt a directive providing suspects and defendants with access to legal aid. 1 Council Resolution of 30 November 2009 on a Roadmap for strengthening procedural rights of suspected or accused persons in criminal proceedings, OJ C 295/1, 4 December 2009; hereafter will be referred to this Council Resolution as the ‘Roadmap’; for further information see M. Jimeno-Bulnes, ‘The EU Roadmap for Strengthening Procedural Rights of Suspected or Accused Persons in Criminal Proceedings’, 4 EUCrim (2009), 157-161. 2 United Nations Principles and Guidelines on Access to Legal Aid in Criminal Justice Systems, A/Res/67/187, 20 December 2012; from here on will be referred to this as the ‘Resolution’.