50 resultados para General-purpose computing on graphics processing units (GPGPU)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper results are shown to indicate the efficacy of a direct connection between the human nervous system and a computer network. Experimental results obtained thus far from a study lasting for over 3 months are presented, with particular emphasis placed on the direct interaction between the human nervous system and a piece of wearable technology. An overview of the present state of neural implants is given, as well as a range of application areas considered thus far. A view is also taken as to what may be possible with implant technology as a general purpose human-computer interface for the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the effect of time offset errors on the partial parallel interference canceller (PIC) and compares the performance of it against that of the standard PIC. The BER performances of the standard and partial interference cancellers are simulated in a near far environment with varying time offset errors. These simulations indicate that whilst timing errors significantly affect the performance of both these schemes, they do not diminish the gains that are realised by the partial PIC over that of the standard PIC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the design, implementation and testing of a high speed controlled stereo “head/eye” platform which facilitates the rapid redirection of gaze in response to visual input. It details the mechanical device, which is based around geared DC motors, and describes hardware aspects of the controller and vision system, which are implemented on a reconfigurable network of general purpose parallel processors. The servo-controller is described in detail and higher level gaze and vision constructs outlined. The paper gives performance figures gained both from mechanical tests on the platform alone, and from closed loop tests on the entire system using visual feedback from a feature detector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors compare the performance of two types of controllers one based on the multilayered network and the other based on the single layered CMAC network (cerebellar model articulator controller). The neurons (information processing units) in the multi-layered network use Gaussian activation functions. The control scheme which is considered is a predictive control algorithm, along the lines used by Willis et al. (1991), Kambhampati and Warwick (1991). The process selected as a test bed is a continuous stirred tank reactor. The reaction taking place is an irreversible exothermic reaction in a constant volume reactor cooled by a single coolant stream. This reactor is a simplified version of the first tank in the two tank system given by Henson and Seborg (1989).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the production and on-line processing of English tense morphemes by sequential bilingual (L2) Turkish-speaking children with more than three years of exposure to English. Thirty nine 6-9-year-old L2 children and 28 typically developing age-matched monolingual (L1) children were administered the production component for third person –s and past tense of the Test for Early Grammatical Impairment (Rice & Wexler, 1996) and participated in an on-line word-monitoring task involving grammatical and ungrammatical sentences with presence/omission of tense (third person –s, past tense -ed) and non-tense (progressive –ing, possessive ‘s) morphemes. The L2 children’s performance on the on-line task was compared to that of children with Specific Language Impairment (SLI) in Montgomery & Leonard (1998, 2006) to ascertain similarities and differences between the two populations. Results showed that the L2 children were sensitive to the ungrammaticality induced by the omission of tense morphemes, despite variable production. This reinforces the claim about intact underlying syntactic representations in child L2 acquisition despite non target-like production (Haznedar & Schwartz, 1997).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Explanations of the marked individual differences in elementary school mathematical achievement and mathematical learning disability (MLD or dyscalculia) have involved domain-general factors (working memory, reasoning, processing speed and oral language) and numerical factors that include single-digit processing efficiency and multi-digit skills such as number system knowledge and estimation. This study of third graders (N = 258) finds both domain-general and numerical factors contribute independently to explaining variation in three significant arithmetic skills: basic calculation fluency, written multi-digit computation, and arithmetic word problems. Estimation accuracy and number system knowledge show the strongest associations with every skill and their contributions are both independent of each other and other factors. Different domain-general factors independently account for variation in each skill. Numeral comparison, a single digit processing skill, uniquely accounts for variation in basic calculation. Subsamples of children with MLD (at or below 10th percentile, n = 29) are compared with low achievement (LA, 11th to 25th percentiles, n = 42) and typical achievement (above 25th percentile, n = 187). Examination of these and subsets with persistent difficulties supports a multiple deficits view of number difficulties: most children with number difficulties exhibit deficits in both domain-general and numerical factors. The only factor deficit common to all persistent MLD children is in multi-digit skills. These findings indicate that many factors matter but multi-digit skills matter most in third grade mathematical achievement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The experience of learning and using a second language (L2) has been shown to affect the grey matter (GM) structure of the brain. Importantly, GM density in several cortical and subcortical areas has been shown to be related to performance in L2 tasks. Here we show that bilingualism can lead to increased GM volume in the cerebellum, a structure that has been related to the processing of grammatical rules. Additionally, the cerebellar GM volume of highly proficient L2 speakers is correlated to their performance in a task tapping on grammatical processing in a L2, demonstrating the importance of the cerebellum for the establishment and use of grammatical rules in a L2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During April and May 2010 the ash cloud from the eruption of the Icelandic volcano Eyjafjallajökull caused widespread disruption to aviation over northern Europe. The location and impact of the eruption led to a wealth of observations of the ash cloud were being obtained which can be used to assess modelling of the long range transport of ash in the troposphere. The UK FAAM (Facility for Airborne Atmospheric Measurements) BAe-146-301 research aircraft overflew the ash cloud on a number of days during May. The aircraft carries a downward looking lidar which detected the ash layer through the backscatter of the laser light. In this study ash concentrations derived from the lidar are compared with simulations of the ash cloud made with NAME (Numerical Atmospheric-dispersion Modelling Environment), a general purpose atmospheric transport and dispersion model. The simulated ash clouds are compared to the lidar data to determine how well NAME simulates the horizontal and vertical structure of the ash clouds. Comparison between the ash concentrations derived from the lidar and those from NAME is used to define the fraction of ash emitted in the eruption that is transported over long distances compared to the total emission of tephra. In making these comparisons possible position errors in the simulated ash clouds are identified and accounted for. The ash layers seen by the lidar considered in this study were thin, with typical depths of 550–750 m. The vertical structure of the ash cloud simulated by NAME was generally consistent with the observed ash layers, although the layers in the simulated ash clouds that are identified with observed ash layers are about twice the depth of the observed layers. The structure of the simulated ash clouds were sensitive to the profile of ash emissions that was assumed. In terms of horizontal and vertical structure the best results were obtained by assuming that the emission occurred at the top of the eruption plume, consistent with the observed structure of eruption plumes. However, early in the period when the intensity of the eruption was low, assuming that the emission of ash was uniform with height gives better guidance on the horizontal and vertical structure of the ash cloud. Comparison of the lidar concentrations with those from NAME show that 2–5% of the total mass erupted by the volcano remained in the ash cloud over the United Kingdom.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a software-based study of a hardware-based non-sorting median calculation method on a set of integer numbers. The method divides the binary representation of each integer element in the set into bit slices in order to find the element located in the middle position. The method exhibits a linear complexity order and our analysis shows that the best performance in execution time is obtained when slices of 4-bit in size are used for 8-bit and 16-bit integers, in mostly any data set size. Results suggest that software implementation of bit slice method for median calculation outperforms sorting-based methods with increasing improvement for larger data set size. For data set sizes of N > 5, our simulations show an improvement of at least 40%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability to match individual patients to tailored treatments has the potential to greatly improve outcomes for individuals suffering from major depression. In particular, while the vast majority of antidepressant treatments affect either serotonin or noradrenaline or a combination of these two neurotransmitters, it is not known whether there are particular patients or symptom profiles which respond preferentially to the potentiation of serotonin over noradrenaline or vice versa. Experimental medicine models suggest that the primary mode of action of these treatments may be to remediate negative biases in emotional processing. Such models may provide a useful framework for interrogating the specific actions of antidepressants. Here, we therefore review evidence from studies examining the effects of drugs which potentiate serotonin, noradrenaline or a combination of both neurotransmitters on emotional processing. These results suggest that antidepressants targeting serotonin and noradrenaline may have some specific actions on emotion and reward processing which could be used to improve tailoring of treatment or to understand the effects of dual-reuptake inhibition. Specifically, serotonin may be particularly important in alleviating distress symptoms, while noradrenaline may be especially relevant to anhedonia. The data reviewed here also suggest that noradrenergic-based treatments may have earlier effects on emotional memory that those which affect serotonin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IEEE 754 floating-point arithmetic is widely used in modern, general-purpose computers. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. Modifying the IEEE arithmetic so that it uses transreal arithmetic has a number of advantages. It removes one redundant binade from IEEE floating-point objects, doubling the numerical precision of the arithmetic. It removes eight redundant, relational,floating-point operations and removes the redundant total order operation. It replaces the non-reflexive, floating-point, equality operator with a reflexive equality operator and it indicates that some of the exceptions may be removed as redundant { subject to issues of backward compatibility and transient future compatibility as programmers migrate to the transreal paradigm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Body size affects nearly all aspects of organismal biology, so it is important to understand the constraints and dynamics of body size evolution. Despite empirical work on the macroevolution and macroecology of minimum and maximum size, there is little general quantitative theory on rates and limits of body size evolution. We present a general theory that integrates individual productivity, the lifestyle component of the slow–fast life-history continuum, and the allometric scaling of generation time to predict a clade's evolutionary rate and asymptotic maximum body size, and the shape of macroevolutionary trajectories during diversifying phases of size evolution. We evaluate this theory using data on the evolution of clade maximum body sizes in mammals during the Cenozoic. As predicted, clade evolutionary rates and asymptotic maximum sizes are larger in more productive clades (e.g. baleen whales), which represent the fast end of the slow–fast lifestyle continuum, and smaller in less productive clades (e.g. primates). The allometric scaling exponent for generation time fundamentally alters the shape of evolutionary trajectories, so allometric effects should be accounted for in models of phenotypic evolution and interpretations of macroevolutionary body size patterns. This work highlights the intimate interplay between the macroecological and macroevolutionary dynamics underlying the generation and maintenance of morphological diversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a general approach based on nonequilibrium thermodynamics for bridging the gap between a well-defined microscopic model and the macroscopic rheology of particle-stabilised interfaces. Our approach is illustrated by starting with a microscopic model of hard ellipsoids confined to a planar surface, which is intended to simply represent a particle-stabilised fluid–fluid interface. More complex microscopic models can be readily handled using the methods outlined in this paper. From the aforementioned microscopic starting point, we obtain the macroscopic, constitutive equations using a combination of systematic coarse-graining, computer experiments and Hamiltonian dynamics. Exemplary numerical solutions of the constitutive equations are given for a variety of experimentally relevant flow situations to explore the rheological behaviour of our model. In particular, we calculate the shear and dilatational moduli of the interface over a wide range of surface coverages, ranging from the dilute isotropic regime, to the concentrated nematic regime.