961 resultados para General-purpose computing on graphics processing units (GPGPU)
Resumo:
The reading of printed materials implies the visual processing of information originated in two distinct semiotic systems. The rapid identification of redundancy, complementation or contradiction rhetoric strategies between the two information types may be crucial for an adequate interpretation of bimodal materials. Hybrid texts (verbal and visual) are particular instances of bimodal materials, where the redundant information is often neglected while the complementary and the contradictory ones are essential.Studies using the 504 ASL eye-tracking system while reading either additive or exhibiting captions (Baptista, 2009) revealed fixations on the verbal material and transitions between the written and the pictorial in a much higher number and duration than the initially foreseen as necessary to read the verbal text. We therefore hypothesized that confirmation strategies of the written information are taking place, by using information available in the other semiotic system.Such eye-gaze patterns obtained from denotative texts and pictures seem to contradict some of the scarce existing data on visual processing of texts and images, namely cartoons (Carroll, Young and Guertain, 1992), descriptive captions (Hegarty, 1992 a and b), and advertising images with descriptive and explanatory texts (cf. Rayner and Rotello, 2001, who refer to a previous reading of the whole text before looking at the image, or even Rayner, Miller and Rotello, 2008 who refer to an earlier and longer look at the picture) and seem to consolidate findings of Radach et al. (2003) on systematic transitions between text and image.By framing interest areas in the printed pictorial material of non redundant hybrid texts, we have identified the specific areas where transitions take place after fixations in the verbal text. The way those transitions are processed brings a new interest to further research.
Resumo:
A construção de uma escola inclusiva se apresenta como o grande desafio do século vigente. Este trabalho dissertativo teve como objetivos da investigação realizada compreender como as estratégias utilizadas pelos professores em suas práticas educativas, junto aos estudantes com deficiências específicas estão facilitando o processo educativo inclusivo nos anos/séries iniciais do ensino regular fundamental, como objetivo geral e levantar informações sobre os tipos de estratégias que estão sendo indicadas para subsidiar as práticas pedagógicas desenvolvidas por professores do ensino regular, identificar quais as estratégias pedagógicas utilizadas pelos professores do ensino regular que estão favorecendo a aprendizagem dos estudantes com deficiência inclusos no cotidiano de sala de aula, verificar quais as implicações ou obstáculos que estão concorrendo para as situações-problema existentes, na perspectiva do trabalho desenvolvido pelos respectivos docentes e comparar dentre as estratégias pedagógicas que estão sendo trabalhadas pelos professores do ensino regular quais as que apresentam melhores resultados na construção da aprendizagem de estudantes inclusos, como objetivos específicos. O processo investigativo caracterizou-se numa pesquisa de natureza qualitativa e quantitativa, em nível descritivo e explicativo, com base metodológica e científica na entrevista guiada pelo método de Bardin (2010) , na técnica de observação direta em sala de aula, adotando os itens segundo o modelo de Likert para o melhor compilar os dados e preenchimento da ficha de observação e no emprego de questionário, segundo as orientações de Richardson (2009). O cenário escolhido para realização da pesquisa foi uma escola da rede municipal de Jaboatão dos Guararapes, no Estado de Pernambuco, pertencente à Região Nordeste do Brasil, por apresentar um quantitativo significativo de estudantes inclusos, perfazendo um total de 19 estudantes com deficiências diversas, e composta por 12 sujeitos participantes que lecionam em salas de aula inclusivas. Procuramos fundamentar este trabalho com base nos constructos teóricos de autores renomados na seara da educação inclusiva, tais como: Stainback e Stainback (2008), Fonseca (1995), Goffman (2004), Tardif (2008), Beyer (2010), Carvalho (2006 e 2008), Gaio e Meneguetti (2004), Machado (2009), Mantoan (2001, 2006), Pires (2008), Saviani (2005), Angher (2008) entre outros defensores da inclusão. Pode-se observar através dos resultados obtidos as experiências que comprovaram avanços nas aprendizagens dos estudantes inclusos, quando da interação com os demais estudantes, promovendo uma melhor convivência social e educativa respaldada nos princípios e valores de respeito às diferenças e igualdade de condições, de direito e deveres, de participação ativa no processo de construção e de formação da cidadania de todos. Com o suporte de estratégias educacionais, como a contratação de estagiários de apoio, a disponibilização de recursos diversos, e a adoção de concepções de caráter da educação inclusiva apresentadas pelos participantes da pesquisa como elementos facilitadores da aprendizagem. Entretanto, como entraves foram percebidos a falta de formação acadêmica específica dos sujeitos participantes, a inadequação física do prédio escolar, a adoção de uma proposta pedagógica que não contempla os preceitos da educação inclusiva e o não oferecimento do atendimento educacional especializado aos estudantes no recinto escolar nem extraescolar, com profissionais de áreas afins para complemento e suplemento à aprendizagem. Assim verificou-se que o processo de implantação da educação inclusiva que vem sendo desenvolvido pela escola já apresentou respostas favoráveis, porém muito ainda precisa ser feito para minimizar os obstáculos defrontados.
Resumo:
Uncertainties associated with the representation of various physical processes in global climate models (GCMs) mean that, when projections from GCMs are used in climate change impact studies, the uncertainty propagates through to the impact estimates. A complete treatment of this ‘climate model structural uncertainty’ is necessary so that decision-makers are presented with an uncertainty range around the impact estimates. This uncertainty is often underexplored owing to the human and computer processing time required to perform the numerous simulations. Here, we present a 189-member ensemble of global river runoff and water resource stress simulations that adequately address this uncertainty. Following several adaptations and modifications, the ensemble creation time has been reduced from 750 h on a typical single-processor personal computer to 9 h of high-throughput computing on the University of Reading Campus Grid. Here, we outline the changes that had to be made to the hydrological impacts model and to the Campus Grid, and present the main results. We show that, although there is considerable uncertainty in both the magnitude and the sign of regional runoff changes across different GCMs with climate change, there is much less uncertainty in runoff changes for regions that experience large runoff increases (e.g. the high northern latitudes and Central Asia) and large runoff decreases (e.g. the Mediterranean). Furthermore, there is consensus that the percentage of the global population at risk to water resource stress will increase with climate change.
Resumo:
The International System of Units (SI) is founded on seven base units, the metre, kilogram, second, ampere, kelvin, mole and candela corresponding to the seven base quantities of length, mass, time, electric current, thermodynamic temperature, amount of substance and luminous intensity. At its 94th meeting in October 2005, the International Committee for Weights and Measures (CIPM) adopted a recommendation on preparative steps towards redefining the kilogram, ampere, kelvin and mole so that these units are linked to exactly known values of fundamental constants. We propose here that these four base units should be given new definitions linking them to exactly defined values of the Planck constant h, elementary charge e, Boltzmann constant k and Avogadro constant NA, respectively. This would mean that six of the seven base units of the SI would be defined in terms of true invariants of nature. In addition, not only would these four fundamental constants have exactly defined values but also the uncertainties of many of the other fundamental constants of physics would be either eliminated or appreciably reduced. In this paper we present the background and discuss the merits of these proposed changes, and we also present possible wordings for the four new definitions. We also suggest a novel way to define the entire SI explicitly using such definitions without making any distinction between base units and derived units. We list a number of key points that should be addressed when the new definitions are adopted by the General Conference on Weights and Measures (CGPM), possibly by the 24th CGPM in 2011, and we discuss the implications of these changes for other aspects of metrology.
Resumo:
The theory of harmonic force constant refinement calculations is reviewed, and a general-purpose program for force constant and normal coordinate calculations is described. The program, called ASYM20. is available through Quantum Chemistry Program Exchange. It will work on molecules of any symmetry containing up to 20 atoms and will produce results on a series of isotopomers as desired. The vibrational secular equations are solved in either nonredundant valence internal coordinates or symmetry coordinates. As well as calculating the (harmonic) vibrational wavenumbers and normal coordinates, the program will calculate centrifugal distortion constants, Coriolis zeta constants, harmonic contributions to the α′s. root-mean-square amplitudes of vibration, and other quantities related to gas electron-diffraction studies and thermodynamic properties. The program will work in either a predict mode, in which it calculates results from an input force field, or in a refine mode, in which it refines an input force field by least squares to fit observed data on the quantities mentioned above. Predicate values of the force constants may be included in the data set for a least-squares refinement. The program is written in FORTRAN for use on a PC or a mainframe computer. Operation is mainly controlled by steering indices in the input data file, but some interactive control is also implemented.
Resumo:
The rational for this review is to provide a coherent formulation of the cognitive neurochemistry of nicotine, with the aim of suggesting research and clinical applications. The first part is a comprehensive review of the empirical studies of the enhancing effects of nicotine on information processing, especially those on attentional and mnemonic processing. Then, these studies are put in the context of recent studies on the neurochemistry of nicotine and cholinergic drugs, in general. They suggest a positive effect of nicotine on processes acting on encoded material during the post acquisition phase, the process of consolidation. Thus, the involvement of nicotinic receptors in mnemonic processing is modulation of the excitability of neurons in the hippocampal formation to enable associative processing.
Resumo:
This paper presents results to indicate the potential applications of a direct connection between the human nervous system and a computer network. Actual experimental results obtained from a human subject study are given, with emphasis placed on the direct interaction between the human nervous system and possible extra-sensory input. An brief overview of the general state of neural implants is given, as well as a range of application areas considered. An overall view is also taken as to what may be possible with implant technology as a general purpose human-computer interface for the future.
Resumo:
In this paper results are shown to indicate the efficacy of a direct connection between the human nervous system and a computer network. Experimental results obtained thus far from a study lasting for over 3 months are presented, with particular emphasis placed on the direct interaction between the human nervous system and a piece of wearable technology. An overview of the present state of neural implants is given, as well as a range of application areas considered thus far. A view is also taken as to what may be possible with implant technology as a general purpose human-computer interface for the future.
Resumo:
This paper investigates the effect of time offset errors on the partial parallel interference canceller (PIC) and compares the performance of it against that of the standard PIC. The BER performances of the standard and partial interference cancellers are simulated in a near far environment with varying time offset errors. These simulations indicate that whilst timing errors significantly affect the performance of both these schemes, they do not diminish the gains that are realised by the partial PIC over that of the standard PIC.
Resumo:
This paper describes the design, implementation and testing of a high speed controlled stereo “head/eye” platform which facilitates the rapid redirection of gaze in response to visual input. It details the mechanical device, which is based around geared DC motors, and describes hardware aspects of the controller and vision system, which are implemented on a reconfigurable network of general purpose parallel processors. The servo-controller is described in detail and higher level gaze and vision constructs outlined. The paper gives performance figures gained both from mechanical tests on the platform alone, and from closed loop tests on the entire system using visual feedback from a feature detector.
Resumo:
The authors compare the performance of two types of controllers one based on the multilayered network and the other based on the single layered CMAC network (cerebellar model articulator controller). The neurons (information processing units) in the multi-layered network use Gaussian activation functions. The control scheme which is considered is a predictive control algorithm, along the lines used by Willis et al. (1991), Kambhampati and Warwick (1991). The process selected as a test bed is a continuous stirred tank reactor. The reaction taking place is an irreversible exothermic reaction in a constant volume reactor cooled by a single coolant stream. This reactor is a simplified version of the first tank in the two tank system given by Henson and Seborg (1989).
Resumo:
This study investigates the production and on-line processing of English tense morphemes by sequential bilingual (L2) Turkish-speaking children with more than three years of exposure to English. Thirty nine 6-9-year-old L2 children and 28 typically developing age-matched monolingual (L1) children were administered the production component for third person –s and past tense of the Test for Early Grammatical Impairment (Rice & Wexler, 1996) and participated in an on-line word-monitoring task involving grammatical and ungrammatical sentences with presence/omission of tense (third person –s, past tense -ed) and non-tense (progressive –ing, possessive ‘s) morphemes. The L2 children’s performance on the on-line task was compared to that of children with Specific Language Impairment (SLI) in Montgomery & Leonard (1998, 2006) to ascertain similarities and differences between the two populations. Results showed that the L2 children were sensitive to the ungrammaticality induced by the omission of tense morphemes, despite variable production. This reinforces the claim about intact underlying syntactic representations in child L2 acquisition despite non target-like production (Haznedar & Schwartz, 1997).
Resumo:
This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.
Resumo:
Explanations of the marked individual differences in elementary school mathematical achievement and mathematical learning disability (MLD or dyscalculia) have involved domain-general factors (working memory, reasoning, processing speed and oral language) and numerical factors that include single-digit processing efficiency and multi-digit skills such as number system knowledge and estimation. This study of third graders (N = 258) finds both domain-general and numerical factors contribute independently to explaining variation in three significant arithmetic skills: basic calculation fluency, written multi-digit computation, and arithmetic word problems. Estimation accuracy and number system knowledge show the strongest associations with every skill and their contributions are both independent of each other and other factors. Different domain-general factors independently account for variation in each skill. Numeral comparison, a single digit processing skill, uniquely accounts for variation in basic calculation. Subsamples of children with MLD (at or below 10th percentile, n = 29) are compared with low achievement (LA, 11th to 25th percentiles, n = 42) and typical achievement (above 25th percentile, n = 187). Examination of these and subsets with persistent difficulties supports a multiple deficits view of number difficulties: most children with number difficulties exhibit deficits in both domain-general and numerical factors. The only factor deficit common to all persistent MLD children is in multi-digit skills. These findings indicate that many factors matter but multi-digit skills matter most in third grade mathematical achievement.