931 resultados para computer processing of language
Resumo:
Neural signal processing is a discipline within neuroengineering. This interdisciplinary approach combines principles from machine learning, signal processing theory, and computational neuroscience applied to problems in basic and clinical neuroscience. The ultimate goal of neuroengineering is a technological revolution, where machines would interact in real time with the brain. Machines and brains could interface, enabling normal function in cases of injury or disease, brain monitoring, and/or medical rehabilitation of brain disorders. Much current research in neuroengineering is focused on understanding the coding and processing of information in the sensory and motor systems, quantifying how this processing is altered in the pathological state, and how it can be manipulated through interactions with artificial devices including brain–computer interfaces and neuroprosthetics.
Resumo:
The meaning of a novel word can be acquired by extracting it from linguistic context. Here we simulated word learning of new words associated to concrete and abstract concepts in a variant of the human simulation paradigm that provided linguistic context information in order to characterize the brain systems involved. Native speakers of Spanish read pairs of sentences in order to derive the meaning of a new word that appeared in the terminal position of the sentences. fMRI revealed that learning the meaning associated to concrete and abstract new words was qualitatively different and recruited similar brain regions as the processing of real concrete and abstract words. In particular, learning of new concrete words selectively boosted the activation of the ventral anterior fusiform gyrus, a region driven by imageability, which has previously been implicated in the processing of concrete words.
Resumo:
Controversial results have been reported concerning the neural mechanisms involved in the processing of rewards and punishments. On the one hand, there is evidence suggesting that monetary gains and losses activate a similar fronto-subcortical network. On the other hand, results of recent studies imply that reward and punishment may engage distinct neural mechanisms. Using functional magnetic resonance imaging (fMRI) we investigated both regional and interregional functional connectivity patterns while participants performed a gambling task featuring unexpectedly high monetary gains and losses. Classical univariate statistical analysis showed that monetary gains and losses activated a similar fronto-striatallimbic network, in which main activation peaks were observed bilaterally in the ventral striatum. Functional connectivity analysis showed similar responses for gain and loss conditions in the insular cortex, the amygdala, and the hippocampus that correlated with the activity observed in the seed region ventral striatum, with the connectivity to the amygdala appearing more pronounced after losses. Larger functional connectivity was found to the medial orbitofrontal cortex for negative outcomes. The fact that different functional patterns were obtained with both analyses suggests that the brain activations observed in the classical univariate approach identifi es the involvement of different functional networks in the current task. These results stress the importance of studying functional connectivity in addition to standard fMRI analysis in reward-related studies.
Resumo:
During the process of language development, one of the most important tasks that children must face is that of identifying the grammatical category to which words in their language belong. This is essential in order to be able to form grammatically correct utterances. How do children proceed in order to classify words in their language and assign them to their corresponding grammatical category? The present study investigates the usefulness of phonological information for the categorization of nouns in English, given the fact that it is phonology the first source of information that might be available to prelinguistic infants who lack access to semantic information or complex morphosyntactic information. We analyse four different corpora containing linguistic samples of English speaking mothers addressing their children in order to explore the reliability with which words are represented in mothers’ speech based on several phonological criteria. The results of the analysis confirm the prediction that most of the words to which English learning infants are exposed during the first two years of life can be accounted for in terms of their phonological resemblance
Resumo:
A variety of language disturbances including aphasia have been described after subcortical stroke but less is known about the factors that influence the long-term recovery of stroke-induced language dysfunction. We prospectively examined the role of the affected hemisphere and the lesion site in the occurrence and recovery of language deficits in nonthalamic subcortical stroke. Forty patients with unilateral basal gangliastroke underwent language assessment within 1 week, 3 months and 1 year after stroke. Disturbances in at least one language domain were observed in 35 patients during the first week post stroke including aphasia diagnosed in 11 patients. Importantly, the appearance of deficits after stroke onset and the improvement of language function were not determined by the site of subcortical lesion, but instead were critically influenced by the affected hemisphere. In fact, the language impairments following left and right basal ganglia stroke mirrored the language dysfunction observed after cortical lesions in the same hemisphere. A significant overall language improvement was observed at 3 months after stroke, although residual deficits in languageexecutive function were the most commonly observed impairment at 1 year follow-up. Although a substantial improvement of language function can be expected after nonthalamic subcortical stroke, our findings suggest that language recovery may not be fully achieved at 1 year post
Resumo:
This applied linguistic study in the field of second language acquisition investigated the assessment practices of class teachers as well as the challenges and visions of language assessment in bilingual content instruction (CLIL) at primary level in Finnish basic education. Furthermore, pupils’ and their parents’ perceptions of language assessment and LangPerform computer simulations as an alternative, modern assessment method in CLIL contexts were examined. The study was conducted for descriptive and developmental purposes in three phases: 1) a CLIL assessment survey; 2) simulation 1; and 3) simulation 2. All phases had a varying number of participants. The population of this mixed methods study were CLIL class teachers, their pupils and the pupils’ parents. The sampling was multi-staged and based on probability and random sampling. The data were triangulated. Altogether 42 CLIL class teachers nationwide, 109 pupils from the 3rd, 4th and 5th grade as well as 99 parents from two research schools in South-Western Finland participated in the CLIL assessment survey followed by an audio-recorded theme interview of volunteers (10 teachers, 20 pupils and 7 parents). The simulation experimentations 1 and 2 produced 146 pupil and 39 parental questionnaires as well as video interviews of volunteered pupils. The data were analysed both quantitatively using percentages and numerical frequencies and qualitatively employing thematic content analysis. Based on the data, language assessment in primary CLIL is not an established practice. It largely appears to be infrequent, incidental, implicit and based on impressions rather than evidence or the curriculum. The most used assessment methods were teacher observation, bilingual tests and dialogic interaction, and the least used were portfolios, simulations and peer assessment. Although language assessment was generally perceived as important by teachers, a fifth of them did not gather assessment information systematically, and 38% scarcely gave linguistic feedback to pupils. Both pupils and parents wished to receive more information on CLIL language issues; 91% of pupils claimed to receive feedback rarely or occasionally, and 63% of them wished to get more information on their linguistic coping in CLIL subjects. Of the parents, 76% wished to receive more information on the English proficiency of their children and their linguistic development. This may be a response to indirect feedback practices identified in this study. There are several challenges related to assessment; the most notable is the lack of a CLIL curriculum, language objectives and common ground principles of assessment. Three diverse approaches to language in CLIL that appear to affect teachers’ views on language assessment were identified: instrumental (language as a tool), dual (language as a tool and object of learning) and eclectic (miscellaneous views, e.g. affective factors prioritised). LangPerform computer simulations seem to be perceived as an appropriate alternative assessment method in CLIL. It is strongly recommended that the fundamentals for assessment (curricula and language objectives) and a mutual assessment scheme should be determined and stakeholders’ knowledge base of CLIL strengthened. The principles of adequate assessment in primary CLIL are identified as well as several appropriate assessment methods suggested.
Resumo:
The determination of the sterilization value for low acid foods in retorts includes a critical evaluation of the factory's facilities and utilities, validation of the heat processing equipment (by heat distribution assays), and finally heat penetration assays with the product. The intensity of the heat process applied to the food can be expressed by the Fo value (sterilization value, in minutes, at a reference temperature of 121.1 °C, and a thermal index, z, of 10 °C, for Clostridium botulinum spores). For safety reasons, the lowest value for Fo is frequently adopted, being obtained in heat penetration assays as indicative of the minimum process intensity applied. This lowest Fo value should always be higher than the minimum Fo recommended for the food in question. However, the use of the Fo value for the coldest can fail to statistically explain all the practical occurrences in food heat treatment processes. Thus, as a result of intense experimental work, we aimed to develop a new focus to determine the lowest Fo value, which we renamed the critical Fo. The critical Fo is based on a statistical model for the interpretation of the results of heat penetration assays in packages, and it depends not only on the Fo values found at the coldest point of the package and the coldest point of the equipment, but also on the size of the batch of packages processed in the retort, the total processing time in the retort, and the time between CIPs of the retort. In the present study, we tried to explore the results of physical measurements used in the validation of food heat processes. Three examples of calculations were prepared to illustrate the methodology developed and to introduce the concept of critical Fo for the processing of canned food.
Resumo:
This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP) in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP) plan resulted in the detection of two critical control points (CCPs) including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP) was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP) plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP) hazards.
Resumo:
This doctoral study conducts an empirical analysis of the impact of Word-of-Mouth (WOM) on marketing-relevant outcomes such as attitudes and consumer choice, during a high-involvement and complex service decision. Due to its importance to decisionmaking, WOM has attracted interest from academia and practitioners for decades. Consumers are known to discuss products and services with one another. These discussions help consumers to form an evaluative opinion, as WOM reduces perceived risk, simplifies complexity, and increases the confidence of consumers in decisionmaking. These discussions are also highly impactful as WOM is a trustworthy source of information, since it is independent from the company or brand. In responding to the calls for more research on what happens after WOM information is received, and how it affects marketing-relevant outcomes, this dissertation extends prior WOM literature by investigating how consumers process information in a highinvolvement service domain, in particular higher-education. Further, the dissertation studies how the form of WOM influences consumer choice. The research contributes to WOM and services marketing literature by developing and empirically testing a framework for information processing and studying the long-term effects of WOM. The results of the dissertation are presented in five research publications. The publications are based on longitudinal data. The research leads to the development of a proposed theoretical framework for the processing of WOM, based on theories from social psychology. The framework is specifically focused on service decisions, as it takes into account evaluation difficulty through the complex nature of choice criteria associated with service purchase decisions. Further, other gaps in current WOM literature are taken into account by, for example, examining how the source of WOM and service values affects the processing mechanism. The research also provides implications for managers aiming to trigger favorable WOM through marketing efforts, such as advertising and testimonials. The results provide suggestions on how to design these marketing efforts by taking into account the mechanism through which information is processed, or the form of social influence.
Resumo:
The effectiveness of various kinds of computer programs is of concern to nurse-educators. Using a 3x3 experimental design, ninety second year diploma student nurses were randomly selected from a total population at three community colleges in Ontario. Data were collected via a 20-item valid and reliable Likert-type questionnaire developed by the nursing profession to measure perceptions of nurses about computers in the nursing role. The groups were pretested and posttested at the beginning and end of one semester. Subjects attending College A group received a computer literacy course which comprised word processing with technology awareness. College B students were exposed to computer-aided instruction primarily in nursing simulations intermittently throughout the semester. College C subjects maintained their regular curriculum with no computer involvement. The student's t-test (two-tailed) was employed to assess the attitude scores data and a one-way analysis of variance was performed on the attitude scores. Posttest analysis revealed that there was a significant difference (p<.05) between attitude scores on the use of computers in the nursing role between College A and C. No significant differences (p>.05) were seen between College B and A in posttesting. Suggestions for continued computer education of diploma student nurses are provided.
Resumo:
Through aggressive legislative and educational policies Indigenous languages globally have been shifted to the language of the dominant society. Globalization has brought previously geo-politically and/or geo-linguistically isolated people and language . groups into close proximity that necessitated interaction and at times intense power struggles. There are currently approximately 6,000 spoken languages in the world, more than half are either endangered, dying or disappearing altogether. Canadian statistics reveal an overall 3 % decline in the intergenerational transmission of language. Of the original 60 Indigenous languages spoken in Canada, 8 are extinct, 13 are nearly extinct, and 23 are critical. The remaining languages have a slim chance of survival. Within the next 100 years only 4 Indigenous languages will remain. The Hodenosaunee languages of Southern Ontario are not incl~ded among the list of languages that will survive the next 100 years. There are, without a doubt, complex challenges in the maintenance of Indigenous languages within a dominant-culture influenced environment. Given the increasing awareness of the social impact of linguistic integrity and preservation of languages on Indigenous people as a whole, this study considers how language is currently being used; the social, economic, and political implications of language shifting; the need to shift our social consciousness in order to understand the urgency in privileging our Hodenosaunee languages; as well as ways in which we might achieve those goals as individuals, as families, and as a community.
Resumo:
Research points clearly to the need for all concerned stakeholders to adopt a preventative approach while intervening with children who are at-risk for future reading disabilities. Research has indicated also that a particular sub-group of children at-risk for reading impairments include preschool children with language impairments (Catts, 1993). Preschool children with language impairments may have difficulties with emergent literacy skills - important prerequisite skills necessary for successful formal reading. Only in the past decade have researchers begun to study the effects of emergent literacy intervention on preschool children with language impairments. As such, the current study continues this investigation of how to effectively implement an emergent literacy therapy aimed at supporting preschool children with language impairments. In addition to this, the current study explores emergent literacy intervention within an applied clinical setting. The setting, presents a host of methodological and theoretical challenges - challenges that will advance the field of understanding children within naturalistic settings. This exploratory study included thirty-eight participants who were recruited from Speech Services Niagara, a local preschool speech and language program. Using a between-group pre- and posttest design, this study compared two intervention approaches - an experimental emergent literacy intervention and a traditional language intervention. The experimental intervention was adopted from Read It Again! (Justice, McGinty, Beckman, & Kilday, 2006) and the traditional language intervention was based on the traditional models of language therapy typically used in preschool speech and language models across Ontario. 5 Results indicated that the emergent literacy intervention was superior to the ,t..3>~, ~\., ;./h traditional language therapy in improving the children's alphabet knowledge, print and word awareness and phonological awareness. Moreover, results revealed that children with more severe language impairments require greater support and more explicit instruction than children with moderate language impairments. Another important finding indicated that the effects of the preschool emergent literacy intervention used in this study may not be sustainable as children enter grade one. The implications of this study point to the need to support preschool children with language impairments with intensive emergent literacy intervention that extends beyond preschool into formal educational settings.
Resumo:
This thesis presents Zen experience as aesthetic in nature. This is done through an analysis of language, a central concern for Zen Buddhism. The thesis develops two modes of language at work in Zen: representational and indexical. What these modes of language entail, the kind of relations that are developed through their use, are explored with recourse to a variety of Zen platforms: poetry, the koan, zazen, music, and suizen. In doing so, a primacy of listening is found in Zen - a listening without a listener. Given this primacy of listening, silence comes to the forefront of the investigation. An analysis of John Cage's 4'33" provides this thesis with justification of the groundlessness of silence, and the groundlessness of subjectivity. Listening allows for the abyssal subject to emerges, which in tum allows for reality to present itself outside of the constitutive function of language.
Resumo:
Les troubles du spectre autistique (TSA) sont actuellement caractérisés par une triade d'altérations, incluant un dysfonctionnement social, des déficits de communication et des comportements répétitifs. L'intégration simultanée de multiples sens est cruciale dans la vie quotidienne puisqu'elle permet la création d'un percept unifié. De façon similaire, l'allocation d'attention à de multiples stimuli simultanés est critique pour le traitement de l'information environnementale dynamique. Dans l'interaction quotidienne avec l'environnement, le traitement sensoriel et les fonctions attentionnelles sont des composantes de base dans le développement typique (DT). Bien qu'ils ne fassent pas partie des critères diagnostiques actuels, les difficultés dans les fonctions attentionnelles et le traitement sensoriel sont très courants parmi les personnes autistes. Pour cela, la présente thèse évalue ces fonctions dans deux études séparées. La première étude est fondée sur la prémisse que des altérations dans le traitement sensoriel de base pourraient être à l'origine des comportements sensoriels atypiques chez les TSA, tel que proposé par des théories actuelles des TSA. Nous avons conçu une tâche de discrimination de taille intermodale, afin d'investiguer l'intégrité et la trajectoire développementale de l'information visuo-tactile chez les enfants avec un TSA (N = 21, âgés de 6 à18 ans), en comparaison à des enfants à DT, appariés sur l’âge et le QI de performance. Dans une tâche à choix forcé à deux alternatives simultanées, les participants devaient émettre un jugement sur la taille de deux stimuli, basé sur des inputs unisensoriels (visuels ou tactiles) ou multisensoriels (visuo-tactiles). Des seuils différentiels ont évalué la plus petite différence à laquelle les participants ont été capables de faire la discrimination de taille. Les enfants avec un TSA ont montré une performance diminuée et pas d'effet de maturation aussi bien dans les conditions unisensorielles que multisensorielles, comparativement aux participants à DT. Notre première étude étend donc des résultats précédents d'altérations dans le traitement multisensoriel chez les TSA au domaine visuo-tactile. Dans notre deuxième étude, nous avions évalué les capacités de poursuite multiple d’objets dans l’espace (3D-Multiple Object Tracking (3D-MOT)) chez des adultes autistes (N = 15, âgés de 18 à 33 ans), comparés à des participants contrôles appariés sur l'âge et le QI, qui devaient suivre une ou trois cibles en mouvement parmi des distracteurs dans un environnement de réalité virtuelle. Les performances ont été mesurées par des seuils de vitesse, qui évaluent la plus grande vitesse à laquelle des observateurs sont capables de suivre des objets en mouvement. Les individus autistes ont montré des seuils de vitesse réduits dans l'ensemble, peu importe le nombre d'objets à suivre. Ces résultats étendent des résultats antérieurs d'altérations au niveau des mécanismes d'attention en autisme quant à l'allocation simultanée de l'attention envers des endroits multiples. Pris ensemble, les résultats de nos deux études révèlent donc des altérations chez les TSA quant au traitement simultané d'événements multiples, que ce soit dans une modalité ou à travers des modalités, ce qui peut avoir des implications importantes au niveau de la présentation clinique de cette condition.
Resumo:
La fibrillation auriculaire, l'arythmie la plus fréquente en clinique, affecte 2.3 millions de patients en Amérique du Nord. Pour en étudier les mécanismes et les thérapies potentielles, des modèles animaux de fibrillation auriculaire ont été développés. La cartographie électrique épicardique à haute densité est une technique expérimentale bien établie pour suivre in vivo l'activité des oreillettes en réponse à une stimulation électrique, à du remodelage, à des arythmies ou à une modulation du système nerveux autonome. Dans les régions qui ne sont pas accessibles par cartographie épicardique, la cartographie endocardique sans contact réalisée à l'aide d'un cathéter en forme de ballon pourrait apporter une description plus complète de l'activité auriculaire. Dans cette étude, une expérience chez le chien a été conçue et analysée. Une reconstruction électro-anatomique, une cartographie épicardique (103 électrodes), une cartographie endocardique sans contact (2048 électrodes virtuelles calculées à partir un cathéter en forme de ballon avec 64 canaux) et des enregistrements endocardiques avec contact direct ont été réalisés simultanément. Les systèmes d'enregistrement ont été également simulés dans un modèle mathématique d'une oreillette droite de chien. Dans les simulations et les expériences (après la suppression du nœud atrio-ventriculaire), des cartes d'activation ont été calculées pendant le rythme sinusal. La repolarisation a été évaluée en mesurant l'aire sous l'onde T auriculaire (ATa) qui est un marqueur de gradient de repolarisation. Les résultats montrent un coefficient de corrélation épicardique-endocardique de 0.8 (expérience) and 0.96 (simulation) entre les cartes d'activation, et un coefficient de corrélation de 0.57 (expérience) and 0.92 (simulation) entre les valeurs de ATa. La cartographie endocardique sans contact apparait comme un instrument expérimental utile pour extraire de l'information en dehors des régions couvertes par les plaques d'enregistrement épicardique.