912 resultados para Phonological processing
Resumo:
The phonological loop is a component of the working memory system specifically involved in the processing and manipulation of limited amounts of information of a sound-based phonological nature. Phonological memory can be assessed by the Children's Test of Nonword Repetition (CNRep) in English speakers but not in Portuguese speakers due to phonotactic differences between the two languages. The objectives of the present study were: 1) to develop the Brazilian Children's Test of Pseudoword Repetition (BCPR), a Portuguese version of the CNRep, and 2) to validate the BCPR by correlation with the Auditory Digit Span Test from the Stanford-Binet Intelligence Scale. The BCPR and Digit Span were assessed in 182 children aged 4-10 years, 84 from Minas Gerais State (42 from a rural region) and 98 from the city of São Paulo. There are subject age and word length effects causing repetition accuracy to decline as a function of the number of syllables of the pseudowords. Correlations between BCPR and Digit Span forward (r = 0.50; P <= 0.01) and backward (r = 0.43; P <= 0.01) were found, and partial correlation indicated that higher BCPR scores were associated with higher Digit Span scores. BCPR appears to depend more on schooling, while Digit Span was more related to development. The results demonstrate that the BCPR is a reliable measure of phonological working memory, similar to the CNRep.
Resumo:
The aim of this master’s thesis is to research and analyze how purchase invoice processing can be automated and streamlined in a system renewal project. The impacts of workflow automation on invoice handling are studied by means of time, cost and quality aspects. Purchase invoice processing has a lot of potential for automation because of its labor-intensive and repetitive nature. As a case study combining both qualitative and quantitative methods, the topic is approached from a business process management point of view. The current process was first explored through interviews and workshop meetings to create a holistic understanding of the process at hand. Requirements for process streamlining were then researched focusing on specified vendors and their purchase invoices, which helped to identify the critical factors for successful invoice automation. To optimize the flow from invoice receipt to approval for payment, the invoice receiving process was outsourced and the automation functionalities of the new system utilized in invoice handling. The quality of invoice data and the need of simple structured purchase order (PO) invoices were emphasized in the system testing phase. Hence, consolidated invoices containing references to multiple PO or blanket release numbers should be simplified in order to use automated PO matching. With non-PO invoices, it is important to receive the buyer reference details in an applicable invoice data field so that automation rules could be created to route invoices to a review and approval flow. In the beginning of the project, invoice processing was seen ineffective both time- and cost-wise, and it required a lot of manual labor to carry out all tasks. In accordance with testing results, it was estimated that over half of the invoices could be automated within a year after system implementation. Processing times could be reduced remarkably, which would then result savings up to 40 % in annual processing costs. Due to several advancements in the purchase invoice process, business process quality could also be perceived as improved.
Resumo:
We studied the action of high pressure processing on the inactivation of two foodborne pathogens, Staphylococcus aureus ATCC 6538 and Salmonella enteritidis ATCC 13076, suspended in a culture medium and inoculated into caviar samples. The baroresistance of the two pathogens in a tryptic soy broth suspension at a concentration of 10(8)-10(9) colony-forming units/ml was tested for continuous and cycled pressurization in the 150- to 550-MPa range and for 15-min treatments at room temperature. The increase of cycle number permitted the reduction of the pressure level able to totally inactivate both microorganisms in the tryptic soy broth suspension, whereas the effect of different procedure times on complete inactivation of the microorganisms inoculated into caviar was similar.
Resumo:
According to the working memory model, the phonological loop is the component of working memory specialized in processing and manipulating limited amounts of speech-based information. The Children's Test of Nonword Repetition (CNRep) is a suitable measure of phonological short-term memory for English-speaking children, which was validated by the Brazilian Children's Test of Pseudoword Repetition (BCPR) as a Portuguese-language version. The objectives of the present study were: i) to investigate developmental aspects of the phonological memory processing by error analysis in the nonword repetition task, and ii) to examine phoneme (substitution, omission and addition) and order (migration) errors made in the BCPR by 180 normal Brazilian children of both sexes aged 4-10, from preschool to 4th grade. The dominant error was substitution [F(3,525) = 180.47; P < 0.0001]. The performance was age-related [F(4,175) = 14.53; P < 0.0001]. The length effect, i.e., more errors in long than in short items, was observed [F(3,519) = 108.36; P < 0.0001]. In 5-syllable pseudowords, errors occurred mainly in the middle of the stimuli, before the syllabic stress [F(4,16) = 6.03; P = 0.003]; substitutions appeared more at the end of the stimuli, after the stress [F(12,48) = 2.27; P = 0.02]. In conclusion, the BCPR error analysis supports the idea that phonological loop capacity is relatively constant during development, although school learning increases the efficiency of this system. Moreover, there are indications that long-term memory contributes to holding memory trace. The findings were discussed in terms of distinctiveness, clustering and redintegration hypotheses.
Resumo:
Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented
Resumo:
Novel word learning has been rarely studied in people with aphasia (PWA), although it can provide a relatively pure measure of their learning potential, and thereby contribute to the development of effective aphasia treatment methods. The main aim of the present thesis was to explore the capacity of PWA for associative learning of word–referent pairings and cognitive-linguistic factors related to it. More specifically, the thesis examined learning and long-term maintenance of the learned pairings, the role of lexical-semantic abilities in learning as well as acquisition of phonological versus semantic information in associative novel word learning. Furthermore, the effect of modality on associative novel word learning and the neural underpinnings of successful learning were explored. The learning experiments utilized the Ancient Farming Equipment (AFE) paradigm that employs drawings of unfamiliar referents and their unfamiliar names. Case studies of Finnishand English-speaking people with chronic aphasia (n = 6) were conducted in the investigation. The learning results of PWA were compared to those of healthy control participants, and active production of the novel words and their semantic definitions was used as learning outcome measures. PWA learned novel word–novel referent pairings, but the variation between individuals was very wide, from more modest outcomes (Studies I–II) up to levels on a par with healthy individuals (Studies III–IV). In incidental learning of semantic definitions, none of the PWA reached the performance level of the healthy control participants. Some PWA maintained part of the learning outcomes up to months post-training, and one individual showed full maintenance of the novel words at six months post-training (Study IV). Intact lexical-semantic processing skills promoted learning in PWA (Studies I–II) but poor phonological short-term memory capacities did not rule out novel word learning. In two PWA with successful learning and long-term maintenance of novel word–novel referent pairings, learning relied on orthographic input while auditory input led to significantly inferior learning outcomes (Studies III–IV). In one of these individuals, this previously undetected modalityspecific learning ability was successfully translated into training with familiar but inaccessible everyday words (Study IV). Functional magnetic resonance imaging revealed that this individual had a disconnected dorsal speech processing pathway in the left hemisphere, but a right-hemispheric neural network mediated successful novel word learning via reading. Finally, the results of Study III suggested that the cognitive-linguistic profile may not always predict the optimal learning channel for an individual with aphasia. Small-scale learning probes seem therefore useful in revealing functional learning channels in post-stroke aphasia.
Resumo:
No fully effective treatment has been developed since the discovery of Chagas' disease by Carlos Chagas in 1909. Since drug-resistant Trypanosoma cruzi strains are occurring and the current therapy is effectiveness in the acute phase but with various adverse side effects, more studies are needed to characterize the susceptibility of T. cruzi to new drugs. Many natural and/or synthetic substances showing trypanocidal activity have been used, even though they are not likely to be turned into clinically approved drugs. Originally, drug screening was performed using natural products, with only limited knowledge of the molecular mechanism involved in the development of diseases. Trans-splicing, which is unusual RNA processing reaction and occurs in nematodes and trypanosomes, implies the processing of polycistronic transcription units into individual mRNAs; a short transcript spliced leader (SL RNA) is trans-spliced to the acceptor pre-mRNA, giving origin to the mature mRNA. In the present study, permeable cells of T. cruzi epimastigote forms (Y, BOL and NCS strains) were treated to evaluate the interference of two drugs (hydroxymethylnitrofurazone - NFOH-121 and nitrofurazone) in the trans-splicing reaction using silver-stained PAGE analysis. Both drugs induced a significant reduction in RNA processing at concentrations from 5 to 12.5 µM. These data agreed with the biological findings, since the number of parasites decreased, especially with NFOH-121. This proposed methodology allows a rapid and cost-effective screening strategy for detecting drug interference in the trans-splicing mechanism of T. cruzi.
Resumo:
The topic of the present doctoral dissertation is the analysis of the phonological and tonal structures of a previously largely undescribed language, namely Samue. It is a Gur language belonging to the Niger-Congo language phulym, which is spoken in Burkina Faso. The data were collected during the fieldwork period in a Sama village; the data include 1800 lexical items, thousands of elicited sentences and 30 oral texts. The data were first transcribed phonetically and then the phonological and tonal analyses were conducted. The results show that the phonological system of Samue with the phoneme inventory and phonological processes has the same characteristics as other related Gur languages, although some particularities were found, such as the voicing and lenition of stop consonants in medial positions. Tonal analysis revealed three level tones, which have both lexical and grammatical functions. A particularity of the tonal system is the regressive Mid tone spreading in the verb phrase. The theoretical framework used in the study is Optimality theory. Optimality theory is rarely used in the analysis of an entire language system, and thus an objective was to see whether the theory was applicable to this type of work. Within the tonal analysis especially, some language specific constraints had to be created, although the basic Optimality Theory principle is the universal nature of the constraints. These constraints define the well-formedness of the language structures and they are differently ranked in different languages. This study gives new insights about typological phenomena in Gur languages. It is also a fundamental starting point for the Samue language in relation to the establishment of an orthography. From the theoretical point of view, the study proves that Optimality theory is largely applicable in the analysis of an entire sound system.
Resumo:
Studies have shown that dyslexic children present a deficiency in the temporal processing of auditory stimuli applied in rapid succession. However, discussion continues concerning the way this deficiency can be influenced by temporal variables of auditory processing tests. Therefore, the purpose of the present study was to analyze by auditory temporal processing tests the effect of temporal variables such as interstimulus intervals, stimulus duration and type of task on dyslexic children compared to a control group. Of the 60 children evaluated, 33 were dyslexic (mean age = 10.5 years) and 27 were normal controls (mean age = 10.8 years). Auditory processing tests assess the abilities of discrimination and ordering of stimuli in relation to their duration and frequency. Results showed a significant difference in the average accuracy of control and dyslexic groups considering each variable (interstimulus intervals: 47.9 ± 5.5 vs 37.18 ± 6.0; stimulus duration: 61.4 ± 7.6 vs 50.9 ± 9.0; type of task: 59.9 ± 7.9 vs 46.5 ± 9.0) and the dyslexic group demonstrated significantly lower performance in all situations. Moreover, there was an interactive effect between the group and the duration of stimulus variables for the frequency-pattern tests, with the dyslexic group demonstrating significantly lower results for short durations (53.4 ± 8.2 vs 48.4 ± 11.1), as opposed to no difference in performance for the control group (62.2 ± 7.1 vs 60.6 ± 7.9). These results support the hypothesis that associates dyslexia with auditory temporal processing, identifying the stimulus-duration variable as the only one that unequally influenced the performance of the two groups.
Resumo:
Serotonin has been implicated in the neurobiology of depressive and anxiety disorders, but little is known about its role in the modulation of basic emotional processing. The aim of this study was to determine the effect of the selective serotonin reuptake inhibitor, escitalopram, on the perception of facial emotional expressions. Twelve healthy male volunteers completed two experimental sessions each, in a randomized, balanced order, double-blind design. A single oral dose of escitalopram (10 mg) or placebo was administered 3 h before the task. Participants were presented to a task composed of six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) that were morphed between neutral and each standard emotion in 10% steps. Escitalopram facilitated the recognition of sadness and inhibited the recognition of happiness in male, but not female faces. No drug effect on subjective measures was detected. These results confirm that serotonin modulates the recognition of emotional faces, and suggest that the gender of the face can have a role in this modulation. Further studies including female volunteers are needed.
Resumo:
The purpose of this study was to determine the middle latency response (MLR) characteristics (latency and amplitude) in children with (central) auditory processing disorder [(C)APD], categorized as such by their performance on the central auditory test battery, and the effects of these characteristics after auditory training. Thirty children with (C)APD, 8 to 14 years of age, were tested using the MLR-evoked potential. This group was then enrolled in an 8-week auditory training program and then retested at the completion of the program. A control group of 22 children without (C)APD, composed of relatives and acquaintances of those involved in the research, underwent the same testing at equal time intervals, but were not enrolled in the auditory training program. Before auditory training, MLR results for the (C)APD group exhibited lower C3-A1 and C3-A2 wave amplitudes in comparison to the control group [C3-A1, 0.84 µV (mean), 0.39 (SD - standard deviation) for the (C)APD group and 1.18 µV (mean), 0.65 (SD) for the control group; C3-A2, 0.69 µV (mean), 0.31 (SD) for the (C)APD group and 1.00 µV (mean), 0.46 (SD) for the control group]. After training, the MLR C3-A1 [1.59 µV (mean), 0.82 (SD)] and C3-A2 [1.24 µV (mean), 0.73 (SD)] wave amplitudes of the (C)APD group significantly increased, so that there was no longer a significant difference in MLR amplitude between (C)APD and control groups. These findings suggest progress in the use of electrophysiological measurements for the diagnosis and treatment of (C)APD.
Resumo:
This study aims to analyze the influence of dehydration and different preparation methods during home processing related toalpha-carotene, beta-carotene and total carotenoids stability in carrots. Vitamin A values were evaluated after different treatments. Thus, carrots were submitted to steam cooking, water cooking with and without pressure, moist/dry cooking and conventional dehydration. Determination of alpha- and beta-carotenes was made by High-Performance Liquid Chromatography (HPLC) (conditions were developed by us) using spectrophotometric detection visible-UV at 470 nm; a RP-18 column and methanol: acetonitrile: ethyl acetate (80: 10: 10) as mobile phase. Total carotenoids quantification was made by 449 nm spectrophotometer. The retention of the analyzed carotenoids ranged from 60.13 to 85.64%. Water cooking without pressure promoted higher retention levels of alpha- and beta-carotene and vitamin A values, while water cooking with pressure promoted higher retention levels of total carotenoids. Dehydration promoted the highest carotenoid losses. The results showed that, among the routinely utilized methods under domestic condition, cooking without pressure, if performed under controlled time and temperature, is the best method as it reduces losses in the amount of alpha- and beta-carotene, the main carotenoids present in the carrots. Despite the significant carotenoid losses, carrots prepared through domestic methods, remain a rich source of provitamin A.
Resumo:
Spent nickel catalyst (SNC) has the potential of insulting the quality of the environment in a number of ways. Its disposal has a pollution effect. Optimum recovery of fat from SNC, could save the environment and reduce the oil loss. Hexane has been the solvent of choice for oil extraction. Alternative solvents considered to have been safer have been evaluated. Hexane, isopropanol, ethanol and heptane were examined using soxhlet extraction. While hexane is more efficient in oil recovery from SNC, isopropanol proved to be very good in clear separation of oil from waste material and also provides high solvent recovery compared to other solvents. Isopropanol extraction with chill separation of miscella into lower oil-rich phase, and an upper, solvent-rich recyclable phase save mush energy of vaporization for distilling. An aqueous extraction process with immiscible solvent assisted was tested. Solvent like hexane added to SNC, and water added later with continuous stirring. The mixture was stirred for about 30 minutes, prior to centrifugation. Aqueous process extracted less amount of oil compared to solvent extraction.
Resumo:
The aim of this work was to evaluate the effect of processing and roasting on the antioxidant activity of coffee brews. Brews prepared with light, medium and dark roasted coffees were analyzed. The pH, total solids content, polyphenols content, reducing substances and chlorogenic acids content were determined. The antioxidant activity of aqueous extracts, the guaicol decolorizing and the capacity to inhibit lipid peroxidation were also analyzed. The antioxidant activity of coffee brews were concentration-dependent. A progressive antioxidant activity and polyphenols content was observed decreasing with roasting. The light roasted coffee showed the highest antioxidant activity and dark roasted coffee showed the lowest antioxidant activity. The results indicate that the ingestion of coffee brews prepared with light and medium roasted coffees might protect cells from oxidative stress damages.
Resumo:
This study proposes alternatives to the current methods of processing round-cooked lobster. The paralyzation of lobsters with direct electric shock consumes 10.526 x 10-3 kWh, which is significantly less than the 11 kWh required by the traditional thermal-shock method (based on 60 kg of lobsters). A better weight gain was obtained by immersion of paralyzed lobsters in brine before cooking. Systematic trials combining 3, 6, or 9% brine concentrations with immersion periods of 15, 30, or 45 minutes were performed in order to determine the best combinations. A mathematical model was designed to predict the weight gain of lobsters of different sizes in any combination of treatments. For small lobsters, a 45 minutes immersion in 6% brine gave the best response in terms of weight gain (4.7%) and cooking produced a weight loss of only 1.34% in relation to fresh lobster weight. For medium-sized lobsters, a 45 minutes immersion in 9% brine produced a weight gain of 2.64%, and cooking a weight gain of 1.08%. For large lobsters, a 45 minutes immersion in 6% brine produced a weight gain of 3.87%, and cooking a weight gain of 1.62%.