931 resultados para computer processing of language
Resumo:
Techniques of production of enthomopatogenic bacteria are developed aiming to increase the productivity and to reduce the costs of the fermentative process. Like this, it has been using agroindustrial wastes or by-products as nutrient sources in culture medium, having been used, in this study, the manipueira, a by-product of the processing of the cassava flour. Fermentations were performed in flasks of Erlenmeyer of 500 mL containing 250 mL of culture media, conditioned in shaker at 180 r.p.m. and 28°C, and the media were composed by manipueira, in concentrations that varied between 400 and 1000 mL/L. The time of the process varied between 48 and 120 hours. They appraised the following parameters: cellular growth, the production of spores, the reduction of organic matter (COD analysis) and the variation of reduction sugar. Although there was a proportional cellular growth to the manipueira concentration, the production of spores was similar in all the cases, at the end of the process, in spite of the smallest speed of production of the same ones in the highest concentrations. In relation to the variation of COD, it has, also, a percentile minor of reduction in the highest concentrations. In the analysis of variation of reduction sugars, the higher concentrations are the ones that they present larger slowness in the reduction of this.
Resumo:
The collection of prices for basic goods supply is very important for the population, based on the collection and processing of these data the CLI (Cost Living Index) is calculated among others, helping consumers to shop more rationally and with a clearer view of each product impact of each product on their household budget, not only food, but also cleaning products and personal hygiene ones. Nowadays, the project of collection of prices for basic goods supply is conducted weekly in Botucatu - SP through a spreadsheet. The aim of this work was to develop a software which utilized mobile devices in the data collection and storage phase, concerning the basic goods supply in Botucatu -SP. This was created in order to eliminate the need of taking notes in paper spreadsheets, increasing efficiency and accelerating the data processing. This work utilized the world of mobile technology and development tools, through the platform".NET" - Compact Framework and programming language Visual Basic".NET" was used in the handheld phase, enabling to develop a system using techniques of object oriented programming, with higher speed and reliability in the codes writing. A HP Pavilion dv3 personal computer and an Eten glofish x500+ handheld computer were used. At the end of the software development, collection, data storing and processing in a report, the phase of in loco paper spreadsheets were eliminated and it was possible to verify that the whole process was faster, more consistent, safer, more efficient and the data were more available.
Resumo:
Recent Salmonella outbreaks have prompted the need for new processing options for peanut products. Traditional heating kill-steps have shown to be ineffective in lipid-rich matrices such as peanut products. High pressure processing is one such option for peanut sauce because it has a high water activity, which has proved to be a large contributing factor in microbial lethality due to high pressure processing. Four different formulations of peanut sauce were inoculated with a five strain Salmonella cocktail and high pressure processed. Results indicate that increasing pressure or increasing hold time increases log10 reductions. The Weibull model was fitted to each kill curve, with b and n values significantly optimized for each curve (p-value < 0.05). Most curves had an n parameter value less than 1, indicating that the population had a dramatic initial reduction, but tailed off as time increased, leaving a small resistant population. ANOVA analysis of the b and n parameters show that there are more significant differences between b parameters than n parameters, meaning that most treatments showed similar tailing effect, but differed on the shape of the curve. Comparisons between peanut sauce formulations at the same pressure treatments indicate that increasing amount of organic peanut butter within the sauce formulation decreases log10 reductions. This could be due to a protective effect from the lipids in the peanut butter, or it may be due to other factors such as nutrient availability or water activity. Sauces pressurized at lower temperatures had decreased log10 reductions, indicating that cooler temperatures offered some protective effect. Log10 reductions exceeded 5 logs, indicating that high pressure processing may be a suitable option as a kill-step for Salmonella in industrial processing of peanut sauces. Future research should include high pressure processing on other peanut products with high water activities such as sauces and syrups as well as research to determine the effects of water activity and lipid composition with a food matrix such as peanut sauces.
Resumo:
Even though the digital processing of documents is increasingly widespread in industry, printed documents are still largely in use. In order to process electronically the contents of printed documents, information must be extracted from digital images of documents. When dealing with complex documents, in which the contents of different regions and fields can be highly heterogeneous with respect to layout, printing quality and the utilization of fonts and typing standards, the reconstruction of the contents of documents from digital images can be a difficult problem. In the present article we present an efficient solution for this problem, in which the semantic contents of fields in a complex document are extracted from a digital image.
Resumo:
This article analyzes the role that has been attributed to grammar throughout the history of foreign language teaching, with special emphasis on methods and approaches of the twentieth century. In order to support our argument, we discuss the notion of grammar by proposing a conceptual continuum that includes the main meanings of the term which are relevant to our research. We address as well the issue of "pedagogical grammar" and consider the position of grammar in the different approaches of the "era of the methods" and the current "post-method condition" in the field of language teaching and learning. The findings presented at the end of the text consist of recognizing the central role that grammar has played throughout the history of the methods and approaches, where grammar has always been present by the definition of the contents' progression. The rationale that we propose for this is the recognition of the fact that the dissociation between what is said and how it is said can not be more than theoretical and, thus, artificial.
Resumo:
Context: The aberrant processing of salience is thought to be a fundamental factor underlying psychosis. Cannabis can induce acute psychotic symptoms, and its chronic use may increase the risk of schizophrenia. We investigated whether its psychotic effects are mediated through an influence on attentional salience processing. Objective: To examine the effects of Delta 9-tetrahydrocannabinol (Delta 9-THC) and cannabidiol (CBD) on regional brain function during salience processing. Design: Volunteers were studied using event-related functional magnetic resonance imaging on 3 occasions after administration of Delta 9-THC, CBD, or placebo while performing a visual oddball detection paradigm that involved allocation of attention to infrequent (oddball) stimuli within a string of frequent (standard) stimuli. Setting: University center. Participants: Fifteen healthy men with minimal previous cannabis use. Main Outcome Measures: Symptom ratings, task performance, and regional brain activation. Results: During the processing of oddball stimuli, relative to placebo, Delta 9-THC attenuated activation in the right caudate but augmented it in the right prefrontal cortex. Delta 9-Tetrahydrocannabinol also reduced the response latency to standard relative to oddball stimuli. The effect of Delta 9-THC in the right caudate was negatively correlated with the severity of the psychotic symptoms it induced and its effect on response latency. The effects of CBD on task-related activation were in the opposite direction of those of Delta 9-THC; relative to placebo, CBD augmented left caudate and hippocampal activation but attenuated right prefrontal activation. Conclusions: Delta 9-Tetrahydrocannabinol and CBD differentially modulate prefrontal, striatal, and hippocampal function during attentional salience processing. These effects may contribute to the effects of cannabis on psychotic symptoms and on the risk of psychotic disorders.
Resumo:
The extraction of information about neural activity timing from BOLD signal is a challenging task as the shape of the BOLD curve does not directly reflect the temporal characteristics of electrical activity of neurons. In this work, we introduce the concept of neural processing time (NPT) as a parameter of the biophysical model of the hemodynamic response function (HRF). Through this new concept we aim to infer more accurately the duration of neuronal response from the highly nonlinear BOLD effect. The face validity and applicability of the concept of NPT are evaluated through simulations and analysis of experimental time series. The results of both simulation and application were compared with summary measures of HRF shape. The experiment that was analyzed consisted of a decision-making paradigm with simultaneous emotional distracters. We hypothesize that the NPT in primary sensory areas, like the fusiform gyrus, is approximately the stimulus presentation duration. On the other hand, in areas related to processing of an emotional distracter, the NPT should depend on the experimental condition. As predicted, the NPT in fusiform gyrus is close to the stimulus duration and the NPT in dorsal anterior cingulate gyrus depends on the presence of an emotional distracter. Interestingly, the NPT in right but not left dorsal lateral prefrontal cortex depends on the stimulus emotional content. The summary measures of HRF obtained by a standard approach did not detect the variations observed in the NPT. Hum Brain Mapp, 2012. (C) 2010 Wiley Periodicals, Inc.
Resumo:
BECTS represents the vast majority of childhood focal epilepsy. Owing to the age peculiarity of children who suffer from this disease, i.e., school-going age of between 6 and 9 years, the condition is often referred to as a school disorder by parents and teachers. Objective: The aim of this study was to evaluate the academic performance of children with BED'S, according to the clinical and electroencephalographic ILAE criteria, and compare the results of neuropsychological tests of language and attention to the frequency of epileptic discharges. Methods: The performances of 40 school children with BED'S were evaluated by applying a school performance test (SBT), neuropsychological tests (WISC and Trail-Making), and language tests (Illinois Test Psycholinguistic Abilities - ITPA - and Staggered Spondaic Word - SSW). The same tests were applied in the control group. Results: Children with BED'S, when compared to those in the control group, showed lower scores in academic performance (SPT), digits and similarities subtests of WISC, auditory processing subtest of SSW, and ITPA - representational and automatic level. The study showed that epileptic discharges did not influence the results. Conclusion: Children with BED'S scored significantly lower scores in tests on academic performance, when compared with those in the control group probably due to executive dysfunction. (C) 2011 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
Resumo:
Although the hydrophobicity is usually an arduous parameter to be determined in the field, it has been pointed out as a good option to monitor aging of polymeric outdoor insulators. Concerning this purpose, digital image processing of photos taken from wet insulators has been the main technique nowadays. However, important challenges on this technique still remain to be overcome, such as; images from non-controlled illumination conditions can interfere on analyses and no existence of standard surfaces with different levels of hydrophobicity. In this paper, the photo image samples were digitally filtered to reduce the illumination influence, and hydrophobic surface samples were prepared from wetting silicon surfaces with solution of water-alcohol. Furthermore norevious studies triying to quantify and relate these properties in a mathematical function were found, that could be used in the field by the electrical companies. Based on such considerations, high quality images of countless hydrophobic surfaces were obtained and three different image processing methodologies, the fractal dimension and two Haralick textures descriptors, entropy and homogeneity, associated with several digital filters, were compared. The entropy parameter Haralick's descriptors filtered with the White Top-Hat filter presented the best result to classify the hydrophobicity.
Resumo:
The use of stone and its types of processing have been very important in the vernacular architecture of the cross-border Carso. In Carso this represents an important legacy of centuries and has a uniform typological characteristic to a great extent. The stone was the main constituent of the local architecture, setting and shaping the human environment, incorporating the history of places through their specific symbolic and constructive language. The primary aim of this research is the recognition of the constructive rules and the values embedded in the Carso rural architecture by use and processing of stone. Central to this investigation is the typological reading, aimed to analyze the constructive language expressed by this legacy, through the analysis of the relationship between type, technique and material.
Resumo:
Der Ausheilung von Infektionen mit Leishmania major liegt die Sekretion von IFN- von sowohl CD4+ als auch CD8+ T Zellen zugrunde.rnAktuell konnte in der Literatur nur ein Epitop aus dem parasitären LACK Protein für eine effektive CD4+ T Zell-vermittelte Immunantwort beschrieben werden. Das Ziel der vorliegenden Arbeit bestand daher darin, mögliche MHC I abhängige CD8+ T Zell Antworten zu untersuchen. rnFür diesen Ansatz wurde als erstes der Effekt einer Vakzinierung mit LACK Protein fusioniert an die Protein-Transduktionsdomäne des HIV-1 (TAT) analysiert. Die Effektivität von TAT-LACK gegenüber CD8+ T Zellen wurde mittels in vivo Protein-Vakzinierung von resistenten C57BL/6 Mäusen in Depletions-Experimenten gezeigt.rnDie Prozessierung von Proteinen vor der Präsentation immunogener Peptide gegenüber T Zellen ist unbedingt erforderlich. Daher wurde in dieser Arbeit die Rolle des IFN--induzierbaren Immunoproteasoms bei der Prozessierung von parasitären Proteinen und Präsentation von Peptiden gebunden an MHC I Moleküle durch in vivo und in vitro Experimente untersucht. Es konnte in dieser Arbeit eine Immunoproteasom-unabhängige Prozessierung aufgezeigt werden.rnWeiterhin wurde Parasitenlysat (SLA) von sowohl Promastigoten als auch Amastigoten fraktioniert. In weiterführenden Experimenten können diese Fraktionen auf immunodominante Proteine/Peptide hin untersucht werden. rnLetztlich wurden Epitop-Vorhersagen für CD8+ T Zellen mittels computergestützer Software von beiden parasitären Lebensformen durchgeführt. 300 dieser Epitope wurden synthetisiert und werden in weiterführenden Experimenten zur Charakterisierung immunogener Eigenschaften weiter verwendet. rnIn ihrer Gesamtheit trägt die vorliegende Arbeit wesentlich zum Verständnis über die komplexen Mechanismen der Prozessierung und letztendlich zur Identifikation von möglichen CD8+ T Zell Epitopen bei. Ein detailiertes Verständnis der Prozessierung von CD8+ T Zell Epitopen von Leishmania major über den MHC Klasse I Weg ist von höchster Bedeutung. Die Charakterisierung sowie die Identifikation dieser Peptide wird einen maßgeblichen Einfluss auf die weiteren Entwicklungen von Vakzinen gegen diesen bedeutenden human-pathogenen Parasiten mit sich bringen. rn
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
Specific language impairment (SLI) is a complex neurodevelopmental disorder defined as an unexpected failure to develop normal language abilities for no obvious reason. Copy number variants (CNVs) are an important source of variation in the susceptibility to neuropsychiatric disorders. Therefore, a CNV study within SLI families was performed to investigate the role of structural variants in SLI. Among the identified CNVs, we focused on CNVs on chromosome 15q11-q13, recurrently observed in neuropsychiatric conditions, and a homozygous exonic microdeletion in ZNF277. Since this microdeletion falls within the AUTS1 locus, a region linked to autism spectrum disorders (ASD), we investigated a potential role of ZNF277 in SLI and ASD. Frequency data and expression analysis of the ZNF277 microdeletion suggested that this variant may contribute to the risk of language impairments in a complex manner, that is independent of the autism risk previously described in this region. Moreover, we identified an affected individual with a dihydropyrimidine dehydrogenase (DPD) deficiency, caused by compound heterozygosity of two deleterious variants in the gene DPYD. Since DPYD represents a good candidate gene for both SLI and ASD, we investigated its involvement in the susceptibility to these two disorders, focusing on the splicing variant rs3918290, the most common mutation in the DPD deficiency. We observed a higher frequency of rs3918290 in SLI cases (1.2%), compared to controls (~0.6%), while no difference was observed in a large ASD cohort. DPYD mutation screening in 4 SLI and 7 ASD families carrying the splicing variant identified six known missense changes and a novel variant in the promoter region. These data suggest that the combined effect of the mutations identified in affected individuals may lead to an altered DPD activity and that rare variants in DPYD might contribute to a minority of cases, in conjunction with other genetic or non-genetic factors.
Resumo:
This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.