931 resultados para Processing Graph


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usage of digital content, such as video clips and images, has increased dramatically during the last decade. Local image features have been applied increasingly in various image and video retrieval applications. This thesis evaluates local features and applies them to image and video processing tasks. The results of the study show that 1) the performance of different local feature detector and descriptor methods vary significantly in object class matching, 2) local features can be applied in image alignment with superior results against the state-of-the-art, 3) the local feature based shot boundary detection method produces promising results, and 4) the local feature based hierarchical video summarization method shows promising new new research direction. In conclusion, this thesis presents the local features as a powerful tool in many applications and the imminent future work should concentrate on improving the quality of the local features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT Five experiments were conducted to evaluate the hypothesis that Solanum americanum density and time of coexistence affect the quality of processing tomato fruit. The tomato crop was established using either the direct drilling or the transplanting technique. The factors evaluated consisted of weed density (from 0 up to 6 plants m-2) and time of weed interference (early bloom stage, full flowering stage, fruit filling, and harvest time). The effects of competition on tomato fruit quality were analysed using a multiple model. Tomato variables evaluated included industrial fruit types (which depended on ripeness and disease infection) and soluble solids level(obrix). Tomato fruit quality is dependent on the factors tested. Under low densities (< 6 plants m-2) of S. americanum there was a small impact on the quality of the tomato fruits. The percentage of grade A (mature fruit with red color and without pathogen infection) tomato fruits is the variable most affect by the independent variables. The impact of these independent variables on the percentage of grade C (green and/or with more than 15% disease infection) tomato yield was of smaller magnitude and in an inverse trend as the observed for grade A. The level of soluble solids was influenced by the weed interference on only two experiments, but the impact was of small magnitude. The impact of the results on current and future crop management practices is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical natural language processing (BioNLP) is a subfield of natural language processing, an area of computational linguistics concerned with developing programs that work with natural language: written texts and speech. Biomedical relation extraction concerns the detection of semantic relations such as protein-protein interactions (PPI) from scientific texts. The aim is to enhance information retrieval by detecting relations between concepts, not just individual concepts as with a keyword search. In recent years, events have been proposed as a more detailed alternative for simple pairwise PPI relations. Events provide a systematic, structural representation for annotating the content of natural language texts. Events are characterized by annotated trigger words, directed and typed arguments and the ability to nest other events. For example, the sentence “Protein A causes protein B to bind protein C” can be annotated with the nested event structure CAUSE(A, BIND(B, C)). Converted to such formal representations, the information of natural language texts can be used by computational applications. Biomedical event annotations were introduced by the BioInfer and GENIA corpora, and event extraction was popularized by the BioNLP'09 Shared Task on Event Extraction. In this thesis we present a method for automated event extraction, implemented as the Turku Event Extraction System (TEES). A unified graph format is defined for representing event annotations and the problem of extracting complex event structures is decomposed into a number of independent classification tasks. These classification tasks are solved using SVM and RLS classifiers, utilizing rich feature representations built from full dependency parsing. Building on earlier work on pairwise relation extraction and using a generalized graph representation, the resulting TEES system is capable of detecting binary relations as well as complex event structures. We show that this event extraction system has good performance, reaching the first place in the BioNLP'09 Shared Task on Event Extraction. Subsequently, TEES has achieved several first ranks in the BioNLP'11 and BioNLP'13 Shared Tasks, as well as shown competitive performance in the binary relation Drug-Drug Interaction Extraction 2011 and 2013 shared tasks. The Turku Event Extraction System is published as a freely available open-source project, documenting the research in detail as well as making the method available for practical applications. In particular, in this thesis we describe the application of the event extraction method to PubMed-scale text mining, showing how the developed approach not only shows good performance, but is generalizable and applicable to large-scale real-world text mining projects. Finally, we discuss related literature, summarize the contributions of the work and present some thoughts on future directions for biomedical event extraction. This thesis includes and builds on six original research publications. The first of these introduces the analysis of dependency parses that leads to development of TEES. The entries in the three BioNLP Shared Tasks, as well as in the DDIExtraction 2011 task are covered in four publications, and the sixth one demonstrates the application of the system to PubMed-scale text mining.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nucleus tractus solitarii (NTS) receives afferent projections from the arterial baroreceptors, carotid chemoreceptors and cardiopulmonary receptors and as a function of this information produces autonomic adjustments in order to maintain arterial blood pressure within a narrow range of variation. The activation of each of these cardiovascular afferents produces a specific autonomic response by the excitation of neuronal projections from the NTS to the ventrolateral areas of the medulla (nucleus ambiguus, caudal and rostral ventrolateral medulla). The neurotransmitters at the NTS level as well as the excitatory amino acid (EAA) receptors involved in the processing of the autonomic responses in the NTS, although extensively studied, remain to be completely elucidated. In the present review we discuss the role of the EAA L-glutamate and its different receptor subtypes in the processing of the cardiovascular reflexes in the NTS. The data presented in this review related to the neurotransmission in the NTS are based on experimental evidence obtained in our laboratory in unanesthetized rats. The two major conclusions of the present review are that a) the excitation of the cardiovagal component by cardiovascular reflex activation (chemo- and Bezold-Jarisch reflexes) or by L-glutamate microinjection into the NTS is mediated by N-methyl-D-aspartate (NMDA) receptors, and b) the sympatho-excitatory component of the chemoreflex and the pressor response to L-glutamate microinjected into the NTS are not affected by an NMDA receptor antagonist, suggesting that the sympatho-excitatory component of these responses is mediated by non-NMDA receptors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, the suitability of different trackers for finger tracking in high-speed videos was studied. Tracked finger trajectories from the videos were post-processed and analysed using various filtering and smoothing methods. Position derivatives of the trajectories, speed and acceleration were extracted for the purposes of hand motion analysis. Overall, two methods, Kernelized Correlation Filters and Spatio-Temporal Context Learning tracking, performed better than the others in the tests. Both achieved high accuracy for the selected high-speed videos and also allowed real-time processing, being able to process over 500 frames per second. In addition, the results showed that different filtering methods can be applied to produce more appropriate velocity and acceleration curves calculated from the tracking data. Local Regression filtering and Unscented Kalman Smoother gave the best results in the tests. Furthermore, the results show that tracking and filtering methods are suitable for high-speed hand-tracking and trajectory-data post-processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study was designed to evaluate the effect of different conditions of collection, transport and storage on the quality of blood samples from normal individuals in terms of the activity of the enzymes ß-glucuronidase, total hexosaminidase, hexosaminidase A, arylsulfatase A and ß-galactosidase. The enzyme activities were not affected by the different materials used for collection (plastic syringes or vacuum glass tubes). In the evaluation of different heparin concentrations (10% heparin, 5% heparin, and heparinized syringe) in the syringes, it was observed that higher doses resulted in an increase of at least 1-fold in the activities of ß-galactosidase, total hexosaminidase and hexosaminidase A in leukocytes, and ß-glucuronidase in plasma. When the effects of time and means of transportation were studied, samples that had been kept at room temperature showed higher deterioration with time (72 and 96 h) before processing, and in this case it was impossible to isolate leukocytes from most samples. Comparison of heparin and acid citrate-dextrose (ACD) as anticoagulants revealed that ß-glucuronidase and hexosaminidase activities in plasma reached levels near the lower normal limits when ACD was used. In conclusion, we observed that heparin should be used as the preferable anticoagulant when measuring these lysosomal enzyme activities, and we recommend that, when transport time is more than 24 h, samples should be shipped by air in a styrofoam box containing wet ice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this master’s thesis is to research and analyze how purchase invoice processing can be automated and streamlined in a system renewal project. The impacts of workflow automation on invoice handling are studied by means of time, cost and quality aspects. Purchase invoice processing has a lot of potential for automation because of its labor-intensive and repetitive nature. As a case study combining both qualitative and quantitative methods, the topic is approached from a business process management point of view. The current process was first explored through interviews and workshop meetings to create a holistic understanding of the process at hand. Requirements for process streamlining were then researched focusing on specified vendors and their purchase invoices, which helped to identify the critical factors for successful invoice automation. To optimize the flow from invoice receipt to approval for payment, the invoice receiving process was outsourced and the automation functionalities of the new system utilized in invoice handling. The quality of invoice data and the need of simple structured purchase order (PO) invoices were emphasized in the system testing phase. Hence, consolidated invoices containing references to multiple PO or blanket release numbers should be simplified in order to use automated PO matching. With non-PO invoices, it is important to receive the buyer reference details in an applicable invoice data field so that automation rules could be created to route invoices to a review and approval flow. In the beginning of the project, invoice processing was seen ineffective both time- and cost-wise, and it required a lot of manual labor to carry out all tasks. In accordance with testing results, it was estimated that over half of the invoices could be automated within a year after system implementation. Processing times could be reduced remarkably, which would then result savings up to 40 % in annual processing costs. Due to several advancements in the purchase invoice process, business process quality could also be perceived as improved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We studied the action of high pressure processing on the inactivation of two foodborne pathogens, Staphylococcus aureus ATCC 6538 and Salmonella enteritidis ATCC 13076, suspended in a culture medium and inoculated into caviar samples. The baroresistance of the two pathogens in a tryptic soy broth suspension at a concentration of 10(8)-10(9) colony-forming units/ml was tested for continuous and cycled pressurization in the 150- to 550-MPa range and for 15-min treatments at room temperature. The increase of cycle number permitted the reduction of the pressure level able to totally inactivate both microorganisms in the tryptic soy broth suspension, whereas the effect of different procedure times on complete inactivation of the microorganisms inoculated into caviar was similar.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

No fully effective treatment has been developed since the discovery of Chagas' disease by Carlos Chagas in 1909. Since drug-resistant Trypanosoma cruzi strains are occurring and the current therapy is effectiveness in the acute phase but with various adverse side effects, more studies are needed to characterize the susceptibility of T. cruzi to new drugs. Many natural and/or synthetic substances showing trypanocidal activity have been used, even though they are not likely to be turned into clinically approved drugs. Originally, drug screening was performed using natural products, with only limited knowledge of the molecular mechanism involved in the development of diseases. Trans-splicing, which is unusual RNA processing reaction and occurs in nematodes and trypanosomes, implies the processing of polycistronic transcription units into individual mRNAs; a short transcript spliced leader (SL RNA) is trans-spliced to the acceptor pre-mRNA, giving origin to the mature mRNA. In the present study, permeable cells of T. cruzi epimastigote forms (Y, BOL and NCS strains) were treated to evaluate the interference of two drugs (hydroxymethylnitrofurazone - NFOH-121 and nitrofurazone) in the trans-splicing reaction using silver-stained PAGE analysis. Both drugs induced a significant reduction in RNA processing at concentrations from 5 to 12.5 µM. These data agreed with the biological findings, since the number of parasites decreased, especially with NFOH-121. This proposed methodology allows a rapid and cost-effective screening strategy for detecting drug interference in the trans-splicing mechanism of T. cruzi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies have shown that dyslexic children present a deficiency in the temporal processing of auditory stimuli applied in rapid succession. However, discussion continues concerning the way this deficiency can be influenced by temporal variables of auditory processing tests. Therefore, the purpose of the present study was to analyze by auditory temporal processing tests the effect of temporal variables such as interstimulus intervals, stimulus duration and type of task on dyslexic children compared to a control group. Of the 60 children evaluated, 33 were dyslexic (mean age = 10.5 years) and 27 were normal controls (mean age = 10.8 years). Auditory processing tests assess the abilities of discrimination and ordering of stimuli in relation to their duration and frequency. Results showed a significant difference in the average accuracy of control and dyslexic groups considering each variable (interstimulus intervals: 47.9 ± 5.5 vs 37.18 ± 6.0; stimulus duration: 61.4 ± 7.6 vs 50.9 ± 9.0; type of task: 59.9 ± 7.9 vs 46.5 ± 9.0) and the dyslexic group demonstrated significantly lower performance in all situations. Moreover, there was an interactive effect between the group and the duration of stimulus variables for the frequency-pattern tests, with the dyslexic group demonstrating significantly lower results for short durations (53.4 ± 8.2 vs 48.4 ± 11.1), as opposed to no difference in performance for the control group (62.2 ± 7.1 vs 60.6 ± 7.9). These results support the hypothesis that associates dyslexia with auditory temporal processing, identifying the stimulus-duration variable as the only one that unequally influenced the performance of the two groups.