985 resultados para Video Processing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nucleus tractus solitarii (NTS) receives afferent projections from the arterial baroreceptors, carotid chemoreceptors and cardiopulmonary receptors and as a function of this information produces autonomic adjustments in order to maintain arterial blood pressure within a narrow range of variation. The activation of each of these cardiovascular afferents produces a specific autonomic response by the excitation of neuronal projections from the NTS to the ventrolateral areas of the medulla (nucleus ambiguus, caudal and rostral ventrolateral medulla). The neurotransmitters at the NTS level as well as the excitatory amino acid (EAA) receptors involved in the processing of the autonomic responses in the NTS, although extensively studied, remain to be completely elucidated. In the present review we discuss the role of the EAA L-glutamate and its different receptor subtypes in the processing of the cardiovascular reflexes in the NTS. The data presented in this review related to the neurotransmission in the NTS are based on experimental evidence obtained in our laboratory in unanesthetized rats. The two major conclusions of the present review are that a) the excitation of the cardiovagal component by cardiovascular reflex activation (chemo- and Bezold-Jarisch reflexes) or by L-glutamate microinjection into the NTS is mediated by N-methyl-D-aspartate (NMDA) receptors, and b) the sympatho-excitatory component of the chemoreflex and the pressor response to L-glutamate microinjected into the NTS are not affected by an NMDA receptor antagonist, suggesting that the sympatho-excitatory component of these responses is mediated by non-NMDA receptors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study was designed to evaluate the effect of different conditions of collection, transport and storage on the quality of blood samples from normal individuals in terms of the activity of the enzymes ß-glucuronidase, total hexosaminidase, hexosaminidase A, arylsulfatase A and ß-galactosidase. The enzyme activities were not affected by the different materials used for collection (plastic syringes or vacuum glass tubes). In the evaluation of different heparin concentrations (10% heparin, 5% heparin, and heparinized syringe) in the syringes, it was observed that higher doses resulted in an increase of at least 1-fold in the activities of ß-galactosidase, total hexosaminidase and hexosaminidase A in leukocytes, and ß-glucuronidase in plasma. When the effects of time and means of transportation were studied, samples that had been kept at room temperature showed higher deterioration with time (72 and 96 h) before processing, and in this case it was impossible to isolate leukocytes from most samples. Comparison of heparin and acid citrate-dextrose (ACD) as anticoagulants revealed that ß-glucuronidase and hexosaminidase activities in plasma reached levels near the lower normal limits when ACD was used. In conclusion, we observed that heparin should be used as the preferable anticoagulant when measuring these lysosomal enzyme activities, and we recommend that, when transport time is more than 24 h, samples should be shipped by air in a styrofoam box containing wet ice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis studies the role of video based content marketing as a part of modern marketing communications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this master’s thesis is to research and analyze how purchase invoice processing can be automated and streamlined in a system renewal project. The impacts of workflow automation on invoice handling are studied by means of time, cost and quality aspects. Purchase invoice processing has a lot of potential for automation because of its labor-intensive and repetitive nature. As a case study combining both qualitative and quantitative methods, the topic is approached from a business process management point of view. The current process was first explored through interviews and workshop meetings to create a holistic understanding of the process at hand. Requirements for process streamlining were then researched focusing on specified vendors and their purchase invoices, which helped to identify the critical factors for successful invoice automation. To optimize the flow from invoice receipt to approval for payment, the invoice receiving process was outsourced and the automation functionalities of the new system utilized in invoice handling. The quality of invoice data and the need of simple structured purchase order (PO) invoices were emphasized in the system testing phase. Hence, consolidated invoices containing references to multiple PO or blanket release numbers should be simplified in order to use automated PO matching. With non-PO invoices, it is important to receive the buyer reference details in an applicable invoice data field so that automation rules could be created to route invoices to a review and approval flow. In the beginning of the project, invoice processing was seen ineffective both time- and cost-wise, and it required a lot of manual labor to carry out all tasks. In accordance with testing results, it was estimated that over half of the invoices could be automated within a year after system implementation. Processing times could be reduced remarkably, which would then result savings up to 40 % in annual processing costs. Due to several advancements in the purchase invoice process, business process quality could also be perceived as improved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We studied the action of high pressure processing on the inactivation of two foodborne pathogens, Staphylococcus aureus ATCC 6538 and Salmonella enteritidis ATCC 13076, suspended in a culture medium and inoculated into caviar samples. The baroresistance of the two pathogens in a tryptic soy broth suspension at a concentration of 10(8)-10(9) colony-forming units/ml was tested for continuous and cycled pressurization in the 150- to 550-MPa range and for 15-min treatments at room temperature. The increase of cycle number permitted the reduction of the pressure level able to totally inactivate both microorganisms in the tryptic soy broth suspension, whereas the effect of different procedure times on complete inactivation of the microorganisms inoculated into caviar was similar.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tässä fenomenologisessa tutkimuksessa kuvaillaan Video-EEG –tutkimukseen (VEEG) tulevien potilaiden kokemuksia kohtauksistaan. Tutkimusasetelmana on käytetty fenomenologiseen psykologiaan kuuluvaa Giorgin menetelmää soveltaen sitä hoitotieteen tutkimukseen. Tutkimuksen tarkoituksena oli kuvailla neurologisten kohtausoireiden vuoksi VEEG-tutkimukseen tulleiden potilaiden kokemuksia kohtauksistaan ja tunnistaa sekä kuvailla kokemukseen liittyviä tekijöitä. Tutkimuksen tavoitteena oli lisäta terveydenhoitohenkilökunnan ymmärrystä neurologisia kohtausoireita saavien ihmisten ohjaustarpeista. Materiaali kerättiin kahdeksalta potilaalta avoimilla haastatteluilla ja analysoitiin Giorgin analyysimenetelmällä. Aineistoon yhdistettiin kliinisen neurofysiologin lausunto ja muodostettiin kokemuskertomukset. Aineistosta tunnistettiin fenomenologista reduktiota käyttäen keskeiset kohtauksiin ja sairauteen liittyvät kokemukset. Käsitteiden suhdetta toisiinsa ja merkitystä sopeutumiselle analysoitiin käyttäen apuna Uncertainty in illness -mallia. Keskeisten kokemusten pohjalta toteutettiin kirjallisuushaku, jonka tuloksia reflektoitiin tämän tutkimuksen tuloksiin. Aineistosta muodostui kolme erillistä kokemuskertomusta: kertomus konkreettisista tapahtumista, kokemus hallinnan menettämisestä ja kokemus sairauden kanssa elämisesta. Keskeisiksi kokemussisällöiksi tunnistettiin kokemus terveysongelman hallinnasta, kokemus hallinnan menettämisestä, kokemus ympäristön negatiivisesta suhtautumisesta ja huoli läheisistä. Aikaisempaa tutkimusta löytyi kokemuksista terveysongelman hallinnasta ja hallinnan menetyksestä sekä ympäristön suhtautumisesta.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several methods have been described to measure intraocular pressure (IOP) in clinical and research situations. However, the measurement of time varying IOP with high accuracy, mainly in situations that alter corneal properties, has not been reported until now. The present report describes a computerized system capable of recording the transitory variability of IOP, which is sufficiently sensitive to reliably measure ocular pulse peak-to-peak values. We also describe its characteristics and discuss its applicability to research and clinical studies. The device consists of a pressure transducer, a signal conditioning unit and an analog-to-digital converter coupled to a video acquisition board. A modified Cairns trabeculectomy was performed in 9 Oryctolagus cuniculus rabbits to obtain changes in IOP decay parameters and to evaluate the utility and sensitivity of the recording system. The device was effective for the study of kinetic parameters of IOP, such as decay pattern and ocular pulse waves due to cardiac and respiratory cycle rhythm. In addition, there was a significant increase of IOP versus time curve derivative when pre- and post-trabeculectomy recordings were compared. The present procedure excludes corneal thickness and error related to individual operator ability. Clinical complications due to saline infusion and pressure overload were not observed during biomicroscopic evaluation. Among the disadvantages of the procedure are the requirement of anesthesia and the use in acute recordings rather than chronic protocols. Finally, the method described may provide a reliable alternative for the study of ocular pressure dynamic alterations in man and may facilitate the investigation of the pathogenesis of glaucoma.