989 resultados para data flow diagram
Resumo:
Accepted in 13th IEEE Symposium on Embedded Systems for Real-Time Multimedia (ESTIMedia 2015), Amsterdam, Netherlands.
Resumo:
Short set of slides explaining the workflow from a university website to equipment.data.ac.uk
Resumo:
Abstract interpretation-based data-flow analysis of logic programs is at this point relatively well understood from the point of view of general frameworks and abstract domains. On the other hand, comparatively little attention has been given to the problems which arise when analysis of a full, practical dialect of the Prolog language is attempted, and only few solutions to these problems have been proposed to date. Such problems relate to dealing correctly with all builtins, including meta-logical and extra-logical predicates, with dynamic predicates (where the program is modified during execution), and with the absence of certain program text during compilation. Existing proposals for dealing with such issues generally restrict in one way or another the classes of programs which can be analyzed if the information from analysis is to be used for program optimization. This paper attempts to fill this gap by considering a full dialect of Prolog, essentially following the recently proposed ISO standard, pointing out the problems that may arise in the analysis of such a dialect, and proposing a combination of known and novel solutions that together allow the correct analysis of arbitrary programs using the full power of the language.
Resumo:
A complete workflow specification requires careful integration of many different process characteristics. Decisions must be made as to the definitions of individual activities, their scope, the order of execution that maintains the overall business process logic, the rules governing the discipline of work list scheduling to performers, identification of time constraints and more. The goal of this paper is to address an important issue in workflows modelling and specification, which is data flow, its modelling, specification and validation. Researchers have neglected this dimension of process analysis for some time, mainly focussing on structural considerations with limited verification checks. In this paper, we identify and justify the importance of data modelling in overall workflows specification and verification. We illustrate and define several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension. A discussion on essential requirements of the workflow data model in order to support data validation is also given..
Resumo:
* The research is supported partly by INTAS: 04-77-7173 project, http://www.intas.be
Resumo:
Computers employing some degree of data flow organisation are now well established as providing a possible vehicle for concurrent computation. Although data-driven computation frees the architecture from the constraints of the single program counter, processor and global memory, inherent in the classic von Neumann computer, there can still be problems with the unconstrained generation of fresh result tokens if a pure data flow approach is adopted. The advantages of allowing serial processing for those parts of a program which are inherently serial, and of permitting a demand-driven, as well as data-driven, mode of operation are identified and described. The MUSE machine described here is a structured architecture supporting both serial and parallel processing which allows the abstract structure of a program to be mapped onto the machine in a logical way.
Resumo:
Työn tavoitteena oli luoda selkeä kuvaus sanomalehden lehtitilausprosessin toiminnasta tilauksesta jakeluun sekä kartoittaa lehtitilausprosessiin liittyvät ongelmakohdat ja esittää alustavia parannus- ja kehittämisehdotuksia niiden ratkaisemiseksi. Kuvauksen rakenteen teoreettisen pohjan luo työn alussa esitetty strukturoitu analyysi eli SA-menetelmä (structured analysis), jota soveltaen varsinainen kuvaus pääosin henkilöhaastatteluista saatuihin tietoihin perustuen toteutettiin.Sanomalehden lehtitilausprosessi on hyvin laaja ja monivaiheinen prosessi. Työn yhtenä tarkoituksena olikin selventää henkilöstölle, mitä kyseisen prosessin sisällä oikein tapahtuu. Kuvausta tehtäessä havaittiin useita lehtitilausprosessiin liittyviä ongelma- ja kehittämiskohteita, joiden kuntoon saattamiseksi on työssä esitetty alustavia ratkaisuvaihtoehtoja. Diplomityön teettämisen perusteena oli halu kehittää lehtitilausprosessia tilauksesta jakeluun nykyprosessin laadun parantamiseksi.Toimenpide-ehdotuksena esitetään tutkimuksessa läpi käytyjen sekä sen pohjalta mahdollisesti ilmenevien uusien parannus- ja kehittämisehdotusten toteuttamista lähitulevaisuudessa. Näin kyetään takaamaan lehtitilausprosessin toimivuus sekä vastaamaan asiakkaiden kasvaviin tarpeisiin myös tulevaisuudessa.
Resumo:
Reconfigurable platforms are a promising technology that offers an interesting trade-off between flexibility and performance, which many recent embedded system applications demand, especially in fields such as multimedia processing. These applications typically involve multiple ad-hoc tasks for hardware acceleration, which are usually represented using formalisms such as Data Flow Diagrams (DFDs), Data Flow Graphs (DFGs), Control and Data Flow Graphs (CDFGs) or Petri Nets. However, none of these models is able to capture at the same time the pipeline behavior between tasks (that therefore can coexist in order to minimize the application execution time), their communication patterns, and their data dependencies. This paper proves that the knowledge of all this information can be effectively exploited to reduce the resource requirements and the timing performance of modern reconfigurable systems, where a set of hardware accelerators is used to support the computation. For this purpose, this paper proposes a novel task representation model, named Temporal Constrained Data Flow Diagram (TCDFD), which includes all this information. This paper also presents a mapping-scheduling algorithm that is able to take advantage of the new TCDFD model. It aims at minimizing the dynamic reconfiguration overhead while meeting the communication requirements among the tasks. Experimental results show that the presented approach achieves up to 75% of resources saving and up to 89% of reconfiguration overhead reduction with respect to other state-of-the-art techniques for reconfigurable platforms.
Resumo:
Observational longitudinal research is particularly useful for assessing etiology and prognosis and for providing evidence for clinical decision making. However, there are no structured reporting requirements for studies of this design to assist authors, editors, and readers. The authors developed and tested a checklist of criteria related to threats to the internal and external validity of observational longitudinal studies. The checklist criteria concerned recruitment, data collection, biases, and data analysis and descriptive issues relevant to study rationale, study population, and generalizability. Two raters independently assessed 49 randomly selected articles describing stroke research published from 1999 to 2003 in six journals: American Journal of Epidemiology, Journal of Epidemiology and Community Health, Stroke, Annals of Neurology, Archives of Physical Medicine and Rehabilitation, and American Journal of Physical Medicine and Rehabilitation. On average, 17 of the 33 checklist criteria were reported. Criteria describing the study design were better reported than those related to internal validity. No relation was found between study type (etiologic or prognostic) or word count and quality of reporting. A flow diagram for summarizing participant flow through a study was developed. Editors and authors should consider using a checklist and flow diagram when reporting on observational longitudinal research.
Resumo:
A finite-element method is used to study the elastic properties of random three-dimensional porous materials with highly interconnected pores. We show that Young's modulus, E, is practically independent of Poisson's ratio of the solid phase, nu(s), over the entire solid fraction range, and Poisson's ratio, nu, becomes independent of nu(s) as the percolation threshold is approached. We represent this behaviour of nu in a flow diagram. This interesting but approximate behaviour is very similar to the exactly known behaviour in two-dimensional porous materials. In addition, the behaviour of nu versus nu(s) appears to imply that information in the dilute porosity limit can affect behaviour in the percolation threshold limit. We summarize the finite-element results in terms of simple structure-property relations, instead of tables of data, to make it easier to apply the computational results. Without using accurate numerical computations, one is limited to various effective medium theories and rigorous approximations like bounds and expansions. The accuracy of these equations is unknown for general porous media. To verify a particular theory it is important to check that it predicts both isotropic elastic moduli, i.e. prediction of Young's modulus alone is necessary but not sufficient. The subtleties of Poisson's ratio behaviour actually provide a very effective method for showing differences between the theories and demonstrating their ranges of validity. We find that for moderate- to high-porosity materials, none of the analytical theories is accurate and, at present, numerical techniques must be relied upon.
Resumo:
Diplomityössä tehtiin Balas- simulointimalli Stora Enso Publication Papers Oy Ltd Varkauden tehtaiden kuumahierrelaitoksesta, johon sovitettiin tehtaan lämpö-, kuitu- ja kiertovesivirtaukset sekä kiertovesissä olevien liuenneiden ja kolloidisten aineiden virtaukset. Kirjallisuusosassa perehdyttiin simulaattorimallin luontiin ja käyttötarkoituksiin. Siinä käsitellään Balas- simulaattorin ominaisuuksia ja laiteparametrointia. Tarkastellaan kuumahierreprosessin eri vaiheita, olosuhteita ja laitteiden toimintaa. Perehdytään prosessin energian kulutukseen ja talteenottoon sekä kiertovesien liuenneiden ja kolloidisten aineiden mittaussuureisiin ja vaikutuksiin prosessissa. Simulaattorimallin tekemiseen kuului tehtaan virtauskaavion ja simulaattorimallin tekeminen, mittauskoesuunnittelu, tehdasmittaukset, simulaattorin parametrointi, mittaus- ja simulaattoritulosten yhteensovittaminen sekä mallin kelpoistaminen. Tehtaan virtauskaavion piirtämisessä käytettiin apuna tehtaan prosessi- ja instrumentointikaavioita, joiden pohjalta mittauskoesuunnitelma ja simulaattorimalli tehtiin. Koesuunnitelman mittaussuureiksi valittiin virtaukset, sakeudet, lämpötilat sekä kiertovesien kiintoaineen-, kokonaisorgaanisen hiilen- ja liuenneen orgaanisen aineen pitoisuuksien määritykset. Mittauksilla saatiin tietoja prosessivirtausten taseista, joita käytettiin simulaattorimallin keskeisempien laiteparametriarvojen määrityksessä. Exceliä hyödynnettiin mittaus- ja simulaattoritulosten taulukoinnissa, laiteparametrien laskemisessa sekä arvojen syötössä ja vastaanotossa Excelin ja simulaattorin välillä. Sitä käytettiin myös graafisessa mittaus- ja simulaattoritulosten yhteensovittamisessa, jolla pystyttiin havainnollisesti näkemään eri syöttöparametrien muutoksien vaikutukset simulaattorin laskemissa virtaustuloksissa. Mallin antamien arvojen ja todellisten prosessimittausarvojen yhteensovittamisen ja mallista varmistumisen tuloksista voidaan todeta mallin korreloivan todellista prosessia melko hyvin.
Resumo:
The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The constant search for improvements and the survival of organizations makes the strategic guidelines are deployed and executed at the operational levels. This work is the approach to critical analysis of the equipment of a chemical industry through a case study based on the classification of each equipment manufactures through qualitative and quantitative analysis on the pillars of maintenance costs, loss of production, MTBF, contribution margin, Health Safety and Environment (SHE). From this study and future data collection, along with the flow diagram show the main equipment that should be special attention. To this can be prepared an action plan with deadlines and responsible. With the results one can measure the maintenance costs, loss of production and technical availability of the plant, with future gains