76 resultados para TRANSFORM


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työssä esiteltävä laite on osa DC-AC hakkuria, jolla muodostetaan 750 V tasajännitteestä yksivaiheista (230 VRMS, 50 Hz) galvaanisesti erotettua verkkojännitettä. Tasajännite muunnetaan resonanssikonvertterilla korkeataajuiseksi vaihtojännitteeksi, joka johdetaan erotusmuuntajaan. Galvaanisen erotuksen jälkeen korkeataajuisesta vaihtojännitteestä muodostetaan suoraan verkkotaajuista vaihtojännitettä työssä esiteltävällä syklokonvertterilla. Suunnittelussa on pyritty minimoimaan häviöt mahdollisimman tarkkaan, jotta laite olisi kilpailukykyinen ei-galvaanisesti erottavien konverttereiden kanssa. Tämä on toteutettu käyttämällä mahdollisimman vähän komponentteja virran kulkureitillä sekä soveltamalla pehmeää kytkentää kaikissa tilanteissa. Lopuksi esitellään prototyyppi, jonka tarkoitus oli selvittää laitteen toiminta ja ongelmakohdat käytännössä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to explore longing and its implication for health. The overall purpose is to develop a theory model of longing. The research question is: What is the substance of longing in a caring science perspective? The model is developed based on theoretical and empirical studies, which contains three different research materials; hermeneutic reading of texts by Augustine and Kierkegaard, and interpretation of research interviews with nine women in a cancer context. The design of the study is explorative and the ontological hermeneutics of Gadamer is chosen as a guidance for understanding. The main standpoint of the study is performed within the systematic caring science, which through basic research, generates knowledge about the human desire as crucial for the deeper health processes. Through the contextual study there is a link to the clinical caring science. In the ontology of the systematic caring science, the character of longing is in touch with two different aspects. Longing is rooted in the inner source of love of the ethos of the human where the inscrutable depth exists and contains the reality beyond the visible. Further, longing is essential for human being becoming in health and suffering, through holiness as a unit of body, soul and spirit. The results of the study are presented in a theory model. The model has by abduction provided new and deeper understanding of dimensions of longing related to health. On a general level the forces in longing unfolds in two perspectives; suffering and the basis of love. There appears to be a relationship between human and the source of love in all three materials. When human opens up his life in a larger perspective, resting in love, he can manage to stand in the thrill, and acknowledge loss and emptiness. In the transparency of an inner dialogue unfolding dispair, deeper longing can be opened up so that lives are released from the source of love. The holiness of the human desire has such appeal because the holiness of the source of love is always more than the suffering and the particular. The holiness in longing seems to satisfy the hearts deepest searching. The directon of longing is performed in relation to human and the source of love. The study reveals how longing is associated with the source of love, where the holiness of longing seems to drag the human and by that gives the answer to the seeking of the heart. Dynamics forces have direction from the human suffering in the foundation and a release of the power is given back to transform, deepen and reconcile life and suffering. The movements of the power released by longing are keys to understand the suffering of human in relation to the source of love, becoming in health. By this study, results contribute to deepen the ontological core of caring science. Firstly, human in his longing is connected to the inner ethos and by that the most sacred and absolute in itself so that parts of the potential of love can be released to health. Secondly, longing is the road of reconciliation and can further expand to authentic reconciliation, where human is becoming towards unity and holiness. Thirdly, the spirituality is unfolding through longing and the transcendental is received. In longing, human is in touch with the mystery, the longing exceeds the present and moving towards eternity and infinity, and is in what is yet to come. Such deep experience of longing moments leave an impression and show the longing fulfilled.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The requirements set by the market for electrical machines become increasingly demanding requiring more sophisticated technological solutions. Companies producing electrical ma-chines are challenged to develop machines that provide competitive edge for the customer for example through increased efficiency, reliability or some customer specific special requirement. The objective of this thesis is to derive a proposal for the first steps to transform the electrical machine product development process of a manufacturing company towards lean product development. The current product development process in the company is presented together with the processes of four other companies interviewed for the thesis. On the basis of current processes of the electrical machine industry and the related literature, a generalized electrical machine product development process is derived. The management isms and –tools utilized by the companies are analyzed. Adoption of lean Pull-Event –reviews, Oobeya –management and Knowledge based product development are suggested as the initial steps of implementing lean product development paradigm in the manufacturing company. Proposals for refining the cur-rent product development process and increasing the stakeholder involvement in the development projects are made. Lean product development is finding its way to Finnish electrical machine industry, but the results will be available only after the methods have been implemented and adopted by the companies. There is some enthusiasm about the benefits of lean approach and if executed successfully it will provide competitive edge for the Finnish electrical machine industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis discusses the opportunities and challenges of the cloud computing technology in healthcare information systems by reviewing the existing literature on cloud computing and healthcare information system and the impact of cloud computing technology to healthcare industry. The review shows that if problems related to security of data are solved then cloud computing will positively transform the healthcare institutions by giving advantage to the healthcare IT infrastructure as well as improving and giving benefit to healthcare services. Therefore, this thesis will explore the opportunities and challenges that are associated with cloud computing in the context of Finland in order to help the healthcare organizations and stakeholders to determine its direction when it decides to adopt cloud technology on their information systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, cantilever-enhanced photoacoustic spectroscopy (CEPAS) was applied in different drug detection schemes. The study was divided into two different applications: trace detection of vaporized drugs and drug precursors in the gas-phase, and detection of cocaine abuse in hair. The main focus, however, was the study of hair samples. In the gas-phase, methyl benzoate, a hydrolysis product of cocaine hydrochloride, and benzyl methyl ketone (BMK), a precursor of amphetamine and methamphetamine were investigated. In the solid-phase, hair samples from cocaine overdose patients were measured and compared to a drug-free reference group. As hair consists mostly of long fibrous proteins generally called keratin, proteins from fingernails and saliva were also studied for comparison. Different measurement setups were applied in this study. Gas measurements were carried out using quantum cascade lasers (QLC) as a source in the photoacoustic detection. Also, an external cavity (EC) design was used for a broader tuning range. Detection limits of 3.4 particles per billion (ppb) for methyl benzoate and 26 ppb for BMK in 0.9 s were achieved with the EC-QCL PAS setup. The achieved detection limits are sufficient for realistic drug detection applications. The measurements from drug overdose patients were carried out using Fourier transform infrared (FTIR) PAS. The drug-containing hair samples and drug-free samples were both measured with the FTIR-PAS setup, and the measured spectra were analyzed statistically with principal component analysis (PCA). The two groups were separated by their spectra with PCA and proper spectral pre-processing. To improve the method, ECQCL measurements of the hair samples, and studies using photoacoustic microsampling techniques, were performed. High quality, high-resolution spectra with a broad tuning range were recorded from a single hair fiber. This broad tuning range of an EC-QCL has not previously been used in the photoacoustic spectroscopy of solids. However, no drug detection studies were performed with the EC-QCL solid-phase setup.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global warming is assertively the greatest environmental challenge for humans of 21st century. It is primarily caused by the anthropogenic greenhouse gas (GHG) that trap heat in the atmosphere. Because of which, the GHG emission mitigation, globally, is a critical issue in the political agenda of all high-profile nations. India, like other developing countries, is facing this threat of climate change while dealing with the challenge of sustaining its rapid economic growth. India’s economy is closely connected to its natural resource base and climate sensitive sectors like water, agriculture and forestry. Due to Climate change the quality and distribution of India’s natural resources may transform and lead to adverse effects on livelihood of its people. Therefore, India is expected to face a major threat due to the projected climate change. This study proposes possible solutions for GHG emission mitigation that are specific to the power sector of India. The methods discussed here will take Indian power sector from present coal dominant ideology to a system, centered with renewable energy sources. The study further proposes a future scenario for 2050, based on the present Indian government policies and global energy technologies advancements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interferometer for low resolution portable Fourier Transform middle infrared spectrometer was developed and studied experimentally. The final aim was a concept for a commercial prototype. Because of the portability, the interferometer should be compact sized and insensitive to the external temperature variations and mechanical vibrations. To minimise the size and manufacturing costs, Michelson interferometer based on plane mirrors and porch swing bearing was selected and no dynamic alignment system was applied. The driving motor was a linear voice coil actuator to avoid mechanical contact of the moving parts. The driving capability for low mirror driving velocities required by the photoacoustic detectors was studied. In total, four versions of such an interferometer were built and experimentally studied. The thermal stability during the external temperature variations and the alignment stability over the mirror travel were measured using the modulation depth of the wide diameter laser beam. Method for estimating the mirror tilt angle from the modulation depth was developed to take account the effect from the non-uniform intensity distribution of the laser beam. The spectrometer stability was finally studied also using the infrared radiation. The latest interferometer was assembled for the middle infrared spectrometer with spectral range from 750 cm−1 to 4500 cm−1. The interferometer size was (197 × 95 × 79) mm3 with the beam diameter of 25 mm. The alignment stability as the change of the tilt angle over the mirror travel of 3 mm was 5 μrad, which decreases the modulation depth only about 0.7 percent in infrared at 3000 cm−1. During the temperature raise, the modulation depth at 3000 cm−1 changed about 1 . . . 2 percentage units per Celsius over short term and even less than 0.2 percentage units per Celsius over the total temperature raise of 30 °C. The unapodised spectral resolution was 4 cm−1 limited by the aperture size. The best achieved signal to noise ratio was about 38 000:1 with commercially available DLaTGS detector. Although the vibration sensitivity requires still improving, the interferometer performed, as a whole, very well and could be further developed to conform all the requirements of the portable and stable spectrometer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

End-user development is a very common but often largely overlooked phenomenon in information systems research and practice. End-user development means that regular people, the end-users of software, and not professional developers are doing software development. A large number of people are directly or indirectly impacted by the results of these non-professional development activities. The numbers of users performing end-user development activities are difficult to ascertain precisely. But it is very large, and still growing. Computer adoption is growing towards 100% and many new types of computational devices are continually introduced. In addition, other devices not previously programmable are becoming so. This means that, at this very moment, hundreds of millions of people are likely struggling with development problems. Furthermore, software itself is continually being adapted for more flexibility, enabling users to change the behaviour of their software themselves. New software and services are helping to transform users from consumers to producers. Much of this is now found on-line. The problem for the end-user developer is that little of this development is supported by anyone. Often organisations do not notice end-user development and consequently neither provide support for it, nor are equipped to be able to do so. Many end-user developers do not belong to any organisation at all. Also, the end-user development process may be aggravating the problem. End-users are usually not really committed to the development process, which tends to be more iterative and ad hoc. This means support becomes a distant third behind getting the job done and figuring out the development issues to get the job done. Sometimes the software itself may exacerbate the issue by simplifying the development process, deemphasising the difficulty of the task being undertaken. On-line support could be the lifeline the end-user developer needs. Going online one can find all the knowledge one could ever need. However, that does still not help the end-user apply this information or knowledge in practice. A virtual community, through its ability to adopt the end-user’s specific context, could surmount this final obstacle. This thesis explores the concept of end-user development and how it could be supported through on-line sources, in particular virtual communities, which it is argued here, seem to fit the end-user developer’s needs very well. The experiences of real end-user developers and prior literature were used in this process. Emphasis has been on those end-user developers, e.g. small business owners, who may have literally nowhere to turn to for support. Adopting the viewpoint of the end-user developer, the thesis examines the question of how an end-user could use a virtual community effectively, improving the results of the support process. Assuming the common situation where the demand for support outstrips the supply.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sosialistisen vallankumouksen jälkeen Neuvosto-Venäjän oli monien muiden kysymysten ohella ratkaistava sosialismin saavutusten puolustaminen. Aluksi ratkaisuksi suunniteltiin vapaaehtoisuuteen perustuvaa punakaartia, mutta riittävän miesvahvuuden turvaamiseksi päädyttiin yleiseen asevelvollisuuteen. Pian Venäjän sisällissodan jälkeen sotataidon suunta painottui enemmän vanhan armeijan asiantuntijoiden näkemysten kuin vallankumoussankarien kokemusten mukaiseksi, vaikka Frunzen puna-armeijalle kirjoittama doktriini perustui luokkataisteluun ja korosti sisällissodassa hyväksi koettua operatiivista liikkuvuutta. Neuvostoliiton ja Venäjän sotataidon perustana on Pietari I:n aloittama länsimainen suuntaus, jota kuitenkin täydentävät vahvat kansalliset piirteet. Venäläisen sotataidon henkisenä isänä voidaan hyvällä syyllä pitää Aleksandr Suvorovia, jonka opetukset näkyvät tekstilainausten lisäksi myös periaatteissa ja sotilaskasvatuksessa. Napoleonin sotien jälkeen perustettu Keisarillinen yleisesikunta-akatemia loi Venäjälle sotatieteellisen tutkimuksen ja opetuksen. Sotatieteen mahdollisuuksia ei 1800-luvun Venäjällä osattu täysin hyödyntää. Aseistuksen kasvavan tehon merkitystä vähättelevä asenne johti sotataidon taantumisen ja katastrofiin Venäjän–Japanin sodassa. Sen kokemuksia analysoidessaan Aleksandr Neznamov kehitti edelleen saksalaista operaation käsitettä ja loi perustan Neuvostoliitossa 1920-luvulla kehitetylle operaatiotaidolle. Neuvostoliittolaisen sotataidon päämääränä oli kehittää taktinen ja operatiivinen ratkaisu aseistuksen tehon kasvun aikaansaamaan puolustuksen ylivoimaisuuteen. Ratkaisussa hyödynnettiin brittien kokemuksia ja tutkimusta. Neuvostoliittolainen taktiikka ja operaatiotaito eivät kuitenkaan olleet brittiläisen mekanisoidun sodankäynnin tai saksalaisen salamasodan itäinen kopioita vaan itsenäisiin ratkaisuihin pohjautuvia. Syvän taistelun ja operaation teoriaa kokeiltiin harjoituksissa, ja sitä kehitettiin Stalinin vuoden 1937 puhdistuksiin saakka. Toisen maailmansodan taisteluissa puna-armeija sovelsi alkuvaiheen katastrofin jälkeen syvän taistelun ja syvän operaation oppeja. Komentajien ja joukkojen taito ei riittänyt teorian vaatimusten mukaiseen toimintaan, siksi syväksi aiotusta taistelusta tuli ajoittain ainoastaan tiheää. Suuren isänmaallisen sodan kokemusten perusteella neuvostoliittolainen sotatiede kehitti yleisjoukkojen taistelun periaatteet, jotka ovat säilyneet muuttumattomina nykypäivään saakka. Kylmän sodan aikakaudella ydin- ja tavanomaisen aseistuksen merkitys sodan ja taistelun kuvassa vaihteli. Lännen sotataidon ja aseteknologian kehitys pakotti Neuvostoliiton siirtymään 1980-luvulla sotilaallisessa ajattelussaan hyökkäyksestä puolustukseen. Neuvostoliiton hajoamisen jälkeen Venäjän sotilaallisen turvallisuuden takaajana on ydinaseistus. Yhdysvaltain tavanomainen ilma-avaruushyökkäyskyky vaatii Venäjää kehittämään torjuntajärjestelmiä. Tavanomaisten joukkojen rakentamisessa Venäjä seuraa tarkasti läntisen sotataidon kehittymistä, mutta pitäytyy omaperäisiin ratkaisuihin, joiden kehittämisessä sen vahvalla sotatieteellisellä järjestelmällä ja dialektisen materialismin metodilla on edelleen olennaisen tärkeä merkitys.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biokuvainformatiikan kehittäminen – mikroskopiasta ohjelmistoratkaisuihin – sovellusesimerkkinä α2β1-integriini Kun ihmisen genomi saatiin sekvensoitua vuonna 2003, biotieteiden päätehtäväksi tuli selvittää eri geenien tehtävät, ja erilaisista biokuvantamistekniikoista tuli keskeisiä tutkimusmenetelmiä. Teknologiset kehitysaskeleet johtivat erityisesti fluoresenssipohjaisten valomikroskopiatekniikoiden suosion räjähdysmäiseen kasvuun, mutta mikroskopian tuli muuntua kvalitatiivisesta tieteestä kvantitatiiviseksi. Tämä muutos synnytti uuden tieteenalan, biokuvainformatiikan, jonka on sanottu mahdollisesti mullistavan biotieteet. Tämä väitöskirja esittelee laajan, poikkitieteellisen työkokonaisuuden biokuvainformatiikan alalta. Väitöskirjan ensimmäinen tavoite oli kehittää protokollia elävien solujen neliulotteiseen konfokaalimikroskopiaan, joka oli yksi nopeimmin kasvavista biokuvantamismenetelmistä. Ihmisen kollageenireseptori α2β1-integriini, joka on tärkeä molekyyli monissa fysiologisissa ja patologisissa prosesseissa, oli sovellusesimerkkinä. Työssä saavutettiin selkeitä visualisointeja integriinien liikkeistä, yhteenkeräytymisestä ja solun sisään siirtymisestä, mutta työkaluja kuvainformaation kvantitatiiviseen analysointiin ei ollut. Väitöskirjan toiseksi tavoitteeksi tulikin tällaiseen analysointiin soveltuvan tietokoneohjelmiston kehittäminen. Samaan aikaan syntyi biokuvainformatiikka, ja kipeimmin uudella alalla kaivattiin erikoistuneita tietokoneohjelmistoja. Tämän väitöskirjatyön tärkeimmäksi tulokseksi muodostui näin ollen BioImageXD, uudenlainen avoimen lähdekoodin ohjelmisto moniulotteisten biokuvien visualisointiin, prosessointiin ja analysointiin. BioImageXD kasvoi yhdeksi alansa suurimmista ja monipuolisimmista. Se julkaistiin Nature Methods -lehden biokuvainformatiikkaa käsittelevässä erikoisnumerossa, ja siitä tuli tunnettu ja laajalti käytetty. Väitöskirjan kolmas tavoite oli soveltaa kehitettyjä menetelmiä johonkin käytännönläheisempään. Tehtiin keinotekoisia piidioksidinanopartikkeleita, joissa oli "osoitelappuina" α2β1-integriinin tunnistavia vasta-aineita. BioImageXD:n avulla osoitettiin, että nanopartikkeleilla on potentiaalia lääkkeiden täsmäohjaussovelluksissa. Tämän väitöskirjatyön yksi perimmäinen tavoite oli edistää uutta ja tuntematonta biokuvainformatiikan tieteenalaa, ja tämä tavoite saavutettiin erityisesti BioImageXD:n ja sen lukuisten julkaistujen sovellusten kautta. Väitöskirjatyöllä on merkittävää potentiaalia tulevaisuudessa, mutta biokuvainformatiikalla on vakavia haasteita. Ala on liian monimutkainen keskimääräisen biolääketieteen tutkijan hallittavaksi, ja alan keskeisin elementti, avoimen lähdekoodin ohjelmistokehitystyö, on aliarvostettu. Näihin seikkoihin tarvitaan useita parannuksia,

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Can crowdsourcing solutions serve many masters? Can they be beneficial for both, for the layman or native speakers of minority languages on the one hand and serious linguistic research on the other? How did an infrastructure that was designed to support linguistics turn out to be a solution for raising awareness of native languages? Since 2012 the National Library of Finland has been developing the Digitisation Project for Kindred Languages, in which the key objective is to support a culture of openness and interaction in linguistic research, but also to promote crowdsourcing as a tool for participation of the language community in research. In the course of the project, over 1,200 monographs and nearly 111,000 pages of newspapers in Finno-Ugric languages will be digitised and made available in the Fenno-Ugrica digital collection. This material was published in the Soviet Union in the 1920s and 1930s, and users have had only sporadic access to the material. The publication of open-access and searchable materials from this period is a goldmine for researchers. Historians, social scientists and laymen with an interest in specific local publications can now find text materials pertinent to their studies. The linguistically-oriented population can also find writings to delight them: (1) lexical items specific to a given publication, and (2) orthographically-documented specifics of phonetics. In addition to the open access collection, we developed an open source code OCR editor that enables the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary since these rare and peripheral prints often include already archaic characters, which are neglected by modern OCR software developers but belong to the historical context of kindred languages, and are thus an essential part of the linguistic heritage. When modelling the OCR editor, it was essential to consider both the needs of researchers and the capabilities of lay citizens, and to have them participate in the planning and execution of the project from the very beginning. By implementing the feedback iteratively from both groups, it was possible to transform the requested changes as tools for research that not only supported the work of linguistics but also encouraged the citizen scientists to face the challenge and work with the crowdsourcing tools for the benefit of research. This presentation will not only deal with the technical aspects, developments and achievements of the infrastructure but will highlight the way in which user groups, researchers and lay citizens were engaged in a process as an active and communicative group of users and how their contributions were made to mutual benefit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.