102 resultados para Minimalist Running
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
Speed, uncertainty and complexity are increasing in the business world all the time. When knowledge and skills become quickly irrelevant, new challenges are set for information technology (IT) education. Meta-learning skills – learning how to learn rapidly - and innovation skills have become more essential than single technologies or other specific issues. The drastic changes in the information and communications technology (ICT) sector have caused a need to reconsider how IT Bachelor education in Universities of Applied Sciences should be organized and employed to cope with the change. The objective of the study was to evaluate how a new approach to IT Bachelor education, the ICT entrepreneurship study path (ICT-ESP) fits IT Bachelor education in a Finnish University of Applied Sciences. This kind of educational arrangement has not been employed elsewhere in the context of IT Bachelor education. The study presents the results of a four-year period during which IT Bachelor education was renewed in a Finnish University of Applied Sciences. The learning environment was organized into an ICT-ESP based on Nonaka’s knowledge theory and Kolb’s experiental learning. The IT students who studied in the ICT-ESP established a cooperative and learned ICT by running their cooperative at the University of Applied Sciences. The students (called team entrepreneurs) studied by reading theory in books and other sources of explicit information, doing projects for their customers, and reflecting in training sessions on what was learnt by doing and by studying the literature. Action research was used as the research strategy in this study. Empirical data was collected via theme-based interviews, direct observation, and participative observation. Grounded theory method was utilized in the data analysis and the theoretical sampling was used to guide the data collection. The context of the University of Applied Sciences provided a good basis for fostering team entrepreneurship. However, the results showed that the employment of the ICT-ESP did not fit into the IT Bachelor education well enough. The ICT-ESP was cognitively too tough for the team entrepreneurs because they had two different set of rules to follow in their studies. The conventional courses engaged lot of energy which should have been spent for professional development in the ICT-ESP. The amount of competencies needed in the ICT-ESP for professional development was greater than those needed for any other ways of studying. The team entrepreneurs needed to develop skills in ICT, leadership and self-leadership, team development and entrepreneurship skills. The entrepreneurship skills included skills on marketing and sales, brand development, productization, and business administration. Considering the three-year time the team entrepreneurs spent in the ICT-ESP, the challenges were remarkable. Changes to the organization of IT Bachelor education are also suggested in the study. At first, it should be admitted that the ICT-ESP produces IT Bachelors with a different set of competencies compared to the conventional way of educating IT Bachelors. Secondly, the number of courses on general topics in mathematics, physics, and languages for team entrepreneurs studying in the ICTESP should be reconsidered and the conventional course-based teaching of the topics should be reorganized to support the team coaching process of the team entrepreneurs with their practiceoriented projects. Third, the upcoming team entrepreneurs should be equipped with relevant information about the ICT-ESP and what it would require in practice to study as a team entrepreneur. Finally, the upcoming team entrepreneurs should be carefully selected before they start in the ICT-ESP to have a possibility to eliminate solo players and those who have a too romantic view of being a team entrepreneur. The results gained in the study provided answers to the original research questions and the objectives of the study were met. Even though the IT degree programme was terminated during the research process, the amount of qualitative data gathered made it possible to justify the interpretations done.
Resumo:
Palvelujen hankinta on yhä kiinteämmin osa yritysten liiketoimintaa, kun kiristyvässä kilpailussa yritykset pyrkivät siirtämään keskittymistään enenevissä määrin ydintoimintojensa tuottamiseen. Yleisiä ohjeistuksia palveluiden laadun kehittämisestä ja hankintaprosessin läpiviennistä on olemassa runsaasti, mutta palveluihin keskittyvän hankinnan kokonaisvaltaista kuvausta on tutkimuskentästä vaikea löytää. Tämän tutkimuksen tavoitteena on tutkia palvelun hankinnan prosessia ja laatua, sekä paikata olemassa olevaa tutkimusaukkoa tunnistaen hankintaprosessista lopputuloksen laadun kannalta kriittisiä elementtejä. Tutkimusaihetta lähestytään tarkastelemalla ensin olemassa olevaa tutkimusta palvelujen hankintaprosessista ja laadusta. Lisäksi tutkimusta varten on haastateltu hankinta-asiantuntijoita, sekä suoritettu case-tutkimus Espoon kaupungin koulukuljetusten hankinnasta, jossa hyödynnetään QFD-matriisianalyysiä kriittisten tekijöiden tunnistamiseksi. Lopputuloksena tutkimus esittää hankintaprosessiin perustuvan listan niistä laadukkaan lopputuloksen kannalta kriittisistä tekijöistä, joihin keskittymällä palveluhankinnan laatua on mahdollista parantaa. Keskeisimpinä tekijöinä tuloksissa esiin nousevat hankintatiimin kokoaminen, yrityksen tahtotilan tunnistaminen, palvelusisällön ja sen tavoitteiden ymmärtäminen, sanktiomallien hyödyntäminen, sekä yhteistyön vaaliminen hankinnan suunnittelussa ja sopimusseurannassa.
Resumo:
Environmental threats are growing nowadays, they became global issues. People around the world try to face these issues by two means: solving the current affected environs and preventing non-affected environs. This thesis describes the design, implementation, and evaluation of online water quality monitoring system in Lake Saimaa, Finland. The water quality in Lake Saimaa needs to be monitored in order to provide responsible bodies with valuable information which allows them to act fast in order to prevent any negative impact on the lake's environment. The objectives were to design a suitable system, implement the system in Lake Saimaa, and then to evaluate the applicability and reliability of such systems for this environment. The needs for the system were first isolated, and then the design, needed modifications, and the construction of the system took place. After that was the testing of the system in Lake Saimaa in two locations nearby Mikkeli city. The last step was to evaluate the whole system. The main results were that the application of online water quality monitoring systems in Lake Saimaa can benefit of many advantages such as reducing the required manpower, time and running costs. However, the point of unreliability of the exact measured values of some parameters is still the drawback of such systems which can be developed by using more advanced equipments with more sophisticated features specifically for the purpose of monitoring in the predefined location.
Resumo:
Brain computer interface (BCI) is a kind of human machine interface, which provides a new interaction method between human and computer or other equipment. The most significant characteristic of BCI system is that its control input is brain electrical activities acquired from the brain instead of traditional input such as hands or eyes. BCI technique has rapidly developed during last two decades and it has mainly worked as an auxiliary technique to help the disable people improve their life qualities. With the appearance of low cost novel electrical devices such as EMOTIV, BCI technique has been applied to the general public through many useful applications including video gaming, virtual reality and virtual keyboard. The purpose of this research is to be familiar with EMOTIV EPOC system and make use of it to build an EEG based BCI system for controlling an industrial manipulator by means of human thought. To build a BCI system, an acquisition program based on EMOTIV EPOC system is designed and a MFC based dialog that works as an operation panel is presented. Furthermore, the inverse kinematics of RV-3SB industrial robot was solved. In the last part of this research, the designed BCI system with human thought input is examined and the results indicate that the system is running smoothly and displays clearly the motion type and the incremental displacement of the motion.
Resumo:
Master’s thesis Biomass Utilization in PFC Co-firing System with the Slagging and Fouling Analysis is the study of the modern technologies of different coal-firing systems: PFC system, FB system and GF system. The biomass co-fired with coal is represented by the research of the company Alstom Power Plant. Based on the back ground of the air pollution, greenhouse effect problems and the national fuel security today, the bioenergy utilization is more and more popular. However, the biomass is promoted to burn to decrease the emission amount of carbon dioxide and other air pollutions, new problems form like slagging and fouling, hot corrosion in the firing systems. Thesis represent the brief overview of different coal-firing systems utilized in the world, and focus on the biomass-coal co-firing in the PFC system. The biomass supply and how the PFC system is running are represented in the thesis. Additionally, the new problems of hot corrosion, slagging and fouling are mentioned. The slagging and fouling problem is simulated by using the software HSC Chemistry 6.1, and the emissions comparison between coal-firing and co-firing are simulated as well.