205 resultados para documenting automation
Resumo:
The Finnish electricity distribution sector, rural areas in particular, is facing major challenges because of the economic regulation, tightening supply security requirements and the ageing network asset. Therefore, the target in the distribution network planning and asset management is to develop and renovate the networks to meet these challenges in compliance with the regulations in an economically feasible way. Concerning supply security, the new Finnish Electricity Market Act limits the maximum duration of electricity supply interruptions to six hours in urban areas and 36 hours in rural areas. This has a significant impact on distribution network planning, especially in rural areas where the distribution networks typically require extensive modifications and renovations to meet the supply security requirements. This doctoral thesis introduces a methodology to analyse electricity distribution system development. The methodology is based on and combines elements of reliability analysis, asset management and economic regulation. The analysis results can be applied, for instance, to evaluate the development of distribution reliability and to consider actions to meet the tightening regulatory requirements. Thus, the methodology produces information for strategic decision-making so that DSOs can respond to challenges arising in the electricity distribution sector. The key contributions of the thesis are a network renovation concept for rural areas, an analysis to assess supply security, and an evaluation of the effects of economic regulation on the strategic network planning. In addition, the thesis demonstrates how the reliability aspect affects the placement of automation devices and how the reserve power can be arranged in a rural area network.
Resumo:
Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.
Resumo:
Growing traffic is believed to increase the risk of an accident in the Gulf of Finland. As the consequences of a large oil accident would be devastating in the vulnerable sea area, accident prevention is performed at the international, regional and national levels. Activities of shipping companies are governed with maritime safety policy instruments, which can be categorised into regulatory, economic and information instruments. The maritime regulatory system has been criticised for being inefficient because it has not been able to eliminate the violations that enable accidents. This report aims to discover how maritime governance systems or maritime safety policy instruments could be made more efficient in the future, in order to improve the maritime safety level. The results of the research are based on a literature review and nine expert interviews, with participants from shipping companies, interest groups and authorities. Based on the literature and the interviews, a suggestion can be made that in the future, instead of implementing new policy instruments, maritime safety risks should be eliminated by making the existing system more efficient and by influencing shipping companies’ safety culture and seafarers’ safety related attitudes. Based on this research, it can be stated that the development of maritime safety policy instruments should concentrate on harmonisation, automation and increasing national and cross-border cooperation. These three tasks could be primarily accomplished by developing the existing technology. Human error plays a role in a significant number of maritime accidents. Because of this, improving companies’ safety culture and voluntary activities that go beyond laws are acknowledged as potential ways of improving maritime safety. In the future, maritime regulatory system should be developed into a direction where the private sector has better possibilities to take part in decision-making.
Resumo:
This study is based on the notion that all students are likely to have a computer of some kind as their primary tool at school within a few years. The overall aim is to contribute to the knowledge of what this development of computer-assisted multimodal text production and communication on and over the net may entail in a school context. The study has an abductive approach drawing on theory from Media and Communication studies and from Pedagogy - particularly on media peda-gogy, multimodality, storytelling, conversation research and deliberative democracy – and is based on a DBR project in three schools. The empirical data are retrieved from four school classes, school years 4 and 5, with good access to computers and digital cameras. The classes have used the class blogs to tell the blog visitors about their school work and Skype to communicate with other classes in Sweden and Tanzania. A variety of research methods was employed: content analysis of texts, observations with field notes and camera documentation, interviews with individual students, group interviews with teachers and students, and a small survey. The study is essentially qualitative, focusing on students’ different perceptions. A small quantitative study was conducted to determine if any factors and variables could be linked to each other and to enable comparisons of the surveyed group with other research results. The results suggest that more computers at school offer more opportunities for real-life assignments and the chance to secure an authentic audience to the students’ production; primarily the students’ parents and relatives, students in the same class and at other schools. A theoretical analysis model to determine the degree of reality and authenticity in various school assignments was developed. The results also indicate that having access to cameras for documenting various events in the classes and to an authentic audience can create new opportunities for storytelling that have not been practiced previously at school. The documentary photo invites a viewer into the present tense of the image and the location where the picture was taken, whoever took the picture. It is used by the students and here too, a model has been developed to describe this relationship. The study also focuses on the freedom of expression and democracy. One of the more unexpected findings is that the students in the study did not see that they can influence other people’s perceptions or change various power structures through communication on the web, neither in nor outside of school.
Resumo:
I det här arbetet utformades en utvecklings- och investeringsplan för automatisk trafiksäkerhetsövervakning för åren 2011-2015 för landsvägsnätet i Egentliga Finland och Satakunta. Planen innehåller analys av den nuvarande verksamhetsmodellen och förslag hur verksamhetsmodellen kunde förbättras samt förslag om vilka vägavsnitt som ska få automatisk övervakning och vilka projekt som ska prioriteras under åren 2011-2015. Vidare beskrivs i planen den automatiska övervakningsteknik och verksamhetsmodell som används i Sverige samt åtgärdsförslag för att öka den automatiska trafikövervakningens acceptans. Den nuvarande verksamhetsmodellen och behovet av att utveckla den automatiska trafikövervakningen har utvärderats på basis av intervjuer med experter. Experterna ansåg att den nuvarande verksamhetsmodellen i huvudsak är funktionell och effektiv, eftersom man med relativt små kostnader har uppnått effektiva resultat. Den automatiska trafikövervakningen måste dock fortfarande utvecklas och Sveriges verksamhetsmodell ansågs vara en bra förebild och ett gott exempel. De viktigaste utvecklingsområdena ansågs vara utveckling av tekniken, snabbare informationsbehandling och automatisering, vilket skulle öka polisens resurser att verkställa övervakningen. Vägavsnitten som ska få automatisk övervakning under åren 2011-2015 prioriterades främst på basis av olycksfrekvens, antal olyckor, Tarva-beräkningsresultat, hastighetsdata och trafikmängder. Av olycksstatistiken har man främst uppmärksammat personskadeolyckor. Tarva-beräkningarna användes också för att utvärdera den automatiska trafikövervakningens inverkan och effektivitet. Enligt de intervjuade experterna är öppenhet kring övervakningen, synlig övervakningsapparatur och kontinuerlig informationsförmedling bland de viktigaste sätten att öka acceptansen. Genom att informera för man fram systemets fördelar och dess betydelse för trafiksäkerheten. Att förändra övervakningskamerans varumärke till säkerhetskamera enligt Sveriges modell ansågs även vara bra. I detta arbete presenteras en plan för att utveckla acceptansen för automatisk trafiksäkerhetsövervakning.
Resumo:
Kansalliskirjasto isännöi Helsingissä, Metsätalolla 9.–11.6. 2010 kansainvälistä kirjastoalan ELAG (European Library Automation Group) konferenssia.
Resumo:
The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.
Resumo:
Tämän tutkimuksen tavoitteena oli selvittää miten logistiikka-alan asiantuntijapalvelu voidaan tuotteistaa palvelun tarjoajan ja palvelun ostajan kannalta selkeäksi kokonaisuudeksi. Tutkimus toteutettiin konstruktiivisella tutkimusotteella. Tutkimuksessa rakennettiin alan kirjallisuutta hyödyntäen tuotteistamisprosessi, jonka avulla asiantuntijapalveluita on mahdollista tuotteistaa Mantsinen Groupissa. Tuotteistamisprosessi toteutettiin yhden palvelun osalta. Tuotteistamisprosessi sisälsi muun muassa liiketoimintamallin laadinnan, palvelun paketoimisen, palvelun osien dokumentoinnin, koemarkkinoinnin sekä hinnoittelun. Tuotteistamista varten haastateltiin yhteensä 11 henkilöä. Lisäksi tehtiin yksi laajempi ryhmätyö. Kilpailijakatsauksessa hyödynnettiin julkisia tietolähteitä. Tutkimuksen tuloksena oli tuotteistettu, moduuleihin jaettu asiantuntijapalvelu sekä palveluun liittyvä dokumentointi. Yksityiskohtainen dokumentaatio on vain yrityksen sisäistä käyttöä varten, mutta yleiskuvaus palvelusta on esitetty tässä työssä. Tutkimuksessa todettiin, että tuotteistaminen on hyvä tapa järkeistää asiantuntijapalveluliiketoimintaa. Tutkimuksessa myös havaittiin, että koemarkkinoinnin toteuttaminen asiantuntijaprojektityössä on hankalaa.
Resumo:
Tämän diplomityön tarkoituksena oli selvittää tämän päivän mittakuviin liittyviä asiakastarpeita Metso Automaation Virtauksensäätöratkaisut -liiketoimintalinjassa. Mittakuvat ovat tärkeä osa asiakkaalle tuotteen yhteydessä toimitettavaa dokumenttipakettia, joihin on kohdistettu yhä enemmän vaatimuksia viime vuosien aikana. Tutkimuksen keskeisenä tavoitteena oli ymmärtää 3D-mittakuvien merkitys Metson liiketoiminnassa, tunnistaa tämän päivän mittakuviin liittyvät asiakastarpeet, sekä luoda tunnistettujen asiakastarpeiden pohjalta kehittämissuunnitelma mittakuvatoiminnalle. Työssä toteutettiin teoreettinen kirjallisuusselvitys 3D-mallintamisesta sekä empiirinen tutkimusosuus asiakastarpeiden tunnistamisesta. Venttiiliyhdistelmä -mittakuviin liittyvät asiakastarpeet kerättiin haastatteluiden sekä verkkokyselyn avulla. Työssä haastateltiin Metson asiantuntijoita, Metson asiakkaita sekä CAD-järjestelmien toimittajia. Työn keskeisimpänä tuloksena esitettiin mittakuvatoiminnan kehittämissuunnitelma, jonka perustana oli mittakuviin liittyvä asiakastarvekartoitus, arvio Metson mittakuvatyökaluista sekä tulokset 3D-mittakuvien data- ja järjestelmävaatimuksista. Kehittämissuunnitelmassa kuvattiin, miten mittakuvatoimintaa tulee kehittää kokonaisvaltaisesti lähitulevaisuudessa. Tulokset antavat hyvän perustan laadukkaamman ja asiakaslähtöisemmän toiminnan kehittämiselle.
Resumo:
The Laboratory of Intelligent Machine researches and develops energy-efficient power transmissions and automation for mobile construction machines and industrial processes. The laboratory's particular areas of expertise include mechatronic machine design using virtual technologies and simulators and demanding industrial robotics. The laboratory has collaborated extensively with industrial actors and it has participated in significant international research projects, particularly in the field of robotics. For years, dSPACE tools were the lonely hardware which was used in the lab to develop different control algorithms in real-time. dSPACE's hardware systems are in widespread use in the automotive industry and are also employed in drives, aerospace, and industrial automation. But new competitors are developing new sophisticated systems and their features convinced the laboratory to test new products. One of these competitors is National Instrument (NI). In order to get to know the specifications and capabilities of NI tools, an agreement was made to test a NI evolutionary system. This system is used to control a 1-D hydraulic slider. The objective of this research project is to develop a control scheme for the teleoperation of a hydraulically driven manipulator, and to implement a control algorithm between human and machine interaction, and machine and task environment interaction both on NI and dSPACE systems simultaneously and to compare the results.
Resumo:
Rakennusautomaatiossa tulee esiin sovelluksia, joissa järjestelmän ohjaus-, säätö- tai valvontaratkaisun toteuttaminen ohjelmoitavilla logiikoilla ei ole riittävän edullista. Tällöin vaihtoehtona on oman laitteen suunnittelu. Työn tavoitteena oli suunnitella ja toteuttaa kustannustehokas CAN-väylään liitettävä vapaasti ohjelmoitava automaatioyksikkö. Suunnittelua ohjasivat asiakkaan laatimat vaatimusmäärittelyt. Niistä laitteen konfigurointimahdollisuudet ja piirilevyn tavoitekoko asettivat suurimmat haasteet laitteen suunnittelulle. Työn tuloksena toteutettiin asiakkaan tarpeisiin soveltuva automaatioyksikkö. Tavoitteisiin päästiin komponenttivalinnoilla ja hyödyntämällä tehokkaasti mikro-ohjaimen integroituja ominaisuuksia. Näiden avulla pystyttiin karsimaan monia yksiköitä, joita tavanomaisesti toteutetaan erilliskomponenteilla. Työssä perehdyttiin sulautetun järjestelmän elektroniikan tuotekehitysprosessiin ideasta prototyyppiin. Samalla on kuvailtu valittuja ratkaisuja sekä suunnittelussa tapahtuneita virheitä ja miten ne on ratkaistu.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Biomedical natural language processing (BioNLP) is a subfield of natural language processing, an area of computational linguistics concerned with developing programs that work with natural language: written texts and speech. Biomedical relation extraction concerns the detection of semantic relations such as protein-protein interactions (PPI) from scientific texts. The aim is to enhance information retrieval by detecting relations between concepts, not just individual concepts as with a keyword search. In recent years, events have been proposed as a more detailed alternative for simple pairwise PPI relations. Events provide a systematic, structural representation for annotating the content of natural language texts. Events are characterized by annotated trigger words, directed and typed arguments and the ability to nest other events. For example, the sentence “Protein A causes protein B to bind protein C” can be annotated with the nested event structure CAUSE(A, BIND(B, C)). Converted to such formal representations, the information of natural language texts can be used by computational applications. Biomedical event annotations were introduced by the BioInfer and GENIA corpora, and event extraction was popularized by the BioNLP'09 Shared Task on Event Extraction. In this thesis we present a method for automated event extraction, implemented as the Turku Event Extraction System (TEES). A unified graph format is defined for representing event annotations and the problem of extracting complex event structures is decomposed into a number of independent classification tasks. These classification tasks are solved using SVM and RLS classifiers, utilizing rich feature representations built from full dependency parsing. Building on earlier work on pairwise relation extraction and using a generalized graph representation, the resulting TEES system is capable of detecting binary relations as well as complex event structures. We show that this event extraction system has good performance, reaching the first place in the BioNLP'09 Shared Task on Event Extraction. Subsequently, TEES has achieved several first ranks in the BioNLP'11 and BioNLP'13 Shared Tasks, as well as shown competitive performance in the binary relation Drug-Drug Interaction Extraction 2011 and 2013 shared tasks. The Turku Event Extraction System is published as a freely available open-source project, documenting the research in detail as well as making the method available for practical applications. In particular, in this thesis we describe the application of the event extraction method to PubMed-scale text mining, showing how the developed approach not only shows good performance, but is generalizable and applicable to large-scale real-world text mining projects. Finally, we discuss related literature, summarize the contributions of the work and present some thoughts on future directions for biomedical event extraction. This thesis includes and builds on six original research publications. The first of these introduces the analysis of dependency parses that leads to development of TEES. The entries in the three BioNLP Shared Tasks, as well as in the DDIExtraction 2011 task are covered in four publications, and the sixth one demonstrates the application of the system to PubMed-scale text mining.
Resumo:
Tutkimuksen tavoitteena oli selvittää, miten operatiivista tehokkuutta tulee kehittää ja mitata logistiikkapalveluissa. Tutkimuksen empiirinen aineisto koostui haastatteluista, joihin osallistui seitsemän logistiikkapalveluyrityksen edustajaa ja kuusi logistiikkapalveluyrityksen asiakasyritysten edustajaa. Haastatteluiden jälkeen jokaista edustajaa pyydettiin vastaamaan kyselyyn, jonka tarkoituksena oli todentaa haastatteluiden keskeisimpien tulosten paikkansapitävyys. Tutkimuksen perusteella hyötyjä voidaan saavuttaa: 1) hyödyntämällä sopivia kehitystyökaluja järjestelmällisesti, 2) kehittämällä toimipistekohtaisia mittaristoja ja hyödyntämällä niitä eri käyttötarkoituksiin, 3) suunnittelemalla kehitystoimintaa toimipistekohtaisesti, 4) raportoimalla ja dokumentoimalla kehitystoimintaa sekä toimipisteen sisällä että koko organisaation laajuudella sekä 5) kehittämällä prosesseja eri liiketoimintayksiköissä koko organisaation laajuisen kehitystyöryhmän avulla. Lisäksi tutkimuksesta kävi ilmi, minkälaisia kriteereitä asiakkaat arvostavat logistiikkapalveluyrityksen toiminnassa, mihin kehitystoimet tulisi ensisijaisesti kohdistaa sekä minkälaisia mittareita ja mittaamisen viitekehyksiä voidaan hyödyntää operatiivisen tehokkuuden mittaamisessa.