46 resultados para Text-to-speech systems

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The flow of information within modern information society has increased rapidly over the last decade. The major part of this information flow relies on the individual’s abilities to handle text or speech input. For the majority of us it presents no problems, but there are some individuals who would benefit from other means of conveying information, e.g. signed information flow. During the last decades the new results from various disciplines have all suggested towards the common background and processing for sign and speech and this was one of the key issues that I wanted to investigate further in this thesis. The basis of this thesis is firmly within speech research and that is why I wanted to design analogous test batteries for widely used speech perception tests for signers – to find out whether the results for signers would be the same as in speakers’ perception tests. One of the key findings within biology – and more precisely its effects on speech and communication research – is the mirror neuron system. That finding has enabled us to form new theories about evolution of communication, and it all seems to converge on the hypothesis that all communication has a common core within humans. In this thesis speech and sign are discussed as equal and analogical counterparts of communication and all research methods used in speech are modified for sign. Both speech and sign are thus investigated using similar test batteries. Furthermore, both production and perception of speech and sign are studied separately. An additional framework for studying production is given by gesture research using cry sounds. Results of cry sound research are then compared to results from children acquiring sign language. These results show that individuality manifests itself from very early on in human development. Articulation in adults, both in speech and sign, is studied from two perspectives: normal production and re-learning production when the apparatus has been changed. Normal production is studied both in speech and sign and the effects of changed articulation are studied with regards to speech. Both these studies are done by using carrier sentences. Furthermore, sign production is studied giving the informants possibility for spontaneous speech. The production data from the signing informants is also used as the basis for input in the sign synthesis stimuli used in sign perception test battery. Speech and sign perception were studied using the informants’ answers to questions using forced choice in identification and discrimination tasks. These answers were then compared across language modalities. Three different informant groups participated in the sign perception tests: native signers, sign language interpreters and Finnish adults with no knowledge of any signed language. This gave a chance to investigate which of the characteristics found in the results were due to the language per se and which were due to the changes in modality itself. As the analogous test batteries yielded similar results over different informant groups, some common threads of results could be observed. Starting from very early on in acquiring speech and sign the results were highly individual. However, the results were the same within one individual when the same test was repeated. This individuality of results represented along same patterns across different language modalities and - in some occasions - across language groups. As both modalities yield similar answers to analogous study questions, this has lead us to providing methods for basic input for sign language applications, i.e. signing avatars. This has also given us answers to questions on precision of the animation and intelligibility for the users – what are the parameters that govern intelligibility of synthesised speech or sign and how precise must the animation or synthetic speech be in order for it to be intelligible. The results also give additional support to the well-known fact that intelligibility in fact is not the same as naturalness. In some cases, as shown within the sign perception test battery design, naturalness decreases intelligibility. This also has to be taken into consideration when designing applications. All in all, results from each of the test batteries, be they for signers or speakers, yield strikingly similar patterns, which would indicate yet further support for the common core for all human communication. Thus, we can modify and deepen the phonetic framework models for human communication based on the knowledge obtained from the results of the test batteries within this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study develops an approach that tries to validate software functionality to work systems needs in SMEs. The formulated approach is constructed by using a SAAS based software i.e., work collaboration service (WCS), and SMEs as the elements of study. Where the WCS’s functionality is qualified to the collaboration needs that exist in operational and project work within SMEs. For this research constructivist approach and case study method is selected because the nature of the current study requires an in depth study of the work collaboration service as well as a detailed study of the work systems within different enterprises. Four different companies are selected in which fourteen interviews are conducted to gather data pertaining. The work systems method and framework are used as a central part of the approach to collect, analyze and interpret the enterprises work systems model and the underlying collaboration needs on operational and project work. On the other hand, the functional model of the WCS and its functionality is determined from functional model analysis, software testing, documentation and meetings with the service vendor. The enterprise work system model and the WCS model are compared to reveal how work progression differs between the two and make visible unaddressed stages of work progression. The WCS functionality is compared to work systems collaboration needs to ascertain if the service will suffice the needs of the project and operational work under study. The unaddressed needs provide opportunities to improve the functionality of the service for better conformity to the needs of enterprise and work. The results revealed that the functional models actually differed in how operational and project work progressed within the stages. WCS shared similar stages of work progression apart from the stages of identification and acceptance, and progress and completion stages were only partially addressed. Conclusion is that the identified unaddressed needs such as, single point of reference, SLA and OLA inclusion etc., should be implemented or improved within the WCS at appropriate stages of work to gain better compliance of the service to the needs of the enterprise an work itself. The developed approach can hence be used to carry out similar analysis for the conformance of pre-built software functionality to work system needs with SMEs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis examines coordination of systems development process in a contemporary software producing organization. The thesis consists of a series of empirical studies in which the actions, conceptions and artifacts of practitioners are analyzed using a theory-building case study research approach. The three phases of the thesis provide empirical observations on different aspects of systemsdevelopment. In the first phase is examined the role of architecture in coordination and cost estimation in multi-site environment. The second phase involves two studies on the evolving requirement understanding process and how to measure this process. The third phase summarizes the first two phases and concentrates on the role of methods and how practitioners work with them. All the phases provide evidence that current systems development method approaches are too naïve in looking at the complexity of the real world. In practice, development is influenced by opportunity and other contingent factors. The systems development processis not coordinated using phases and tasks defined in methods providing universal mechanism for managing this process like most of the method approaches assume.Instead, the studies suggest that managing systems development process happens through coordinating development activities using methods as tools. These studies contribute to the systems development methods by emphasizing the support of communication and collaboration between systems development participants. Methods should not describe the development activities and phases in a detail level, butshould include the higher level guidance for practitioners on how to act in different systems development environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Terveydenhuollossa käytetään nykyisin informaatioteknologian (IT) mahdollisuuksia parantamaan hoidon laatua, vähentämään hoitoon liittyviä kuluja sekä yksinkertaistamaan ja selkeyttämään laakareiden työnkulkua. Tietojärjestelmät, jotka edustavat jokaisen IT-ratkaisun ydintä, täytyy kehittää täyttämään lukuisia vaatimuksia, ja yksi niistä on kyky integroitua saumattomasti toisten tietojärjestelmien kanssa. Järjestelmäintegraatio on kuitenkin yhä haastava tehtävä, vaikka sita varten on kehitetty useita standardeja. Tässä työssä kuvataan vastakehitetyn lääketieteellisen tietojärjestelmän liittymäratkaisu. Työssä pohditaan vaatimuksia, jotka tällaiselle sovellukselle asetetaan, ja myös tapa, jolla vaatimukset toteutuvat on esitetty. Liittymaratkaisu on jaettu kahteen osaan, tietojärjestelmaliittymään ja "liittymakoneeseen" (interfacing engine). Edellinen on käsittää perustoiminnallisuuden, jota tarvitaan vastaanottamaan ja lähettämään tietoa toisiin järjestelmiin, kun taas jälkimmäinen tarjoaa tuen tuotantoympäristössa käytettäville standardeille. Molempien osien suunnitelu on esitelty perusteellisesti tässä työssä. Ongelma ratkaistiin modulaarisen ja geneerisen suunnittelun avulla. Tämä lähestymistapa osoitetaan työssä kestäväksi ja joustavaksi ratkaisuksi, jota voidaan käyttää tarkastelemaan laajaa valikoimaa liittymäratkaisulle asetettuja vaatimuksia. Lisaksi osoitetaan kuinka tehty ratkaisu voidaan joustavuutensa ansiosta helposti mukauttaa vaatimuksiin, joita ei ole etukäteen tunnistettu, ja siten saavutetaan perusta myös tulevaisuuden tarpeille

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä tutkimustyössä on tavoitteena tutkia UPM-Kymmene Wood Oy:n vaneriliiketoimintaorganisaation lähtölogistiikka- ja myyntiosaston tiedon jakamisen ja yhteistoiminnan nykytilaa ja kartoittaa osastojen tiedontarpeita yhteistoimintaan liittyen. Työn tavoitteena on tämän jälkeen antaa kohdeosastoille ehdotuksia toimenpiteistä ja teknologisista ratkaisuista toiminnan parantamiseen. Työssä esitetään aluksi tietoon ja sen jakamiseen liittyviä käsitteitä. Tämän jälkeen tekstissä luodaan katsaus liiketoimintatietoon ja sen hallintaan. Lisäksi tutkimuksessa tarkastellaan tutkimuksen kohteena olevaa organisaatiota ja kohdeosastoja. Haastatteluiden ja havainnoinnin avulla kerätyn tutkimusaineiston perusteella kohdeosastojen välillä vaihdetaan nykytilassa paljon erilaista asiakkaisiin ja markkinoihin liittyvää tietoa. Tiedon jakaminen ja vaihto perustuu kuitenkin nykytilassa hyvin vahvasti säännöllisen yhteistoiminnan sijaan erillisesti ilmaistuihin tietotarpeisiin. Haastatteluiden perusteella kohdeosastoissa on havaittavissa myös parannustarpeita nykyiseen tiedonkulkuun ja tiedon jakamiseen verrattuna. Annetut kehitysehdotukset koskevat yhteistoiminnan muuntamista säännölliseksi ja keskustelevaksi, olemassa olevien tietojen tehokkaammasta jakamisesta ja nykyisten tietohallintajärjestelmien hyödyntämisestä yhteistoiminnassa uusien järjestelmien luomisen sijaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary objective is to identify the critical factors that have a natural impact on the performance measurement system. It is important to make correct decisions related to measurement systems, which are based on the complex business environment. The performance measurement system is combined with a very complex non-linear factor. The Six Sigma methodology is seen as one potential approach at every organisational level. It will be linked to the performance and financial measurement as well as to the analytical thinking on which the viewpoint of management depends. The complex systems are connected to the customer relationship study. As the primary throughput can be seen in a new well-defined performance measurement structure that will also be facilitated as will an analytical multifactor system. These critical factors should also be seen as a business innovation opportunity at the same time. This master's thesis has been divided into two different theoretical parts. The empirical part consists of both action-oriented and constructive research approaches with an empirical case study. The secondary objective is to seek a competitive advantage factor with a new analytical tool and the Six Sigma thinking. Process and product capabilities will be linked to the contribution of complex system. These critical barriers will be identified by the performance measuring system. The secondary throughput can be recognised as the product and the process cost efficiencies which throughputs are achieved with an advantage of management. The performance measurement potential is related to the different productivity analysis. Productivity can be seen as one essential part of the competitive advantage factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The necessity of EC (Electronic Commerce) and enterprise systems integration is perceived from the integrated nature of enterprise systems. The proven benefits of EC to provide competitive advantages to the organizations force enterprises to adopt and integrate EC with their enterprise systems. Integration is a complex task to facilitate seamless flow of information and data between different systems within and across enterprises. Different systems have different platforms, thus to integrate systems with different platforms and infrastructures, integration technologies, such as middleware, SOA (Service-Oriented Architecture), ESB (Enterprise Service Bus), JCA (J2EE Connector Architecture), and B2B (Business-to-Business) integration standards are required. Huge software vendors, such as Oracle, IBM, Microsoft, and SAP suggest various solutions to address EC and enterprise systems integration problems. There are limited numbers of literature about the integration of EC and enterprise systems in detail. Most of the studies in this area have focused on the factors which influence the adoption of EC by enterprise or other studies provide limited information about a specific platform or integration methodology in general. Therefore, this thesis is conducted to cover the technical details of EC and enterprise systems integration and covers both the adoption factors and integration solutions. In this study, many literature was reviewed and different solutions were investigated. Different enterprise integration approaches as well as most popular integration technologies were investigated. Moreover, various methodologies of integrating EC and enterprise systems were studied in detail and different solutions were examined. In this study, the influential factors to adopt EC in enterprises were studied based on previous literature and categorized to technical, social, managerial, financial, and human resource factors. Moreover, integration technologies were categorized based on three levels of integration, which are data, application, and process. In addition, different integration approaches were identified and categorized based on their communication and platform. Also, different EC integration solutions were investigated and categorized based on the identified integration approaches. By considering different aspects of integration, this study is a great asset to the architectures, developers, and system integrators in order to integrate and adopt EC with enterprise systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Outsourcing is a common strategy for companies looking for cost savings and improvements in performance. This has been especially prevalent in logistics, where warehousing and transporting are typical targets for outsourcing. However, while the benefits from logistics outsourcing are clear on paper, there are several cases companies fail to reach these benefits. The most commonly cited reasons for this are poor information flow between the company and the third party logistics partner, and a lack of integration between the two partners. Uncertainty stems from lack of information, and it can cripple the whole outsourcing operation. This is where enterprise resource planning (ERP) systems step in, as they can have a significant role in improving the flow of information, and integration, which consequently mitigates uncertainty. The purpose of the study is to examine if ERP systems have an effect on a company's decision to outsource logistics operations. Along the rapid advancements in technology during the past decades, ERP systems have also evolved. Therefore, empirical research on the subject needs constant revision as it can quickly become outdated due to ERP systems having more advanced capabilities every year. The research was conducted using a qualitative single-case study of a Finnish manufacturing firm that had outsourced warehousing and transportation operations in the Swedish market. The empirical data was gathered with use of semi-structured interviews with three employees from the case company that were closely related to the outsourcing operation. The theoretical framework that was used to analyze the empirical data was based on Transaction Cost Economics theory. The results of the study were align with the theoretical framework, in that the ERP system of the case company was seen as an enabler for their logistics outsourcing operation. However, the full theoretical benefits from ERP systems concerning extended enterprise functionality and flexibility were not attained due to the case company having an older version of their ERP system. This emphasizes the importance of having up-to-date technology if you want to overcome the shortcomings of ERP systems in outsourcing situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many-core systems provide a great potential in application performance with the massively parallel structure. Such systems are currently being integrated into most parts of daily life from high-end server farms to desktop systems, laptops and mobile devices. Yet, these systems are facing increasing challenges such as high temperature causing physical damage, high electrical bills both for servers and individual users, unpleasant noise levels due to active cooling and unrealistic battery drainage in mobile devices; factors caused directly by poor energy efficiency. Power management has traditionally been an area of research providing hardware solutions or runtime power management in the operating system in form of frequency governors. Energy awareness in application software is currently non-existent. This means that applications are not involved in the power management decisions, nor does any interface between the applications and the runtime system to provide such facilities exist. Power management in the operating system is therefore performed purely based on indirect implications of software execution, usually referred to as the workload. It often results in over-allocation of resources, hence power waste. This thesis discusses power management strategies in many-core systems in the form of increasing application software awareness of energy efficiency. The presented approach allows meta-data descriptions in the applications and is manifested in two design recommendations: 1) Energy-aware mapping 2) Energy-aware execution which allow the applications to directly influence the power management decisions. The recommendations eliminate over-allocation of resources and increase the energy efficiency of the computing system. Both recommendations are fully supported in a provided interface in combination with a novel power management runtime system called Bricktop. The work presented in this thesis allows both new- and legacy software to execute with the most energy efficient mapping on a many-core CPU and with the most energy efficient performance level. A set of case study examples demonstrate realworld energy savings in a wide range of applications without performance degradation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mobile malwares are increasing with the growing number of Mobile users. Mobile malwares can perform several operations which lead to cybersecurity threats such as, stealing financial or personal information, installing malicious applications, sending premium SMS, creating backdoors, keylogging and crypto-ransomware attacks. Knowing the fact that there are many illegitimate Applications available on the App stores, most of the mobile users remain careless about the security of their Mobile devices and become the potential victim of these threats. Previous studies have shown that not every antivirus is capable of detecting all the threats; due to the fact that Mobile malwares use advance techniques to avoid detection. A Network-based IDS at the operator side will bring an extra layer of security to the subscribers and can detect many advanced threats by analyzing their traffic patterns. Machine Learning(ML) will provide the ability to these systems to detect unknown threats for which signatures are not yet known. This research is focused on the evaluation of Machine Learning classifiers in Network-based Intrusion detection systems for Mobile Networks. In this study, different techniques of Network-based intrusion detection with their advantages, disadvantages and state of the art in Hybrid solutions are discussed. Finally, a ML based NIDS is proposed which will work as a subsystem, to Network-based IDS deployed by Mobile Operators, that can help in detecting unknown threats and reducing false positives. In this research, several ML classifiers were implemented and evaluated. This study is focused on Android-based malwares, as Android is the most popular OS among users, hence most targeted by cyber criminals. Supervised ML algorithms based classifiers were built using the dataset which contained the labeled instances of relevant features. These features were extracted from the traffic generated by samples of several malware families and benign applications. These classifiers were able to detect malicious traffic patterns with the TPR upto 99.6% during Cross-validation test. Also, several experiments were conducted to detect unknown malware traffic and to detect false positives. These classifiers were able to detect unknown threats with the Accuracy of 97.5%. These classifiers could be integrated with current NIDS', which use signatures, statistical or knowledge-based techniques to detect malicious traffic. Technique to integrate the output from ML classifier with traditional NIDS is discussed and proposed for future work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability in software system is still a new practice that most software developers and companies are trying to incorporate into their software development lifecycle and has been largely discussed in academia. Sustainability is a complex concept viewed from economic, environment and social dimensions with several definitions proposed making sometimes the concept of sustainability very fuzzy and difficult to apply and assess in software systems. This has hindered the adoption of sustainability in the software industry. A little research explores sustainability as a quality property of software products and services to answer questions such as; How to quantify sustainability as a quality construct in the same way as other quality attributes such as security, usability and reliability? How can it be applied to software systems? What are the measures and measurement scale of sustainability? The Goal of this research is to investigate the definitions, perceptions and measurement of sustainability from the quality perspective. Grounded in the general theory of software measurement, the aim is to develop a method that decomposes sustainability in factors, criteria and metrics. The Result is a method to quantify and access sustainability of software systems while incorporating management and users concern. Conclusion: The method will empower the ability of companies to easily adopt sustainability while facilitating its integration to the software development process and tools. It will also help companies to measure sustainability of their software products from economic, environmental, social, individual and technological dimension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Turvallisuuteen liittyvän ohjausjärjestelmän tehtävänä on siirtää ja käsitellä turvallisuuskriittistä tietoa. Esimerkiksi anturin (turvalaitteen) havaitessa vaaravyöhykkeelle pyrkivän ihmisen tulee ensimmäisenä tiedon välittyä ohjausjärjestelmään. Ohjausjärjestelmän on muodostettava saapuneen tiedon perusteella käsky tehonohjauselimille. Tehonohjauselimillä säädellään koneen käyttöenergian syöttöä ja sitä kautta on mahdollista pysäyttää koneen liike ennen mahdollisen vahingon sattumista. Perinteiset turvaratkaisut ovat perustuneet pakkotoimintaisiin releisiin ja kahdennuksiin. Tällämenetelmällä on toteutettu varmatoimintaisia turvaratkaisuja, joissa yksittäiset viat ovat paljastuneet. Nykyisin on kuitenkin yhä enemmän tarvetta integroida turvatoiminnot automaatiojärjestelmään ja toteuttaa turvaratkaisut hajautetuillajärjestelmillä. Hajautetut järjestelmät sisältävät osittain ohjelmoitavien järjestelmien etuja ja haittoja, mutta tuovat mukanaan myös uudenlaisia vikoja. Työn tarkoitus oli selvittää, millaisia rajoituksia koneturvallisuus asettaa paperiteollisuuden pituusleikkurien turvallisuuteen liittyville ohjausjärjestelmille, sekä selvittää turvallisuuteen liittyvien järjestelmien rakennetta ja ominaisuuksia. Tässä työssä tutkittiin AS-i, EsaLan ja ProfiSafe turvajärjestelmiä teknisten tietojen perusteella. Tutkitut järjestelmät ovat erilaisia ja soveltuvat tämän vuoksi hieman erilaisiin kohteisiin. Kaikilla näillä järjestelmillä on kuitenkin mahdollista toteuttaa pituusleikkurin turvaväyläjärjestelmässä riittävä turvallisuuden taso koneturvallisuuden näkökulmasta. Tämä edellyttää oikein tehtyä riskianalyysiä ja oikeita suunnittelumenetelmiä. Teknisten tietojen perusteella otettiin testattavaksi AS-i safety at workja EsaLan:in Compact järjestelmät, joille suoritettiin sähkömagneettiseen yhteensopivuuteen (EMC) ja toiminnallisuuteen liittyviä testejä.