838 resultados para Hardware-based security


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inhimilliseen turvallisuuteen kriisinhallinnan kautta – oppimisen mahdollisuuksia ja haasteita Kylmän sodan jälkeen aseelliset konfliktit ovat yleensä alkaneet niin sanotuissa hauraissa valtioissa ja köyhissä maissa, ne ovat olleet valtioiden sisäisiä ja niihin on osallistunut ei-valtiollisia aseellisia ryhmittymiä. Usein ne johtavat konfliktikierteeseen, jossa sota ja vakaammat olot vaihtelevat. Koska kuolleisuus konflikteissa voi jäädä alle kansainvälisen määritelmän (1000 kuollutta vuodessa), kutsun tällaisia konflikteja ”uusiksi konflikteiksi”. Kansainvälinen yhteisö on pyrkinyt kehittämään kriisinhallinnan ja rauhanrakentamisen malleja, jotta pysyvä rauhantila saataisiin aikaiseksi. Inhimillinen turvallisuus perustuu näkemykseen, jossa kunnioitetaan jokaisen yksilön ihmisoikeuksia ja jolla on vaikutusta myös kriisinhallinnan ja rauhanrakentamisen toteuttamiseen. Tutkimukseen kuuluu kaksi empiiristä osaa: Delfoi tulevaisuuspaneeliprosessin sekä kriisinhallintahenkilöstön haastattelut. Viisitoista eri alojen kriisinhallinta-asiantuntijaa osallistui paneeliin, joka toteutettiin vuonna 2008. Paneelin tulosten mukaan tulevat konfliktit usein ovat uusien konfliktien kaltaisia. Lisäksi kriisinhallintahenkilöstöltä edellytetään vuorovaikutus- ja kommunikaatiokykyä ja luonnollisesti myös varsinaisia ammatillisia valmiuksia. Tulevaisuuspaneeli korosti vuorovaikutus- ja kommunikaatiotaitoja erityisesti siviilikriisinhallintahenkilöstön kompetensseissa, mutta samat taidot painottuivat sotilaallisen kriisinhallinnan henkilöstön kompetensseissakin. Kriisinhallinnassa tarvitaan myös selvää työnjakoa eri toimijoiden kesken. Kosovossa työskennelleen henkilöstön haastatteluaineisto koostui yhteensä 27 teemahaastattelusta. Haastateltavista 9 oli ammattiupseeria, 10 reservistä rekrytoitua rauhanturvaajaa ja 8 siviilikriisinhallinnassa työskennellyttä henkilöä. Haastattelut toteutettiin helmi- ja kesäkuun välisenä aikana vuonna 2008. Haastattelutuloksissa korostui vuorovaikutus- ja kommunikaatiotaitojen merkitys, sillä monissa käytännön tilanteissa haastateltavat olivat ratkoneet ongelmia yhteistyössä muun kriisinhallintahenkilöstön tai paikallisten asukkaiden kanssa. Kriisinhallinnassa toteutui oppimisprosesseja, jotka usein olivat luonteeltaan myönteisiä ja informaalisia. Tällaisten onnistumisten vaikutus yksilön minäkuvaan oli myönteinen. Tällaisia prosesseja voidaan kuvata ”itseä koskeviksi oivalluksiksi”. Kriisinhallintatehtävissä oppimisella on erityinen merkitys, jos halutaan kehittää toimintoja inhimillisen turvallisuuden edistämiseksi. Siksi on tärkeää, että kriisinhallintakoulutusta ja kriisinhallintatyössä oppimista kehitetään ottamaan huomioon oppimisen eri tasot ja ulottuvuudet sekä niiden merkitys. Informaaliset oppimisen muodot olisi otettava paremmin huomioon kriisinhallintakoulutusta ja kriisinhallintatehtävissä oppimista kehitettäessä. Palautejärjestelmää olisi kehitettävä eri tavoin. Koko kriisinhallintaoperaation on saatava tarvittaessa myös kriittistä palautetta onnistumisista ja epäonnistumisista. Monet kriisinhallinnassa työskennelleet kaipaavat kunnollista palautetta työrupeamastaan. Liian rutiininomaiseksi koettu palaute ei edistä yksilön oppimista. Spontaanisti monet haastatellut pitivät tärkeänä, että kriisinhallinnassa työskennelleillä olisi mahdollisuus debriefing- tyyppiseen kotiinpaluukeskusteluun. Pelkkä tällainen mahdollisuus ilmeisesti voisi olla monelle myönteinen uutinen, vaikka tilaisuutta ei hyödynnettäisikään. Paluu kriisinhallintatehtävistä Suomeen on monelle haasteellisempaa kuin näissä tehtävissä työskentelyn aloittaminen ulkomailla. Tutkimuksen tulokset kannustavat tutkimaan kriisinhallintaa oppimisen näkökulmasta. On myös olennaista, että kriisinhallinnan palautejärjestelmiä kehitetään mahdollisimman hyvin edistämään sekä yksilöllistä että organisatorista oppimista kriisinhallinnassa. Kriisinhallintaoperaatio on oppimisympäristö. Kriisinhallintahenkilöstön kommunikaatio- ja vuorovaikutustaitojen kehittäminen on olennaista tavoiteltaessa kestävää rauhanprosessia, jossa konfliktialueen asukkaatkin ovat mukana.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cyber security is one of the main topics that are discussed around the world today. The threat is real, and it is unlikely to diminish. People, business, governments, and even armed forces are networked in a way or another. Thus, the cyber threat is also facing military networking. On the other hand, the concept of Network Centric Warfare sets high requirements for military tactical data communications and security. A challenging networking environment and cyber threats force us to consider new approaches to build security on the military communication systems. The purpose of this thesis is to develop a cyber security architecture for military networks, and to evaluate the designed architecture. The architecture is described as a technical functionality. As a new approach, the thesis introduces Cognitive Networks (CN) which are a theoretical concept to build more intelligent, dynamic and even secure communication networks. The cognitive networks are capable of observe the networking environment, make decisions for optimal performance and adapt its system parameter according to the decisions. As a result, the thesis presents a five-layer cyber security architecture that consists of security elements controlled by a cognitive process. The proposed architecture includes the infrastructure, services and application layers that are managed and controlled by the cognitive and management layers. The architecture defines the tasks of the security elements at a functional level without introducing any new protocols or algorithms. For evaluating two separated method were used. The first method is based on the SABSA framework that uses a layered approach to analyze overall security of an organization. The second method was a scenario based method in which a risk severity level is calculated. The evaluation results show that the proposed architecture fulfills the security requirements at least at a high level. However, the evaluation of the proposed architecture proved to be very challenging. Thus, the evaluation results must be considered very critically. The thesis proves the cognitive networks are a promising approach, and they provide lots of benefits when designing a cyber security architecture for the tactical military networks. However, many implementation problems exist, and several details must be considered and studied during the future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Laboratory of Intelligent Machine researches and develops energy-efficient power transmissions and automation for mobile construction machines and industrial processes. The laboratory's particular areas of expertise include mechatronic machine design using virtual technologies and simulators and demanding industrial robotics. The laboratory has collaborated extensively with industrial actors and it has participated in significant international research projects, particularly in the field of robotics. For years, dSPACE tools were the lonely hardware which was used in the lab to develop different control algorithms in real-time. dSPACE's hardware systems are in widespread use in the automotive industry and are also employed in drives, aerospace, and industrial automation. But new competitors are developing new sophisticated systems and their features convinced the laboratory to test new products. One of these competitors is National Instrument (NI). In order to get to know the specifications and capabilities of NI tools, an agreement was made to test a NI evolutionary system. This system is used to control a 1-D hydraulic slider. The objective of this research project is to develop a control scheme for the teleoperation of a hydraulically driven manipulator, and to implement a control algorithm between human and machine interaction, and machine and task environment interaction both on NI and dSPACE systems simultaneously and to compare the results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual environments and real-time simulators (VERS) are becoming more and more important tools in research and development (R&D) process of non-road mobile machinery (NRMM). The virtual prototyping techniques enable faster and more cost-efficient development of machines compared to use of real life prototypes. High energy efficiency has become an important topic in the world of NRMM because of environmental and economic demands. The objective of this thesis is to develop VERS based methods for research and development of NRMM. A process using VERS for assessing effects of human operators on the life-cycle efficiency of NRMM was developed. Human in the loop simulations are ran using an underground mining loader to study the developed process. The simulations were ran in the virtual environment of the Laboratory of Intelligent Machines of Lappeenranta University of Technology. A physically adequate real-time simulation model of NRMM was shown to be reliable and cost effective in testing of hardware components by the means of hardware-in-the-loop (HIL) simulations. A control interface connecting integrated electro-hydraulic energy converter (IEHEC) with virtual simulation model of log crane was developed. IEHEC consists of a hydraulic pump-motor and an integrated electrical permanent magnet synchronous motorgenerator. The results show that state of the art real-time NRMM simulators are capable to solve factors related to energy consumption and productivity of the NRMM. A significant variation between the test drivers is found. The results show that VERS can be used for assessing human effects on the life-cycle efficiency of NRMM. HIL simulation responses compared to that achieved with conventional simulation method demonstrate the advances and drawbacks of various possible interfaces between the simulator and hardware part of the system under study. Novel ideas for arranging the interface are successfully tested and compared with the more traditional one. The proposed process for assessing the effects of operators on the life-cycle efficiency will be applied for wider group of operators in the future. Driving styles of the operators can be analysed statistically from sufficient large result data. The statistical analysis can find the most life-cycle efficient driving style for the specific environment and machinery. The proposed control interface for HIL simulation need to be further studied. The robustness and the adaptation of the interface in different situations must be verified. The future work will also include studying the suitability of the IEHEC for different working machines using the proposed HIL simulation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target’s three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT) system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The power is still today an issue in wearable computing applications. The aim of the present paper is to raise awareness of the power consumption of wearable computing devices in specific scenarios to be able in the future to design energy efficient wireless sensors for context recognition in wearable computing applications. The approach is based on a hardware study. The objective of this paper is to analyze and compare the total power consumption of three representative wearable computing devices in realistic scenarios such as Display, Speaker, Camera and microphone, Transfer by Wi-Fi, Monitoring outdoor physical activity and Pedometer. A scenario based energy model is also developed. The Samsung Galaxy Nexus I9250 smartphone, the Vuzix M100 Smart Glasses and the SimValley Smartwatch AW-420.RX are the three devices representative of their form factors. The power consumption is measured using PowerTutor, an android energy profiler application with logging option and using unknown parameters so it is adjusted with the USB meter. The result shows that the screen size is the main parameter influencing the power consumption. The power consumption for an identical scenario varies depending on the wearable devices meaning that others components, parameters or processes might impact on the power consumption and further study is needed to explain these variations. This paper also shows that different inputs (touchscreen is more efficient than buttons controls) and outputs (speaker sensor is more efficient than display sensor) impact the energy consumption in different way. This paper gives recommendations to reduce the energy consumption in healthcare wearable computing application using the energy model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IoT consists of essentially thousands of tiny sensor nodes interconnected to the internet, each one of which executes the programmed functions under memory and power limita- tions. The sensor nodes are distributed mainly for gathering data in various situations. IoT envisions the future technologies such as e-health, smart city, auto-mobiles automa- tion, construction sites automation, and smart home. Secure communication of data under memory and energy constraints is major challenge in IoT. Authentication is the first and important phase of secure communication. This study presents a protocol to authenticate resource constraint devices in physical proximity by solely using the shared wireless communication interfaces. This model of authentication only relies on the abundance of ambient radio signals to authenticate in less than a second. To evaluate the designed protocol, SkyMotes are emulated in a network environment simulated by Contiki/COOJA. Results presented during this study proves that this approach is immune against passive and active attacks. An adversary located as near as two meters can be identified in less than a second with minimal expense of energy. Since, only radio device is used as required hardware for the authentication, this technique is scalable and interoperable to heterogeneous nature of IoT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the framework of state security policy, the focus of this dissertation are the relations between how new security threats are perceived and the policy planning and bureaucratic implementation that are designed to address them. In addition, this thesis explores and studies some of the inertias that might exist in the core of the state apparatus as it addresses new threats and how these could be better managed. The dissertation is built on five thematic and interrelated articles highlighting different aspects of when new significant national security threats are detected by different governments until the threats on the policy planning side translate into protective measures within the society. The timeline differs widely between different countries and some key aspects of this process are also studied. One focus concerns mechanisms for adaptability within the Intelligence Community, another on the policy planning process within the Cabinet Offices/National Security Councils and the third focus is on the planning process and how policy is implemented within the bureaucracy. The issue of policy transfer is also analysed, revealing that there is some imitation of innovation within governmental structures and policies, for example within the field of cyber defence. The main findings of the dissertation are that this context has built-in inertias and bureaucratic seams found in most government bureaucratic machineries. As much of the information and planning measures imply security classification of the transparency and internal debate on these issues, alternative assessments become limited. To remedy this situation, the thesis recommends ways to improve the decision-making system in order to streamline the processes involved in making these decisions. Another special focus of the thesis concerns the role of the public policy think tanks in the United States as an instrument of change in the country’s national security decision-making environment, which is viewed from the perspective as being a possible source of new ideas and innovation. The findings in this part are based on unique interviews data on how think tanks become successful and influence the policy debate in a country such as the United States. It appears clearly that in countries such as the United States think tanks smooth the decision making processes, and that this model with some adaptations also might be transferrable to other democratic countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hybridiajoneuvosovellukset vaativat usein sekä korkea- että matalajännitejärjestelmän. Korkeajännitejärjestelmä sisältää yleensä energiavaraston, joka on joko superkondansaattori tai korkeajänniteakusto, dieselgeneraattorin tai range extenderin ja ajokäytön. Korkeajännitejärjestelmään liitetään usein myös erilaisia apukäyttöjä kuten kompressoreita ja hydraulipumppuja. Matalajännitejärjelmä koostuu yleensä ohjausyksiköistä, ajovaloista, yms. laitteista. Perinteisesti matalajännitejärjestelmää on syötetty dieselmoottorin laturista, mutta korkeajännitejärjestelmien myötä DC/DC-hakkurin käyttäminen korkea- ja matalajännitejärjestelmien välillä on herättänyt kiinnostusta, koska tällöin laturin voisi poistaa ja matalajänniteakustoa pienentää. Tässä työssä kuvatun monilähöisen tehonmuokkaimen invertterisilta soveltuu apukäyttöjen ajamiseen, ja erotettu DC/DC-hakkuri matalajännitejärjestelmän syöttämiseen. Tässä työssä käydään läpi edellä mainitun tehonmuokkaimen suunnittelu, keskittyen eritoten laitteen korkeajänniteosien mitoitukseen ja termiseen suunniteluun. DC/DC-hakkurin osalta perinteisiä piistä valmistettuja IGBT transistoreja vertaillaan piikarbidi MOSFET transistoreihin. Lämpömallilaskujen paikkaansapitävyyttä tutkitaan suorittamalla prototyyppilaitteelle hyötysuhdemittaus, jonka tuloksia verrataan laskettuihin tuloksiin. Lämpömallin parannusmahdollisuuksia käsitellään myös hyötysuhdemittauksen tulosten perusteella.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Finnish legislation requires for a safe and secure learning environment. However, the comprehensive, risk based safety and security management (SSM) and the management commitment in the implementation and development of the SSM are not mentioned in the legislation. Multiple institutions, operators and researchers have studied and developed safety and security in educational institutions over the past decade. Typically the approach has been fragmented and without bringing up the importance of the comprehensive SSM. The development needs of the safety and security operations in universities have been studied. However, in universities of applied sciences (UASs) and in elementary schools (ESs), the performance level, strengths and weaknesses of the comprehensive SSM have not been studied. The objective of this study was to develop the comprehensive, risk based SSM of educational institutions by developing the new Asteri consultative auditing process and study its effects on auditees. Furthermore, the performance level in the comprehensive SSM in UASs and ESs were studied using Asteri and the TUTOR model developed by the Keski-Uusimaa Department for Rescue Services. In addition, strengths, development needs and differences were identified. In total, 76 educational institutions were audited between the years 2011 and 2014. The study is based on logical empiricism, and an observational applied research design was used. Auditing, observation and an electronic survey were used for data collection. Statistical analysis was used to analyze the collected information. In addition, thematic analysis was used to analyze the development areas of the organizations mentioned by the respondents in the survey. As one of the main contributions, this research presents the new Asteri consultative auditing process. Organizations with low performance levels on the audited subject benefit the most from the Asteri consultative auditing process. Asteri may be usable in many different types of audits, not only in SSM audits. As a new result, this study provides new knowledge on attitudes related to auditing. According to the research findings, auditing may generate negative attitudes and the auditor should take them into account when planning and preparing for audits. Negative attitudes can be compensated by producing added value, objectivity and positivity for the audit and, thus, improve the positive effects of auditing on knowledge and skills. Moreover, as the results of this study shows, auditing safety and security issues do not increase feelings of insecurity, but rather increase feelings of safety and security when using the new Asteri consultative auditing process with the TUTOR model. The results showed that the SSM in the audited UASs was statistically significantly more advanced than that in the audited ESs. However, there is still room for improvement in the ESs and the UASs as the approach to the SSM was fragmented. It can be assumed that the majority of Finnish UASs and ESs do not likely meet the basic level of the comprehensive, risk based the SSM.