80 resultados para Quality Management Systems
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The objective of the work has been to study why systems thinking should be used in combination with TQM, what are the main benefits of the integration and how it could best be done. The work analyzes the development of systems thinking and TQM with time and the main differences between them. The work defines prerequisites for adopting a systems approach and the organizational factors which embody the development of an efficient learning organization. The work proposes a model based on combination of an interactive management model and redesign to be used for application of systems approach with TQM in practice. The results of the work indicate that there are clear differences between systems thinking and TQM which justify their combination. Systems approach provides an additional complementary perspective to quality management. TQM is focused on optimizing operations at the operational level while interactive management and redesign of organization are focused on optimization operations at the conceptual level providing a holistic system for value generation. The empirical study demonstrates the applicability of the proposed model in one case study company but its application is tenable and possible also beyond this particular company. System dynamic modeling and other systems based techniques like cognitive mapping are useful methods for increasing understanding and learning about the behavior of systems. The empirical study emphasizes the importance of using a proper early warning system.
Resumo:
The environmental aspect of corporate social responsibility (CSR) expressed through the process of the EMS implementation in the oil and gas companies is identified as the main subject of this research. In the theoretical part, the basic attention is paid to justification of a link between CSR and environmental management. The achievement of sustainable competitive advantage as a result of environmental capital growth and inclusion of the socially responsible activities in the corporate strategy is another issue that is of special significance here. Besides, two basic forms of environmental management systems (environmental decision support systems and environmental information management systems) are explored and their role in effective stakeholder interaction is tackled. The most crucial benefits of EMS are also analyzed to underline its importance as a source of sustainable development. Further research is based on the survey of 51 sampled oil and gas companies (both publicly owned and state owned ones) originated from different countries all over the world and providing reports on sustainability issues in the open access. To analyze their approach to sustainable development, a specifically designed evaluation matrix with 37 indicators developed in accordance with the General Reporting Initiative (GRI) guidelines for non-financial reporting was prepared. Additionally, the quality of environmental information disclosure was measured on the basis of a quality – quantity matrix. According to results of research, oil and gas companies prefer implementing reactive measures to the costly and knowledge-intensive proactive techniques for elimination of the negative environmental impacts. Besides, it was identified that the environmental performance disclosure is mostly rather limited, so that the quality of non-financial reporting can be judged as quite insufficient. In spite of the fact that most of the oil and gas companies in the sample claim the EMS to be embedded currently in their structure, they often do not provide any details for the process of their implementation. As a potential for the further development of EMS, author mentions possible integration of their different forms in a single entity, extension of existing structure on the basis of consolidation of the structural and strategic precautions as well as development of a unified certification standard instead of several ones that exist today in order to enhance control on the EMS implementation.
Resumo:
Ohjelmiston suorituskyky on kokonaisvaltainen asia, johon kaikki ohjelmiston elinkaaren vaiheet vaikuttavat. Suorituskykyongelmat johtavat usein projektien viivästymisiin, kustannusten ylittymisiin sekä joissain tapauksissa projektin täydelliseen epäonnistumiseen. Software performance engineering (SPE) on ohjelmistolähtöinen lähestysmistapa, joka tarjoaa tekniikoita suorituskykyisen ohjelmiston kehittämiseen. Tämä diplomityö tutkii näitä tekniikoita ja valitsee niiden joukosta ne, jotka soveltuvat suorituskykyongelmien ratkaisemiseen kahden IT-laitehallintatuotteen kehityksessä. Työn lopputuloksena on päivitetty versio nykyisestä tuotekehitysprosessista, mikä huomioi sovellusten suorituskykyyn liittyvät haasteet tuotteiden elinkaaren eri vaiheissa.
Resumo:
Kirjoitus perustuu 24.9.1996 pidettyyn esitelmään yhteiskunnan riskien hallintaa käsittelevässä seminaarissa.
Resumo:
Kokonaisvaltaisen laatujohtamisen malli TQM (Total Quality Management) on noussut yhdeksi merkittävimmistä konsepteista globaalissa liiketoiminnassa, missä laatu on tärkeä kilpailutekijä. Tämä diplomityö pureutuu nykyaikaiseen kokonaisvaltaisen laatujohtamisen konseptiin, joka nostaa perinteisen laatuajattelun uudelle tasolle. Moderni laatujohtamisajattelu on kasvanut koskemaan yrityksen toiminnan kaikkia osa-alueita. Työn tavoitteena on TietoEnator Käsittely ja Verkkopalvelut liiketoiminta-alueen osalta laadun sekä liiketoiminnallisen suorituskyvyn parantaminen kokonaisvaltaisesti. Ennen varsinaisen laatujohtamis-konseptin käsittelyä työ esittelee ensin yleisellä tasolla perinteistä laatu käsitettä sekä käsittelee lyhyestiICT-liiketoimintaa ja siihen liittyviä standardeja. Lopuksi tutkimus esittelee priorisoituja parannusehdotuksia ja askeleita jotka auttavat organisaatiota saavuttamaan kokonaisvaltaisen laatujohtamiskonseptin mukaisia pyrkimyksiä.
Resumo:
Laatu on osaltaan vahvistamassa asemaansa liike-elämässä yritysten kilpaillessa kansainvälisillä markkinoilla niin hinnalla kuin laadulla. Tämä suuntaus on synnyttänyt useita laatuohjelmia, joita käytetään ahkerasti yritysten kokonais- valtaisen laatujohtamisen (TQM) toteuttamisessa. Laatujohtaminen kattaa yrityksen kaikki toiminnot ja luo vaatimuksia myös yrityksen tukitoimintojen kehittämiselle ja parantamiselle. Näihin lukeutuu myös tämän tutkimuksen kohde tietohallinto (IT). Tutkielman tavoitteena oli kuvata IT prosessin nykytila. Tutkielmassa laadittu prosessikuvaus pohjautuu prosessijohtamisen teoriaan ja kohdeyrityksen käyttämään laatupalkinto kriteeristöön. Tutkimusmenetelmänä prosessin nykytilan selvittämiseksi käytettiin teemahaastattelutta. Prosessin nykytilan ja sille asetettujen vaatimusten selvittämiseksi haastateltiin IT prosessin asiakkaita. Prosessianalyysi, tärkeimpien ala-prosessien tunnistaminen ja parannusalueiden löytäminen ovat tämän tutkielman keskeisemmät tulokset. Tutkielma painottui IT prosessin heikkouksien ja parannuskohteiden etsimiseen jatkuvan kehittämisen pohjaksi, ei niinkään prosessin radikaaliin uudistamiseen. Tutkielmassa esitellään TQM:n periaatteet, laatutyökaluja sekä prosessijohtamisen terminologia, periaatteet ja sen systemaattinen toteutus. Työ antaa myös kuvan siitä, miten TQM ja prosessijohtaminen niveltyvät yrityksen laatutyössä.
Resumo:
In a networked business environment the visibility requirements towards the supply operations and customer interface has become tighter. In order to meet those requirements the master data of case company is seen as an enabler. However the current state of master data and its quality are not seen good enough to meet those requirements. In this thesis the target of research was to develop a process for managing master data quality as a continuous process and find solutions to cleanse the current customer and supplier data to meet the quality requirements defined in that process. Based on the theory of Master Data Management and data cleansing, small amount of master data was analyzed and cleansed using one commercial data cleansing solution available on the market. This was conducted in cooperation with the vendor as a proof of concept. In the proof of concept the cleansing solution’s applicability to improve the quality of current master data was proved. Based on those findings and the theory of data management the recommendations and proposals for improving the quality of data were given. In the results was also discovered that the biggest reasons for poor data quality is the lack of data governance in the company, and the current master data solutions and its restrictions.
Resumo:
For any international companies who wish to enter the Chinese market, quality is base on the fundamental. The companies are coming to realize the importance of quality gradually, thus companies have been put the quality problems on the agenda. The competitiveness of companies comes from quality. Quality is the key to success, and it can decide that the companies can be accepted or eliminated by the market. Due to the obvious benefits, the demand of the method of how to achieve high quality of product keeps growing. During achieving the high quality process, the main troubles come from the impact between Eastern and Western culture. Chinese culture which is different with Western one have lasted as long as five thousand years. Such a culture deeply rooted in the hearts of Chinese people, and effected generation after generation of Chinese people's working style and ways of thinking. This thesis determines how to find a good fit point between Eastern and Western culture. Doing right thing by the right way. The nature of improving quality is improving management level in fact. "How to manage, who should be managed", the thesis explains the basic and best option to achieve those. It describes three-dimension-style management to monitoring the working process. This kind of management style can inspect production process from horizontal and vertical direction. In this management way, it defines effective evaluation system to every subcontractor, and makes the companies to achieve the ultimate goal - satisfy quality. Because of the importance of human factor, the thesis determines the range of training of the inspector and welder due to the current situation of China. The results show that in order to reach reliable training effective evaluation, not only the quality of the human but also the ultimate goal of product quality.
Resumo:
The purpose of this study is to study whether a Web CMS can be used to implement and host an online community. The study is divided into two parts. The theoretical part contains the definition of Web CMS and clarifies the relation between an online community and a social software. The first part also defines the parameters, which must be taken account when choosing a Web CMS for hosting an online community. The practical part of the study contains analyses of three Web CMSs, Drupal, Liferay and Plone. All the three Web CMSs were analyzed using the technical and social parameters discovered in the theoretical part of the study. The primary objective is to investigate whether the selected Web CMS can be used to implement and host an online community. If hosting is possible, the secondary objective is to investigate whether the selected Web CMS have an effect to the online community.
Resumo:
The aim of the thesis was to study quality management with process approach and to find out how to utilize process management to improve quality. The operating environment of organizations has changed. Organizations are focusing on their core competences and networking with suppliers and customers to ensure more effective and efficient value creation for the end customer. Quality management is moving from inspection of the output to prevention of problems from occurring in the first place and management thinking is changing from functional approach to process approach. In the theoretical part of the thesis, it is studied how to define quality, how to achieve good quality, how to improve quality, and how to make sure the improvement goes on as never ending cycle. A selection of quality tools is introduced. Process approach to quality management is described and compared to functional approach, which is the traditional way to manage operations and quality. The customer focus is also studied, and it is presented, that to ensure long term customer commitment, organization needs to react to changing customer requirements and wishes by constantly improving the processes. In the experimental part the theories are tested in a process improvement business case. It is shown how to execute a process improvement project starting from defining the customer requirements, continuing to defining the process ownership, roles and responsibilities, boundaries, interfaces and the actual process activities. The control points and measures are determined for the process, as well as the feedback and corrective action process, to ensure continual improvement can be achieved and to enable verification that customer requirements are fulfilled.
Resumo:
The purpose of this thesis is to study, investigate and compare usability of open source cms. The thesis examines and compares usability aspect of some open source cms. The research is divided into two complementary parts –theoretical part and analytical part. The theoretical part mainly describes open source web content management systems, usability and the evaluation methods. The analytical part is to compare and analyze the results found from the empirical research. Heuristic evaluation method was used to measure usability problems in the interfaces. The study is fairly limited in scope; six tasks were designed and implemented in each interface for discovering defects in the interfaces. Usability problems were rated according to their level of severity. Time it took by each task, level of problem’s severity and type of heuristics violated will be recorded, analyzed and compared. The results of this study indicate that the comparing systems provide usable interfaces, and WordPress is recognized as the most usable system.
Resumo:
Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.
Resumo:
Tämä diplomityö arvioi hitsauksen laadunhallintaohjelmistomarkkinoiden kilpailijoita. Kilpailukenttä on uusi ja ei ole tarkkaa tietoa siitä minkälaisia kilpailijoita on markkinoilla. Hitsauksen laadunhallintaohjelmisto auttaa yrityksiä takaamaan korkean laadun. Ohjelmisto takaa korkean laadun varmistamalla, että hitsaaja on pätevä, hän noudattaa hitsausohjeita ja annettuja parametreja. Sen lisäksi ohjelmisto kerää kaiken tiedon hitsausprosessista ja luo siitä vaadittavat dokumentit. Diplomityön teoriaosuus muodostuu kirjallisuuskatsauksesta ratkaisuliike-toimintaan, kilpailija-analyysin ja kilpailuvoimien teoriaan sekä hitsauksen laadunhallintaan. Työn empiriaosuus on laadullinen tutkimus, jossa tutkitaan kilpailevia hitsauksen laadunhallintaohjelmistoja ja haastatellaan ohjelmistojen käyttäjiä. Diplomityön tuloksena saadaan uusi kilpailija-analyysimalli hitsauksen laadunhallintaohjelmistoille. Mallin avulla voidaan arvostella ohjelmistot niiden tarjoamien primääri- ja sekundääriominaisuuksien perusteella. Toiseksi tässä diplomityössä analysoidaan nykyinen kilpailijatilanne hyödyntämällä juuri kehitettyä kilpailija-analyysimallia.