33 resultados para Computer software - Quality control

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, a computer software for defining the geometry for a centrifugal compressor impeller is designed and implemented. The project is done under the supervision of Laboratory of Fluid Dynamics in Lappeenranta University of Technology. This thesis is similar to the thesis written by Tomi Putus (2009) in which a centrifugal compressor impeller flow channel is researched and commonly used design practices are reviewed. Putus wrote a computer software which can be used to define impeller’s three-dimensional geometry based on the basic geometrical dimensions given by a preliminary design. The software designed in this thesis is almost similar but it uses a different programming language (C++) and a different way to define the shape of the impeller meridional projection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software quality has become an important research subject, not only in the Information and Communication Technology spheres, but also in other industries at large where software is applied. Software quality is not a happenstance; it is defined, planned and created into the software product throughout the Software Development Life Cycle. The research objective of this study is to investigate the roles of human and organizational factors that influence software quality construction. The study employs the Straussian grounded theory. The empirical data has been collected from 13 software companies, and the data includes 40 interviews. The results of the study suggest that tools, infrastructure and other resources have a positive impact on software quality, but human factors involved in the software development processes will determine the quality of the products developed. On the other hand, methods of development were found to bring little effect on software quality. The research suggests that software quality is an information-intensive process whereby organizational structures, mode of operation, and information flow within the company variably affect software quality. The results also suggest that software development managers influence the productivity of developers and the quality of the software products. Several challenges of software testing that affect software quality are also brought to light. The findings of this research are expected to benefit the academic community and software practitioners by providing an insight into the issues pertaining to software quality construction undertakings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increased awareness and evolved consumer habits have set more demanding standards for the quality and safety control of food products. The production of foodstuffs which fulfill these standards can be hampered by different low-molecular weight contaminants. Such compounds can consist of, for example residues of antibiotics in animal use or mycotoxins. The extremely small size of the compounds has hindered the development of analytical methods suitable for routine use, and the methods currently in use require expensive instrumentation and qualified personnel to operate them. There is a need for new, cost-efficient and simple assay concepts which can be used for field testing and are capable of processing large sample quantities rapidly. Immunoassays have been considered as the golden standard for such rapid on-site screening methods. The introduction of directed antibody engineering and in vitro display technologies has facilitated the development of novel antibody based methods for the detection of low-molecular weight food contaminants. The primary aim of this study was to generate and engineer antibodies against low-molecular weight compounds found in various foodstuffs. The three antigen groups selected as targets of antibody development cause food safety and quality defects in wide range of products: 1) fluoroquinolones: a family of synthetic broad-spectrum antibacterial drugs used to treat wide range of human and animal infections, 2) deoxynivalenol: type B trichothecene mycotoxin, a widely recognized problem for crops and animal feeds globally, and 3) skatole, or 3-methyindole is one of the two compounds responsible for boar taint, found in the meat of monogastric animals. This study describes the generation and engineering of antibodies with versatile binding properties against low-molecular weight food contaminants, and the consecutive development of immunoassays for the detection of the respective compounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Software product metrics aim at measuring the quality of software. Modu- larity is an essential factor in software quality. In this work, metrics related to modularity and especially cohesion of the modules, are considered. The existing metrics are evaluated, and several new alternatives are proposed. The idea of cohesion of modules is that a module or a class should consist of related parts. The closely related principle of coupling says that the relationships between modules should be minimized. First, internal cohesion metrics are considered. The relations that are internal to classes are shown to be useless for quality measurement. Second, we consider external relationships for cohesion. A detailed analysis using design patterns and refactorings confirms that external cohesion is a better quality indicator than internal. Third, motivated by the successes (and problems) of external cohesion metrics, another kind of metric is proposed that represents the quality of modularity of software. This metric can be applied to refactorings related to classes, resulting in a refactoring suggestion system. To describe the metrics formally, a notation for programs is developed. Because of the recursive nature of programming languages, the properties of programs are most compactly represented using grammars and formal lan- guages. Also the tools that were used for metrics calculation are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Chinese welding industry is growing every year due to rapid development of the Chinese economy. Increasingly, companies around the world are looking to use Chinese enterprises as their cooperation partners. However, the Chinese welding industry also has its weaknesses, such as relatively low quality and weak management. A modern, advanced welding management system appropriate for local socio-economic conditions is required to enable Chinese enterprises to enhance further their business development. The thesis researches the design and implementation of a new welding quality management system for China. This new system is called ‗welding production quality control management model in China‘ (WQMC). Constructed on the basis of analysis of a survey and in-company interviews, the welding management system comprises the following different elements and perspectives: a ‗Localized congenital existing problem resolution strategies‘ (LCEPRS) database, a ‗human factor designed training system‘ (HFDT) training strategy, the theory of modular design, ISO 3834 requirements, total welding management (TWM), and lean manufacturing (LEAN) theory. The methods used in the research are literature review, questionnaires, interviews, and the author‘s model design experiences and observations, i.e. the approach is primarily qualitative and phenomenological. The thesis describes the design and implementation of a HFDT strategy in Chinese welding companies. Such training is an effective way to increase employees‘ awareness of quality and issues associated with quality assurance. The study identified widely existing problems in the Chinese welding industry and constructed a LCEPRS database that can be used in efforts to mitigate and avoid common problems. The work uses the theory of modular design, TWM and LEAN as tools for the implementation of the WQMC system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jatkuva laadunmittaus osana ohjelmistoprosessia on yleistynyt ohjelmistoyritysten keskuudessa viime vuosien aikana. ISO 9001:2000 -laatustandardi vaatii yrityksiltä tuotteiden ja prosessien laadun mittaamista ja seuraamista. Laadun mittareiden valinta on haastava tehtävä. Yritykset luulevat usein mittaavansa laatua, vaikka ne todellisuudessa mittaavatkin ohjelmistojen eri ominaisuuksia kuten kokoa tai monimutkaisuutta. Tässä diplomityössä kehitetään ohjelmistojen validointiprosessiin vertailuun perustuva laadunmittausprosessi ohjelmistotuotteiden laadun arviointiin, mittaamiseen ja seurantaan. Laatumittarit valitaan ennalta määriteltyjen kriteereiden mukaisesti, ja niille asetetaan tavoitearvot vertailuanalyysistä saatujen tulosten perusteella. Laadunmittausprosessin lisäksi työssä annetaan suositus prosessin käyttöönotosta ja käytöstä osana yrityksen toimintaa, mikä mahdollistaa jatkuvan seurannan sekä kehityksen tulevaisuudessa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Joints intended for welding frequently show variations in geometry and position, for which it is unfortunately not possible to apply a single set of operating parameters to ensure constant quality. The cause of this difficulty lies in a number of factors, including inaccurate joint preparation and joint fit up, tack welds, as well as thermal distortion of the workpiece. In plasma arc keyhole welding of butt joints, deviations in the gap width may cause weld defects such as an incomplete weld bead, excessive penetration and burn through. Manual adjustment of welding parameters to compensate for variations in the gap width is very difficult, and unsatisfactory weld quality is often obtained. In this study a control system for plasma arc keyhole welding has been developed and used to study the effects of the real time control of welding parameters on gap tolerance during welding of austenitic stainless steel AISI 304L. The welding tests demonstrated the beneficial effect of real time control on weld quality. Compared with welding using constant parameters, the maximum tolerable gap width with an acceptable weld quality was 47% higher when using the real time controlled parameters for a plate thickness of 5 mm. In addition, burn through occurred with significantly larger gap widths when parameters were controlled in real time. Increased gap tolerance enables joints to be prepared and fit up less accurately, saving time and preparation costs for welding. In addition to the control system, a novel technique for back face monitoring is described in this study. The test results showed that the technique could be successfully applied for penetration monitoring when welding non magnetic materials. The results also imply that it is possible to measure the dimensions of the plasma efflux or weld root, and use this information in a feedback control system and, thus, maintain the required weld quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to explore software development methods and quality assurance practices used by South Korean software industry. Empirical data was collected by conducting a survey that focused on three main parts: software life cycle models and methods, software quality assurance including quality standards, the strengths and weaknesses of South Korean software industry. The results of the completed survey showed that the use of agile methods is slightly surpassing the use of traditional software development methods. The survey also revealed an interesting result that almost half of the South Korean companies do not use any software quality assurance plan in their projects. For the state of South Korean software industry large number of the respondents thought that despite of the weakness, the status of software development in South Korea will improve in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability in software system is still a new practice that most software developers and companies are trying to incorporate into their software development lifecycle and has been largely discussed in academia. Sustainability is a complex concept viewed from economic, environment and social dimensions with several definitions proposed making sometimes the concept of sustainability very fuzzy and difficult to apply and assess in software systems. This has hindered the adoption of sustainability in the software industry. A little research explores sustainability as a quality property of software products and services to answer questions such as; How to quantify sustainability as a quality construct in the same way as other quality attributes such as security, usability and reliability? How can it be applied to software systems? What are the measures and measurement scale of sustainability? The Goal of this research is to investigate the definitions, perceptions and measurement of sustainability from the quality perspective. Grounded in the general theory of software measurement, the aim is to develop a method that decomposes sustainability in factors, criteria and metrics. The Result is a method to quantify and access sustainability of software systems while incorporating management and users concern. Conclusion: The method will empower the ability of companies to easily adopt sustainability while facilitating its integration to the software development process and tools. It will also help companies to measure sustainability of their software products from economic, environmental, social, individual and technological dimension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tavoitteena oli vertailla paperikoneen lajinvaihdon säätötapoja. Vertailun kohteina olivat Metso Automationin IQGradeChange lajinvaihto-ohjelmisto ja operaattoreiden käsin tekemät lajinvaihdot. Kattavan tutkimusaineiston saamiseksi paperikoneen lajinvaihtodataa kerättiin seitsemän kuukauden ajan. Kerätyt lajinvaihdot käytiin läpi Matlab-ympäristössä lajinvaihtoaikojen selvittämiseksi. Lisäksi lajinvaihdoista laskettiin tuotannon muutokset ((t/h)/min) vanhan ja uudenlajin välillä, jotta päästiin selvyyteen lajinvaihdon laajuudesta ja eri lajinvaihtotapojen suorituskyvyistä. Koeajojaksona paperikoneelta kerättiin kaikkiaan 130 lajinvaihdon tiedot. Näistä lajinvaihdoista 58 tehtiin IQGradeChange lajinvaihto-ohjelmistolla ja 72 oli operaattoreiden käsin tekemiä lajinvaihtoja. Kerätyistä 130 lajinvaihdosta 27 kappaletta päättyi ratakatkoon. Yhtenä tehtävänä olikin tutkia katkoon päättyneitä lajinvaihtoja.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän tutkimuksen aiheena on tilintarkastuksen historiallinen kehittyminen Suomessa runsaan sadan vuoden aikana. Tutkimuksen tavoitteena on analysoida osakeyhtiön tilintarkastuksen kehitystä ja yhdistää vuosisadan kehityspiirteet tilintarkastuksen kokonaiskuvaksi. Tutkittava periodi alkaa 1800-luvun lopulta ja päättyy 2000-luvun taitteeseen. Tutkimuksessa tarkastellaan suomalaista tilintarkastusinstituutiota, joka jaetaan kolmeen osaan: tilintarkastusta säätelevään normistoon (normit), tilintarkastajajärjestelmään (toimijat) ja tilintarkastuksen sisältöön (tehtävät). Tutkimuksessa tavoitellaan vastauksia kysymyksiin: mitä tarkastettiin, milloin tarkastettiin, kuka tarkasti ja miten tarkastettiin eri aikakausina? Tutkimus perustuu historialliseen lähdeaineistoon, jonka muodostavat tutkimusajanjakson lainsäädäntö, lainvalmisteluasiakirjat, viranomaisten ohjeet ja päätökset, alan järjestöjen suositukset, ammattilehtien artikkelit sekä laskentatoimen ja tilintarkastuksen ammattikirjallisuus. Metodologisesti tutkimus on teoreettinen, kvalitatiivinen historiantutkimus, jossa lähdeaineistoa käsitellään lähdekriittisesti ja osittain sisältöanalyysin keinoin. Tilintarkastusta säätelevässä normistossa keskeisiä lakeja ovat olleet osakeyhtiölaki, kirjanpitolaki ja tilintarkastuslaki. Lakisääteinen tilintarkastus alkoi vuoden 1895 osakeyhtiölaista, joka uudistui vuonna 1978 ja jälleen vuonna 1997. Kirjanpitolainsäädäntö on uudistunut viidesti: 1925 ja 1928, 1945, 1973, 1993 sekä 1997. Vuoden 1994 tilintarkastuslakiin koottiin tilintarkastuksen säädökset useista laeista. Muita normistoja ovat olleet EY:n direktiivit, Kilan ohjeet, KHT-yhdistyksen suositukset, Keskuskauppakamarin säännökset ja viimeisimpinä IAS- ja ISA-standardit. Ammattimainen tilintarkastajajärjestelmä saatiin maahamme kauppiaskokousten ansiosta. Ammattimaisena tilintarkastuksen toimijana aloitti Suomen Tilintarkastajainyhdistys vuonna 1911, ja sen toimintaa jatkoi KHT-yhdistys vuodesta 1925 alkaen. Tilintarkastajien auktorisointi siirtyi Keskuskauppakamarille vuonna 1924. HTM-tilintarkastajat ovat olleet alalla vuodesta 1950 lähtien. Kauppakamarijärjestö on toiminut hyväksyttyjen tilintarkastajien valvojana koko ammattimaisen tilintarkastustoiminnan ajan. Valtion valvontaa suorittaa VALA (Valtion tilintarkastuslautakunta). Koko tutkittavan periodin ajan auktorisoitujen tilintarkastajien rinnalla osakeyhtiöiden tarkastajina ovat toimineet myös maallikot.Tilintarkastuksen tehtäviin kuului vuoden 1895 osakeyhtiölain mukaan hallinnon ja tilien tarkastus. Myöhemmin sisältö täsmentyi tilinpäätöksen, kirjanpidon ja hallinnon tarkastukseksi. Tutkimusajanjakson alussa tilintarkastus oli manuaalista kaikkien tositteiden prikkausta ja virheiden etsimistä. Myöhemmin tarkastus muuttui pistokokeiksi. Kertatarkastuksesta siirryttiin jatkuvaan valvontatarkastukseen 1900-luvun alkupuolella. Dokumentoinnista ja työpapereista alkaa olla havaintoja 1930-luvulta lähtien. Atk-tarkastus yleistyi 1970- ja 1980-luvuilla, jolloin myös riskianalyyseihin alettiin kiinnittää huomiota. Hallinnon tarkastuksen merkitys on kasvanut kaiken aikaa. Tilintarkastuskertomukset olivat tutkimusajanjakson alussa vapaamuotoisia ja sisällöltään ilmaisurikkaita ja kuvailevia. Kertomus muuttui julkiseksi vuoden 1978 osakeyhtiölain myötä. Myöhemmin KHT-yhdistyksen vakiokertomusmallit yhdenmukaistivat ja pelkistivät raportointia. Tutkimuksen perusteella tilintarkastuksen historia voidaan jakaa kolmeen kauteen, jotka ovat tilintarkastusinstituution rakentumisen kausi (1895 - 1950), vakiintumisen kausi (1951 - 1985) ja kansainvälistymisen ja julkisuuden kausi (1986 alkaen). Tutkimusajanjakson jokaisella vuosikymmenellä keskusteltiin jatkuvasti tilintarkastajien riittävyydestä, alalle pääsyn ja tutkintojen vaikeudesta, tilintarkastajien ammattitaidon tasosta,hallinnon tarkastuksen sisällöstä, tilintarkastuskertomuksesta sekä maallikkotarkastajien asemasta. 1990-luvun keskeisimmät keskusteluaiheet olivat konsultointi, riippumattomuus, odotuskuilu sekä tilintarkastuksen taso ja laadunvalvonta. Analysoitaessa tilintarkastuksen muutoksia runsaan sadan vuoden ajalta voidaan todeta, että tilintarkastuksen ydintehtävät eivät juurikaan ole muuttuneet vuosikymmenien kuluessa. Osakeyhtiön tilintarkastus on edelleenkin laillisuustarkastusta. Sen tarkoituksena on yhä kirjanpidon, tilinpäätöksen ja hallinnon tarkastus. Tilintarkastajat valvovat osakkeenomistajien etua ja raportoivat heille tarkastuksen tuloksista. Tilintarkastuksen ulkoinen maailma sen sijaan on muuttunut vuosikymmenten saatossa. Kansainvälistyminen on lisännyt säännösten määrää, odotuksia ja vaatimuksia on nykyisin enemmän, uusi tekniikka mahdollistaa nopean tiedonkulun ja valvonta on lisääntynyt nykypäivää kohti tultaessa. Tilintarkastajan pätevyys perustuu nykyään tietotekniikan, tietojärjestelmien ja yrityksen toimialantuntemukseen. Runsaan sadan vuoden takaisen lain vaarinpitovaatimuksesta on tultu virtuaaliaikaiseen maailmaan!

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaikka keraamisten laattojen valmistusprosessi onkin täysin automatisoitu, viimeinen vaihe eli laaduntarkistus ja luokittelu tehdään yleensä ihmisvoimin. Automaattinen laaduntarkastus laattojen valmistuksessa voidaan perustella taloudellisuus- ja turvallisuusnäkökohtien avulla. Tämän työn tarkoituksena on kuvata tutkimusprojektia keraamisten laattojen luokittelusta erilaisten väripiirteiden avulla. Oleellisena osana tutkittiin RGB- ja spektrikuvien välistä eroa. Työn teoreettinen osuus käy läpi aiemmin aiheesta tehdyn tutkimuksen sekä antaa taustatietoa konenäöstä, hahmontunnistuksesta, luokittelijoista sekä väriteoriasta. Käytännön osan aineistona oli 25 keraamista laattaa, jotka olivat viidestä eri luokasta. Luokittelussa käytettiin apuna k:n lähimmän naapurin (k-NN) luokittelijaa sekä itseorganisoituvaa karttaa (SOM). Saatuja tuloksia verrattiin myös ihmisten tekemään luokitteluun. Neuraalilaskenta huomattiin tärkeäksi työkaluksi spektrianalyysissä. SOM:n ja spektraalisten piirteiden avulla saadut tulokset olivat lupaavia ja ainoastaan kromatisoidut RGB-piirteet olivat luokittelussa parempia kuin nämä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaatimustenkäsittely on erittäin tärkeä osa-alue tehtäessä uusia ohjelmistoja. Vaatimustenkäsittely ei ole vain vaatimusmääritettydokumentin kokoamista ohjelmistoprojektin alussa vaan siihen sisältyy vaatimusten määrittely, hallinta ja todentaminen koko ohjelmiston elinkaaren ajan. Ohjelmistopalveluyrityksessä vaatimustenkäsittelyn merkitys korostuu entisestään ja tällaisella yrityksellä on oltava toimiva vaatimustenkäsittelyprosessi. Tässä työssä esitellään vaatimustenkäsittelyn teoriaa, prosesseihin liittyvää laadunvalvontaa sekä prosessien arviointi ja -kehittämismalleja. Työssä tarkastellaan kahden erityyppisen ohjelmistopalveluyrityksen vaatimustenkäsittelyä ja esitetään havaintoja prosessimalleista. Työn tuloksena esitetään johtopäätöksiä vaatimustenkäsittelystä ja siihen liittyvistä prosesseista sekä laadunvalvonnasta.