24 resultados para Non Functional Requirements

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämä diplomityö kuvaa viestintä sovelluksen ytimen kehitystyön Symbian-alustalle. Koko sovelluksen vaatimuksena oli vastaamattomiin puheluihin vastaaminen ennalta määritellyillä tekstiviesteillä käyttäjän määrittelemien sääntöjen mukaisesti. Ei-toiminnallisia vaatimuksia olivat resurssien käytön vähentäminen ja uudelleenkäytön mahdollistaminen. Täten tämän työn tavoitteena oli kehittää ydin, joka kapseloi sovelluksen sellaisen toiminnallisuuden, joka on käyttöliittymästä riippumatonta ja uudelleenkäytettävää. Kehitystyössä ohjasi Unified Process, joka on iteroiva, käyttötapauksien ohjaama ja arkkitehtuurikeskeinen ohjelmistoprosessi. Se kannusti käyttämään myös muita teollisuudenalan vakiintuneita menetelmiä, kuten suunnittelumalleja ja visuaalista mallintamista käyttäen Unified Modelling Languagea. Suunnittelumalleja käytettiin kehitystyön aikana ja ohjelmisto mallinnettiin visuaalisesti suunnittelun edistämiseksi ja selkiyttämiseksi. Alustan palveluita käytettiin hyväksi kehitysajan ja resurssien käytön minimoimiseksi. Ytimen päätehtäviksi määrättiin viestien lähettäminen sekä sääntöjen talletus ja tarkistaminen. Sovelluksen eri alueet, eli sovelluspalvelin ja käyttöliittymää, pystyivät käyttämään ydintä ja sillä ei ollut riippuvuuksia käyttöliittymätasolle. Täten resurssien käyttö väheni ja uudelleenkäytettävyys lisääntyi. Viestien lähettäminen toteutettiin Symbian-alustan menetelmin. Sääntöjen tallettamiseen tehtiin tallennuskehys, joka eristää sääntöjen sisäisen ja ulkoisen muodon. Tässä tapauksessa ulkoiseksi tallennustavaksi valittiin relaatiotietokanta. Sääntöjen tarkastaminen toteutettiin tavanomaisella olioiden yhteistoiminnalla. Päätavoite saavutettiin. tämä ja muut hyviksi arvioidut lopputulokset, kuten uudelleenkäytettävyys ja vähentynyt resurssien käyttö, arveltiin juontuvan suunnittelumallien ja Unified Processin käytöstä. Kyseiset menetelmät osoittivat mukautuvansa pieniinkin projekteihin. Menetelmien todettiin myös tukevan ja kannustavan kehitystyön aikaista oppimista, mikä oli välttämätöntä tässä tapauksessa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the development of a software requirements specification for a user-centric event management system. The system is set to satisfy three goals: adding value for the event attendees, adding value for the event organizer, and reducing the costs of arranging and running an event. The requirements are identified by researching the prescriptive traits of event business and the current state of the case company and its environment. First the professional and human needs for events are scrutinized. Second, some recent reports about the current trends in the event business are reviewed. Then the event life cycle is presented using the model of new service development, and online promotion of events and especially word-of-mouth marketing receive special attention. Events are also regarded from the perspective of social networks and social media. The case company’s current state and its competitors are reviewed to formulate the needs which the system should fulfil. Then the currently available solutions for social media oriented event management are reviewed. The result is a set of functional and non-functional requirements. The functional requirements are categorized into social media, social networking, event personalization, event management, and system administration features. The specified features and non-functional requirements satisfy the three goals set for the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software quality has become an important research subject, not only in the Information and Communication Technology spheres, but also in other industries at large where software is applied. Software quality is not a happenstance; it is defined, planned and created into the software product throughout the Software Development Life Cycle. The research objective of this study is to investigate the roles of human and organizational factors that influence software quality construction. The study employs the Straussian grounded theory. The empirical data has been collected from 13 software companies, and the data includes 40 interviews. The results of the study suggest that tools, infrastructure and other resources have a positive impact on software quality, but human factors involved in the software development processes will determine the quality of the products developed. On the other hand, methods of development were found to bring little effect on software quality. The research suggests that software quality is an information-intensive process whereby organizational structures, mode of operation, and information flow within the company variably affect software quality. The results also suggest that software development managers influence the productivity of developers and the quality of the software products. Several challenges of software testing that affect software quality are also brought to light. The findings of this research are expected to benefit the academic community and software practitioners by providing an insight into the issues pertaining to software quality construction undertakings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Object-oriented programming is a widely adopted paradigm for desktop software development. This paradigm partitions software into separate entities, objects, which consist of data and related procedures used to modify and inspect it. The paradigm has evolved during the last few decades to emphasize decoupling between object implementations, via means such as explicit interface inheritance and event-based implicit invocation. Inter-process communication (IPC) technologies allow applications to interact with each other. This enables making software distributed across multiple processes, resulting in a modular architecture with benefits in resource sharing, robustness, code reuse and security. The support for object-oriented programming concepts varies between IPC systems. This thesis is focused on the D-Bus system, which has recently gained a lot of users, but is still scantily researched. D-Bus has support for asynchronous remote procedure calls with return values and a content-based publish/subscribe event delivery mechanism. In this thesis, several patterns for method invocation in D-Bus and similar systems are compared. The patterns that simulate synchronous local calls are shown to be dangerous. Later, we present a state-caching proxy construct, which avoids the complexity of properly asynchronous calls for object inspection. The proxy and certain supplementary constructs are presented conceptually as generic object-oriented design patterns. The e ect of these patterns on non-functional qualities of software, such as complexity, performance and power consumption, is reasoned about based on the properties of the D-Bus system. The use of the patterns reduces complexity, but maintains the other qualities at a good level. Finally, we present currently existing means of specifying D-Bus object interfaces for the purposes of code and documentation generation. The interface description language used by the Telepathy modular IM/VoIP framework is found to be an useful extension of the basic D-Bus introspection format.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cell division (mitosis) is a fundamental process in the life cycle of a cell. Equal distribution of chromosomes between the daughter cells is essential for the viability and well-being of an organism: loss of fidelity of cell division is a contributing factor in human cancer and also gives rise to miscarriages and genetic birth defects. For maintaining the proper chromosome number, a cell must carefully monitor cell division in order to detect and correct mistakes before they are translated into chromosomal imbalance. For this purpose an evolutionarily conserved mechanism termed the spindle assembly checkpoint (SAC) has evolved. The SAC comprises a complex network of proteins that relay and amplify mitosis-regulating signals created by assemblages called kinetochores (KTs). Importantly, minor defects in SAC signaling can cause loss or gain of individual chromosomes (aneuploidy) which promotes tumorigenesis while complete failure of SAC results in cell death. The latter event has raised interest in discovery of low molecular weight (LMW) compounds targeting the SAC that could be developed into new anti-cancer therapeutics. In this study, we performed a cell-based, phenotypic high-throughput screen (HTS) to identify novel LMW compounds that inhibit SAC function and result in loss of cancer cell viability. Altogether, we screened 65 000 compounds and identified eight that forced the cells prematurely out of mitosis. The flavonoids fisetin and eupatorin, as well as the synthetic compounds termed SACi2 and SACi4, were characterized in more detail utilizing versatile cell-based and biochemical assays. To identify the molecular targets of these SAC-suppressing compounds, we investigated the conditions in which SAC activity became abrogated. Eupatorin, SACi2 and SACi4 preferentially abolished the tensionsensitive arm of the SAC, whereas fisetin lowered also the SAC activity evoked by lack of attachments between microtubules (MTs) and KTs. Consistent with the abrogation of SAC in response to low tension, our data indicate that all four compounds inhibited the activity of Aurora B kinase. This essential mitotic protein is required for correction of erratic MT-KT attachments, normal SAC signaling and execution of cytokinesis. Furthermore, eupatorin, SACi2 and SACi4 also inhibited Aurora A kinase that controls the centrosome maturation and separation and formation of the mitotic spindle apparatus. In line with the established profound mitotic roles of Aurora kinases, these small compounds perturbed SAC function, caused spindle abnormalities, such as multi- and monopolarity and fragmentation of centrosomes, and resulted in polyploidy due to defects in cytokinesis. Moreover, the compounds dramatically reduced viability of cancer cells. Taken together, using a cell-based HTS we were able to identify new LMW compounds targeting the SAC. We demonstrated for the first time a novel function for flavonoids as cellular inhibitors of Aurora kinases. Collectively, our data support the concept that loss of mitotic fidelity due to a non-functional SAC can reduce the viability of cancer cells, a phenomenon that may possess therapeutic value and fuel development of new anti-cancer drugs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context: Web services have been gaining popularity due to the success of service oriented architecture and cloud computing. Web services offer tremendous opportunity for service developers to publish their services and applications over the boundaries of the organization or company. However, to fully exploit these opportunities it is necessary to find efficient discovery mechanism thus, Web services discovering mechanism has attracted a considerable attention in Semantic Web research, however, there have been no literature surveys that systematically map the present research result thus overall impact of these research efforts and level of maturity of their results are still unclear. This thesis aims at providing an overview of the current state of research into Web services discovering mechanism using systematic mapping. The work is based on the papers published 2004 to 2013, and attempts to elaborate various aspects of the analyzed literature including classifying them in terms of the architecture, frameworks and methods used for web services discovery mechanism. Objective: The objective if this work is to summarize the current knowledge that is available as regards to Web service discovery mechanisms as well as to systematically identify and analyze the current published research works in order to identify different approaches presented. Method: A systematic mapping study has been employed to assess the various Web Services discovery approaches presented in the literature. Systematic mapping studies are useful for categorizing and summarizing the level of maturity research area. Results: The result indicates that there are numerous approaches that are consistently being researched and published in this field. In terms of where these researches are published, conferences are major contributing publishing arena as 48% of the selected papers were conference published papers illustrating the level of maturity of the research topic. Additionally selected 52 papers are categorized into two broad segments namely functional and non-functional based approaches taking into consideration architectural aspects and information retrieval approaches, semantic matching, syntactic matching, behavior based matching as well as QOS and other constraints.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays, when most of the business are moving forward to sustainability by providing or getting different services from different vendors, Service Level Agreement (SLA) becomes very important for both the business providers/vendors and as well as for users/customers. There are many ways to inform users/customers about various services with its inherent execution functionalities and even non-functional/Quality of Services (QoS) aspects through negotiating, evaluating or monitoring SLAs. However, these traditional SLA actually do not cover eco-efficient green issues or IT ethics issues for sustainability. That is why green SLA (GSLA) should come into play. GSLA is a formal agreement incorporating all the traditional commitments as well as green issues and ethics issues in IT business sectors. GSLA research would survey on different traditional SLA parameters for various services like as network, compute, storage and multimedia in IT business areas. At the same time, this survey could focus on finding the gaps and incorporation of these traditional SLA parameters with green issues for all these mentioned services. This research is mainly points on integration of green parameters in existing SLAs, defining GSLA with new green performance indicators and their measurable units. Finally, a GSLA template could define compiling all the green indicators such as recycling, radio-wave, toxic material usage, obsolescence indication, ICT product life cycles, energy cost etc for sustainable development. Moreover, people’s interaction and IT ethics issues such as security and privacy, user satisfaction, intellectual property right, user reliability, confidentiality etc could also need to add for proposing a new GSLA. However, integration of new and existing performance indicators in the proposed GSLA for sustainable development could be difficult for ICT engineers. Therefore, this research also discovers the management complexity of proposed green SLA through designing a general informational model and analyses of all the relationships, dependencies and effects between various newly identified services under sustainability pillars. However, sustainability could only be achieved through proper implementation of newly proposed GSLA, which largely depends on monitoring the performance of the green indicators. Therefore, this research focuses on monitoring and evaluating phase of GSLA indicators through the interactions with traditional basic SLA indicators, which would help to achieve proper implementation of future GSLA. Finally, this newly proposed GSLA informational model and monitoring aspects could definitely help different service providers/vendors to design their future business strategy in this new transitional sustainable society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ohjelmistotestauksen merkitys on kasvanut sen mukaan mitä enemmän ohjelmisto-tuotteet vaikuttavat jokapäiväisesseen elämämme. Tämän vuoksi yritysten investointien ja laadunvarmentamisen yhteys on ilmeinen. Organisaatiot panostavat yhä enemmän ei–funktionaaliseen testaukseen, kuten turvallisuuden, suorituskyvyn ja käytettävyyden testaamiseen. Tämän työn tarkoituksena on tutkia ohjelmistotestauksen nykytilannetta Suomessa. Syy tähän on uudistaa ja parantaa ohjelmistotestauksen kurssitarjontaa Turun yliopistossa vastaamaan parhaalla mahdollisella tavalla yritysten tarvetta. Opinnäyte on toteutettu replikaatio-tutkimuksena. Pääosa kyselystä sisältää kysymyksiä ohjelmistotestauksen menetelmistä ja työkaluista testausprosessin toimintojen aikana. Lisäksi on yleisiä kysymyksiä yrityksistä ja niiden ohjelmistotestausympäristöistä. Kyselyssä otetaan myös kantaa yritysten käyttämiin monenlaisiin testaus-tasoihin, -tyyppeihin ja testauksessa kohdattuihin haasteisiin. Tämä opinnäyte perustuu testausprosessistandardeihin. Ohjelmistotestausstandardit ovat keskeisessä asemassa tässä työssä, vaikka ne ovat olleet viime aikoina vahvan kritiikin kohteena. Epäilys standardien välttämättömyyteen on syntynyt muutoksista ohjelmistokehityksessä. Tämä työ esittelee tulokset ohjelmistotestauksen käytännöistä. Tuloksia on verrattu aiheeseen liittyvän aiemman kyselyn (Lee, Kang, & Lee, 2011) tuloksiin. Ajanpuutteen havaitaan olevan suuri haaste ohjelmistotestauksessa. Ketterä ohjelmistokehitys on saavuttanut suosiota kaikissa vastaajien yrityksissä. Testauksen menetelmät ja työkalut testauksen arviointiin, suunnitteluun ja raportointiin ovat hyvin vähäisessä käytössä. Toisaalta testauksen menetelmien ja työkalujen käyttö automaattiseen testauksen toteuttamiseen ja virheiden hallintaan on lisääntynyt. Järjestelmä-, hyväksyntä-, yksikkö- ja integraatiotestaus ovat käytössä kaikkien vastaajien edustamissa yrityksissä. Kaikkien vastaajien mielestä regressio- sekä tutkiva- ja ei-funktionaalinen testaus ovat tärkeitä tekniikoita.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents synopsis of efficient strategies used in power managements for achieving the most economical power and energy consumption in multicore systems, FPGA and NoC Platforms. In this work, a practical approach was taken, in an effort to validate the significance of the proposed Adaptive Power Management Algorithm (APMA), proposed for system developed, for this thesis project. This system comprise arithmetic and logic unit, up and down counters, adder, state machine and multiplexer. The essence of carrying this project firstly, is to develop a system that will be used for this power management project. Secondly, to perform area and power synopsis of the system on these various scalable technology platforms, UMC 90nm nanotechnology 1.2v, UMC 90nm nanotechnology 1.32v and UMC 0.18 μmNanotechnology 1.80v, in order to examine the difference in area and power consumption of the system on the platforms. Thirdly, to explore various strategies that can be used to reducing system’s power consumption and to propose an adaptive power management algorithm that can be used to reduce the power consumption of the system. The strategies introduced in this work comprise Dynamic Voltage Frequency Scaling (DVFS) and task parallelism. After the system development, it was run on FPGA board, basically NoC Platforms and on these various technology platforms UMC 90nm nanotechnology1.2v, UMC 90nm nanotechnology 1.32v and UMC180 nm nanotechnology 1.80v, the system synthesis was successfully accomplished, the simulated result analysis shows that the system meets all functional requirements, the power consumption and the area utilization were recorded and analyzed in chapter 7 of this work. This work extensively reviewed various strategies for managing power consumption which were quantitative research works by many researchers and companies, it's a mixture of study analysis and experimented lab works, it condensed and presents the whole basic concepts of power management strategy from quality technical papers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sustainability in software system is still a new practice that most software developers and companies are trying to incorporate into their software development lifecycle and has been largely discussed in academia. Sustainability is a complex concept viewed from economic, environment and social dimensions with several definitions proposed making sometimes the concept of sustainability very fuzzy and difficult to apply and assess in software systems. This has hindered the adoption of sustainability in the software industry. A little research explores sustainability as a quality property of software products and services to answer questions such as; How to quantify sustainability as a quality construct in the same way as other quality attributes such as security, usability and reliability? How can it be applied to software systems? What are the measures and measurement scale of sustainability? The Goal of this research is to investigate the definitions, perceptions and measurement of sustainability from the quality perspective. Grounded in the general theory of software measurement, the aim is to develop a method that decomposes sustainability in factors, criteria and metrics. The Result is a method to quantify and access sustainability of software systems while incorporating management and users concern. Conclusion: The method will empower the ability of companies to easily adopt sustainability while facilitating its integration to the software development process and tools. It will also help companies to measure sustainability of their software products from economic, environmental, social, individual and technological dimension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today's business environment has become increasingly unexpected and fast changing because of the global competition. This new environment requires the companies to organize their control differently, e.g. by logistic process thinking. Logistic process thinking in software engineering applies the principles of production process to immaterial products. Processes must be optimized, so that every phase adds value to the customer, and the lead times can be cut shorter to meet the new customer requirements. The purpose of this thesis is to examine and optimize the testing processes of software engineering concentrating on module testing, functional testing and their interface. The concept of logistic process thinking is introduced through production process, value added model and process management. Also theory of testing based on literature is presented, concentrating on module testing and functional testing. The testing processes of the Case Company are presented together with the project models in which they are implemented. The real life practices in module testing and functional testing and their interface are examined through interviews. These practices are analyzed against the processes and the testing theory, through which ideas for optimizing the testing process are introduced. The project world of the Case Company is also introduced together with two example testing projects in different life cycle phases. The examples give a view of how much effort of the project is put in different types of testing.