15 resultados para Model Testing
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The aim of this study is to test the accrual-based model suggested by Dechow et al. (1995) in order to detect and compare earnings management practices in Finnish and French companies. Also the impact of financial crisis of 2008 on earnings management behavior in these countries is tested by dividing the whole time period of 2003-2012 into two sub-periods: pre-crisis (2003-2008) and post-crisis (2009-2012). Results support the idea that companies in both countries have significant earnings management practices. During the post-crisis period companies in Finland show income inflating practices, while in France the opposite tendency is noticed (income deflating) during the same period. Results of the assumption that managers in highly concentrated companies are engaged in income enhancing practices vary in two countries. While in Finland managers are trying to show better performance for bonuses or other contractual compensation motivations, in France they avoid paying dividends or high taxes.
Resumo:
Teollisuuden palvelut tuottavat tulevaisuudessa yhä suuremman osan yritysten liikevaihdosta ja tämän vuoksi niiden systemaattinen kehittäminen yrityksessä on erityisen tärkeää. Tutkimuksen tavoitteena on luoda teorian pohjalta malli, jonka avulla mekatroniikkaklusterin yritykset voivat analysoida tarjottavia palveluitaan asiakkaan näkökulmasta. Tutkimuksessa pyritään myös selvittämään empiirisesti mallin soveltuvuus palvelutarjonnan kehittämiseen tutkimuksessa mukana oleville erikokoisille yrityksille. Tutkimuksen teoreettisessa osassa luodaan katsaus yrityksen kasvustrategioihin. Palveluiden innovaatio- sekä kehitysprosessit käydään lyhyesti läpi, sekä esitetään lähtökohtia, miten yritykset voivat siirtyä tuotetarjoajasta palveluntarjoajaksi. Teoriaosassa käydään läpi QFD -malli, jota voidaan käyttää hyödyksi kun analysoidaan asiakkaan palvelutarpeita ja priorisoidaan yrityksen resursseja. QFD -mallin lisäksi esitellään myös muita laatutyökaluja sekä asiakkaan tarpeiden selvitysmenetelmiä, joita yritys voi käyttää hyödykseen QFD -mallin kanssa. Synteesiosassa esitellään ensin MS Excel -pohjainen QFD -malli. Mallin käyttö-kokeilu toteutetaan työpaja tyylisessä kokouksessa, ja tutkimustulokset kerätään havainnoimalla, teemahaastattelulla sekä kyselyn avulla.
Resumo:
Tämän tutkimuksen päätavoitteena oli luoda yleisellä tasolla kustannusmalli maarakennuskonepalveluita tuottavien pk-yritysten käyttöön ja käytännön päätöstilanteiden avuksi osana päätöksenteko organisaation ja myyntityötä tekevän portaan kustannustarkkailuun. Mallin luomisen tarkoituksena oli että mallia voidaan helposti muokata erilaisten myyntitilanteiden kannattavuuksien tarkasteluun ja sitä kautta malli luo käyttäjilleen etulyöntiaseman luodessa pitkiä palvelusopimuksia ja erilaisten projektityömaiden myyntisopimuksia simuloimalla kaluston siirtokustannuksia olemassa olevien tiedettyjen kustannustekijöiden toimesta. Teollisuudessa ja palvelujentarjoajapuolella on vastaavia malleja esitetty, mutta erityisesti maarakennuspuolen ja konevuokrauksen kustannuslaskentamalleja ei julkisesta ole juurikaan saatavilla. Työn kustannusmallin muutoksia simuloitiin ja testattiin luomalla erilaisia kysyntäskenaarioita joista yksi esitellään tarkemmin työn testausosiossa. mallilla on helppo kasata kustannusdataa erilaisina yhtälöinä miten uudet työmaat ovat kannattavampia luomalla kokonaisvaltaisesti paljon lisää uusia työkohteita. Kustannusmallin rakentamiselle oli kysyntää ja tärkeänä tietona pidettiin kokonaisvaltaista muutosta ja tietoa millä tehollisilla tunneilla vastaavat hankinnat olisivat kannattavia. Työn teoriaosa pohjautuu pääasiassa hinnoittelun, kannattavuuden ja investointilaskelmien teoriaan, artikkeleihin ja tutkimuksiin sekä kirjoihin. Työn empiirinen osa perustuu arvioihin tämän hetken hintatasoista sekä arvioihin kustannusten kertymisestä maarakennuspalveluita tuottavissa pk-yrityksissä joissa organisaatiokaavio on matala ja toiminta tehokasta. Keskeisimmät tulokset liittyvät siihen miten kustannuksia tulee huomioida erilaisille asiakkaille ja millainen kustannusmalli on käyttökelpoinen eri tilanteissa.
Resumo:
Abstract
Resumo:
Tämä työ tehtiin globaaliin elektroniikka-alan yritykseen. Diplomityö liittyy haasteeseen, jonka lisääntynyt globalisaatio ja kiristyvä kilpailu ovat luoneet: case yrityksen on selvitettävä kuinka se voi saavuttaa kasvutavoitteet myös tulevaisuudessa hankkimalla uusia asiakkaita ja olemalla yhä enenevissä määrin maailmanlaajuisesti läsnä. Tutkimuksen tavoite oli löytää sopiva malli potentiaalisten avainasiakkaiden identifiointiin ja valintaan, sekä testata ja modifioida valittua mallia case yrityksen tarpeiden mukaisesti. Erityisesti raakadatan kerääminen, asiakkaiden houkuttelevuuskriteerit ja kohdemarkkinarako olivat asioita, jotka tarvitsivat tutkimuksessa huomiota. Kirjallisuuskatsauksessa keskityttiin yritysmarkkinoihin, eri asiakassuhteenhallinnan lähestymistapoihin ja avainasiakkaiden määrittämiseen. CRM:n, KAM:n ja Customer Insight-ajattelun perusteet esiteltiin yhdessä eri avainasiakkaiden identifiointimallien kanssa. Valittua Chevertonin mallia testattiin ja muokattiin työn empiirisessä osassa. Tutkimuksen empiirinen kontribuutio on modifioitu malli potentiaalisten avainasiakkaiden identifiointiin. Se auttaa päätöksentekijöitä etenemään systemaattisesti ja organisoidusti askel askeleelta kohti potentiaalisten asiakkaiden listaa tietyltä markkina-alueelta. Työ tarjoaa työkalun tähän prosessiin sekä luo pohjaa tulevaisuuden tutkimukselle ja toimenpiteille.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
Today's business environment has become increasingly unexpected and fast changing because of the global competition. This new environment requires the companies to organize their control differently, e.g. by logistic process thinking. Logistic process thinking in software engineering applies the principles of production process to immaterial products. Processes must be optimized, so that every phase adds value to the customer, and the lead times can be cut shorter to meet the new customer requirements. The purpose of this thesis is to examine and optimize the testing processes of software engineering concentrating on module testing, functional testing and their interface. The concept of logistic process thinking is introduced through production process, value added model and process management. Also theory of testing based on literature is presented, concentrating on module testing and functional testing. The testing processes of the Case Company are presented together with the project models in which they are implemented. The real life practices in module testing and functional testing and their interface are examined through interviews. These practices are analyzed against the processes and the testing theory, through which ideas for optimizing the testing process are introduced. The project world of the Case Company is also introduced together with two example testing projects in different life cycle phases. The examples give a view of how much effort of the project is put in different types of testing.
Resumo:
Simulaattorit ovat yksinkertaistettuja malleja tietyistä järjestelmän osioista. Niitä käytetään mallintamaan testattavan osion ympärillä olevien muiden osioiden ulkoista toimintaa, jotta testattavalle osiolle saadaan oikeanlainen toimintaympäristö aikaiseksi. Tilakoneita käytetään mallintamaan ohjelmistojen tai niiden osien toimintaa. Sanomaohjatuissa tilakoneissa tilojen vaihdot perustuvat saapuviin sanomiin. Tässä työssä esitellään erään ohjelmiston alijärjestelmän testaamisessa käytettävä arkkitehtuuri, joka perustuu suurelta osin simulaattoreiden käyttöön muiden alijärjestelmien mallintamisessa. Testattava ohjelmisto koostuu enimmäkseen tilakoneista, jotka vaihtavat keskenään sanomia ja ohjaavat näin toistensa tilasiirtymiä. Työn testausympäristö on suunniteltu juuri tämänkaltaisen ohjelmiston testaamiseen. Työssä esiteltävää testausympäristöä myöskin käytettiin useamman kuukauden ajan ja se todettiin toimivaksi. Joitakin testausympäristön käyttöohjeita, käyttökokemuksia sekä siihen liittyviä parannusehdotuksia käydään läpi työn loppuosassa. Erityisesti havaittiin miten tärkeää on testata implementaatiota jo luokka tasolla ennen alijärjestelmä tason testaukseen siirtymistä sekä päädyttiin siihen, että suunnitteluvaiheen pitäisi olla lähemmin liitoksissa alijärjestelmätestaukseen.
Resumo:
Novel biomaterials are needed to fill the demand of tailored bone substitutes required by an ever‐expanding array of surgical procedures and techniques. Wood, a natural fiber composite, modified with heat treatment to alter its composition, may provide a novel approach to the further development of hierarchically structured biomaterials. The suitability of wood as a model biomaterial as well as the effects of heat treatment on the osteoconductivity of wood was studied by placing untreated and heat‐treated (at 220 C , 200 degrees and 140 degrees for 2 h) birch implants (size 4 x 7mm) into drill cavities in the distal femur of rabbits. The follow‐up period was 4, 8 and 20 weeks in all in vivo experiments. The flexural properties of wood as well as dimensional changes and hydroxyl apatite formation on the surface of wood (untreated, 140 degrees C and 200 degrees C heat‐treated wood) were tested using 3‐point bending and compression tests and immersion in simulated body fluid. The effect of premeasurement grinding and the effect of heat treatment on the surface roughness and contour of wood were tested with contact stylus and non‐contact profilometry. The effects of heat treatment of wood on its interactions with biological fluids was assessed using two different test media and real human blood in liquid penetration tests. The results of the in vivo experiments showed implanted wood to be well tolerated, with no implants rejected due to foreign body reactions. Heat treatment had significant effects on the biocompatibility of wood, allowing host bone to grow into tight contact with the implant, with occasional bone ingrowth into the channels of the wood implant. The results of the liquid immersion experiments showed hydroxyl apatite formation only in the most extensively heat‐treated wood specimens, which supported the results of the in vivo experiments. Parallel conclusions could be drawn based on the results of the liquid penetration test where human blood had the most favorable interaction with the most extensively heat‐treated wood of the compared materials (untreated, 140 degrees C and 200 degrees C heat‐treated wood). The increasing biocompatibility was inferred to result mainly from changes in the chemical composition of wood induced by the heat treatment, namely the altered arrangement and concentrations of functional chemical groups. However, the influence of microscopic changes in the cell walls, surface roughness and contour cannot be totally excluded. The heat treatment was hypothesized to produce a functional change in the liquid distribution within wood, which could have biological relevance. It was concluded that the highly evolved hierarchical anatomy of wood could yield information for the future development of bulk bone substitutes according to the ideology of bioinspiration. Furthermore, the results of the biomechanical tests established that heat treatment alters various biologically relevant mechanical properties of wood, thus expanding the possibilities of wood as a model material, which could include e.g. scaffold applications, bulk bone applications and serving as a tool for both mechanical testing and for further development of synthetic fiber reinforced composites.
Resumo:
This study investigates futures market efficiency and optimal hedge ratio estimation. First, cointegration between spot and futures prices is studied using Johansen method, with two different model specifications. If prices are found cointegrated, restrictions on cointegrating vector and adjustment coefficients are imposed, to account for unbiasedness, weak exogeneity and prediction hypothesis. Second, optimal hedge ratios are estimated using static OLS, and time-varying DVEC and CCC models. In-sample and out-of-sample results for one, two and five period ahead are reported. The futures used in thesis are RTS index, EUR/RUB exchange rate and Brent oil, traded in Futures and options on RTS.(FORTS) For in-sample period, data points were acquired from start of trading of each futures contract, RTS index from August 2005, EUR/RUB exchange rate March 2009 and Brent oil October 2008, lasting till end of May 2011. Out-of-sample period covers start of June 2011, till end of December 2011. Our results indicate that all three asset pairs, spot and futures, are cointegrated. We found RTS index futures to be unbiased predictor of spot price, mixed evidence for exchange rate, and for Brent oil futures unbiasedness was not supported. Weak exogeneity results for all pairs indicated spot price to lead in price discovery process. Prediction hypothesis, unbiasedness and weak exogeneity of futures, was rejected for all asset pairs. Variance reduction results varied between assets, in-sample in range of 40-85 percent and out-of sample in range of 40-96 percent. Differences between models were found small, except for Brent oil in which OLS clearly dominated. Out-of-sample results indicated exceptionally high variance reduction for RTS index, approximately 95 percent.
Resumo:
Recently, due to the increasing total construction and transportation cost and difficulties associated with handling massive structural components or assemblies, there has been increasing financial pressure to reduce structural weight. Furthermore, advances in material technology coupled with continuing advances in design tools and techniques have encouraged engineers to vary and combine materials, offering new opportunities to reduce the weight of mechanical structures. These new lower mass systems, however, are more susceptible to inherent imbalances, a weakness that can result in higher shock and harmonic resonances which leads to poor structural dynamic performances. The objective of this thesis is the modeling of layered sheet steel elements, to accurately predict dynamic performance. During the development of the layered sheet steel model, the numerical modeling approach, the Finite Element Analysis and the Experimental Modal Analysis are applied in building a modal model of the layered sheet steel elements. Furthermore, in view of getting a better understanding of the dynamic behavior of layered sheet steel, several binding methods have been studied to understand and demonstrate how a binding method affects the dynamic behavior of layered sheet steel elements when compared to single homogeneous steel plate. Based on the developed layered sheet steel model, the dynamic behavior of a lightweight wheel structure to be used as the structure for the stator of an outer rotor Direct-Drive Permanent Magnet Synchronous Generator designed for high-power wind turbines is studied.
Resumo:
Open innovation paradigm states that the boundaries of the firm have become permeable, allowing knowledge to flow inwards and outwards to accelerate internal innovations and take unused knowledge to the external environment; respectively. The successful implementation of open innovation practices in firms like Procter & Gamble, IBM, and Xerox, among others; suggest that it is a sustainable trend which could provide basis for achieving competitive advantage. However, implementing open innovation could be a complex process which involves several domains of management; and whose term, classification, and practices have not totally been agreed upon. Thus, with many possible ways to address open innovation, the following research question was formulated: How could Ericsson LMF assess which open innovation mode to select depending on the attributes of the project at hand? The research followed the constructive research approach which has the following steps: find a practical relevant problem, obtain general understanding of the topic, innovate the solution, demonstrate the solution works, show theoretical contributions, and examine the scope of applicability of the solution. The research involved three phases of data collection and analysis: Extensive literature review of open innovation, strategy, business model, innovation, and knowledge management; direct observation of the environment of the case company through participative observation; and semi-structured interviews based of six cases involving multiple and heterogeneous open innovation initiatives. Results from the cases suggest that the selection of modes depend on multiple reasons, with a stronger influence of factors related to strategy, business models, and resources gaps. Based on these and others factors found in the literature review and observations; it was possible to construct a model that supports approaching open innovation. The model integrates perspectives from multiple domains of the literature review, observations inside the case company, and factors from the six open innovation cases. It provides steps, guidelines, and tools to approach open innovation and assess the selection of modes. Measuring the impact of open innovation could take years; thus, implementing and testing entirely the model was not possible due time limitation. Nevertheless, it was possible to validate the core elements of the model with empirical data gathered from the cases. In addition to constructing the model, this research contributed to the literature by increasing the understanding of open innovation, providing suggestions to the case company, and proposing future steps.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.