74 resultados para Data portal performance
Resumo:
Tutkimuksen tavoitteena on selvittää, esiintyykö suomeen sijoittavilla osakerahastoilla menestyksen pysyvyyttä. Tutkimusaineisto koostuu kaikista suomalaisista osakerahastoista, jotka toimivat ajanjaksolla 15.1.1998-13.1.2005. Aineisto on vapaa selviytymisvinoumasta. Suorituskyvyn mittareina käytetään CAPM-alfaa sekä kolmi- ja nelifaktori-alfaa. Empiirisessä osassa osakerahastojen menestyksen pysyvyyttä testataan Spearmanin järjestyskorrelaatiotestillä. Evidenssi menestyksen pysyvyydestä jäi vähäiseksi, vaikkakin sitä esiintyi satunnaisesti kaikilla menestysmittareilla joillakin ranking- ja sijoitusperiodin yhdistelmillä. CAPM-alfalla tarkasteltuna tilastollisesti merkitsevää menestyksen pysyvyyttä esiintyi selvästi useammin kuin muilla menestysmittareilla. Tulokset tukevat viimeaikaisia kansainvälisiä tutkimuksia, joiden mukaan menestyksen pysyvyys riippuu usein mittaustavasta. Menestysmittareina käytettyjen regressiomallien merkitsevyystestit osoittavat multifaktorimallien selittävän osakerahastojen tuottoja CAPM:a paremmin. Lisätyt muuttujat parantavat merkittävästi CAPM:n selitysvoimaa.
Resumo:
Tutkimuksen tarkoituksena on selvittää, miten valtionomistajuus vaikuttaa yrityksen suorituskykyyn suomalaisissa pörssinoteeratuissa valtionyhtiöissä, joissa valtio toimii pää- tai osaomistajana. Suorituskykyä tutkitaan kandella eri menetelmällä. Ensin tutkitaan osaketuottoja Jensenin alfan avulla, jonka jälkeen suoritetaan tilinpäätöstunnuslukujen toimialavertailu. Tutkimuksen teoriaosuudessa esitetään yksityistämisen tuottamia etuja yrityksen taloudelliseen suorituskykyyn, sekä myöskin valtionomistajuuden tuottamia etuja. Lisäksi teoriaosuudessa käsitellään aikaisempien empiiristen tutkimusten tuloksia valtionomistajuuden vaikutuksista. Tämän tutkimuksen empiirisessä osiossa käytettävä data on saatu osakedatan osalta Datastreamista ja tilinpäätöstunnuslukujen osalta Balance Consulting Oy:ltä. Kokonaisosakedataa koskeva tutkimus Jensenin alfalla ei osoittanut valtionyhtiöiden toimivan tehottomasti, vaan osoitti yritysten kyenneen tuottamaan epänormaaleja tuottoja riskitasoonsa nähden. Vuositasolle pilkotun datan analysointi sen sijaan tuotti useita negatiivisia alfoja yrityksille eli merkkejä tehottomuudesta tiettyinä vuosina. Lisäksi tilinpäätöstunnuslukujen analysointi osoitti osan valtionyhtiöistä olleen pääosin omaa toimialaansa tehottomampia, kun taas osa kykenipäihittämään toimialansa.
Resumo:
The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.
Resumo:
Due to the intense international competition, demanding, and sophisticated customers, and diverse transforming technological change, organizations need to renew their products and services by allocating resources on research and development (R&D). Managing R&D is complex, but vital for many organizations to survive in the dynamic, turbulent environment. Thus, the increased interest among decision-makers towards finding the right performance measures for R&D is understandable. The measures or evaluation methods of R&D performance can be utilized for multiple purposes; for strategic control, for justifying the existence of R&D, for providing information and improving activities, as well as for the purposes of motivating and benchmarking. The earlier research in the field of R&D performance analysis has generally focused on either the activities and considerable factors and dimensions - e.g. strategic perspectives, purposes of measurement, levels of analysis, types of R&D or phases of R&D process - prior to the selection of R&Dperformance measures, or on proposed principles or actual implementation of theselection or design processes of R&D performance measures or measurement systems. This study aims at integrating the consideration of essential factors anddimensions of R&D performance analysis to developed selection processes of R&D measures, which have been applied in real-world organizations. The earlier models for corporate performance measurement that can be found in the literature, are to some extent adaptable also to the development of measurement systemsand selecting the measures in R&D activities. However, it is necessary to emphasize the special aspects related to the measurement of R&D performance in a way that make the development of new approaches for especially R&D performance measure selection necessary: First, the special characteristics of R&D - such as the long time lag between the inputs and outcomes, as well as the overall complexity and difficult coordination of activities - influence the R&D performance analysis problems, such as the need for more systematic, objective, balanced and multi-dimensional approaches for R&D measure selection, as well as the incompatibility of R&D measurement systems to other corporate measurement systems and vice versa. Secondly, the above-mentioned characteristics and challenges bring forth the significance of the influencing factors and dimensions that need to be recognized in order to derive the selection criteria for measures and choose the right R&D metrics, which is the most crucial step in the measurement system development process. The main purpose of this study is to support the management and control of the research and development activities of organizations by increasing the understanding of R&D performance analysis, clarifying the main factors related to the selection of R&D measures and by providing novel types of approaches and methods for systematizing the whole strategy- and business-based selection and development process of R&D indicators.The final aim of the research is to support the management in their decision making of R&D with suitable, systematically chosen measures or evaluation methods of R&D performance. Thus, the emphasis in most sub-areas of the present research has been on the promotion of the selection and development process of R&D indicators with the help of the different tools and decision support systems, i.e. the research has normative features through providing guidelines by novel types of approaches. The gathering of data and conducting case studies in metal and electronic industry companies, in the information and communications technology (ICT) sector, and in non-profit organizations helped us to formulate a comprehensive picture of the main challenges of R&D performance analysis in different organizations, which is essential, as recognition of the most importantproblem areas is a very crucial element in the constructive research approach utilized in this study. Multiple practical benefits regarding the defined problemareas could be found in the various constructed approaches presented in this dissertation: 1) the selection of R&D measures became more systematic when compared to the empirical analysis, as it was common that there were no systematic approaches utilized in the studied organizations earlier; 2) the evaluation methods or measures of R&D chosen with the help of the developed approaches can be more directly utilized in the decision-making, because of the thorough consideration of the purpose of measurement, as well as other dimensions of measurement; 3) more balance to the set of R&D measures was desired and gained throughthe holistic approaches to the selection processes; and 4) more objectivity wasgained through organizing the selection processes, as the earlier systems were considered subjective in many organizations. Scientifically, this dissertation aims to make a contribution to the present body of knowledge of R&D performance analysis by facilitating dealing with the versatility and challenges of R&D performance analysis, as well as the factors and dimensions influencing the selection of R&D performance measures, and by integrating these aspects to the developed novel types of approaches, methods and tools in the selection processes of R&D measures, applied in real-world organizations. In the whole research, facilitation of dealing with the versatility and challenges in R&D performance analysis, as well as the factors and dimensions influencing the R&D performance measure selection are strongly integrated with the constructed approaches. Thus, the research meets the above-mentioned purposes and objectives of the dissertation from the scientific as well as from the practical point of view.
Resumo:
Työn tavoittena oli selvittää, miten tietovarastointi voi tukea yrityksessä tapahtuvaa päätöksentekoa. Tietovarastokomponenttien ja –prosessien kuvauksen jälkeen on käsitelty tietovarastoprojektin eri vaiheita. Esitettyä teoriaa sovellettiin käytäntöön globaalissa metalliteollisuusyrityksessä, jossa tietovarastointikonseptia testattiin. Testauksen perusteella arvioitiin olemassa olevan tiedon tilaa sekä kahden käytetyn ohjelmiston toimivuutta tietovarastoinnissa. Yrityksen operatiivisten järjestelmien tiedon laadun todettiin olevan tutkituilta osin epäyhtenäistä ja puutteellista. Siksi tiedon suora yrityslaajuinen hyödyntäminen luotettavien ja hyvälaatuisten raporttien luonnissa on vaikeaa. Lisäksi eri yksiköiden välillä havaittiin epäyhtenäisyyttä käytettyjen liiketoiminnan käsitteiden sekä järjestelmien käyttötapojen suhteen. Testauksessa käytetyt ohjelmistot suoriutuivat perustietovarastoinnista hyvin, vaikkakin joitain rajoituksia ja erikoisuuksia ilmenikin. Työtä voidaan pitää ennen varsinaista tietovarastoprojektia tehtävänä esitutkimuksena. Jatkotoimenpiteinä ehdotetaan testauksen jatkamista nykyisillä työkaluilla kohdistaen tavoitteet konkreettisiin tuloksiin. Tiedon laadun tärkeyttä tulee korostaa koko organisaatiossa ja olemassa olevan tiedon laatua pitää parantaa tulevaisuudessa.
Resumo:
Tämä diplomityö kirjoitettiin UPM- Kymmene konsernin UPM Net Services sa/nv osastolle Brysselissä ja Helsingissä. Työn aihe, Data communication in paper sales environment, määriteltiin käsittelemään paperin myyntijärjestelmään liittyviä aiheita. Nykyinen paperin myyntijärjestelmä on käsitelty ensin teoriassa ja aiheeseen kuuluvat ohjelmistotuotteet ja työkaluohjelmistot on esitelty. Parannuksia nykyiseen järjestelmään on pohdittu ohjelmistosuunnittelun, tehokkuuden, tiedon hallinnan, tietoturvallisuuden ja liiketoiminnan näkökulmista. Diplomityön käytännön osuudessa esitellään kaksi ohjelmistoa. Nämä ohjelmistot tehtiin UPM Net Services'lle, jotta saatiin kokemuksia viestin välitykseen perustuvasta tiedon siirrosta. Diplomityön johtopäätösosuudessa todetaan, että paperin myyntijärjestelmän tiedon siirto toimii luotettavasti nykyisessä järjestelmässä. Tulevaisuuden tarpeet ja parannukset ovat kuitenkin vaikeasti toteutettavissa nykyään käytettävin välinein. Erityisesti internetin hyödyntäminen nähdään tärkeänä, mutta se on vaikeasti otettavissa käyttöön nykyisessä järjestelmässä. Viestin välitykseen perustuvat järjestelmät ovat osoittautuneet käytännössä toimiviksi ja tärkein kehitysehdotus onkin viestin välitysjärjestelmän käyttöönotto.
Resumo:
Diplomityö tarkastelee säikeistettyä ohjelmointia rinnakkaisohjelmoinnin ylemmällä hierarkiatasolla tarkastellen erityisesti hypersäikeistysteknologiaa. Työssä tarkastellaan hypersäikeistyksen hyviä ja huonoja puolia sekä sen vaikutuksia rinnakkaisalgoritmeihin. Työn tavoitteena oli ymmärtää Intel Pentium 4 prosessorin hypersäikeistyksen toteutus ja mahdollistaa sen hyödyntäminen, missä se tuo suorituskyvyllistä etua. Työssä kerättiin ja analysoitiin suorituskykytietoa ajamalla suuri joukko suorituskykytestejä eri olosuhteissa (muistin käsittely, kääntäjän asetukset, ympäristömuuttujat...). Työssä tarkasteltiin kahdentyyppisiä algoritmeja: matriisioperaatioita ja lajittelua. Näissä sovelluksissa on säännöllinen muistinkäyttökuvio, mikä on kaksiteräinen miekka. Se on etu aritmeettis-loogisissa prosessoinnissa, mutta toisaalta huonontaa muistin suorituskykyä. Syynä siihen on nykyaikaisten prosessorien erittäin hyvä raaka suorituskyky säännöllistä dataa käsiteltäessä, mutta muistiarkkitehtuuria rajoittaa välimuistien koko ja useat puskurit. Kun ongelman koko ylittää tietyn rajan, todellinen suorituskyky voi pudota murto-osaan huippusuorituskyvystä.
Resumo:
Työn tavoitteena oli selvittää Stora Enso Oyj:llä käytössä olevan Fenix myynnin- ja logistiikanhallintajärjestelmän logistiikkapalveluiden suorituskyky, tuottaa asiakasohjelmisto suorituskykymittauksista muodostuneen tiedon hallintaan sekä tuottaa toteuttamissuunnitelma suorituskyvyn parantamiseksi. Suorituskyky mitattiin käyttämällä TUXEDOn tarjoamia ominaisuuksia. Suorituskykymittausten tuloksien arviointia varten rakennettiin asiakasohjelmisto, jolla pystyttiin tuottamaan tarvittavat yhteenvetotiedot palveluiden kestoista ja rakenteista. Valmiita ratkaisuja ei ollut tarjolla, joten kaikki tarvittavat ohjelmistot on rakennettu osana tätä työtä. Kaikki komponenttiliittymät toteutettiin siten, että myös muitakin kuin logistiikkaan liittyviä palveluita voidaan tarvittaessa mitata. Mittausten tuloksena saatuja keskimääräisiä suoritusaikoja käytettiin hyväksi toteuttamissuunnitelmaa tehdessä. Toteutussuunnitelma sisältää useiden logistiikka-alueiden kehittämisideoita, joilla Fenixin logistiikkapalveluiden suorituskykyä voidaan tehostaa., ja nykyinen järjestelmän toimintanopeus pystytään säilyttämään tulevaisuudessa. Toteuttamissuunnitelmassa esitettyjä toimenpiteitä tullaan toteuttamaan TietoEnator Oyj:ssä vuoden 2003 aikana.
Resumo:
Tämän tutkimuksen ensisijaisena tavoitteena oli määrittää Schauman Wood Oy:n ostoprosessin suorituskyvyn nykytila yrityksen Suomen yksiköissä. Nykytila-arviointi suoritettiin uusien ja käytössä olevien mittaustulosten avulla. Tutkimuksessa verrattiin kymmenen tuotantolaitoksen ostoprosesseja keskenään. Keskeinen tutkimusongelma oli ostoprosessin suorituseroja aikaansaavien tekijöiden selvittäminen eri yksiköissä. Tutkimuksen tavoitteena oli saavuttaa yhtenäisemmät toimintatavat yrityksessä sekä laajentaa konsernin osto-organisaation hyödyntämistä hankintatoimessa. Tavoitteena oli ostoprosessin virtaviivaistaminen ja tehokkaamman seurantajärjestelmän kehittäminen. Ostotoimintojen suorituskyvyn jatkuva parantaminen perustuu osittain uusien mittareiden avulla saatavaan informaatioon ja täsmällisempään seurantaan. Sisäistä benchmarkingia käytettiin työkaluna suorituskyky-eroavaisuuksien määrittelyssä. Tietoa erilaisista toimintatavoista kerättiin haastattelemalla yrityksen ostajia ja tehdaspalvelupäälliköitä eri tehdaspaikkakunnilla. Sisäisen benchmarkingin avulla määriteltiin toimintatapa eroavaisuudet sekä kehitettiin seurantakortti, jossa jokaista yksikköä verrataan parhaaseen ja eniten kehittyneeseen yksikköön. Työn tuloksina muodostui ehdotuksia ostotoiminnon uusiksi mittareiksi. Uudet mittarit ovat tehokkuusmittareita, jotka kuvaavat resurssien käytön tehokkuutta sekä auttavat seuraamaan ostoprosessin tilaa entistä paremmin. Uusien mittareiden tavoitteena on myös vähentää mittareiden manipulaatiomahdollisuutta. Työn ulkopuolelle rajattiin informaatioteknologiajärjestelmien tietotekninen osuus. Eräs yrityksen tuotantolaitoksista rajattiin myös työn ulkopuolelle, koska sen ostoprosessit ovat huomattavasti kehittymättömämpiä kuin Schauman Woodin muiden tehtaiden ostoprosessit. Kyseisen yksikön kehittämisen tulee lähteä aivan ruohonjuuritasolta. Tutkimuksen teoriaosuus on kerätty alan ammattikirjallisuudesta ja tutkimuksen aihetta käsittelevistä uudehkoista tieteellisistä alan artikkeleista. Teorian tarkoituksena on tukea empiiristä osuutta sekä antaa lukijalle uusia näkemyksiä ostotoiminnan monista mahdollisuuksista. Tutkimuksen tuloksia ovat nykytila-analyysi, ehdotukset uusista ostotoiminnan mittareista sekä ehdotus MRO-tuotteiden ulkoistamiskokeilusta. Yrityksen ostotoiminnan tulisi kehittyä operatiivisesta tasosta kohti strategisempaa oston tasoa. Johdon sitoutuminen hankintatoimen kehityshankkeisiin on erityisen tärkeää, lisäksi hankintatoimi tulisi nähdä strategisempana osa-alueena yrityksessä. Hankintatoimen kehittämisen avulla yrityksen kustannustehokkuutta voidaan lisätä merkittävästi.
Resumo:
Recent advances in machine learning methods enable increasingly the automatic construction of various types of computer assisted methods that have been difficult or laborious to program by human experts. The tasks for which this kind of tools are needed arise in many areas, here especially in the fields of bioinformatics and natural language processing. The machine learning methods may not work satisfactorily if they are not appropriately tailored to the task in question. However, their learning performance can often be improved by taking advantage of deeper insight of the application domain or the learning problem at hand. This thesis considers developing kernel-based learning algorithms incorporating this kind of prior knowledge of the task in question in an advantageous way. Moreover, computationally efficient algorithms for training the learning machines for specific tasks are presented. In the context of kernel-based learning methods, the incorporation of prior knowledge is often done by designing appropriate kernel functions. Another well-known way is to develop cost functions that fit to the task under consideration. For disambiguation tasks in natural language, we develop kernel functions that take account of the positional information and the mutual similarities of words. It is shown that the use of this information significantly improves the disambiguation performance of the learning machine. Further, we design a new cost function that is better suitable for the task of information retrieval and for more general ranking problems than the cost functions designed for regression and classification. We also consider other applications of the kernel-based learning algorithms such as text categorization, and pattern recognition in differential display. We develop computationally efficient algorithms for training the considered learning machines with the proposed kernel functions. We also design a fast cross-validation algorithm for regularized least-squares type of learning algorithm. Further, an efficient version of the regularized least-squares algorithm that can be used together with the new cost function for preference learning and ranking tasks is proposed. In summary, we demonstrate that the incorporation of prior knowledge is possible and beneficial, and novel advanced kernels and cost functions can be used in algorithms efficiently.
Resumo:
Market orientation is the organizational culture that creates the necessary behaviors for continuous additional value for customers and thus continuous superior performance for the business. The field of market orientation has been studied repeatedly during the past two decades. Yet research has concentrated on large firms in large domestic markets creating a need for diversifying research. The master’s thesis at hand examined the general incidence of market orientation among SMEs from five different industries as well as its consequences on SME performance. The empirical part of the thesis was conducted with a web-based survey that resulted in 255 responses. The data of the survey was analyzed by statistical analysis. The incidence of market orientation varied among dimensions and market orientation did not show any direct effect on firm performance. Customer orientation was the only dimension that showed a direct (positive) effect. On the contrary, moderating effects were found which indicate that the effect of market orientation in SMEs is influenced by other factors that should receive further attention. Also industry specific differences were discovered and should be further examined.
Resumo:
Performance standards for Positron emission tomography (PET) were developed to be able to compare systems from different generations and manufacturers. This resulted in the NEMA methodology in North America and the IEC in Europe. In practices, the NEMA NU 2- 2001 is the method of choice today. These standardized methods allow assessment of the physical performance of new commercial dedicated PET/CT tomographs. The point spread in image formation is one of the factors that blur the image. The phenomenon is often called the partial volume effect. Several methods for correcting for partial volume are under research but no real agreement exists on how to solve it. The influence of the effect varies in different clinical settings and it is likely that new methods are needed to solve this problem. Most of the clinical PET work is done in the field of oncology. The whole body PET combined with a CT is the standard investigation today in oncology. Despite the progress in PET imaging technique visualization, especially quantification of small lesions is a challenge. In addition to partial volume, the movement of the object is a significant source of error. The main causes of movement are respiratory and cardiac motions. Most of the new commercial scanners are in addition to cardiac gating, also capable of respiratory gating and this technique has been used in patients with cancer of the thoracic region and patients being studied for the planning of radiation therapy. For routine cardiac applications such as assessment of viability and perfusion only cardiac gating has been used. However, the new targets such as plaque or molecular imaging of new therapies require better control of the cardiac motion also caused by respiratory motion. To overcome these problems in cardiac work, a dual gating approach has been proposed. In this study we investigated the physical performance of a new whole body PET/CT scanner with NEMA standard, compared methods for partial volume correction in PET studies of the brain and developed and tested a new robust method for dual cardiac-respiratory gated PET with phantom, animal and human data. Results from performance measurements showed the feasibility of the new scanner design in 2D and 3D whole body studies. Partial volume was corrected, but there is no best method among those tested as the correction also depends on the radiotracer and its distribution. New methods need to be developed for proper correction. The dual gating algorithm generated is shown to handle dual-gated data, preserving quantification and clearly eliminating the majority of contraction and respiration movement
Resumo:
The purpose of this study is to investigate the performance persistence of international mutual funds, employing a data sample which includes 2,168 European mutual funds investing in Asia-Pacific region; Japan excluded. Also, a number of performance measures is tested and compared, and especially, this study tries to find out whether iterative Bayesian procedure can be used to provide more accurate predictions on future performance. Finally, this study examines whether the cross-section of mutual fund returns can be explained with simple accounting variables and market risk. To exclude the effect of the Asian currency crisis in 1997, the studied time period includes years from 1999 to 2007. The overall results showed significant performance persistence for repeating winners when performance was tested with contingency tables. Also the annualized alpha spreads between the top and bottom portfolios were more than ten percent at their highest. Nevertheless, the results do not confirm the improved prediction accuracy of the Bayesian alphas.
Resumo:
The main purpose of this thesis is to investigate winner-loser performance when financial markets are facing crisis. This is examined through the idea that does the prior loser portfolios outperform the prior winner portfolios during the three major crises: The depression of the 1990s, the IT-Bubble and the Subprime -crisis. Firstly, the winner and loser portfolios superiority is counted by using the cumulative excess returns from the examination period. The portfolios were formed by counting the excess returns and locating them in to the order of superiority. The excess returns are counted by using one year pre-data before the actual examination period. The results of this part did not support the results of De Bondt & Thaler’s (1985) paper. Secondly, it is investigated how the Finnish and the US macroeconomic factors are seen to be affecting the stock market valuation in Finnish Stock Markets during economic crises. This is done to explain better the changes in the successes of the winner-loser performance. The crises included different amount of selected macro factors. Two latest crises involved as well few selected US macro factors. Exclusively the IT-Bubble -crisis had the most statistically significant results with the US factors. Two other crises did not receive statistically significant results. An extra research was produced to study do the US macro factors impact more significantly on Finnish stock exchange after lags. The selected lags were three, six, nine and twelve months. Three and six month lagged US macro factors during the IT-Bubble -crisis improved the results. The extra research did not improve the results of the Subprime -crisis.
Resumo:
The research around performance measurement and management has focused mainly on the design, implementation and use of performance measurement systems. However, there is little evidence about the actual impacts of performance measurement on the different levels of business and operations of organisations, as well as the underlying factors that lead to a positive impact of performance measurement. The study thus focuses on this research gap, which can be considered both important and challenging to cover. The first objective of the study was to examine the impacts of performance measurement on different aspects of management, leadership and the quality of working life, after which the factors that facilitate and improve performance and performance measurement at the operative level of an organisation were examined. The second objective was to study how these factors operate in practice. The third objective focused on the construction of a framework for successful operative level performance measurement and the utilisation of the factors in the organisations. The research objectives have been studied through six research papers utilising empirical data from three separate studies, including two sets of interview data and one of quantitative data. The study applies mainly the hermeneutical research approach. As a contribution of the study, a framework for successful operative level performance measurement was formed by matching the findings of the current study and performance measurement theory. The study extents the prior research regarding the impacts of performance measurement and the factors that have a positive effect on operative level performance and performance measurement. The results indicate that under suitable circumstances, performance measurement has positive impacts on different aspects of management, leadership, and the quality of working life. The results reveal that for example the perception of the employees and the management of the impacts of performance measurement on leadership style differ considerably. Furthermore, the fragmented literature has been reorganised into six factors that facilitate and improve the performance of the operations and employees, and the use of performance measurement at the operative level of an organisation. Regarding the managerial implications of the study, managers who operate around performance measurement can utilise the framework for example by putting the different phases of the framework into practice.