25 resultados para Technology Readiness Level
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Tässä tutkielmassa perehdyttiin julkisen tutkimuksen kaupallistamiseen tutkimusorganisaatio VTT:n tapauksen kautta. Tutkimuksen tavoitteena oli selvittää, kuinka julkista tutkimusta voidaan yleensä kaupallistaa, mitkä ovat kaupallistamisprosessin vaiheet, haasteet sekä kriittiset onnistumistekijät. Tutkimus on luonteeltaan laadullinen ja se toteutettiin henkilöhaastatteluina, joita täydennettiin kirjallisilla lähteillä. Tutkimuksessa ilmeni, että VTT:llä julkisrahoitteisen tutkimuksen tuloksia kaupallistetaan lisensoimalla, toimeksianto- eli sopimustutkimuksen kautta, spin-off toimilla, allianssien ja erilaisten palvelujen, kuten valmistus-, testaus-, ja analyysipalvelujen kautta. VTT:n näkökulmasta katsottuna merkittävin kaupallistamiskanava on sopimus- eli toimeksiantotutkimus, jota seuraavat lisensointi sekä spin-off toiminta. Kaupallistamistavan valinta riippuu lukuisista tekijöistä, kuten organisaation kaupallistamisstrategiasta, innovaatiotyypistä, teknologian kypsyydestä, tiedon luonteesta, toimialojen ja markkinoiden ominaisuuksista, innovaation hyödynnettävyydestä ja suojauksesta, arvonmuodostuspotentiaalista kuin yksittäisten henkilöiden motivaatiostakin.Valinta onkin tehtävä tapauskohtaisesti. Julkisesta tutkimuksesta lähtöisin olevien teknologioiden tie markkinoille voi olla haastava, sillä teknologiat ovat kehitetty ei-kaupallisessa ympäristössä. Tutkimuksessa kävi ilmi, että kriittisimmiksi koetut kaupallistamisen onnistumistekijät liittyivät organisaatioon, teknologiaan, markkina- ja asiakastarpeeseen, aineettoman omaisuuden suojaukseen sekä markkinoille menon nopeuteen.
Resumo:
Korkeasaatavuus on olennainen osa nykyaikaisissa, integroiduissa yritysjärjestelmissä. Yritysten kansainvälistyessä tiedon on oltava saatavissa ympärivuorokautisesti, mikä asettaa yhä kovempia vaatimuksia järjestelmän yksittäisten osien saatavuudelle. Kasvava tietojärjestelmäintegraatio puolestaan tekee järjestelmän solmukohdista kriittisiä liiketoiminnan kannalta. Tässä työssä perehdytään hajautettujen järjestelmien ominaisuuksiin ja niiden asettamiin haasteisiin. Esiteltyjä teknologioita ovat muun muassa väliohjelmistot, klusterit ja kuormantasaus. Yrityssovellusten pohjana käytetty Java 2 Enterprise Edition (J2EE) -teknologia käsitellään olennaisilta osiltaan. Työssä käytetään sovelluspalvelinalustana BEA WebLogic Server -ohjelmistoa, jonka ominaisuudet käydään läpi hajautuksen kannalta. Työn käytännön osuudessa toteutetaan kahdelle erilaiselle olemassa olevalle yrityssovellukselle korkean saatavuuden sovelluspalvelinympäristö, joissa sovellusten asettamat rajoitukset on otettu huomioon.
Resumo:
Diplomityön tavoitteena on innovaatioprosessien selventäminen ja yhteensovittaminen radikaalien teknologia innovaatioiden osalta. Prosessien yhteensovittamisen edellytyksenä on teknologia innovaation radikaalisuuden ymmärtäminen. Kohteena on alkuvaiheen innovaatioprosessi jonka aikana virallinen yhteistyö ei vielä ole mandollista. Kehitetyn mallinavulla voidaan arvioida yrityksen kokemaa radikaalisuutta innovaatioprosessin alkuvaiheessa. On tärkeää ymmärtää miten vuorovaikutuksessa olevat yritykset kokevat innovaation radikaalisuuden prosessin alkuvaiheessa. Prosessien eroavaisuudet voidaan sanoa riippuvan yritysten näkökulmasta, koska niiden toiminnot määrittyvät koetun radikaalisuuden mukaan.Yhtäläisenä koettu radikaalisuus luo pohjan avoimelle vuorovaikutukselle ja antaa viitteitä syvemmästä yhteistyöstä tulevaisuudessa. Vertikaalisti sijoittuneiden yritysten prosessien esitetty yhteensovitus perustuu molemminpuoliseen radikaalisuuteen ja sisäisen hyväksynnän vastaavuuteen. Tällöin osapuolet ovat mukana samalla sitoumuksella prosessin jatkumisen ollessa epävarma.
Resumo:
There are several alternatives for valuing the future opportunities of firms. The traditional appraisal methods for single projects such as net present value, internalrate of return and payback rules have been criticized in recent years. It has been said that they do not take into account all growth opportunities of firms. At the company level, business valuation is traditionally based on financial and market information. Yield estimates, net worth values and market values of shares are commonly used. Naturally, all valuation methods have their own strengths and shortcomings. In the background of most estimation rules there is the idea that the future of the firms is quite clear and predictable. However, in recent times the business environment of most companies has changed to a more unpredictable direction and the effects of uncertainty have increased. There has been a growing interest in estimating the risks and values of future possibilities. The aim of the current paper is to describe the difference between the value of futureopportunities in information technology firms and forest companies, and also toanalyse the backgrounds for the observed gap.
Resumo:
There is a broad consensus among economists that technologicalchange has been a major contributor to the productivity growth and, hence, to the growth of the material welfare in western industrialized countries at least over the last century. Paradoxically, this issue has not been the focal point of theoretical economics. At the same time, we have witnessed the rise of the importance of technological issues at the strategic management level of business firms. Interestingly, the research has not accurately responded to this challenge either. The tension between the overwhelming empirical evidence of the importance of technology and its relative omission in the research offers a challenging target for a methodological endeavor. This study deals with the question of how different theories cope with technology and explain technological change. The focusis at the firm level and the analysis concentrates on metatheoretical issues, except for the last two chapters, which examine the problems of strategic management of technology. Here the aim is to build a new evolutionary-based theoreticalframework to analyze innovation processes at the firm level. The study consistsof ten chapters. Chapter 1 poses the research problem and contrasts the two basic approaches, neoclassical and evolutionary, to be analyzed. Chapter 2 introduces the methodological framework which is based on the methodology of isolation. Methodological and ontoogical commitments of the rival approaches are revealed and basic questions concerning their ways of theorizing are elaborated. Chapters 3-6 deal with the so-called substantive isolative criteria. The aim is to examine how different approaches cope with such critical issues as inherent uncertainty and complexity of innovative activities (cognitive isolations, chapter 3), theboundedness of rationality of innovating agents (behavioral isolations, chapter4), the multidimensional nature of technology (chapter 5), and governance costsrelated to technology (chapter 6). Chapters 7 and 8 put all these things together and look at the explanatory structures used by the neoclassical and evolutionary approaches in the light of substantive isolations. The last two cpahters of the study utilize the methodological framework and tools to appraise different economics-based candidates in the context of strategic management of technology. The aim is to analyze how different approaches answer the fundamental question: How can firms gain competitive advantages through innovations and how can the rents appropriated from successful innovations be sustained? The last chapter introduces a new evolutionary-based technology management framework. Also the largely omitted issues of entrepreneurship are examined.
Resumo:
Information Technology (IT) outsourcing has traditionally been seen as a means to acquire newresources and competencies to perform standard tasks at lowered cost. This dissertationchallenges the thought that outsourcing should be limited to non-strategic systems andcomponents, and presents ways to maximize outsourcing enabled benefits while minimizingassociated risks. In this dissertation IT outsourcing is approached as an efficiency improvement and valuecreationprocess rather than a sourcing decision. The study focuses on when and how tooutsource information technology, and presents a new set of critical success factors foroutsourcing project management. In a case study it re-validates the theory-based propositionthat in certain cases and situations it is beneficial to partly outsource also strategic IT systems. The main contribution of this dissertation is the validation of proposal that in companies wherethe level of IT competency is high, managerial support established and planning processes welldefined,it is possible to safely outsource also business critical IT systems. A model describing the critical success factors in such cases is presented based on existing knowledge on the fieldand the results of empirical study. This model further highlights the essence of aligning IT andbusiness strategies, assuming long-term focus on partnering, and the overall target ofoutsourcing to add to the strengths of the company rather than eliminating weaknesses.
Resumo:
The objective of this thesis is to find out how information and communication technology affects the global consumption of printing and writing papers. Another objective is to find out, whether there are differences between paper grades in these effects. The empirical analysis is conducted by linear regression analysis using three sets of country-level panel data from 1990-2006. Data set of newsprint contains 95 countries, data set of uncoated woodfree paper 61 countries and data set of coated mechanical paper 42 countries. The material is based on paper consumption data of RISI’s Industry Statistics Database and on the information and communication technology data of GMID-database. Results indicate that number of Internet users has statistically significant negative effect on the consumption of newsprint and on the consumption of coated mechanical paper and number of mobile telephone users has positive effect on the consumptions of these papers. Results also indicate that information and communication technologies have only small effect on consumption of uncoated woodfree paper or no significant effect at all, but these results are more uncertain to some extent.
Resumo:
Työn tavoitteena oli yritysjohdon käyttämän sisäisen raportin kehittäminen vastaamaan paremmin johdon tarpeita. Teoriaosuuden tarkoituksena oli selvittää kirjallisuuden avulla sisäisen raportin käyttöä johdon työkaluna, hyvän raportin vaatimuksia ja yleisiä raportoinnin periaatteita. Empiirinen osuus toteutettiin haastatteluiden avulla ja analysoimalla case-yrityksen sisäisiä raportteja. Empiirisen osan tavoitteena oli selvittää aiemman raportin ongelmat, löytää ratkaisut näihin ongelmiin ja laatia uusi raportointimalli, joka palvelee paremmin raportin käyttäjiä.Teoriaosuuden perusteella voidaan päätellä, että raportoinnin kehittämiseen vaikuttaa eniten raportin käyttötarkoitus ja henkilöt, joiden työväline raportti tulee olemaan. Raportin kehittäminen on jatkuva prosessi, sillä raportin tulee mukautua liiketoiminta- ja organisaatiomuutoksiin.Työn empiirisessä osassa esitetty uusi raportointimalli palvelee case-yritystä monella eri tasolla. Se tukee johdon työskentelyä ja tarjoaa yritystasolla välineen kustannusten parempaan hallintaan ja seurantaan.
Resumo:
Tutkimuksen selvitettiin miten skenaarioanalyysia voidaan käyttää uuden teknologian tutkimisessa. Työssä havaittiin, että skenaarioanalyysin soveltuvuuteen vaikuttaa eniten teknologisen muutoksen taso ja saatavilla olevan tiedon luonne. Skenaariomenetelmä soveltuu hyvin uusien teknologioiden tutkimukseen erityisesti radikaalien innovaatioiden kohdalla. Syynä tähän on niihin liittyvä suuri epävarmuus, kompleksisuus ja vallitsevan paradigman muuttuminen, joiden takia useat muut tulevaisuuden tutkimuksen menetelmät eivät ole tilanteessa käyttökelpoisia. Työn empiirisessä osiossa tutkittiin hilaverkkoteknologian tulevaisuutta skenaarioanalyysin avulla. Hilaverkot nähtiin mahdollisena disruptiivisena teknologiana, joka radikaalina innovaationa saattaa muuttaa tietokonelaskennan nykyisestä tuotepohjaisesta laskentakapasiteetin ostamisesta palvelupohjaiseksi. Tällä olisi suuri vaikutus koko nykyiseen ICT-toimialaan erityisesti tarvelaskennan hyödyntämisen ansiosta. Tutkimus tarkasteli kehitystä vuoteen 2010 asti. Teorian ja olemassa olevan tiedon perusteella muodostettiin vahvaan asiantuntijatietouteen nojautuen neljä mahdollista ympäristöskenaariota hilaverkoille. Skenaarioista huomattiin, että teknologian kaupallinen menestys on vielä monen haasteen takana. Erityisesti luottamus ja lisäarvon synnyttäminen nousivat tärkeimmiksi hilaverkkojen tulevaisuutta ohjaaviksi tekijöiksi.
Resumo:
Laatu on osaltaan vahvistamassa asemaansa liike-elämässä yritysten kilpaillessa kansainvälisillä markkinoilla niin hinnalla kuin laadulla. Tämä suuntaus on synnyttänyt useita laatuohjelmia, joita käytetään ahkerasti yritysten kokonais- valtaisen laatujohtamisen (TQM) toteuttamisessa. Laatujohtaminen kattaa yrityksen kaikki toiminnot ja luo vaatimuksia myös yrityksen tukitoimintojen kehittämiselle ja parantamiselle. Näihin lukeutuu myös tämän tutkimuksen kohde tietohallinto (IT). Tutkielman tavoitteena oli kuvata IT prosessin nykytila. Tutkielmassa laadittu prosessikuvaus pohjautuu prosessijohtamisen teoriaan ja kohdeyrityksen käyttämään laatupalkinto kriteeristöön. Tutkimusmenetelmänä prosessin nykytilan selvittämiseksi käytettiin teemahaastattelutta. Prosessin nykytilan ja sille asetettujen vaatimusten selvittämiseksi haastateltiin IT prosessin asiakkaita. Prosessianalyysi, tärkeimpien ala-prosessien tunnistaminen ja parannusalueiden löytäminen ovat tämän tutkielman keskeisemmät tulokset. Tutkielma painottui IT prosessin heikkouksien ja parannuskohteiden etsimiseen jatkuvan kehittämisen pohjaksi, ei niinkään prosessin radikaaliin uudistamiseen. Tutkielmassa esitellään TQM:n periaatteet, laatutyökaluja sekä prosessijohtamisen terminologia, periaatteet ja sen systemaattinen toteutus. Työ antaa myös kuvan siitä, miten TQM ja prosessijohtaminen niveltyvät yrityksen laatutyössä.
Resumo:
As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.
Resumo:
Globalization is the trend which is realized in all areas in today’s business world. Pressure for cost reduction, changes in market situation and available scale economies have changed business environment more global than ever. To respond to new situation, companies are establishing global strategies. In this thesis, available global competitive advantages in electrical machine industry are studied in context of gaining them by global technology transfers. In theory part, establishing global strategy and competitive advantage is considered with connection to global sourcing and supply chain management. Additionally, market development in 21st century and its impact on global strategies is studied. In practice, global manufacturing is enabled by technology transfer projects. Smooth and fast project implementation enables faster and more flexible production ramp up. By starting the production available competitive advantages can be realized. In this thesis the present situation of technology transfer projects and the risks and advantages related to global manufacturing are analyzed. The analysis of implemented technology transfer projects indicates that project implementation is in good level. For further development of project execution 10 minor suggestions could be presented with two major ones: higher level standardization and development of product information model to support better global manufacturing.
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
The research around performance measurement and management has focused mainly on the design, implementation and use of performance measurement systems. However, there is little evidence about the actual impacts of performance measurement on the different levels of business and operations of organisations, as well as the underlying factors that lead to a positive impact of performance measurement. The study thus focuses on this research gap, which can be considered both important and challenging to cover. The first objective of the study was to examine the impacts of performance measurement on different aspects of management, leadership and the quality of working life, after which the factors that facilitate and improve performance and performance measurement at the operative level of an organisation were examined. The second objective was to study how these factors operate in practice. The third objective focused on the construction of a framework for successful operative level performance measurement and the utilisation of the factors in the organisations. The research objectives have been studied through six research papers utilising empirical data from three separate studies, including two sets of interview data and one of quantitative data. The study applies mainly the hermeneutical research approach. As a contribution of the study, a framework for successful operative level performance measurement was formed by matching the findings of the current study and performance measurement theory. The study extents the prior research regarding the impacts of performance measurement and the factors that have a positive effect on operative level performance and performance measurement. The results indicate that under suitable circumstances, performance measurement has positive impacts on different aspects of management, leadership, and the quality of working life. The results reveal that for example the perception of the employees and the management of the impacts of performance measurement on leadership style differ considerably. Furthermore, the fragmented literature has been reorganised into six factors that facilitate and improve the performance of the operations and employees, and the use of performance measurement at the operative level of an organisation. Regarding the managerial implications of the study, managers who operate around performance measurement can utilise the framework for example by putting the different phases of the framework into practice.
Resumo:
This thesis was produced for the Technology Marketing unit at the Nokia Research Center. Technology marketing was a new function at Nokia Research Center, and needed an established framework with the capacity to take into account multiple aspects for measuring the team performance. Technology marketing functions had existed in other parts of Nokia, yet no single method had been agreed upon for measuring their performance. The purpose of this study was to develop a performance measurement system for Nokia Research Center Technology Marketing. The target was that Nokia Research Center Technology Marketing had a framework for separate metrics; including benchmarking for starting level and target values in the future planning (numeric values were kept confidential within the company). As a result of this research, the Balanced Scorecard model of Kaplan and Norton, was chosen for the performance measurement system for Nokia Research Center Technology Marketing. This research selected the indicators, which were utilized in the chosen performance measurement system. Furthermore, performance measurement system was defined to guide the Head of Marketing in managing Nokia Research Center Technology Marketing team. During the research process the team mission, vision, strategy and critical success factors were outlined.