23 resultados para Collection development (Libraries)

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The collections policy of the National Library of Finland guides the formation and development of collections. The policy also outlines the National Library’s collection, its national and international significance, and the guidelines for acquisition, selection, donation and disposal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisatorisen luottamuksen tutkimuksessa luottamus nähdään yleensä henkilöiden välisenä ilmiönä kuten työntekijän luottamuksena työtovereihin, esimieheen tai lähimpään johtoon. Organisatorisessa luottamuksessa on kuitenkin myös ei-henkilöityvä ulottuvuus, ns. institutionaalinen luottamus. Tähän mennessä vain muutamat tutkijat ovat omissa tutkimuksissaan käyttäneet myös institutionaalista luottamusta osana organisatorista luottamusta. Tämän työn tavoitteena on kehittää institutionaalisen luottamuksen käsitettä sekä mittari sen havainnoimiseksi organisaatioympäristössä. Kehitysprosessi koostui kolmesta vaiheesta. Ensimmäisessä vaiheessa kehitettiin mittariin tulevia väittämiä sekä arvioitiin sisällön validiteetti. Toinen vaihe käsitti aineiston keruun, väittämien karsimisen sekä vaihtoehtoisten mallien vertailun. Kolmannessa vaiheessa arvioitiin rakennevaliditeetti sekä reliabiliteetti. Työn empiirinen osatoteutettiin internet-kyselynä aikuisopiskelijoiden keskuudessa. Aineiston analysoinnissa käytettiin pääkomponenttianalyysiä sekä konfirmatorista faktorianalyysiä. Institutionaalinen luottamus muodostuu kahdesta ulottuvuudesta: kyvykkyys ja oikeudenmukaisuus. Kyvykkyys muodostuu viidestä alakomponentista: operatiivisen toiminnan organisointi, organisaation pysyvyys, kyvykkyys liiketoiminnan ja ihmisten johtamisessa, teknologinen luotettavuus sekä kilpailukyky. Oikeudenmukaisuus puolestaan muodostuu HRM-käytännöistä, organisaatiossa vallitsevasta reilun pelin hengestä sekä kommunikaatiosta. Lopullinen mittari kyvykkyydelle käsittää 18 väittämää ja oikeudenmukaisuudelle 13 väittämää. Työssä kehitetty mittari mahdollistaa organisatorisen luottamuksen entistä paremman ja luotettavamman mittaamisen. Tutkijan tietämyksen mukaan tämä onensimmäinen kokonaisvaltainen mittari institutionaalisen luottamuksen mittaamiseksi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the development of advanced silicon radiation detectors and their characterization by simulations, used in the work for searching elementary particles in the European Organization for Nuclear Research, CERN. Silicon particle detectors will face extremely harsh radiation in the proposed upgrade of the Large Hadron Collider, the future high-energy physics experiment Super-LHC. The increase in the maximal fluence and the beam luminosity up to 1016 neq / cm2 and 1035 cm-2s-1 will require detectors with a dramatic improvement in radiation hardness, when such a fluence will be far beyond the operational limits of the present silicon detectors. The main goals of detector development concentrate on minimizing the radiation degradation. This study contributes mainly to the device engineering technology for developing more radiation hard particle detectors with better characteristics. Also the defect engineering technology is discussed. In the nearest region of the beam in Super-LHC, the only detector choice is 3D detectors, or alternatively replacing other types of detectors every two years. The interest in the 3D silicon detectors is continuously growing because of their many advantages as compared to conventional planar detectors: the devices can be fully depleted at low bias voltages, the speed of the charge collection is high, and the collection distances are about one order of magnitude less than those of planar technology strip and pixel detectors with electrodes limited to the detector surface. Also the 3D detectors exhibit high radiation tolerance, and thus the ability of the silicon detectors to operate after irradiation is increased. Two parameters, full depletion voltage and electric field distribution, is discussed in more detail in this study. The full depletion of the detector is important because the only depleted area in the detector is active for the particle tracking. Similarly, the high electric field in the detector makes the detector volume sensitive, while low-field areas are non-sensitive to particles. This study shows the simulation results of full depletion voltage and the electric field distribution for the various types of 3D detectors. First, the 3D detector with the n-type substrate and partial-penetrating p-type electrodes are researched. A detector of this type has a low electric field on the pixel side and it suffers from type inversion. Next, the substrate is changed to p-type and the detectors having electrodes with one doping type and the dual doping type are examined. The electric field profile in a dual-column 3D Si detector is more uniform than that in the single-type column 3D detector. The dual-column detectors are the best in radiation hardness because of their low depletion voltages and short drift distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Viime vuosien nopea kehitys on kiihdyttänyt uusien lääkkeiden kehittämisprosessia. Kombinatorinen kemia on tehnyt mahdolliseksi syntetisoida suuria kokoelmia rakenteeltaan toisistaan poikkeavia molekyylejä, nk. kombinatorisia kirjastoja, biologista seulontaa varten. Siinä molekyylien rakenteeseen liittyvä aktiivisuus tutkitaan useilla erilaisilla biologisilla testeillä mahdollisten "osumien" löytämiseksi, joista osasta saatetaan myöhemmin kehittää uusia lääkeaineita. Jotta biologisten tutkimusten tulokset olisivat luotettavia, on syntetisoitujen komponenttien oltava mahdollisimman puhtaita. Tämän vuoksi tarvitaan HTP-puhdistusta korkealaatuisten komponenttien ja luotettavan biologisen tiedon takaamiseksi. Jatkuvasti kasvavat tuotantovaatimukset ovat johtaneet näiden puhdistustekniikoiden automatisointiin ja rinnakkaistamiseen. Preparatiivinen LC/MS soveltuu kombinatoristen kirjastojen nopeaan ja tehokkaaseen puhdistamiseen. Monet tekijät, esimerkiksi erotuskolonnin ominaisuudet sekä virtausgradientti, vaikuttavat preparatiivisen LC/MS puhdistusprosessin tehokkuuteen. Nämä parametrit on optimoitava parhaan tuloksen saamiseksi. Tässä työssä tutkittiin emäksisiä komponentteja erilaisissa virtausolosuhteissa. Menetelmä kombinatoristen kirjastojen puhtaustason määrittämiseksi LC/MS-puhdistuksen jälkeen optimoitiin ja määritettiin puhtaus joillekin komponenteille eri kirjastoista ennen puhdistusta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ohjelmistoteollisuudessa pitkiä ja vaikeita kehityssyklejä voidaan helpottaa käyttämällä hyväksi ohjelmistokehyksiä (frameworks). Ohjelmistokehykset edustavat kokoelmaa luokkia, jotka tarjoavat yleisiä ratkaisuja tietyn ongelmakentän tarpeisiin vapauttaen ohjelmistokehittäjät keskittymään sovelluskohtaisiin vaatimuksiin. Hyvin suunniteltujen ohjelmistokehyksien käyttö lisää suunnitteluratkaisujen sekä lähdekoodin uudelleenkäytettävyyttä enemmän kuin mikään muu suunnittelulähestymistapa. Tietyn kohdealueen tietämys voidaan tallentaa ohjelmistokehyksiin, joista puolestaan voidaan erikoistaa viimeisteltyjä ohjelmistotuotteita. Tässä diplomityössä kuvataan ohjelmistoagentteihin (software agents) perustuvaa ohjelmistokehyksen suunnittelua toteutusta. Pääpaino työssä on vaatimusmäärittelyä vastaavan suunnitelman sekä toteutuksen kuvaaminen ohjelmistokehykselle, josta voidaan erikoistaa erilaiseen tiedonkeruuseen kykeneviä ohjelmistoja Internet ympäristöön. Työn kokeellisessa osuudessa esitellään myös esimerkkisovellus, joka perustuu työssä kehitettyyn ohjelmistokehykseen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkielman tavoitteena on luoda liiketoimintamalli, joka tukee langattomien matkaviestintäpalveluiden markkinoiden luomista kehittyvillä markkinoilla. Teoreettinen osa tarkastelee langattomien matkaviestintäpalveluiden liiketoimintamallin kehittämisen tärkeimpiä elementtejä CIS maissa. Teoreettisen kappaleen tuloksena saadaan puitteet, jonka avulla liiketoimintamalli matkaviestintäpalveluille voidaan kehittää. Tutkielman empiirinen osa on toteutettu case tutkimuksena, jonka tavoitteena on ollut langattomien matkaviestintäpalvelujen markkinoiden luominen CIS maissa. Pääasiallinen empiirisen tiedon lähde on ollut teemahaastattelut. Tuloksena saatuja empiirisen osan tietoja verrataan teoriakappaleen vastaaviin tuloksiin Tulokset osoittavat, että radikaalin korkean teknologian innovaation markkinoiden luominen on hidas prosessi, joka vaatii kärsivällisyyttä yritykseltä. Markkinoiden, teknologian ja strategian epävarmuustekijät tuovat epävarmuutta kehittyvälle toimialalle ja markkinoille, joka vaikeuttaa liiketoimintamallin kehittämistä. Tärkein tekijä on palvelujen markkinointi ennemmin kuin teknologian. Avain kyvykkyys markkinoiden luomisessa on oppiminen, ei tietäminen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Kenyan forestry and sawmilling industry have been subject to a changing environment since 1999 when the industrial forest plantations were closed down. This has lowered raw material supply and it has affected and reduced the sawmill operations and the viability of the sawmill enterprises. The capacity of the 276 registered sawmills is not sufficient to fulfill sawn timber demand in Kenya. This is because of the technological degradation and lack of a qualified labor force, which were caused because of non-existent sawmilling education and further training in Kenya. Lack of competent sawmill workers has led to low raw material recovery, under utilization of resources and loss of employment. The objective of the work was to suggest models, methods and approaches for the competence and capacity development of the Kenyan sawmilling industry, sawmills and their workers. A nationwide field survey, interviews, questionnaire and literature review was used for data collection to find out the sawmills’ competence development areas and to suggest models and methods for their capacity building. The sampling frame included 22 sawmills that represented 72,5% of all the registered sawmills in Kenya. The results confirmed that the sawmills’ technological level was backwards, productivity low, raw material recovery unacceptable and workers’ professional education low. The future challenges will be how to establish the sawmills’ capacity building and workers’ competence development. Sawmilling industry development requires various actions through new development models and approaches. Activities should be started for technological development and workers’ competence development. This requires re-starting of vocational training in sawmilling and the establishment of more effective co-operation between the sawmills and their stakeholder groups. In competence development the Enterprise Competence Management Model of Nurminen (2007) can be used, whereas the best training model and approach would be a practically oriented learning at work model in which the short courses, technical assistance and extension services would be the key functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing incidence of type 1 diabetes has led researchers on a quest to find the reason behind this phenomenon. The rate of increase is too great to be caused simply by changes in the genetic component, and many environmental factors are under investigation for their possible contribution. These studies require, however, the participation of those individuals most likely to develop the disease, and the approach chosen by many is to screen vast populations to find persons with increased genetic risk factors. The participating individuals are then followed for signs of disease development, and their exposure to suspected environmental factors is studied. The main purpose of this study was to find a suitable tool for easy and inexpensive screening of certain genetic risk markers for type 1 diabetes. The method should be applicable to using whole blood dried on sample collection cards as sample material, since the shipping and storage of samples in this format is preferred. However, the screening of vast sample libraries of extracted genomic DNA should also be possible, if such a need should arise, for example, when studying the effect of newly discovered genetic risk markers. The method developed in this study is based on homogeneous assay chemistry and an asymmetrical polymerase chain reaction (PCR). The generated singlestranded PCR product is probed by lanthanide-labelled, LNA (locked nucleic acid)-spiked, short oligonucleotides with exact complementary sequences. In the case of a perfect match, the probe is hybridised to the product. However, if even a single nucleotide difference occurs, the probe is bound instead of the PCR product to a complementary quencher-oligonucleotide labelled with a dabcyl-moiety, causing the signal of the lanthanide label to be quenched. The method was applied to the screening of the well-known type 1 diabetes risk alleles of the HLA-DQB1 gene. The method was shown to be suitable as an initial screening step including thousands of samples in the scheme used in the TEDDY (The Environmental Determinants of Diabetes in the Young) study to identify those individuals at increased genetic risk. The method was further developed into dry-reagent form to allow an even simpler approach to screening. The reagents needed in the assay were in dry format in the reaction vessel, and performing the assay required only the addition of the sample and, if necessary, water to rehydrate the reagents. This allows the assay to be successfully executed even by a person with minimal laboratory experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The size and complexity of projects in the software development are growing very fast. At the same time, the proportion of successful projects is still quite low according to the previous research. Although almost every project's team knows main areas of responsibility which would help to finish project on time and on budget, this knowledge is rarely used in practice. So it is important to evaluate the success of existing software development projects and to suggest a method for evaluating success chances which can be used in the software development projects. The main aim of this study is to evaluate the success of projects in the selected geographical region (Russia-Ukraine-Belarus). The second aim is to compare existing models of success prediction and to determine their strengths and weaknesses. Research was done as an empirical study. A survey with structured forms and theme-based interviews were used as the data collection methods. The information gathering was done in two stages. At the first stage, project manager or someone with similar responsibilities answered the questions over Internet. At the second stage, the participant was interviewed; his or her answers were discussed and refined. It made possible to get accurate information about each project and to avoid errors. It was found out that there are many problems in the software development projects. These problems are widely known and were discussed in literature many times. The research showed that most of the projects have problems with schedule, requirements, architecture, quality, and budget. Comparison of two models of success prediction presented that The Standish Group overestimates problems in project. At the same time, McConnell's model can help to identify problems in time and avoid troubles in future. A framework for evaluating success chances in distributed projects was suggested. The framework is similar to The Standish Group model but it was customized for distributed projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Target company of this study is a large machinery company, which is, inter alia, engaged in energy and pulp engineering, procurement and construction management (EPCM) supply business. The main objective of this study was to develop cost estimation of the target company by providing more accurate, reliable and up-to-date information through enterprise resource planning (ERP) system. Another objective was to find cost-effective methods to collect total cost of ownership information to support more informed supplier selection decision making. This study is primarily action-oriented, but also constructive, and it can be divided in two sections: theoretical literature review and empirical study on the abovementioned part of the target company’s business. Development of information collection is, in addition to literature review, based on nearly 30 qualitative interviews of employees at various organizational units, functions and levels at the target company. At the core of development was to make initial data more accurate, reliable and available, a necessary prerequisite for informed use of the information. Certain development suggestions and paths were presented in order to regain confidence in ERP system as information source by reorganizing work breakdown structure and by complementing mere cost information with quantitative, technical and scope information. Several methods to use the information ever more effectively were also discussed. While implementation of the development suggestions outreached the scope of this study, it was forwarded in test environment and interest groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social media has become a part of many people’s everyday lives. In the library field the adoption of social media has been widespread and discussions of the development of “Library 2.0” began at an early stage. The aim with this thesis is to study the interface between public libraries, social media, and users, focusing on information activities. The main research question is: How is the interface between public libraries and social media perceived and acted upon by its main stakeholders (library professionals and users)? The background of Library 2.0 is strongly associated with the development of the Web and social media, as well as with the public libraries and their user-centered and information technological development. The theoretical framework builds on the research within the area of Library and Information Science concerning information behavior, information practice, and information activities. Earlier research on social media and public libraries is also highlighted in this thesis. The methods survey and content analysis were applied to map the interface between social media and public libraries. A questionnaire was handed out to the users and another questionnaire was sent out to the library professionals. The results were statistically analyzed. In the content analysis public library Facebook pages were studied. All the empirical investigations were conducted in the area of Finland Proper. An integrated analysis of the results deepens the understanding of the key elements of the social media and public library context. These elements are interactivity, information activities, perceptions, and stakeholders. In this context seven information activities were distinguished: reading, seeking, creating, communicating, informing, mediating, and contributing. This thesis contributes to develop the research concerning information activities and draws a realistic picture of the challenges and opportunities in the social media and public library context. It also contributes with knowledge on library professionals and library users, and the existing differences in their perceptions of the interface between libraries and social media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Netnography has been studying in various aspects (e.g. definitions of netnography, application of netngoraphy, conducting procedure…) within different industrial contexts. Besides, there are many studies and researches about new product development in various perspectives, such as new product development models, management of new product development project, or interaction between customers and new product design, and so on. However, the connection and the interaction between netnography and new product development have not been studied recently. This opens opportunities for the writer to study and explore unrevealed issues regarding to applying netnography in new product development. In term of the relation between netnography and new product development, there are numerous of matters need to be explored; for instance, the process of applying netnography in order to benefit to new product development, the involvement degree of netnography in new product development process, or eliminating useless information from netnography so that only crucial data is utilized, and so on. In this thesis, writer focuses on exploring how netnography is applied in new product development process, and what benefits netnography can contribute to the succeed of the project. The aims of this study are to understand how netnography is conducted for new product development purpose, and to analyse the contributions of netnography in the new product development process. To do so, a case-study strategy will be conducted with triple case studies. The case studies are chosen bases on many different criteria in order to select the most relevant cases. Eventually, the writer selected three case studies, which are Sunless tanning product project (HYVE), Listerine (NetBase), and Nivea co-creation and netnography in black and white deodorant. The case study strategy applied in this thesis includes four steps e.g. case selection, data collection, case study analysis, and generating the research outcomes from the analysis. This study of the contributions of netnography in the new product development process may be useful for the readers in many ways. It offers the fundamental knowledge of netnography market research method and basic understanding of new product development process. Additionally, it emphasizes the differences between netnography and other market research methods in order to explain the reasons why many companies and market research agents recently utilized netnography in their market research projects. Furthermore, it highlights the contributions of netnography in the new product development process in order to indicate the importance of netnography in developing new product. Thus, the potential readers of the study can be students, marketers, researchers, product developers, or business managers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adapting and scaling up agile concepts, which are characterized by iterative, self-directed, customer value focused methods, may not be a simple endeavor. This thesis concentrates on studying challenges in a large-scale agile software development transformation in order to enhance understanding and bring insight into the underlying factors for such emerging challenges. This topic is approached through understanding the concepts of agility and different methods compared to traditional plan-driven processes, complex adaptive theory and the impact of organizational culture on agile transformational efforts. The empirical part was conducted by a qualitative case study approach. The internationally operating software development case organization had a year of experience of an agile transformation effort during it had also undergone organizational realignment efforts. The primary data collection was conducted through semi-structured interviews supported by participatory observation. As a result the identified challenges were categorized under four broad themes: organizational, management, team dynamics and process related. The identified challenges indicate that agility is a multifaceted concept. Agile practices may bring visibility in issues of which many are embedded in the organizational culture or in the management style. Viewing software development as a complex adaptive system could facilitate understanding of the underpinning philosophy and eventually solving the issues: interactions are more important than processes and solving a complex problem, such a novel software development, requires constant feedback and adaptation to changing requirements. Furthermore, an agile implementation seems to be unique in nature, and agents engaged in the interaction are the pivotal part of the success of achieving agility. In case agility is not a strategic choice for whole organization, it seems additional issues may arise due to different ways of working in different parts of an organization. Lastly, detailed suggestions to mitigate the challenges of the case organization are provided.