63 resultados para Abstracting and Indexing as Topic

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internet on elektronisen postin perusrakenne ja ollut tärkeä tiedonlähde akateemisille käyttäjille jo pitkään. Siitä on tullut merkittävä tietolähde kaupallisille yrityksille niiden pyrkiessä pitämään yhteyttä asiakkaisiinsa ja seuraamaan kilpailijoitansa. WWW:n kasvu sekä määrällisesti että sen moninaisuus on luonut kasvavan kysynnän kehittyneille tiedonhallintapalveluille. Tällaisia palveluja ovet ryhmittely ja luokittelu, tiedon löytäminen ja suodattaminen sekä lähteiden käytön personointi ja seuranta. Vaikka WWW:stä saatavan tieteellisen ja kaupallisesti arvokkaan tiedon määrä on huomattavasti kasvanut viime vuosina sen etsiminen ja löytyminen on edelleen tavanomaisen Internet hakukoneen varassa. Tietojen hakuun kohdistuvien kasvavien ja muuttuvien tarpeiden tyydyttämisestä on tullut monimutkainen tehtävä Internet hakukoneille. Luokittelu ja indeksointi ovat merkittävä osa luotettavan ja täsmällisen tiedon etsimisessä ja löytämisessä. Tämä diplomityö esittelee luokittelussa ja indeksoinnissa käytettävät yleisimmät menetelmät ja niitä käyttäviä sovelluksia ja projekteja, joissa tiedon hakuun liittyvät ongelmat on pyritty ratkaisemaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this thesis is to gain more in-depth understanding of employer branding and offer suggestions on how this knowledge could be utilized in the case company. More in detail, the purpose of this research is to provide tools for improving Lindström’s organizational attractiveness and boosting the recruitment and retention of the segment of high-performing sales professionals. A strategy for reaching this particular segment has not been previously drawn and HR-managers believe strongly that it would be very beneficial for the company’s development and growth. The topic of this research is very current for Lindström, but also contributes on general level as companies are competing against each other in attracting, recruiting and retention of skilled workforce in the times of labor shortage. The research is conducted with qualitative methods and the data collection includes primary data through interviews as well as secondary data in the form of analysis on previous research, websites, recruitment material and discussions with Lindström’s HR department. This research provides a good basis for broader examination on the topic and presents development suggestions for the identified challenges. Based on the key findings Lindström’s HR department was advised to increase firm’s visibility, broaden recruitment channels, provide more hands-on knowledge about the sales positions and investigate their possibilities of developing sales reward systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena on selvittää ja arvioida kahden konetekniikan maisteriohjelman vaikuttavuutta ohjelmiin osallistuneiden opiskelijoiden ja heidän työnantajiensa näkökulmasta. Aikuiskoulutuksen vaikuttavuutta arvioidaan teemoilla osaaminen, tiedon siirtyminen yrityselämään ja työura. Lisäksi tarkastellaan millaista tukea aikuisopiskelija saa ja arvioidaan jatkotarpeita vastaaville koulutuksille. Tutkimus toteutettiin opiskelijoille tehdyn kvantitatiivisen kyselylomaketutkimuksen ja työnantajille tehdyn kvalitatiivisen teemahaastattelututkimuksen avulla. Menetelmät täydentävät tutkimuksessa toisiaan ja antavat eri näkökulmia käsiteltäviin aiheisiin. Tulokset esitetään rinnakkain, joten samalla on mahdollisuus vertailla niiden tuloksia keskenään. Opiskelulla on selvästi vaikutusta ammatilliseen ja muuhun osaamiseen. Opiskellessa tullutta tietoa siirtyy työyhteisöön harjoitustöiden, palaverien ja varsinkin epävirallisten keskustelujen kautta. Uraan opiskelu vaikuttaa selvästi edistävästi, yritysten onkin syytä antaa valmistuneille uusia haasteita jos haluavat pitää nämä itsellään. Tukea aikuisopiskelija kaipaa monelta suunnalta, mutta erityisesti perheeltään. Vastaavanlaiselle, ja myös muulle alueelliselle koulutukselle on selkeä tarve myös jatkossa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tavoitteena oli koota Lappeenrannan teknillisen yliopiston tuotantotalouden osaston opetuksen kehittämisen historiatiedot ja henkilöstön mielipide vuosien 2000–2008 aikana tehdystä kehittämistyöstä organisaation jatkokehittämistä ja viestintää varten. Työn aihepiiri käsittelee organisaation kehittämistä (kehittymistä) yksilön ja organisaation oppimisen näkökulmasta. Historiatiedot on kerätty osaston henkilökunnalta ja saaduista dokumenteista. Henkilöstön palautteen kerääminen kehittämistyöstä toteutettiin laadullisena tutkimuksena suorittamalla 32 henkilökohtaista haastattelua. Tutkimuksen keskeisimmäksi tulokseksi on saatu onnistuneen kehittämisen malli, jossa yksilön ja organisaation kehittymiseen vaikuttavat vahvasti yksilötasolla tarve ja konkreettinen päämäärä. Koko organisaatiossa on lisäksi huomioitava muina osina systematiikka, yhteisöllisyys ja tietämyksen (osaamisen) hallinta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In knowledge-intensive economy an effective knowledge transfer is a part of the firm’s strategy to achieve a competitive advantage in the market. Knowledge transfer related to a variety of mechanisms depends on the nature of knowledge and context. The topic is, however, very little empirical studied and there is a research gap in scientific literature. This study examined and analyzed external knowledge transfer mechanisms in service business and especially in the context of acquisitions. The aim was to find out what kind of mechanisms was used when the buyer began to transfer data e.g. their own agendas and practices to the purchased units. Another major research goal was to identify the critical factors which contributed to knowledge transfer through different mechanisms. The study was conducted as a multiple-case study in a consultative service business company, in its four business units acquired by acquisition, in various parts of the country. The empirical part of the study was carried out as focus group interviews in each unit, and the data were analyzed using qualitative methods. The main findings of this study were firstly the nine different knowledge transfer mechanisms in service business acquisition: acquisition management team as an initiator, unit manager as a translator, formal training, self-directed learning, rooming-in, IT systems implementation, customer relationship management, codified database and ecommunication. The used mechanisms brought up several aspects as giving the face to changing, security of receiving right knowledge and correctly interpreted we-ness atmosphere, and orientation to use more consultative touch with customers. The study pointed out seven critical factors contributed to different mechanisms: absorption, motivation, organizational learning, social interaction, trust, interpretation and time resource. The two last mentioned were new findings compared to previous studies. Each of the mechanisms and the related critical factors contributed in different ways to the activity in different units after the acquisition. The role of knowledge management strategy was the most significant managerial contribution of the study. Phenomenon is not recognized enough although it is strongly linked in knowledge based companies. The recognition would help to develop a better understanding of the business through acquisitions, especially in situations such as where two different knowledge strategies combines in new common company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the 21st century, agile project management (APM) has emerged as a major evolutionary step in the area of software project management. APM is defined as a conceptual framework, consisting of various methods such as Scrum, quick respond to change, better customer collaboration, minimum coverage of documentation and extreme programming (XP) that facilitates to produce working software in multiple iterations with team work. Because agile project management has become more popular in the software industry in recent years, it constitutes an interesting and comprehensive research topic. This thesis presents a systematic literature review (SLR) of published research articles concerning agile project management. Based on a predefined search strategy, 273 such articles were identified, of which 44 were included in the review. The selected 44 articles were published between years 2005 and 2012. The thesis defines a review process by developing a review protocol and presenting the results of the review. The results are expected to provide researchers, software man

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Taloussuhdanteiden yhteisvaihtelun tutkimus on eräs taloustieteiden vanhimmista tutkimusaloista. Finanssikriisi ja euroalueen kohtaamat talousvaikeudet ovat kuitenkin nostaneet aiheen jälleen hyvin ajankohtaiseksi. Kuluneiden kahdenkymmenen vuoden aikana tutkimusalueesta on muodostunut erittäin laaja lukuisine näkökulmineen ja debatteineen. Tutkielman aiheena on Suomen taloussuhdanteiden kansainvälinen yhteisvaihtelu valittujen vertailumaiden kanssa. Vertailumaat ovat Ruotsi, Norja, Tanska, Saksa, Ranska, Iso-Britannia ja Yhdysvallat. Tutkielmaan valitut taloussuhdannetta kuvaavat muuttujat ovat reaalinen bruttokansantuote, yksityinen kokonaiskulutus ja teollisuustuotantoindeksi. Aineisto on kerätty Lappeenrannan tiedekirjaston Nelli-portaalin OECD iLibrary-tietokannasta ja se kattaa aikajakson 1960 Q1- 2014 Q4. Maakohtainen taloussuhdanne operationalisoidaan laskemalla ensimmäinen logaritminen differenssi, joka edustaa perinteistä reaalisuhdanneteoreettisen koulukunnan näkemystä taloussuhdanteesta. Tutkielman näkökulmaksi valitaan yhden maan näkökulma, joka on hieman harvinaisempi näkökulma verrattuna laajempiin alueellisiin näkökulmiin. Tutkimusmenetelminä käytetään Pearsonin korrelaatiokerrointa, Engle-Granger- sekä Johansenin yhteisintegroituvuustestejä ja VAR-GARCH-BEKK –mallilla laskettua dynaamista korrelaatiota, jotka lasketaan Suomen ja vertailumaiden välille maapareittain. Tuloksia tulkitaan suomalaisen vientiä vertailumaihin suunnittelevan yrityksen näkökulmasta. Tutkielman tulosten perusteella Engle-Grangerin menetelmällä laskettu samanaikainen yhteisintegroituvuus Suomen ja vertailumaiden välillä on epätodennäköistä. Kun yhteisintegroituvuuden annetaan riippua myös viiveistä, saadaan Johansenin menetelmällä yhteisintegroituvuus Suomen ja Yhdysvaltojen välille reaalisessa bruttokansantuotteessa, Suomen ja Saksan, Suomen ja Ranskan sekä Suomen ja Yhdysvaltojen välille yksityisessä kokonaiskulutuksessa sekä Suomen ja Norjan välille teollisuustuotantoindeksissä. Tulosten tulkintaa vaikeuttavat niiden malliriippuvuus ja informaatiokriteerien toisistaan poikkeavat mallisuositukset, joten yhteisintegroituvuus on mahdollinen myös muiden maaparien kohdalla. Dynaamisten korrelaatiokuvaajien perusteella maaparien välisen yhteisvaihtelun voimakkuus muuttuu ajan mukana. Finanssikriisin aikana kokonaistuotannossa on havaittavissa korkeampi korrelaatio, mutta korrelaatio palaa sen jälkeen perustasolleen. Kokonaiskulutuksen korrelaatio on kokonaistuotantoa alhaisempi ja pitemmissä aikajaksoissa vaihtelevaa.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present dissertation, multilingual thesauri were approached as cultural products and the focus was twofold: On the empirical level the focus was placed on the translatability of certain British-English social science indexing terms into the Finnish language and culture at a concept, a term and an indexing term level. On the theoretical level the focus was placed on the aim of translation and on the concept of equivalence. In accordance with modern communicative and dynamic translation theories the interest was on the human dimension. The study is qualitative. In this study, equivalence was understood in a similar way to how dynamic, functional equivalence is commonly understood in translation studies. Translating was seen as a decision-making process, where a translator often has different kinds of possibilities to choose in order to fulfil the function of the translation. Accordingly, and as a starting point for the construction of the empirical part, the function of the source text was considered to be the same or similar to the function of the target text, that is, a functional thesaurus both in source and target context. Further, the study approached the challenges of multilingual thesaurus construction from the perspectives of semantics and pragmatics. In semantic analysis the focus was on what the words conventionally mean and in pragmatics on the ‘invisible’ meaning - or how we recognise what is meant even when it is not actually said (or written). Languages and ideas expressed by languages are created mainly in accordance with expressional needs of the surrounding culture and thesauri were considered to reflect several subcultures and consequently the discourses which represent them. The research material consisted of different kinds of potential discourses: dictionaries, database records, and thesauri, Finnish versus British social science researches, Finnish versus British indexers, simulated indexing tasks with five articles and Finnish versus British thesaurus constructors. In practice, the professional background of the two last mentioned groups was rather similar. It became even more clear that all the material types had their own characteristics, although naturally not entirely separate from each other. It is further noteworthy that the different types and origins of research material were not used to represent true comparison pairs, and that the aim of triangulation of methods and material was to gain a holistic view. The general research questions were: 1. Can differences be found between Finnish and British discourses regarding family roles as thesaurus terms, and if so, what kinds of differences and which are the implications for multilingual thesaurus construction? 2. What is the pragmatic indexing term equivalence? The first question studied how the same topic (family roles) was represented in different contexts and by different users, and further focused on how the possible differences were handled in multilingual thesaurus construction. The second question was based on findings of the previous one, and answered to the final question as to what kinds of factors should be considered when defining translation equivalence in multilingual thesaurus construction. The study used multiple cases and several data collection and analysis methods aiming at theoretical replication and complementarity. The empirical material and analysis consisted of focused interviews (with Finnish and British social scientists, thesaurus constructors and indexers), simulated indexing tasks with Finnish and British indexers, semantic component analysis of dictionary definitions and translations, coword analysis and datasets retrieved in databases, and discourse analysis of thesauri. As a terminological starting point a topic and case family roles was selected. The results were clear: 1) It was possible to identify different discourses. There also existed subdiscourses. For example within the group of social scientists the orientation to qualitative versus quantitative research had an impact on the way they reacted to the studied words and discourses, and indexers placed more emphasis on the information seekers whereas thesaurus constructors approached the construction problems from a more material based solution. The differences between the different specialist groups i.e. the social scientists, the indexers and the thesaurus constructors were often greater than between the different geo-cultural groups i.e. Finnish versus British. The differences occurred as a result of different translation aims, diverging expectations for multilingual thesauri and variety of practices. For multilingual thesaurus construction this means severe challenges. The clearly ambiguous concept of multilingual thesaurus as well as different construction and translation strategies should be considered more precisely in order to shed light on focus and equivalence types, which are clearly not self-evident. The research also revealed the close connection between the aims of multilingual thesauri and the pragmatic indexing term equivalence. 2) The pragmatic indexing term equivalence is very much context-depended. Although thesaurus term equivalence is defined and standardised in the field of library and information science (LIS), it is not understood in one established way and the current LIS tools are inadequate to provide enough analytical tools for both constructing and studying different kinds of multilingual thesauri as well as their indexing term equivalence. The tools provided in translation science were more practical and theoretical, and especially the division of different meanings of a word provided a useful tool in analysing the pragmatic equivalence, which often differs from the ideal model represented in thesaurus construction literature. The study thus showed that the variety of different discourses should be acknowledged, there is a need for operationalisation of new types of multilingual thesauri, and the factors influencing pragmatic indexing term equivalence should be discussed more precisely than is traditionally done.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Globalization of software today is making many companies in the industrialized nations to outsource their work to low-wage countries. This thesis aims at obtaining an initial general overview of offshore software development in Africa. It seeks to explore the state of offshore software outsourcing in Africa with a focus on the factors contributing to the successes and challenges of offshore software development practicesin Africa. The thesis made use of electronic questionnaires and voice interviews to collect the data. Identified African vendors were interviewed, and the data was analyzed qualitatively. The study found that theAfrican software outsourcing industry is still at its infancy. It is expected that the industry will grow. However, a lot needs to be done, and African governments are called upon to actively implement supportive infrastructures that will promote the growth of the local and export software industries. Further research is recommended to cover the wide context ofthe topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of fluid flow in pipes is one of the main topic of interest for engineers in industries. In this thesis, an effort is made to study the boundary layers formed near the wall of the pipe and how it behaves as a resistance to heat transfer. Before few decades, the scientists used to derive the analytical and empirical results by hand as there were limited means available to solve the complex fluid flow phenomena. Due to the increase in technology, now it has been practically possible to understand and analyze the actual fluid flow in any type of geometry. Several methodologies have been used in the past to analyze the boundary layer equations and to derive the expression for heat transfer. An integral relation approach is used for the analytical solution of the boundary layer equations and is compared with the FLUENT simulations for the laminar case. Law of the wall approach is used to derive the empirical correlation between dimensionless numbers and is then compared with the results from FLUENT for the turbulent case. In this thesis, different approaches like analytical, empirical and numerical are compared for the same set of fluid flow equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study explores the early phases of intercompany relationship building, which is a very important topic for purchasing and business development practitioners as well as for companies' upper management. There is a lot ofevidence that a proper engagement with markets increases a company's potential for achieving business success. Taking full advantage of the market possibilities requires, however, a holistic view of managing related decision-making chain. Most literature as well as the business processes of companies are lacking this holism. Typically they observe the process from the perspective of individual stages and thus lead to discontinuity and sub-optimization. This study contains a comprehensive introduction to and evaluation of literature related to various steps of the decision-making process. It is studied from a holistic perspective ofdetermining a company's vertical integration position within its demand/ supplynetwork context; translating the vertical integration objectives to feasible strategies and objectives; and operationalizing the decisions made through engagement with collaborative intercompany relationships. The empirical part of the research has been conducted in two sections. First the phenomenon of intercompany engagement is studied using two complementary case studies. Secondly a survey hasbeen conducted among the purchasing and business development managers of several electronics manufacturing companies, to analyze the processes, decision-makingcriteria and success factors of engagement for collaboration. The aim has been to identify the reasons why companies and their management act the way they do. As a combination of theoretical and empirical research an analysis has been produced of what would be an ideal way of engaging with markets. Based on the respective findings the study concludes by proposing a holistic framework for successful engagement. The evidence presented throughout the study demonstrates clear gaps, discontinuities and limitations in both current research and in practical purchasing decision-making chains. The most significant discontinuity is the identified disconnection between the supplier selection process and related criteria and the relationship success factors.