19 resultados para news business models
em Helda - Digital Repository of University of Helsinki
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
Suvi Nenonen Customer asset management in action: using customer portfolios for allocating resources across business-to-business relationships for improved shareholder value Customers are crucial assets to all firms as customers are the ultimate source of all cash flows. Regardless this financial importance of customer relationships, for decades there has been a lack of suitable frameworks explaining how customer relationships contribute to the firm financial performance and how this contribution can be actively managed. In order to facilitate a better understanding of the customer asset, contemporary marketing has investigated the use of financial theories and asset management practices in the customer relationship context. Building on this, marketing academics have promoted the customer lifetime value concept as a solution for valuating and managing customer relationships for optimal financial outcomes. However, the empirical investigation of customer asset management lags behind the conceptual development steps taken. Additionally, the practitioners have not embraced the use of customer lifetime value in guiding managerial decisions - especially in the business-to-business context. The thesis points out that there are fundamental differences between customer relationships and investment instruments as investment targets, effectively eliminating the possibility to use financial theories in a customer relationships context or to optimize the customer base as a single investment portfolio. As an alternative, the thesis proposes the use of customer portfolio approach for allocating resources across the customer base for improved shareholder value. In the customer portfolio approach, the customer base of a firm is divided into multiple portfolios based on customer relationships’ potential to contribute to the shareholder value creation. After this, customer management concepts are tailored to each customer portfolio, designed to improve the shareholder value in their own respect. Therefore, effective customer asset management with the customer portfolio approach necessitates that firms are able to manage multiple parallel customer management concepts, or business models, simultaneously. The thesis is one of the first empirical studies on customer asset management, bringing empirical evidence from multiple business-to-business case studies on how customer portfolio models can be formed, how customer portfolios can be managed, and how customer asset management has contributed to the firm financial performance.
Resumo:
Suvi Nenonen Customer asset management in action: using customer portfolios for allocating resources across business-to-business relationships for improved shareholder value Customers are crucial assets to all firms as customers are the ultimate source of all cash flows. Regardless this financial importance of customer relationships, for decades there has been a lack of suitable frameworks explaining how customer relationships contribute to the firm financial performance and how this contribution can be actively managed. In order to facilitate a better understanding of the customer asset, contemporary marketing has investigated the use of financial theories and asset management practices in the customer relationship context. Building on this, marketing academics have promoted the customer lifetime value concept as a solution for valuating and managing customer relationships for optimal financial outcomes. However, the empirical investigation of customer asset management lags behind the conceptual development steps taken. Additionally, the practitioners have not embraced the use of customer lifetime value in guiding managerial decisions - especially in the business-to-business context. The thesis points out that there are fundamental differences between customer relationships and investment instruments as investment targets, effectively eliminating the possibility to use financial theories in a customer relationships context or to optimize the customer base as a single investment portfolio. As an alternative, the thesis proposes the use of customer portfolio approach for allocating resources across the customer base for improved shareholder value. In the customer portfolio approach, the customer base of a firm is divided into multiple portfolios based on customer relationships’ potential to contribute to the shareholder value creation. After this, customer management concepts are tailored to each customer portfolio, designed to improve the shareholder value in their own respect. Therefore, effective customer asset management with the customer portfolio approach necessitates that firms are able to manage multiple parallel customer management concepts, or business models, simultaneously. The thesis is one of the first empirical studies on customer asset management, bringing empirical evidence from multiple business-to-business case studies on how customer portfolio models can be formed, how customer portfolios can be managed, and how customer asset management has contributed to the firm financial performance.
Resumo:
Wireless network access is gaining increased heterogeneity in terms of the types of IP capable access technologies. The access network heterogeneity is an outcome of incremental and evolutionary approach of building new infrastructure. The recent success of multi-radio terminals drives both building a new infrastructure and implicit deployment of heterogeneous access networks. Typically there is no economical reason to replace the existing infrastructure when building a new one. The gradual migration phase usually takes several years. IP-based mobility across different access networks may involve both horizontal and vertical handovers. Depending on the networking environment, the mobile terminal may be attached to the network through multiple access technologies. Consequently, the terminal may send and receive packets through multiple networks simultaneously. This dissertation addresses the introduction of IP Mobility paradigm into the existing mobile operator network infrastructure that have not originally been designed for multi-access and IP Mobility. We propose a model for the future wireless networking and roaming architecture that does not require revolutionary technology changes and can be deployed without unnecessary complexity. The model proposes a clear separation of operator roles: (i) access operator, (ii) service operator, and (iii) inter-connection and roaming provider. The separation allows each type of an operator to have their own development path and business models without artificial bindings with each other. We also propose minimum requirements for the new model. We present the state of the art of IP Mobility. We also present results of standardization efforts in IP-based wireless architectures. Finally, we present experimentation results of IP-level mobility in various wireless operator deployments.
Resumo:
This study investigates primary and secondary school teachers’ social representations and ways to conceptualise new technologies. The focus is both on teachers’ descriptions, interpretations and conceptions of technology and on the adoption and formation of these conceptions. In addition, the purpose of this study is to analyse how the national objectives of the information society and the implementation of information and communication technologies (ICT) in schools reflect teachers’ thinking and everyday practices. The starting point for the study is the idea of a dynamic and mutual relationship between teachers and technology so that technology does not affect one-sidedly teachers’ thinking. This relationship is described in this study as the teachers’ technology relationship. This concept emphasises that technology cannot be separated from society, social relations and the context where it is used but it is intertwined with societal practices and is therefore formed in interaction with the material and social factors. The theoretical part of this study encompasses three different research traditions: 1) the social shaping of technology, 2) research on how schools and teachers use technology and 3) social representations theory. The study was part of the Helmi Project (Holistic development of e-Learning and business models) in 2001–2005 at the Helsinki University of Technology, SimLab research unit. The Helmi Project focused on different aspects of the utilisation of ICT in teaching. The research data consisted of interviews of teachers and principals. Altogether 37 interviews were conducted in 2003 and 2004 in six different primary and secondary schools in Espoo, Finland. The data was analysed applying grounded theory. The results showed that the teachers’ technology relationship was diverse and context specific. Technology was interpreted differently depending on the context: the teachers’ technology related descriptions and metaphors illustrated on one hand the benefits and the possibilities and on the other hand the problems and threats of different technologies. The dualist nature of technology was also expressed in the teachers’ thinking about technology as a deterministic and irrevocable force and as a controllable and functional tool at the same time. Teachers did not consider technology as having a stable character but they interpreted technology in relation to the variable context of use. This way they positioned or anchored technology into their everyday practices. The study also analysed the formation of the teachers’ technology relationship and the ways teachers familiarise themselves with new technologies. Comparison of different technologies as well as technology related metaphors turned out to be significant in forming the technology relationship. Also the ways teachers described the familiarisation process and the interpretations of their own technical skills affected the formation of technology relationship. In addition, teachers defined technology together with other teachers, and the discussions reflected teachers’ interpretations and descriptions.
Resumo:
One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.
Resumo:
Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.
Resumo:
This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.
Resumo:
Despite thirty years of research in interorganizational networks and project business within the industrial networks approach and relationship marketing, collective capability of networks of business and other interorganizational actors has not been explicitly conceptualized and studied within the above-named approaches. This is despite the fact that the two approaches maintain that networking is one of the core strategies for the long-term survival of market actors. Recently, many scholars within the above-named approaches have emphasized that the survival of market actors is based on the strength of their networks and that inter-firm competition is being replaced by inter-network competition. Furthermore, project business is characterized by the building of goal-oriented, temporary networks whose aims, structures, and procedures are clarified and that are governed by processes of interaction as well as recurrent contracts. This study develops frameworks for studying and analysing collective network capability, i.e. collective capability created for the network of firms. The concept is first justified and positioned within the industrial networks, project business, and relationship marketing schools. An eclectic source of conceptual input is based on four major approaches to interorganizational business relationships. The study uses qualitative research and analysis, and the case report analyses the empirical phenomenon using a large number of qualitative techniques: tables, diagrams, network models, matrices etc. The study shows the high level of uniqueness and complexity of international project business. While perceived psychic distance between the parties may be small due to previous project experiences and the benefit of existing relationships, a varied number of critical events develop due to the economic and local context of the recipient country as well as the coordination demands of the large number of involved actors. The study shows that the successful creation of collective network capability led to the success of the network for the studied project. The processes and structures for creating collective network capability are encapsulated in a model of governance factors for interorganizational networks. The theoretical and management implications are summarized in seven propositions. The core implication is that project business success in unique and complex environments is achieved by accessing the capabilities of a network of actors, and project management in such environments should be built on both contractual and cooperative procedures with local recipient country parties.
Resumo:
This study examined the effects of the Greeks of the options and the trading results of delta hedging strategies, with three different time units or option-pricing models. These time units were calendar time, trading time and continuous time using discrete approximation (CTDA) time. The CTDA time model is a pricing model, that among others accounts for intraday and weekend, patterns in volatility. For the CTDA time model some additional theta measures, which were believed to be usable in trading, were developed. The study appears to verify that there were differences in the Greeks with different time units. It also revealed that these differences influence the delta hedging of options or portfolios. Although it is difficult to say anything about which is the most usable of the different time models, as this much depends on the traders view of the passing of time, different market conditions and different portfolios, the CTDA time model can be viewed as an attractive alternative.
Resumo:
This paper examines how volatility in financial markets can preferable be modeled. The examination investigates how good the models for the volatility, both linear and nonlinear, are in absorbing skewness and kurtosis. The examination is done on the Nordic stock markets, including Finland, Sweden, Norway and Denmark. Different linear and nonlinear models are applied, and the results indicates that a linear model can almost always be used for modeling the series under investigation, even though nonlinear models performs slightly better in some cases. These results indicate that the markets under study are exposed to asymmetric patterns only to a certain degree. Negative shocks generally have a more prominent effect on the markets, but these effects are not really strong. However, in terms of absorbing skewness and kurtosis, nonlinear models outperform linear ones.
Resumo:
This study evaluates three different time units in option pricing: trading time, calendar time and continuous time using discrete approximations (CTDA). The CTDA-time model partitions the trading day into 30-minute intervals, where each interval is given a weight corresponding to the historical volatility in the respective interval. Furthermore, the non-trading volatility, both overnight and weekend volatility, is included in the first interval of the trading day in the CTDA model. The three models are tested on market prices. The results indicate that the trading-time model gives the best fit to market prices in line with the results of previous studies, but contrary to expectations under non-arbitrage option pricing. Under non-arbitrage pricing, the option premium should reflect the cost of hedging the expected volatility during the option’s remaining life. The study concludes that the historical patterns in volatility are not fully accounted for by the market, rather the market prices options closer to trading time.