886 resultados para Scholarly book market
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.
Resumo:
Open access is a new model for the publishing of scientific journals enabled by the Internet, in which the published articles are freely available for anyone to read. During the 1990’s hundreds of individual open access journals were founded by groups of academics, supported by grants and unpaid voluntary work. During the last five years other types of open access journals, funded by author charges have started to emerge and also established publishers have started to experiment with different variations of open access. This article reports on the experiences of one open access journal (The Electronic Journal of Information Technology in Construction, ITcon) over its ten year history. In addition to a straightforward account of the lessons learned the journal is also benchmarked against a number of competitors in the same research area and its development is put into the larger perspective of changes in scholarly publishing. The main findings are: That a journal publishing around 20-30 articles per year, equivalent to a typical quarterly journal, can sustainable be produced using an open source like production model. The journal outperforms its competitors in some respects, such as the speed of publication, availability of the results and balanced global distribution of authorship, and is on a par with them in most other respects. The key statistics for ITcon are: Acceptance rate 55 %. Average speed of publication 6-7 months. 801 subscribers to email alerts. Average number of downloads by human readers per paper per month 21.
Resumo:
The liquidity crisis that swept through the financial markets in 2007 triggered multi-billion losses and forced buyouts of some large banks. The resulting credit crunch is sometimes compared to the great recession in the early twentieth century. But the crisis also serves as a reminder of the significance of the interbank market and of proper central bank policy in this market. This thesis deals with implementation of monetary policy in the interbank market and examines how central bank tools affect commercial banks' decisions. I answer the following questions: • What is the relationship between the policy setup and interbank interest rate volatility? (averaging reserve requirement reduces the volatility) • What can explain a weak relationship between market liquidity and the interest rate? (high reserve requirement buffer) • What determines banks' decisions on when to satisfy the reserve requirement? (market frictions) • How did the liquidity crisis that began in 2007 affect interbank market behaviour? (resulted in higher credit risk and trading frictions as well as expected liquidity shortage)
Resumo:
Using Thomé's procedure, the asymptotic solutions of the Frieman and Book equation for the two-particle correlation in a plasma have been obtained in a complete form. The solution is interpreted in terms of the Lorentz distance. The exact expressions for the internal energy and pressure are evaluated and they are found to be a generalization of the result obtained earlier by others.
Resumo:
This thesis analyzes how matching takes place at the Finnish labor market from three different angles. The Finnish labor market has undergone severe structural changes following the economic crisis in the early 1990s. The labor market has had problems adjusting from these changes and hence a high and persistent unemployment has followed. In this thesis I analyze if matching problems, and in particular if changes in matching, can explain some of this persistence. The thesis consists of three essays. In the first essay Finnish Evidence of Changes in the Labor Market Matching Process the matching process at the Finnish labor market is analyzed. The key finding is that the matching process has changed thoroughly between the booming 1980s and the post-crisis period. The importance of the number of unemployed, and in particular long-term unemployed, for the matching process has vanished. More unemployed do not increase matching as theory predicts but rather the opposite. In the second essay, The Aggregate Matching Function and Directed Search -Finnish Evidence, stock-flow matching as a potential micro foundation of the aggregate matching function is studied. In the essay I show that newly unemployed match mainly with the stock of vacancies while longer term unemployed match with the inflow of vacancies. When aggregating I still find evidence of the traditional aggregate matching function. This could explain the huge support the aggregate matching function has received despite its odd randomness assumption. The third essay, How do Registered Job Seekers really match? -Finnish occupational level Evidence, studies matching for nine occupational groups and finds that very different matching problems exist for different occupations. In this essay also misspecification stemming from non-corresponding variables is dealt with through the introduction of a completely new set of variables. The new outflow measure used is vacancies filled with registered job seekers and it is matched by the supply side measure registered job seekers.
Resumo:
A functioning stock market is an essential component of a competitive economy, since it provides a mechanism for allocating the economy’s capital stock. In an ideal situation, the stock market will steer capital in a manner that maximizes the total utility of the economy. As prices of traded stocks depend on and vary with information available to investors, it is apparent that information plays a crucial role in a functioning stock market. However, even though information indisputably matters, several issues regarding how stock markets process and react to new information still remain unanswered. The purpose of this thesis is to explore the link between new information and stock market reactions. The first essay utilizes new methodological tools in order to investigate the average reaction of investors to new financial statement information. The second essay explores the behavior of different types of investors when new financial statement information is disclosed to the market. The third essay looks into the interrelation between investor size, behavior and overconfidence. The fourth essay approaches the puzzle of negative skewness in stock returns from an altogether different angle than previous studies. The first essay presents evidence of the second derivatives of some financial statement signals containing more information than the first derivatives. Further, empirical evidence also indicates that some of the investigated signals proxy risk while others contain information priced with a delay. The second essay documents different categories of investors demonstrating systematical differences in their behavior when new financial statement information arrives to the market. In addition, a theoretical model building on differences in investor overconfidence is put forward in order to explain the observed behavior. The third essay shows that investor size describes investor behavior very well. This finding is predicted by the model proposed in the second essay, and hence strengthens the model. The behavioral differences between investors of different size furthermore have significant economic implications. Finally, the fourth essay finds strong evidence of management news disclosure practices causing negative skewness in stock returns.
Resumo:
Market microstructure is “the study of the trading mechanisms used for financial securities” (Hasbrouck (2007)). It seeks to understand the sources of value and reasons for trade, in a setting with different types of traders, and different private and public information sets. The actual mechanisms of trade are a continually changing object of study. These include continuous markets, auctions, limit order books, dealer markets, or combinations of these operating as a hybrid market. Microstructure also has to allow for the possibility of multiple prices. At any given time an investor may be faced with a multitude of different prices, depending on whether he or she is buying or selling, the quantity he or she wishes to trade, and the required speed for the trade. The price may also depend on the relationship that the trader has with potential counterparties. In this research, I touch upon all of the above issues. I do this by studying three specific areas, all of which have both practical and policy implications. First, I study the role of information in trading and pricing securities in markets with a heterogeneous population of traders, some of whom are informed and some not, and who trade for different private or public reasons. Second, I study the price discovery of stocks in a setting where they are simultaneously traded in more than one market. Third, I make a contribution to the ongoing discussion about market design, i.e. the question of which trading systems and ways of organizing trading are most efficient. A common characteristic throughout my thesis is the use of high frequency datasets, i.e. tick data. These datasets include all trades and quotes in a given security, rather than just the daily closing prices, as in traditional asset pricing literature. This thesis consists of four separate essays. In the first essay I study price discovery for European companies cross-listed in the United States. I also study explanatory variables for differences in price discovery. In my second essay I contribute to earlier research on two issues of broad interest in market microstructure: market transparency and informed trading. I examine the effects of a change to an anonymous market at the OMX Helsinki Stock Exchange. I broaden my focus slightly in the third essay, to include releases of macroeconomic data in the United States. I analyze the effect of these releases on European cross-listed stocks. The fourth and last essay examines the uses of standard methodologies of price discovery analysis in a novel way. Specifically, I study price discovery within one market, between local and foreign traders.
Resumo:
A better understanding of stock price changes is important in guiding many economic activities. Since prices often do not change without good reasons, searching for related explanatory variables has involved many enthusiasts. This book seeks answers from prices per se by relating price changes to their conditional moments. This is based on the belief that prices are the products of a complex psychological and economic process and their conditional moments derive ultimately from these psychological and economic shocks. Utilizing information about conditional moments hence makes it an attractive alternative to using other selective financial variables in explaining price changes. The first paper examines the relation between the conditional mean and the conditional variance using information about moments in three types of conditional distributions; it finds that the significance of the estimated mean and variance ratio can be affected by the assumed distributions and the time variations in skewness. The second paper decomposes the conditional industry volatility into a concurrent market component and an industry specific component; it finds that market volatility is on average responsible for a rather small share of total industry volatility — 6 to 9 percent in UK and 2 to 3 percent in Germany. The third paper looks at the heteroskedasticity in stock returns through an ARCH process supplemented with a set of conditioning information variables; it finds that the heteroskedasticity in stock returns allows for several forms of heteroskedasticity that include deterministic changes in variances due to seasonal factors, random adjustments in variances due to market and macro factors, and ARCH processes with past information. The fourth paper examines the role of higher moments — especially skewness and kurtosis — in determining the expected returns; it finds that total skewness and total kurtosis are more relevant non-beta risk measures and that they are costly to be diversified due either to the possible eliminations of their desirable parts or to the unsustainability of diversification strategies based on them.
Resumo:
The integrated European debt capital market has undoubtedly broadened the possibilities for companies to access funding from the public and challenged investors to cope with an ever increasing complexity of its market participants. Well into the Euro-era, it is clear that the unified market has created potential for all involved parties, where investment opportunities are able to meet a supply of funds from a broad geographical area now summoned under a single currency. Europe’s traditionally heavy dependency on bank lending as a source of debt capital has thus been easing as corporate residents are able to tap into a deep and liquid capital market to satisfy their funding needs. As national barriers eroded with the inauguration of the Euro and interest rates for the EMU-members converged towards over-all lower yields, a new source of debt capital emerged to the vast majority of corporate residents under the new currency and gave an alternative to the traditionally more maturity-restricted bank debt. With increased sophistication came also an improved knowledge and understanding of the market and its participants. Further, investors became more willing to bear credit risk, which opened the market for firms of ever lower creditworthiness. In the process, the market as a whole saw a change in the profile of issuers, as non-financial firms increasingly sought their funding directly from the bond market. This thesis consists of three separate empirical studies on how corporates fund themselves on the European debt capital markets. The analysis focuses on a firm’s access to and behaviour on the capital market, subsequent the decision to raise capital through the issuance of arm’s length debt on the bond market. The specific areas considered are contributing to our knowledge in the fields of corporate finance and financial markets by considering explicitly firms’ primary market activities within the new market area. The first essay explores how reputation of an issuer affects its debt issuance. Essay two examines the choice of interest rate exposure on newly issued debt and the third and final essay explores pricing anomalies on corporate debt issues.
Resumo:
Liquidity, or how easy an investment is to buy or sell, is becoming increasingly important for financial market participants. The objective of this dissertation is to contribute to the understanding of how liquidity affects financial markets. The first essays analyze the actions taken by underwriters immediately after listing to improve liquidity of IPO stock. To estimate the impact of underwriter activity on the pricing of the IPOs, the order book during the first weeks of trading in the IPO stock is studied. Evidence of stabilization and liquidity enhancing activities by underwriters is found. The second half of the dissertation is concerned with the daily trading of stocks where liquidity may be impacted by policy issues such as changes in taxes or exchange fees and by opening the access to the markets for foreign investors. The desirability of a transaction tax on securities trading is addressed. An increase in transaction tax is found to cause lower prices and higher volatility. In the last essay the objective is to determine if the liquidity of a security has an impact on the return investors require. The results support the notion that returns are negatively correlated to liquidity.
Resumo:
This doctoral dissertation takes a buy side perspective to third-party logistics (3PL) providers’ service tiering by applying a linear serial dyadic view to transactions. It takes its point of departure not only from the unalterable focus on the dyad levels as units of analysis and how to manage them, but also the characteristics both creating and determining purposeful conditions for a longer duration. A conceptual framework is proposed and evaluated on its ability to capture logistics service buyers’ perceptions of service tiering. The problem discussed is in the theoretical context of logistics and reflects value appropriation, power dependencies, visibility in linear serial dyads, a movement towards the more market governed modes of transactions (i.e. service tiering) and buyers’ risk perception of broader utilisation of the logistics services market. Service tiering, in a supply chain setting, with the lack of multilateral agreements between supply chain members, is new. The deductive research approach applied, in which theoretically based propositions are empirically tested with quantitative and qualitative data, provides new insight into (contractual) transactions in 3PL. The study findings imply that the understanding of power dependencies and supply chain dynamics in a 3PL context is still in its infancy. The issues found include separation of service responsibilities, supply chain visibility, price-making behaviour and supply chain strategies under changing circumstances or influence of non-immediate supply chain actors. Understanding (or failing to understand) these issues may mean remarkable implications for the industry. Thus, the contingencies may trigger more open-book policies, larger liability scope of 3PL service providers or insourcing of critical logistics activities from the first-tier buyer core business and customer service perspectives. In addition, a sufficient understanding of the issues surrounding service tiering enables proactive responses to devise appropriate supply chain strategies. The author concludes that qualitative research designs, facilitating data collection on multiple supply chain actors, may capture and increase understanding of the impact of broader supply chain strategies. This would enable pattern-matching through an examination of two or more sides of exchange transactions to measure relational symmetries across linear serial dyads. Indeed, the performance of the firm depends not only on how efficiently it cooperates with its partners, but also on how well exchange partners cooperate with an organisation’s own business.
Resumo:
In this paper I investigate the exercise policy, and the market reaction to that, of the executive stock option holders in Finland. The empirical tests are conducted with aggregated firm level data from 34 firms and 41 stock option programs. I find some evidence of an inverse relation between the exercise intensity of the options holders and the future abnormal return of the company share price. This finding is supported by the view that information about future company prospect seems to be the only theoretical attribute that could delay the exercise of the options. Moreover, a high concentration of exercises in the beginning of the exercise window is predicted and the market is expected to react to deviations from this. The empirical findings however show that the market does not react homogenously to the information revealed by the late exercises.
Resumo:
The use of different time units in option pricing may lead to inconsistent estimates of time decay and spurious jumps in implied volatilities. Different time units in the pricing model leads to different implied volatilities although the option price itself is the same.The chosen time unit should make it necessary to adjust the volatility parameter only when there are some fundamental reasons for it and not due to wrong specifications of the model. This paper examined the effects of option pricing using different time hypotheses and empirically investigated which time frame the option markets in Germany employ over weekdays. The paper specifically tries to get a picture of how the market prices options. The results seem to verify that the German market behaves in a fashion that deviates from the most traditional time units in option pricing, calendar and trading days. The study also showed that the implied volatility of Thursdays was somewhat higher and thus differed from the pattern of other days of the week. Using a GARCH model to further investigate the effect showed that although a traditional tests, like the analysis of variance, indicated a negative return for Thursday during the same period as the implied volatilities used, this was not supported using a GARCH model.
Resumo:
This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.