956 resultados para Modern portfolio theory
Resumo:
Investors significantly overweight domestic assets in their portfolios. This behavior which is commonly called “home bias” contradicts the prescriptions of portfolio theory. This thesis explores potential reasons for the “home bias” by examining the characteristics of the investing and the target countries and features of the interaction between them. A common theme of the four essays is a focus on the importance of information about foreign markets in explaining the share of these markets in investors’ portfolios. The results indicate that the size of the equity ownership in another country strongly relates to the distance to the financial capital of that country, and to trade in goods with and direct investments (FDI) to that country. The first essay empirically investigates the relationship between trade in real goods and portfolio investments. Overall, the evidence indicates a substantial role for trade in reducing the information cost relating to portfolio investments. The second essay examines the implications of the launch of the European Monetary Union (EMU) on international portfolio investments. The evidence on the allocation of Finnish international portfolio investments is more consistent with an information-based than a diversification motive explanation. The third essay employs new data for a large number of countries and further explores the role of trade on international portfolio investments. The results indicate that trade provides important information especially on firms in countries in which the corporate governance structure and the information environment of firms generate less reliable information. The fourth essay examines the relationship between direct investments (FDI) and portfolio investments. In contrast to the predications of portfolio theory, it provides evidence that FDI is a complement rather than a substitute for portfolio investments.
Resumo:
In this article we review classical and modern Galois theory with historical evolution and prove a criterion of Galois for solvability of an irreducible separable polynomial of prime degree over an arbitrary field k and give many illustrative examples.
Resumo:
Os mercados financeiros têm um papel fundamental na dinamização das economias modernas. Às empresas cotadas oferece o capital necessário para impulsionar o seu crescimento e aos investidores individuais proporciona a diversificação das suas carteiras, usufruindo desta forma do crescimento e da vitalidade da economia mundial. A gestão de carteiras de ativos financeiros constitui uma área que procura apresentar mecanismos para a obtenção de uma relação ótima entre retorno e risco. Neste sentido, inúmeros estudos têm contribuído de forma significativa para a eficiência e para a prática desta técnica. Esta dissertação pretende analisar a metodologia desenvolvida por Elton-Gruber para a construção de carteiras otimizadas e aplicar as técnicas subjacentes ao mercado acionista português. Para o efeito, serão realizadas pesquisas em fontes bibliográficas da especialidade e serão consultadas bases de dados de cotações históricas das ações e do índice de mercado nacional. A aplicação incidiu sobre ações cotadas no índice PSI-20 durante o período compreendido entre 2010 e 2014. No intuito de melhorar a compreensão das séries de retornos das amostras, o estudo de caráter quantitativo também recorreu à análise estatística. As evidências mostram que a carteira otimizada, no período em análise, contém apenas as ações da empresa Portucel. Este resultado estará condicionado pelos efeitos da crise financeira que iniciou em 2008.
Resumo:
Barry Saltzman was a giant in the fields of meteorology and climate science. A leading figure in the study of weather and climate for over 40 yr, he has frequently been referred to as the "father of modern climate theory." Ahead of his time in many ways, Saltzman made significant contributions to our understanding of the general circulation and spectral energetics budget of the atmosphere, as well as climate change across a wide spectrum of time scales. In his endeavor to develop a unified theory of how the climate system works, lie played a role in the development of energy balance models, statistical dynamical models, and paleoclimate dynamical models. He was a pioneer in developing meteorologically motivated dynamical systems, including the progenitor of Lorenz's famous chaos model. In applying his own dynamical-systems approach to long-term climate change, he recognized the potential for using atmospheric general circulation models in a complimentary way. In 1998, he was awarded the Carl-Gustaf Rossby medal, the highest honor of the American Meteorological Society "for his life-long contributions to the study of the global circulation and the evolution of the earth's climate." In this paper, the authors summarize and place into perspective some of the most significant contributions that Barry Saltzman made during his long and distinguished career. This short review also serves as an introduction to the papers in this special issue of the Journal of Climate dedicated to Barry's memory.
Resumo:
This paper introduces a normative view on corporate reputation management; an algorithmic model for reputation-driven strategic decision making is proposed and corporate reputation is conceptualized as influenced by a selection among organizational priorities. A portfolio-based approach is put forward; we draw on the foundations of portfolio theory and we create a portfolio-based reputation management model where reputation components and priorities are weighted by decision makers and shape organizational change in an attempt to formulate a corporate reputation strategy. The rationale of this paper is based on the foundational consideration of organizations as choosing the optimal strategy by seeking to maximize performance on corporate reputation capital while maintaining organizational stability and minimizing organizational risk.
Resumo:
Whether to keep products segregated (e.g., unbundled) or integrate some or all of them (e.g., bundle) has been a problem of profound interest in areas such as portfolio theory in finance, risk capital allocations in insurance and marketing of consumer products. Such decisions are inherently complex and depend on factors such as the underlying product values and consumer preferences, the latter being frequently described using value functions, also known as utility functions in economics. In this paper, we develop decision rules for multiple products, which we generally call ‘exposure units’ to naturally cover manifold scenarios spanning well beyond ‘products’. Our findings show, e.g. that the celebrated Thaler's principles of mental accounting hold as originally postulated when the values of all exposure units are positive (i.e. all are gains) or all negative (i.e. all are losses). In the case of exposure units with mixed-sign values, decision rules are much more complex and rely on cataloging the Bell number of cases that grow very fast depending on the number of exposure units. Consequently, in the present paper, we provide detailed rules for the integration and segregation decisions in the case up to three exposure units, and partial rules for the arbitrary number of units.
Resumo:
In recent times, technology has advanced in such a manner that the world can now communicate in means previously never thought possible. Transnational organised crime groups, who have exploited these new technologies as basis for their criminal success, however, have not overlooked this development, growth and globalisation. Law enforcement agencies have been confronted with an unremitting challenge as they endeavour to intercept, monitor and analyse these communications as a means of disrupting the activities of criminal enterprises. The challenge lies in the ability to recognise and change tactics to match an increasingly sophisticated adversary. The use of communication interception technology, such as phone taps or email interception, is a tactic that when used appropriately has the potential to cause serious disruption to criminal enterprises. Despite the research that exists on CIT and TOC, these two bodies of knowledge rarely intersect. This paper builds on current literature, drawing them together to provide a clearer picture of the use of CIT in an enforcement and intelligence capacity. It provides a review of the literature pertaining to TOC, the structure of criminal enterprises and the vulnerability of communication used by these crime groups. Identifying the current contemporary models of policing it reviews intelligence-led policing as the emerging framework for modern policing. Finally, it assesses the literature concerning CIT, its uses within Australia and the limitations and arguments that exist. In doing so, this paper provides practitioners with a clearer picture of the use, barriers and benefits of using CIT in the fight against TOC. It helps to bridge the current gaps in modern policing theory and offers a perspective that can help drive future research.
Resumo:
"In the past few years, many career theorists have noted the dearth of literature in the area of career development in childhood and adolescence. A growing need for integrating theory and research on the early stages of vocational development within a systemic, life-span developmental approach has been articulated. This volume, the first book dedicated to career development of children and adolescents, provides a broad and comprehensive overview of the current knowledge about the key career processes that take place in this age group. Each of the eighteen chapters represents an in-depth examination of a specific aspect of career development with a focus on integrating modern career theory and ongoing research and further developing theory-practice connections in understanding child and adolescent career behaviour. Twenty-six authors, leading experts from eight countries, provide a state-of-the-art summary of the current thinking in the field and outline directions for future empirical work and practice."--publisher website
Resumo:
A known limitation of the Probability Ranking Principle (PRP) is that it does not cater for dependence between documents. Recently, the Quantum Probability Ranking Principle (QPRP) has been proposed, which implicitly captures dependencies between documents through “quantum interference”. This paper explores whether this new ranking principle leads to improved performance for subtopic retrieval, where novelty and diversity is required. In a thorough empirical investigation, models based on the PRP, as well as other recently proposed ranking strategies for subtopic retrieval (i.e. Maximal Marginal Relevance (MMR) and Portfolio Theory(PT)), are compared against the QPRP. On the given task, it is shown that the QPRP outperforms these other ranking strategies. And unlike MMR and PT, one of the main advantages of the QPRP is that no parameter estimation/tuning is required; making the QPRP both simple and effective. This research demonstrates that the application of quantum theory to problems within information retrieval can lead to significant improvements.
Resumo:
Ranking documents according to the Probability Ranking Principle has been theoretically shown to guarantee optimal retrieval effectiveness in tasks such as ad hoc document retrieval. This ranking strategy assumes independence among document relevance assessments. This assumption, however, often does not hold, for example in the scenarios where redundancy in retrieved documents is of major concern, as it is the case in the sub–topic retrieval task. In this chapter, we propose a new ranking strategy for sub–topic retrieval that builds upon the interdependent document relevance and topic–oriented models. With respect to the topic– oriented model, we investigate both static and dynamic clustering techniques, aiming to group topically similar documents. Evidence from clusters is then combined with information about document dependencies to form a new document ranking. We compare and contrast the proposed method against state–of–the–art approaches, such as Maximal Marginal Relevance, Portfolio Theory for Information Retrieval, and standard cluster–based diversification strategies. The empirical investigation is performed on the ImageCLEF 2009 Photo Retrieval collection, where images are assessed with respect to sub–topics of a more general query topic. The experimental results show that our approaches outperform the state–of–the–art strategies with respect to a number of diversity measures.
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
This dissertation studies the language of Latin letters that were written in Egypt and Vindolanda (in northern Britain) during the period 1st century BC 3rd century AD on papyri, ostraca, and wooden tablets. The majority of the texts is, in one way or another, connected with the Roman army. The focus of the study is on syntax and pragmatics. Besides traditional philological methods, modern syntactic theory is used as well, especially in the pragmatic analysis. The study begins with a critical survey of certain concepts that are current in the research on the Latin language, most importantly the concept of vulgar Latin , which, it is argued, seems to be used as an abstract noun for variation and change in Latin . Further, it is necessary to treat even the non-literary material primarily as written texts and not as straightforward reflections of spoken language. An examination of letter phraseology shows that there is considerable variation between the two major geographical areas of provenance. Latin letter writing in Egypt was influenced by Greek. The study highlights the importance of seeing the letters as a text type, with recurring phraseological elements appearing in the body text as well. It is argued that recognising these elements is essential for the correct analysis of the syntax. Three areas of syntax are discussed in detail: sentence connection (mainly parataxis), syntactically incoherent structures and word order (the order of the object and the verb). For certain types of sentence connection we may plausibly posit an origin in spoken Latin, but for many other linguistic phenomena attested in this material the issue of spoken Latin is anything but simple. Concerning the study of historical syntax, the letters offer information about the changing status of the accusative case. Incoherent structures may reflect contaminations in spoken language but usually the reason for them is the inability of the writer to put his thoughts into writing, especially when there is something more complicated to be expressed. Many incoherent expressions reflect the need to start the predication with a thematic constituent. Latin word order is seen as resulting from an interaction of syntactic and pragmatic factors. The preference for an order where the topic is placed sentence-initially can be seen in word order more generally as well. Furthermore, there appears a difference between Egypt and Vindolanda. The letters from Vindolanda show the order O(bject) V(erb) clearly more often than the letters from Egypt. Interestingly, this difference correlates with another, namely the use of the anaphoric pronoun is. This is an interesting observation in view of the fact that both of these are traditional Latin features, as opposed to those that foreshadow the Romance development (VO order and use of the anaphoric ille). However, it is difficult to say whether this is an indication of social or regional variation.
Resumo:
A modern system theory based nonlinear control design is discussed in this paper for successful operation of an air-breathing engine operating at supersonic speed. The primary objective of the control design of such an air-breathing engine is to ensure that the engine dynamically produces the thrust that tracks a commanded value of thrust as closely as possible by regulating the fuel flow to the combustion system. However, since the engine operates in the supersonic range, an important secondary objective is to manage the shock wave configuration in the intake section of the engine which is manipulated by varying the throat area of the nozzle. A nonlinear sliding mode control technique has been successfully used to achieve both of the above objectives. In this problem, since the process is faster than the actuators, independent control designs are also carried out for the actuators as well to assure the satisfactory performance of the system. Moreover, to filter out the sensor and process noises and to estimate the states for making the control design operate based on output feedback, an Extended Kalman Filter based state estimation design is also carried out. The promising simulation results suggest that the proposed control design approach is quite successful in obtaining robust performance of the air-breathing engine.
Resumo:
Starting from the early decades of the twentieth century, evolutionary biology began to acquire mathematical overtones. This took place via the development of a set of models in which the Darwinian picture of evolution was shown to be consistent with the laws of heredity discovered by Mendel. The models, which came to be elaborated over the years, define a field of study known as population genetics. Population genetics is generally looked upon as an essential component of modern evolutionary theory. This article deals with a famous dispute between J. B. S. Haldane, one of the founders of population genetics, and Ernst Mayr, a major contributor to the way we understand evolution. The philosophical undercurrents of the dispute remain relevant today. Mayr and Haldane agreed that genetics provided a broad explanatory framework for explaining how evolution took place but differed over the relevance of the mathematical models that sought to underpin that framework. The dispute began with a fundamental issue raised by Mayr in 1959: in terms of understanding evolution, did population genetics contribute anything beyond the obvious? Haldane's response came just before his death in 1964. It contained a spirited defense, not just of population genetics, but also of the motivations that lie behind mathematical modelling in biology. While the difference of opinion persisted and was not glossed over, the two continued to maintain cordial personal relations.