956 resultados para Modern portfolio theory
Natural Law and Civil Sovereignty: moral right and state authority in early modern political thought
Resumo:
Theatre, like other subjects in the humanities, has recently undergone quintessential changes in theory, approach, and research. Modern Drama - a collection of twelve essays from leading theatre and drama scholars - investigates the contemporary meanings and the cultural and political resonances of the terms inherent in the concepts of 'modern' and 'drama, ' delving into a range of theoretical questions on the history of modernism, modernity, post-modernism, and postmodernity as they have intersected with the shifting histories of drama, theatre, and performance.
Resumo:
The reweaving and repaving of the modern Silk Road passes through outsourcing and offshoring activities that have a profound impact on both global business psyche and landscape. Firms, in particular, and their global value chain are being shaped and reshaped through a complex concoction of vertical integration and disintegration. The boundary of the firm and the firm/market interface has been of interest to students of organisation and economics for some time. It has provided the context for Internalisation Theory. Within the new economy, the twin trends of globalisation and advancing technologies are giving rise to a hitherto unknown “worldwide market for market transactions? and increased opportunities for international expansion by firms via market-based modes of organisation. We describe these trends and offer an early modeling approach for explaining why some firm’s externalise the marginal transaction in the so-called new economy. The paper further draws attention on the need to articulate an “Externalisation Theory? that adequately accounts for the firm’s offshoring and outsourcing activities, and that parallels as well as complement “Internalisation Theory? for a full explanation of today’s firms behaviour.
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.
Resumo:
Improvement of training students using modern information technologies, like collective developing teaching computer software, is discussed. Organizational, technical, technological advices are given. Experience of using information technologies in educational course “Decision Theory” is described.
Resumo:
Portfolio analysis exists, perhaps, as long, as people think about acceptance of rational decisions connected with use of the limited resources. However the occurrence moment of portfolio analysis can be dated precisely enough is having connected it with a publication of pioneer work of Harry Markovittz (Markovitz H. Portfolio Selection) in 1952. The model offered in this work, simple enough in essence, has allowed catching the basic features of the financial market, from the point of view of the investor, and has supplied the last with the tool for development of rational investment decisions. The central problem in Markovitz theory is the portfolio choice that is a set of operations. Thus in estimation, both separate operations and their portfolios two major factors are considered: profitableness and risk of operations and their portfolios. The risk thus receives a quantitative estimation. The account of mutual correlation dependences between profitablenesses of operations appears the essential moment in the theory. This account allows making effective diversification of portfolio, leading to essential decrease in risk of a portfolio in comparison with risk of the operations included in it. At last, the quantitative characteristic of the basic investment characteristics allows defining and solving a problem of a choice of an optimum portfolio in the form of a problem of quadratic optimization.
Resumo:
AMS subject classification: 93C95, 90A09.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
Economic theories of rational addiction aim to describe consumer behavior in the presence of habit-forming goods. We provide a biological foundation for this body of work by formally specifying conditions under which it is optimal to form a habit. We demonstrate the empirical validity of our thesis with an in-depth review and synthesis of the biomedical literature concerning the action of opiates in the mammalian brain and their eects on behavior. Our results lend credence to many of the unconventional behavioral assumptions employed by theories of rational addiction, including adjacent complementarity and the importance of cues, attention, and self-control in determining the behavior of addicts. We oer evidence for the special case of the opiates that "harmful" addiction is the manifestation of a mismatch between behavioral algorithms encoded in the human genome and the expanded menu of choices faced by consumers in the modern world.
Resumo:
A főáramlat közgazdászai elismerik, hogy a szocialista rendszert a krónikus hiány jellemezte, de úgy vélik, hogy a kapitalista rendszerben - kisebb vagy nagyobb ingadozások közepette - piaci egyensúly uralkodik. Ezzel szemben a tanulmány két piaci állapotot állít egymással szembe. Az egyikben dominálnak a túlkeresleti jelenségek, bár előfordulnak túlkínálati jelenségek is, ezt nevezi a szerző hiánygazdaságnak. A másikban dominálnak a túlkínálati jelenségek, bár előfordulnak túlkeresleti jelenségek is, amit a szerző többletgazdaságnak nevez. A tanulmány II. része összefoglalja a tanulmány fő állításait. Ezek szerint a szocialista rendszer veleszületett immanens tulajdonsága a hiánygazdaság, míg a kapitalista rendszer veleszületett immanens tulajdonsága a többletgazdaság létrehozása és állandó reprodukciója. Állami beavatkozások erősíthetik vagy gyengíthetik ezeket a genetikus tulajdonságokat, de nem szüntethetik meg. A tanulmány áttekinti a többletgazdaság kedvező és kedvezőtlen hatásait. A kedvező hatások között különleges nyomatékkal emeli ki, hogy a többlet (elsősorban többletkapacitások) nélkül nem alakulhat ki a termelők, illetve az eladók rivalizálása, ami az innovációs folyamat legfontosabb hajtóereje. A többletgazdaság általános esetének vizsgálata után különböző speciális esetekkel foglalkozik a tanulmány: a gazdaság konjunkturális hullámzásával, a hadigazdasággal, a modern kapitalizmusban mutatkozó történelmi léptékű változásokkal, valamint a szocialista rendszeren belül megjelent piacorientált reformokkal és a posztszocialista átmenettel. ____ Mainstream economists recognize that the socialist system was marked by chronic shortage, but they consider that the capitalist system exhibits market equilibrium, give or take some greater or lesser fluctuations. This study, however, contrasts two market states. One is dominated by phenomena of excess demand, though instances of excess supply appear as well; this the author calls a shortage economy. The other is dominated by phenomena of excess supply, though instances of excess demand appear as well; this the author terms a surplus economy. Part II of the study starts by summing up its main propositions. Just as the shortage economy is an intrinsic, immanent trait of the socialist system, so the creation and continual reproduction of the surplus economy is an intrinsic, immanent trait of the capitalist system. Such genetic traits may be strengthened or weakened by state intervention, but not eliminated by it. The study reviews the favourable and detrimental effects of the surplus economy. Of the favourable effects, it is emphasized that without surplus there cannot develop among producers or sellers the rivalry that provides the main impetus for the innovation process. Having examined the general case of the surplus economy, the study turns to various special cases: to the trade-cycle fluctuations of the economy, the war economy, the historic changes appearing in modern capitalism, the market-oriented reforms that appear within the socialist system, and the post-socialist transition.
Resumo:
A nemzetközi, elsősorban európai szervezettudományban mára meghatározóvá vált a kritikai megközelítés, a hazai szakirodalomban mégis elvétve találni rá utalásokat. A szerzők írásukban tárgyalják, hogy a kritikai menedzsmentelméletek (KME) szemszögéből miként bírálható a mindenkori szervezeti gyakorlat, és miért bírálandók a főáramú menedzsmentelméletek. A tanulmány fő részében elméleti megkülönböztetéseket tesznek: egyrészt elhatárolják a kritikai megközelítést a főáramú szervezetelméletektől, másrészt több szempontból is különbséget tesznek a különféle – de egyaránt a KME alá tartozó – kritikai megközelítések között. De a kritikai szemlélethez hűen nem csak a puszta elméletismertetés volt a céljuk: e bevezetés és problémafelvető tanulmány – s a későbbiekben tervezett cikksorozat – szándékuk szerint vitaindítóként is szolgál. Abban bíznak, hogy a felvetett kérdésekről valódi, lényegi párbeszédet generálhatnak a hazai menedzsmenttudományban (kutatók, oktatók és elméletalkalmazók körében), mely kihathat a szervezeti gyakorlatra is. _____ Critical Management Studies (CMS) as a field of organization studies (OS) has become central internationally, and especially in Europe. Yet, its appearance is still very rare in the Hungarian OS literature. In this study first the authors discuss how the nowadays dominant organizational practices along with the mainstream management and organization theories are to be criticized from a Critical Management perspective. In the main section, so as to define CMS, they make important theoretical distinctions, first between CMS and mainstream organization theories (in general), and then among the different critical approaches that nevertheless all fall under the broad CMS umbrella. But, in line with a truly critical attitude, they not only go into theoretical discussions but, at least to their intention, the purpose of this introductory paper is also to addresses important problems both in the theory and the practice of organization and management. Therefore, it could serve as an opening of an important debate or dialogue in the Hungarian academic community (researchers, educators and other professionals), a theoretical discussion that could have real influence on organizational practice too.
Innovációs kalandozások az elmélettől a stratégiáig = Innovation adventuring from theory to strategy
Resumo:
A cikk célja, hogy közelebb vigye az olvasót az innováció és az innovációmenedzsment kérdésköréhez. A tanulmány az innováció témakörének feldolgozását a vállalatelméleti alapoktól kezdi, majd konkrét stratégiai megfontolásokig jut el a cikk végére. A tanulmány széles körű hazai és nemzetközi szakirodalom alapján tárja fel az innováció vállalatelméleti gyökereit. A felhasznált irodalom nem ragad meg elméleti szinten, hiszen a tanulmány lefordítja ezeket az elméleti koncepciókat valós, gyakorlatorientált üzleti nyelvre. Célja, hogy a nagyvilágban szétszórt elméleteket letisztítsa, és a modern kori menedzsmentelvekkel szintetizálja. A cikk az innovációt a vállalati értékteremtés szemszögéből vizsgálja. Megállapítja, hogy az innováció számos vállalatelmélet tanait integrálja egybe, aminek következtében a stratégiai implikációk is széles spektrumon mozoghatnak. Ahogy az innováció változást indukál a szervezetben, úgy bukkannak fel komplex optimalizálási dilemmák, amelyek a turbulens gazdasági környezetben, a rövidülő reakcióidők miatt, egyre nagyobb kihívást okoznak a menedzserek számára. A cikk ezeket a dilemmákat mutatja be vitaindító attitűddel, valamint az elmélet és a gyakorlat szintetizálásával. ____ The aim of the article is to bring the reader closer to the topic of innovation management through several company theories and strategic implications. This study is based on a wide range of international literature which is interpreted in a practice oriented way. This article is summarizing the sporadic information on innovation as well as makes an effort on synthesizing this knowledge. The aspect of observation is mainly based on corporate value creation. As innovation causes changes into the organization company leaders have to face complex economic optimization questions. These management dilemmas are not easy to solve in this turbulent business environment. This article tries to highlight these strategic level issues around innovation management by synthesizing theoretical knowledge with business implications.
Resumo:
Extreme stock price movements are of great concern to both investors and the entire economy. For investors, a single negative return, or a combination of several smaller returns, can possible wipe out so much capital that the firm or portfolio becomes illiquid or insolvent. If enough investors experience this loss, it could shock the entire economy. An example of such a case is the stock market crash of 1987. Furthermore, there has been a lot of recent interest regarding the increasing volatility of stock prices. ^ This study presents an analysis of extreme stock price movements. The data utilized was the daily returns for the Standard and Poor's 500 index from January 3, 1978 to May 31, 2001. Research questions were analyzed using the statistical models provided by extreme value theory. One of the difficulties in examining stock price data is that there is no consensus regarding the correct shape of the distribution function generating the data. An advantage with extreme value theory is that no detailed knowledge of this distribution function is required to apply the asymptotic theory. We focus on the tail of the distribution. ^ Extreme value theory allows us to estimate a tail index, which we use to derive bounds on the returns for very low probabilities on an excess. Such information is useful in evaluating the volatility of stock prices. There are three possible limit laws for the maximum: Gumbel (thick-tailed), Fréchet (thin-tailed) or Weibull (no tail). Results indicated that extreme returns during the time period studied follow a Fréchet distribution. Thus, this study finds that extreme value analysis is a valuable tool for examining stock price movements and can be more efficient than the usual variance in measuring risk. ^
Resumo:
This dissertation focused on an increasingly prevalent phenomenon in today's global business environment—strategic alliance portfolio. Building on resource-based view, resource dependency theory and real options theory, this dissertation adopted a multi-dimensional perspective to examine the performance implications, strategic antecedents of alliance portfolio configuration, and its strategic effects on firms' decision-making on their continuing foreign expansion. The dissertation consisted of three interrelated essays, each of which dealt with a specific research question. In the first essay I applied a two-dimensional construct that embraces both alliance relations' and alliance partners' attributes to illustrate alliance portfolio configuration. Based on this framework, a longitudinal study was conducted attempting to explore the performance properties of alliance portfolio configuration. The results revealed that alliance diversity and partner diversity have different relative contributions to firms' economic performance. The relationship between alliance portfolio configuration and firm performance was shaped by degree of multinationality in a curvilinear pattern. The second essay attempted to identify the firm level driving forces of alliance portfolio configuration and how these forces interacting with firms' internationalization influence firms' strategic choices on alliance portfolio configuration. The empirical results indicated that past alliance experience, slack resource and firms' brand images are three critical determinants shaping alliance portfolios, but those shaping relationships are conditioned by firms' multinationality. The third essay primarily employed real options theory to build a conceptual framework, revealing how country-, alliance portfolio-, firm-, and industry level factors and their interactions influence firms' strategic decision-making on post-entry continuing expansion in foreign markets. The two empirical studies were resided in global hospitality and travel industries and use panel data to test the relevant theoretical models. Overall, the dissertation advanced and enriched the theoretical domain of alliance portfolio. It particularly shed valuable insights on three fundamental questions in the domain of alliance portfolio research, namely "if and how alliance portfolios contribute to firms' economic performance"; "what determines the appearance of alliance portfolios”; and "how alliance portfolios affect firms' strategic decision-making". This dissertation also extended the international business and strategic management research on service multinationals' foreign expansion and performance.
Resumo:
X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.
A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.
Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.
The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).
First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.
Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.
Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.
The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.
To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.
The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.
The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.
Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.
The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.
In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.