990 resultados para pricing strategies
Resumo:
Purpose: The paper aims to further extend our understanding by assessing the extent to which two prominent cultural values in East Asia i.e. face saving and group orientation drive consumers' perceptions of luxury goods across four East Asian markets. Design/methodology/approach: A multi-methods research approach was adopted consisting of: an expert panel of close to 70 participants, group discussions with five extended East Asian families, personal interviews with eight East Asian scholars, a pilot test with over 50 East Asian graduate students and a multi-market survey of 443 consumer respondents in Beijing, Tokyo, Singapore and Hanoi. Findings: The authors extend previous conceptual studies by empirically investigating the impact of these two cultural values on the perception of luxury among East Asian societies. Specifically the study reveals that across all four markets face saving has the strongest influence on the conspicuous and hedonistic dimensions of luxury, group orientation meanwhile is the strongest predictor of the quality, extended self and exclusivity dimensions of luxury. Collectively these two cultural values significantly influence East Asian perceptions of luxury. Overall, the findings reiterate the importance of understanding different cultural values and their influence across different East Asian societies. Practical implications: The findings have important implications for managers of western luxury branded goods that are seeking to penetrate East Asian markets or seek to serve East Asian consumers. Specifically, to assist with developing suitable brand positioning, products, services, communications and pricing strategies. Originality/value: This study contributes to our understanding of the subject by exploring the impact of face saving and group orientation on the perception of luxury goods across four East Asian countries. Several directions for future research are suggested. © Emerald Group Publishing Limited.
Resumo:
This dissertation discussed resource allocation mechanisms in several network topologies including infrastructure wireless network, non-infrastructure wireless network and wire-cum-wireless network. Different networks may have different resource constrains. Based on actual technologies and implementation models, utility function, game theory and a modern control algorithm have been introduced to balance power, bandwidth and customers' satisfaction in the system. ^ In infrastructure wireless networks, utility function was used in the Third Generation (3G) cellular network and the network was trying to maximize the total utility. In this dissertation, revenue maximization was set as an objective. Compared with the previous work on utility maximization, it is more practical to implement revenue maximization by the cellular network operators. The pricing strategies were studied and the algorithms were given to find the optimal price combination of power and rate to maximize the profit without degrading the Quality of Service (QoS) performance. ^ In non-infrastructure wireless networks, power capacity is limited by the small size of the nodes. In such a network, nodes need to transmit traffic not only for themselves but also for their neighbors, so power management become the most important issue for the network overall performance. Our innovative routing algorithm based on utility function, sets up a flexible framework for different users with different concerns in the same network. This algorithm allows users to make trade offs between multiple resource parameters. Its flexibility makes it a suitable solution for the large scale non-infrastructure network. This dissertation also covers non-cooperation problems. Through combining game theory and utility function, equilibrium points could be found among rational users which can enhance the cooperation in the network. ^ Finally, a wire-cum-wireless network architecture was introduced. This network architecture can support multiple services over multiple networks with smart resource allocation methods. Although a SONET-to-WiMAX case was used for the analysis, the mathematic procedure and resource allocation scheme could be universal solutions for all infrastructure, non-infrastructure and combined networks. ^
Resumo:
[Excerpt] Today’s hospitality and tourism companies face complex, dramatically shifting challenges, most notably the need to compete for increasingly sophisticated customers in a global, fluid marketplace. To attract and retain the loyal cadre of customers that will ensure the organization’s success, service companies such as hospitality organizations must employ technologically advanced, yet margin sensitive, product and pricing strategies and practices that will differentiate themselves to their intended market. Even more importantly, these service organizations need to devise strategies that will capture and retain the most important yet, from a financial perspective, unrecognized asset on the balance sheet: the employees that design and deliver the service to the customer base. Human resource strategists (i.e. Becker & Gerhart, 1996; Cappelli & Crocker-Hefter, 1996; O’Reilly & Pfeffer, 2000; Pfeffer, 1998; Ulrich, 1997), including those who take a hospitality perspective (i.e. Baumann, 2000; Hume, 2000; Worcester, 1999) advocate a renewed attention to the investment in employees or “human capital” as a source of strategic competitive advantage.
Resumo:
National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.
Resumo:
Transportation Systems Center, Cambridge, Mass.
Resumo:
Computer simulation was used to suggest potential selection strategies for beef cattle breeders with different mixes of clients between two potential markets. The traditional market paid on the basis of carcass weight (CWT), while a new market considered marbling grade in addition to CWT as a basis for payment. Both markets instituted discounts for CWT in excess of 340 kg and light carcasses below 300 kg. Herds were simulated for each price category on the carcass weight grid for the new market. This enabled the establishment of phenotypic relationships among the traits examined [CWT, percent intramuscular fat (IMF), carcass value in the traditional market, carcass value in the new market, and the expected proportion of progeny in elite price cells in the new market pricing grid]. The appropriateness of breeding goals was assessed on the basis of client satisfaction. Satisfaction was determined by the equitable distribution of available stock between markets combined with the assessment of the utility of the animal within the market to which it was assigned. The best goal for breeders with predominantly traditional clients was a CWT in excess of 330 kg, while that for breeders with predominantly new market clients was a CWT of between 310 and 329 kg and with a marbling grade of AAA in the Ontario carcass pricing system. For breeders who wished to satisfy both new and traditional clients, the optimal CWT was 310-329 kg and the optimal marbling grade was AA-AAA. This combination resulted in satisfaction levels of greater than 75% among clients, regardless of the distribution of the clients between the traditional and new marketplaces.
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.
Resumo:
This paper presents a Multi-Agent Market simulator designed for developing new agent market strategies based on a complete understanding of buyer and seller behaviors, preference models and pricing algorithms, considering user risk preferences and game theory for scenario analysis. This tool studies negotiations based on different market mechanisms and, time and behavior dependent strategies. The results of the negotiations between agents are analyzed by data mining algorithms in order to extract rules that give agents feedback to improve their strategies. The system also includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions, and capable of considering other agent reactions.
Resumo:
This paper reviews the literature on reference pricing (RP) in pharmaceutical markets. The RP strategy for cost containment of expenditure on drugs is analyzed as part of the procurement mechanism. We review the existing literature and the state-of-the-art regarding RP by focusing on its economic effects. In particular, we consider: (1) the institutional context and problem-related factors which appear to underline the need to implement an RP strategy; i.e., its nature, characteristics and the sort of health care problems commonly addressed; (2) how RP operates in practice; that is, how third party-payers (the insurers/buyers) have established the RP systems existing on the international scene (i.e., information methods, monitoring procedures and legislative provisions); (3)the range of effects resulting from particular RP strategies (including effects on choice of appropriate pharmaceuticals, insurer savings, total drug expenditures, prices of referenced and non-referenced products and dynamic efficiency; (4) the market failures which an RP policy is supposed to address and the main advantages and drawbacks which emerge from an analysis of its effects. Results suggest that RP systems achieve better their postulated goals (1) if cost inflation in pharmaceuticals is due to high prices rather than to the excess of prescription rates, (2) when the larger is the existing difference in prices among equivalent drugs, and (3) more important is the actual market for generics.
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.
Resumo:
Generic or own brand products were initially only lesser expensive copies of the branded label alternative, but nowadays, pricing alone is not enough in order to survive in the Fast Moving Consumer Goods (FMCG) or Consumer Packaged Goods (CPG)markets. With this in mind manufacturers of generic brands have adapted to this rapidlygrowing niche by investing in design and marketing during the initial phase in order to be perceived as having a quality product comparable to that of the branded products. In addition, they have gone further ahead with a second phase and resorted to innovativeproduct differentiation strategies and even pure innovation in many cases. These strategies have granted generic brands constantly increasing market shares and a position of equals relative to national brands.Using previous analyses and case studies, this paper will provide conceptual and empirical evidence to explain the surprisingly fast growth and penetration of generic supermarket brands, which in their relatively short lifespan, have grown to rival the historical market leaders, the branded products. According to this analysis, the main conclusion is that the growth in generic brands can be explained not only by price competition, but also by the use of innovative product differentiation strategies.
Resumo:
Tämän diplomityön päätavoitteena oli parantaa kehitetyn kustannusperusteisen siirtohinnoittelutyökalun ominaisuuksia osastokohtaisen kustannusarviointiprosessin käyttöön. Työ on vaikeutunut lähimenneisyyden heikosta hintakyselyiden vastauskyvystä. Työn pääongelmana oli kerätä luotettavaa tuotannonohjausjärjestelmän kustannusaineistoa osittain vanhentuneista vakioventtiilien koneistus- ja materiaalitiedosta. Tutkimuksessa käytetyt tärkeimmät tutkimusmenetelmät voidaan jakaa siirtohinnoittelu- ja kustannusarvioprosessien kirjallisuustutkimukseen, kenttäanalyysiin ja nykyisen Microsoft Excel –siirtohinnoittelutyökalun kehittämiseen eri osastojen rajapinnassa. Siirtohinnoittelumenetelmät ovat yleisesti jaettu kustannus-, markkina- ja neuvotteluperusteisiin malleihin, jotka harvoin sellaisenaan kohtaavat siirtohinnoittelulle asetetut tavoitteet. Tämä ratkaisutapa voi johtaa tilanteisiin, jossa kaksi erillistä menetelmää sulautuvat yhteen. Lisäksi varsinaiseen siirtohinnoittelujärjestelmään yleensä vaikuttavat useat sisäiset ja ulkoiset tekijät. Lopullinen siirtohinnoittelumenetelmä tulisi ehdottomasti tukea myös yrityksen visiota ja muita liiketoiminnalle asetettuja strategioita. Työn tuloksena saatiin laajennettu Microsoft Excel –sovellus, joka vaatii sekä vuosittaista että kuukausittaista erikoisventtiilimateriaalien hinta- ja toimitusaikatietojen päivittämistä. Tämä ratkaisutapa ehdottomasti parantaa kustannusarviointiprosessia, koska myös alihankkijatietoja joudutaan tutkimaan systemaattisesti. Tämän jälkeen koko siirtohinnoitteluprosessia voidaan kehittää muuntamalla kokoonpano- ja testaustyövaiheiden kustannusrakennetta toimintoperustaisen kustannuslaskentamallin mukaiseksi.
Resumo:
The aim of this thesis is to examine whether the pricing anomalies exists in the Finnish stock markets by comparing the performance of quantile portfolios that are formed on the basis of either individual valuation ratios, composite value measures or combined value and momentum indicators. All the research papers included in the thesis show evidence of value anomalies in the Finnish stock markets. In the first paper, the sample of stocks over the 1991-2006 period is divided into quintile portfolios based on four individual valuation ratios (i.e., E/P, EBITDA/EV, B/P, and S/P) and three hybrids of them (i.e. composite value measures). The results show the superiority of composite value measures as selection criterion for value stocks, particularly when EBITDA/EV is employed as earnings multiple. The main focus of the second paper is on the impact of the holding period length on performance of value strategies. As an extension to the first paper, two more individual ratios (i.e. CF/P and D/P) are included in the comparative analysis. The sample of stocks over 1993- 2008 period is divided into tercile portfolios based on six individual valuation ratios and three hybrids of them. The use of either dividend yield criterion or one of three composite value measures being examined results in best value portfolio performance according to all performance metrics used. Parallel to the findings of many international studies, our results from performance comparisons indicate that for the sample data employed, the yearly reformation of portfolios is not necessarily optimal in order to maximally gain from the value premium. Instead, the value investor may extend his holding period up to 5 years without any decrease in long-term portfolio performance. The same holds also for the results of the third paper that examines the applicability of data envelopment analysis (DEA) method in discriminating the undervalued stocks from overvalued ones. The fourth paper examines the added value of combining price momentum with various value strategies. Taking account of the price momentum improves the performance of value portfolios in most cases. The performance improvement is greatest for value portfolios that are formed on the basis of the 3-composite value measure which consists of D/P, B/P and EBITDA/EV ratios. The risk-adjusted performance can be enhanced further by following 130/30 long-short strategy in which the long position of value winner stocks is leveraged by 30 percentages while simultaneously selling short glamour loser stocks by the same amount. Average return of the long-short position proved to be more than double stock market average coupled with the volatility decrease. The fifth paper offers a new approach to combine value and momentum indicators into a single portfolio-formation criterion using different variants of DEA models. The results throughout the 1994-2010 sample period shows that the top-tercile portfolios outperform both the market portfolio and the corresponding bottom-tercile portfolios. In addition, the middle-tercile portfolios also outperform the comparable bottom-tercile portfolios when DEA models are used as a basis for stock classification criteria. To my knowledge, such strong performance differences have not been reported in earlier peer-reviewed studies that have employed the comparable quantile approach of dividing stocks into portfolios. Consistently with the previous literature, the division of the full sample period into bullish and bearish periods reveals that the top-quantile DEA portfolios lose far less of their value during the bearish conditions than do the corresponding bottom portfolios. The sixth paper extends the sample period employed in the fourth paper by one year (i.e. 1993- 2009) covering also the first years of the recent financial crisis. It contributes to the fourth paper by examining the impact of the stock market conditions on the main results. Consistently with the fifth paper, value portfolios lose much less of their value during bearish conditions than do stocks on average. The inclusion of a momentum criterion somewhat adds value to an investor during bullish conditions, but this added value turns to negative during bearish conditions. During bear market periods some of the value loser portfolios perform even better than their value winner counterparts. Furthermore, the results show that the recent financial crisis has reduced the added value of using combinations of momentum and value indicators as portfolio formation criteria. However, since the stock markets have historically been bullish more often than bearish, the combination of the value and momentum criteria has paid off to the investor despite the fact that its added value during bearish periods is negative, on an average.
Resumo:
In this paper we introduce a financial market model based on continuos time random motions with alternanting constant velocities and with jumps ocurring when the velocity switches. if jump directions are in the certain corresondence with the velocity directions of the underlyng random motion with respect to the interest rate, the model is free of arbitrage. The replicating strategies for options are constructed in details. Closed form formulas for the opcion prices are obtained.
Resumo:
En este documento está desarrollado un modelo de mercado financiero basado en movimientos aleatorios con tiempo continuo, con velocidades constantes alternantes y saltos cuando hay cambios en la velocidad. Si los saltos en la dirección tienen correspondencia con la dirección de la velocidad del comportamiento aleatorio subyacente, con respecto a la tasa de interés, el modelo no presenta arbitraje y es completo. Se construye en detalle las estrategias replicables para opciones, y se obtiene una presentación cerrada para el precio de las opciones. Las estrategias de cubrimiento quantile para opciones son construidas. Esta metodología es aplicada al control de riesgo y fijación de precios de instrumentos de seguros.