939 resultados para Return-based pricing kernel
Resumo:
This thesis studies the development of service offering model that creates added-value for customers in the field of logistics services. The study focusses on offering classification and structures of model. The purpose of model is to provide value-added solutions for customers and enable superior service experience. The aim of thesis is to define what customers expect from logistics solution provider and what value customers appreciate so greatly that they could invest in value-added services. Value propositions, costs structures of offerings and appropriate pricing methods are studied. First, literature review of creating solution business model and customer value is conducted. Customer value is found out with customer interviews and qualitative empiric data is used. To exploit expertise knowledge of logistics, innovation workshop tool is utilized. Customers and experts are involved in the design process of model. As a result of thesis, three-level value-added service offering model is created based on empiric and theoretical data. Offerings with value propositions are proposed and the level of model reflects the deepness of customer-provider relationship and the amount of added value. Performance efficiency improvements and cost savings create the most added value for customers. Value-based pricing methods, such as performance-based models are suggested to apply. Results indicate the interest of benefitting networks and partnership in field of logistics services. Networks development is proposed to be investigated further.
Resumo:
Sähkönsiirtomaksuilla katetaan luonnollisena alueellisena monopolina toimivan paikallisen sähkönjakeluyhtiön toiminta. Koska sähköverkkoyhtiöt toimivat monopoliasemassa, niiden toimintaa valvoo Energiavirasto. Energiavirasto valvoo pääasiassa verkkoyhtiöiden liikevaihtoa ja toiminnan tehokkuutta. Verkkoyhtiöiden toimintaa säätää Suomessa sähkömarkkinalaki. Verkkoyhtiöiden kustannusrakenne on hyvin sidottu verkoston kustannuksiin. Koska sähköverkoston kustannukset ovat pitkän käyttöiän ja suurien alkuinvestointien vuoksi vuosittain kiinteät ja merkittävän suuret, olisi verkkoyhtiön kannalta tärkeää, että vuosittaiset tulot eivät suuresti vaihtelisi. Nykyisellä hinnoittelumallilla kuitenkin verkkoyhtiöiden tulot jäävät hyvin riippuvaisiksi talven kylmyydestä. Nykyisin Suomessa yleisesti käytettävässä siirtohinnoittelun hinnoittelumallissa ei huomioida myöskään sitä, että verkkoyhtiön kustannukset ovat enemmän riippuvaisia verkossa siirretyistä tehoista kuin energioista. Siirtohinnoittelun kannalta on tapahtunut ja on tapahtumassa merkittäviä muutoksia. Etäluettavien sähkönkulutusmittarien käyttöönoton myötä asiakkaiden kulutuksesta saadaan huomattavasti entistä enemmän tietoa, mikä auttaa kustannusvastaavien hintojen määrittämisessä. Samaan aikaan asiakkaiden sähkön käyttö voi muuttua huomattavasti varsinkin sähköautojen ja lämpöpumppujen yleistymisen myötä. Tämän työn tarkoituksena on luoda Mäntsälän Sähkö Oy:lle sähkönsiirtohinnoittelun kustannuksia vastaavien hintojen määrittämiseksi laskentasovellus sekä selvittää tulevaisuuden siirtohinnoittelun toteuttamismahdollisuuksia. Lyhyen aikavälin suunnittelussa keskitytään kustannusvastaavien hintojen määrittämiseen keskihintaperiaatteen avulla ja pidemmän aikavälin suunnittelussa asiakkaiden huipputehoihin perustuvan hinnoittelun toteuttamisen vaikutuksiin sekä verkkoyhtiön että asiakkaan näkökulmasta.
Resumo:
Kandidaatintyön aihe oli arvopohjainen hinnoittelu teollisilla markkinoilla. Tavoitteena oli löytää arvopohjaisen hinnoittelun eri muodot ja arvopohjaista hinnoittelua estävät päätekijät kirjallisuudesta. Tutkimusmenetelmänä käytettiin kirjallisuuskatsausta aikaisempiin tutkimuksiin aiheesta. Työ toteutettiin tutustumalla lukuisiin artikkeleihin, joiden pohjalta itse työtä lähdettiin kirjoittamaan. Työssä määriteltiin yleisimmät kolme hinnoittelumallia: Kustannuspohjainen, kilpailijalähtöinen ja arvopohjainen hinnoittelumalli. Arvopohjainen hinnoittelu jaettiin vielä neljään alakategoriaan; performanssi- ja lisenssipohjaiseen, sekä kermankuorinta- ja penetraatiohinnoitteluun. Arvopohjaisen hinnoittelun toteuttamiseen liittyviä ongelmia löydettiin viisi kappaletta: Tuotteen arvon määrittäminen, arvon kommunikointi, markkinasegmentointi, myyntihenkilöstön johtaminen sekä ylimmän johdon tuki. Näihin ongelmiin löydettiin ratkaisuja useasta eri lähteestä. Työn johtopäätökseksi saatiin, että arvopohjainen hinnoittelu, sen haasteista huolimatta, on oikein toteutettuna nykyisistä päähinnottelumalleista kaikista kannattavin. Täten yrityksen, joka haluaa hinnoitella tehokkaasti, tulisi pyrkiä arvopohjaiseen hinnoittelumalliin ja saada hintansa toteutumaan
Resumo:
Kandidaatin tutkielma ”Hinnoittelustrategian valinta terästeollisuudessa – Case Teräsyhtiö Oy” käsittelee tuotantohyödykkeitä valmistavan hiiliteräsyhtiön hinnoittelustrategian valintaa, peilaamalla toteutuvaa hinnoittelua hinnoittelun teoreettisiin periaatteisiin. Tutkimuksen tavoitteeksi kohdentui selvittää, kuinka hyvin kohdeyrityksen hinnoittelu noudattaa hinnoittelun teoreettisia periaatteita. Tavoitteena oli myös selvittää, miten hinnoittelu toteutuu kohdeyrityksessä ja mitkä tekijät vaikuttavat tähän strategiseen päätökseen. Tutkielman teoriaosuuden muodostaa hinnan ja hinnoitteluprosessin teorian muodostama kokonaisuus yhdessä tutkielman varsinaisen viitekehyksen kanssa, jona toimivat kustannusperusteisen ja markkinalähtöisen hinnoittelun perinteiset mallit. Tutkielmassa markkinalähtöisillä malleilla viitataan kysyntä- ja kilpailulähtöisiin malleihin. Tutkimuksen aineisto kerättiin teema-haastattelun avulla, haastattelemalla kolmea Case-yrityksen hinnoittelussa toimivaa henkilöä. Tutkimus toteutui laadullisena tutkimuksena hyödyntäen analyysissä teorialähtöistä sisällönanalyysiä. Tutkimustulosten osalta tärkeään rooliin asettui kahtiajako kotimarkkinoihin ja kotimarkkinoiden ulkopuolisiin alueisiin. Nämä alueet määrittivät pitkälle sitä, jouduttiinko hinnoittelu toteuttamaan hintaa seuraten vai asettaen. Toimialan alueellisten hintatasojen osalta merkittävässä asemassa olivat teräsyhtiöiden avoimet hintalistat, jotka ohjaavat hinnoittelua vahvasti. Hinnoitteluprosessin osalta tärkeimmäksi tavoitteeksi asettui kannattavuuden takaaminen, sekä johdonmukaisen hinnoittelun harjoittaminen. Markkinalähtöistä hinnoittelua ohjaavista ulkoisista tekijöistä tärkeintä oli Kilpailulain (948/2011) huomioiminen. Asiakkaan rooli hinnoittelussa oli myös erittäin merkittävä. Tutkimus osoitti kohdeyrityksen hinnoittelun painottavan markkinalähtöisiä menetelmiä, huomioiden kuitenkin kustannusten vaikutus katteen kautta. Tutkimus osoitti myös, ettei asiakkaan kokemaa arvoa huomioida hinnoittelun pohjatyössä siinä määrin, kuin olisi mahdollisesti tarpeellista. Tutkimuksen johtopäätöksissä korostuu se, kuinka asiakasarvon huomioiminen voisi mahdollistaa yritykselle korkeamman tuloksellisuuden.
Resumo:
Digitalization and technology megatrends such as Cloud services have provided SMEs with a suitable atmosphere and conditions to internationalize and seek for further business growth. There is a limited amount of research on Cloud services from the business perspective and the limitations and challenges SMEs encounter when pursuing international business growth. Thus, the main research question of this study was how Cloud services may enable Finnish SMEs to overcome international growth challenges. The research question was further divided into three sub-questions dealing with matters related to features and characteristics of Cloud services, limitations and challenges Finnish SMEs experience when pursuing international growth of business, and benefits and advantages of utilizing Cloud services to mitigate and suppress international growth challenges. First, the theoretical framework of this study was constructed based on the existing literature on Cloud services, SMEs, and international growth challenges. After this, qualitative research approach and methodology were applied for this study. The data was collected through six semi-structured expert interviews in person with representatives of IBM, Exidio, Big Data Solutions, and Comptel. After analyzing the collected data by applying thematic analysis method, the results were compared with the existing theory and the original framework was modified and complemented accordingly. Resource scarcity, customer base expansion and retention, and lack of courage to try new things and take risks turned out to be major international growth challenges of Finnish SMEs. Due to a number of benefits and advantages of utilizing Cloud services including service automation, consumption-based pricing model, lack of capital expenditures (capex) and huge upfront investments, lightened organization structure, cost savings, speed, accessibility, scalability, agility, geographical expansion potential, global reaching and covering, credibility, partners, enhanced CRM, freedom, and flexibility, it can be concluded that Cloud services can help directly and indirectly Finnish SMEs to mitigate and overcome international growth challenges and enable further business growth.
Resumo:
Vers la fin du 19ème siècle, le moine et réformateur hindou Swami Vivekananda affirma que la science moderne convergeait vers l'Advaita Vedanta, un important courant philosophique et religieux de l'hindouisme. Au cours des décennies suivantes, suite aux apports scientifiques révolutionnaires de la théorie de la relativité d'Einstein et de la physique quantique, un nombre croissant d'auteurs soutenaient que d'importants "parallèles" pouvaient être tracés entre l'Advaita Vedanta et la physique moderne. Encore aujourd'hui, de tels rapprochements sont faits, particulièrement en relation avec la physique quantique. Cette thèse examine de manière critique ces rapprochements à travers l'étude comparative détaillée de deux concepts: le concept d'akasa dans l'Advaita Vedanta et celui de vide en physique quantique. L'énoncé examiné est celui selon lequel ces deux concepts pointeraient vers une même réalité: un substratum omniprésent et subtil duquel émergent et auquel retournent ultimement les divers constituants de l'univers. Sur la base de cette étude comparative, la thèse argumente que des comparaisons de nature conceptuelle favorisent rarement la mise en place d'un véritable dialogue entre l'Advaita Vedanta et la physique moderne. Une autre voie d'approche serait de prendre en considération les limites épistémologiques respectivement rencontrées par ces disciplines dans leur approche du "réel-en-soi" ou de la "réalité ultime." Une attention particulière sera portée sur l'épistémologie et le problème de la nature de la réalité dans l'Advaita Vedanta, ainsi que sur le réalisme scientifique et les implications philosophiques de la non-séparabilité en physique quantique.
Resumo:
L’auteur fonde son argument sur l’importance déterminante des conséquences économiques de la numérisation sur l’évolution des droits d’auteur afférant à la musique. La musique numérisée correspondant à un bien public, les prix de sa négociation tendent vers 0, et seules les contraintes légales telles que les droits d’auteur ou les ententes sur les prix, qui sont généralement proscrites par les lois sur la concurrence, peuvent sauver l’entrant intrépide ou l’opérateur mis sur le sable. Alors que les propriétaires de droits d’auteur maximisent leurs profits en prônant l’extension de leur champ d’application et en poursuivant leur application par les tribunaux, leur valeur sociale est mesurée en termes d’efficacité pour la promotion de l’innovation. L’industrie de la musique a projeté le champ d’application des droits d’auteur si grossièrement loin au–delà des limites de la raison par rapport à la musique numérisée que leur position légale sera attaquée inlassablement sur tous les fronts, que ce soit par une banalisation des infractions, ou par la résistance devant les tribunaux ou par des campagnes visant une réforme législative.
Resumo:
Esta pesquisa teve como objetivo fazer uma verificação no processo de alocação dos custos indiretos em três indústrias de alimentos do Estado da Paraíba , cidade de Campina Grande , confrontando-se os método se alocações dessas empresas com a literatura pertinente (CAPíTULO I). Para melhor compreensão das relações entre a teor ia e a prática de alocaçõe s dos custos indiretos , foi apresentada uma revisão de literatura , com alguns conceitos e classificação desses custos , bem como comentários sobre o pro cesso de alocação, tendo em vista quatro propósito s básicos da contabilidade de custos : Avaliação de Estoques, Fixação de preços, Avaliação de Desempenho e Análise, para Tomada de Decisões (CAPíTULO lI ) . Devido à carência de dados empíricos nessa área, a circunscrição da pesquisa a três indústrias e à necessidade de uma descrição mais profunda sobre as práticas corrente s de alocações , optou-se pelo método do " Estudo de Caso " (CAPíTULO III ) . Segue-se a apresentação dos casos estudados e uma descrição dos métodos de alocações adotados pelas empresas ( CAPíTULO IV) . O s resultados obtidos possibilitaram uma análise dos método s de alocações pesquisados, em confronto com a literatura revisada (CAPíTULO V) . Por fim, é apresentado um sumário, onde se chega a conclusões importantes , incluindo recomendações e sugestões para novos estudos nesta área de conhecimento ( CApíTULO VI ).
Resumo:
The Forward Premium Puzzle (FPP) is how the empirical observation of a negative relation between future changes in the spot rates and the forward premium is known. Modeling this forward bias as a risk premium and under weak assumptions on the behavior of the pricing kernel, we characterize the potential bias that is present in the regressions where the FPP is observed and we identify the necessary and sufficient conditions that the pricing kernel has to satisfy to account for the predictability of exchange rate movements. Next, we estimate the pricing kernel applying two methods: i) one, du.e to Araújo et aI. (2005), that exploits the fact that the pricing kernel is a serial correlation common feature of asset prices, and ii) a traditional principal component analysis used as a procedure 1;0 generate a statistical factor modeI. Then, using on the sample and out of the sample exercises, we are able to show that the same kernel that explains the Equity Premi um Puzzle (EPP) accounts for the FPP in all our data sets. This suggests that the quest for an economic mo deI that generates a pricing kernel which solves the EPP may double its prize by simultaneously accounting for the FPP.
Resumo:
Esta tese é constituída por três ensaios. O primeiro ensaio analisa a informação pública disponível sobre o risco das carteiras de crédito dos bancos brasileiros, sendo dividido em dois capítulos. O primeiro analisa a limitação da informação pública disponibilizada pelos bancos e pelo Banco Central, quando comparada a informação gerencial disponível internamente pelos bancos. Concluiu-se que existe espaço para o aumento da transparência na divulgação das informações, fato que vem ocorrendo gradativamente no Brasil através de novas normas relacionadas ao Pilar 3 de Basileia II e à divulgação de informações mais detalhas pelo Bacen, como, por exemplo, aquelas do “Top50” . A segunda parte do primeiro ensaio mostra a discrepância entre o índice de inadimplência contábil (NPL) e a probabilidade de inadimplência (PD) e também discute a relação entre provisão e perda esperada. Através da utilização de matrizes de migração e de uma simulação baseada na sobreposição de safras de carteira de crédito de grandes bancos, concluiu-se que o índice de inadimplência subestima a PD e que a provisão constituída pelos bancos é menor que a perda esperada do SFN. O segundo ensaio relaciona a gestão de risco à discriminação de preço. Foi desenvolvido um modelo que consiste em um duopólio de Cournot em um mercado de crédito de varejo, em que os bancos podem realizar discriminação de terceiro grau. Neste modelo, os potenciais tomadores de crédito podem ser de dois tipos, de baixo ou de alto risco, sendo que tomadores de baixo risco possuem demanda mais elástica. Segundo o modelo, se o custo para observar o tipo do cliente for alto, a estratégia dos bancos será não discriminar (pooling equilibrium). Mas, se este custo for suficientemente baixo, será ótimo para os bancos cobrarem taxas diferentes para cada grupo. É argumentado que o Acordo de Basileia II funcionou como um choque exógeno que deslocou o equilíbrio para uma situação com maior discriminação. O terceiro ensaio é divido em dois capítulos. O primeiro discute a aplicação dos conceitos de probabilidade subjetiva e incerteza Knigthiana a modelos de VaR e a importância da avaliação do “risco de modelo”, que compreende os riscos de estimação, especificação e identificação. O ensaio propõe que a metodologia dos “quatro elementos” de risco operacional (dados internos, externos, ambiente de negócios e cenários) seja estendida à mensuração de outros riscos (risco de mercado e risco de crédito). A segunda parte deste último ensaio trata da aplicação do elemento análise de cenários para a mensuração da volatilidade condicional nas datas de divulgação econômica relevante, especificamente nos dias de reuniões do Copom.
Resumo:
We study cash allocation ability as a possible explanatory factor that allows equity fund managers to produce high levels of adjusted returns (not explained by the risk factors they are exposed to). In order to do so, we explore the non-indexed Brazilian equity fund industry during the period of January 2006 to February 2015, evaluating cash allocation ability by level and effectiveness of cash deployment using return-based and holding-based approaches to explore a database of monthly invested assets and returns. We found that even though market timing is a rare skill in the industry, the flexibility to hold high levels of cash played a significant role in the result of over performing managers.
Resumo:
O objetivo deste estudo é analisar as durações das carteiras de renda fixa dos fundos previdenciários, que são paradoxalmente curtas em relação aos objetivos de longo prazo inerentes à previdência, e os eventuais efeitos dos incentivos de permanência existentes nos planos coletivos instituídos, como o custeio do instituidor e regras de desligamento – vesting – no alongamento dessas carteiras. Como forma de sobrepujar as dificuldades da observação direta dos prazos de alongamento das carteiras dos fundos analisados, foi proposto um índice de alongamento calcado na Análise de Estilo Baseada nos Retornos desenvolvida por SHARPE (1992) empregando-se as componentes principais dos Índices de Duração Constante da Anbima (IDkA) para a avaliação da sensibilidade dos retornos mensais dos fundos analisados às curvas de juros real e nominal. Os resultados obtidos não mostram evidências de que os fundos que recebem recursos exclusivamente de planos instituídos apresentem duração maior do que daqueles que recebem recursos de planos individuais e coletivos averbados. Por outro lado, os fundos classificados como “Previdência Data Alvo” pela Anbima destacam-se por apresentar índices de alongamento maiores frente à média dos fundos classificados como “Previdência Renda Fixa” ou “Previdência Balanceado” e correlação positiva entre seus índices de alongamento e Ano Alvo do fundo, o que sugere que políticas que trabalhem o conjunto de informação dos agentes, investidores e gestores, são capazes de modificar a alocação dos investimentos. Basta informação para melhorar a alocação.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The developmental processes and functions of an organism are controlled by the genes and the proteins that are derived from these genes. The identification of key genes and the reconstruction of gene networks can provide a model to help us understand the regulatory mechanisms for the initiation and progression of biological processes or functional abnormalities (e.g. diseases) in living organisms. In this dissertation, I have developed statistical methods to identify the genes and transcription factors (TFs) involved in biological processes, constructed their regulatory networks, and also evaluated some existing association methods to find robust methods for coexpression analyses. Two kinds of data sets were used for this work: genotype data and gene expression microarray data. On the basis of these data sets, this dissertation has two major parts, together forming six chapters. The first part deals with developing association methods for rare variants using genotype data (chapter 4 and 5). The second part deals with developing and/or evaluating statistical methods to identify genes and TFs involved in biological processes, and construction of their regulatory networks using gene expression data (chapter 2, 3, and 6). For the first part, I have developed two methods to find the groupwise association of rare variants with given diseases or traits. The first method is based on kernel machine learning and can be applied to both quantitative as well as qualitative traits. Simulation results showed that the proposed method has improved power over the existing weighted sum method (WS) in most settings. The second method uses multiple phenotypes to select a few top significant genes. It then finds the association of each gene with each phenotype while controlling the population stratification by adjusting the data for ancestry using principal components. This method was applied to GAW 17 data and was able to find several disease risk genes. For the second part, I have worked on three problems. First problem involved evaluation of eight gene association methods. A very comprehensive comparison of these methods with further analysis clearly demonstrates the distinct and common performance of these eight gene association methods. For the second problem, an algorithm named the bottom-up graphical Gaussian model was developed to identify the TFs that regulate pathway genes and reconstruct their hierarchical regulatory networks. This algorithm has produced very significant results and it is the first report to produce such hierarchical networks for these pathways. The third problem dealt with developing another algorithm called the top-down graphical Gaussian model that identifies the network governed by a specific TF. The network produced by the algorithm is proven to be of very high accuracy.
Resumo:
This chapter provides a detailed discussion of the evidence on housing and mortgage lending discrimination, as well as the potential impacts of such discrimination on minority outcomes like homeownership and neighborhood environment. The paper begins by discussing conceptual issues surrounding empirical analyses of discrimination including explanations for why discrimination takes place, defining different forms of discrimination, and the appropriate interpretation of observed racial and ethnic differences in treatment or outcomes. Next, the paper reviews evidence on housing market discrimination starting with evidence of segregation and price differences in the housing market and followed by direct evidence of discrimination by real estate agents in paired testing studies. Finally, mortgage market discrimination and barriers in access to mortgage credit are discussed. This discussion begins with an assessment of the role credit barriers play in explaining racial and ethnic differences in homeownership and follows with discussions of analyses of underwriting and the price of credit based on administrative and private sector data sources including analyses of the subprime market. The paper concludes that housing discrimination has declined especially in the market for owner-occupied housing and does not appear to play a large role in limiting the neighborhood choices of minority households or the concentration of minorities into central cities. On the other hand, the patterns of racial centralization and lower home ownership rates of African-Americans appear to be related to each other, and lower minority homeownership rates are in part attributable to barriers in the market for mortgage credit. The paper presents considerable evidence of racial and ethnic differences in mortgage underwriting, as well as additional evidence suggesting these differences may be attributable to differential provision of coaching, assistance, and support by loan officers. At this point, innovation in loan products, the shift towards risk based pricing, and growth of the subprime market have not mitigated the role credit barriers play in explaining racial and ethnic differences in homeownership. Further, the growth of the subprime lending industry appears to have segmented the mortgage market in terms of geography leading to increased costs of relying on local/neighborhood sources of mortgage credit and affecting the integrity of many low-income minority neighborhoods through increased foreclosure rates.