939 resultados para Return-based pricing kernel
Resumo:
This project attempts to provide an in-depth competitive assessment of the Portuguese indoor location-based analytics market, and to elaborate an entry-pricing strategy for Business Intelligence Positioning System (BIPS) implementation in Portuguese shopping centre stores. The role of industry forces and company’s organizational resources platform to sustain company’s competitive advantage was explored. A customer value-based pricing approach was adopted to assess BIPS value to retailers and maximize Sonae Sierra profitability. The exploratory quantitative research found that there is a market opportunity to explore every store area types with tailored proposals, and to set higher-than-tested membership fees to allow a rapid ROI, concluding there are propitious conditions for Sierra to succeed in BIPS store’s business model in Portugal.
Resumo:
PURPOSE: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. METHODS: We looked at about 240,000 IRC measurements carried out in about 150,000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m(3). Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. RESULTS: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. CONCLUSIONS: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
Vuonna 1995 alkanut sähkömarkkinoiden vapautuminen on muuttanut sähköyhtiöiden myyntisopimusten hinnoittelua merkittävästi. Ennen markkinauudistusta voitiin sähkön myyntisopimukset hinnoitella perustuen oman sähkön tuotannon kustannuksiin ja haluttuun katteeseen. Nykyään sähköpörssissä noteerattava sähkön hinta muodostaa perustan kaikkien myyntisopimusten hinnoittelulle. Sähkön markkinahinnan lisäksi on myyntisopimusten hinnoittelussa otettava huomioon sähkömarkkinoiden ominaispiirteistä aiheutuvat riskit sähkön myyjälle. Tässä työssä mallinnetaan Lappeenrannan Energia Oy:n markkinalähtöiset hinnoittelumenetelmät kahdelle sähkönmyyntisopimustyypille. Lisäksi tutkitaan markkinalähtöisen hinnoittelun tärkeimpien riskikomponenttien, aluehintaeron sekä profiililisän, merkitystä markkinalähtöisten myyntisopimusten hinnoittelussa. Aluehintaeron hyväksikäyttöä myyntisopimusten hinnoittelussa on tutkittu selvittämällä Suomen hinta-alueen CfD-johdannaisten riskipreemiot. Profiililisän merkitystä myyntisopimusten hinnoittelussa on tutkittu havainnoimalla profiililisän muutoksia hinta- ja kulutusaikasarjoissa sekä suojaushinta ja tehotasossa tapahtuneiden muutosten suhteen. Ennustetun ja toteutuneen profiililisän eroja on tutkittu laskemalla ne seitsemälle Lappeenrannan Energia Oy:n merkittävälle asiakkaalle. Lisäksi on tarkasteltu profiililisän laskentaan tarvittavan hinta-aikasarjan mallintamiseen käytettyjen termiinituotteiden hintojen merkitystä lasketun profiililisän suuruuteen. Työn lopuksi esitetään kaksi vaihtoehtoista sähkösopimusten hinnoittelumenetelmää ja vertaillaan esitettyjä hinnoittelumenetelmiä keskenään. Työssä havaitaan, että aluehintaerolta suojautumiseen käytettävien CfD-johdannaisten avulla olisi ollut mahdollista lisätä markkinalähtöisten myyntisopimusten tuottoa viimeisen kolmen vuoden aikana. Suoritettujen herkkyysanalyysien perusteella voidaan todeta, että toteutuneen ja ennustetun profiililisän erot johtuvat laskentaan käytettävien hinta- sekä kulutusaikasarjojen epätarkkuudesta. Lappeenrannan Energia Oy:n käyttämät profiililisät osoittautuivat ex-post -tarkastelussa liian suuriksi yhtä asiakasta lukuun ottamatta. Lisäksi tarjousaikana laskettujen profiililisien voidaan katsoa muuttuvan täysin samassa suhteessa hinta-aikasarjan mallintamiseen käytettyjen termiinituotteiden volatiliteettien kanssa. Esiteltyjen vaihtoehtoisten myyntisopimusten hinnoittelumenetelmien voidaan katsoa antavan varsin samanlaisia tuloksia kuin Lappeenrannan Energia Oy:n nykyinen hinnoittelumenetelmä. Saatuihin tuloksiin vaikuttavat kuitenkin painokertoimien estimointiin käytetyn vuoden volatiliteetti sekä profiililisän laskentamenetelmä
Resumo:
Little research has been conducted to guide the management of marketing variables, such as pricing, in systems business context. Furthermore, given that international partnering has become a popular mode of operation for SMEs, the objective of the current thesis was to explore the scantly researched topic of managing the pricing of integrated solutions in an export partnership. Specifically, the thesis synthesizes literature findings from the three areas of export pricing, systems business, and export partnerships. The empirical section of the study consists of a qualitative single-case study of a Finnish systems integrator that has recently launched its export operations. Primary data was collected by conducting four interviews of the case company’s managers and by organizing one group interview session. The study findings indicate that a systems integrator’s pricing strategy in an export partnership can be very multidimensional and dependant on international pricing environment and partner characteristics, that an export partnership appears to have unique implications on a systems integrator’s pricing process, and that customer value –based pricing strategies might be particularly suited to pricing integrated solutions.
Resumo:
Over the past decade, organizations worldwide have begun to widely adopt agile software development practices, which offer greater flexibility to frequently changing business requirements, better cost effectiveness due to minimization of waste, faster time-to-market, and closer collaboration between business and IT. At the same time, IT services are continuing to be increasingly outsourced to third parties providing the organizations with the ability to focus on their core capabilities as well as to take advantage of better demand scalability, access to specialized skills, and cost benefits. An output-based pricing model, where the customers pay directly for the functionality that was delivered rather than the effort spent, is quickly becoming a new trend in IT outsourcing allowing to transfer the risk away from the customer while at the same time offering much better incentives for the supplier to optimize processes and improve efficiency, and consequently producing a true win-win outcome. Despite the widespread adoption of both agile practices and output-based outsourcing, there is little formal research available on how the two can be effectively combined in practice. Moreover, little practical guidance exists on how companies can measure the performance of their agile projects, which are being delivered in an output-based outsourced environment. This research attempted to shed light on this issue by developing a practical project monitoring framework which may be readily applied by organizations to monitor the performance of agile projects in an output-based outsourcing context, thus taking advantage of the combined benefits of such an arrangement Modified from action research approach, this research was divided into two cycles, each consisting of the Identification, Analysis, Verification, and Conclusion phases. During Cycle 1, a list of six Key Performance Indicators (KPIs) was proposed and accepted by the professionals in the studied multinational organization, which formed the core of the proposed framework and answered the first research sub-question of what needs to be measured. In Cycle 2, a more in-depth analysis was provided for each of the suggested Key Performance Indicators including the techniques for capturing, calculating, and evaluating the information provided by each KPI. In the course of Cycle 2, the second research sub-question was answered, clarifying how the data for each KPI needed to be measured, interpreted, and acted upon. Consequently, after two incremental research cycles, the primary research question was answered describing the practical framework that may be used for monitoring the performance of agile IT projects delivered in an output-based outsourcing context. This framework was evaluated by the professionals within the context of the studied organization and received positive feedback across all four evaluation criteria set forth in this research, including the low overhead of data collection, high value of provided information, ease of understandability of the metric dashboard, and high generalizability of the proposed framework.
Resumo:
In this Master’s thesis agent-based modeling has been used to analyze maintenance strategy related phenomena. The main research question that has been answered was: what does the agent-based model made for this study tell us about how different maintenance strategy decisions affect profitability of equipment owners and maintenance service providers? Thus, the main outcome of this study is an analysis of how profitability can be increased in industrial maintenance context. To answer that question, first, a literature review of maintenance strategy, agent-based modeling and maintenance modeling and optimization was conducted. This review provided the basis for making the agent-based model. Making the model followed a standard simulation modeling procedure. With the simulation results from the agent-based model the research question was answered. Specifically, the results of the modeling and this study are: (1) optimizing the point in which a machine is maintained increases profitability for the owner of the machine and also the maintainer with certain conditions; (2) time-based pricing of maintenance services leads to a zero-sum game between the parties; (3) value-based pricing of maintenance services leads to a win-win game between the parties, if the owners of the machines share a substantial amount of their value to the maintainers; and (4) error in machine condition measurement is a critical parameter to optimizing maintenance strategy, and there is real systemic value in having more accurate machine condition measurement systems.
Palvelumallin kehittäminen sähköverkkoyhtiön kohtuullisen tuoton ja siirtohinnoittelun analysointiin
Resumo:
Uuden sähkömarkkinalain myötä sähkönjakeluverkonhaltijoille tuli velvoite parantaa verkkonsa toimitusvarmuutta tasolle, jossa verkon vikaantuminen myrskyn tai lumikuorman seurauksena ei aiheuta asemakaava-alueella asiakkaalle yli 6 tuntia kestävää keskeytystä eikä muilla alueilla yli 36 tuntia kestävää keskeytystä. 1.9.2013 voimaan astuneessa laissa toimitusvarmuuden parantamisen aikaraamiksi on määritetty 15 vuotta. Verkkoyhtiöt voivat päättää verkon saneerausstrategiasta itsenäisesti, mutta toimitusvarmuusvaatimuksiin vastaaminen vaatii monelta yhtiöltä laajamittaista verkon kaapelointia ja investointivolyymien kasvattamista. Verkon investoinneilla on vaikutuksensa verkkoyhtiöiden taloudellisessa valvonnassa, joka puolestaan on Energiaviraston vastuulla. Valvonnan kohteina ovat siirtohinnoittelun kohtuullisuus, toiminnan tehokkuus ja sähkön laatu. ElMil Oy:lle kehitetyn palvelumallin tarkoituksena on siten mallintaa investointien vaikutusta sähköverkkoliiketoiminnan valvontamallin näkökulmasta. Palvelumallissa laaditaan ennalta määritettyjen investointikohteiden ympärille optimaalinen investointiohjelma kohteiden kannattavuuden perusteella. Ohjelman perusteella voidaan puolestaan estimoida investointien vaikutusta verkosta saatavaan kohtuulliseen tuottoon ja mallintaa tämän pohjalta siirtohinnan korotuspotentiaalia. Muodostettaessa optimaalista investointiohjelmaa voidaan työssä kehitetyn laskentatyökalun avulla varioida erilaisia skenaarioita ja tehdä vaihtelevia painotuksia investointivolyymeissa vuositasolla. Laskenta seuraa myös tulorahoituksen riittävyyttä, joten investointiohjelman optimoinnilla voidaan hakea vähäisintä lisärahoitusta vaativa ratkaisu, jolloin voidaan minimoida toimitusvarmuusinvestointien aiheuttamaa vieraan pääoman kasvattamista. Osana palvelumallia päivitetään viranomaisraportointiin liittyvä toimitusvarmuuden kehittämissuunnitelma.
Resumo:
The main objective of this study was to examine the pricing of customized industrial products in international markets, and to understand what pricing decision making consists of. Another purpose of the study was to identify the main factors that affect the pricing decisions of industrial companies, as well as the different pricing strategies industrial companies may choose when pricing customized products. The research was conducted as a qualitative single case study, and a Finnish industrial company specializing in indoor environment solutions, Halton Marine Oy, was used as the case company in the study. The primary data was collected through semi-structured theme interviews with the key management personnel of the company, and the results were discussed and analyzed in the light of the existing literature. The results of this study indicate that the pricing of customized industrial products consists of several dimensions, and is influenced by a large variety of factors that are both internal and external to the firm. In addition, it was found that the choice of a pricing strategy is largely dependent on the chosen segment, the product category, and the stage in the product life cycle. The results also suggest that customizing companies should consider using the value-based pricing orientation, since customization is closely linked to customer value.
Resumo:
Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.
Resumo:
We study the asset pricing implications of an endowment economy when agents can default on contracts that would leave them otherwise worse off. We specialize and extend the environment studied by Kocherlakota (1995) and Kehoe and Levine (1993) to make it comparable to standard studies of asset pricillg. We completely charactize efficient allocations for several special cases. We illtroduce a competitive equilibrium with complete markets alld with elldogellous solvency constraints. These solvellcy constraints are such as to prevent default -at the cost of reduced risk sharing. We show a version of the classical welfare theorems for this equilibrium definition. We characterize the pricing kernel, alld compare it with the one for economies without participation constraints : interest rates are lower and risk premia can be bigger depending on the covariance of the idiosyncratic and aggregate shocks. Quantitative examples show that for reasonable parameter values the relevant marginal rates of substitution fali within the Hansen-Jagannathan bounds.
Resumo:
We study an intertemporal asset pricing model in which a representative consumer maximizes expected utility derived from both the ratio of his consumption to some reference level and this level itself. If the reference consumption level is assumed to be determined by past consumption levels, the model generalizes the usual habit formation specifications. When the reference level growth rate is made dependent on the market portfolio return and on past consumption growth, the model mixes a consumption CAPM with habit formation together with the CAPM. It therefore provides, in an expected utility framework, a generalization of the non-expected recursive utility model of Epstein and Zin (1989). When we estimate this specification with aggregate per capita consumption, we obtain economically plausible values of the preference parameters, in contrast with the habit formation or the Epstein-Zin cases taken separately. All tests performed with various preference specifications confirm that the reference level enters significantly in the pricing kernel.
Resumo:
In da Costa et al. (2006) we have shown how a same pricing kernel can account for the excess returns of the S&:P500 over the US short term bond and of the uncovered over the covered trading of foreign government bonds. In this paper we estimate and test the overidentifying restrictiom; of Euler equations associated with "ix different versions of the Consumption Capital Asset Pricing I\Iodel. Our main finding is that the same (however often unreasonable) values for the parameters are estimated for ali models in both nmrkets. In most cases, the rejections or otherwise of overidentifying restrictions occurs for the two markets, suggesting that success and failure stories for the equity premium repeat themselves in foreign exchange markets. Our results corroborate the findings in da Costa et al. (2006) that indicate a strong similarity between the behavior of excess returns in the two markets when modeled as risk premiums, providing empirical grounds to believe that the proposed preference-based solutions to puzzles in domestic financiaI markets can certainly shed light on the Forward Premium Puzzle.
Resumo:
The problems of wave propagation and power flow in the distribution network composed of an overhead wire parallel to the surface of the ground have not been satisfactorily solved. While a complete solution of the actual problem is impossible, as it is explained in the famous Carson's paper (1926), the solution of the problem, where the actual earth is replaced by a plane homogenous semi-infinite solid, is of considerable interest. In this paper, a power flow algorithm in distribution networks with earth return, based on backward-forward technique, is discussed. In this novel use of the technique, the ground is explicitly represented. In addition, an iterative method for determining impedance for modelling ground effect in the extended power flow algorithm is suggested. Results obtained from single-wire and three-wire studies using IEEE test networks are presented and discussed. (C) 2003 Elsevier Ltd. All rights reserved.