910 resultados para Discrete time pricing model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares two linear programming (LP) models for shift scheduling in services where homogeneously-skilled employees are available at limited times. Although both models are based on set covering approaches, one explicitly matches employees to shifts, while the other imposes this matching implicitly. Each model is used in three forms—one with complete, another with very limited meal break placement flexibility, and a third without meal breaks—to provide initial schedules to a completion/improvement heuristic. The term completion/improvement heuristic is used to describe a construction/ improvement heuristic operating on a starting schedule. On 80 test problems varying widely in scheduling flexibility, employee staffing requirements, and employee availability characteristics, all six LP-based procedures generated lower cost schedules than a comparison from-scratch construction/improvement heuristic. This heuristic, which perpetually maintains an explicit matching of employees to shifts, consists of three phases which add, drop, and modify shifts. In terms of schedule cost, schedule generation time, and model size, the procedures based on the implicit model performed better, as a group, than those based on the explicit model. The LP model with complete break placement flexibility and implicitly matching employees to shifts generated schedules costing 6.7% less than those developed by the from-scratch heuristic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The meteorological and chemical transport model WRF-Chem was implemented to forecast PM10 concentrations over Poland. WRF-Chem version 3.5 was configured with three one-way nested domains using the GFS meteorological data and the TNO MACC II emissions. The 48 hour forecasts were run for each day of the winter and summer period of 2014 and there is only a small decrease in model performance for winter with respect to forecast lead time. The model in general captures the variability in observed PM10 concentrations for most of the stations. However, for some locations and specific episodes, the model performance is poor and the results cannot yet be used by official authorities. We argue that a higher resolution sector-based emission data will be helpful for this analysis in connection with a focus on planetary boundary layer processes in WRF-Chem and their impact on the initial distribution of emissions on both time and space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce mémoire présente une version dividende du Capital Asset Pricing Model (CAPM). Selon le modèle développé ici, il existe une relation à l'équilibre entre le rendement en dividendes et le risque systématique. Cette relation est linéaire et négative et peut-être dérivée dans un monde avec ou sans impôt. Une application de ce modèle est possible lorsqu'on évalue la valeur théorique d'une action ordinaire à l'aide du taux net d'actualisation. Au total, le test empirique indique qu'il y a une concordance observable entre les implications majeures du modèle et les faits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The solvency rate of banks differs from the other corporations. The equity rate of a bank is lower than it is in corporations of other field of business. However, functional banking industry has huge impact on the whole society. The equity rate of a bank needs to be higher because that makes the banking industry more stable as the probability of the banks going under will decrease. If a bank goes belly up, the government will be compensating the deposits since it has granted the bank’s depositors a deposit insurance. This means that the payment comes from the tax payers in the last resort. Economic conversation has long concentrated on the costs of raising equity ratio. It has been a common belief that raising equity ratio also increases the banks’ funding costs in the same phase and these costs will be redistributed to the banks customers as higher service charges. Regardless of the common belief, the actual reaction of the funding costs to the higher equity ratio has been studied only a little in Europe and no study has been constructed in Finland. Before it can be calculated whether the higher stability of the banking industry that is caused by the raise in equity levels compensates the extra costs in funding costs, it must be calculated how much the actual increase in the funding costs is. Currently the banking industry is controlled by complex and heavy regulation. To maintain such a complex system inflicts major costs in itself. This research leans on the Modigliani and Miller theory, which shows that the finance structure of a firm is irrelevant to their funding costs. In addition, this research follows the calculations of Miller, Yang ja Marcheggianon (2012) and Vale (2011) where they calculate the funding costs after the doubling of specific banks’ equity ratios. The Finnish banks studied in this research are Nordea and Danske Bank because they are the two largest banks operating in Finland and they both also have the right company form to able the calculations. To calculate the costs of halving their leverages this study used the Capital Asset Pricing Model. The halving of the leverage of Danske Bank raised its funding costs for 16—257 basis points depending on the method of assessment. For Nordea the increase in funding costs was 11—186 basis points when its leverage was halved. On the behalf of the results found in this study it can be said that the doubling of an equity ratio does not increase the funding costs of a bank one by one. Actually the increase is quite modest. More solvent banks would increase the stability of the banking industry enormously while the increase in funding costs is low. If the costs of bank regulation exceeds the increase in funding costs after the higher equity ratio, it can be thought that this is the better way of stabilizing the banking industry rather than heavy regulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabajo plantea la implementación de una metodología para la medición del riego de liquidez al modelo interno adoptado por CONFIAR Cooperativa Financiera, haciendo énfasis en proyecciones estadísticas de los depósitos y retiros de cuentas de ahorro a la vista, depósitos y retiros de aportes sociales, captaciones en contratos de depósito a término, captaciones en depósitos contractuales y los desembolsos de créditos, a través del modelo de series de tiempo para la construcción del flujo de caja, conforme a lo exigido por la Superintendencia Financiera de Colombia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenge of detecting a change in the distribution of data is a sequential decision problem that is relevant to many engineering solutions, including quality control and machine and process monitoring. This dissertation develops techniques for exact solution of change-detection problems with discrete time and discrete observations. Change-detection problems are classified as Bayes or minimax based on the availability of information on the change-time distribution. A Bayes optimal solution uses prior information about the distribution of the change time to minimize the expected cost, whereas a minimax optimal solution minimizes the cost under the worst-case change-time distribution. Both types of problems are addressed. The most important result of the dissertation is the development of a polynomial-time algorithm for the solution of important classes of Markov Bayes change-detection problems. Existing techniques for epsilon-exact solution of partially observable Markov decision processes have complexity exponential in the number of observation symbols. A new algorithm, called constellation induction, exploits the concavity and Lipschitz continuity of the value function, and has complexity polynomial in the number of observation symbols. It is shown that change-detection problems with a geometric change-time distribution and identically- and independently-distributed observations before and after the change are solvable in polynomial time. Also, change-detection problems on hidden Markov models with a fixed number of recurrent states are solvable in polynomial time. A detailed implementation and analysis of the constellation-induction algorithm are provided. Exact solution methods are also established for several types of minimax change-detection problems. Finite-horizon problems with arbitrary observation distributions are modeled as extensive-form games and solved using linear programs. Infinite-horizon problems with linear penalty for detection delay and identically- and independently-distributed observations can be solved in polynomial time via epsilon-optimal parameterization of a cumulative-sum procedure. Finally, the properties of policies for change-detection problems are described and analyzed. Simple classes of formal languages are shown to be sufficient for epsilon-exact solution of change-detection problems, and methods for finding minimally sized policy representations are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis was to observe possibilities to enhance the development of manufacturing costs savings and competitiveness related to the compact KONE Renova Slim elevator door. Compact slim doors are especially designed for EMEA markets. EMEA market area is characterized by highly competitive pricing and lead times which are manifested as pressures to decrease manufacturing costs and lead times of the compact elevator door. The new elevator safety code EN81-20 coming live during the spring 2016 will also have a negative impact on the cost and competitiveness development making the situation more acute. As a sheet metal product the KONE Renova slim is highly variable. Manufacturing methods utilized in the production are common and robust methods. Due to the low volumes, high variability and tight lead times the manufacturing of the doors is facing difficulties. Manufacturing of the doors is outsourced to two individual suppliers Stera and Wittur. This thesis was implemented in collaboration with Stera. KONE and Stera pursue a long term and close partnership where the benefits reached by the collaboration are shared equally. Despite the aims, the collaboration between companies is not totally visible and various barriers are hampering the development towards more efficient ways of working. Based on the empirical studies related to this thesis, an efficient standardized (A+) process was developed for the main variations of the compact elevator door. Using the standardized process KONE is able to order the most important AMDS door variations from Stera with increased quality, lower manufacturing costs and manufacturing lead time compared to the current situation. In addition to all the benefits, the standardized (A+) process also includes risks in practice. KONE and the door supplier need to consider these practical risks together before decisions are made.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The solvency rate of banks differs from the other corporations. The equity rate of a bank is lower than it is in corporations of other field of business. However, functional banking industry has huge impact on the whole society. The equity rate of a bank needs to be higher because that makes the banking industry more stable as the probability of the banks going under will decrease. If a bank goes belly up, the government will be compensating the deposits since it has granted the bank’s depositors a deposit insurance. This means that the payment comes from the tax payers in the last resort. Economic conversation has long concentrated on the costs of raising equity ratio. It has been a common belief that raising equity ratio also increases the banks’ funding costs in the same phase and these costs will be redistributed to the banks customers as higher service charges. Regardless of the common belief, the actual reaction of the funding costs to the higher equity ratio has been studied only a little in Europe and no study has been constructed in Finland. Before it can be calculated whether the higher stability of the banking industry that is caused by the raise in equity levels compensates the extra costs in funding costs, it must be calculated how much the actual increase in the funding costs is. Currently the banking industry is controlled by complex and heavy regulation. To maintain such a complex system inflicts major costs in itself. This research leans on the Modigliani and Miller theory, which shows that the finance structure of a firm is irrelevant to their funding costs. In addition, this research follows the calculations of Miller, Yang ja Marcheggianon (2012) and Vale (2011) where they calculate the funding costs after the doubling of specific banks’ equity ratios. The Finnish banks studied in this research are Nordea and Danske Bank because they are the two largest banks operating in Finland and they both also have the right company form to able the calculations. To calculate the costs of halving their leverages this study used the Capital Asset Pricing Model. The halving of the leverage of Danske Bank raised its funding costs for 16—257 basis points depending on the method of assessment. For Nordea the increase in funding costs was 11—186 basis points when its leverage was halved. On the behalf of the results found in this study it can be said that the doubling of an equity ratio does not increase the funding costs of a bank one by one. Actually the increase is quite modest. More solvent banks would increase the stability of the banking industry enormously while the increase in funding costs is low. If the costs of bank regulation exceeds the increase in funding costs after the higher equity ratio, it can be thought that this is the better way of stabilizing the banking industry rather than heavy regulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese (doutorado)—Universidade de Brasília, Instituto de Geociências, Pós-Graduação em Geociências Aplicadas, 2015.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given that landfills are depletable and replaceable resources, the right approach, when dealing with landfill management, is that of designing an optimal sequence of landfills rather than designing every single landfill separately. In this paper we use Optimal Control models, with mixed elements of both continuous and discrete time problems, to determine an optimal sequence of landfills, as regarding their capacity and lifetime. The resulting optimization problems involve splitting a time horizon of planning into several subintervals, the length of which has to be decided. In each of the subintervals some costs, the amount of which depends on the value of the decision variables, have to be borne. The obtained results may be applied to other economic problems such as private and public investments, consumption decisions on durable goods, etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Queueing theory provides models, structural insights, problem solutions and algorithms to many application areas. Due to its practical applicability to production, manufacturing, home automation, communications technology, etc, more and more complex systems requires more elaborated models, tech- niques, algorithm, etc. need to be developed. Discrete-time models are very suitable in many situations and a feature that makes the analysis of discrete time systems technically more involved than its continuous time counterparts. In this paper we consider a discrete-time queueing system were failures in the server can occur as-well as priority messages. The possibility of failures of the server with general life time distribution is considered. We carry out an extensive study of the system by computing generating functions for the steady-state distribution of the number of messages in the queue and in the system. We also obtain generating functions for the stationary distribution of the busy period and sojourn times of a message in the server and in the system. Performance measures of the system are also provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mest. em Ciências Económicas e Empresariais, Unidade de Ciências Económicas e Empresariais, Univ. do Algarve, 1996

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Company valuation models attempt to estimate the value of a company in two stages: (1) comprising of a period of explicit analysis and (2) based on unlimited production period of cash flows obtained through a mathematical approach of perpetuity, which is the terminal value. In general, these models, whether they belong to the Dividend Discount Model (DDM), the Discount Cash Flow (DCF), or RIM (Residual Income Models) group, discount one attribute (dividends, free cash flow, or results) to a given discount rate. This discount rate, obtained in most cases by the CAPM (Capital asset pricing model) or APT (Arbitrage pricing theory) allows including in the analysis the cost of invested capital based on the risk taking of the attributes. However, one cannot ignore that the second stage of valuation that is usually 53-80% of the company value (Berkman et al., 1998) and is loaded with uncertainties. In this context, particular attention is needed to estimate the value of this portion of the company, under penalty of the assessment producing a high level of error. Mindful of this concern, this study sought to collect the perception of European and North American financial analysts on the key features of the company that they believe contribute most to its value. For this feat, we used a survey with closed answers. From the analysis of 123 valid responses using factor analysis, the authors conclude that there is great importance attached (1) to the life expectancy of the company, (2) to liquidity and operating performance, (3) to innovation and ability to allocate resources to R&D, and (4) to management capacity and capital structure, in determining the value of a company or business in long term. These results contribute to our belief that we can formulate a model for valuating companies and businesses where the results to be obtained in the evaluations are as close as possible to those found in the stock market

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente artículo, presenta un análisis de las decisiones de estructuración de capital de la compañía Merck Sharp & Dome S.A.S, desde la perspectiva de las finanzas comportamentales, comparando los métodos utilizados actualmente por la compañía seleccionada con la teoría tradicional de las finanzas, para así poder evaluar el desempeño teórico y real. Incorporar elementos comportamentales dentro del estudio permite profundizar más sobre de las decisiones corporativas en un contexto más cercano a los avances investigativos de las finanzas del comportamiento, lo cual lleva a que el análisis de este artículo se enfoque en la identificación y entendimiento de los sesgos de exceso de confianza y statu quo, pero sobre todo su implicación en las decisiones de financiación. Según la teoría tradicional el proceso de estructuración de capital se guía por los costos, pero este estudio de caso permitió observar que en la práctica esta relación de costo-decisión está en un segundo lugar, después de la relación riesgo-decisión a la hora del proceso de estructuración de capital.