910 resultados para Discrete time pricing model
Resumo:
The power required to operate large mills is typically 5-10 MW. Hence, optimisation of power consumption will have a significant impact on overall economic performance and environmental impact. Power draw modelling results using the discrete element code PFC3D have been compared with results derived from the widely used empirical Model of Morrell. This is achieved by calculating the power draw for a range of operating conditions for constant mill size and fill factor using two modelling approaches. fThe discrete element modelling results show that, apart from density, selection of the appropriate material damping ratio is critical for the accuracy of modelling of the mill power draw. The relative insensitivity of the power draw to the material stiffness allows selection of moderate stiffness values, which result in acceptable computation time. The results obtained confirm that modelling of the power draw for a vertical slice of the mill, of thickness 20% of the mill length, is a reliable substitute for modelling the full mill. The power draw predictions from PFC3D show good agreement with those obtained using the empirical model. Due to its inherent flexibility, power draw modelling using PFC3D appears to be a viable and attractive alternative to empirical models where necessary code and computer power are available.
Resumo:
Purpose: Precise needle puncture of the renal collecting system is an essential but challenging step for successful percutaneous nephrolithotomy. We evaluated the efficiency of a new real-time electromagnetic tracking system for in vivo kidney puncture. Materials and Methods: Six anesthetized female pigs underwent ureterorenoscopy to place a catheter with an electromagnetic tracking sensor into the desired puncture site and ascertain puncture success. A tracked needle with a similar electromagnetic tracking sensor was subsequently navigated into the sensor in the catheter. Four punctures were performed by each of 2 surgeons in each pig, including 1 each in the kidney, middle ureter, and right and left sides. Outcome measurements were the number of attempts and the time needed to evaluate the virtual trajectory and perform percutaneous puncture. Results: A total of 24 punctures were easily performed without complication. Surgeons required more time to evaluate the trajectory during ureteral than kidney puncture (median 15 seconds, range 14 to 18 vs 13, range 11 to 16, p ¼ 0.1). Median renal and ureteral puncture time was 19 (range 14 to 45) and 51 seconds (range 45 to 67), respectively (p ¼ 0.003). Two attempts were needed to achieve a successful ureteral puncture. The technique requires the presence of a renal stone for testing. Conclusions: The proposed electromagnetic tracking solution for renal collecting system puncture proved to be highly accurate, simple and quick. This method might represent a paradigm shift in percutaneous kidney access techniques
Resumo:
Purpose: Precise needle puncture of the renal collecting system is an essential but challenging step for successful percutaneous nephrolithotomy. We evaluated the efficiency of a new real-time electromagnetic tracking system for in vivo kidney puncture. Materials and Methods: Six anesthetized female pigs underwent ureterorenoscopy to place a catheter with an electromagnetic tracking sensor into the desired puncture site and ascertain puncture success. A tracked needle with a similar electromagnetic tracking sensor was subsequently navigated into the sensor in the catheter. Four punctures were performed by each of 2 surgeons in each pig, including 1 each in the kidney, middle ureter, and right and left sides. Outcome measurements were the number of attempts and the time needed to evaluate the virtual trajectory and perform percutaneous puncture. Results: A total of 24 punctures were easily performed without complication. Surgeons required more time to evaluate the trajectory during ureteral than kidney puncture (median 15 seconds, range 14 to 18 vs 13, range 11 to 16, p ¼ 0.1). Median renal and ureteral puncture time was 19 (range 14 to 45) and 51 seconds (range 45 to 67), respectively (p ¼ 0.003). Two attempts were needed to achieve a successful ureteral puncture. The technique requires the presence of a renal stone for testing. Conclusions: The proposed electromagnetic tracking solution for renal collecting system puncture proved to be highly accurate, simple and quick. This method might represent a paradigm shift in percutaneous kidney access techniques.
Resumo:
Portugal has the largest LPG (Liquefied Petroleum Gas) share of primary energy demand in the EU (about 5%). Due to the increasing international cost of LPG in the last years and the high price sensitivity of the consumers the preference for substitute energy sources in new and existing consumers has been increasing. To select the kind of energy, some consumer estimate and compare the total costs while others follow agents (equipment sellers) recommendations. It takes time to build agents perception about the most advantageous source of energy, which is seen as an important resource that drives client resource accumulation and retention. Marketing strategies have to take into consideration some market dynamic effects derived from the accumulation and depletion of these resources. A simple system dynamics model was built, combined with Economic Value Added framework, to evaluate some pricing strategies under different scenarios of LPG international cost.
Resumo:
Portugal has the largest LPG (Liquefied Petroleum Gas) share of primary energy demand in the EU (about 5%). Due to the increasing international cost of LPG in the last years and the high price sensitivity of the consumers the preference for substitute energy sources in new and existing consumers has been increasing. To select the kind of energy, some consumer estimate and compare the total costs while others follow agents (equipment sellers) recommendations. It takes time to build agents perception about the most advantageous source of energy, which is seen as an important resource that drives client resource accumulation and retention. Marketing strategies have to take into consideration some market dynamic effects derived from the accumulation and depletion of these resources. A simple system dynamics model was built, combined with Economic Value Added framework, to evaluate some pricing strategies under different scenarios of LPG international cost.
Resumo:
The growth experimented in recent years in both the variety and volume of structured products implies that banks and other financial institutions have become increasingly exposed to model risk. In this article we focus on the model risk associated with the local volatility (LV) model and with the Variance Gamma (VG) model. The results show that the LV model performs better than the VG model in terms of its ability to match the market prices of European options. Nevertheless, both models are subject to significant pricing errors when compared with the stochastic volatility framework.
Resumo:
Moving towards autonomous operation and management of increasingly complex open distributed real-time systems poses very significant challenges. This is particularly true when reaction to events must be done in a timely and predictable manner while guaranteeing Quality of Service (QoS) constraints imposed by users, the environment, or applications. In these scenarios, the system should be able to maintain a global feasible QoS level while allowing individual nodes to autonomously adapt under different constraints of resource availability and input quality. This paper shows how decentralised coordination of a group of autonomous interdependent nodes can emerge with little communication, based on the robust self-organising principles of feedback. Positive feedback is used to reinforce the selection of the new desired global service solution, while negative feedback discourages nodes to act in a greedy fashion as this adversely impacts on the provided service levels at neighbouring nodes. The proposed protocol is general enough to be used in a wide range of scenarios characterised by a high degree of openness and dynamism where coordination tasks need to be time dependent. As the reported results demonstrate, it requires less messages to be exchanged and it is faster to achieve a globally acceptable near-optimal solution than other available approaches.
Resumo:
Cloud SLAs compensate customers with credits when average availability drops below certain levels. This is too inflexible because consumers lose non-measurable amounts of performance being only compensated later, in next charging cycles. We propose to schedule virtual machines (VMs), driven by range-based non-linear reductions of utility, different for classes of users and across different ranges of resource allocations: partial utility. This customer-defined metric, allows providers transferring resources between VMs in meaningful and economically efficient ways. We define a comprehensive cost model incorporating partial utility given by clients to a certain level of degradation, when VMs are allocated in overcommitted environments (Public, Private, Community Clouds). CloudSim was extended to support our scheduling model. Several simulation scenarios with synthetic and real workloads are presented, using datacenters with different dimensions regarding the number of servers and computational capacity. We show the partial utility-driven driven scheduling allows more VMs to be allocated. It brings benefits to providers, regarding revenue and resource utilization, allowing for more revenue per resource allocated and scaling well with the size of datacenters when comparing with an utility-oblivious redistribution of resources. Regarding clients, their workloads’ execution time is also improved, by incorporating an SLA-based redistribution of their VM’s computational power.
Resumo:
A área da simulação computacional teve um rápido crescimento desde o seu apareciment, sendo actualmente uma das ciências de gestão e de investigação operacional mais utilizadas. O seu princípio baseia-se na replicação da operação de processos ou sistemas ao longo de períodos de tempo, tornando-se assim uma metodologia indispensável para a resolução de variados problemas do mundo real, independentemente da sua complexidade. Das inúmeras áreas de aplicação, nos mais diversos campos, a que mais se destaca é a utilização em sistemas de produção, onde o leque de aplicações disponível é muito vasto. A sua aplicação tem vindo a ser utilizada para solucionar problemas em sistemas de produção, uma vez que permite às empresas ajustar e planear de uma maneira rápida, eficaz e ponderada as suas operações e os seus sistemas, permitindo assim uma rápida adaptação das mesmas às constantes mudanças das necessidades da economia global. As aplicações e packages de simulação têm seguindo as tendências tecnológicas pelo que é notório o recurso a tecnologias orientadas a objectos para o desenvolvimento das mesmas. Este estudo baseou-se, numa primeira fase, na recolha de informação de suporte aos conceitos de modelação e simulação, bem como a respectiva aplicação a sistemas de produção em tempo real. Posteriormente centralizou-se no desenvolvimento de um protótipo de uma aplicação de simulação de ambientes de fabrico em tempo real. O desenvolvimento desta ferramenta teve em vista eventuais fins pedagógicos e uma utilização a nível académico, sendo esta capaz de simular um modelo de um sistema de produção, estando também dotada de animação. Sem deixar de parte a possibilidade de integração de outros módulos ou, até mesmo, em outras plataformas, houve ainda a preocupação acrescida de que a sua implementação recorresse a metodologias de desenvolvimento orientadas a objectos.
Resumo:
Many-core platforms are an emerging technology in the real-time embedded domain. These devices offer various options for power savings, cost reductions and contribute to the overall system flexibility, however, issues such as unpredictability, scalability and analysis pessimism are serious challenges to their integration into the aforementioned area. The focus of this work is on many-core platforms using a limited migrative model (LMM). LMM is an approach based on the fundamental concepts of the multi-kernel paradigm, which is a promising step towards scalable and predictable many-cores. In this work, we formulate the problem of real-time application mapping on a many-core platform using LMM, and propose a three-stage method to solve it. An extended version of the existing analysis is used to assure that derived mappings (i) guarantee the fulfilment of timing constraints posed on worst-case communication delays of individual applications, and (ii) provide an environment to perform load balancing for e.g. energy/thermal management, fault tolerance and/or performance reasons.
Resumo:
The 30th ACM/SIGAPP Symposium On Applied Computing (SAC 2015). 13 to 17, Apr, 2015, Embedded Systems. Salamanca, Spain.
Resumo:
3rd Workshop on High-performance and Real-time Embedded Systems (HIRES 2015). 21, Jan, 2015. Amsterdam, Netherlands.
Resumo:
ABSTRACT - It is the purpose of the present thesis to emphasize, through a series of examples, the need and value of appropriate pre-analysis of the impact of health care regulation. Specifically, the thesis presents three papers on the theme of regulation in different aspects of health care provision and financing. The first two consist of economic analyses of the impact of health care regulation and the third comprises the creation of an instrument for supporting economic analysis of health care regulation, namely in the field of evaluation of health care programs. The first paper develops a model of health plan competition and pricing in order to understand the dynamics of health plan entry and exit in the presence of switching costs and alternative health premium payment systems. We build an explicit model of death spirals, in which profitmaximizing competing health plans find it optimal to adopt a pattern of increasing relative prices culminating in health plan exit. We find the steady-state numerical solution for the price sequence and the plan’s optimal length of life through simulation and do some comparative statics. This allows us to show that using risk adjusted premiums and imposing price floors are effective at reducing death spirals and switching costs, while having employees pay a fixed share of the premium enhances death spirals and increases switching costs. Price regulation of pharmaceuticals is one of the cost control measures adopted by the Portuguese government, as in many European countries. When such regulation decreases the products’ real price over time, it may create an incentive for product turnover. Using panel data for the period of 1997 through 2003 on drug packages sold in Portuguese pharmacies, the second paper addresses the question of whether price control policies create an incentive for product withdrawal. Our work builds the product survival literature by accounting for unobservable product characteristics and heterogeneity among consumers when constructing quality, price control and competition indexes. These indexes are then used as covariates in a Cox proportional hazard model. We find that, indeed, price control measures increase the probability of exit, and that such effect is not verified in OTC market where no such price regulation measures exist. We also find quality to have a significant positive impact on product survival. In the third paper, we develop a microsimulation discrete events model (MSDEM) for costeffectiveness analysis of Human Immunodeficiency Virus treatment, simulating individual paths from antiretroviral therapy (ART) initiation to death. Four driving forces determine the course of events: CD4+ cell count, viral load resistance and adherence. A novel feature of the model with respect to the previous MSDEMs is that distributions of time to event depend on individuals’ characteristics and past history. Time to event was modeled using parametric survival analysis. Events modeled include: viral suppression, regimen switch due virological failure, regimen switch due to other reasons, resistance development, hospitalization, AIDS events, and death. Disease progression is structured according to therapy lines and the model is parameterized with cohort Portuguese observational data. An application of the model is presented comparing the cost-effectiveness ART initiation with two nucleoside analogue reverse transcriptase inhibitors (NRTI) plus one non-nucleoside reverse transcriptase inhibitor(NNRTI) to two NRTI plus boosted protease inhibitor (PI/r) in HIV- 1 infected individuals. We find 2NRTI+NNRTI to be a dominant strategy. Results predicted by the model reproduce those of the data used for parameterization and are in line with those published in the literature.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Master's Double Degree in Finance from the NOVA School of Business and Economics / Masters Degree in Economics from Insper
Resumo:
This project characterizes the accuracy of the escrowed dividend model on the value of European options on a stock paying discrete dividend. A description of the escrowed dividend model is provided, and a comparison between this model and the benchmark model is realized. It is concluded that options on stocks with either low volatility, low dividend yield, low ex-dividend to maturity ratio or that are deep in or out of the money are reasonably priced with the escrowed dividend model.