23 resultados para Return-based pricing kernel

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Actualmente existe un gran interés orientado hacia el mercado del gas natural. Son muchas las razones por las que este combustible se posiciona como uno de los más importantes dentro del panorama energético mundial. Además de que salvaría el hueco dejado por el carbón y el petróleo, supone una alternativa mucho más limpia que se podría desarrollar aún más tanto a nivel doméstico, industrial como en el mundo de los transportes. La industria del gas natural está cambiando rápidamente fundamentalmente por la aparición del gas no convencional y sus técnicas de extracción. Por lo que se está produciendo un cambio en la economía de la producción de gas así como en la dinámica y los movimientos del GNL a lo largo de todo el planeta. El propósito de este estudio es enfocar el estado del sector y mercado del gas natural en todo el mundo y de esta forma subrayar las principales regiones que marcan la tendencia general de los precios de todo el planeta. Además, este trabajo reflejará los pronósticos esperados para los próximos años así como un resumen de las tendencias que se han seguido hasta el momento. Particularmente, se centrará la atención en el movimiento hacia los sistemas basados en forma de hub que comenzaron en EE.UU. y que llegaron a Reino Unido y al continente Europeo a principios del S.XX. Esta tendencia es la que se pretende implantar en España con el fin de conseguir una mayor competitividad, flexibilidad y liquidez en los precios y en el sistema gasista. De esta forma, poco a poco se irá construyendo la estructura hacia un Mercado Único Europeo que es el objetivo final que plantean los organismos de los estados miembros. Sin embargo, para la puesta en marcha de este nuevo modelo es necesario realizar una serie de cambios en el sistema como la modificación de la Ley de Hidrocarburos, la designación de un Operador de Mercado, elaboración de una serie de reglas para regular el mercado así como fomentar la liquidez del mercado. Cuando tenga lugar el cambio regulatorio, la liquidez del sistema español incrementará y se dará la oportunidad de crear nuevas formas para balancear las carteras de gas y establecer nuevas estrategias para gestionar el riesgo. No obstante, antes de que se hagan efectivos los cambios en la legislación, se implantaría uno de los modelos planteados en el “Gas Target Model”, el denominado “Modelo de Asignación de Capacidad Implícita”. La introducción de este modelo sería un primer paso para la integración de un mercado de gas sin la necesidad de afrontar un cambio legislativo, lo que serviría de VIII impulso para alcanzar el “Modelo de Área de Mercado” que sería el mejor para el sistema gasista español y se conectaría ampliamente con el resto de mercados europeos. Las conclusiones del estudio en relación a la formación del nuevo modelo en forma de hub plantean la necesidad de aprovechar al máximo la nueva situación y conseguir implantar el hub lo antes posible para poder dotar al sistema de mayor competencia y liquidez. Además, el sistema español debe aprovechar su gran capacidad y moderna infraestructura para convertir al país en la entrada de gas del suroeste de Europa ampliando así la seguridad de suministro de los países miembros. Otra conclusión que se puede extraer del informe es la necesidad de ampliar el índice de penetración del gas en España e incentivar el consumo frente a otros combustibles fósiles como el carbón y el petróleo. Esto situaría al gas natural como la principal energía de respaldo con respecto a las renovables y permitiría disminuir los precios del kilovatio hora del gas natural. El estudio y análisis de la dinámica que se viene dando en la industria del gas en el mundo es fundamental para poder anticiparse y planear las mejores estrategias frente a los cambios que poco a poco irán modificando el sector y el mercado gasista. ABSTRACT There is a great deal of focus on the natural gas market at the moment. Whether you view natural gas as bridging the gap between coal/oil and an altogether cleaner solution yet to be determined, or as a destination fuel which will be used not only for heating and gas fired generation but also as transportation fuel, there is no doubt that natural gas will have an increasingly important role to play in the global energy landscape. The natural gas industry is changing rapidly, as shale gas exploration changes the economics of gas production and LNG connects regions across the globe. The purpose of this study is to outline the present state of the global gas industry highlighting the differing models around the world. This study will pay particular attention to the move towards hub based pricing that has taken hold first in the US and over the past decade across the UK and Continental Europe. In the coming years the Spanish model will move towards hub based pricing. As gas market regulatory change takes hold, liquidity in the Spanish gas market will increase, bringing with it new ways to balance gas portfolios and placing an increasing focus on managing price risk. This study will in turn establish the links between the changes that have taken place in other markets as a way to better understanding how the Spanish market will evolve in the coming years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La evolución de los teléfonos móviles inteligentes, dotados de cámaras digitales, está provocando una creciente demanda de aplicaciones cada vez más complejas que necesitan algoritmos de visión artificial en tiempo real; puesto que el tamaño de las señales de vídeo no hace sino aumentar y en cambio el rendimiento de los procesadores de un solo núcleo se ha estancado, los nuevos algoritmos que se diseñen para visión artificial han de ser paralelos para poder ejecutarse en múltiples procesadores y ser computacionalmente escalables. Una de las clases de procesadores más interesantes en la actualidad se encuentra en las tarjetas gráficas (GPU), que son dispositivos que ofrecen un alto grado de paralelismo, un excelente rendimiento numérico y una creciente versatilidad, lo que los hace interesantes para llevar a cabo computación científica. En esta tesis se exploran dos aplicaciones de visión artificial que revisten una gran complejidad computacional y no pueden ser ejecutadas en tiempo real empleando procesadores tradicionales. En cambio, como se demuestra en esta tesis, la paralelización de las distintas subtareas y su implementación sobre una GPU arrojan los resultados deseados de ejecución con tasas de refresco interactivas. Asimismo, se propone una técnica para la evaluación rápida de funciones de complejidad arbitraria especialmente indicada para su uso en una GPU. En primer lugar se estudia la aplicación de técnicas de síntesis de imágenes virtuales a partir de únicamente dos cámaras lejanas y no paralelas—en contraste con la configuración habitual en TV 3D de cámaras cercanas y paralelas—con información de color y profundidad. Empleando filtros de mediana modificados para la elaboración de un mapa de profundidad virtual y proyecciones inversas, se comprueba que estas técnicas son adecuadas para una libre elección del punto de vista. Además, se demuestra que la codificación de la información de profundidad con respecto a un sistema de referencia global es sumamente perjudicial y debería ser evitada. Por otro lado se propone un sistema de detección de objetos móviles basado en técnicas de estimación de densidad con funciones locales. Este tipo de técnicas es muy adecuada para el modelado de escenas complejas con fondos multimodales, pero ha recibido poco uso debido a su gran complejidad computacional. El sistema propuesto, implementado en tiempo real sobre una GPU, incluye propuestas para la estimación dinámica de los anchos de banda de las funciones locales, actualización selectiva del modelo de fondo, actualización de la posición de las muestras de referencia del modelo de primer plano empleando un filtro de partículas multirregión y selección automática de regiones de interés para reducir el coste computacional. Los resultados, evaluados sobre diversas bases de datos y comparados con otros algoritmos del estado del arte, demuestran la gran versatilidad y calidad de la propuesta. Finalmente se propone un método para la aproximación de funciones arbitrarias empleando funciones continuas lineales a tramos, especialmente indicada para su implementación en una GPU mediante el uso de las unidades de filtraje de texturas, normalmente no utilizadas para cómputo numérico. La propuesta incluye un riguroso análisis matemático del error cometido en la aproximación en función del número de muestras empleadas, así como un método para la obtención de una partición cuasióptima del dominio de la función para minimizar el error. ABSTRACT The evolution of smartphones, all equipped with digital cameras, is driving a growing demand for ever more complex applications that need to rely on real-time computer vision algorithms. However, video signals are only increasing in size, whereas the performance of single-core processors has somewhat stagnated in the past few years. Consequently, new computer vision algorithms will need to be parallel to run on multiple processors and be computationally scalable. One of the most promising classes of processors nowadays can be found in graphics processing units (GPU). These are devices offering a high parallelism degree, excellent numerical performance and increasing versatility, which makes them interesting to run scientific computations. In this thesis, we explore two computer vision applications with a high computational complexity that precludes them from running in real time on traditional uniprocessors. However, we show that by parallelizing subtasks and implementing them on a GPU, both applications attain their goals of running at interactive frame rates. In addition, we propose a technique for fast evaluation of arbitrarily complex functions, specially designed for GPU implementation. First, we explore the application of depth-image–based rendering techniques to the unusual configuration of two convergent, wide baseline cameras, in contrast to the usual configuration used in 3D TV, which are narrow baseline, parallel cameras. By using a backward mapping approach with a depth inpainting scheme based on median filters, we show that these techniques are adequate for free viewpoint video applications. In addition, we show that referring depth information to a global reference system is ill-advised and should be avoided. Then, we propose a background subtraction system based on kernel density estimation techniques. These techniques are very adequate for modelling complex scenes featuring multimodal backgrounds, but have not been so popular due to their huge computational and memory complexity. The proposed system, implemented in real time on a GPU, features novel proposals for dynamic kernel bandwidth estimation for the background model, selective update of the background model, update of the position of reference samples of the foreground model using a multi-region particle filter, and automatic selection of regions of interest to reduce computational cost. The results, evaluated on several databases and compared to other state-of-the-art algorithms, demonstrate the high quality and versatility of our proposal. Finally, we propose a general method for the approximation of arbitrarily complex functions using continuous piecewise linear functions, specially formulated for GPU implementation by leveraging their texture filtering units, normally unused for numerical computation. Our proposal features a rigorous mathematical analysis of the approximation error in function of the number of samples, as well as a method to obtain a suboptimal partition of the domain of the function to minimize approximation error.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The seismic hazard of the Iberian Peninsula is analysed using a nonparametric methodology based on statistical kernel functions; the activity rate is derived from the catalogue data, both its spatial dependence (without a seismogenetic zonation) and its magnitude dependence (without using Gutenberg–Richter's law). The catalogue is that of the Instituto Geográfico Nacional, supplemented with other catalogues around the periphery; the quantification of events has been homogenised and spatially or temporally interrelated events have been suppressed to assume a Poisson process. The activity rate is determined by the kernel function, the bandwidth and the effective periods. The resulting rate is compared with that produced using Gutenberg–Richter statistics and a zoned approach. Three attenuation laws have been employed, one for deep sources and two for shallower events, depending on whether their magnitude was above or below 5. The results are presented as seismic hazard maps for different spectral frequencies and for return periods of 475 and 2475 yr, which allows constructing uniform hazard spectra.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A multivariate analysis on flood variables is needed to design some hydraulic structures like dams, as the complexity of the routing process in a reservoir requires a representation of the full hydrograph. In this work, a bivariate copula model was used to obtain the bivariate joint distribution of flood peak and volume, in order to know the probability of occurrence of a given inflow hydrograph. However, the risk of dam overtopping is given by the maximum water elevation reached during the routing process, which depends on the hydrograph variables, the reservoir volume and the spillway crest length. Consequently, an additional bivariate return period, the so-called routed return period, was defined in terms of risk of dam overtopping based on this maximum water elevation obtained after routing the inflow hydrographs. The theoretical return periods, which give the probability of occurrence of a hydrograph prior to accounting for the reservoir routing, were compared with the routed return period, as in both cases hydrographs with the same probability will draw a curve in the peak-volume space. The procedure was applied to the case study of the Santillana reservoir in Spain. Different reservoir volumes and spillway lengths were considered to investigate the influence of the dam and reservoir characteristics on the results. The methodology improves the estimation of the Design Flood Hydrograph and can be applied to assess the risk of dam overtopping

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The seismic hazard of the Iberian Peninsula is analysed using a nonparametric methodology based on statistical kernel functions; the activity rate is derived from the catalogue data, both its spatial dependence (without a seismogenic zonation) and its magnitude dependence (without using Gutenberg–Richter's relationship). The catalogue is that of the Instituto Geográfico Nacional, supplemented with other catalogues around the periphery; the quantification of events has been homogenised and spatially or temporally interrelated events have been suppressed to assume a Poisson process. The activity rate is determined by the kernel function, the bandwidth and the effective periods. The resulting rate is compared with that produced using Gutenberg–Richter statistics and a zoned approach. Three attenuation relationships have been employed, one for deep sources and two for shallower events, depending on whether their magnitude was above or below 5. The results are presented as seismic hazard maps for different spectral frequencies and for return periods of 475 and 2475 yr, which allows constructing uniform hazard spectra

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictions about electric energy needs, based on current electric energy models, forecast that the global energy consumption on Earth for 2050 will double present rates. Using distributed procedures for control and integration, the expected needs can be halved. Therefore implementation of Smart Grids is necessary. Interaction between final consumers and utilities is a key factor of future Smart Grids. This interaction is aimed to reach efficient and responsible energy consumption. Energy Residential Gateways (ERG) are new in-building devices that will govern the communication between user and utility and will control electric loads. Utilities will offer new services empowering residential customers to lower their electric bill. Some of these services are Smart Metering, Demand Response and Dynamic Pricing. This paper presents a practical development of an ERG for residential buildings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the classic oscillator design methods are reviewed, and their strengths and weaknesses are shown. Provisos for avoiding the misuse of classic methods are also proposed. If the required provisos are satisfied, the solutions provided by the classic methods (oscillator start-up linear approximation) will be correct. The provisos verification needs to use the NDF (Network Determinant Function). The use of the NDF or the most suitable RRT (Return Relation Transponse), which is directly related to the NDF, as a tool to analyze oscillators leads to a new oscillator design method. The RRT is the "true" loop-gain of oscillators. The use of the new method is demonstrated with examples. Finally, a comparison of NDF/RRT results with the HB (Harmonic Balance) simulation and practical implementation measurements prove the universal use of the new methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. This paper describes a new and original method for designing oscillators based on the Normalized Determinant Function (NDF) and Return Relations (RRT)- Firstly, a review of the loop-gain method will be performed. The loop-gain method pros, cons and some examples for exploring wrong solutions provided by this method will be shown. This method produces in some cases wrong solutions because some necessary conditions have not been fulfilled. The required necessary conditions to assure a right solution will be described. The necessity of using the NDF or the Transpose Return Relations (RRT), which are related with the True Loop-Gain, to test the additional conditions will be demonstrated. To conclude this paper, the steps for oscillator design and analysis, using the proposed NDF/RRj method, will be presented. The loop-gain wrong solutions will be compared with the NDF/RRj and the accuracy of this method to estimate the oscillation frequency and QL will be demonstrated. Some additional examples of plane reference oscillators (Z/Y/T), will be added and they will be analyzed with the new NDF/RRj proposed method, even these oscillators cannot be analyzed using the classic loop gain method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The implementation of a charging policy for heavy goods vehicles in European Union (EU) member countries has been imposed to reflect costs of construction and maintenance of infrastructure as well as externalities such as congestion, accidents and environmental impact. In this context, EU countries approved the Eurovignette directive (1999/62/EC) and its amending directive (2006 /38/EC) which established a legal framework to regulate the system of tolls. Even if that regulation seek s to increase the efficien cy of freight, it will trigger direct and indirect effects on Spain’s regional economies by increasing transport costs. This paper presents the development of a multiregional Input-Output methodology (MRIO) with elastic trade coefficients to predict in terregional trade, using transport attributes integrated in multinomial logit models. This method is highly useful to carry out an ex-ante evaluation of transport policies because it involves road freight transport cost sensitivity, and determine regional distributive and substitution economic effect s of countries like Spain, characterized by socio-demographic and economic attributes, differentiated region by region. It will thus be possible to determine cost-effective strategies, given different policy scenarios. MRIO mode l would then be used to determine the impact on the employment rate of imposing a charge in the Madrid-Sevilla corridor in Spain. This methodology is important for measuring the impact on the employment rate since it is one of the main macroeconomic indicators of Spain’s regional and national economic situation. A previous research developed (DESTINO) using a MRIO method estimated employment impacts of road pricing policy across Spanish regions considering a fuel tax charge (€/liter) in the entire shortest cost path network for freight transport. Actually, it found that the variation in employment is expected to be substantial for some regions, and negligible for others. For example, in this Spanish case study of regional employment has showed reductions between 16.1% (Rioja) and 1.4% (Madrid region). This variation range seems to be related to either the intensity of freight transport in each region or dependency of regions to transport intensive economic sect ors. In fact, regions with freight transport intensive sectors will lose more jobs while regions with a predominantly service economy undergo a fairly insignificant loss of employment. This paper is focused on evaluating a freight transport vehicle-kilometer charge (€/km) in a non-tolled motorway corridor (A-4) between Madrid-Sevilla (517 Km.). The consequences of the road pricing policy implementation show s that the employment reductions are not as high as the diminution stated in the previous research because this corridor does not affect the whole freight transport system of Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On December 20th 2006 the European Commission approved a law proposal to include the civil aviation sector in the European market of carbon dioxide emission rights [European Union Emissions Trading System, EUETS). On July 8th 2009, the European Parliament and Conseil agreed that all flights leaving or landing in the EU airports starting from January 1st 2012 should be included in the EUETS. On November 19th 2008, the EU Directive 2008/101/CE [1] included the civil aviation activities in the EUETS, and this directive was transposed by the Spanish law 13/2010 of July 5th 2010 [2]. Thus, in 2012 the aviation sector should reduce their emissions to 97 % of the mean values registered in the period 2004-2006, and for 2013 these emission reductions should reach 95 % of the mean values for that same period. Trying to face this situation, the aviation companies are planning seriously the use of alternative jet fuels to reduce their greenhouse gas emissions and to lower their costs. However, some US airlines have issued a lawsuit before the European Court of Justice based in that this EU action violates a long standing worldwide aviation treaty, the Chicago convention of 1944, and also the Chinese aviation companies have rejected to pay any EU carbon dioxide tax [3]. Moreover, the USA Departments of Agriculture and Energy and the Navy will invest a total of up to $150 million over three years to spur production of aviation and marine biofuels for commercial and military applications [4]. However, the jet fuels should fulfill a set of extraordinarily sensitive properties to guarantee the safety of planes and passengers during all the flights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aims to introduce some fundamental concepts underlying option valuation theory including implementation of computational tools. In many cases analytical solution for option pricing does not exist, thus the following numerical methods are used: binomial trees, Monte Carlo simulations and finite difference methods. First, an algorithm based on Hull and Wilmott is written for every method. Then these algorithms are improved in different ways. For the binomial tree both speed and memory usage is significantly improved by using only one vector instead of a whole price storing matrix. Computational time in Monte Carlo simulations is reduced by implementing a parallel algorithm (in C) which is capable of improving speed by a factor which equals the number of processors used. Furthermore, MatLab code for Monte Carlo was made faster by vectorizing simulation process. Finally, obtained option values are compared to those obtained with popular finite difference methods, and it is discussed which of the algorithms is more appropriate for which purpose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characteristics of CC and CLP systems are in principle very dierent However a recent trend towards convergence in the implementation techniques for these systems can be observed While CLP and Prolog systems have been incorporating capabilities to deal with userdened suspension and coroutining CC compilers have been trying to coalesce negrained tasks into coarsergrained sequential threads This convergence of techniques opens up the possibility of having a general purpose kernel language and abstract machine to serve as a compilation target for a variety of userlevel languages We propose a transformation technique directed towards such an objective In particular we report on techniques to support the Andorra computational model essentially emulating the AndorraI system via program transformation into a sequential language with delay primitives The system is automatic comprising an optional program analyzer and a basic transformer to the kernel language It turns out that a simple parallel CLP or Prolog system with dynamic scheduling is sucient as a kernel language for this purpose The preliminary results are quite encouraging performance of the resulting system is comparable to the current AndorraI implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessing social benefits in transport policy implementation has been studied by many researchers using theoretical or empirical measures. However, few of them measure social benefit using different discount rates including the inter-temporal preferences rate of users, the private investment discount rate and the inter-temporal preferences rate of the government. In general, the social discount rate used is the same for all social actors. Therefore, this paper aims to assess a new method by integrating different types of discount rate belonging to different social actors in order to measure the real benefits of each actor in the short, medium and long term. A dynamic simulation is provided by a strategic Land-Use and Transport Interaction (LUTI) model. The method is tested by optimizing a cordon toll scheme in Madrid considering socio- economic efficiency and environmental criteria. Based on the modified social welfare function (WF), the effects on the measure of social benefits are estimated and compared with the classical WF results as well. The results of this research could be a key issue to understanding the relationship between transport system policies and social actors' benefits distribution in a metropolitan context. The results show that the use of more suitable discount rates for each social actor had an effect on the selection and definition of optimal strategy of congestion pricing. The usefulness of the measure of congestion toll declines more quickly overtime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: A fully three-dimensional (3D) massively parallelizable list-mode ordered-subsets expectation-maximization (LM-OSEM) reconstruction algorithm has been developed for high-resolution PET cameras. System response probabilities are calculated online from a set of parameters derived from Monte Carlo simulations. The shape of a system response for a given line of response (LOR) has been shown to be asymmetrical around the LOR. This work has been focused on the development of efficient region-search techniques to sample the system response probabilities, which are suitable for asymmetric kernel models, including elliptical Gaussian models that allow for high accuracy and high parallelization efficiency. The novel region-search scheme using variable kernel models is applied in the proposed PET reconstruction algorithm. Methods: A novel region-search technique has been used to sample the probability density function in correspondence with a small dynamic subset of the field of view that constitutes the region of response (ROR). The ROR is identified around the LOR by searching for any voxel within a dynamically calculated contour. The contour condition is currently defined as a fixed threshold over the posterior probability, and arbitrary kernel models can be applied using a numerical approach. The processing of the LORs is distributed in batches among the available computing devices, then, individual LORs are processed within different processing units. In this way, both multicore and multiple many-core processing units can be efficiently exploited. Tests have been conducted with probability models that take into account the noncolinearity, positron range, and crystal penetration effects, that produced tubes of response with varying elliptical sections whose axes were a function of the crystal's thickness and angle of incidence of the given LOR. The algorithm treats the probability model as a 3D scalar field defined within a reference system aligned with the ideal LOR. Results: This new technique provides superior image quality in terms of signal-to-noise ratio as compared with the histogram-mode method based on precomputed system matrices available for a commercial small animal scanner. Reconstruction times can be kept low with the use of multicore, many-core architectures, including multiple graphic processing units. Conclusions: A highly parallelizable LM reconstruction method has been proposed based on Monte Carlo simulations and new parallelization techniques aimed at improving the reconstruction speed and the image signal-to-noise of a given OSEM algorithm. The method has been validated using simulated and real phantoms. A special advantage of the new method is the possibility of defining dynamically the cut-off threshold over the calculated probabilities thus allowing for a direct control on the trade-off between speed and quality during the reconstruction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we address the problem of dynamic pricing to optimize the revenue coming from the sales of a limited inventory in a finite time-horizon. A priori, the demand is assumed to be unknown. The seller must learn on the fly. We first deal with the simplest case, involving only one class of product for sale. Furthermore the general situation is considered with a finite number of product classes for sale. In particular, a case in point is the sale of tickets for events related to culture and leisure; in this case, typically the tickets are sold months before the event, thus, uncertainty over actual demand levels is a very a common occurrence. We propose a heuristic strategy of adaptive dynamic pricing, based on experience gained from the past, taking into account, for each time period, the available inventory, the time remaining to reach the horizon, and the profit made in previous periods. In the computational simulations performed, the demand is updated dynamically based on the prices being offered, as well as on the remaining time and inventory. The simulations show a significant profit over the fixed-price strategy, confirming the practical usefulness of the proposed strategy. We develop a tool allowing us to test different dynamic pricing strategies designed to fit market conditions and seller s objectives, which will facilitate data analysis and decision-making in the face of the problem of dynamic pricing.