14 resultados para Wholesale price indexes

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Análisis del proceso de formación de precios en el mercado residencial de Lisboa desde el punto de vista de la eliminación de los aspectos subjetivos de la apreciación por el tasador de las características de los inmuebles

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The constellation of adverse cardiovascular disease (CVD) and metabolic risk factors, including elevated abdominal obesity, blood pressure (BP), glucose, and triglycerides (TG) and lowered high-density lipoprotein-cholesterol (HDL-C), has been termed the metabolic syndrome (MetSyn) [1]. A number of different definitions have been developed by the World Health Organization (WHO) [2], the National Cholesterol Education Program Adult Treatment Panel III (ATP III) [3], the European Group for the Study of Insulin Resistance (EGIR) [4] and, most recently, the International Diabetes Federation (IDF) [5]. Since there is no universal definition of the Metabolic Syndrome, several authors have derived different risk scores to represent the clustering of its components [6-11].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, the dimensional synthesis of a spherical Parallel Manipulator (PM) with a -1S kinematic chain is presented. The goal of the synthesis is to find a set of parameters that defines the PM with the best performance in terms of workspace capabilities, dexterity and isotropy. The PM is parametrized in terms of a reference element, and a non-directed search of these parameters is carried out. First, the inverse kinematics and instantaneous kinematics of the mechanism are presented. The latter is found using the screw theory formulation. An algorithm that explores a bounded set of parameters and determines the corresponding value of global indexes is presented. The concepts of a novel global performance index and a compound index are introduced. Simulation results are shown and discussed. The best PMs found in terms of each performance index evaluated are locally analyzed in terms of its workspace and local dexterity. The relationship between the performance of the PM and its parameters is discussed, and a prototype with the best performance in terms of the compound index is presented and analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document. Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mineral price assigned in mining project design is critical to determining the economic feasibility of a project. Nevertheless, although it is not difficult to find literature about market metal prices, it is much more complicated to achieve a specific methodology for calculating the value or which justifications are appropriate to include. This study presents an analysis of various methods for selecting metal prices and investigates the mechanisms and motives underlying price selections. The results describe various attitudes adopted by the designers of mining investment projects, and how the price can be determined not just by means of forecasting but also by consideration of other relevant parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The price formation of the Iberian Energy Derivatives Market-the power futures market-starting in July 2006, is assessed until November 2011, through the evolution of the difference between forward and spot prices in the delivery period (“ex-post forward risk premium”) and the comparison with the forward generation costs from natural gas (“clean spark spread”). The premium tends to be positive in all existing mechanisms (futures, Over-the-Counter and auctions for catering part of the last resort supplies). Since year 2011, the values are smaller due to regulatorily recognized prices for coal power plants. The power futures are strongly correlated with European gas prices. The spreads built with prompt contracts tend also to be positive. The biggest ones are for the month contract, followed by the quarter contract and then by the year contract. Therefore, gas fired generation companies can maximize profits trading with contracts of shorter maturity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We can say without hesitation that in energy markets a throughout data analysis is crucial when designing sophisticated models that are able to capture most of the critical market drivers. In this study we will attempt to investigate into Spanish natural gas prices structure to improve understanding of the role they play in the determination of electricity prices and decide in the future about price modelling aspects. To further understand the potential for modelling, this study will focus on the nature and characteristics of the different gas price data available. The fact that the existing gas market in Spain does not incorporate enough liquidity of trade makes it even more critical to analyze in detail available gas price data information that in the end will provide relevant information to understand how electricity prices are affected by natural gas markets. In this sense representative Spanish gas prices are typically difficult to explore given the fact that there is not a transparent gas market yet and all the gas imported in the country is negotiated and purchased by private companies at confidential terms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity price forecasting is an interesting problem for all the agents involved in electricity market operation. For instance, every profit maximisation strategy is based on the computation of accurate one-day-ahead forecasts, which is why electricity price forecasting has been a growing field of research in recent years. In addition, the increasing concern about environmental issues has led to a high penetration of renewable energies, particularly wind. In some European countries such as Spain, Germany and Denmark, renewable energy is having a deep impact on the local power markets. In this paper, we propose an optimal model from the perspective of forecasting accuracy, and it consists of a combination of several univariate and multivariate time series methods that account for the amount of energy produced with clean energies, particularly wind and hydro, which are the most relevant renewable energy sources in the Iberian Market. This market is used to illustrate the proposed methodology, as it is one of those markets in which wind power production is more relevant in terms of its percentage of the total demand, but of course our method can be applied to any other liberalised power market. As far as our contribution is concerned, first, the methodology proposed by García-Martos et al(2007 and 2012) is generalised twofold: we allow the incorporation of wind power production and hydro reservoirs, and we do not impose the restriction of using the same model for 24h. A computational experiment and a Design of Experiments (DOE) are performed for this purpose. Then, for those hours in which there are two or more models without statistically significant differences in terms of their forecasting accuracy, a combination of forecasts is proposed by weighting the best models(according to the DOE) and minimising the Mean Absolute Percentage Error (MAPE). The MAPE is the most popular accuracy metric for comparing electricity price forecasting models. We construct the combi nation of forecasts by solving several nonlinear optimisation problems that allow computation of the optimal weights for building the combination of forecasts. The results are obtained by a large computational experiment that entails calculating out-of-sample forecasts for every hour in every day in the period from January 2007 to Decem ber 2009. In addition, to reinforce the value of our methodology, we compare our results with those that appear in recent published works in the field. This comparison shows the superiority of our methodology in terms of forecasting accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper applies an integrated modeling approach to the case of Spain; the approach is based on a random utility-based multiregional input-output model and a road transport network model for assessing the effect of introducing longer and heavier vehicles (LHVs) on the regional consumer price index (CPI) and on the transportation system. The approach strongly supports the concept that changes in transport costs derived from the LHV allowance as well as the economic structure of regions have direct and indirect effects on the economy and on the transportation system. Results show that the introduction of LHVs might reduce prices paid by consumers for a representative basket of goods and services in the regions of Spain and would also lead to a reduction in the regional CPI. In addition, the magnitude and extent of changes in the transportation system are estimated by using the commodity-based structure of the approach to identify the effect of traffic changes on traffic flows and on pollutant emissions over the whole network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this work is to provide a description of the heavy rainfall phenomenon on statistical tools from a Spanish region. We want to quantify the effect of the climate change to verify the rapidity of its evolution across the variation of the probability distributions. Our conclusions have special interest for the agrarian insurances, which may make estimates of costs more realistically. In this work, the analysis mainly focuses on: The distribution of consecutive days without rain for each gauge stations and season. We estimate density Kernel functions and Generalized Pareto Distribution (GPD) for a network of station from the Ebro River basin until a threshold value u. We can establish a relation between distributional parameters and regional characteristics. Moreover we analyze especially the tail of the probability distribution. These tails are governed by law of power means that the number of events n can be expressed as the power of another quantity x : n(x) = x? . ? can be estimated as the slope of log-log plot the number of events and the size. The most convenient way to analyze n(x) is using the empirical probability distribution. Pr(X mayor que x) ? x-?. The distribution of rainfall over percentile of order 0.95 from wet days at the seasonal scale and in a yearly scale with the same treatment of tails than in the previous section.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Melon is traditionally cultivated in fertigated farmlands in the center of Spain with high inputs of water and N fertilizer. Excess N can have a negative impact, from the economic point of view, since it can diminish the production and quality of the fruit, from the environmental point of view, since it is a very mobile element in the soil and can contaminate groundwater. From health point of view, nitrate can be accumulated in fruit pulp, and, in addition, groundwater is the fundamental supply source of human populations. Best management practices are particularly necessary in this region as many zones have been declared vulnerable to NO3- pollution (Directive 91/676/CEE) During successive years, a melon crop (Cucumis melo L.) was grown under field conditions applying mineral and organic fertilizers under drip irrigation. Different doses of ammonium nitrate were used as well as compost derived from the wine-distillery industry which is relevant in this area. The present study reviews the most common N efficiency indexes under the different management options with a view to maximizing yield and minimizing N loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Air Mass and atmosphere components (basically aerosol (AOD) and precipitable water (PW)) define the absorption of the sunlight that arrive to Earth. Radiative models such as SMARTS or MODTRAN use these parameters to generate an equivalent spectrum. However, complex and expensive instruments (as AERONET network devices) are needed to obtain AOD and PW. On the other hand, the use of isotype cells is a convenient way to characterize spectrally a place for CPV considering that they provide the photocurrent of the different internal subcells individually. Crossing data from AERONET station and a Tri-band Spectroheliometer, a model that correlates Spectral Mismatch Ratios and atmospheric parameters is proposed. Considering the amount of stations of AERONET network, this model may be used to estimate the spectral influence on energy performance of CPV systems close to all the stations worldwide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La premisa inicial de la tesis examina cómo las secuelas de Segunda Guerra mundial motivaron una revisión general de la Ciencia y procuraron una nueva relación entre el hombre y su entorno. Matemáticas, Física y Biología gestaron las Ciencias de la Computación como disciplina de convergencia. En un momento de re-definición del objeto científico, una serie de arquitectos vislumbraron la oportunidad para transformar ciertas convenciones disciplinares. Mediante la incorporación de ontologías y procedimientos de cibernética y computación, trazaron un nuevo espacio arquitectónico. Legitimados por un despegue tecnológico incuestionable, desafían los límites de la profesión explorando campos abiertos a nuevos programas y acciones; amplían el dominio natural de la Arquitectura más allá del objeto(terminado) hacia el proceso(abierto). Se da inicio a la tesis describiendo los antecedentes que conducen a ese escenario de cambio. Se anotan aspectos de Teoría de Sistemas, Computación, Biología y de ciertos referentes de Arquitectura con relevancia para esa nuevo planteamiento. En esos antecedentes residen los argumentos para orientar la disciplina hacia el trabajo con procesos. La linea argumental central del texto aborda la obra de Christopher Alexander, Nicholas Negroponte y Cedric Price a través de una producción teórica y práctica transformada por la computación, y examina la contribución conceptual de cada autor. El análisis comparado de sus modelos se dispone mediante la disección de tres conceptos convergentes: Sistema, Código y Proceso. La discusión crítica se articula por una triangulación entre los autores, donde se identifican comparando por pares las coincidencias y controversias entre ellos. Sirve este procedimiento al propósito de tender un puente conceptual con el escenario arquitectónico actual estimando el impacto de sus propuestas. Se valora su contribución en la deriva del programa cerrado a la especulación , de lo formal a lo informal, de lo único a lo múltiple; del estudio de arquitectura al laboratorio de investigación. Para guiar ese recorrido por la significación de cada autor en el desarrollo digital de la disciplina, se incorporan a la escena dos predicados esenciales; expertos en computación que trabajaron de enlace entre los autores, matizando el significado de sus modelos. El trabajo de Gordon Pask y John Frazer constituye el vehículo de transmisión de los hallazgos de aquellos años, prolonga los caminos iniciados entonces, en la arquitectura de hoy y la que ya se está diseñando para mañana. ABSTRACT The initial premise of the thesis examines how the aftermath of second world war motivated a general revision of science and procure the basis of a new relation between mankind and its environment. Mathematics, Physics, and Biology gave birth to the Computer Sciences as a blend of different knowledge and procedures. In a time when the object of major sciences was being redefined, a few architects saw a promising opportunity for transforming the Architectural convention. By implementing the concepts, ontology and procedures of Cybernetics, Artificial Intelligence and Information Technology, they envisioned a new space for their discipline. In the verge of transgression three prescient architects proposed complete architectural systems through their writings and projects; New systems that challenged the profession exploring open fields through program and action, questioning the culture of conservatism; They shifted architectural endeavor from object to process. The thesis starts describing the scientific and architectural background that lead to that opportunity, annotating aspects of Systems Theory, Computing, Biology and previous Architecture form the process perspective. It then focuses on the Works of Christopher Alexander, Nicholas Negroponte and Cedric Price through their work, and examines each authors conceptual contribution. It proceeds to a critical analysis of their proposals on three key converging aspects: system, architectural encoding and process. Finally, the thesis provides a comparative discussion between the three authors, and unfolds the impact of their work in todays architectural scenario. Their contribution to shift from service to speculation, from formal to informal , from unitary to multiple; from orthodox architecture studio to open laboratories of praxis through research. In order to conclude that triangle of concepts, other contributions come into scene to provide relevant predicates and complete those models. A reference to Gordon Pask and John Frazer is then provided with particular interest in their role as link between those pioneers and todays perspective, pushing the boundaries of both what architecture was and what it could become.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La segunda mitad de los años 60, ese convulso periodo generador de experiencias largamente revisitadas, es testigo también de un curioso fenómeno en Italia que, vinculado al auge de los locales nocturnos en Estados Unidos y a un intenso clima de emancipación social, utiliza estos nuevos “palacios de la diversión” como fuente de inspiración ideológica al ser percibidos entre los jóvenes arquitectos y diseñadores radicales italianos como un laboratorio experimental estilístico y funcional capaz de generar modelos para un nuevo orden social ligado al entretenimiento. La intensidad productiva de estos años da como resultado multitud de propuestas donde la arqui­tectura actúa como catalizadora de una pulsión social que mezcla en el mismo espa­cio la vanguardia cultural y experimental más radical con el fenómeno de masas de la sociedad del espectáculo, permitiendo a la industria del placer ocupar sin complejos una posición clave en el discurso de una nueva generación que traslada intenciona­damente su interés desde la forma construida a la producción de ambientes artificia­les, electrónicamente amplificados, demostrando al mismo tiempo su compromiso con las formas y la lógica de las nuevas tecnologías.