40 resultados para SMOOTHING SPLINE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The presence of subcentres cannot be captured by an exponential function. Cubic spline functions seem more appropriate to depict the polycentricity pattern of modern urban systems. Using data from Barcelona Metropolitan Region, two possible population subcentre delimitation procedures are discussed. One, taking an estimated derivative equal to zero, the other, a density gradient equal to zero. It is argued that, in using a cubic spline function, a delimitation strategy based on derivatives is more appropriate than one based on gradients because the estimated density can be negative in sections with very low densities and few observations, leading to sudden changes in estimated gradients. It is also argued that using as a criteria for subcentre delimitation a second derivative with value zero allow us to capture a more restricted subcentre area than using as a criteria a first derivative zero. This methodology can also be used for intermediate ring delimitation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to examine the pros and cons of book and fair value accounting from the perspective of the theory of banking. We consider the implications of the two accounting methods in an overlapping generations environment. As observed by Allen and Gale(1997), in an overlapping generation model, banks have a role as intergenerational connectors as they allow for intertemporal smoothing. Our main result is that when dividends depend on profits, book value ex ante dominates fair value, as it provides better intertemporal smoothing. This is in contrast with the standard view that states that, fair value yields a better allocation as it reflects the real opportunity cost of assets. Banking regulation play an important role by providing the right incentives for banks to smooth intertemporal consumption whereas market discipline improves intratemporal efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to examine the pros and cons of book and fair value accounting from the perspective of the theory of banking. We consider the implications of the two accounting methods in an overlapping generations environment. As observed by Allen and Gale(1997), in an overlapping generation model, banks have a role as intergenerational connectors as they allow for intertemporal smoothing. Our main result is that when dividends depend on profits, book value ex ante dominates fair value, as it provides better intertemporal smoothing. This is in contrast with the standard view that states that, fair value yields a better allocation as it reflects the real opportunity cost of assets. Banking regulation play an important role by providing the right incentives for banks to smooth intertemporal consumption whereas market discipline improves intratemporal efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La RMB es una ciudad de tipo policéntrico en la que resaltan unas ciudades de tamaño medio con elevada presencia de actividad económica y que, en muchos casos, destacan por sus dinámicas de crecimiento endógeno. El objetivo de esta investigación era hallar evidencia empírica en la RMB acerca de los determinantes de la localización de la actividad económica. Un objetivo que, a la par, requería la inclusión del estudio de la estructura urbana de la región para poder evaluar el efecto que en ella ejercen los determinantes de la localización. Si bien los resultados obtenidos con la Exponencial son buenos, la inclusión de formas funcionales de tipo polinómico para capturar los grumos de densidad han demostrado su eficiencia. Aunque la Cubic-Spline obtiene buenos resultados, tiene el inconveniente de no poder interpretar sus coeficientes. No obstante, nuestra propuesta, la Spline-Lineal, nos permite detectar la presencia de los subcentros que constituyen la región en base a la existencia de gradientes de densidad positivos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The metropolitan spatial structure displays various patterns, sometimes monocentricity and sometimes multicentricity, which seems much more complicated than the exponential density function used in classic works such as Clark (1961), Muth (1969) or Mills (1973) among others, can effectively represent. It seems that a more flexible density function,such as cubic spline function (Anderson (1982), Zheng (1991), etc.) to describe the density-accessibility relationship is needed. Also, accessibility, the fundamental determinant of density variations, is only partly captured by the inclusion of distance to the city centre as an explanatory variable. Steen (1986) has proposed to correct that miss-especification by including an additional gradient for distance to the nearest transportation axis. In identifying the determinants of urban spatial structure in the context of inter-urban systems, some of the variables proposed by Muth (1969), Mills (1973) and Alperovich (1983) such as city age or population, make no sense in the case of a single urban system. All three criticism to the exponential density function and its determinants apply for the Barcelona Metropolitan Region, a polycentric conurbation structured on well defined transportation axes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last two decades there has been an increase in using dynamic tariffs for billing household electricity consumption. This has questioned the suitability of traditional pricing schemes, such as two-part tariffs, since they contribute to create marked peak and offpeak demands. The aim of this paper is to assess if two-part tariffs are an efficient pricing scheme using Spanish household electricity microdata. An ordered probit model with instrumental variables on the determinants of power level choice and non-paramentric spline regressions on the electricity price distribution will allow us to distinguish between the tariff structure choice and the simultaneous demand decisions. We conclude that electricity consumption and dwellings’ and individuals’ characteristics are key determinants of the fixed charge paid by Spanish households Finally, the results point to the inefficiency of the two-part tariff as those consumers who consume more electricity pay a lower price than the others.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this study is the empirical identification of the monetary policy rules pursued in individual countries of EU before and after the launch of European Monetary Union. In particular, we have employed an estimation of the augmented version of the Taylor rule (TR) for 25 countries of the EU in two periods (1992-1998, 1999-2006). While uniequational estimation methods have been used to identify the policy rules of individual central banks, for the rule of the European Central Bank has been employed a dynamic panel setting. We have found that most central banks really followed some interest rate rule but its form was usually different from the original TR (proposing that domestic interest rate responds only to domestic inflation rate and output gap). Crucial features of policy rules in many countries have been the presence of interest rate smoothing as well as response to foreign interest rate. Any response to domestic macroeconomic variables have been missing in the rules of countries with inflexible exchange rate regimes and the rules consisted in mimicking of the foreign interest rates. While we have found response to long-term interest rates and exchange rate in rules of some countries, the importance of monetary growth and asset prices has been generally negligible. The Taylor principle (the response of interest rates to domestic inflation rate must be more than unity as a necessary condition for achieving the price stability) has been confirmed only in large economies and economies troubled with unsustainable inflation rates. Finally, the deviation of the actual interest rate from the rule-implied target rate can be interpreted as policy shocks (these deviation often coincided with actual turbulent periods).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La solución a los problemas de disponibilidad horaria para la realización de sesiones prácticas por parte de los estudiantes se encuentra en los laboratorios remotos, que permiten a estos interactuar con los elementos instalados en los laboratorios sin necesidad de estar presentes físicamente. Este proyecto pretende crear un laboratorio remoto para la asignatura “Robótica y Automatización Industrial” impartida en la ETSE, UAB, en el cual los estudiantes puedan ejecutar trayectorias de tipo spline cúbico en un brazo robot y observar a través de vídeo en tiempo real los movimientos del robot desde cualquier lugar con conexión a Internet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examine the evolution of monetary policy rules in a group of inflation targeting countries (Australia, Canada, New Zealand, Sweden and the United Kingdom) applying moment- based estimator at time-varying parameter model with endogenous regressors. Using this novel flexible framework, our main findings are threefold. First, monetary policy rules change gradually pointing to the importance of applying time-varying estimation framework. Second, the interest rate smoothing parameter is much lower that what previous time-invariant estimates of policy rules typically report. External factors matter for all countries, albeit the importance of exchange rate diminishes after the adoption of inflation targeting. Third, the response of interest rates on inflation is particularly strong during the periods, when central bankers want to break the record of high inflation such as in the U.K. or in Australia at the beginning of 1980s. Contrary to common wisdom, the response becomes less aggressive after the adoption of inflation targeting suggesting the positive effect of this regime on anchoring inflation expectations. This result is supported by our finding that inflation persistence as well as policy neutral rate typically decreased after the adoption of inflation targeting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.