62 resultados para ellipse fitting

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

20.00% 20.00%

Publicador:

Resumo:

When dealing with sustainability we are concerned with the biophysical as well as the monetary aspects of economic and ecological interactions. This multidimensional approach requires that special attention is given to dimensional issues in relation to curve fitting practice in economics. Unfortunately, many empirical and theoretical studies in economics, as well as in ecological economics, apply dimensional numbers in exponential or logarithmic functions. We show that it is an analytical error to put a dimensional unit x into exponential functions ( a x ) and logarithmic functions ( x a log ). Secondly, we investigate the conditions of data sets under which a particular logarithmic specification is superior to the usual regression specification. This analysis shows that logarithmic specification superiority in terms of least square norm is heavily dependent on the available data set. The last section deals with economists’ “curve fitting fetishism”. We propose that a distinction be made between curve fitting over past observations and the development of a theoretical or empirical law capable of maintaining its fitting power for any future observations. Finally we conclude this paper with several epistemological issues in relation to dimensions and curve fitting practice in economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is to examine the proper use of dimensions and curve fitting practices elaborating on Georgescu-Roegen’s economic methodology in relation to the three main concerns of his epistemological orientation. Section 2 introduces two critical issues in relation to dimensions and curve fitting practices in economics in view of Georgescu-Roegen’s economic methodology. Section 3 deals with the logarithmic function (ln z) and shows that z must be a dimensionless pure number, otherwise it is nonsensical. Several unfortunate examples of this analytical error are presented including macroeconomic data analysis conducted by a representative figure in this field. Section 4 deals with the standard Cobb-Douglas function. It is shown that the operational meaning cannot be obtained for capital or labor within the Cobb-Douglas function. Section 4 also deals with economists "curve fitting fetishism". Section 5 concludes thispaper with several epistemological issues in relation to dimensions and curve fitting practices in economics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we present a proposal for a course in translation from German into Spanish following the task based approach as known in second language acquisition. The aim is to improve the translation competence of translation students. We depart from the hypothesis that some students select inapropiate translation strategies when faced with certain translation problems leading them to translation errors. In order to avoid these translation errors originated by wrong application of such strategies we propose a didactic method which helps to prevent them by a) raising awareness of the different subcompetences required while translating, b) improving the ability to identify translation problems and relate them to the different subcompetences and c) enhancing the use of the most adequate strategy according to the characteristics of each problem. With regard to translation and how translation competence is acquired our work follows the communicative approach to translation theory as defended among others by Hatim & Mason (1990), Lörscher (1992) and Kiraly (1995), where translation is seen as a communicative activity which can be analized from a psycholinguistic perspective. In this sense we give operative definitions for what we understand by “translation problem”, “translation strategy”, “translation error”, “translation competence” and “translation”. Our approach to didactics adapts recent developments in Second Language Teaching within the communicative paradigm as is the task based approach by Nunan (1989) acquisition to translation. Fitting the recquirements of this pedagogic approach we present a planning for a translation course which is compatible with present translation studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Un dels principals problemes de la interacció dels robots autònoms és el coneixement de l'escena. El reconeixement és fonamental per a solucionar aquest problema i permetre als robots interactuar en un escenari no controlat. En aquest document presentem una aplicació pràctica de la captura d'objectes, de la normalització i de la classificació de senyals triangulars i circulars. El sistema s'introdueix en el robot Aibo de Sony per a millorar-ne la interacció. La metodologia presentada s'ha comprobat en simulacions i problemes de categorització reals, com ara la classificació de senyals de trànsit, amb resultats molt prometedors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El principal objectiu d'aquest treball és proporcionar una metodologia per a reduir el temps de càlcul del mètode d'interpolació kriging sense pèrdua de la qualitat del model resultat. La solució adoptada ha estat la paral·lelització de l'algorisme mitjançant MPI sobre llenguatge C. Prèviament ha estat necessari automatitzar l'ajust del variograma que millor s'adapta a la distribució espacial de la variable d'estudi. Els resultats experimentals demostren la validesa de la solució implementada, en reduir de forma significativa els temps d'execució final de tot el procés.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Half-lives of radionuclides span more than 50 orders of magnitude. We characterize the probability distribution of this broad-range data set at the same time that explore a method for fitting power-laws and testing goodness-of-fit. It is found that the procedure proposed recently by Clauset et al. [SIAM Rev. 51, 661 (2009)] does not perform well as it rejects the power-law hypothesis even for power-law synthetic data. In contrast, we establish the existence of a power-law exponent with a value around 1.1 for the half-life density, which can be explained by the sharp relationship between decay rate and released energy, for different disintegration types. For the case of alpha emission, this relationship constitutes an original mechanism of power-law generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la Universitat de Bonn, Alemanya, entre agost i desembre del 2008. Recentement, arran de la creació del Registre de Càncer de Catalunya, s'ha el.laborat un nou "estat de la qüestió" del càncer a Catalunya, que ha permès obtenir una imatge complerta de la incidència, mortalitat i supervivència del càncer a Catalunya, a partir de les dades obtingudes pels registres poblacionals del càncer de Girona i Tarragona pel que fa a la incidència del càncer, i pel registre de Mortalitat de Catalunya, pel que fa a la mortalitat per càncer. El projecte realitzat ha tingut dos objectius principals. En primer lloc, desenvolupar un conjunt integrat de funcions per al càlcul automatitzat de la incidència, mortalitat i supervivència, així com l'ajust dels models estadístics que permeten avaluar les tendències i obtenir les projeccions del càncer pels anys futurs. En segon lloc, s'han aplicat les funcions a les dades disponibles i s'han obtingut els resultats a Catalunya, que inclou les projeccions de la incìdència i mortalitat per càncer a Catalunya fins a l'any 2020. Tos dos objectius han estat substancialment assolits. Pel que fa al primer, s'ha desenvolupat un fitxer font en R que conté les macros i funcions utilitzades. Pel que fa al segon, les anàlisis realitzades han estat emprades per a la realització d'una monografia sobre el càncer a Catalunya, que actualment està acceptada per la seva publicació. Els resultats mostren que la incidència per càncer ha augmentat i està previst que així continuï, tot i que es preveu un esmoertiment de l'augment pels homes. Pel que fa a la mortalitat s'observa un recent decrement que es preveu que es mantingui en el futur, essent aquest major pels homes respecte les dones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Caustics are curves with the property that a billiard trajectory, once tangent to it, stays tangent after every reflection at the boundary of the billiard table. When the billiard table is an ellipse, any nonsingular billiard trajectory has a caustic, which can be either a confocal ellipse or a confocal hyperbola. Resonant caustics —the ones whose tangent trajectories are closed polygons— are destroyed under generic perturbations of the billiard table. We prove that none of the resonant elliptical caustics persists under a large class of explicit perturbations of the original ellipse. This result follows from a standard Melnikov argument and the analysis of the complex singularities of certain elliptic functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tropical cyclones are affected by a large number of climatic factors, which translates into complex patterns of occurrence. The variability of annual metrics of tropical-cyclone activity has been intensively studied, in particular since the sudden activation of the North Atlantic in the mid 1990’s. We provide first a swift overview on previous work by diverse authors about these annual metrics for the North-Atlantic basin, where the natural variability of the phenomenon, the existence of trends, the drawbacks of the records, and the influence of global warming have been the subject of interesting debates. Next, we present an alternative approach that does not focus on seasonal features but on the characteristics of single events [Corral et al., Nature Phys. 6, 693 (2010)]. It is argued that the individual-storm power dissipation index (PDI) constitutes a natural way to describe each event, and further, that the PDI statistics yields a robust law for the occurrence of tropical cyclones in terms of a power law. In this context, methods of fitting these distributions are discussed. As an important extension to this work we introduce a distribution function that models the whole range of the PDI density (excluding incompleteness effects at the smallest values), the gamma distribution, consisting in a powerlaw with an exponential decay at the tail. The characteristic scale of this decay, represented by the cutoff parameter, provides very valuable information on the finiteness size of the basin, via the largest values of the PDIs that the basin can sustain. We use the gamma fit to evaluate the influence of sea surface temperature (SST) on the occurrence of extreme PDI values, for which we find an increase around 50 % in the values of these basin-wide events for a 0.49 C SST average difference. Similar findings are observed for the effects of the positive phase of the Atlantic multidecadal oscillation and the number of hurricanes in a season on the PDI distribution. In the case of the El Niño Southern oscillation (ENSO), positive and negative values of the multivariate ENSO index do not have a significant effect on the PDI distribution; however, when only extreme values of the index are used, it is found that the presence of El Niño decreases the PDI of the most extreme hurricanes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Definició d'un projecte de Portal de Tràmits i ajustar-lo al model pres de referència (el Portal de Tràmits de l'Ajuntament de Barcelona).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En aquesta memòria de pràctiques es desenvolupa el projecte que s'ha portat a terme durant els darrers tres mesos en el Departament d'Educació Viària de la policia local de Montmeló. El projecte s'ha aplicat, per mitjà de l'assessorament als agents, amb la finalitat de proporcionar-los unes eines bàsiques perquè en el futur puguin revisar i modificar les seves intervencions educatives o dissenyar-ne unes altres de noves, d'una manera molt més ajustada i totalment autònoma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The statistical analysis of literary style is the part of stylometry that compares measurable characteristicsin a text that are rarely controlled by the author, with those in other texts. When thegoal is to settle authorship questions, these characteristics should relate to the author’s style andnot to the genre, epoch or editor, and they should be such that their variation between authors islarger than the variation within comparable texts from the same author.For an overview of the literature on stylometry and some of the techniques involved, see for exampleMosteller and Wallace (1964, 82), Herdan (1964), Morton (1978), Holmes (1985), Oakes (1998) orLebart, Salem and Berry (1998).Tirant lo Blanc, a chivalry book, is the main work in catalan literature and it was hailed to be“the best book of its kind in the world” by Cervantes in Don Quixote. Considered by writterslike Vargas Llosa or Damaso Alonso to be the first modern novel in Europe, it has been translatedseveral times into Spanish, Italian and French, with modern English translations by Rosenthal(1996) and La Fontaine (1993). The main body of this book was written between 1460 and 1465,but it was not printed until 1490.There is an intense and long lasting debate around its authorship sprouting from its first edition,where its introduction states that the whole book is the work of Martorell (1413?-1468), while atthe end it is stated that the last one fourth of the book is by Galba (?-1490), after the death ofMartorell. Some of the authors that support the theory of single authorship are Riquer (1990),Chiner (1993) and Badia (1993), while some of those supporting the double authorship are Riquer(1947), Coromines (1956) and Ferrando (1995). For an overview of this debate, see Riquer (1990).Neither of the two candidate authors left any text comparable to the one under study, and thereforediscriminant analysis can not be used to help classify chapters by author. By using sample textsencompassing about ten percent of the book, and looking at word length and at the use of 44conjunctions, prepositions and articles, Ginebra and Cabos (1998) detect heterogeneities that mightindicate the existence of two authors. By analyzing the diversity of the vocabulary, Riba andGinebra (2000) estimates that stylistic boundary to be near chapter 383.Following the lead of the extensive literature, this paper looks into word length, the use of the mostfrequent words and into the use of vowels in each chapter of the book. Given that the featuresselected are categorical, that leads to three contingency tables of ordered rows and therefore tothree sequences of multinomial observations.Section 2 explores these sequences graphically, observing a clear shift in their distribution. Section 3describes the problem of the estimation of a suden change-point in those sequences, in the followingsections we propose various ways to estimate change-points in multinomial sequences; the methodin section 4 involves fitting models for polytomous data, the one in Section 5 fits gamma modelsonto the sequence of Chi-square distances between each row profiles and the average profile, theone in Section 6 fits models onto the sequence of values taken by the first component of thecorrespondence analysis as well as onto sequences of other summary measures like the averageword length. In Section 7 we fit models onto the marginal binomial sequences to identify thefeatures that distinguish the chapters before and after that boundary. Most methods rely heavilyon the use of generalized linear models

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En el present projecte final de carrera s’han dissenyat i pressupostat les instal·lacions de fontaneria, energia solar i gas natural per abastir d’aigua les dependències d’un hotel. L’hotel es troba situat a municipi de Platja d’Aro, a la província de Girona. La seva categoria és de quatre estrelles i això comporta un nivell alt de qualitat en els serveis oferts, en aquest cas el subministrament d’aigua freda i aigua calenta sanitària (ACS) als seus clients

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A graphical processing unit (GPU) is a hardware device normally used to manipulate computer memory for the display of images. GPU computing is the practice of using a GPU device for scientific or general purpose computations that are not necessarily related to the display of images. Many problems in econometrics have a structure that allows for successful use of GPU computing. We explore two examples. The first is simple: repeated evaluation of a likelihood function at different parameter values. The second is a more complicated estimator that involves simulation and nonparametric fitting. We find speedups from 1.5 up to 55.4 times, compared to computations done on a single CPU core. These speedups can be obtained with very little expense, energy consumption, and time dedicated to system maintenance, compared to equivalent performance solutions using CPUs. Code for the examples is provided.