999 resultados para Premis literaris -- Catalunya -- Girona
Resumo:
Ressenya dels llibres: Cons i neocons. El rerefons filosòfic, de Joan Vergés Gifra (ed.); Les pintures rupestres prehistòriques del Zemmur (Sahara Occidental), de Joaquim Soler i Subils; Tratado de la cabrevación, de Jaume Tos i Urgellés; La patrimonialització de la materialitat etrusca de la Toscana
Resumo:
Els objectius comuns definits per l'espai europeu d'educació superior han comportat, ineludiblement, que les universitats catalanes s'hagin hagut d'obrir a l'exterior. Les universitats del nostre país s'han trobat abocades de ple a un procés d'internacionalització irrefrenable
Resumo:
Aquest quadern és el primer lliurement de les Orientacions per a l’adaptació a l’espai europeu d’educació superior. Té l’origen en el debat de la Comissió de seguiment del Pla pilot d’adaptació a l’espai europeu d’educació superior de la UdG i del grup de treball que s’ha constituït l’hivern 2005-2006 expressament per tractar el tema de les competències
Resumo:
Aquest quadern forma part de la Guia per a l'adaptació a l'espai europeu d'educació superior
Resumo:
Aquest quadern forma part de la Guia per a l'adaptació a l'espai europeu d'educació superior
Resumo:
Aquest quadern és el quart lliurement de la Guia per a l'adaptació a l'espai europeo superior. Té l'origen en el debat de la Comissió de Seguiment del Pla Pilot d'adaptació a l'Espai Europeu d'Educació Superior de la UdG i del grup de treball que s'ha constituït l'estiu del 2006 expressament per tractar el tema de les activitats d'aprenentatge
Resumo:
Aquest quadern és el cinquè lliurement de la Guia per a l'adaptació a l'espai europeu d'educació superior. Té l'origen en el debat de la Comissió de Seguiment del Pla pilot d'Adaptació a l'Espai Europeu d'Educació Superior de la UdG i del grup de treball que s'ha constituït expressament per tractar el tema de l'avaluació dels aprenentatges
Resumo:
Aquest quadern és el sisè lliurament de la Guia per a l'adaptació a l'espai europeu d'educació superior. Té l'origen en el debat de la Comissió de Seguiment del Pla Pilot d'Adaptació a l'Espai Europeu d'Educació Superior de la UdG i del grup de treball que s'ha constituït expressament per tractar el tema de l'avaluació dels aprenentatges
Resumo:
We take stock of the present position of compositional data analysis, of what has been achieved in the last 20 years, and then make suggestions as to what may be sensible avenues of future research. We take an uncompromisingly applied mathematical view, that the challenge of solving practical problems should motivate our theoretical research; and that any new theory should be thoroughly investigated to see if it may provide answers to previously abandoned practical considerations. Indeed a main theme of this lecture will be to demonstrate this applied mathematical approach by a number of challenging examples
Resumo:
This paper is a first draft of the principle of statistical modelling on coordinates. Several causes —which would be long to detail—have led to this situation close to the deadline for submitting papers to CODAWORK’03. The main of them is the fast development of the approach along the last months, which let appear previous drafts as obsolete. The present paper contains the essential parts of the state of the art of this approach from my point of view. I would like to acknowledge many clarifying discussions with the group of people working in this field in Girona, Barcelona, Carrick Castle, Firenze, Berlin, G¨ottingen, and Freiberg. They have given a lot of suggestions and ideas. Nevertheless, there might be still errors or unclear aspects which are exclusively my fault. I hope this contribution serves as a basis for further discussions and new developments
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
The simplex, the sample space of compositional data, can be structured as a real Euclidean space. This fact allows to work with the coefficients with respect to an orthonormal basis. Over these coefficients we apply standard real analysis, inparticular, we define two different laws of probability trought the density function and we study their main properties
Resumo:
Traditionally, compositional data has been identified with closed data, and the simplex has been considered as the natural sample space of this kind of data. In our opinion, the emphasis on the constrained nature of compositional data has contributed to mask its real nature. More crucial than the constraining property of compositional data is the scale-invariant property of this kind of data. Indeed, when we are considering only few parts of a full composition we are not working with constrained data but our data are still compositional. We believe that it is necessary to give a more precise definition of composition. This is the aim of this oral contribution
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By an essential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur in many compositional situations, such as household budget patterns, time budgets, palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful in such situations. From consideration of such examples it seems sensible to build up a model in two stages, the first determining where the zeros will occur and the second how the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning