156 resultados para Nery Delgado
Resumo:
Els riscos greus es poden materialitzar en emergències col·lectives on l’afectació a la població, als béns materials, a les infraestructures i als serveis bàsics pot ser molt intensa i de forma molt ràpida. Quan aquests elements vulnerables se situen molt pròxims al focus de l’emergència es pot dificultar l’actuació del sistema de gestió d’emergències i més concretament no poder aplicar a temps les mesures de protecció i autoprotecció a la població, o bé que aquestes siguin insuficients. La ubicació dels elements vulnerables en relació als potencials focus d’emergència queda definida per la planificació territorial i l’urbanisme. Per tant, una adequada ubicació dels elements vulnerables que faciliti la gestió de les emergències pel que fa a l’aplicació de mesures de protecció i autoprotecció requereix incorporar la prevenció dels riscos greus a la planificació territorial i l’urbanisme. Actualment aquest procés de prevenció es troba en fase embrionària i disgregat entre diversos col·lectius professionals i organismes sectorials de l’àmbit de la seguretat i el risc. Aquest estudi de recerca recull l’estat actual de la qüestió i valora i justifica la importància social i econòmica del problema de la prevenció d’emergències col·lectives. A partir d’aquest punt es proposa que atès el caràcter transversal i aglutinador del servei de protecció civil, a través d’aquest servei es defineixi un sistema i metedologia per incorporar la prevenció del risc als paràmetres que condicionen el plantejament territorial i l’urbanisme. Es proposa que aquest sistema incorpori les visions dels diferents organismes experts amb caràcter sectorial i gestors de riscos greus concrets, i que tingui una estructura i un procés d’implantació similar al model de prevenció d’incendis en edificis, altament implantat i efectiu.
Resumo:
Els pacients amb crisis suboclusives recurrents o amb símptomes digestius que impideixen mantenir el seu pes normal, en absència d´una causa estructural que justifique els símptomes, poden estar afectes d´un trastorn de la motilitat del budell prim. La manometría gastrointestinal es la tècnica d´ elecció per l´ estudi de la motilidad del tracte gastrointestinal i en aquests pacients pot mostrar patrons aberrants que justifiquin els símptomes. No obstant, la manometría gastrointestinal és una tècnica específica però poc sensible. L´objetiu general d´aquest treball es determinar la resposta motora intestinal a una sobrecàrrega de quimo. Per a això s´estudiarà un grup de subjectes sans i es mesurarà l´ activitat motora del budell prim mitjançant manometría durant la infusió contínua de una solució de nutrients solos o espesats amb un compost no absobible.
Resumo:
Quan parlem de disseny centrat en l'usuari ens referim al fet de dissenyar pensant en el que vol l'usuari. És per això que se'l consulta, per a saber què vol, què necessita, què i com li agrada més, què no li agrada, què li costa més d'entendre. Es fa, en definitiva, una avaluació del producte la finalitat del qual és ser útil i pràctic per al major nombre possible de persones. En aquest treball parlem de dissenyar pensant en l'usuari en entorns tàctils. Els entorns tàctils només són una de les moltes possibilitats que ens ofereix la interacció humana amb els ordinadors.
Resumo:
Aquest Projecte de Final de Carrera consisteix en estudiar el format dels fitxers OpenOffice i extreure'n certa informació continguda en els documents per satisfer una necessitat plantejada per la recentment constituïda Agencia Catalana de Seguretat, mitjançant una eina de software creada per a tal fi.
Resumo:
El projecte consisteix en l'estudi i avaluació de diferents alternatives existents al mercat per a realitzar l'anàlisi i desenvolupament d'un conjunt de components que constitueixin un marc de treball per a simplificar i agilitzar el desenvolupament de la capa de presentació per a les aplicacions de client prim d'un determinat Framework desenvolupades amb la plataforma J2EE i basats en el patró de disseny Model-Vista-Controlador.
Resumo:
Aquest informe es basa en la petició formulada per l'Acadèmia del Cinema Andorrà (ACA) als alumnes de TFC de ETIS i ETIG per a la creació d'un magatzem de dades. Amb aquest nova eina es vol, per una banda, guardar en un únic lloc tota la informació que reben dels diversos festivals de cinema internacionals (Festival Internacional de Cinema de Canes, Festival Internacional de Cinema de Berlín (Berlinale), Premis Cèsar, MTV Movie Awards i Premis de l'Acadèmia), i per una altra banda, possibilitar l'anàlisi de tota la informació cinematogràfica de què disposen.
Resumo:
Sistema de gestió d¿amonestacions i sancions en centres educatius
Resumo:
In an earlier investigation (Burger et al., 2000) five sediment cores near the RodriguesTriple Junction in the Indian Ocean were studied applying classical statistical methods(fuzzy c-means clustering, linear mixing model, principal component analysis) for theextraction of endmembers and evaluating the spatial and temporal variation ofgeochemical signals. Three main factors of sedimentation were expected by the marinegeologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. Thedisplay of fuzzy membership values and/or factor scores versus depth providedconsistent results for two factors only; the ultra-basic component could not beidentified. The reason for this may be that only traditional statistical methods wereapplied, i.e. the untransformed components were used and the cosine-theta coefficient assimilarity measure.During the last decade considerable progress in compositional data analysis was madeand many case studies were published using new tools for exploratory analysis of thesedata. Therefore it makes sense to check if the application of suitable data transformations,reduction of the D-part simplex to two or three factors and visualinterpretation of the factor scores would lead to a revision of earlier results and toanswers to open questions . In this paper we follow the lines of a paper of R. Tolosana-Delgado et al. (2005) starting with a problem-oriented interpretation of the biplotscattergram, extracting compositional factors, ilr-transformation of the components andvisualization of the factor scores in a spatial context: The compositional factors will beplotted versus depth (time) of the core samples in order to facilitate the identification ofthe expected sources of the sedimentary process.Kew words: compositional data analysis, biplot, deep sea sediments
Resumo:
Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariableswith some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependenceof a composition with a categorical variable, a colored set of ternary diagrams mightbe a good idea for a first look at the data, but it will fast hide important aspects ifthe composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if theconventional, black-box ilr is used.Thinking on terms of the Euclidean structure of the simplex, we suggest to set upappropriate projections, which on one side show the compositional geometry and on theother side are still comprehensible by a non-expert analyst, readable for all locations andscales of the data. This is e.g. done by defining special balance displays with carefully-selected axes. Following this idea, we need to systematically ask how to display, explore,describe, and test the relation to complementary or explanatory data of categorical, real,ratio or again compositional scales.This contribution shows that it is sufficient to use some basic concepts and very fewadvanced tools from multivariate statistics (principal covariances, multivariate linearmodels, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariateanalysis
Resumo:
The preceding two editions of CoDaWork included talks on the possible considerationof densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended theEuclidean structure of the simplex to a Hilbert space structure of the set of densitieswithin a bounded interval, and van den Boogaart (2005) generalized this to the setof densities bounded by an arbitrary reference density. From the many variations ofthe Hilbert structures available, we work with three cases. For bounded variables, abasis derived from Legendre polynomials is used. For variables with a lower bound, westandardize them with respect to an exponential distribution and express their densitiesas coordinates in a basis derived from Laguerre polynomials. Finally, for unboundedvariables, a normal distribution is used as reference, and coordinates are obtained withrespect to a Hermite-polynomials-based basis.To get the coordinates, several approaches can be considered. A numerical accuracyproblem occurs if one estimates the coordinates directly by using discretized scalarproducts. Thus we propose to use a weighted linear regression approach, where all k-order polynomials are used as predictand variables and weights are proportional to thereference density. Finally, for the case of 2-order Hermite polinomials (normal reference)and 1-order Laguerre polinomials (exponential), one can also derive the coordinatesfrom their relationships to the classical mean and variance.Apart of these theoretical issues, this contribution focuses on the application of thistheory to two main problems in sedimentary geology: the comparison of several grainsize distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock orsediment, like their composition
Resumo:
”compositions” is a new R-package for the analysis of compositional and positive data.It contains four classes corresponding to the four different types of compositional andpositive geometry (including the Aitchison geometry). It provides means for computation,plotting and high-level multivariate statistical analysis in all four geometries.These geometries are treated in an fully analogous way, based on the principle of workingin coordinates, and the object-oriented programming paradigm of R. In this way,called functions automatically select the most appropriate type of analysis as a functionof the geometry. The graphical capabilities include ternary diagrams and tetrahedrons,various compositional plots (boxplots, barplots, piecharts) and extensive graphical toolsfor principal components. Afterwards, ortion and proportion lines, straight lines andellipses in all geometries can be added to plots. The package is accompanied by ahands-on-introduction, documentation for every function, demos of the graphical capabilitiesand plenty of usage examples. It allows direct and parallel computation inall four vector spaces and provides the beginner with a copy-and-paste style of dataanalysis, while letting advanced users keep the functionality and customizability theydemand of R, as well as all necessary tools to add own analysis routines. A completeexample is included in the appendix
Resumo:
Kriging is an interpolation technique whose optimality criteria are based on normality assumptions either for observed or for transformed data. This is the case of normal, lognormal and multigaussian kriging.When kriging is applied to transformed scores, optimality of obtained estimators becomes a cumbersome concept: back-transformed optimal interpolations in transformed scores are not optimal in the original sample space, and vice-versa. This lack of compatible criteria of optimality induces a variety of problems in both point and block estimates. For instance, lognormal kriging, widely used to interpolate positivevariables, has no straightforward way to build consistent and optimal confidence intervals for estimates.These problems are ultimately linked to the assumed space structure of the data support: for instance, positive values, when modelled with lognormal distributions, are assumed to be embedded in the whole real space, with the usual real space structure and Lebesgue measure
Resumo:
The statistical analysis of compositional data is commonly used in geological studies.As is well-known, compositions should be treated using logratios of parts, which aredifficult to use correctly in standard statistical packages. In this paper we describe thenew features of our freeware package, named CoDaPack, which implements most of thebasic statistical methods suitable for compositional data. An example using real data ispresented to illustrate the use of the package
Resumo:
In order to successfully deploy multicast services in QoS-aware networks, pricing architectures must take into account the particular characteristics of multicast sessions. With this objective, we propose a charging scheme for QoS multicast services, assuming that the unicast cost of each interconnecting link is determined and that such cost is expressed in terms of quality of service (QoS) parameters. Our scheme allows determining the cost distribution of a multicast session along a cost distribution tree (CDT), and basing such distribution in those pre-existing unicast cost functions. The paper discusses in detail the main characteristics of the problem in a realistic interdomain scenario and how the proposed scheme would contribute to its solution
Resumo:
This paper presents a new charging scheme for cost distribution along a point-to-multipoint connection when destination nodes are responsible for the cost. The scheme focus on QoS considerations and a complete range of choices is presented. These choices go from a safe scheme for the network operator to a fair scheme to the customer. The in-between cases are also covered. Specific and general problems, like the incidence of users disconnecting dynamically is also discussed. The aim of this scheme is to encourage the users to disperse the resource demand instead of having a large number of direct connections to the source of the data, which would result in a higher than necessary bandwidth use from the source. This would benefit the overall performance of the network. The implementation of this task must balance between the necessity to offer a competitive service and the risk of not recovering such service cost for the network operator. Throughout this paper reference to multicast charging is made without making any reference to any specific category of service. The proposed scheme is also evaluated with the criteria set proposed in the European ATM charging project CANCAN