995 resultados para Statistical Concepts
Resumo:
There are two principal chemical concepts that are important for studying the natural environment. The first one is thermodynamics, which describes whether a system is at equilibrium or can spontaneously change by chemical reactions. The second main concept is how fast chemical reactions (kinetics or rate of chemical change) take place whenever they start. In this work we examine a natural system in which both thermodynamics and kinetic factors are important in determining the abundance of NH+4 , NO−2 and NO−3 in superficial waters. Samples were collected in the Arno Basin (Tuscany, Italy), a system in which natural and antrophic effects both contribute to highly modify the chemical composition of water. Thermodynamical modelling based on the reduction-oxidation reactions involving the passage NH+4 -> NO−2 -> NO−3 in equilibrium conditions has allowed to determine the Eh redox potential values able to characterise the state of each sample and, consequently, of the fluid environment from which it was drawn. Just as pH expresses the concentration of H+ in solution, redox potential is used to express the tendency of an environment to receive or supply electrons. In this context, oxic environments, as those of river systems, are said to have a high redox potential because O2 is available as an electron acceptor. Principles of thermodynamics and chemical kinetics allow to obtain a model that often does not completely describe the reality of natural systems. Chemical reactions may indeed fail to achieve equilibrium because the products escape from the site of the rection or because reactions involving the trasformation are very slow, so that non-equilibrium conditions exist for long periods. Moreover, reaction rates can be sensitive to poorly understood catalytic effects or to surface effects, while variables as concentration (a large number of chemical species can coexist and interact concurrently), temperature and pressure can have large gradients in natural systems. By taking into account this, data of 91 water samples have been modelled by using statistical methodologies for compositional data. The application of log–contrast analysis has allowed to obtain statistical parameters to be correlated with the calculated Eh values. In this way, natural conditions in which chemical equilibrium is hypothesised, as well as underlying fast reactions, are compared with those described by a stochastic approach
Resumo:
Most of economic literature has presented its analysis under the assumption of homogeneous capital stock. However, capital composition differs across countries. What has been the pattern of capital composition associated with World economies? We make an exploratory statistical analysis based on compositional data transformed by Aitchinson logratio transformations and we use tools for visualizing and measuring statistical estimators of association among the components. The goal is to detect distinctive patterns in the composition. As initial findings could be cited that: 1. Sectorial components behaved in a correlated way, building industries on one side and , in a less clear view, equipment industries on the other. 2. Full sample estimation shows a negative correlation between durable goods component and other buildings component and between transportation and building industries components. 3. Countries with zeros in some components are mainly low income countries at the bottom of the income category and behaved in a extreme way distorting main results observed in the full sample. 4. After removing these extreme cases, conclusions seem not very sensitive to the presence of another isolated cases
Resumo:
Several eco-toxicological studies have shown that insectivorous mammals, due to their feeding habits, easily accumulate high amounts of pollutants in relation to other mammal species. To assess the bio-accumulation levels of toxic metals and their in°uence on essential metals, we quantified the concentration of 19 elements (Ca, K, Fe, B, P, S, Na, Al, Zn, Ba, Rb, Sr, Cu, Mn, Hg, Cd, Mo, Cr and Pb) in bones of 105 greater white-toothed shrews (Crocidura russula) from a polluted (Ebro Delta) and a control (Medas Islands) area. Since chemical contents of a bio-indicator are mainly compositional data, conventional statistical analyses currently used in eco-toxicology can give misleading results. Therefore, to improve the interpretation of the data obtained, we used statistical techniques for compositional data analysis to define groups of metals and to evaluate the relationships between them, from an inter-population viewpoint. Hypothesis testing on the adequate balance-coordinates allow us to confirm intuition based hypothesis and some previous results. The main statistical goal was to test equal means of balance-coordinates for the two defined populations. After checking normality, one-way ANOVA or Mann-Whitney tests were carried out for the inter-group balances
Resumo:
The identification of compositional changes in fumarolic gases of active and quiescent volcanoes is one of the most important targets in monitoring programs. From a general point of view, many systematic (often cyclic) and random processes control the chemistry of gas discharges, making difficult to produce a convincing mathematical-statistical modelling. Changes in the chemical composition of volcanic gases sampled at Vulcano Island (Aeolian Arc, Sicily, Italy) from eight different fumaroles located in the northern sector of the summit crater (La Fossa) have been analysed by considering their dependence from time in the period 2000-2007. Each intermediate chemical composition has been considered as potentially derived from the contribution of the two temporal extremes represented by the 2000 and 2007 samples, respectively, by using inverse modelling methodologies for compositional data. Data pertaining to fumaroles F5 and F27, located on the rim and in the inner part of La Fossa crater, respectively, have been used to achieve the proposed aim. The statistical approach has allowed us to highlight the presence of random and not random fluctuations, features useful to understand how the volcanic system works, opening new perspectives in sampling strategies and in the evaluation of the natural risk related to a quiescent volcano
Resumo:
In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuously cored boreholes, 100 to 220m deep were drilled in the northern part of the Po Plain by Regione Lombardia in the last five years. Quantitative provenance analysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carried out by using multivariate statistical analysis (principal component analysis, PCA, and similarity analysis) on an integrated data set, including high-resolution bulk petrography and heavy-mineral analyses on Pleistocene sands and of 250 major and minor modern rivers draining the southern flank of the Alps from West to East (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations, metamorphic and quartzofeldspathic detritus from the Western and Central Alps was carried from the axial belt to the Po basin longitudinally parallel to the SouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenario rapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset of the first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA and similarity analysis from core samples show that the longitudinal trunk river at this time was shifted southward by the rapid southward and westward progradation of transverse alluvial river systems fed from the Central and Southern Alps. Sediments were transported southward by braided river systems as well as glacial sediments transported by Alpine valley glaciers invaded the alluvial plain. Kew words: Detrital modes; Modern sands; Provenance; Principal Components Analysis; Similarity, Canberra Distance; palaeodrainage
Resumo:
Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariables with some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependence of a composition with a categorical variable, a colored set of ternary diagrams might be a good idea for a first look at the data, but it will fast hide important aspects if the composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if the conventional, black-box ilr is used. Thinking on terms of the Euclidean structure of the simplex, we suggest to set up appropriate projections, which on one side show the compositional geometry and on the other side are still comprehensible by a non-expert analyst, readable for all locations and scales of the data. This is e.g. done by defining special balance displays with carefully- selected axes. Following this idea, we need to systematically ask how to display, explore, describe, and test the relation to complementary or explanatory data of categorical, real, ratio or again compositional scales. This contribution shows that it is sufficient to use some basic concepts and very few advanced tools from multivariate statistics (principal covariances, multivariate linear models, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariate analysis
Resumo:
The preceding two editions of CoDaWork included talks on the possible consideration of densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended the Euclidean structure of the simplex to a Hilbert space structure of the set of densities within a bounded interval, and van den Boogaart (2005) generalized this to the set of densities bounded by an arbitrary reference density. From the many variations of the Hilbert structures available, we work with three cases. For bounded variables, a basis derived from Legendre polynomials is used. For variables with a lower bound, we standardize them with respect to an exponential distribution and express their densities as coordinates in a basis derived from Laguerre polynomials. Finally, for unbounded variables, a normal distribution is used as reference, and coordinates are obtained with respect to a Hermite-polynomials-based basis. To get the coordinates, several approaches can be considered. A numerical accuracy problem occurs if one estimates the coordinates directly by using discretized scalar products. Thus we propose to use a weighted linear regression approach, where all k- order polynomials are used as predictand variables and weights are proportional to the reference density. Finally, for the case of 2-order Hermite polinomials (normal reference) and 1-order Laguerre polinomials (exponential), one can also derive the coordinates from their relationships to the classical mean and variance. Apart of these theoretical issues, this contribution focuses on the application of this theory to two main problems in sedimentary geology: the comparison of several grain size distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock or sediment, like their composition
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Exercises and solutions for a second year statistics course.
Resumo:
Lecture notes for a first year statistical modelling course.
Resumo:
El prolapso del piso pélvico es una entidad frecuente, especialmente en pacientes postmenopáusicas y en su gran mayoría requiere tratamiento quirúrgico. En este estudio comparamos la aparición de complicaciones postoperatorias tempranas entre la colporrafia anterior con técnica clásica (TC) versus la colporrafia anterior con técnica de sitio especifico (CSE). Se realizó un estudio observacional analítico, retrospectivo, de dos cohortes de pacientes que requirieron colporrafia anterior entre agosto de 2009 hasta junio de 2012. Las características de cada grupo fueron homogéneas y comparables. El desenlace de mayor frecuencia fue dehiscencia de la línea de sutura, sin embargo, no se encontraron diferencias estadísticamente significativas entre las dos técnicas. La aparición de reprolapso temprano y el diagnóstico de abscesos o hematomas presentaron frecuencias que carecen de diferencia significativa. No hubo complicaciones tempranas graves tales como sangrado intraoperatoria mayor o lesiones vesicouretrales. Los resultados sugieren que las dos técnicas tienen una incidencia baja de complicaciones postoperatorias tempranas y por lo tanto parecen ser seguros dentro del manejo quirúrgico del prolapso del componente anterior del piso pélvico.
Resumo:
INTRODUCCION: Existe controversia en cuanto a la técnica quirúrgica para el manejo de tumores del limbo conjuntival. El uso de cierre primario con uso de lente de contacto puede ofrecer una mejor cicatrización y tener ventajas adicionales sobre la técnica tradicional con el uso de plastia. OBJETIVOS: Comparar los resultados en cuanto a grado de dolor, picadas, prurito, porcentaje de epitelización y cicatrización, comodidad del paciente, grado de quemosis y tiempo de retorno a actividades diarias en ambas técnicas quirúrgicas. MATERIALES Y METODOS: Experimento clínico controlado aleatorizado en dos grupos: Al primer grupo se le realizó cirugía de resección de la lesión más plastia. Al segundo grupo se le practicó la resección de la lesión cierre primario y lente de contacto. El seguimiento se realizó al primer y cuarto día, y cada semana durante el primer mes de postoperatorio. Se utilizó el SPSS 20.0 ® para análisis estadístico de datos y se utilizó estadística no paramétrica. RESULTADOS: Se conto con 10 pacientes por grupo. El dolor y porcentaje de cicatrización al primer día postoperatorio fueron mayores en el grupo usando lente de contacto (p=0.048). Al cuarto día postquirúrgico se encontró un mayor porcentaje de cicatrización en el grupo usando lente de contacto. (p=0.075). CONCLUSIONES: El cierre por afrontamiento con uso de lente de contacto mostró dolor y picadas mayores al primer y cuarto día postoperatorio. Pero la epitelización y cicatrización fueron tempranas con un retorno corto a actividades cotidianas.
Resumo:
El sector de comida rápida es un sector que está presentando un crecimiento acelerado a nivel mundial debido a los cambios en el día a día de los consumidores, en donde la prioridad es el tiempo y este genera unos acelerados ritmos de vida (Sirgado & Lamas, 2011). Este crecimiento, es un elemento que se quiere mostrar a lo largo de esta investigación, a través de cifras que corroboran la expansión del sector de comida rápida a nivel mundial, latinoamericano y a nivel nacional; información que será útil para que empresarios puedan extraer un análisis del escenario en el que se encuentran las marcas pertenecientes al sector. La expansión del sector ha generado un incremento en los puntos de venta de comida rápida, lo cual se ha facilitado con la implementación del modelo de franquicia como una estrategia de crecimiento en las compañías debido a las ventajas que ofrece y a la posibilidad de diversificar la oferta e invertir en nuevos mercados (Portafolio, 2006). De este modo, se presentan los conceptos fundamentales a entender de este tipo de negocio, así como las ventajas que posee y las clases de franquicia que se pueden encontrar; para quienes desean implementar el modelo en su compañía o para quienes piensan en adquirir una. Para ejemplificar lo mencionado anteriormente, se toma una organización nacional, la cual ha introducido el modelo de franquicia y es perteneciente al sector de comida rápida, caso que puede servir como referencia para otras marcas del sector.