13 resultados para Melt Compositions
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariableswith some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependenceof a composition with a categorical variable, a colored set of ternary diagrams mightbe a good idea for a first look at the data, but it will fast hide important aspects ifthe composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if theconventional, black-box ilr is used.Thinking on terms of the Euclidean structure of the simplex, we suggest to set upappropriate projections, which on one side show the compositional geometry and on theother side are still comprehensible by a non-expert analyst, readable for all locations andscales of the data. This is e.g. done by defining special balance displays with carefully-selected axes. Following this idea, we need to systematically ask how to display, explore,describe, and test the relation to complementary or explanatory data of categorical, real,ratio or again compositional scales.This contribution shows that it is sufficient to use some basic concepts and very fewadvanced tools from multivariate statistics (principal covariances, multivariate linearmodels, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariateanalysis
Resumo:
The preceding two editions of CoDaWork included talks on the possible considerationof densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended theEuclidean structure of the simplex to a Hilbert space structure of the set of densitieswithin a bounded interval, and van den Boogaart (2005) generalized this to the setof densities bounded by an arbitrary reference density. From the many variations ofthe Hilbert structures available, we work with three cases. For bounded variables, abasis derived from Legendre polynomials is used. For variables with a lower bound, westandardize them with respect to an exponential distribution and express their densitiesas coordinates in a basis derived from Laguerre polynomials. Finally, for unboundedvariables, a normal distribution is used as reference, and coordinates are obtained withrespect to a Hermite-polynomials-based basis.To get the coordinates, several approaches can be considered. A numerical accuracyproblem occurs if one estimates the coordinates directly by using discretized scalarproducts. Thus we propose to use a weighted linear regression approach, where all k-order polynomials are used as predictand variables and weights are proportional to thereference density. Finally, for the case of 2-order Hermite polinomials (normal reference)and 1-order Laguerre polinomials (exponential), one can also derive the coordinatesfrom their relationships to the classical mean and variance.Apart of these theoretical issues, this contribution focuses on the application of thistheory to two main problems in sedimentary geology: the comparison of several grainsize distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock orsediment, like their composition
Resumo:
We propose to analyze shapes as “compositions” of distances in Aitchison geometry asan alternate and complementary tool to classical shape analysis, especially when sizeis non-informative.Shapes are typically described by the location of user-chosen landmarks. Howeverthe shape – considered as invariant under scaling, translation, mirroring and rotation– does not uniquely define the location of landmarks. A simple approach is to usedistances of landmarks instead of the locations of landmarks them self. Distances arepositive numbers defined up to joint scaling, a mathematical structure quite similar tocompositions. The shape fixes only ratios of distances. Perturbations correspond torelative changes of the size of subshapes and of aspect ratios. The power transformincreases the expression of the shape by increasing distance ratios. In analogy to thesubcompositional consistency, results should not depend too much on the choice ofdistances, because different subsets of the pairwise distances of landmarks uniquelydefine the shape.Various compositional analysis tools can be applied to sets of distances directly or afterminor modifications concerning the singularity of the covariance matrix and yield resultswith direct interpretations in terms of shape changes. The remaining problem isthat not all sets of distances correspond to a valid shape. Nevertheless interpolated orpredicted shapes can be backtransformated by multidimensional scaling (when all pairwisedistances are used) or free geodetic adjustment (when sufficiently many distancesare used)
Resumo:
Aitchison and Bacon-Shone (1999) considered convex linear combinations ofcompositions. In other words, they investigated compositions of compositions, wherethe mixing composition follows a logistic Normal distribution (or a perturbationprocess) and the compositions being mixed follow a logistic Normal distribution. Inthis paper, I investigate the extension to situations where the mixing compositionvaries with a number of dimensions. Examples would be where the mixingproportions vary with time or distance or a combination of the two. Practicalsituations include a river where the mixing proportions vary along the river, or acrossa lake and possibly with a time trend. This is illustrated with a dataset similar to thatused in the Aitchison and Bacon-Shone paper, which looked at how pollution in aloch depended on the pollution in the three rivers that feed the loch. Here, I explicitlymodel the variation in the linear combination across the loch, assuming that the meanof the logistic Normal distribution depends on the river flows and relative distancefrom the source origins
Resumo:
This paper examines a dataset which is modeled well by thePoisson-Log Normal process and by this process mixed with LogNormal data, which are both turned into compositions. Thisgenerates compositional data that has zeros without any need forconditional models or assuming that there is missing or censoreddata that needs adjustment. It also enables us to model dependenceon covariates and within the composition
Resumo:
The R-package “compositions”is a tool for advanced compositional analysis. Its basicfunctionality has seen some conceptual improvement, containing now some facilitiesto work with and represent ilr bases built from balances, and an elaborated subsys-tem for dealing with several kinds of irregular data: (rounded or structural) zeroes,incomplete observations and outliers. The general approach to these irregularities isbased on subcompositions: for an irregular datum, one can distinguish a “regular” sub-composition (where all parts are actually observed and the datum behaves typically)and a “problematic” subcomposition (with those unobserved, zero or rounded parts, orelse where the datum shows an erratic or atypical behaviour). Systematic classificationschemes are proposed for both outliers and missing values (including zeros) focusing onthe nature of irregularities in the datum subcomposition(s).To compute statistics with values missing at random and structural zeros, a projectionapproach is implemented: a given datum contributes to the estimation of the desiredparameters only on the subcompositon where it was observed. For data sets withvalues below the detection limit, two different approaches are provided: the well-knownimputation technique, and also the projection approach.To compute statistics in the presence of outliers, robust statistics are adapted to thecharacteristics of compositional data, based on the minimum covariance determinantapproach. The outlier classification is based on four different models of outlier occur-rence and Monte-Carlo-based tests for their characterization. Furthermore the packageprovides special plots helping to understand the nature of outliers in the dataset.Keywords: coda-dendrogram, lost values, MAR, missing data, MCD estimator,robustness, rounded zeros
Resumo:
The aim of this talk is to convince the reader that there are a lot of interesting statisticalproblems in presentday life science data analysis which seem ultimately connected withcompositional statistics.Key words: SAGE, cDNA microarrays, (1D-)NMR, virus quasispecies
Resumo:
The kinetics of crystallization of four amorphous (or partially amorphous) melt spun Nd-Fe-B alloys induced by thermal treatment is studied by means of differential scanning calorimetry and scanning electron microscopy, In the range of temperatures explored experimentally, the crystallization process is thermally activated and generally proceeds in various stages. The Curie temperature and the crystallization behavior have been measured. The apparent activation energy of crystallization of most of the crystallization stages has been determined for each melt spun alloy. The explicit form of the kinetic equation that best describes the first stage of crystallization has been found. It follows in general the Johnson-Mehl-Avrami-Erofe'ev model, but clear deviations to that model occur for one alloy. Scanning electron microscopy demonstrates that preferentially hetereogeneous nucleation occurs at the ribbon surface which was in contact with the wheel. From crystallization kinetics results the lower part of the experimental time-temperature-transformation curves for all studied alloys are deduced and extrapolated to the high temperature limit of their range of validity, also deduced.
Resumo:
NdFeB melt-spun amorphous or partially amorphous alloys of four compositions were prepared. Their crystallization kinetics induced by thermal treatment was studied by differential scanning calorimetry and scanning and transmission electron microscopy. Scanning electron microscopy demonstrated that heterogeneous nucleation occurs preferentially at the ribbon surface which was in contact with the wheel. The explicit form of the kinetic equation that best describes the first stage of crystallization under high undercooling conditions was obtained for each alloy. From the crystallization results, the lower part of the experimental time-temperature-transformation curves was deduced for each alloy and extrapolated up to the high-temperature limit of their validity. Microstructural observations showed a typical size of the microcrystals obtained by heat treatment of ~100 nm. From the magnetic properties measured with a vibrating sample magnetometer, the same magnetic behavior of partially crystallized alloys is observed regardless of the temperature of annealing provided the same crystallization fraction, x, is achieved, at least for small values of x (typically ~10%).
Resumo:
Magnetic, structural, and transport properties of as quenched and annealed Co10Cu90 samples have been investigated using x¿ray diffraction and a SQUID magnetometer. The largest value of MR change was observed for the as¿quenched sample annealed at 450°C for 30 min. The magnetic and transport properties closely correlate with the microstructure, mainly with Co magnetic particle size and its distribution. For thermal annealing the as quenched samples below 600°C, the Co particle diameters increase from 4.0 to 6.0 nm with a magnetoresistance (MR) drop from 33.0% to 5.0% at 10 K. Comparison with the theory indicates that the interfacial electron spin¿dependent scattering mechanism correlates with GMR for Co particle diameters up to about 6.0 nm.
Resumo:
A series of poly(butylene terephthalate) copolyesters containing 5-tert-butyl isophthalate units up to 50%-mole, as well as the homopolyester entirely made of these units, were prepared by polycondensation from the melt. The microstructure of the copolymers was determined by NMR to be at random for the whole range of compositions. The effect exerted by the 5-tert-butyl isophthalate units on thermal, tensile and gas transport properties was evaluated. Both Tm and crystallinity as well as the mechanical moduli were found to decrease steadily with copolymerization whereas Tg increased and the polyesters became more brittle. Permeability and solubility sligthly increased also with the content in substituted units whereas the diffusion coefficient remained practically constant. For the homopolyester poly(5-tert-butyl isophthalate), all these properties were found to deviate significantly from the general trend displayed by copolyesters suggesting that a different chain mode of packing in the amorphous phase is likely adopted in this case.
Resumo:
Materials science is a multidisciplinary research topic related to the development of physics and technology. Mechanical alloying of ribbon flakes is a two steps route to develop advanced materials. In this work, a Fe based alloy was obtained using three pathways: mechanical alloying, melt-spinning and mechanical alloying of previously melt-spun samples. Processing conditions allow us to obtain amorphous or nanocrystalline structures. Furthermore, a bibliographic revision of mechanical alloying is here presented