1000 resultados para Danés i Torras, Josep, 1891-1955 -- Exhibitions
Resumo:
It can be assumed that the composition of Mercury’s thin gas envelope (exosphere) is related to thecomposition of the planets crustal materials. If this relationship is true, then inferences regarding the bulkchemistry of the planet might be made from a thorough exospheric study. The most vexing of allunsolved problems is the uncertainty in the source of each component. Historically, it has been believedthat H and He come primarily from the solar wind, while Na and K originate from volatilized materialspartitioned between Mercury’s crust and meteoritic impactors. The processes that eject atoms andmolecules into the exosphere of Mercury are generally considered to be thermal vaporization, photonstimulateddesorption (PSD), impact vaporization, and ion sputtering. Each of these processes has its owntemporal and spatial dependence. The exosphere is strongly influenced by Mercury’s highly ellipticalorbit and rapid orbital speed. As a consequence the surface undergoes large fluctuations in temperatureand experiences differences of insolation with longitude. We will discuss these processes but focus moreon the expected surface composition and solar wind particle sputtering which releases material like Caand other elements from the surface minerals and discuss the relevance of composition modelling
Resumo:
The log-ratio methodology makes available powerful tools for analyzing compositionaldata. Nevertheless, the use of this methodology is only possible for those data setswithout null values. Consequently, in those data sets where the zeros are present, aprevious treatment becomes necessary. Last advances in the treatment of compositionalzeros have been centered especially in the zeros of structural nature and in the roundedzeros. These tools do not contemplate the particular case of count compositional datasets with null values. In this work we deal with \count zeros" and we introduce atreatment based on a mixed Bayesian-multiplicative estimation. We use the Dirichletprobability distribution as a prior and we estimate the posterior probabilities. Then weapply a multiplicative modi¯cation for the non-zero values. We present a case studywhere this new methodology is applied.Key words: count data, multiplicative replacement, composition, log-ratio analysis
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
In 2000 the European Statistical Office published the guidelines for developing theHarmonized European Time Use Surveys system. Under such a unified framework,the first Time Use Survey of national scope was conducted in Spain during 2002–03. The aim of these surveys is to understand human behavior and the lifestyle ofpeople. Time allocation data are of compositional nature in origin, that is, they aresubject to non-negativity and constant-sum constraints. Thus, standard multivariatetechniques cannot be directly applied to analyze them. The goal of this work is toidentify homogeneous Spanish Autonomous Communities with regard to the typicalactivity pattern of their respective populations. To this end, fuzzy clustering approachis followed. Rather than the hard partitioning of classical clustering, where objects areallocated to only a single group, fuzzy method identify overlapping groups of objectsby allowing them to belong to more than one group. Concretely, the probabilistic fuzzyc-means algorithm is conveniently adapted to deal with the Spanish Time Use Surveymicrodata. As a result, a map distinguishing Autonomous Communities with similaractivity pattern is drawn.Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance
Resumo:
The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares
Resumo:
This paper deals with the problem of semiactive vibration control of civil engineering structures subject to unknown external disturbances (for example, earthquakes, winds, etc.). Two kinds of semiactive controllers are proposed based on the backstepping control technique. The experimental setup used is a 6-story test structure equipped with shear-mode semiactive magnetorheological dampers being installed in the Washington University Structural Control and Earthquake Engineering Laboratory (WUSCEEL). The experimental results obtained have verified the effectiveness of the proposed control algorithms
Resumo:
In the B-ISDN there is a provision for four classes of services, all of them supported by a single transport network (the ATM network). Three of these services, the connected oriented (CO) ones, permit connection access control (CAC) but the fourth, the connectionless oriented (CLO) one, does not. Therefore, when CLO service and CO services have to share the same ATM link, a conflict may arise. This is because a bandwidth allocation to obtain maximum statistical gain can damage the contracted ATM quality of service (QOS); and vice versa, in order to guarantee the contracted QOS, the statistical gain have to be sacrificed. The paper presents a performance evaluation study of the influence of the CLO service on a CO service (a circuit emulation service or a variable bit-rate service) when sharing the same link
Resumo:
This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented
Resumo:
This paper proposes a multicast implementation based on adaptive routing with anticipated calculation. Three different cost measures for a point-to-multipoint connection: bandwidth cost, connection establishment cost and switching cost can be considered. The application of the method based on pre-evaluated routing tables makes possible the reduction of bandwidth cost and connection establishment cost individually
Resumo:
The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved
Resumo:
L’objectiu del present projecte és aconseguir l’autosuficiència energètica per a la casa rural “Les Vinyes Grosses”. Aquesta casa rural està situada a Sant Agustí de Lluçanès, comarca d’Osona. Es vol aconseguir fer una canvi de les instal·lacions que utilitzen energies no renovables a unes instal·lacions que funcionin amb energies renovables. Es pretén canviar la instal·lació de calefacció i aigua calenta que funciona amb gas-oil per una instal·lació que funcioni amb estella de pi roig que provingui de la mateixa finca. També es vol fer un canvi de la instal·lació elèctrica que funciona amb electricitat de la xarxa per una instal·lació elèctrica que utilitzi energia solar fotovoltàica. Per últim, es pretén que tota l’aigua consumida en la casa rural sigui aigua de la pluja en comptes d’utilitzar l’aigua de la xarxa. El cost total calculat per a realitzar aquests canvis d’instal·lacions és de 58.825 €, amb un temps d’amortització de 15,32 anys.
Resumo:
In Catalonia, according to the nitrate directive (91/676/EU), nine areas have been declared as vulnerable to nitrate pollution from agricultural sources (Decret 283/1998 and Decret 479/2004). Five of these areas have been studied coupling hydro chemical data with a multi-isotopic approach (Vitòria et al. 2005, Otero et al. 2007, Puig et al. 2007), in an ongoing research project looking for an integrated application of classical hydrochemistry data, with a comprehensive isotopic characterisation (δ15N and δ18O of dissolved nitrate, δ34S and δ18O of dissolved sulphate, δ13C of dissolved inorganic carbon, and δD and δ18O of water). Within this general frame, the contribution presented explores compositional ways of: (i) distinguish agrochemicals and manure N pollution, (ii) quantify natural attenuation of nitrate (denitrification), and identify possible controlling factors.To achieve this two-fold goal, the following techniques have been used. Separate biplots of each suite of data show that each studied region has a distinct δ34S and pH signatures, but they are homogeneous with regard to NO3- related variables. Also, the geochemical variables were projected onto the compositional directions associated with the possible denitrification reactions in each region. The resulting balances can be plot together with some isotopes, to assess their likelihood of occurrence
Resumo:
Geochemical data that is derived from the whole or partial analysis of various geologic materialsrepresent a composition of mineralogies or solute species. Minerals are composed of structuredrelationships between cations and anions which, through atomic and molecular forces, keep the elementsbound in specific configurations. The chemical compositions of minerals have specific relationships thatare governed by these molecular controls. In the case of olivine, there is a well-defined relationshipbetween Mn-Fe-Mg with Si. Balances between the principal elements defining olivine composition andother significant constituents in the composition (Al, Ti) have been defined, resulting in a near-linearrelationship between the logarithmic relative proportion of Si versus (MgMnFe) and Mg versus (MnFe),which is typically described but poorly illustrated in the simplex.The present contribution corresponds to ongoing research, which attempts to relate stoichiometry andgeochemical data using compositional geometry. We describe here the approach by which stoichiometricrelationships based on mineralogical constraints can be accounted for in the space of simplicialcoordinates using olivines as an example. Further examples for other mineral types (plagioclases andmore complex minerals such as clays) are needed. Issues that remain to be dealt with include thereduction of a bulk chemical composition of a rock comprised of several minerals from which appropriatebalances can be used to describe the composition in a realistic mineralogical framework. The overallobjective of our research is to answer the question: In the cases where the mineralogy is unknown, arethere suitable proxies that can be substituted?Kew words: Aitchison geometry, balances, mineral composition, oxides
Resumo:
Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression