26 resultados para refined multiscale entropy
Resumo:
New high-precision niobium (Nb) and tantalum (Ta) concentration data are presented for early Archaean metabasalts, metabasaltic komatiites and their erosion products (mafic metapelites) from SW Greenland and the Acasta gneiss complex, Canada. Individual datasets consistently show sub-chondritic Nb/Ta ratios averaging 15.1+/-11.6. This finding is discussed with regard to two competing models for the solution of the Nb-deficit that characterises the accessible Earth. Firstly, we test whether Nb could have sequestered into the core due to its slightly siderophile (or chalcophile) character under very reducing conditions, as recently proposed from experimental evidence. We demonstrate that troilite inclusions of the Canyon Diablo iron meteorite have Nb and V concentrations in excess of typical chondrites but that the metal phase of the Grant, Toluca and Canyon Diablo iron meteorites do not have significant concentrations of these lithophile elements. We find that if the entire accessible Earth Nb-deficit were explained by Nb in the core, only ca. 17% of the mantle could be depleted and that by 3.7 Ga, continental crust would have already achieved ca. 50% of its present mass. Nb/Ta systematics of late Archaean metabasalts compiled from the literature would further require that by 2.5 Ga, 90% of the present mass of continental crust was already in existence. As an alternative to this explanation, we propose that the average Nb/Ta ratio (15.1+/-11.6) of Earth's oldest mafic rocks is a valid approximation for bulk silicate Earth. This would require that ca. 13% of the terrestrial Nb resided in the Ta-free core. Since the partitioning of Nb between silicate and metal melts depends largely on oxygen fugacity and pressure, this finding could mean that metal/silicate segregation did not occur at the base of a deep magma ocean or that the early mantle was slightly less reducing than generally assumed. A bulk silicate Earth Nb/Ta ratio of 15.1 allows for depletion of up to 40% of the total mantle. This could indicate that in addition to the upper mantle, a portion of the lower mantle is depleted also, or if only the upper mantle were depleted, an additional hidden high Nb/Ta reservoir must exist. Comparison of Nb/Ta systematics between early and late Archaean metabasalts supports the latter idea and indicates deeply subducted high Nb/Ta eclogite slabs could reside in the mantle transition zone or the lower mantle. Accumulation of such slabs appears to have commenced between 2.5 and 2.0 Ga. Regardless of these complexities of terrestrial Nb/Ta systematics, it is shown that the depleted mantle Nb/Th ratio is a very robust proxy for the amount of extracted continental crust, because the temporal evolution of this ratio is dominated by Th-loss to the continents and not Nb-retention in the mantle. We present a new parameterisation of the continental crust volume versus age curve that specifically explores the possibility of lithophile element loss to the core and storage of eclogite slabs in the transition zone. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
The first terrestrial Pb-isotope paradox refers to the fact that on average, rocks from the Earth's surface (i.e. the accessible Earth) plot significantly to the right of the meteorite isochron in a common Pb-isotope diagram. The Earth as a whole, however, should plot close to the meteorite isochron, implying the existence of at least one terrestrial reservoir that plots to the left of the meteorite isochron. The core and the lower continental crust are the two candidates that have been widely discussed in the past. Here we propose that subducted oceanic crust and associated continental sediment stored as garnetite slabs in the mantle Transition Zone or mid-lower mantle are an additional potential reservoir that requires consideration. We present evidence from the literature that indicates that neither the core nor the lower crust contains sufficient unradiogenic Pb to balance the accessible Earth. Of all mantle magmas, only rare alkaline melts plot significantly to the left of the meteorite isochron. We interpret these melts to be derived from the missing mantle reservoir that plots to the left of the meteorite isochron but, significantly, above the mid-ocean ridge basalt (MORB)-source mantle evolution line. Our solution to the paradox predicts the bulk silicate Earth to be more radiogenic in Pb-207/Pb-204 than present-day MORB-source mantle, which opens the possibility that undegassed primitive mantle might be the source of certain ocean island basalts (OIB). Further implications for mantle dynamics and oceanic magmatism are discussed based on a previously justified proposal that lamproites and associated rocks could derive from the Transition Zone.
Resumo:
In this paper, we present a technique for equilibria characterization of activated carbon having slit-shaped pores. This method was first developed by Do (Do, D. D. A new method for the characterisation of micro-mesoporous materials. Presented at the International Symposium on New Trends in Colloid and Interface Science, September 24-26, 1998 Chiba, Japan) and applied by his group and other groups for characterization of pore size distribution (PSD) as well as adsorption equilibria determination of a wide range of hydrocarbons. It is refined in this paper and compared with the grand canonical Monte Carlo (GCMG) simulation and density functional theory (DFT). The refined theory results in a good agreement between the pore filling pressure versus pore width and those obtained by GCMG and DFT. Furthermore, our local isotherms are qualitatively in good agreement with those obtained by the GCMC simulations. The main advantage of this method is that it is about 4 orders of magnitude faster than the GCMC simulations, making it suitable for optimization studies and design purposes. Finally, we apply our method and the GCMG in the derivation of the PSD of a commercial activated carbon. It was found that the PSD derived from our method is comparable to that derived from the GCMG simulations.
Resumo:
A quantum random walk on the integers exhibits pseudo memory effects, in that its probability distribution after N steps is determined by reshuffling the first N distributions that arise in a classical random walk with the same initial distribution. In a classical walk, entropy increase can be regarded as a consequence of the majorization ordering of successive distributions. The Lorenz curves of successive distributions for a symmetric quantum walk reveal no majorization ordering in general. Nevertheless, entropy can increase, and computer experiments show that it does so on average. Varying the stages at which the quantum coin system is traced out leads to new quantum walks, including a symmetric walk for which majorization ordering is valid but the spreading rate exceeds that of the usual symmetric quantum walk.
Resumo:
Chemical engineers are turning to multiscale modelling to extend traditional modelling approaches into new application areas and to achieve higher levels of detail and accuracy. There is, however, little advice available on the best strategy to use in constructing a multiscale model. This paper presents a starting point for the systematic analysis of multiscale models by defining several integrating frameworks for linking models at different scales. It briefly explores how the nature of the information flow between the models at the different scales is influenced by the choice of framework, and presents some restrictions on model-framework compatibility. The concepts are illustrated with reference to the modelling of a catalytic packed bed reactor. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
We introduce a novel way of measuring the entropy of a set of values undergoing changes. Such a measure becomes useful when analyzing the temporal development of an algorithm designed to numerically update a collection of values such as artificial neural network weights undergoing adjustments during learning. We measure the entropy as a function of the phase-space of the values, i.e. their magnitude and velocity of change, using a method based on the abstract measure of entropy introduced by the philosopher Rudolf Carnap. By constructing a time-dynamic two-dimensional Voronoi diagram using Voronoi cell generators with coordinates of value- and value-velocity (change of magnitude), the entropy becomes a function of the cell areas. We term this measure teleonomic entropy since it can be used to describe changes in any end-directed (teleonomic) system. The usefulness of the method is illustrated when comparing the different approaches of two search algorithms, a learning artificial neural network and a population of discovering agents. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
A systematic goal-driven top-down modelling methodology is proposed that is capable of developing a multiscale model of a process system for given diagnostic purposes. The diagnostic goal-set and the symptoms are extracted from HAZOP analysis results, where the possible actions to be performed in a fault situation are also described. The multiscale dynamic model is realized in the form of a hierarchical coloured Petri net by using a novel substitution place-transition pair. Multiscale simulation that focuses automatically on the fault areas is used to predict the effect of the proposed preventive actions. The notions and procedures are illustrated on some simple case studies including a heat exchanger network and a more complex wet granulation process.
Resumo:
The cross-entropy (CE) method is a new generic approach to combinatorial and multi-extremal optimization and rare event simulation. The purpose of this tutorial is to give a gentle introduction to the CE method. We present the CE methodology, the basic algorithm and its modifications, and discuss applications in combinatorial optimization and machine learning. combinatorial optimization
Resumo:
Consider a network of unreliable links, modelling for example a communication network. Estimating the reliability of the network-expressed as the probability that certain nodes in the network are connected-is a computationally difficult task. In this paper we study how the Cross-Entropy method can be used to obtain more efficient network reliability estimation procedures. Three techniques of estimation are considered: Crude Monte Carlo and the more sophisticated Permutation Monte Carlo and Merge Process. We show that the Cross-Entropy method yields a speed-up over all three techniques.
Resumo:
The buffer allocation problem (BAP) is a well-known difficult problem in the design of production lines. We present a stochastic algorithm for solving the BAP, based on the cross-entropy method, a new paradigm for stochastic optimization. The algorithm involves the following iterative steps: (a) the generation of buffer allocations according to a certain random mechanism, followed by (b) the modification of this mechanism on the basis of cross-entropy minimization. Through various numerical experiments we demonstrate the efficiency of the proposed algorithm and show that the method can quickly generate (near-)optimal buffer allocations for fairly large production lines.
Resumo:
We explore both the rheology and complex flow behavior of monodisperse polymer melts. Adequate quantities of monodisperse polymer were synthesized in order that both the materials rheology and microprocessing behavior could be established. In parallel, we employ a molecular theory for the polymer rheology that is suitable for comparison with experimental rheometric data and numerical simulation for microprocessing flows. The model is capable of matching both shear and extensional data with minimal parameter fitting. Experimental data for the processing behavior of monodisperse polymers are presented for the first time as flow birefringence and pressure difference data obtained using a Multipass Rheometer with an 11:1 constriction entry and exit flow. Matching of experimental processing data was obtained using the constitutive equation with the Lagrangian numerical solver, FLOWSOLVE. The results show the direct coupling between molecular constitutive response and macroscopic processing behavior, and differentiate flow effects that arise separately from orientation and stretch. (c) 2005 The Society of Rheology.
Resumo:
We consider the problem of estimating P(Yi + (...) + Y-n > x) by importance sampling when the Yi are i.i.d. and heavy-tailed. The idea is to exploit the cross-entropy method as a toot for choosing good parameters in the importance sampling distribution; in doing so, we use the asymptotic description that given P(Y-1 + (...) + Y-n > x), n - 1 of the Yi have distribution F and one the conditional distribution of Y given Y > x. We show in some specific parametric examples (Pareto and Weibull) how this leads to precise answers which, as demonstrated numerically, are close to being variance minimal within the parametric class under consideration. Related problems for M/G/l and GI/G/l queues are also discussed.
Resumo:
In recent years, the cross-entropy method has been successfully applied to a wide range of discrete optimization tasks. In this paper we consider the cross-entropy method in the context of continuous optimization. We demonstrate the effectiveness of the cross-entropy method for solving difficult continuous multi-extremal optimization problems, including those with non-linear constraints.