4 resultados para Computations

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this thesis is to understand and link together some of the early works by Michel Rumin and Pierre Julg. The work is centered around the so-called Rumin complex, which is a construction in subRiemannian geometry. A Carnot manifold is a manifold endowed with a horizontal distribution. If further a metric is given, one gets a subRiemannian manifold. Such data arise in different contexts, such as: - formulation of the second principle of thermodynamics; - optimal control; - propagation of singularities for sums of squares of vector fields; - real hypersurfaces in complex manifolds; - ideal boundaries of rank one symmetric spaces; - asymptotic geometry of nilpotent groups; - modelization of human vision. Differential forms on a Carnot manifold have weights, which produces a filtered complex. In view of applications to nilpotent groups, Rumin has defined a substitute for the de Rham complex, adapted to this filtration. The presence of a filtered complex also suggests the use of the formal machinery of spectral sequences in the study of cohomology. The goal was indeed to understand the link between Rumin's operator and the differentials which appear in the various spectral sequences we have worked with: - the weight spectral sequence; - a special spectral sequence introduced by Julg and called by him Forman's spectral sequence; - Forman's spectral sequence (which turns out to be unrelated to the previous one). We will see that in general Rumin's operator depends on choices. However, in some special cases, it does not because it has an alternative interpretation as a differential in a natural spectral sequence. After defining Carnot groups and analysing their main properties, we will introduce the concept of weights of forms which will produce a splitting on the exterior differential operator d. We shall see how the Rumin complex arises from this splitting and proceed to carry out the complete computations in some key examples. From the third chapter onwards we will focus on Julg's paper, describing his new filtration and its relationship with the weight spectral sequence. We will study the connection between the spectral sequences and Rumin's complex in the n-dimensional Heisenberg group and the 7-dimensional quaternionic Heisenberg group and then generalize the result to Carnot groups using the weight filtration. Finally, we shall explain why Julg required the independence of choices in some special Rumin operators, introducing the Szego map and describing its main properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After almost 10 years from “The Free Lunch Is Over” article, where the need to parallelize programs started to be a real and mainstream issue, a lot of stuffs did happened: • Processor manufacturers are reaching the physical limits with most of their approaches to boosting CPU performance, and are instead turning to hyperthreading and multicore architectures; • Applications are increasingly need to support concurrency; • Programming languages and systems are increasingly forced to deal well with concurrency. This thesis is an attempt to propose an overview of a paradigm that aims to properly abstract the problem of propagating data changes: Reactive Programming (RP). This paradigm proposes an asynchronous non-blocking approach to concurrency and computations, abstracting from the low-level concurrency mechanisms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first part of this essay aims at investigating the already available and promising technologies for the biogas and bio-hydrogen production from anaerobic digestion of different organic substrates. One strives to show all the peculiarities of this complicate process, such as continuity, number of stages, moisture, biomass preservation and rate of feeding. The main outcome of this part is the awareness of the huge amount of reactor configurations, each of which suitable for a few types of substrate and circumstance. Among the most remarkable results, one may consider first of all the wet continuous stirred tank reactors (CSTR), right to face the high waste production rate in urbanised and industrialised areas. Then, there is the up-flow anaerobic sludge blanket reactor (UASB), aimed at the biomass preservation in case of highly heterogeneous feedstock, which can also be treated in a wise co-digestion scheme. On the other hand, smaller and scattered rural realities can be served by either wet low-rate digesters for homogeneous agricultural by-products (e.g. fixed-dome) or the cheap dry batch reactors for lignocellulose waste and energy crops (e.g. hybrid batch-UASB). The biological and technical aspects raised during the first chapters are later supported with bibliographic research on the important and multifarious large-scale applications the products of the anaerobic digestion may have. After the upgrading techniques, particular care was devoted to their importance as biofuels, highlighting a further and more flexible solution consisting in the reforming to syngas. Then, one shows the electricity generation and the associated heat conversion, stressing on the high potential of fuel cells (FC) as electricity converters. Last but not least, both the use as vehicle fuel and the injection into the gas pipes are considered as promising applications. The consideration of the still important issues of the bio-hydrogen management (e.g. storage and delivery) may lead to the conclusion that it would be far more challenging to implement than bio-methane, which can potentially “inherit” the assets of the similar fossil natural gas. Thanks to the gathered knowledge, one devotes a chapter to the energetic and financial study of a hybrid power system supplied by biogas and made of different pieces of equipment (natural gas thermocatalitic unit, molten carbonate fuel cell and combined-cycle gas turbine structure). A parallel analysis on a bio-methane-fed CCGT system is carried out in order to compare the two solutions. Both studies show that the apparent inconvenience of the hybrid system actually emphasises the importance of extending the computations to a broader reality, i.e. the upstream processes for the biofuel production and the environmental/social drawbacks due to fossil-derived emissions. Thanks to this “boundary widening”, one can realise the hidden benefits of the hybrid over the CCGT system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.