10 resultados para Hellinger-Reissner generalized variational principle in complementary energy form

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last decade has witnessed the establishment of a Standard Cosmological Model, which is based on two fundamental assumptions: the first one is the existence of a new non relativistic kind of particles, i. e. the Dark Matter (DM) that provides the potential wells in which structures create, while the second one is presence of the Dark Energy (DE), the simplest form of which is represented by the Cosmological Constant Λ, that sources the acceleration in the expansion of our Universe. These two features are summarized by the acronym ΛCDM, which is an abbreviation used to refer to the present Standard Cosmological Model. Although the Standard Cosmological Model shows a remarkably successful agreement with most of the available observations, it presents some longstanding unsolved problems. A possible way to solve these problems is represented by the introduction of a dynamical Dark Energy, in the form of the scalar field ϕ. In the coupled DE models, the scalar field ϕ features a direct interaction with matter in different regimes. Cosmic voids are large under-dense regions in the Universe devoided of matter. Being nearby empty of matter their dynamics is supposed to be dominated by DE, to the nature of which the properties of cosmic voids should be very sensitive. This thesis work is devoted to the statistical and geometrical analysis of cosmic voids in large N-body simulations of structure formation in the context of alternative competing cosmological models. In particular we used the ZOBOV code (see ref. Neyrinck 2008), a publicly available void finder algorithm, to identify voids in the Halos catalogues extraxted from CoDECS simulations (see ref. Baldi 2012 ). The CoDECS are the largest N-body simulations to date of interacting Dark Energy (DE) models. We identify suitable criteria to produce voids catalogues with the aim of comparing the properties of these objects in interacting DE scenarios to the standard ΛCDM model, at different redshifts. This thesis work is organized as follows: in chapter 1, the Standard Cosmological Model as well as the main properties of cosmic voids are intro- duced. In chapter 2, we will present the scalar field scenario. In chapter 3 the tools, the methods and the criteria by which a voids catalogue is created are described while in chapter 4 we discuss the statistical properties of cosmic voids included in our catalogues. In chapter 5 the geometrical properties of the catalogued cosmic voids are presented by means of their stacked profiles. In chapter 6 we summarized our results and we propose further developments of this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nei prossimi anni è atteso un aggiornamento sostanziale di LHC, che prevede di aumentare la luminosità integrata di un fattore 10 rispetto a quella attuale. Tale parametro è proporzionale al numero di collisioni per unità di tempo. Per questo, le risorse computazionali necessarie a tutti i livelli della ricostruzione cresceranno notevolmente. Dunque, la collaborazione CMS ha cominciato già da alcuni anni ad esplorare le possibilità offerte dal calcolo eterogeneo, ovvero la pratica di distribuire la computazione tra CPU e altri acceleratori dedicati, come ad esempio schede grafiche (GPU). Una delle difficoltà di questo approccio è la necessità di scrivere, validare e mantenere codice diverso per ogni dispositivo su cui dovrà essere eseguito. Questa tesi presenta la possibilità di usare SYCL per tradurre codice per la ricostruzione di eventi in modo che sia eseguibile ed efficiente su diversi dispositivi senza modifiche sostanziali. SYCL è un livello di astrazione per il calcolo eterogeneo, che rispetta lo standard ISO C++. Questo studio si concentra sul porting di un algoritmo di clustering dei depositi di energia calorimetrici, CLUE, usando oneAPI, l'implementazione SYCL supportata da Intel. Inizialmente, è stato tradotto l'algoritmo nella sua versione standalone, principalmente per prendere familiarità con SYCL e per la comodità di confronto delle performance con le versioni già esistenti. In questo caso, le prestazioni sono molto simili a quelle di codice CUDA nativo, a parità di hardware. Per validare la fisica, l'algoritmo è stato integrato all'interno di una versione ridotta del framework usato da CMS per la ricostruzione. I risultati fisici sono identici alle altre implementazioni mentre, dal punto di vista delle prestazioni computazionali, in alcuni casi, SYCL produce codice più veloce di altri livelli di astrazione adottati da CMS, presentandosi dunque come una possibilità interessante per il futuro del calcolo eterogeneo nella fisica delle alte energie.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stress recovery techniques have been an active research topic in the last few years since, in 1987, Zienkiewicz and Zhu proposed a procedure called Superconvergent Patch Recovery (SPR). This procedure is a last-squares fit of stresses at super-convergent points over patches of elements and it leads to enhanced stress fields that can be used for evaluating finite element discretization errors. In subsequent years, numerous improved forms of this procedure have been proposed attempting to add equilibrium constraints to improve its performances. Later, another superconvergent technique, called Recovery by Equilibrium in Patches (REP), has been proposed. In this case the idea is to impose equilibrium in a weak form over patches and solve the resultant equations by a last-square scheme. In recent years another procedure, based on minimization of complementary energy, called Recovery by Compatibility in Patches (RCP) has been proposed in. This procedure, in many ways, can be seen as the dual form of REP as it substantially imposes compatibility in a weak form among a set of self-equilibrated stress fields. In this thesis a new insight in RCP is presented and the procedure is improved aiming at obtaining convergent second order derivatives of the stress resultants. In order to achieve this result, two different strategies and their combination have been tested. The first one is to consider larger patches in the spirit of what proposed in [4] and the second one is to perform a second recovery on the recovered stresses. Some numerical tests in plane stress conditions are presented, showing the effectiveness of these procedures. Afterwards, a new recovery technique called Last Square Displacements (LSD) is introduced. This new procedure is based on last square interpolation of nodal displacements resulting from the finite element solution. In fact, it has been observed that the major part of the error affecting stress resultants is introduced when shape functions are derived in order to obtain strains components from displacements. This procedure shows to be ultraconvergent and is extremely cost effective, as it needs in input only nodal displacements directly coming from finite element solution, avoiding any other post-processing in order to obtain stress resultants using the traditional method. Numerical tests in plane stress conditions are than presented showing that the procedure is ultraconvergent and leads to convergent first and second order derivatives of stress resultants. In the end, transverse stress profiles reconstruction using First-order Shear Deformation Theory for laminated plates and three dimensional equilibrium equations is presented. It can be seen that accuracy of this reconstruction depends on accuracy of first and second derivatives of stress resultants, which is not guaranteed by most of available low order plate finite elements. RCP and LSD procedures are than used to compute convergent first and second order derivatives of stress resultants ensuring convergence of reconstructed transverse shear and normal stress profiles respectively. Numerical tests are presented and discussed showing the effectiveness of both procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, data handling and data analysis in High Energy Physics requires a vast amount of computational power and storage. In particular, the world-wide LHC Com- puting Grid (LCG), an infrastructure and pool of services developed and deployed by a ample community of physicists and computer scientists, has demonstrated to be a game changer in the efficiency of data analyses during Run-I at the LHC, playing a crucial role in the Higgs boson discovery. Recently, the Cloud computing paradigm is emerging and reaching a considerable adoption level by many different scientific organizations and not only. Cloud allows to access and utilize not-owned large computing resources shared among many scientific communities. Considering the challenging requirements of LHC physics in Run-II and beyond, the LHC computing community is interested in exploring Clouds and see whether they can provide a complementary approach - or even a valid alternative - to the existing technological solutions based on Grid. In the LHC community, several experiments have been adopting Cloud approaches, and in particular the experience of the CMS experiment is of relevance to this thesis. The LHC Run-II has just started, and Cloud-based solutions are already in production for CMS. However, other approaches of Cloud usage are being thought of and are at the prototype level, as the work done in this thesis. This effort is of paramount importance to be able to equip CMS with the capability to elastically and flexibly access and utilize the computing resources needed to face the challenges of Run-III and Run-IV. The main purpose of this thesis is to present forefront Cloud approaches that allow the CMS experiment to extend to on-demand resources dynamically allocated as needed. Moreover, a direct access to Cloud resources is presented as suitable use case to face up with the CMS experiment needs. Chapter 1 presents an overview of High Energy Physics at the LHC and of the CMS experience in Run-I, as well as preparation for Run-II. Chapter 2 describes the current CMS Computing Model, and Chapter 3 provides Cloud approaches pursued and used within the CMS Collaboration. Chapter 4 and Chapter 5 discuss the original and forefront work done in this thesis to develop and test working prototypes of elastic extensions of CMS computing resources on Clouds, and HEP Computing “as a Service”. The impact of such work on a benchmark CMS physics use-cases is also demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this Thesis work is to study the multi-frequency properties of the Ultra Luminous Infrared Galaxy (ULIRG) IRAS 00183-7111 (I00183) at z = 0.327, connecting ALMA sub-mm/mm observations with those at high energies in order to place constraints on the properties of its central power source and verify whether the gas traced by the CO may be responsible for the obscuration observed in X-rays. I00183 was selected from the so-called Spoon diagnostic diagram (Spoon et al. 2007) for mid-infrared spectra of infrared galaxies based on the equivalent width of the 6.2 μm Polycyclic Aromatic Hydrocarbon (PAH) emission feature versus the 9.7 μm silicate strength. Such features are a powerful tool to investigate the contribution of star formation and AGN activity in this class of objects. I00183 was selected from the top-left region of the plot where the most obscured sources, characterized by a strong Si absorption feature, are located. To link the sub-mm/mm to the X-ray properties of I00183, ALMA archival Cycle 0 data in Band 3 (87 GHz) and Band 6 (270 GHz) have been calibrated and analyzed, using CASA software. ALMA Cycle 0 was the Early Science program for which data reprocessing is strongly suggested. The main work of this Thesis consisted in reprocessing raw data to provide an improvement with respect to the available archival products and results, which were obtained using standard procedures. The high-energy data consists of Chandra, XMM-Newton and NuSTAR observations which provide a broad coverage of the spectrum in the energy range 0.5 − 30 keV. Chandra and XMM archival data were used, with an exposure time of 22 and 22.2 ks, respectively; their reduction was carried out using CIAO and SAS software. The 100 ks NuSTAR are still private and the spectra were obtained by courtesy of the PI (K. Iwasawa). A detailed spectral analysis was done using XSPEC software; the spectral shape was reproduced starting from simple phenomenological models, and then more physical models were introduced to account for the complex mechanisms that involve this source. In Chapter 1, an overview of the scientific background is discussed, with a focus on the target, I00183, and the Spoon diagnostic diagram, from which it was originally selected. In Chapter 2, the basic principles of interferometry are briefly introduced, with a description of the calibration theory applied to interferometric observations. In Chapter 3, ALMA and its capabilities, both current and future, are shown, explaining also the complex structure of the ALMA archive. In Chapter 4, the calibration of ALMA data is presented and discussed, showing also the obtained imaging products. In Chapter 5, the analysis and discussion of the main results obtained from ALMA data is presented. In Chapter 6, the X-ray observations, data reduction and spectral analysis are reported, with a brief introduction to the basic principle of X-ray astronomy and the instruments from which the observations were carried out. Finally, the overall work is summarized, with particular emphasis on the main obtained results and the possible future perspectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last few decades, offshore field has grown fast especially after the notable development of technologies, explorations of oil and gas in deep water and the high concern of offshore companies in renewable energy mainly Wind Energy. Fatigue damage was noticed as one of the main problems causing failure of offshore structures. The purpose of this research is to focus on the evaluation of Stress Concentration Factor and its influence on Fatigue Life for 2 tubular KT-Joints in offshore Jacket structure using different calculation methods. The work is done by using analytical calculations, mainly Efthymiou’s formulations, and numerical solutions, FEM analysis, using ABAQUS software. As for the analytical formulations, the calculations were done according to the geometrical parameters of each method using excel sheets. As for the numerical model, 2 different types of tubular KT-Joints are present where for each model 5 shell element type, 3 solid element type and 3 solid-with-weld element type models were built on ABAQUS. Meshing was assigned according to International Institute of Welding (IIW) recommendations, 5 types of mesh element, to evaluate the Hot-spot stresses. 23 different types of unitary loading conditions were assigned, 9 axial, 7 in-plane bending moment and 7 out-plane bending moment loads. The extraction of Hot-spot stresses and the evaluation of the Stress Concentration Factor were done using PYTHON scripting and MATLAB. Then, the fatigue damage evaluation for a critical KT tubular joint based on Simplified Fatigue Damage Rule and Local Approaches (Strain Damage Parameter and Stress Damage Parameter) methods were calculated according to the maximum Stress Concentration Factor conducted from DNV and FEA methods. In conclusion, this research helped us to compare different results of Stress Concentration Factor and Fatigue Life using different methods and provided us with a general overview about what to study next in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cyanoacetylene HC3N is a molecule of great astronomical importance and it has been observed in many interstellar environments. Its deuterated form DC3N has been detected in number of sources from external galaxies to Galactic interstellar clouds, star-forming regions and planetary atmospheres. All these detections relied on previous laboratory investigations, which however still lack some essential information concerning its infrared spectrum. In this project, high-resolution ro-vibrational spectra of DC3N have been recorded in two energy regions: 150 – 450 cm-1 and 1800 – 2800 cm-1. In the first window the ν7← GS, 2ν7 ← ν7, ν5 ← ν7, ν5+ν7 ← 2ν7, ν6+ν7 → 2v7, 4ν7 ← 2ν7 bands have been assigned, while in the second region the three stretching fundamental bands ν1, ν2, ν3 have been observed and analysed. The 150 – 450 cm-1 region spectra have been recorded at the AILES beamline at the SOLEIL synchrotron (France), the 1800 – 2800 cm-1 spectra at the Department of Industrial Chemistry “Toso Montanari” in Bologna. In total, 2299 transitions have been assigned. Such experimental transition, together with data previously recorded for DC3N, were included in a least-squares fitting procedure from which several spectroscopic parameters have been determined with high precision and accuracy. They include rotational, vibrational and resonance constants. The spectroscopic data of DC3N have been included in a line catalog for this molecule in order to assist future astronomical observations and data interpretation. A paper which includes this research work has been published (M. Melosso, L. Bizzocchi, A. Adamczyk, E. Cane, P. Caselli, L. Colzid, L. Dorea, B. M. Giulianob, J.-C. Guillemine, M-A. Martin-Drumel, O. Piralif, A. Pietropolli Charmet , D. Prudenzano, V. M. Rivillad, F. Tamassia, Extensive ro-vibrational analysis of deuterated-cyanoacetylene (DC3N) from millimeter wavelengths to the infrared domain, Jour. of Quant. Spectr. and Rad. Tran. 254, 107221, 2020).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new quantum description for the Oppenheimer-Snyder model of gravitational collapse of a ball of dust. Starting from the geodesic equation for dust in spherical symmetry, we introduce a time-independent Schrödinger equation for the radius of the ball. The resulting spectrum is similar to that of the Hydrogen atom and Newtonian gravity. However, the non-linearity of General Relativity implies that the ground state is characterised by a principal quantum number proportional to the square of the ADM mass of the dust. For a ball with ADM mass much larger than the Planck scale, the collapse is therefore expected to end in a macroscopically large core and the singularity predicted by General Relativity is avoided. Mathematical properties of the spectrum are investigated and the ground state is found to have support essentially inside the gravitational radius, which makes it a quantum model for the matter core of Black Holes. In fact, the scaling of the ADM mass with the principal quantum number agrees with the Bekenstein area law and the corpuscular model of Black Holes. Finally, the uncertainty on the size of the ground state is interpreted within the framework of an Uncertainty Principle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of the neutrino mass is a direct evidence of new physics. Several questions arise from this observation, regarding the mechanism originating the neutrino masses and their hierarchy, the violation of lepton number conservation and the generation of the baryon asymmetry. These questions can be addressed by the experimental search for neutrinoless double beta (0\nu\beta\beta) decay, a nuclear decay consisting of two simultaneous beta emissions without the emission of two antineutrinos. 0\nu\beta\beta decay is possible only if neutrinos are identical to antineutrinos, namely if they are Majorana particles. Several experiments are searching for 0\nu\beta\beta decay. Among these, CUORE is employing 130Te embedded in TeO_2 bolometric crystals. It needs to have an accurate understanding of the background contribution in the energy region around the Q-value of 130Te. One of the main contributions is given by particles from the decay chains of contaminating nuclei (232Th, 235-238U) present in the active crystals or in the support structure. This thesis uses the 1 ton yr CUORE data to study these contamination by looking for events belonging to sub-chains of the Th and U decay chains and reconstructing their energy and time difference distributions in a delayed coincidence analysis. These results in combination with studies on the simulated data are then used to evaluate the contaminations. This is the first time this analysis is applied to the CUORE data and this thesis highlights the feasibility of it while providing a starting point for further studies. A part of the obtained results agrees with ones from previous analysis, demonstrating that delayed coincidence searches might improve the understanding of the CUORE experiment background. This kind of delayed coincidence analysis can also be reused in the future once the, CUORE upgrade, CUPID data will be ready to be analyzed, with the aim of improving the sensitivity to the 0\nu\beta\beta decay of 100Mo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the framework of the energy transition, the acquisition of proper knowledge of fundamental aspects characterizing the use of alternative fuels is paramount as well as the development of optimized know-how and technologies. In this sense, the use of hydrogen has been indicated as a promising route for decarbonization at the end-users stage in the energy supply chain. However, the elevated reactivity and the low-density at atmospheric conditions of hydrogen pose new challenges. Among the others, the dilution of hydrogen with carbon dioxide from carbon capture and storage systems represents a possible route. However, the interactions between these species have been poorly studied so far. For these reasons, this thesis, in collaboration between the University of Bologna and Technische Universität Bergakademie of Freiberg in Saxony (Germany), investigates the laminar flame of hydrogen-based premixed gas with the dilution of carbon dioxide. An experimental system, called a heat flux burner, was adopted ad different operating conditions. The presence of the cellularity phenomenon, forming the so-called cellular flame, was observed and analysed. Theoretical and visual methods have allowed for the characterization of the investigated flames, opening new alternatives for sustainable energy production via hydrogen transformation.