9 resultados para Fermi-density distribution with two parameters

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La mancanza di procedure standard per la verifica delle strutture in compositi, al contrario dei materiali metallici, porta all’esigenza di una continua ricerca nel settore, al fine di ottenere risultati significativi che culminino in una standardizzazione delle procedure. In tale contesto si colloca la ricerca svolta per la stesura del presente elaborato, condotta presso il laboratorio DASML del TU Delft, nei Paesi Bassi. Il materiale studiato è un prepreg (preimpregnated) costituito da fibre di carbonio (M30SC) e matrice epossidica (DT120) con la particolare configurazione [0°/90°/±45°/±45°/90°/0°]. L’adesivo utilizzato per l’incollaggio è di tipo epossidico (FM94K). Il materiale è stato assemblato in laboratorio in modo da ottenere i provini da testare, di tipo DCB, ENF e CCP. Due differenti qualità dello stesso materiale sono state ottenute, una buona ottenuta seguendo le istruzione del produttore, ed una povera ottenuta modificando il processo produttivo suggerito, che risulta in un incollaggio di qualità nettamente inferiore rispetto al primo tipo di materiale. Lo scopo era quello di studiare i comportamenti di entrambe le qualità sotto due diversi modi di carico, modo I o opening mode e modo II o shear mode, entrambi attraverso test quasi-statici e a fatica, così da ottenere risultati comparabili tra di essi che permettano in futuro di identificare se si dispone di un materiale di buona qualità prima di procedere con il progetto dell’intera struttura. L’approccio scelto per lo studio dello sviluppo della delaminazione è un adattamento della teoria della Meccanica della Frattura Lineare Elastica (LEFM)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La materia ordinaria copre soli pochi punti percentuali della massa-energia totale dell'Universo, che è invece largamente dominata da componenti “oscure”. Il modello standard usato per descriverle è il modello LambdaCDM. Nonostante esso sembri consistente con la maggior parte dei dati attualmente disponibili, presenta alcuni problemi fondamentali che ad oggi restano irrisolti, lasciando spazio per lo studio di modelli cosmologici alternativi. Questa Tesi mira a studiare un modello proposto recentemente, chiamato “Multi-coupled Dark Energy” (McDE), che presenta interazioni modificate rispetto al modello LambdaCDM. In particolare, la Materia Oscura è composta da due diversi tipi di particelle con accoppiamento opposto rispetto ad un campo scalare responsabile dell'Energia Oscura. L'evoluzione del background e delle perturbazioni lineari risultano essere indistinguibili da quelle del modello LambdaCDM. In questa Tesi viene presentata per la prima volta una serie di simulazioni numeriche “zoomed”. Esse presentano diverse regioni con risoluzione differente, centrate su un singolo ammasso di interesse, che permettono di studiare in dettaglio una singola struttura senza aumentare eccessivamente il tempo di calcolo necessario. Un codice chiamato ZInCo, da me appositamente sviluppato per questa Tesi, viene anch'esso presentato per la prima volta. Il codice produce condizioni iniziali adatte a simulazioni cosmologiche, con differenti regioni di risoluzione, indipendenti dal modello cosmologico scelto e che preservano tutte le caratteristiche dello spettro di potenza imposto su di esse. Il codice ZInCo è stato usato per produrre condizioni iniziali per una serie di simulazioni numeriche del modello McDE, le quali per la prima volta mostrano, grazie all'alta risoluzione raggiunta, che l'effetto di segregazione degli ammassi avviene significativamente prima di quanto stimato in precedenza. Inoltre, i profili radiale di densità ottenuti mostrano un appiattimento centrale nelle fasi iniziali della segregazione. Quest'ultimo effetto potrebbe aiutare a risolvere il problema “cusp-core” del modello LambdaCDM e porre limiti ai valori dell'accoppiamento possibili.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Slope failure occurs in many areas throughout the world and it becomes an important problem when it interferes with human activity, in which disasters provoke loss of life and property damage. In this research we investigate the slope failure through the centrifuge modeling, where a reduced-scale model, N times smaller than the full-scale (prototype), is used whereas the acceleration is increased by N times (compared with the gravity acceleration) to preserve the stress and the strain behavior. The aims of this research “Centrifuge modeling of sandy slopes” are in extreme synthesis: 1) test the reliability of the centrifuge modeling as a tool to investigate the behavior of a sandy slope failure; 2) understand how the failure mechanism is affected by changing the slope angle and obtain useful information for the design. In order to achieve this scope we arranged the work as follows: Chapter one: centrifuge modeling of slope failure. In this chapter we provide a general view about the context in which we are working on. Basically we explain what is a slope failure, how it happens and which are the tools available to investigate this phenomenon. Afterwards we introduce the technology used to study this topic, that is the geotechnical centrifuge. Chapter two: testing apparatus. In the first section of this chapter we describe all the procedures and facilities used to perform a test in the centrifuge. Then we explain the characteristics of the soil (Nevada sand), like the dry unit weight, water content, relative density, and its strength parameters (c,φ), which have been calculated in laboratory through the triaxial test. Chapter three: centrifuge tests. In this part of the document are presented all the results from the tests done in centrifuge. When we talk about results we refer to the acceleration at failure for each model tested and its failure surface. In our case study we tested models with the same soil and geometric characteristics but different angles. The angles tested in this research were: 60°, 75° and 90°. Chapter four: slope stability analysis. We introduce the features and the concept of the software: ReSSA (2.0). This software allows us to calculate the theoretical failure surfaces of the prototypes. Then we show in this section the comparisons between the experimental failure surfaces of the prototype, traced in the laboratory, and the one calculated by the software. Chapter five: conclusion. The conclusion of the research presents the results obtained in relation to the two main aims, mentioned above.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to investigate two candidate waveforms for next generation wireless systems, filtered Orthogonal Frequency Division Multiplexing (f-OFDM) and Unified Filtered Multi-Carrier (UFMC). The evaluation is done based on the power spectral density analysis of the signal and performance measurements in synchronous and asynchronous transmission. In f-OFDM we implement a soft truncated filter with length 1/3 of OFDM symbol. In UFMC we use the Dolph-Chebyshev filter, limited to the length of zero padding (ZP). The simulation results demonstrates that both waveforms have a better spectral behaviour compared with conventional OFDM. However, the induced inter-symbol interference (ISI) caused by the filter in f-OFDM, and the inter-carrier interference (ICI) induced in UFMC due to cyclic prefix (CP) reduction , should be kept under control. In addition, in a synchronous transmission case with ideal parameters, f-OFDM and UFMC appear to have similar performance with OFDM. When carrier frequency offset (CFO) is imposed in the transmission, UFMC outperforms OFDM and f-OFDM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis project, I present stationary models of rotating fluids with toroidal distributions that can be used to represent the active galactic nuclei (AGN) central obscurers, i.e. molecular tori (Combes et al., 2019), as well as geometrically thick accretion discs, like ADAF discs (Narayan and Yi, 1995) or Polish doughnuts (Abramowicz, 2005). In particular, I study stationary rotating systems with a more general baroclinic distribution (with a vertical gradient of the angular velocity), which are often more realistic and less studied, due to their complexity, than the barotropic ones (with cylindrical rotation), which are easier to construct. In the thesis, I compute analytically the main intrinsic and projected properties of the power-law tori based on the potential-density pairs of Ciotti and Bertin (2005). I study the density distribution and the resulting gravitational potential for different values of α, in the range 2 < α < 5. For the same models, I compute the surface density of the systems when seen face-on and edge-on. I then apply the stationary Euler equations to obtain rotational velocity and temperature distributions of the self-gravitating models in the absence of an external gravitational potential. In the thesis I also consider the power-law tori with the presence of a central black hole in addition to the gas self-gravity, and solving analytically the stationary Euler equations, I compute how the properties of the system are modified by the black hole and how they vary as a function of the black hole mass. Finally, applying the Solberg-Høiland criterion, I show that these baroclinic stationary models are linearly stable in the absence of the black hole. In the presence of the black hole I derive the analytical condition for stability, which depends on α and on the black hole mass. I also study the stability of the tori in the hypothesis that they are weakly magnetized, finding that they are always unstable to this instability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the Massive IoT vision, millions of devices need to be connected to the Internet through a wireless access technology. However, current IoT-focused standards are not fully prepared for this future. In this thesis, a novel approach to Non-Orthogonal techniques for Random Access, which is the main bottleneck in high density systems, is proposed. First, the most popular wireless access standards are presented, with a focus on Narrowband-IoT. Then, the Random Access procedure as implemented in NB-IoT is analyzed. The Non-Orthogonal Random Access technique is presented next, along with two potential algorithms for the detection of non-orthogonal preambles. Finally, the performance of the proposed solutions are obtained through numerical simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the success of the ΛCDM model in describing the Universe, a possible tension between early- and late-Universe cosmological measurements is calling for new independent cosmological probes. Amongst the most promising ones, gravitational waves (GWs) can provide a self-calibrated measurement of the luminosity distance. However, to obtain cosmological constraints, additional information is needed to break the degeneracy between parameters in the gravitational waveform. In this thesis, we exploit the latest LIGO-Virgo-KAGRA Gravitational Wave Transient Catalog (GWTC-3) of GW sources to constrain the background cosmological parameters together with the astrophysical properties of Binary Black Holes (BBHs), using information from their mass distribution. We expand the public code MGCosmoPop, previously used for the application of this technique, by implementing a state-of-the-art model for the mass distribution, needed to account for the presence of non-trivial features, i.e. a truncated power law with two additional Gaussian peaks, referred to as Multipeak. We then analyse GWTC-3 comparing this model with simpler and more commonly adopted ones, both in the case of fixed and varying cosmology, and assess their goodness-of-fit with different model selection criteria, and their constraining power on the cosmological and population parameters. We also start to explore different sampling methods, namely Markov Chain Monte Carlo and Nested Sampling, comparing their performances and evaluating the advantages of both. We find concurring evidence that the Multipeak model is favoured by the data, in line with previous results, and show that this conclusion is robust to the variation of the cosmological parameters. We find a constraint on the Hubble constant of H0 = 61.10+38.65−22.43 km/s/Mpc (68% C.L.), which shows the potential of this method in providing independent constraints on cosmological parameters. The results obtained in this work have been included in [1].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tsunamis are rare events. However, their impact can be devastating and it may extend to large geographical areas. For low-probability high-impact events like tsunamis, it is crucial to implement all possible actions to mitigate the risk. The tsunami hazard assessment is the result of a scientific process that integrates traditional geological methods, numerical modelling and the analysis of tsunami sources and historical records. For this reason, analysing past events and understanding how they interacted with the land is the only way to inform tsunami source and propagation models, and quantitatively test forecast models like hazard analyses. The primary objective of this thesis is to establish an explicit relationship between the macroscopic intensity, derived from historical descriptions, and the quantitative physical parameters measuring tsunami waves. This is done first by defining an approximate estimation method based on a simplified 1D physical onshore propagation model to convert the available observations into one reference physical metric. Wave height at the coast was chosen as the reference due to its stability and independence of inland effects. This method was then implemented for a set of well-known past events to build a homogeneous dataset with both macroseismic intensity and wave height. By performing an orthogonal regression, a direct and invertible empirical relationship could be established between the two parameters, accounting for their relevant uncertainties. The target relationship is extensively tested and finally applied to the Italian Tsunami Effect Database (ITED), providing a homogeneous estimation of the wave height for all existing tsunami observations in Italy. This provides the opportunity for meaningful comparison for models and simulations, as well as quantitatively testing tsunami hazard models for the Italian coasts and informing tsunami risk management initiatives.