8 resultados para CMF, molecular cloud, extraction algorithm

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work was to study the dense cloud structures and to obtain the mass distribution of the dense cores (CMF) within the NGC6357 complex, from observations of the dust continuum at 450 and 850~$\mu$m of a 30 $\times$ 30 arcmin$^2$ region containing the H\textsc{ii} regions, G353.2+0.9 and G353.1+0.6.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Feedback from the most massive components of a young stellar cluster deeply affects the surrounding ISM driving an expanding over-pressured hot gas cavity in it. In spiral galaxies these structures may have sufficient energy to break the disk and eject large amount of material into the halo. The cycling of this gas, which eventually will fall back onto the disk, is known as galactic fountains. We aim at better understanding the dynamics of such fountain flow in a Galactic context, frame the problem in a more dynamic environment possibly learning about its connection and regulation to the local driving mechanism and understand its role as a metal diffusion channel. The interaction of the fountain with a hot corona is hereby analyzed, trying to understand the properties and evolution of the extraplanar material. We perform high resolution hydrodynamical simulations with the moving-mesh code AREPO to model the multi-phase ISM of a Milky Way type galaxy. A non-equilibrium chemical network is included to self consistently follow the evolution of the main coolants of the ISM. Spiral arm perturbations in the potential are considered so that large molecular gas structures are able to dynamically form here, self shielded from the interstellar radiation field. We model the effect of SN feedback from a new-born stellar cluster inside such a giant molecular cloud, as the driving force of the fountain. Passive Lagrangian tracer particles are used in conjunction to the SN energy deposition to model and study diffusion of freshly synthesized metals. We find that both interactions with hot coronal gas and local ISM properties and motions are equally important in shaping the fountain. We notice a bimodal morphology where most of the ejected gas is in a cold $10^4$ K clumpy state while the majority of the affected volume is occupied by a hot diffuse medium. While only about 20\% of the produced metals stay local, most of them quickly diffuse through this hot regime to great scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex networks analysis is a very popular topic in computer science. Unfortunately this networks, extracted from different contexts, are usually very large and the analysis may be very complicated: computation of metrics on these structures could be very complex. Among all metrics we analyse the extraction of subnetworks called communities: they are groups of nodes that probably play the same role within the whole structure. Communities extraction is an interesting operation in many different fields (biology, economics,...). In this work we present a parallel community detection algorithm that can operate on networks with huge number of nodes and edges. After an introduction to graph theory and high performance computing, we will explain our design strategies and our implementation. Then, we will show some performance evaluation made on a distributed memory architectures i.e. the supercomputer IBM-BlueGene/Q "Fermi" at the CINECA supercomputing center, Italy, and we will comment our results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of my thesis is to parallelize the Weighting Histogram Analysis Method (WHAM), which is a popular algorithm used to calculate the Free Energy of a molucular system in Molecular Dynamics simulations. WHAM works in post processing in cooperation with another algorithm called Umbrella Sampling. Umbrella Sampling has the purpose to add a biasing in the potential energy of the system in order to force the system to sample a specific region in the configurational space. Several N independent simulations are performed in order to sample all the region of interest. Subsequently, the WHAM algorithm is used to estimate the original system energy starting from the N atomic trajectories. The parallelization of WHAM has been performed through CUDA, a language that allows to work in GPUs of NVIDIA graphic cards, which have a parallel achitecture. The parallel implementation may sensibly speed up the WHAM execution compared to previous serial CPU imlementations. However, the WHAM CPU code presents some temporal criticalities to very high numbers of interactions. The algorithm has been written in C++ and executed in UNIX systems provided with NVIDIA graphic cards. The results were satisfying obtaining an increase of performances when the model was executed on graphics cards with compute capability greater. Nonetheless, the GPUs used to test the algorithm is quite old and not designated for scientific calculations. It is likely that a further performance increase will be obtained if the algorithm would be executed in clusters of GPU at high level of computational efficiency. The thesis is organized in the following way: I will first describe the mathematical formulation of Umbrella Sampling and WHAM algorithm with their apllications in the study of ionic channels and in Molecular Docking (Chapter 1); then, I will present the CUDA architectures used to implement the model (Chapter 2); and finally, the results obtained on model systems will be presented (Chapter 3).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work belongs to the PRANA project, the first extensive field campaign of observation of atmospheric emission spectra covering the Far InfraRed spectral region, for more than two years. The principal deployed instrument is REFIR-PAD, a Fourier transform spectrometer used by us to study Antarctic cloud properties. A dataset covering the whole 2013 has been analyzed and, firstly, a selection of good quality spectra is performed, using, as thresholds, radiance values in few chosen spectral regions. These spectra are described in a synthetic way averaging radiances in selected intervals, converting them into BTs and finally considering the differences between each pair of them. A supervised feature selection algorithm is implemented with the purpose to select the features really informative about the presence, the phase and the type of cloud. Hence, training and test sets are collected, by means of Lidar quick-looks. The supervised classification step of the overall monthly datasets is performed using a SVM. On the base of this classification and with the help of Lidar observations, 29 non-precipitating ice cloud case studies are selected. A single spectrum, or at most an average over two or three spectra, is processed by means of the retrieval algorithm RT-RET, exploiting some main IR window channels, in order to extract cloud properties. Retrieved effective radii and optical depths are analyzed, to compare them with literature studies and to evaluate possible seasonal trends. Finally, retrieval output atmospheric profiles are used as inputs for simulations, assuming two different crystal habits, with the aim to examine our ability to reproduce radiances in the FIR. Substantial mis-estimations are found for FIR micro-windows: a high variability is observed in the spectral pattern of simulation deviations from measured spectra and an effort to link these deviations to cloud parameters has been performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is aimed to assess similarities and mismatches between the outputs from two independent methods for the cloud cover quantification and classification based on quite different physical basis. One of them is the SAFNWC software package designed to process radiance data acquired by the SEVIRI sensor in the VIS/IR. The other is the MWCC algorithm, which uses the brightness temperatures acquired by the AMSU-B and MHS sensors in their channels centered in the MW water vapour absorption band. At a first stage their cloud detection capability has been tested, by comparing the Cloud Masks they produced. These showed a good agreement between two methods, although some critical situations stand out. The MWCC, in effect, fails to reveal clouds which according to SAFNWC are fractional, cirrus, very low and high opaque clouds. In the second stage of the inter-comparison the pixels classified as cloudy according to both softwares have been. The overall observed tendency of the MWCC method, is an overestimation of the lower cloud classes. Viceversa, the more the cloud top height grows up, the more the MWCC not reveal a certain cloud portion, rather detected by means of the SAFNWC tool. This is what also emerges from a series of tests carried out by using the cloud top height information in order to evaluate the height ranges in which each MWCC category is defined. Therefore, although the involved methods intend to provide the same kind of information, in reality they return quite different details on the same atmospheric column. The SAFNWC retrieval being very sensitive to the top temperature of a cloud, brings the actual level reached by this. The MWCC, by exploiting the capability of the microwaves, is able to give an information about the levels that are located more deeply within the atmospheric column.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface based measurements systems play a key role in defining the ground truth for climate modeling and satellite product validation. The Italian-French station of Concordia is operative year round since 2005 at Dome C (75°S, 123°E, 3230 m) on the East Antarctic Plateau. A Baseline Surface Radiation Network (BSRN) site was deployed and became operational since January 2006 to measure downwelling components of the radiation budget, and successively was expanded in April 2007 to measure upwelling radiation. Hence, almost a decade of measurement is now available and suitable to define a statistically significant climatology for the radiation budget of Concordia including eventual trends, by specifically assessing the effects of clouds and water vapor on SW and LW net radiation. A well known and robust clear sky-id algorithm (Long and Ackerman, 2000) has been operationally applied on downwelling SW components to identify cloud free events and to fit a parametric equation to determine clear-sky reference along the Antarctic daylight periods (September to April). A new model for surface broadband albedo has been developed in order to better describe the features the area. Then, a novel clear-sky LW parametrization, based on a-priori assumption about inversion layer structure, combined with daily and annual oscillations of the surface temperature, have been adopted and validated. The longwave based method is successively exploited to extend cloud radiative forcing studies to nighttime period (winter). Results indicated inter-annual and intra-annual warming behaviour, i.e. 13.70 W/m2 on the average, specifically approaching neutral effect in summer, when SW CRF compensates LW CRF, and warming along the rest of the year due prevalentely to CRF induced on the LW component.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis project aims to the development of an algorithm for the obstacle detection and the interaction between the safety areas of an Automated Guided Vehicles (AGV) and a Point Cloud derived map inside the context of a CAD software. The first part of the project focuses on the implementation of an algorithm for the clipping of general polygons, with which has been possible to: construct the safety areas polygon, derive the sweep of this areas along the navigation path performing a union and detect the intersections with line or polygon representing the obstacles. The second part is about the construction of a map in terms of geometric entities (lines and polygons) starting from a point cloud given by the 3D scan of the environment. The point cloud is processed using: filters, clustering algorithms and concave/convex hull derived algorithms in order to extract line and polygon entities representing obstacles. Finally, the last part aims to use the a priori knowledge of possible obstacle detections on a given segment, to predict the behavior of the AGV and use this prediction to optimize the choice of the vehicle's assigned velocity in that segment, minimizing the travel time.