5 resultados para Sandbox - physical models

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this Thesis work is to study the multi-frequency properties of the Ultra Luminous Infrared Galaxy (ULIRG) IRAS 00183-7111 (I00183) at z = 0.327, connecting ALMA sub-mm/mm observations with those at high energies in order to place constraints on the properties of its central power source and verify whether the gas traced by the CO may be responsible for the obscuration observed in X-rays. I00183 was selected from the so-called Spoon diagnostic diagram (Spoon et al. 2007) for mid-infrared spectra of infrared galaxies based on the equivalent width of the 6.2 μm Polycyclic Aromatic Hydrocarbon (PAH) emission feature versus the 9.7 μm silicate strength. Such features are a powerful tool to investigate the contribution of star formation and AGN activity in this class of objects. I00183 was selected from the top-left region of the plot where the most obscured sources, characterized by a strong Si absorption feature, are located. To link the sub-mm/mm to the X-ray properties of I00183, ALMA archival Cycle 0 data in Band 3 (87 GHz) and Band 6 (270 GHz) have been calibrated and analyzed, using CASA software. ALMA Cycle 0 was the Early Science program for which data reprocessing is strongly suggested. The main work of this Thesis consisted in reprocessing raw data to provide an improvement with respect to the available archival products and results, which were obtained using standard procedures. The high-energy data consists of Chandra, XMM-Newton and NuSTAR observations which provide a broad coverage of the spectrum in the energy range 0.5 − 30 keV. Chandra and XMM archival data were used, with an exposure time of 22 and 22.2 ks, respectively; their reduction was carried out using CIAO and SAS software. The 100 ks NuSTAR are still private and the spectra were obtained by courtesy of the PI (K. Iwasawa). A detailed spectral analysis was done using XSPEC software; the spectral shape was reproduced starting from simple phenomenological models, and then more physical models were introduced to account for the complex mechanisms that involve this source. In Chapter 1, an overview of the scientific background is discussed, with a focus on the target, I00183, and the Spoon diagnostic diagram, from which it was originally selected. In Chapter 2, the basic principles of interferometry are briefly introduced, with a description of the calibration theory applied to interferometric observations. In Chapter 3, ALMA and its capabilities, both current and future, are shown, explaining also the complex structure of the ALMA archive. In Chapter 4, the calibration of ALMA data is presented and discussed, showing also the obtained imaging products. In Chapter 5, the analysis and discussion of the main results obtained from ALMA data is presented. In Chapter 6, the X-ray observations, data reduction and spectral analysis are reported, with a brief introduction to the basic principle of X-ray astronomy and the instruments from which the observations were carried out. Finally, the overall work is summarized, with particular emphasis on the main obtained results and the possible future perspectives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cosmic X-ray background synthesis models (Gilli 2007) require a significant fraction of obscured AGN, some of which are expected to be heavily obscured (Compton-thick), but the number density of observationally found obscured sources is still an open issue (Vignali 2010, 2014). This thesis work takes advantage of recent NuSTAR data and is based on a multiwavelength research approach. Gruppioni et al. 2016 compared the AGN bolometric luminosity, for a sample of local 12 micron Seyfert galaxies, derived from the SED decomposition to the same quantity obtained by the 2-10 keV luminosity (IPAC-NED). A difference up to two orders of magnitude resulted between these quantities for some sources. Thus, the intrinsic X-ray luminosity obtained correcting for the obscuration may be underestimated. In this thesis we have tested this hypothesis by re-analysing the X-ray spectra of three of the sources (UGC05101, NGC1194 and NGC3079), for which observations from NuSTAR and Chandra and/or XMM-Newton were available. This is meant to extend our analysis to energies above 10 keV and thus estimate the AGN column density as reliable as possible. For spectral fitting we made use of both the commonly used XSPEC package and the two very recent MYtorus and BNtorus physical models. The available wide bandpass allowed us to achieve new and more solid insights into the X-ray spectral properties of these sources. The measured absorption column densities are highly suggestive of heavy obscuration. Once corrected the X-ray AGN luminosity for the obscuration estimated through our spectral analysis, we compared the L(X) values in the 2-10 keV band with those derived from the MIR band, by means of the relation by Gandhi, 2009. As expected, the values derived from this relation are in good agreement with those we measured, indicating that the column densities were underestimated in the previous literature works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Scilla rock avalanche occurred on 6 February 1783 along the coast of the Calabria region (southern Italy), close to the Messina Strait. It was triggered by a mainshock of the Terremoto delle Calabrie seismic sequence, and it induced a tsunami wave responsible for more than 1500 casualties along the neighboring Marina Grande beach. The main goal of this work is the application of semi-analtycal and numerical models to simulate this event. The first one is a MATLAB code expressly created for this work that solves the equations of motion for sliding particles on a two-dimensional surface through a fourth-order Runge-Kutta method. The second one is a code developed by the Tsunami Research Team of the Department of Physics and Astronomy (DIFA) of the Bologna University that describes a slide as a chain of blocks able to interact while sliding down over a slope and adopts a Lagrangian point of view. A wide description of landslide phenomena and in particular of landslides induced by earthquakes and with tsunamigenic potential is proposed in the first part of the work. Subsequently, the physical and mathematical background is presented; in particular, a detailed study on derivatives discratization is provided. Later on, a description of the dynamics of a point-mass sliding on a surface is proposed together with several applications of numerical and analytical models over ideal topographies. In the last part, the dynamics of points sliding on a surface and interacting with each other is proposed. Similarly, different application on an ideal topography are shown. Finally, the applications on the 1783 Scilla event are shown and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.