928 resultados para Ground sloths


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uno dei principali ambiti di ricerca dell’intelligenza artificiale concerne la realizzazione di agenti (in particolare, robot) in grado di aiutare o sostituire l’uomo nell’esecuzione di determinate attività. A tal fine, è possibile procedere seguendo due diversi metodi di progettazione: la progettazione manuale e la progettazione automatica. Quest’ultima può essere preferita alla prima nei contesti in cui occorra tenere in considerazione requisiti quali flessibilità e adattamento, spesso essenziali per lo svolgimento di compiti non banali in contesti reali. La progettazione automatica prende in considerazione un modello col quale rappresentare il comportamento dell’agente e una tecnica di ricerca (oppure di apprendimento) che iterativamente modifica il modello al fine di renderlo il più adatto possibile al compito in esame. In questo lavoro, il modello utilizzato per la rappresentazione del comportamento del robot è una rete booleana (Boolean network o Kauffman network). La scelta di tale modello deriva dal fatto che possiede una semplice struttura che rende agevolmente studiabili le dinamiche tuttavia complesse che si manifestano al suo interno. Inoltre, la letteratura recente mostra che i modelli a rete, quali ad esempio le reti neuronali artificiali, si sono dimostrati efficaci nella programmazione di robot. La metodologia per l’evoluzione di tale modello riguarda l’uso di tecniche di ricerca meta-euristiche in grado di trovare buone soluzioni in tempi contenuti, nonostante i grandi spazi di ricerca. Lavori precedenti hanno gia dimostrato l’applicabilità e investigato la metodologia su un singolo robot. Lo scopo di questo lavoro è quello di fornire prova di principio relativa a un insieme di robot, aprendo nuove strade per la progettazione in swarm robotics. In questo scenario, semplici agenti autonomi, interagendo fra loro, portano all’emergere di un comportamento coordinato adempiendo a task impossibili per la singola unità. Questo lavoro fornisce utili ed interessanti opportunità anche per lo studio delle interazioni fra reti booleane. Infatti, ogni robot è controllato da una rete booleana che determina l’output in funzione della propria configurazione interna ma anche dagli input ricevuti dai robot vicini. In questo lavoro definiamo un task in cui lo swarm deve discriminare due diversi pattern sul pavimento dell’arena utilizzando solo informazioni scambiate localmente. Dopo una prima serie di esperimenti preliminari che hanno permesso di identificare i parametri e il migliore algoritmo di ricerca, abbiamo semplificato l’istanza del problema per meglio investigare i criteri che possono influire sulle prestazioni. E’ stata così identificata una particolare combinazione di informazione che, scambiata localmente fra robot, porta al miglioramento delle prestazioni. L’ipotesi è stata confermata applicando successivamente questo risultato ad un’istanza più difficile del problema. Il lavoro si conclude suggerendo nuovi strumenti per lo studio dei fenomeni emergenti in contesti in cui le reti booleane interagiscono fra loro.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies in regions of the nuclear chart in which the model predictions of properties of nuclei fail can bring a better understanding of the strong interaction in the nuclear medium. To such regions belongs the so called "island of inversion" centered around Ne, Na and Mg isotopes with 20 neutrons in which unexpected ground-state spins, large deformations and dense low-energy spectra appear. This is a strong argument that the magic N = 20 is not a closed shell in this area. In this thesis investigations of isotope shifts of stable 24,25,26Mg, as well as spins and magnetic moments of short-lived 29,31Mg are presented. The successful studies were performed at the ISOLDE facility at CERN using collinear laser and beta-NMR spectroscopy techniques. The isotopes were investigated as single-charged ions in the 280-nm transition from the atomic ground state 2S1/2 to one of the two lowest excited states 2P1/2,3/2 using continuous wave laser beams. The isotope-shift measurements with fluorescence detection for the three stable isotopes show that it is feasible to perform the same studies on radioactive Mg isotopes up to the "island of inversion". This will allow to determine differences in the mean charge square radii and interpret them in terms of deformation. The high detection efficiency for beta particles and optical pumping close to saturation allowed to obtain very good beta-asymmetry signals for 29Mg and 31Mg with half-lives around 1 s and production yields about 10^5 ions/s. For this purpose the ions were implanted into a host crystal lattice. Such detection of the atomic resonances revealed their hyperfine structure, which gives the sign and a first estimate of the value of the magnetic moment. The nuclear magnetic resonance gave also their g-factors with the relative uncertainty smaller than 0.2 %. By combining the two techniques also the nuclear spin of both isotopes could be unambiguously determined. The measured spins and g-factors show that 29Mg with 17 neutrons lies outside the "island of inversion". On the other hand, 31Mg with 19 neutrons has an unexpected ground-state spin which can be explained only by promoting at least two neutrons across the N = 20 shell gap. This places the above nucleus inside the "island". However, modern shell-model approaches cannot predict this level as the ground state but only as one of the low-lying states, even though they reproduce very well the experimental g-factor. This indicates that modifications to the available interactions are required. Future measurements include isotope shift measurements on radioactive Mg isotopes and beta-NMR studies on 33Mg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The radio communication system is one of the most critical system of the overall satellite platform: it often represents the only way of communication, between a spacecraft and the Ground Segment or among a constellation of satellites. This thesis focuses on specific innovative architectures for on-board and on-ground radio systems. In particular, this work is an integral part of a space program started in 2004 at the University of Bologna, Forlì campus, which led to the completion of the microsatellite ALMASat-1, successfully launched on-board the VEGA maiden flight. The success of this program led to the development of a second microsatellite, named ALMASat-EO, a three-axis stabilized microsatellite able to capture images of the Earth surface. Therefore, the first objective of this study was focused on the investigation of an innovative, efficient and low cost architecture for on-board radio communication systems. The TT&C system and the high data rate transmitter for images downlink design and realization are thoroughly described in this work, together with the development of the embedded hardware and the adopted antenna systems. Moreover, considering the increasing interest in the development of constellations of microsatellite, in particular those flying in close formations, a careful analysis has been carried out for the development of innovative communication protocols for inter-satellite links. Furthermore, in order to investigate the system aspects of space communications, a study has been carried out at ESOC having as objective the design, implementation and test of two experimental devices for the enhancement of the ESA GS. Thus, a significant portion of this thesis is dedicated to the description of the results of a method for improving the phase stability of GS radio frequency equipments by means of real-time phase compensation and a new way to perform two antennas arraying tracking using already existing ESA tracking stations facilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Throughout the alpine domain, shallow landslides represent a serious geologic hazard, often causing severe damages to infrastructures, private properties, natural resources and in the most catastrophic events, threatening human lives. Landslides are a major factor of landscape evolution in mountainous and hilly regions and represent a critical issue for mountainous land management, since they cause loss of pastoral lands. In several alpine contexts, shallow landsliding distribution is strictly connected to the presence and condition of vegetation on the slopes. With the aid of high-resolution satellite images, it's possible to divide automatically the mountainous territory in land cover classes, which contribute with different magnitude to the stability of the slopes. The aim of this research is to combine EO (Earth Observation) land cover maps with ground-based measurements of the land cover properties. In order to achieve this goal, a new procedure has been developed to automatically detect grass mantle degradation patterns from satellite images. Moreover, innovative surveying techniques and instruments are tested to measure in situ the shear strength of grass mantle and the geomechanical and geotechnical properties of these alpine soils. Shallow landsliding distribution is assessed with the aid of physically based models, which use the EO-based map to distribute the resistance parameters across the landscape.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, thanks to the technological advances, electromagnetic methods for non-invasive shallow subsurface characterization have been increasingly used in many areas of environmental and geoscience applications. Among all the geophysical electromagnetic methods, the Ground Penetrating Radar (GPR) has received unprecedented attention over the last few decades due to its capability to obtain, spatially and temporally, high-resolution electromagnetic parameter information thanks to its versatility, its handling, its non-invasive nature, its high resolving power, and its fast implementation. The main focus of this thesis is to perform a dielectric site characterization in an efficient and accurate way studying in-depth a physical phenomenon behind a recent developed GPR approach, the so-called early-time technique, which infers the electrical properties of the soil in the proximity of the antennas. In particular, the early-time approach is based on the amplitude analysis of the early-time portion of the GPR waveform using a fixed-offset ground-coupled antenna configuration where the separation between the transmitting and receiving antenna is on the order of the dominant pulse-wavelength. Amplitude information can be extracted from the early-time signal through complex trace analysis, computing the instantaneous-amplitude attributes over a selected time-duration of the early-time signal. Basically, if the acquired GPR signals are considered to represent the real part of a complex trace, and the imaginary part is the quadrature component obtained by applying a Hilbert transform to the GPR trace, the amplitude envelope is the absolute value of the resulting complex trace (also known as the instantaneous-amplitude). Analysing laboratory information, numerical simulations and natural field conditions, and summarising the overall results embodied in this thesis, it is possible to suggest the early-time GPR technique as an effective method to estimate physical properties of the soil in a fast and non-invasive way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A critical point in the analysis of ground displacements time series is the development of data driven methods that allow the different sources that generate the observed displacements to be discerned and characterised. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows reducing the dimensionality of the data space maintaining most of the variance of the dataset explained. Anyway, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. The Independent Component Analysis (ICA) is a popular technique adopted to approach this problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, I use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here I present the application of the vbICA technique to GPS position time series. First, I use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise) and a volcanic source, and I study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, I apply vbICA to different tectonically active scenarios, such as the 2009 L'Aquila (central Italy) earthquake, the 2012 Emilia (northern Italy) seismic sequence, and the 2006 Guerrero (Mexico) Slow Slip Event (SSE).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work belongs to the PRANA project, the first extensive field campaign of observation of atmospheric emission spectra covering the Far InfraRed spectral region, for more than two years. The principal deployed instrument is REFIR-PAD, a Fourier transform spectrometer used by us to study Antarctic cloud properties. A dataset covering the whole 2013 has been analyzed and, firstly, a selection of good quality spectra is performed, using, as thresholds, radiance values in few chosen spectral regions. These spectra are described in a synthetic way averaging radiances in selected intervals, converting them into BTs and finally considering the differences between each pair of them. A supervised feature selection algorithm is implemented with the purpose to select the features really informative about the presence, the phase and the type of cloud. Hence, training and test sets are collected, by means of Lidar quick-looks. The supervised classification step of the overall monthly datasets is performed using a SVM. On the base of this classification and with the help of Lidar observations, 29 non-precipitating ice cloud case studies are selected. A single spectrum, or at most an average over two or three spectra, is processed by means of the retrieval algorithm RT-RET, exploiting some main IR window channels, in order to extract cloud properties. Retrieved effective radii and optical depths are analyzed, to compare them with literature studies and to evaluate possible seasonal trends. Finally, retrieval output atmospheric profiles are used as inputs for simulations, assuming two different crystal habits, with the aim to examine our ability to reproduce radiances in the FIR. Substantial mis-estimations are found for FIR micro-windows: a high variability is observed in the spectral pattern of simulation deviations from measured spectra and an effort to link these deviations to cloud parameters has been performed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Con riferimento alla realizzazione di tunnel per servizi interrati, l’incertezza che contraddistingue il quadro geologico, oltre che incidere sui costi, riveste un ruolo chiave nella progettazione preliminare. Sebbene un’approfondita caratterizzazione geotecnica e geologica del volume di terreno inerente l’opera di scavo sia generalmente parte integrante del progetto, non è comunque possibile eliminare del tutto tali incertezze per via dell’estensione del volume interessato oltre che per la disomogeneità che sempre contraddistingue il terreno. Generalmente, investigazioni in corso d’opera e interventi di stabilizzazione devono essere previsti per contenere i costi di perforazione ed ottimizzare la progettazione. Ad esempio, tra i metodi di esplorazione geotecnica figurano i tunnel pilota, i quali sono in grado di garantire un’ottimale caratterizzazione del quadro geotecnico del sottosuolo. Con riferimento agli interventi di stabilizzazione del terreno, adottabili laddove una perforazione tradizionale non consentirebbe il tunnelling, vi è un vasta gamma di scelta. Pertanto, da una prima analisi delle problematiche connesse al tunnelling emerge che la stabilizzazione delle facce di scavo riveste un’importanza e un risconto applicativo di prim’ordine. Questa tesi si inserisce all’interno di un progetto che promuove un’innovativa ed economica tecnica di stabilizzazione dei tunnel per suzione tenendo quindi conto dell’influenza della suzione sulla coesione non drenata.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Caratteristiche hardware di un rover terrestre (progetto SHERPA). Implementazione tramite il framework ROS di un algoritmo di alto livello di navigazione autonoma basato su due possibili algoritmi di basso livello: LOS (Lightweight Object Streaming developed by BlueBotics) o Navigation Stack. Sviluppo di una Control Ground Station (Java) basata su: protocollo SSH2 oppure sfruttando la libreria LOS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the case of a 46-year-old man found dead in his apartment in a chair with a gunshot wound in his chest. All circumstances pointed to suicide as the manner of death. Finding the weapon, a SIG Sauer P228 pistol, about 2 m away from the decedent with an obstacle between weapon and corpse however generated speculation about third party involvement. Scene investigations and ballistic calculations showed that with a high probability the weapon must have been purposefully thrown away by the decedent after he fired the lethal gunshot.