945 resultados para MULTIVARIATE DISTRIBUTIONS
Resumo:
[EN] In this paper, we have used Geographical Information Systems (GIS) to solve the planar Huff problem considering different demand distributions and forbidden regions. Most of the papers connected with the competitive location problems consider that the demand is aggregated in a finite set of points. In other few cases, the models suppose that the demand is distributed along the feasible region according to a functional form, mainly a uniform distribution. In this case, in addition to the discrete and uniform demand distributions we have considered that the demand is represented by a population surface model, that is, a raster map where each pixel has associated a value corresponding to the population living in the area that it covers...
Resumo:
The proton-nucleus elastic scattering at intermediate energies is a well-established method for the investigation of the nuclear matter distribution in stable nuclei and was recently applied also for the investigation of radioactive nuclei using the method of inverse kinematics. In the current experiment, the differential cross sections for proton elastic scattering on the isotopes $^{7,9,10,11,12,14}$Be and $^8$B were measured. The experiment was performed using the fragment separator at GSI, Darmstadt to produce the radioactive beams. The main part of the experimental setup was the time projection ionization chamber IKAR which was simultaneously used as hydrogen target and a detector for the recoil protons. Auxiliary detectors for projectile tracking and isotope identification were also installed. As results from the experiment, the absolute differential cross sections d$sigma$/d$t$ as a function of the four momentum transfer $t$ were obtained. In this work the differential cross sections for elastic p-$^{12}$Be, p-$^{14}$Be and p-$^{8}$B scattering at low $t$ ($t leq$~0.05~(GeV/c)$^2$) are presented. The measured cross sections were analyzed within the Glauber multiple-scattering theory using different density parameterizations, and the nuclear matter density distributions and radii of the investigated isotopes were determined. The analysis of the differential cross section for the isotope $^{14}$Be shows that a good description of the experimental data is obtained when density distributions consisting of separate core and halo components are used. The determined {it rms} matter radius is $3.11 pm 0.04 pm 0.13$~fm. In the case of the $^{12}$Be nucleus the results showed an extended matter distribution as well. For this nucleus a matter radius of $2.82 pm 0.03 pm 0.12$~fm was determined. An interesting result is that the free $^{12}$Be nucleus behaves differently from the core of $^{14}$Be and is much more extended than it. The data were also compared with theoretical densities calculated within the FMD and the few-body models. In the case of $^{14}$Be, the calculated cross sections describe the experimental data well while, in the case of $^{12}$Be there are discrepancies in the region of high momentum transfer. Preliminary experimental results for the isotope $^8$B are also presented. An extended matter distribution was obtained (though much more compact as compared to the neutron halos). A proton halo structure was observed for the first time with the proton elastic scattering method. The deduced matter radius is $2.60pm 0.02pm 0.26$~fm. The data were compared with microscopic calculations in the frame of the FMD model and reasonable agreement was observed. The results obtained in the present analysis are in most cases consistent with the previous experimental studies of the same isotopes with different experimental methods (total interaction and reaction cross section measurements, momentum distribution measurements). For future investigation of the structure of exotic nuclei a universal detector system EXL is being developed. It will be installed at the NESR at the future FAIR facility where higher intensity beams of radioactive ions are expected. The usage of storage ring techniques provides high luminosity and low background experimental conditions. Results from the feasibility studies of the EXL detector setup, performed at the present ESR storage ring, are presented.
Resumo:
Questa tesi descrive alcuni studi di messa a punto di metodi di analisi fisici accoppiati con tecniche statistiche multivariate per valutare la qualità e l’autenticità di oli vegetali e prodotti caseari. L’applicazione di strumenti fisici permette di abbattere i costi ed i tempi necessari per le analisi classiche ed allo stesso tempo può fornire un insieme diverso di informazioni che possono riguardare tanto la qualità come l’autenticità di prodotti. Per il buon funzionamento di tali metodi è necessaria la costruzione di modelli statistici robusti che utilizzino set di dati correttamente raccolti e rappresentativi del campo di applicazione. In questo lavoro di tesi sono stati analizzati oli vegetali e alcune tipologie di formaggi (in particolare pecorini per due lavori di ricerca e Parmigiano-Reggiano per un altro). Sono stati utilizzati diversi strumenti di analisi (metodi fisici), in particolare la spettroscopia, l’analisi termica differenziale, il naso elettronico, oltre a metodiche separative tradizionali. I dati ottenuti dalle analisi sono stati trattati mediante diverse tecniche statistiche, soprattutto: minimi quadrati parziali; regressione lineare multipla ed analisi discriminante lineare.
Resumo:
In the large maturity limit, we compute explicitly the Local Volatility surface for Heston, through Dupire’s formula, with Fourier pricing of the respective derivatives of the call price. Than we verify that the prices of European call options produced by the Heston model, concide with those given by the local volatility model where the Local Volatility is computed as said above.
Resumo:
A critical point in the analysis of ground displacements time series is the development of data driven methods that allow the different sources that generate the observed displacements to be discerned and characterised. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows reducing the dimensionality of the data space maintaining most of the variance of the dataset explained. Anyway, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. The Independent Component Analysis (ICA) is a popular technique adopted to approach this problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, I use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here I present the application of the vbICA technique to GPS position time series. First, I use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise) and a volcanic source, and I study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, I apply vbICA to different tectonically active scenarios, such as the 2009 L'Aquila (central Italy) earthquake, the 2012 Emilia (northern Italy) seismic sequence, and the 2006 Guerrero (Mexico) Slow Slip Event (SSE).
Resumo:
Since its discovery, top quark has represented one of the most investigated field in particle physics. The aim of this thesis is the reconstruction of hadronic top with high transverse momentum (boosted) with the Template Overlap Method (TOM). Because of the high energy, the decay products of boosted tops are partially or totally overlapped and thus they are contained in a single large radius jet (fat-jet). TOM compares the internal energy distributions of the candidate fat-jet to a sample of tops obtained by a MC simulation (template). The algorithm is based on the definition of an overlap function, which quantifies the level of agreement between the fat-jet and the template, allowing an efficient discrimination of signal from the background contributions. A working point has been decided in order to obtain a signal efficiency close to 90% and a corresponding background rejection at 70%. TOM performances have been tested on MC samples in the muon channel and compared with the previous methods present in literature. All the methods will be merged in a multivariate analysis to give a global top tagging which will be included in ttbar production differential cross section performed on the data acquired in 2012 at sqrt(s)=8 TeV in high phase space region, where new physics processes could be possible. Due to its peculiarity to increase the pT, the Template Overlap Method will play a crucial role in the next data taking at sqrt(s)=13 TeV, where the almost totality of the tops will be produced at high energy, making the standard reconstruction methods inefficient.
Resumo:
The Large Hadron Collider, located at the CERN laboratories in Geneva, is the largest particle accelerator in the world. One of the main research fields at LHC is the study of the Higgs boson, the latest particle discovered at the ATLAS and CMS experiments. Due to the small production cross section for the Higgs boson, only a substantial statistics can offer the chance to study this particle properties. In order to perform these searches it is desirable to avoid the contamination of the signal signature by the number and variety of the background processes produced in pp collisions at LHC. Much account assumes the study of multivariate methods which, compared to the standard cut-based analysis, can enhance the signal selection of a Higgs boson produced in association with a top quark pair through a dileptonic final state (ttH channel). The statistics collected up to 2012 is not sufficient to supply a significant number of ttH events; however, the methods applied in this thesis will provide a powerful tool for the increasing statistics that will be collected during the next LHC data taking.
Resumo:
Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.