833 resultados para Probabilistic methodology
Resumo:
The objective of the work is the evaluation of the potential capabilities of navigation satellite signals to retrieve basic atmospheric parameters. A capillary study have been performed on the assumptions more or less explicitly contained in the common processing steps of navigation signals. A probabilistic procedure has been designed for measuring vertical discretised profiles of pressure, temperature and water vapour and their associated errors. Numerical experiments on a synthetic dataset have been performed with the main objective of quantifying the information that could be gained from such approach, using entropy and relative entropy as testing parameters. A simulator of phase delay and bending of a GNSS signal travelling across the atmosphere has been developed to this aim.
Resumo:
The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. Security testing methodologies are the first step towards standardized security evaluation processes and understanding of how the security threats evolve over time. This dissertation analyzes some of the most used identifying differences and commonalities, useful to compare them and assess their quality. The dissertation then proposes a new enhanced methodology built by keeping the best of every analyzed methodology. The designed methodology is tested over different systems with very effective results, which is the main evidence that it could really be applied in practical cases. Most of the dissertation discusses and proves how the presented testing methodology could be applied to such different systems and even to evade security measures by inverting goals and scopes. Real cases are often hard to find in methodology' documents, in contrary this dissertation wants to show real and practical cases offering technical details about how to apply it. Electronic voting systems are the first field test considered, and Pvote and Scantegrity are the two tested electronic voting systems. The usability and effectiveness of the designed methodology for electronic voting systems is proved thanks to this field cases analysis. Furthermore reputation and anti virus engines have also be analyzed with similar results. The dissertation concludes by presenting some general guidelines to build a coordination-based approach of electronic voting systems to improve the security without decreasing the system modularity.
Resumo:
The thesis applies the ICC tecniques to the probabilistic polinomial complexity classes in order to get an implicit characterization of them. The main contribution lays on the implicit characterization of PP (which stands for Probabilistic Polynomial Time) class, showing a syntactical characterisation of PP and a static complexity analyser able to recognise if an imperative program computes in Probabilistic Polynomial Time. The thesis is divided in two parts. The first part focuses on solving the problem by creating a prototype of functional language (a probabilistic variation of lambda calculus with bounded recursion) that is sound and complete respect to Probabilistic Prolynomial Time. The second part, instead, reverses the problem and develops a feasible way to verify if a program, written with a prototype of imperative programming language, is running in Probabilistic polynomial time or not. This thesis would characterise itself as one of the first step for Implicit Computational Complexity over probabilistic classes. There are still open hard problem to investigate and try to solve. There are a lot of theoretical aspects strongly connected with these topics and I expect that in the future there will be wide attention to ICC and probabilistic classes.
Resumo:
DI Diesel engine are widely used both for industrial and automotive applications due to their durability and fuel economy. Nonetheless, increasing environmental concerns force that type of engine to comply with increasingly demanding emission limits, so that, it has become mandatory to develop a robust design methodology of the DI Diesel combustion system focused on reduction of soot and NOx simultaneously while maintaining a reasonable fuel economy. In recent years, genetic algorithms and CFD three-dimensional combustion simulations have been successfully applied to that kind of problem. However, combining GAs optimization with actual CFD three-dimensional combustion simulations can be too onerous since a large number of calculations is usually needed for the genetic algorithm to converge, resulting in a high computational cost and, thus, limiting the suitability of this method for industrial processes. In order to make the optimization process less time-consuming, CFD simulations can be more conveniently used to generate a training set for the learning process of an artificial neural network which, once correctly trained, can be used to forecast the engine outputs as a function of the design parameters during a GA optimization performing a so-called virtual optimization. In the current work, a numerical methodology for the multi-objective virtual optimization of the combustion of an automotive DI Diesel engine, which relies on artificial neural networks and genetic algorithms, was developed.
Resumo:
Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.
Resumo:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
Resumo:
La tesi di laurea presentata si inserisce nell’ampio contesto della Sicurezza Informatica, in particolare tratta il problema del testing dei sistemi di sicurezza concepiti per contrapporsi alle odierne minacce: gli attacchi mirati (Targeted Attacks) ed in generale le minacce avanzate persistenti (Advanced Persistent Threats). Il principale obiettivo del lavoro svolto è lo sviluppo e la discussione di una metodologia di test per sistemi di sicurezza focalizzati su questo genere di problemi. Le linee guida proposte hanno lo scopo di aiutare a colmare il divario tra quello che viene testato e quello che in realt`a deve essere affrontato realmente. Le attività svolte durante la preparazione della tesi sono state sia di tipo teorico, per quanto concerne lo sviluppo di una metodologia per affrontare al meglio il testing di sistemi di sicurezza a fronte di attacchi mirati, che ne di tipo sperimentale in quanto si sono utilizzati tali concetti per lo svolgimento di test su più strumenti di difesa in uno scenario d’interesse reale.
Resumo:
In this thesis we provide a characterization of probabilistic computation in itself, from a recursion-theoretical perspective, without reducing it to deterministic computation. More specifically, we show that probabilistic computable functions, i.e., those functions which are computed by Probabilistic Turing Machines (PTM), can be characterized by a natural generalization of Kleene's partial recursive functions which includes, among initial functions, one that returns identity or successor with probability 1/2. We then prove the equi-expressivity of the obtained algebra and the class of functions computed by PTMs. In the the second part of the thesis we investigate the relations existing between our recursion-theoretical framework and sub-recursive classes, in the spirit of Implicit Computational Complexity. More precisely, endowing predicative recurrence with a random base function is proved to lead to a characterization of polynomial-time computable probabilistic functions.
Resumo:
Cardiotocography (CTG) is a widespread foetal diagnostic methods. However, it lacks of objectivity and reproducibility since its dependence on observer's expertise. To overcome these limitations, more objective methods for CTG interpretation have been proposed. In particular, many developed techniques aim to assess the foetal heart rate variability (FHRV). Among them, some methodologies from nonlinear systems theory have been applied to the study of FHRV. All the techniques have proved to be helpful in specific cases. Nevertheless, none of them is more reliable than the others. Therefore, an in-depth study is necessary. The aim of this work is to deepen the FHRV analysis through the Symbolic Dynamics Analysis (SDA), a nonlinear technique already successfully employed for HRV analysis. Thanks to its simplicity of interpretation, it could be a useful tool for clinicians. We performed a literature study involving about 200 references on HRV and FHRV analysis; approximately 100 works were focused on non-linear techniques. Then, in order to compare linear and non-linear methods, we carried out a multiparametric study. 580 antepartum recordings of healthy fetuses were examined. Signals were processed using an updated software for CTG analysis and a new developed software for generating simulated CTG traces. Finally, statistical tests and regression analyses were carried out for estimating relationships among extracted indexes and other clinical information. Results confirm that none of the employed techniques is more reliable than the others. Moreover, in agreement with the literature, each analysis should take into account two relevant parameters, the foetal status and the week of gestation. Regarding the SDA, results show its promising capabilities in FHRV analysis. It allows recognizing foetal status, gestation week and global variability of FHR signals, even better than other methods. Nevertheless, further studies, which should involve even pathological cases, are necessary to establish its reliability.
Resumo:
This work aims to evaluate the reliability of these levee systems, calculating the probability of “failure” of determined levee stretches under different loads, using probabilistic methods that take into account the fragility curves obtained through the Monte Carlo Method. For this study overtopping and piping are considered as failure mechanisms (since these are the most frequent) and the major levee system of the Po River with a primary focus on the section between Piacenza and Cremona, in the lower-middle area of the Padana Plain, is analysed. The novelty of this approach is to check the reliability of individual embankment stretches, not just a single section, while taking into account the variability of the levee system geometry from one stretch to another. This work takes also into consideration, for each levee stretch analysed, a probability distribution of the load variables involved in the definition of the fragility curves, where it is influenced by the differences in the topography and morphology of the riverbed along the sectional depth analysed as it pertains to the levee system in its entirety. A type of classification is proposed, for both failure mechanisms, to give an indication of the reliability of the levee system based of the information obtained by the fragility curve analysis. To accomplish this work, an hydraulic model has been developed where a 500-year flood is modelled to determinate the residual hazard value of failure for each stretch of levee near the corresponding water depth, then comparing the results with the obtained classifications. This work has the additional the aim of acting as an interface between the world of Applied Geology and Environmental Hydraulic Engineering where a strong collaboration is needed between the two professions to resolve and improve the estimation of hydraulic risk.
Resumo:
Il cervello umano è composto da una rete complessa, formata da fasci di assoni, che connettono le diverse aree cerebrali. Il fascio arcuato collega l’area imputata alla com- prensione del linguaggio con quella dedicata alla sua produzione. Il fascio arcuato è presente in entrambi gli emisferi cerebrali, anche se spesso è utilizzato prevalente- mente il sinistro. In questa tesi sono state valutate, in un campione di soggetti sani, le differenze tra fascio arcuato destro e sinistro, utilizzando la trattografia, metodica avanzata e non invasiva che permette la ricostruzione della traiettoria delle fibre con immagini RM (Risonanza Magnetica) pesate in diffusione. A questo scopo ho utilizzato un algoritmo probabilistico, che permette la stima di probabilità di connessione della fibra in oggetto con le diverse aree cerebrali, anche nelle sedi di incrocio con fibre di fasci diversi. Grazie all’implementazione di questo metodo, è stato possibile ottenere una ricostruzione accurata del fascio arcuato, an- che nell’emisfero destro dove è spesso critica, tanto da non essere possibile con altri algoritmi trattografici. Parametrizzando poi la geometria del tratto ho diviso il fascio arcuato in venti seg- menti e ho confrontato i parametri delle misure di diffusione, valutate nell’emisfero destro e sinistro. Da queste analisi emerge un’ampia variabilità nella geometria dell’arcuato, sia tra diversi soggetti che diversi emisferi. Nell’emisfero destro l’arcuato incrocia maggiormente fibre appartenenti ad altri fasci. Nell’emisfero sinistro le fibre dell’arcuato sono più compatte e si misura anche una maggiore connettività con altre aree del cervello coinvolte nelle funzioni linguistiche. Nella seconda fase dello studio ho applicato la stessa metodica in due pazienti con lesioni cerebrali, con l’obiettivo di testare il danno del fascio arcuato ipsilaterale alla lesione e stimare se nell’emisfero controlaterale si innescassero meccanismi di plastic- ità strutturale. Questa metodica può essere implementata, in un gruppo di pazienti omogenei, per identificare marcatori RM diagnostici nella fase di pianificazione pre- chirurgica e marcatori RM prognostici di recupero funzionale del linguaggio.
Resumo:
The Swiss Federal Office of Public Health demanded a nationwide health technology assessment registry for cervical and lumbar total disc arthroplasty and for balloon kyphoplasty (BKP) to make a decision about reimbursement of these interventions.