8 resultados para Signature Verification, Forgery Detection, Fuzzy Modeling

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis tackles the problem of the automated detection of the atmospheric boundary layer (BL) height, h, from aerosol lidar/ceilometer observations. A new method, the Bayesian Selective Method (BSM), is presented. It implements a Bayesian statistical inference procedure which combines in an statistically optimal way different sources of information. Firstly atmospheric stratification boundaries are located from discontinuities in the ceilometer back-scattered signal. The BSM then identifies the discontinuity edge that has the highest probability to effectively mark the BL height. Information from the contemporaneus physical boundary layer model simulations and a climatological dataset of BL height evolution are combined in the assimilation framework to assist this choice. The BSM algorithm has been tested for four months of continuous ceilometer measurements collected during the BASE:ALFA project and is shown to realistically diagnose the BL depth evolution in many different weather conditions. Then the BASE:ALFA dataset is used to investigate the boundary layer structure in stable conditions. Functions from the Obukhov similarity theory are used as regression curves to fit observed velocity and temperature profiles in the lower half of the stable boundary layer. Surface fluxes of heat and momentum are best-fitting parameters in this exercise and are compared with what measured by a sonic anemometer. The comparison shows remarkable discrepancies, more evident in cases for which the bulk Richardson number turns out to be quite large. This analysis supports earlier results, that surface turbulent fluxes are not the appropriate scaling parameters for profiles of mean quantities in very stable conditions. One of the practical consequences is that boundary layer height diagnostic formulations which mainly rely on surface fluxes are in disagreement to what obtained by inspecting co-located radiosounding profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La ricerca presentata è un’ampia esplorazione delle possibili applicazioni di concetti, metodi e procedure della Fuzzy Logic all’Ingegneria dei Materiali. Tale nuovo approccio è giustificato dalla inadeguatezza dei risultati conseguiti con i soli metodi tradizionali riguardo alla reologia ed alla durabilità, all’utilizzo di dati di laboratorio nella progettazione e alla necessità di usare un linguaggio (informatizzabile) che consenta una valutazione congiunta degli aspetti tecnici, culturali, economici, paesaggistici della progettazione. – In particolare, la Fuzzy Logic permette di affrontare in modo razionale l’aleatorietà delle variabili e dei dati che, nel settore specifico dei materiali in opera nel costruito dei Beni Culturali, non possono essere trattati con i metodi statistici ordinari. – La scelta di concentrare l’attenzione su materiali e strutture in opera in siti archeologici discende non solo dall’interesse culturale ed economico connesso ai sempre più numerosi interventi in questo nuovo settore di pertinenza dell’Ingegneria dei Materiali, ma anche dal fatto che, in tali contesti, i termini della rappresentatività dei campionamenti, della complessità delle interazioni tra le variabili (fisiche e non), del tempo e quindi della durabilità sono evidenti ed esasperati. – Nell’ambito di questa ricerca si è anche condotto un ampio lavoro sperimentale di laboratorio per l’acquisizione dei dati utilizzati nelle procedure di modellazione fuzzy (fuzzy modeling). In tali situazioni si è operato secondo protocolli sperimentali standard: acquisizione della composizione mineralogica tramite diffrazione di raggi X (XRD), definizione della tessitura microstrutturale con osservazioni microscopiche (OM, SEM) e porosimetria tramite intrusione forzata di mercurio (MIP), determinazioni fisiche quali la velocità di propagazione degli ultrasuoni e rotoviscosimetria, misure tecnologiche di resistenza meccanica a compressione uniassiale, lavorabilità, ecc. – Nell’elaborazione dei dati e nella modellazione in termini fuzzy, la ricerca è articolata su tre livelli: a. quello dei singoli fenomeni chimico-fisici, di natura complessa, che non hanno trovato, a tutt’oggi, una trattazione soddisfacente e di generale consenso; le applicazioni riguardano la reologia delle dispersioni ad alto tenore di solido in acqua (calci, cementi, malte, calcestruzzi SCC), la correlazione della resistenza a compressione, la gelività dei materiali porosi ed alcuni aspetti della durabilità del calcestruzzo armato; b. quello della modellazione della durabilità dei materiali alla scala del sito archeologico; le applicazioni presentate riguardano i centri di cultura nuragica di Su Monte-Sorradile, GennaMaria-Villanovaforru e Is Paras-Isili; c. quello della scelta strategica costituita dalla selezione del miglior progetto di conservazione considerando gli aspetti connessi all’Ingegneria dei Materiali congiuntamente a quelli culturali, paesaggistici ed economici; le applicazioni hanno riguardato due importanti monumenti (Anfiteatro e Terme a Mare) del sito Romano di Nora-Pula.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machines with moving parts give rise to vibrations and consequently noise. The setting up and the status of each machine yield to a peculiar vibration signature. Therefore, a change in the vibration signature, due to a change in the machine state, can be used to detect incipient defects before they become critical. This is the goal of condition monitoring, in which the informations obtained from a machine signature are used in order to detect faults at an early stage. There are a large number of signal processing techniques that can be used in order to extract interesting information from a measured vibration signal. This study seeks to detect rotating machine defects using a range of techniques including synchronous time averaging, Hilbert transform-based demodulation, continuous wavelet transform, Wigner-Ville distribution and spectral correlation density function. The detection and the diagnostic capability of these techniques are discussed and compared on the basis of experimental results concerning gear tooth faults, i.e. fatigue crack at the tooth root and tooth spalls of different sizes, as well as assembly faults in diesel engine. Moreover, the sensitivity to fault severity is assessed by the application of these signal processing techniques to gear tooth faults of different sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present thesis, a new methodology of diagnosis based on advanced use of time-frequency technique analysis is presented. More precisely, a new fault index that allows tracking individual fault components in a single frequency band is defined. More in detail, a frequency sliding is applied to the signals being analyzed (currents, voltages, vibration signals), so that each single fault frequency component is shifted into a prefixed single frequency band. Then, the discrete Wavelet Transform is applied to the resulting signal to extract the fault signature in the frequency band that has been chosen. Once the state of the machine has been qualitatively diagnosed, a quantitative evaluation of the fault degree is necessary. For this purpose, a fault index based on the energy calculation of approximation and/or detail signals resulting from wavelet decomposition has been introduced to quantify the fault extend. The main advantages of the developed new method over existing Diagnosis techniques are the following: - Capability of monitoring the fault evolution continuously over time under any transient operating condition; - Speed/slip measurement or estimation is not required; - Higher accuracy in filtering frequency components around the fundamental in case of rotor faults; - Reduction in the likelihood of false indications by avoiding confusion with other fault harmonics (the contribution of the most relevant fault frequency components under speed-varying conditions are clamped in a single frequency band); - Low memory requirement due to low sampling frequency; - Reduction in the latency of time processing (no requirement of repeated sampling operation).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been exponential growth in using virtual spaces, including dialogue systems, that handle personal information. The concept of personal privacy in the literature is discussed and controversial, whereas, in the technological field, it directly influences the degree of reliability perceived in the information system (privacy ‘as trust’). This work aims to protect the right to privacy on personal data (GDPR, 2018) and avoid the loss of sensitive content by exploring sensitive information detection (SID) task. It is grounded on the following research questions: (RQ1) What does sensitive data mean? How to define a personal sensitive information domain? (RQ2) How to create a state-of-the-art model for SID?(RQ3) How to evaluate the model? RQ1 theoretically investigates the concepts of privacy and the ontological state-of-the-art representation of personal information. The Data Privacy Vocabulary (DPV) is the taxonomic resource taken as an authoritative reference for the definition of the knowledge domain. Concerning RQ2, we investigate two approaches to classify sensitive data: the first - bottom-up - explores automatic learning methods based on transformer networks, the second - top-down - proposes logical-symbolic methods with the construction of privaframe, a knowledge graph of compositional frames representing personal data categories. Both approaches are tested. For the evaluation - RQ3 – we create SPeDaC, a sentence-level labeled resource. This can be used as a benchmark or training in the SID task, filling the gap of a shared resource in this field. If the approach based on artificial neural networks confirms the validity of the direction adopted in the most recent studies on SID, the logical-symbolic approach emerges as the preferred way for the classification of fine-grained personal data categories, thanks to the semantic-grounded tailor modeling it allows. At the same time, the results highlight the strong potential of hybrid architectures in solving automatic tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clear cell sarcoma of the kidney (CCSK) is the second most common pediatric renal tumor, characterized in 90% of cases by the presence of internal tandem duplications (ITDs) localized at the last exon of BCOR gene. BCOR protein constitute a core component of the non-canonical Polycomb Repressive Complex1 (PRC1.1), which performs a fundamental silencing activity. ITDs in the last BCOR exon at the level of PUFD domain have been identified in many tumor subtypes and could affect PCGF1 binding and the subsequent PRC1.1 activity, although the exact oncogenic mechanism of ITD remains poorly understood. This project has the objective of investigating the molecular mechanisms underlying the oncogenesis of CCSK, approaching the study with different methodologies. A first model in HEK-293 allowed to obtain important informations about BCOR functionality, suggesting that the presence of ITD generates an altered activity which is very different from a loss-of-function. It has also been observed that BCOR function within the PRC1.1 complex varies with different ITDs. Moreover, it allowed the identification of molecular signatures evoked by the presence of BCOR-ITD, including its role in extracellular matrix interactions and invasiveness promotion. The parallel analysis of WTS data from 8 CCSK cases permitted the identification of a peculiar signature for metastatic CCSKs, highlighting a 20-fold overexpression of FGF3. This factor promoted a significant increase in invasive ability in the cellular model. In order to study BCOR-ITD effects over cell stemness and differentiation, an inducible model is being obtained in H1 cells. This way, it will be possible to study the functionality of BCOR-ITD in a context more similar to the origin of CCSKs, evaluating both the specific interactome and phenotypic consequences caused by the mutation.