876 resultados para bigdata, data stream processing, dsp, apache storm, cyber security
Resumo:
Artificial immune systems have previously been applied to the problem of intrusion detection. The aim of this research is to develop an intrusion detection system based on the function of Dendritic Cells (DCs). DCs are antigen presenting cells and key to the activation of the human immune system, behaviour which has been abstracted to form the Dendritic Cell Algorithm (DCA). In algorithmic terms, individual DCs perform multi-sensor data fusion, asynchronously correlating the fused data signals with a secondary data stream. Aggregate output of a population of cells is analysed and forms the basis of an anomaly detection system. In this paper the DCA is applied to the detection of outgoing port scans using TCP SYN packets. Results show that detection can be achieved with the DCA, yet some false positives can be encountered when simultaneously scanning and using other network services. Suggestions are made for using adaptive signals to alleviate this uncovered problem.
Resumo:
Artificial immune systems, more specifically the negative selection algorithm, have previously been applied to intrusion detection. The aim of this research is to develop an intrusion detection system based on a novel concept in immunology, the Danger Theory. Dendritic Cells (DCs) are antigen presenting cells and key to the activation of the human immune system. DCs perform the vital role of combining signals from the host tissue and correlate these signals with proteins known as antigens. In algorithmic terms, individual DCs perform multi-sensor data fusion based on time-windows. The whole population of DCs asynchronously correlates the fused signals with a secondary data stream. The behaviour of human DCs is abstracted to form the DC Algorithm (DCA), which is implemented using an immune inspired framework, libtissue. This system is used to detect context switching for a basic machine learning dataset and to detect outgoing portscans in real-time. Experimental results show a significant difference between an outgoing portscan and normal traffic.
Resumo:
As introduced by Bentley et al. (2005), artificial immune systems (AIS) are lacking tissue, which is present in one form or another in all living multi-cellular organisms. Some have argued that this concept in the context of AIS brings little novelty to the already saturated field of the immune inspired computational research. This article aims to show that such a component of an AIS has the potential to bring an advantage to a data processing algorithm in terms of data pre-processing, clustering and extraction of features desired by the immune inspired system. The proposed tissue algorithm is based on self-organizing networks, such as self-organizing maps (SOM) developed by Kohonen (1996) and an analogy of the so called Toll-Like Receptors (TLR) affecting the activation function of the clusters developed by the SOM.
Resumo:
The use of electrocardiogram nowadays, is very important in diagnosis of heart disease. The emergent increase of portable technology provides medica] monitoring of vital signs allowing freedom ofmovement and watching during normal activity of the patient. In this shidy, it is described the development of a prototype of an ambulatory cardiac monitoring system using 3 leads. The systems consists on conversion of an analog signal, having been previously processed and conditioned, into digital ECG signal and after processed with a microcontroller (MCU). The heartbeat rate can be observed in an LCD display. The LCD display is also used as the interface during the setup process. Ali digital data stream can be stored on a SD memory card llowing the ECG signa] to be accessed later on a PC.
Resumo:
Wind energy is one of the most promising and fast growing sector of energy production. Wind is ecologically friendly and relatively cheap energy resource available for development in practically all corners of the world (where only the wind blows). Today wind power gained broad development in the Scandinavian countries. Three important challenges concerning sustainable development, i.e. energy security, climate change and energy access make a compelling case for large-scale utilization of wind energy. In Finland, according to the climate and energy strategy, accepted in 2008, the total consumption of electricity generated by means of wind farms by 2020, should reach 6 - 7% of total consumption in the country [1]. The main challenges associated with wind energy production are harsh operational conditions that often accompany the turbine operation in the climatic conditions of the north and poor accessibility for maintenance and service. One of the major problems that require a solution is the icing of turbine structures. Icing reduces the performance of wind turbines, which in the conditions of a long cold period, can significantly affect the reliability of power supply. In order to predict and control power performance, the process of ice accretion has to be carefully tracked. There are two ways to detect icing – directly or indirectly. The first way applies to the special ice detection instruments. The second one is using indirect characteristics of turbine performance. One of such indirect methods for ice detection and power loss estimation has been proposed and used in this paper. The results were compared to the results directly gained from the ice sensors. The data used was measured in Muukko wind farm, southeast Finland during a project 'Wind power in cold climate and complex terrain'. The project was carried out in 9/2013 - 8/2015 with the partners Lappeenranta university of technology, Alstom renovables España S.L., TuuliMuukko, and TuuliSaimaa.
Resumo:
Due to the sensitive nature of patient data, the secondary use of electronic health records (EHR) is restricted in scientific research and product development. Such restrictions pursue to preserve the privacy of respective patients by limiting the availability and variety of sensitive patient data. Current limitations do not correspond with the actual needs requested by the potential secondary users. In this thesis, the secondary use of Finnish and Swedish EHR data is explored for the purpose of enhancing the availability of such data for clinical research and product development. Involved EHR-related procedures and technologies are analysed to identify the issues limiting the secondary use of patient data. Successful secondary use of patient data increases the data value. To explore the identified circumstances, a case study of potential secondary users and use intentions regarding EHR data was carried out in Finland and Sweden. The data collection for the conducted case study was performed using semi-structured interviews. In total, 14 Finnish and Swedish experts representing scientific research, health management, and business were interviewed. The motivation for the corresponding interviews was to evaluate the protection of EHR data used for secondary purposes. The efficiency of implemented procedures and technologies was analysed in terms of data availability and privacy preserving. The results of the conducted case study show that the factors affecting EHR availability are divided to three categories: management of patient data, preservation of patients' privacy, and potential secondary users. Identified issues regarding data management included laborious and inconsistent data request procedures and the role and effect of external service providers. Based on the study findings, two secondary use approaches enabling the secondary use of EHR data are identified: data alteration and protected processing environment. Data alteration increases the availability of relevant EHR data, further decreasing the value of such data. Protected processing approach restricts the amount of potential users and use intentions while providing more valuable data content.
Resumo:
La ricerca si pone l’obiettivo di analizzare strumenti e metodi per l’applicazione dell’H-BIM comprendendone le criticità e fornendo soluzioni utili in questo campo. Al contempo la finalità non è circoscrivibile alla semplice produzione di modelli 3D semanticamente strutturati e parametrici a partire da una nuvola di punti ottenuta con un rilievo digitale, ma si propone di definire i criteri e le metodiche di applicazione delle H-BIM all’interno dell’intero processo. L’impostazione metodologica scelta prevede un processo che parte dalla conoscenza dello stato dell’arte in tema di H-BIM con lo studio dell’attuale normativa in materia e i casi studio di maggior rilevanza. Si è condotta una revisione critica completa della letteratura in merito alla tecnologia BIM e H-BIM, analizzando esperienze di utilizzo della tecnologia BIM nel settore edile globale. Inoltre, al fine di promuovere soluzioni intelligenti all’interno del Facility Management è stato necessario analizzare le criticità presenti nelle procedure, rivedere i processi e i metodi per raccogliere e gestire i dati, nonché individuare le procedure adeguate per garantire il successo dell’implementazione. Sono state evidenziate le potenzialità procedurali e operative legate all’uso sistematico delle innovazioni digitali nell’ottica del Facility Management, oltre che allo studio degli strumenti di acquisizione ed elaborazione dei dati e di post-produzione. Si è proceduto al testing su casi specifici per l’analisi della fase di Scan-to-BIM, differenziati per tipologia di utilizzo, data di costruzione, proprietà e localizzazione. Il percorso seguito ha permesso di porre in luce il significato e le implicazioni dell’utilizzo del BIM nell’ambito del Facility Management, sulla base di una differenziazione delle applicazioni del modello BIM al variare delle condizioni in essere. Infine, sono state definite le conclusioni e formulate raccomandazioni riguardo al futuro utilizzo della tecnologia H-BIM nel settore delle costruzioni. In particolare, definendo l’emergente frontiera del Digital Twin, quale veicolo necessario nel futuro della Costruzione 4.0.
Resumo:
In the last decade, manufacturing companies have been facing two significant challenges. First, digitalization imposes adopting Industry 4.0 technologies and allows creating smart, connected, self-aware, and self-predictive factories. Second, the attention on sustainability imposes to evaluate and reduce the impact of the implemented solutions from economic and social points of view. In manufacturing companies, the maintenance of physical assets assumes a critical role. Increasing the reliability and the availability of production systems leads to the minimization of systems’ downtimes; In addition, the proper system functioning avoids production wastes and potentially catastrophic accidents. Digitalization and new ICT technologies have assumed a relevant role in maintenance strategies. They allow assessing the health condition of machinery at any point in time. Moreover, they allow predicting the future behavior of machinery so that maintenance interventions can be planned, and the useful life of components can be exploited until the time instant before their fault. This dissertation provides insights on Predictive Maintenance goals and tools in Industry 4.0 and proposes a novel data acquisition, processing, sharing, and storage framework that addresses typical issues machine producers and users encounter. The research elaborates on two research questions that narrow down the potential approaches to data acquisition, processing, and analysis for fault diagnostics in evolving environments. The research activity is developed according to a research framework, where the research questions are addressed by research levers that are explored according to research topics. Each topic requires a specific set of methods and approaches; however, the overarching methodological approach presented in this dissertation includes three fundamental aspects: the maximization of the quality level of input data, the use of Machine Learning methods for data analysis, and the use of case studies deriving from both controlled environments (laboratory) and real-world instances.
Resumo:
The central aim of this dissertation is to introduce innovative methods, models, and tools to enhance the overall performance of supply chains responsible for handling perishable products. This concept of improved performance encompasses several critical dimensions, including enhanced efficiency in supply chain operations, product quality, safety, sustainability, waste generation minimization, and compliance with norms and regulations. The research is structured around three specific research questions that provide a solid foundation for delving into and narrowing down the array of potential solutions. These questions primarily concern enhancing the overall performance of distribution networks for perishable products and optimizing the package hierarchy, extending to unconventional packaging solutions. To address these research questions effectively, a well-defined research framework guides the approach. However, the dissertation adheres to an overarching methodological approach that comprises three fundamental aspects. The first aspect centers on the necessity of systematic data sampling and categorization, including identifying critical points within food supply chains. The data collected in this context must then be organized within a customized data structure designed to feed both cyber-physical and digital twins to quantify and analyze supply chain failures with a preventive perspective.
Resumo:
Riding the wave of recent groundbreaking achievements, artificial intelligence (AI) is currently the buzzword on everybody’s lips and, allowing algorithms to learn from historical data, Machine Learning (ML) emerged as its pinnacle. The multitude of algorithms, each with unique strengths and weaknesses, highlights the absence of a universal solution and poses a challenging optimization problem. In response, automated machine learning (AutoML) navigates vast search spaces within minimal time constraints. By lowering entry barriers, AutoML emerged as promising the democratization of AI, yet facing some challenges. In data-centric AI, the discipline of systematically engineering data used to build an AI system, the challenge of configuring data pipelines is rather simple. We devise a methodology for building effective data pre-processing pipelines in supervised learning as well as a data-centric AutoML solution for unsupervised learning. In human-centric AI, many current AutoML tools were not built around the user but rather around algorithmic ideas, raising ethical and social bias concerns. We contribute by deploying AutoML tools aiming at complementing, instead of replacing, human intelligence. In particular, we provide solutions for single-objective and multi-objective optimization and showcase the challenges and potential of novel interfaces featuring large language models. Finally, there are application areas that rely on numerical simulators, often related to earth observations, they tend to be particularly high-impact and address important challenges such as climate change and crop life cycles. We commit to coupling these physical simulators with (Auto)ML solutions towards a physics-aware AI. Specifically, in precision farming, we design a smart irrigation platform that: allows real-time monitoring of soil moisture, predicts future moisture values, and estimates water demand to schedule the irrigation.
Resumo:
Nell'ambito della loro trasformazione digitale, molte organizzazioni stanno adottando nuove tecnologie per supportare lo sviluppo, l'implementazione e la gestione delle proprie architetture basate su microservizi negli ambienti cloud e tra i fornitori di cloud. In questo scenario, le service ed event mesh stanno emergendo come livelli infrastrutturali dinamici e configurabili che facilitano interazioni complesse e la gestione di applicazioni basate su microservizi e servizi cloud. L’obiettivo di questo lavoro è quello di analizzare soluzioni mesh open-source (istio, Linkerd, Apache EventMesh) dal punto di vista delle prestazioni, quando usate per gestire la comunicazione tra applicazioni a workflow basate su microservizi all’interno dell’ambiente cloud. A questo scopo è stato realizzato un sistema per eseguire il dislocamento di ognuno dei componenti all’interno di un cluster singolo e in un ambiente multi-cluster. La raccolta delle metriche e la loro sintesi è stata realizzata con un sistema personalizzato, compatibile con il formato dei dati di Prometheus. I test ci hanno permesso di valutare le prestazioni di ogni componente insieme alla sua efficacia. In generale, mentre si è potuta accertare la maturità delle implementazioni di service mesh testate, la soluzione di event mesh da noi usata è apparsa come una tecnologia ancora non matura, a causa di numerosi problemi di funzionamento.
Resumo:
A Organização Mundial de Saúde estima que nos países mais industrializados uma em cada três pessoas sofra, por ano, de uma doença de origem alimentar. De acordo com os dados da Agência Europeia para a Segurança Alimentar foram relatados pelos 27 Estados Membros da União Europeia, no ano 2012, um total de 5.363 surtos de origem alimentar, assistindo-se a uma prevalência do setor da restauração, como o local de maior ocorrência dos surtos de doenças de origem alimentar. Para o mesmo ano, Portugal reportou 7 surtos de origem alimentar, envolvendo 135 pessoas com 42 hospitalizações. Neste contexto, a aplicação de boas práticas de higiene, nomeadamente no setor da restauração, é essencial para proteger o consumidor das doenças de origem alimentar. Neste estudo, pretendeu-se identificar os constructos do modelo da Teoria do Comportamento Planeado (Theory of Planned Behaviour – TPB, segundo a terminologia anglo-saxónica), de Icek Ajzen, que melhor explicam a intenção dos operadores de alimentos em adotarem os comportamentos de higiene, a saber: i) utilização de luvas e touca de proteção de cabelos, e ii) remoção de adornos pessoais, durante a manipulação de alimentos. Para o efeito, foi aplicado um questionário tendo por base a Teoria do Comportamento Planeado, a uma amostra de cento e vinte e três operadores dos vários refeitórios de uma universidade portuguesa, na sua grande maioria do sexo feminino (91,1%) e que manipulam alimentos numa base diária, recorrendo-se primeiramente a uma fase preliminar de estudo qualitativo, ou pré-inquérito, para melhor selecionar os temas essenciais e as principais categorias a considerar na construção deste inquérito. Os inquéritos foram tratados estatisticamente recorrendo-se à estatística descritiva, à análise fatorial e avaliação da consistência interna dos fatores resultantes, seguido da aplicação de regressão linear e metodologia de análise de trajetórias (path modeling) com vista à validação do TPB. Os resultados obtidos apontam para o fato de a Atitude ser o melhor preditor da Intenção em adotar os comportamentos em estudo. Verificou-se também que a motivação de cumprir resulta da pressão exercida pelos superiores hierárquicos ou colegas, influenciando positivamente a intenção, na medida em que as crenças normativas assumiram-se como sendo o segundo preditor que melhor previu a intenção.
Resumo:
Mestrado em Relações Internacionais.
Resumo:
Immune systems have been used in the last years to inspire approaches for several computational problems. This paper focus on behavioural biometric authentication algorithms’ accuracy enhancement by using them more than once and with different thresholds in order to first simulate the protection provided by the skin and then look for known outside entities, like lymphocytes do. The paper describes the principles that support the application of this approach to Keystroke Dynamics, an authentication biometric technology that decides on the legitimacy of a user based on his typing pattern captured on he enters the username and/or the password and, as a proof of concept, the accuracy levels of one keystroke dynamics algorithm when applied to five legitimate users of a system both in the traditional and in the immune inspired approaches are calculated and the obtained results are compared.
Resumo:
Dissertação de mestrado em Direito e Informática