10 resultados para bigdata, data stream processing, dsp, apache storm, cyber security
em Universidade do Minho
Resumo:
Inspired by the relational algebra of data processing, this paper addresses the foundations of data analytical processing from a linear algebra perspective. The paper investigates, in particular, how aggregation operations such as cross tabulations and data cubes essential to quantitative analysis of data can be expressed solely in terms of matrix multiplication, transposition and the Khatri–Rao variant of the Kronecker product. The approach offers a basis for deriving an algebraic theory of data consolidation, handling the quantitative as well as qualitative sides of data science in a natural, elegant and typed way. It also shows potential for parallel analytical processing, as the parallelization theory of such matrix operations is well acknowledged.
Resumo:
Immune systems have been used in the last years to inspire approaches for several computational problems. This paper focus on behavioural biometric authentication algorithms’ accuracy enhancement by using them more than once and with different thresholds in order to first simulate the protection provided by the skin and then look for known outside entities, like lymphocytes do. The paper describes the principles that support the application of this approach to Keystroke Dynamics, an authentication biometric technology that decides on the legitimacy of a user based on his typing pattern captured on he enters the username and/or the password and, as a proof of concept, the accuracy levels of one keystroke dynamics algorithm when applied to five legitimate users of a system both in the traditional and in the immune inspired approaches are calculated and the obtained results are compared.
Resumo:
Dissertação de mestrado em Direito e Informática
Resumo:
As increasingly more sophisticated materials and products are being developed and times-to-market need to be minimized, it is important to make available fast response characterization tools using small amounts of sample, capable of conveying data on the relationships between rheological response, process-induced material structure and product characteristics. For this purpose, a single / twin-screw mini-extrusion system of modular construction, with well-controlled outputs in the range 30-300 g/h, was coupled to a in- house developed rheo-optical slit die able to measure shear viscosity and normal-stress differences, as well as performing rheo-optical experiments, namely small angle light scattering (SALS) and polarized optical microscopy (POM). In addition, the mini-extruder is equipped with ports that allow sample collection, and the extrudate can be further processed into products to be tested later. Here, we present the concept and experimental set-up [1, 2]. As a typical application, we report on the characterization of the processing of a polymer blend and of the properties of extruded sheets. The morphological evolution of a PS/PMMA industrial blend along the extruder, the flow-induced structures developed and the corresponding rheological characteristics are presented, together with the mechanical and structural characteristics of produced sheets. The application of this experimental tool to other material systems will also be discussed.
Resumo:
Background: Abnormalities in emotional prosody processing have been consistently reported in schizophrenia and are related to poor social outcomes. However, the role of stimulus complexity in abnormal emotional prosody processing is still unclear. Method: We recorded event-related potentials in 16 patients with chronic schizophrenia and 16 healthy controls to investigate: 1) the temporal course of emotional prosody processing; and 2) the relative contribution of prosodic and semantic cues in emotional prosody processing. Stimuli were prosodic single words presented in two conditions: with intelligible (semantic content condition—SCC) and unintelligible semantic content (pure prosody condition—PPC). Results: Relative to healthy controls, schizophrenia patients showed reduced P50 for happy PPC words, and reduced N100 for both neutral and emotional SCC words and for neutral PPC stimuli. Also, increased P200 was observed in schizophrenia for happy prosody in SCC only. Behavioral results revealed higher error rates in schizophrenia for angry prosody in SCC and for happy prosody in PPC. Conclusions: Together, these data further demonstrate the interactions between abnormal sensory processes and higher-order processes in bringing about emotional prosody processing dysfunction in schizophrenia. They further suggest that impaired emotional prosody processing is dependent on stimulus complexity.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
The environmental and socio-economic importance of coastal areas is widely recognized, but at present these areas face severe weaknesses and high-risk situations. The increased demand and growing human occupation of coastal zones have greatly contributed to exacerbating such weaknesses. Today, throughout the world, in all countries with coastal regions, episodes of waves overtopping and coastal flooding are frequent. These episodes are usually responsible for property losses and often put human lives at risk. The floods are caused by coastal storms primarily due to the action of very strong winds. The propagation of these storms towards the coast induces high water levels. It is expected that climate change phenomena will contribute to the intensification of coastal storms. In this context, an estimation of coastal flooding hazards is of paramount importance for the planning and management of coastal zones. Consequently, carrying out a series of storm scenarios and analyzing their impacts through numerical modeling is of prime interest to coastal decision-makers. Firstly, throughout this work, historical storm tracks and intensities are characterized for the northeastern region of United States coast, in terms of probability of occurrence. Secondly, several storm events with high potential of occurrence are generated using a specific tool of DelftDashboard interface for Delft3D software. Hydrodynamic models are then used to generate ensemble simulations to assess storms' effects on coastal water levels. For the United States’ northeastern coast, a highly refined regional domain is considered surrounding the area of The Battery, New York, situated in New York Harbor. Based on statistical data of numerical modeling results, a review of the impact of coastal storms to different locations within the study area is performed.
Resumo:
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as Nuclear Magnetic Resonance, Gas or Liquid Chromatography, Mass Spectrometry, Infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines.
Resumo:
Current data mining engines are difficult to use, requiring optimizations by data mining experts in order to provide optimal results. To solve this problem a new concept was devised, by maintaining the functionality of current data mining tools and adding pervasive characteristics such as invisibility and ubiquity which focus on their users, providing better ease of use and usefulness, by providing autonomous and intelligent data mining processes. This article introduces an architecture to implement a data mining engine, composed by four major components: database; Middleware (control); Middleware (processing); and interface. These components are interlinked but provide independent scaling, allowing for a system that adapts to the user’s needs. A prototype has been developed in order to test the architecture. The results are very promising and showed their functionality and the need for further improvements.
Resumo:
The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.