947 resultados para Data pre-processing
Resumo:
Since the invention of photography humans have been using images to capture, store and analyse the act that they are interested in. With the developments in this field, assisted by better computers, it is possible to use image processing technology as an accurate method of analysis and measurement. Image processing's principal qualities are flexibility, adaptability and the ability to easily and quickly process a large amount of information. Successful examples of applications can be seen in several areas of human life, such as biomedical, industry, surveillance, military and mapping. This is so true that there are several Nobel prizes related to imaging. The accurate measurement of deformations, displacements, strain fields and surface defects are challenging in many material tests in Civil Engineering because traditionally these measurements require complex and expensive equipment, plus time consuming calibration. Image processing can be an inexpensive and effective tool for load displacement measurements. Using an adequate image acquisition system and taking advantage of the computation power of modern computers it is possible to accurately measure very small displacements with high precision. On the market there are already several commercial software packages. However they are commercialized at high cost. In this work block-matching algorithms will be used in order to compare the results from image processing with the data obtained with physical transducers during laboratory load tests. In order to test the proposed solutions several load tests were carried out in partnership with researchers from the Civil Engineering Department at Universidade Nova de Lisboa (UNL).
Resumo:
Introduction In Brazil, little data exist regarding the distribution of genotypes in relation to basal core promoter (BCP) and precore/core mutations among chronic hepatitis B virus (HBV) carriers from different regions of the country. The aim of this study was to identify HBV genotypes and the frequency of mutations at the BCP and precore/core region among the prevalent genotypes in chronic carriers from southern Brazil. Methods Nested-polymerase chain reaction (nested-PCR) products amplified from the S-polymerase gene, BCP and precore/core region from 54 samples were sequenced and analyzed. Results Phylogenetic analysis of the S-polymerase gene sequences showed that 66.7% (36/54) of the patients were infected with genotype D (D1, D2, D3), 25.9% (14/54) with genotype A (A1, A2), 5.6% (3/54) with subgenotype C2, and 2% (1/54) with genotype E. A comparison of virological characteristics showed significant differences between genotypes A, C and D. The comparison between HBeAg status and the G1896A stop codon mutation in patients with genotype D revealed a relationship between HBV G1896A precore mutants and genotype D and hepatitis B e antigen (HBeAg) seroconversion. Genotype D had a higher prevalence of the G1896A mutation and the presence of a thymine at position 1858. Genotype A was associated with a higher prevalence of the G1862T mutation and the presence of a cytosine at position 1858. Conclusions HBV genotype D (D3) is predominant in HBV chronic carriers from southern Brazil. The presence of mutations in the BCP and precore/core region was correlated with the HBV genotype and HBeAg negative status.
Resumo:
Relatório de estágio de mestrado em Educação Pré-Escolar e Ensino do 1.º Ciclo do Ensino Básico
Resumo:
As increasingly more sophisticated materials and products are being developed and times-to-market need to be minimized, it is important to make available fast response characterization tools using small amounts of sample, capable of conveying data on the relationships between rheological response, process-induced material structure and product characteristics. For this purpose, a single / twin-screw mini-extrusion system of modular construction, with well-controlled outputs in the range 30-300 g/h, was coupled to a in- house developed rheo-optical slit die able to measure shear viscosity and normal-stress differences, as well as performing rheo-optical experiments, namely small angle light scattering (SALS) and polarized optical microscopy (POM). In addition, the mini-extruder is equipped with ports that allow sample collection, and the extrudate can be further processed into products to be tested later. Here, we present the concept and experimental set-up [1, 2]. As a typical application, we report on the characterization of the processing of a polymer blend and of the properties of extruded sheets. The morphological evolution of a PS/PMMA industrial blend along the extruder, the flow-induced structures developed and the corresponding rheological characteristics are presented, together with the mechanical and structural characteristics of produced sheets. The application of this experimental tool to other material systems will also be discussed.
Resumo:
Fibre reinforced thermoplastic pre impregnated materials produced continuously by diverse methods and processing conditions were used to produce composites using pultrusion. The processing windows used to produce these materials and composites profiles were optimized by using the Taguchi / DOE (Design of Experiments) methods. Composites were manufactured by pultrusion and compression moulding and subsequently submitted to mechanical testing and microscopy analysis. The obtained results were compared with the expected theoretical ones predicted from the Rule of Mixtures (ROM) and with those of similar engineering conventional available materials. The results obtained shown that produced composites have adequate properties for applications in common and structural engineering markets.
Resumo:
Background: Abnormalities in emotional prosody processing have been consistently reported in schizophrenia and are related to poor social outcomes. However, the role of stimulus complexity in abnormal emotional prosody processing is still unclear. Method: We recorded event-related potentials in 16 patients with chronic schizophrenia and 16 healthy controls to investigate: 1) the temporal course of emotional prosody processing; and 2) the relative contribution of prosodic and semantic cues in emotional prosody processing. Stimuli were prosodic single words presented in two conditions: with intelligible (semantic content condition—SCC) and unintelligible semantic content (pure prosody condition—PPC). Results: Relative to healthy controls, schizophrenia patients showed reduced P50 for happy PPC words, and reduced N100 for both neutral and emotional SCC words and for neutral PPC stimuli. Also, increased P200 was observed in schizophrenia for happy prosody in SCC only. Behavioral results revealed higher error rates in schizophrenia for angry prosody in SCC and for happy prosody in PPC. Conclusions: Together, these data further demonstrate the interactions between abnormal sensory processes and higher-order processes in bringing about emotional prosody processing dysfunction in schizophrenia. They further suggest that impaired emotional prosody processing is dependent on stimulus complexity.
Resumo:
ABSTRACT The spatial distribution of forest biomass in the Amazon is heterogeneous with a temporal and spatial variation, especially in relation to the different vegetation types of this biome. Biomass estimated in this region varies significantly depending on the applied approach and the data set used for modeling it. In this context, this study aimed to evaluate three different geostatistical techniques to estimate the spatial distribution of aboveground biomass (AGB). The selected techniques were: 1) ordinary least-squares regression (OLS), 2) geographically weighted regression (GWR) and, 3) geographically weighted regression - kriging (GWR-K). These techniques were applied to the same field dataset, using the same environmental variables derived from cartographic information and high-resolution remote sensing data (RapidEye). This study was developed in the Amazon rainforest from Sucumbíos - Ecuador. The results of this study showed that the GWR-K, a hybrid technique, provided statistically satisfactory estimates with the lowest prediction error compared to the other two techniques. Furthermore, we observed that 75% of the AGB was explained by the combination of remote sensing data and environmental variables, where the forest types are the most important variable for estimating AGB. It should be noted that while the use of high-resolution images significantly improves the estimation of the spatial distribution of AGB, the processing of this information requires high computational demand.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Tese de Doutoramento (Programa Doutoral em Engenharia Biomédica)
Resumo:
Football is considered nowadays one of the most popular sports. In the betting world, it has acquired an outstanding position, which moves millions of euros during the period of a single football match. The lack of profitability of football betting users has been stressed as a problem. This lack gave origin to this research proposal, which it is going to analyse the possibility of existing a way to support the users to increase their profits on their bets. Data mining models were induced with the purpose of supporting the gamblers to increase their profits in the medium/long term. Being conscience that the models can fail, the results achieved by four of the seven targets in the models are encouraging and suggest that the system can help to increase the profits. All defined targets have two possible classes to predict, for example, if there are more or less than 7.5 corners in a single game. The data mining models of the targets, more or less than 7.5 corners, 8.5 corners, 1.5 goals and 3.5 goals achieved the pre-defined thresholds. The models were implemented in a prototype, which it is a pervasive decision support system. This system was developed with the purpose to be an interface for any user, both for an expert user as to a user who has no knowledge in football games.
Resumo:
Current data mining engines are difficult to use, requiring optimizations by data mining experts in order to provide optimal results. To solve this problem a new concept was devised, by maintaining the functionality of current data mining tools and adding pervasive characteristics such as invisibility and ubiquity which focus on their users, providing better ease of use and usefulness, by providing autonomous and intelligent data mining processes. This article introduces an architecture to implement a data mining engine, composed by four major components: database; Middleware (control); Middleware (processing); and interface. These components are interlinked but provide independent scaling, allowing for a system that adapts to the user’s needs. A prototype has been developed in order to test the architecture. The results are very promising and showed their functionality and the need for further improvements.
Resumo:
The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.
Resumo:
Healthcare organizations often benefit from information technologies as well as embedded decision support systems, which improve the quality of services and help preventing complications and adverse events. In Centro Materno Infantil do Norte (CMIN), the maternal and perinatal care unit of Centro Hospitalar of Oporto (CHP), an intelligent pre-triage system is implemented, aiming to prioritize patients in need of gynaecology and obstetrics care in two classes: urgent and consultation. The system is designed to evade emergency problems such as incorrect triage outcomes and extensive triage waiting times. The current study intends to improve the triage system, and therefore, optimize the patient workflow through the emergency room, by predicting the triage waiting time comprised between the patient triage and their medical admission. For this purpose, data mining (DM) techniques are induced in selected information provided by the information technologies implemented in CMIN. The DM models achieved accuracy values of approximately 94% with a five range target distribution, which not only allow obtaining confident prediction models, but also identify the variables that stand as direct inducers to the triage waiting times.
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniques for maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables, and an approach for performing parallel addition of N input symbols.
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniquesfor maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables,and an approach for performing parallel addition of N input symbols.