38 resultados para Graphical processing units
em Instituto Politécnico do Porto, Portugal
Resumo:
Swarm Intelligence generally refers to a problem-solving ability that emerges from the interaction of simple information-processing units. The concept of Swarm suggests multiplicity, distribution, stochasticity, randomness, and messiness. The concept of Intelligence suggests that problem-solving approach is successful considering learning, creativity, cognition capabilities. This paper introduces some of the theoretical foundations, the biological motivation and fundamental aspects of swarm intelligence based optimization techniques such Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO) and Artificial Bees Colony (ABC) algorithms for scheduling optimization.
Resumo:
The best places to locate the Gas Supply Units (GSUs) on a natural gas systems and their optimal allocation to loads are the key factors to organize an efficient upstream gas infrastructure. The number of GSUs and their optimal location in a gas network is a decision problem that can be formulated as a linear programming problem. Our emphasis is on the formulation and use of a suitable location model, reflecting real-world operations and constraints of a natural gas system. This paper presents a heuristic model, based on lagrangean approach, developed for finding the optimal GSUs location on a natural gas network, minimizing expenses and maximizing throughput and security of supply.The location model is applied to the Iberian high pressure natural gas network, a system modelised with 65 demand nodes. These nodes are linked by physical and virtual pipelines – road trucks with gas in liquefied form. The location model result shows the best places to locate, with the optimal demand allocation and the most economical gas transport mode: by pipeline or by road truck.
Resumo:
Natural gas industry has been confronted with big challenges: great growth in demand, investments on new GSUs – gas supply units, and efficient technical system management. The right number of GSUs, their best location on networks and the optimal allocation to loads is a decision problem that can be formulated as a combinatorial programming problem, with the objective of minimizing system expenses. Our emphasis is on the formulation, interpretation and development of a solution algorithm that will analyze the trade-off between infrastructure investment expenditure and operating system costs. The location model was applied to a 12 node natural gas network, and its effectiveness was tested in five different operating scenarios.
Resumo:
We describe a novel approach to explore DNA nucleotide sequence data, aiming to produce high-level categorical and structural information about the underlying chromosomes, genomes and species. The article starts by analyzing chromosomal data through histograms using fixed length DNA sequences. After creating the DNA-related histograms, a correlation between pairs of histograms is computed, producing a global correlation matrix. These data are then used as input to several data processing methods for information extraction and tabular/graphical output generation. A set of 18 species is processed and the extensive results reveal that the proposed method is able to generate significant and diversified outputs, in good accordance with current scientific knowledge in domains such as genomics and phylogenetics.
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
An Electrocardiogram (ECG) monitoring system deals with several challenges related with noise sources. The main goal of this text was the study of Adaptive Signal Processing Algorithms for ECG noise reduction when applied to real signals. This document presents an adaptive ltering technique based on Least Mean Square (LMS) algorithm to remove the artefacts caused by electromyography (EMG) and power line noise into ECG signal. For this experiments it was used real noise signals, mainly to observe the di erence between real noise and simulated noise sources. It was obtained very good results due to the ability of noise removing that can be reached with this technique. A recolha de sinais electrocardiogr a cos (ECG) sofre de diversos problemas relacionados com ru dos. O objectivo deste trabalho foi o estudo de algoritmos adaptativos para processamento digital de sinal, para redu c~ao de ru do em sinais ECG reais. Este texto apresenta uma t ecnica de redu c~ao de ru do baseada no algoritmo Least Mean Square (LMS) para remo c~ao de ru dos causados quer pela actividade muscular (EMG) quer por ru dos causados pela rede de energia el ectrica. Para as experiencias foram utilizados ru dos reais, principalmente para aferir a diferen ca de performance do algoritmo entre os sinais reais e os simulados. Foram conseguidos bons resultados, essencialmente devido as excelentes caracter sticas que esta t ecnica tem para remover ru dos.
Resumo:
Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.
Resumo:
Background: Temporal lobe epilepsy (TLE) is a neurological disorder that directly affects cortical areas responsible for auditory processing. The resulting abnormalities can be assessed using event-related potentials (ERP), which have high temporal resolution. However, little is known about TLE in terms of dysfunction of early sensory memory encoding or possible correlations between EEGs, linguistic deficits, and seizures. Mismatch negativity (MMN) is an ERP component – elicited by introducing a deviant stimulus while the subject is attending to a repetitive behavioural task – which reflects pre-attentive sensory memory function and reflects neuronal auditory discrimination and perceptional accuracy. Hypothesis: We propose an MMN protocol for future clinical application and research based on the hypothesis that children with TLE may have abnormal MMN for speech and non-speech stimuli. The MMN can be elicited with a passive auditory oddball paradigm, and the abnormalities might be associated with the location and frequency of epileptic seizures. Significance: The suggested protocol might contribute to a better understanding of the neuropsychophysiological basis of MMN. We suggest that in TLE central sound representation may be decreased for speech and non-speech stimuli. Discussion: MMN arises from a difference to speech and non-speech stimuli across electrode sites. TLE in childhood might be a good model for studying topographic and functional auditory processing and its neurodevelopment, pointing to MMN as a possible clinical tool for prognosis, evaluation, follow-up, and rehabilitation for TLE.
Resumo:
TLE in infancy has been the subject of varied research. Topographical and structural evidence is coincident with the neuronal systems responsible for auditory processing of the highest specialization and complexity. Recent studies have been showing the need of a hemispheric asymmetry for an optimization in central auditory processing (CAP) and acquisition and learning of a language system. A new functional research paradigm is required to study mental processes that require methods of cognitive-sensory information analysis processed in very short periods of time (msec), such as the ERPs. Thus, in this article, we hypothesize that the TLE in infancy could be a good model for topographic and functional study of CAP and its development process, contributing to a better understanding of the learning difficulties that children with this neurological disorder have.
Resumo:
Alheiras are a traditional, smoked, fermented meat sausage, produced in Portugal, with an undeniable cultural and gastronomic legacy. In this study, we assessed the nutritional value of this product, as well as the influence of different types of thermal processing. Alheiras from Mirandela were submitted to six different procedures: microwave, skillet, oven, charcoal grill, electric fryer and electric grill. Protein, fat, carbohydrate, minerals, NaCl, and cholesterol contents, as well as fatty acid profile were evaluated. The results show that alheiras are not hypercaloric but an unbalanced foodstuff (high levels of proteins and lipids) and the type of processing has a major impact on their nutritional value. Charcoal grill is the healthiest option: less fat (12.5 g/100 g) and cholesterol (29.3 mg/100 g), corresponding to a lower caloric intake (231.8 kcal, less 13% than the raw ones). Inversely, fried alheiras presented the worst nutritional profile, with the highest levels of fat (18.1 g/100 g) and cholesterol (76.0 g/100 g).
Resumo:
The main purpose of this work is to present and to interpret the change of structure and physical properties of tantalum oxynitride (TaNxOy) thin films, produced by dc reactive magnetron sputtering, by varying the processing parameters. A set of TaNxOy films was prepared by varying the reactive gases flow rate, using a N2/O2 gas mixture with a concentration ratio of 17:3. The different films, obtained by this process, exhibited significant differences. The obtained composition and the interpretation of X-ray diffraction results, shows that, depending on the partial pressure of the reactive gases, the films are: essentially dark grey metallic, when the atomic ratio (N + O)/Ta < 0.1, evidencing a tetragonal β-Ta structure; grey-brownish, when 0.1 < (N + O)/Ta < 1, exhibiting a face-centred cubic (fcc) TaN-like structure; and transparent oxide-type, when (N + O)/Ta > 1, evidencing the existence of Ta2O5, but with an amorphous structure. These transparent films exhibit refractive indexes, in the visible region, always higher than 2.0. The wear resistance of the films is relatively good. The best behaviour was obtained for the films with (N + O)/Ta ≈ 0.5 and (N + O)/Ta ≈ 1.3.
Resumo:
Prostate cancer (PCa) is one of the most incident malignancies worldwide. Although efficient therapy is available for early-stage PCa, treatment of advanced disease is mainly ineffective and remains a clinical challenge. microRNA (miRNA) dysregulation is associated with PCa development and progression. In fact, several studies have reported a widespread downregulation of miRNAs in PCa, which highlights the importance of studying compounds capable of restoring the global miRNA expression. The main aim of this study was to define the usefulness of enoxacin as an anti-tumoral agent in PCa, due to its ability to induce miRNA biogenesis in a TRBP-mediated manner. Using a panel of five PCa cell lines, we observed that all of them were wild type for the TARBP2 gene and expressed TRBP protein. Furthermore, primary prostate carcinomas displayed normal levels of TRBP protein. Remarkably, enoxacin was able to decrease cell viability, induce apoptosis, cause cell cycle arrest, and inhibit the invasiveness of cell lines. Enoxacin was also effective in restoring the global expression of miRNAs. This study is the first to show that PCa cells are highly responsive to the anti-tumoral effects of enoxacin. Therefore, enoxacin constitutes a promising therapeutic agent for PCa.
Resumo:
Network control systems (NCSs) are spatially distributed systems in which the communication between sensors, actuators and controllers occurs through a shared band-limited digital communication network. However, the use of a shared communication network, in contrast to using several dedicated independent connections, introduces new challenges which are even more acute in large scale and dense networked control systems. In this paper we investigate a recently introduced technique of gathering information from a dense sensor network to be used in networked control applications. Obtaining efficiently an approximate interpolation of the sensed data is exploited as offering a good tradeoff between accuracy in the measurement of the input signals and the delay to the actuation. These are important aspects to take into account for the quality of control. We introduce a variation to the state-of-the-art algorithms which we prove to perform relatively better because it takes into account the changes over time of the input signal within the process of obtaining an approximate interpolation.