7 resultados para Parallel or distributed processing

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The metaheuristics techiniques are known to solve optimization problems classified as NP-complete and are successful in obtaining good quality solutions. They use non-deterministic approaches to generate solutions that are close to the optimal, without the guarantee of finding the global optimum. Motivated by the difficulties in the resolution of these problems, this work proposes the development of parallel hybrid methods using the reinforcement learning, the metaheuristics GRASP and Genetic Algorithms. With the use of these techniques, we aim to contribute to improved efficiency in obtaining efficient solutions. In this case, instead of using the Q-learning algorithm by reinforcement learning, just as a technique for generating the initial solutions of metaheuristics, we use it in a cooperative and competitive approach with the Genetic Algorithm and GRASP, in an parallel implementation. In this context, was possible to verify that the implementations in this study showed satisfactory results, in both strategies, that is, in cooperation and competition between them and the cooperation and competition between groups. In some instances were found the global optimum, in others theses implementations reach close to it. In this sense was an analyze of the performance for this proposed approach was done and it shows a good performance on the requeriments that prove the efficiency and speedup (gain in speed with the parallel processing) of the implementations performed

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study magnetic interface roughness in F/AF bilayers. Two kinds of roughness were considered. The first one consists of isolated defects that divide the substrate in two regions, each one with an AF sub-lattice. The interface exchange coupling is considered uniform and presents a sudden change in the defects line, favoring Neel wall nucleation. Our results show the interface field dependence of the threshold thickness for the reorientation of the magnetization in the ferromagnetic film. Angular profiles show the relaxation of the magnetization, from Neel wall, at the interface, to reoriented state, at the surface. External magnetic field, perpendicular to the easy axis of the substrate, favors the reoriented state. Depending, of the external magnetic field intensity, parallel to the easy axis of the AF, the magnetization profile at surface can be parallel or perpendicular to the field direction. The second one treats of distributed deffects, periodically. The shape hysteresis curves, exchange bias and coercivity were characterized by interface field intensity and roughness pattern. Our results show that dipolar effects decrease the exchange bias and coercivity

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, there are several power converter topologies applied to wind power generation. The converters allow the use of wind turbines operating at variable speed, enabling better use of wind forces. The high performance of the converters is being increasingly demanded, mainly because of the increase in the power generation capacity by wind turbines, which gave rise to various converter topologies, such as parallel or multilevel converters. The use of converters allow effective control of the power injected into the grid, either partially, for the case using partial converter, or total control for the case of using full converter. The back-to-back converter is one of the most used topologies in the market today, due to its simple structure, with few components, contributing to robust and reliable performance. In this work, is presented the implementation of a wind cogeneration system using a permanent magnet synchronous generator (PMSG) associated with a back-to-back power converter is proposed, in order to inject active power in an electric power system. The control strategy of the active power delivered to the grid by cogeneration is based on the philosophy of indirect control

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The underground natural gas found associated or not with oil is characterized by a mixture of hydrocarbons and residual components such as carbon dioxide (CO2), nitrogen gas (N2) and hydrogen sulfide (H2S), called contaminants. The H2S especially promotes itself as a contaminant of natural gas to be associated with corrosion of pipelines, to human toxicity and final applications of Natural Gas (NG). The sulfur present in the GN must be fully or partially removed in order to meet the market specifications, security, transport or further processing. There are distinct and varied methods of desulfurization of natural gas processing units used in Natural Gas (UPGN). In order to solve these problems have for example the caustic washing, absorption, the use of membranes and adsorption processes is costly and great expenditure of energy. Arises on such findings, the need for research to active processes of economic feasibility and efficiency. This work promoted the study of the adsorption of sulfide gas in polymer matrices hydrogen pure and modified. The substrates of Poly(vinyl chloride) (PVC), poly(methyl methacrylate) (PMMA) and sodium alginate (NaALG) were coated with vanadyl phosphate compounds (VOPO4.2H2O), vanadium pentoxide (V2O5), rhodamine B (C28H31N2O3Cl) and ions Co2+ and Cu2+, aiming to the adsorption of hydrogen sulfide gas (H2S). The adsorption tests were through a continuous flow of H2S in a column system (fixed bed reactor) adsorption on a laboratory scale. The techniques used to characterize the adsorbents were Infrared spectroscopy (FTIR), thermogravimetry analysis (TGA), X-ray fluorescence (XRF), the X-ray diffraction (XRD) electron microscopy (SEM). Such work indicates, the results obtained, the adsorbents modified PMMA, PVC and NaALG have a significant adsorptive capacity. The matrix that stood out and had the best adsorption capacity, was to ALG modified Co2+ with a score of 12.79 mg H2S / g matrix

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The last years have presented an increase in the acceptance and adoption of the parallel processing, as much for scientific computation of high performance as for applications of general intention. This acceptance has been favored mainly for the development of environments with massive parallel processing (MPP - Massively Parallel Processing) and of the distributed computation. A common point between distributed systems and MPPs architectures is the notion of message exchange, that allows the communication between processes. An environment of message exchange consists basically of a communication library that, acting as an extension of the programming languages that allow to the elaboration of applications parallel, such as C, C++ and Fortran. In the development of applications parallel, a basic aspect is on to the analysis of performance of the same ones. Several can be the metric ones used in this analysis: time of execution, efficiency in the use of the processing elements, scalability of the application with respect to the increase in the number of processors or to the increase of the instance of the treat problem. The establishment of models or mechanisms that allow this analysis can be a task sufficiently complicated considering parameters and involved degrees of freedom in the implementation of the parallel application. An joined alternative has been the use of collection tools and visualization of performance data, that allow the user to identify to points of strangulation and sources of inefficiency in an application. For an efficient visualization one becomes necessary to identify and to collect given relative to the execution of the application, stage this called instrumentation. In this work it is presented, initially, a study of the main techniques used in the collection of the performance data, and after that a detailed analysis of the main available tools is made that can be used in architectures parallel of the type to cluster Beowulf with Linux on X86 platform being used libraries of communication based in applications MPI - Message Passing Interface, such as LAM and MPICH. This analysis is validated on applications parallel bars that deal with the problems of the training of neural nets of the type perceptrons using retro-propagation. The gotten conclusions show to the potentiality and easinesses of the analyzed tools.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Currently, several psychological and non-psychological tests can be found in publishes without standardization on procedures set in different psychological areas, like intelligence, emotional states, attitudes, social skills, vocation, preferences and others. The computerized psychological testing is a extension of traditional testing psychological practices. However, it has own psychometrics qualities, either by its matching in a computerized environment or by the extension that can be developed in it. The current research, developed from a necessity to study process of validity and reliability on a computerized test, drew a methodological structure to provide parallel applications in numerous kinds of operational groups, evaluating the influences of the time and approach in the computerization process. This validity refers to normative values groups, reproducibility in computerized applications process and data processing. Not every psychological test can be computerized. Therefore, our need to find a good test, with quality and plausible properties to transform in computerized application, leaded us to use The Millon Personality Inventory, created by Theodore Millon. This Inventory assesses personality according to 12 bipolarities distributed in 24 factors, distributed in categories motivational styles, cognitive targets and interpersonal relations. This instrument doesn t diagnose pathological features, but test normal and non adaptive aspects in human personality, comparing with Theodore Millon theory of personality. In oder to support this research in a Brazilian context in psychological testing, we discuss the theme, evaluating the advantages and disadvantages of such practices. Also we discuss the current forms in computerization of psychological testing and the main specific criteria in this psychometric specialized area of knowledge. The test was on-line, hosted in the site http://www.planetapsi.com, during the years of 2007 and 2008, which was available a questionnaire to describe social characteristics before test. A report was generated from the data entry of each user. An application of this test was conducted in a linear way through a national coverage in all Brazil regions, getting 1508 applications. Were organized nine groups, reaching 180 applications in test and retest subject, where three periods of time and three forms of retests for studies of on-line tests were separated. Parallel to this, we organized multi-application session offline group, 20 subjects who received tests by email. The subjects of this study were generally distributed by the five Brazilian regions, and were noticed about the test via the Internet. The performance application in traditional and on-line tested groups subsidies us to conclude that on-line application provides significantly consistency in all criteria for validity studied and justifies its use. The on-line test results were related not only among themselves but were similar to those data of tests done on pencil and paper (0,82). The retests results demonstrated correlation, between 0,92 and, 1 while multisessions had a good correlation in these comparisons. Moreover, were assessed the adequacy of operational criteria used, such as security, the performance of users, the environmental characteristics, the organization of the database, operational costs and limitations in this on-line inventory. In all these five items, there were excellent performances, concluding, also, that it s possible a self-applied psychometric test. The results of this work are a guide to question and establish of methodologies studies for computerization psychological testing software in the country