65 resultados para Algoritmo de Colisão de Partículas
Resumo:
This work proposes a collaborative system for marking dangerous points in the transport routes and generation of alerts to drivers. It consisted of a proximity warning system for a danger point that is fed by the driver via a mobile device equipped with GPS. The system will consolidate data provided by several different drivers and generate a set of points common to be used in the warning system. Although the application is designed to protect drivers, the data generated by it can serve as inputs for the responsible to improve signage and recovery of public roads
Resumo:
The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors
Resumo:
This work presents a scalable and efficient parallel implementation of the Standard Simplex algorithm in the multicore architecture to solve large scale linear programming problems. We present a general scheme explaining how each step of the standard Simplex algorithm was parallelized, indicating some important points of the parallel implementation. Performance analysis were conducted by comparing the sequential time using the Simplex tableau and the Simplex of the CPLEXR IBM. The experiments were executed on a shared memory machine with 24 cores. The scalability analysis was performed with problems of different dimensions, finding evidence that our parallel standard Simplex algorithm has a better parallel efficiency for problems with more variables than constraints. In comparison with CPLEXR , the proposed parallel algorithm achieved a efficiency of up to 16 times better
Resumo:
We use a finite diference eulerian numerical code, called ZEUS 3D, to do simulations involving the collision between two magnetized molecular clouds, aiming to evaluate the rate of star formation triggered by the collision and to analyse how that rate varies depending on the relative orientations between the cloud magnetic fields before the shock. The ZEUS 3D code is not an easy code to handle. We had to create two subroutines, one to study the cloud-cloud collision and the other for the data output. ZEUS is a modular code. Its hierarchical way of working is explained as well as the way our subroutines work. We adopt two sets of different initial values for density, temperature and magnetic field of the clouds and of the external medium in order to study the collision between two molecular clouds. For each set, we analyse in detail six cases with different directions and orientations of the cloud magnetic field relative to direction of motion of the clouds. The analysis of these twelve cases allowed us to conform analytical-theoretical proposals found in the literature, and to obtain several original results. Previous works indicate that, if the cloud magnetic fields before the collision are orthogonal to the direction of motion, then a strong inhibition of star formation will occur during a cloud-cloud shock, whereas if those magnetic fields are parallel to the direction of motion, star formation will be stimulated. Our treatment of the problem confirmed numerically those results, and further allowed us to quantify the relative star forming efficiencies in each case. Moreover, we propose and analyse an intermediate case where the field of one of the clouds is orthogonal to the motion and the field of the other one is parallel to the motion. We conclude that, in this case, the rate at which the star formation occurs has a value also intermediate between the two extreme cases we mentioned above. Besides that we study the case in which the fields are orthogonal to the direction of the motion but, instead of being parallel to each other, they are anti-parallel, and we obtained for this case the corresponding variation of the star formation rate due to this alteration of the field configuration. This last case has not been studied in the literature before. Our study allows us to obtain, from the simulations, the rate of star formation in each case, as well as the temporal dependence of that rate as each collision evolves, what we do in detail for one of the cases in particular. The values we obtain for the rate of star formation are in accordance with those expected from the presently existing observational data
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Resumo:
The present essay shows strategies of improvement in a well succeded evolutionary metaheuristic to solve the Asymmetric Traveling Salesman Problem. Such steps consist in a Memetic Algorithm projected mainly to this problem. Basically this improvement applied optimizing techniques known as Path-Relinking and Vocabulary Building. Furthermore, this last one has being used in two different ways, in order to evaluate the effects of the improvement on the evolutionary metaheuristic. These methods were implemented in C++ code and the experiments were done under instances at TSPLIB library, being possible to observe that the procedures purposed reached success on the tests done
Resumo:
The present study describes the stability and rheological behavior of suspensions of poly (N-isopropylacrylamide) (PNIPAM), poly (N-isopropylacrylamide)-chitosan (PNIPAMCS), and poly (N-isopropylacrylamide)-chitosan-poly (acrylic acid) (PNIPAM-CS-PAA) crosslinked particles sensitive to pH and temperature. These dual-sensitive materials were simply obtained by one-pot method, via free-radical precipitation copolymerization with potassium persulfate, using N,N -methylenebisacrylamide (MBA) as a crosslinking agent. Incorporation of the precursor materials into the chemical networks was confirmed by elementary analysis and infrared spectroscopy. The influence of external stimuli such as pH and temperature, or both, on particle behavior was investigated through rheological measurements, visual stability tests and analytical centrifugation. The PNIPAM-CS particles showed higher stability in acid and neutral media, whereas PNIPAM-CS-PAA particles were more stable in neutral and alkaline media, both below and above the LCST of poly (Nisopropylacrylamide) (stability data). This is due to different interparticle interactions, as well as those between the particles and the medium (also evidenced by rheological data), which were also influenced by the pH and temperature of the medium. Based on the results obtained, we found that the introduction of pH-sensitive polymers to crosslinked poly (Nisopropylacrylamide) particles not only produced dual-sensitive materials, but allowed particle stability to be adjusted, making phase separation faster or slower, depending on the desired application. Thus, it is possible to adapt the material to different media
Resumo:
Among the polymers that stand out most in recent decades, chitosan, a biopolymer with physico-chemical and biological promising properties has been the subject of a broad field of research. Chitosan comes as a great choice in the field of adsorption, due to their adsorbents properties, low cost and abundance. The presence of amino groups in its chain govern the majority of their properties and define which application a sample of chitosan may be used, so it is essential to determine their average degree of deacetylation. In this work we developed kinetic and equilibrium studies to monitor and characterize the adsorption process of two drugs, tetracycline hydrochloride and sodium cromoglycate, in chitosan particles. Kinetic models and the adsorption isotherms were applied to the experimental data. For both studies, the zeta potential analyzes were also performed. The adsorption of each drug showed distinct aspects. Through the studies developed in this work was possible to describe a kinetic model for the adsorption of tetracycline on chitosan particles, thus demonstrating that it can be described by two kinetics of adsorption, one for protonated tetracycline and another one for unprotonated tetracycline. In the adsorption of sodium cromoglycate on chitosan particles, equilibrium studies were developed at different temperatures, allowing the determination of thermodynamic parameters
Resumo:
The separation oil-water by the use of flotation process is characterized by the involvement between the liquid and gas phases. For the comprehension of this process, it s necessary to analyze the physical and chemical properties command float flotation, defining the nature and forces over the particles. The interface chemistry has an important role on the flotation technology once, by dispersion of a gas phase into a liquid mixture the particles desired get stuck into air bubbles, being conduced to a superficial layer where can be physically separated. Through the study of interface interaction involved in the system used for this work, was possible to apply the results in an mathematical model able to determine the probability of flotation using a different view related to petroleum emulsions such as oil-water. The terms of probability of flotation correlate the collision and addition between particles of oil and air bubbles, that as more collisions, better is the probability of flotation. The additional probability was analyzed by the isotherm of absorption from Freundlich, represents itself the add probability between air bubbles and oil particles. The mathematical scheme for float flotation involved the injected air flow, the size of bubbles and quantity for second, the volume of float cell, viscosity of environment and concentration of demulsifier. The results shown that the float agent developed by castor oil, pos pH variation, salt quantity, temperature, concentration and water-oil quantity, presented efficient extraction of oil from water, up to 95%, using concentrations around 11 ppm of demulsifier. The best results were compared to other commercial products, codified by ―W‖ and ―Z‖, being observed an equivalent demulsifier power between Agflot and commercial product ―W‖ and superior to commercial product ―Z‖
Resumo:
Due to its physico-chemical and biological properties, related to the abundance and low cost of raw material, chitosan has been recognized as a material of wide application in various fields, such as in drug delivery systems. Many of these properties are associated with the presence of amino groups in its polymer chain. A proper determination of these amino groups is very important, in order to properly specify if a given chitosan sample can be used in a particular application. Thus, in this work, initially, a comparison between the determination of the deacetylation degree by conductometry and elemental analysis was carried out using a detailed analysis of error propagation. It was shown that the conductometric analysis resulted in a simple and safe method for the determining the degree of deacetylation of chitosan. Subsequently, experiments were performed to monitor and characterize the adsorption of tetracycline on chitosan particles through kinetic and equilibrium studies. The main models of kinetics and adsorption isotherms, widely used to describe the adsorption on wastewater treatment systems and the drug loading, were used to treat the experimental data. Firstly, it was shown that an apparent linear t/q(t) × t relationship did not imply in a pseudo-second-order adsorption kinetics, differently of what has been repeatedly reported in the literature. It was found that this misinterpretation can be avoided by using non-linear regression. Finally, the adsorption of tetracycline on chitosan particles was analyzed using insights obtained from theoretical analysis, and the parameters generated were used to analyze the kinetics of adsorption, the isotherm of adsorption and to ropose a mechanism of adsorption
Resumo:
Textile activity results in effluents with a variety of dyes. Among the several processes for dye-uptaking from these wastewaters, sorption is one of the most effective methods, chitosan being a very promising alternative for this end. The sorption of Methyl Orange by chitosan crosslinked particles was approached using equilibrium and kinetic analyses at different pH s. Besides the standard pseudo-order analysis normally effectuated (i.e. pseudo-first-order and pseudo-second-order), a novel approach involving a pseudo-nth-order kinetics was used, nbeing determined via non-linear regression, using the Levenberg-Marquardt method. Zeta potential measurements indicated that electrostatic interactions were important for the sorption process. Regarding equilibrium experiments, data were well fitted to a hybrid Langmuir-Freundlich isotherm, and estimated Gibbs free energy of adsorption as a function of mass of dye per area of chitosan showed that the process of adsorption becomes more homogeneous as the pH of the continuous phase decreased. Considering the kinetics of sorption, although a pseudo-nth-order description yielded good fits, a kinetic equation involving diffusion adsorption phenomena was found to be more consistent in terms of a physicochemical description of the sorption process
Resumo:
Traditional applications of feature selection in areas such as data mining, machine learning and pattern recognition aim to improve the accuracy and to reduce the computational cost of the model. It is done through the removal of redundant, irrelevant or noisy data, finding a representative subset of data that reduces its dimensionality without loss of performance. With the development of research in ensemble of classifiers and the verification that this type of model has better performance than the individual models, if the base classifiers are diverse, comes a new field of application to the research of feature selection. In this new field, it is desired to find diverse subsets of features for the construction of base classifiers for the ensemble systems. This work proposes an approach that maximizes the diversity of the ensembles by selecting subsets of features using a model independent of the learning algorithm and with low computational cost. This is done using bio-inspired metaheuristics with evaluation filter-based criteria
Resumo:
This thesis proposes an architecture of a new multiagent system framework for hybridization of metaheuristics inspired on the general Particle Swarm Optimization framework (PSO). The main contribution is to propose an effective approach to solve hard combinatory optimization problems. The choice of PSO as inspiration was given because it is inherently multiagent, allowing explore the features of multiagent systems, such as learning and cooperation techniques. In the proposed architecture, particles are autonomous agents with memory and methods for learning and making decisions, using search strategies to move in the solution space. The concepts of position and velocity originally defined in PSO are redefined for this approach. The proposed architecture was applied to the Traveling Salesman Problem and to the Quadratic Assignment Problem, and computational experiments were performed for testing its effectiveness. The experimental results were promising, with satisfactory performance, whereas the potential of the proposed architecture has not been fully explored. For further researches, the proposed approach will be also applied to multiobjective combinatorial optimization problems, which are closer to real-world problems. In the context of applied research, we intend to work with both students at the undergraduate level and a technical level in the implementation of the proposed architecture in real-world problems
Resumo:
Due to great difficulty of accurate solution of Combinatorial Optimization Problems, some heuristic methods have been developed and during many years, the analysis of performance of these approaches was not carried through in a systematic way. The proposal of this work is to make a statistical analysis of heuristic approaches to the Traveling Salesman Problem (TSP). The focus of the analysis is to evaluate the performance of each approach in relation to the necessary computational time until the attainment of the optimal solution for one determined instance of the TSP. Survival Analysis, assisted by methods for the hypothesis test of the equality between survival functions was used. The evaluated approaches were divided in three classes: Lin-Kernighan Algorithms, Evolutionary Algorithms and Particle Swarm Optimization. Beyond those approaches, it was enclosed in the analysis, a memetic algorithm (for symmetric and asymmetric TSP instances) that utilizes the Lin-Kernighan heuristics as its local search procedure
Resumo:
Combinatorial optimization problems have the goal of maximize or minimize functions defined over a finite domain. Metaheuristics are methods designed to find good solutions in this finite domain, sometimes the optimum solution, using a subordinated heuristic, which is modeled for each particular problem. This work presents algorithms based on particle swarm optimization (metaheuristic) applied to combinatorial optimization problems: the Traveling Salesman Problem and the Multicriteria Degree Constrained Minimum Spanning Tree Problem. The first problem optimizes only one objective, while the other problem deals with many objectives. In order to evaluate the performance of the algorithms proposed, they are compared, in terms of the quality of the solutions found, to other approaches