1000 resultados para Algoritmo de Colisão de Partículas
Resumo:
The separation oil-water by the use of flotation process is characterized by the involvement between the liquid and gas phases. For the comprehension of this process, it s necessary to analyze the physical and chemical properties command float flotation, defining the nature and forces over the particles. The interface chemistry has an important role on the flotation technology once, by dispersion of a gas phase into a liquid mixture the particles desired get stuck into air bubbles, being conduced to a superficial layer where can be physically separated. Through the study of interface interaction involved in the system used for this work, was possible to apply the results in an mathematical model able to determine the probability of flotation using a different view related to petroleum emulsions such as oil-water. The terms of probability of flotation correlate the collision and addition between particles of oil and air bubbles, that as more collisions, better is the probability of flotation. The additional probability was analyzed by the isotherm of absorption from Freundlich, represents itself the add probability between air bubbles and oil particles. The mathematical scheme for float flotation involved the injected air flow, the size of bubbles and quantity for second, the volume of float cell, viscosity of environment and concentration of demulsifier. The results shown that the float agent developed by castor oil, pos pH variation, salt quantity, temperature, concentration and water-oil quantity, presented efficient extraction of oil from water, up to 95%, using concentrations around 11 ppm of demulsifier. The best results were compared to other commercial products, codified by ―W‖ and ―Z‖, being observed an equivalent demulsifier power between Agflot and commercial product ―W‖ and superior to commercial product ―Z‖
Resumo:
Due to its physico-chemical and biological properties, related to the abundance and low cost of raw material, chitosan has been recognized as a material of wide application in various fields, such as in drug delivery systems. Many of these properties are associated with the presence of amino groups in its polymer chain. A proper determination of these amino groups is very important, in order to properly specify if a given chitosan sample can be used in a particular application. Thus, in this work, initially, a comparison between the determination of the deacetylation degree by conductometry and elemental analysis was carried out using a detailed analysis of error propagation. It was shown that the conductometric analysis resulted in a simple and safe method for the determining the degree of deacetylation of chitosan. Subsequently, experiments were performed to monitor and characterize the adsorption of tetracycline on chitosan particles through kinetic and equilibrium studies. The main models of kinetics and adsorption isotherms, widely used to describe the adsorption on wastewater treatment systems and the drug loading, were used to treat the experimental data. Firstly, it was shown that an apparent linear t/q(t) × t relationship did not imply in a pseudo-second-order adsorption kinetics, differently of what has been repeatedly reported in the literature. It was found that this misinterpretation can be avoided by using non-linear regression. Finally, the adsorption of tetracycline on chitosan particles was analyzed using insights obtained from theoretical analysis, and the parameters generated were used to analyze the kinetics of adsorption, the isotherm of adsorption and to ropose a mechanism of adsorption
Resumo:
Textile activity results in effluents with a variety of dyes. Among the several processes for dye-uptaking from these wastewaters, sorption is one of the most effective methods, chitosan being a very promising alternative for this end. The sorption of Methyl Orange by chitosan crosslinked particles was approached using equilibrium and kinetic analyses at different pH s. Besides the standard pseudo-order analysis normally effectuated (i.e. pseudo-first-order and pseudo-second-order), a novel approach involving a pseudo-nth-order kinetics was used, nbeing determined via non-linear regression, using the Levenberg-Marquardt method. Zeta potential measurements indicated that electrostatic interactions were important for the sorption process. Regarding equilibrium experiments, data were well fitted to a hybrid Langmuir-Freundlich isotherm, and estimated Gibbs free energy of adsorption as a function of mass of dye per area of chitosan showed that the process of adsorption becomes more homogeneous as the pH of the continuous phase decreased. Considering the kinetics of sorption, although a pseudo-nth-order description yielded good fits, a kinetic equation involving diffusion adsorption phenomena was found to be more consistent in terms of a physicochemical description of the sorption process
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Traditional applications of feature selection in areas such as data mining, machine learning and pattern recognition aim to improve the accuracy and to reduce the computational cost of the model. It is done through the removal of redundant, irrelevant or noisy data, finding a representative subset of data that reduces its dimensionality without loss of performance. With the development of research in ensemble of classifiers and the verification that this type of model has better performance than the individual models, if the base classifiers are diverse, comes a new field of application to the research of feature selection. In this new field, it is desired to find diverse subsets of features for the construction of base classifiers for the ensemble systems. This work proposes an approach that maximizes the diversity of the ensembles by selecting subsets of features using a model independent of the learning algorithm and with low computational cost. This is done using bio-inspired metaheuristics with evaluation filter-based criteria
Resumo:
This thesis proposes an architecture of a new multiagent system framework for hybridization of metaheuristics inspired on the general Particle Swarm Optimization framework (PSO). The main contribution is to propose an effective approach to solve hard combinatory optimization problems. The choice of PSO as inspiration was given because it is inherently multiagent, allowing explore the features of multiagent systems, such as learning and cooperation techniques. In the proposed architecture, particles are autonomous agents with memory and methods for learning and making decisions, using search strategies to move in the solution space. The concepts of position and velocity originally defined in PSO are redefined for this approach. The proposed architecture was applied to the Traveling Salesman Problem and to the Quadratic Assignment Problem, and computational experiments were performed for testing its effectiveness. The experimental results were promising, with satisfactory performance, whereas the potential of the proposed architecture has not been fully explored. For further researches, the proposed approach will be also applied to multiobjective combinatorial optimization problems, which are closer to real-world problems. In the context of applied research, we intend to work with both students at the undergraduate level and a technical level in the implementation of the proposed architecture in real-world problems
Resumo:
Due to great difficulty of accurate solution of Combinatorial Optimization Problems, some heuristic methods have been developed and during many years, the analysis of performance of these approaches was not carried through in a systematic way. The proposal of this work is to make a statistical analysis of heuristic approaches to the Traveling Salesman Problem (TSP). The focus of the analysis is to evaluate the performance of each approach in relation to the necessary computational time until the attainment of the optimal solution for one determined instance of the TSP. Survival Analysis, assisted by methods for the hypothesis test of the equality between survival functions was used. The evaluated approaches were divided in three classes: Lin-Kernighan Algorithms, Evolutionary Algorithms and Particle Swarm Optimization. Beyond those approaches, it was enclosed in the analysis, a memetic algorithm (for symmetric and asymmetric TSP instances) that utilizes the Lin-Kernighan heuristics as its local search procedure
Resumo:
Combinatorial optimization problems have the goal of maximize or minimize functions defined over a finite domain. Metaheuristics are methods designed to find good solutions in this finite domain, sometimes the optimum solution, using a subordinated heuristic, which is modeled for each particular problem. This work presents algorithms based on particle swarm optimization (metaheuristic) applied to combinatorial optimization problems: the Traveling Salesman Problem and the Multicriteria Degree Constrained Minimum Spanning Tree Problem. The first problem optimizes only one objective, while the other problem deals with many objectives. In order to evaluate the performance of the algorithms proposed, they are compared, in terms of the quality of the solutions found, to other approaches
Resumo:
The distribution of petroleum products through pipeline networks is an important problem that arises in production planning of refineries. It consists in determining what will be done in each production stage given a time horizon, concerning the distribution of products from source nodes to demand nodes, passing through intermediate nodes. Constraints concerning storage limits, delivering time, sources availability, limits on sending or receiving, among others, have to be satisfied. This problem can be viewed as a biobjective problem that aims at minimizing the time needed to for transporting the set of packages through the network and the successive transmission of different products in the same pipe is called fragmentation. This work are developed three algorithms that are applied to this problem: the first algorithm is discrete and is based on Particle Swarm Optimization (PSO), with local search procedures and path-relinking proposed as velocity operators, the second and the third algorithms deal of two versions based on the Non-dominated Sorting Genetic Algorithm II (NSGA-II). The proposed algorithms are compared to other approaches for the same problem, in terms of the solution quality and computational time spent, so that the efficiency of the developed methods can be evaluated
Resumo:
This work seeks to propose and evaluate a change to the Ant Colony Optimization based on the results of experiments performed on the problem of Selective Ride Robot (PRS, a new problem, also proposed in this paper. Four metaheuristics are implemented, GRASP, VNS and two versions of Ant Colony Optimization, and their results are analyzed by running the algorithms over 32 instances created during this work. The metaheuristics also have their results compared to an exact approach. The results show that the algorithm implemented using the GRASP metaheuristic show good results. The version of the multicolony ant colony algorithm, proposed and evaluated in this work, shows the best results
Resumo:
Este trabalho aborda o problema de otimização em braquiterapia de alta taxa de dose no tratamento de pacientes com câncer, com vistas à definição do conjunto de tempos de parada. A técnica de solução adotada foi a Transgenética Computacional apoiada pelo método L-BFGS. O algoritmo desenvolvido foi empregado para gerar soluções não denominadas cujas distribuições de dose fossem capazes de eiminar o câncer e ao mesmo tempo preservar as regiões normais
Resumo:
Web services are computational solutions designed according to the principles of Service Oriented Computing. Web services can be built upon pre-existing services available on the Internet by using composition languages. We propose a method to generate WS-BPEL processes from abstract specifications provided with high-level control-flow information. The proposed method allows the composition designer to concentrate on high-level specifi- cations, in order to increase productivity and generate specifications that are independent of specific web services. We consider service orchestrations, that is compositions where a central process coordinates all the operations of the application. The process of generating compositions is based on a rule rewriting algorithm, which has been extended to support basic control-flow information.We created a prototype of the extended refinement method and performed experiments over simple case studies
Resumo:
Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets
Resumo:
The Traveling Purchaser Problem is a variant of the Traveling Salesman Problem, where there is a set of markets and a set of products. Each product is available on a subset of markets and its unit cost depends on the market where it is available. The objective is to buy all the products, departing and returning to a domicile, at the least possible cost defined as the summation of the weights of the edges in the tour and the cost paid to acquire the products. A Transgenetic Algorithm, an evolutionary algorithm with basis on endosymbiosis, is applied to the Capacited and Uncapacited versions of this problem. Evolution in Transgenetic Algorithms is simulated with the interaction and information sharing between populations of individuals from distinct species. The computational results show that this is a very effective approach for the TPP regarding solution quality and runtime. Seventeen and nine new best results are presented for instances of the capacited and uncapacited versions, respectively
Resumo:
In this work we present a mathematical and computational modeling of electrokinetic phenomena in electrically charged porous medium. We consider the porous medium composed of three different scales (nanoscopic, microscopic and macroscopic). On the microscopic scale the domain is composed by a porous matrix and a solid phase. The pores are filled with an aqueous phase consisting of ionic solutes fully diluted, and the solid matrix consists of electrically charged particles. Initially we present the mathematical model that governs the electrical double layer in order to quantify the electric potential, electric charge density, ion adsorption and chemical adsorption in nanoscopic scale. Then, we derive the microscopic model, where the adsorption of ions due to the electric double layer and the reactions of protonation/ deprotanaç~ao and zeta potential obtained in modeling nanoscopic arise in microscopic scale through interface conditions in the problem of Stokes and Nerst-Planck equations respectively governing the movement of the aqueous solution and transport of ions. We developed the process of upscaling the problem nano/microscopic using the homogenization technique of periodic structures by deducing the macroscopic model with their respectives cell problems for effective parameters of the macroscopic equations. Considering a clayey porous medium consisting of kaolinite clay plates distributed parallel, we rewrite the macroscopic model in a one-dimensional version. Finally, using a sequential algorithm, we discretize the macroscopic model via the finite element method, along with the interactive method of Picard for the nonlinear terms. Numerical simulations on transient regime with variable pH in one-dimensional case are obtained, aiming computational modeling of the electroremediation process of clay soils contaminated