5 resultados para Simulated experiment
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
This paper analyzes the performance of a parallel implementation of Coupled Simulated Annealing (CSA) for the unconstrained optimization of continuous variables problems. Parallel processing is an efficient form of information processing with emphasis on exploration of simultaneous events in the execution of software. It arises primarily due to high computational performance demands, and the difficulty in increasing the speed of a single processing core. Despite multicore processors being easily found nowadays, several algorithms are not yet suitable for running on parallel architectures. The algorithm is characterized by a group of Simulated Annealing (SA) optimizers working together on refining the solution. Each SA optimizer runs on a single thread executed by different processors. In the analysis of parallel performance and scalability, these metrics were investigated: the execution time; the speedup of the algorithm with respect to increasing the number of processors; and the efficient use of processing elements with respect to the increasing size of the treated problem. Furthermore, the quality of the final solution was verified. For the study, this paper proposes a parallel version of CSA and its equivalent serial version. Both algorithms were analysed on 14 benchmark functions. For each of these functions, the CSA is evaluated using 2-24 optimizers. The results obtained are shown and discussed observing the analysis of the metrics. The conclusions of the paper characterize the CSA as a good parallel algorithm, both in the quality of the solutions and the parallel scalability and parallel efficiency
Resumo:
Expanded Bed Adsorption (EBA) is an integrative process that combines concepts of chromatography and fluidization of solids. The many parameters involved and their synergistic effects complicate the optimization of the process. Fortunately, some mathematical tools have been developed in order to guide the investigation of the EBA system. In this work the application of experimental design, phenomenological modeling and artificial neural networks (ANN) in understanding chitosanases adsorption on ion exchange resin Streamline® DEAE have been investigated. The strain Paenibacillus ehimensis NRRL B-23118 was used for chitosanase production. EBA experiments were carried out using a column of 2.6 cm inner diameter with 30.0 cm in height that was coupled to a peristaltic pump. At the bottom of the column there was a distributor of glass beads having a height of 3.0 cm. Assays for residence time distribution (RTD) revelead a high degree of mixing, however, the Richardson-Zaki coefficients showed that the column was on the threshold of stability. Isotherm models fitted the adsorption equilibrium data in the presence of lyotropic salts. The results of experiment design indicated that the ionic strength and superficial velocity are important to the recovery and purity of chitosanases. The molecular mass of the two chitosanases were approximately 23 kDa and 52 kDa as estimated by SDS-PAGE. The phenomenological modeling was aimed to describe the operations in batch and column chromatography. The simulations were performed in Microsoft Visual Studio. The kinetic rate constant model set to kinetic curves efficiently under conditions of initial enzyme activity 0.232, 0.142 e 0.079 UA/mL. The simulated breakthrough curves showed some differences with experimental data, especially regarding the slope. Sensitivity tests of the model on the surface velocity, axial dispersion and initial concentration showed agreement with the literature. The neural network was constructed in MATLAB and Neural Network Toolbox. The cross-validation was used to improve the ability of generalization. The parameters of ANN were improved to obtain the settings 6-6 (enzyme activity) and 9-6 (total protein), as well as tansig transfer function and Levenberg-Marquardt training algorithm. The neural Carlos Eduardo de Araújo Padilha dezembro/2013 9 networks simulations, including all the steps of cycle, showed good agreement with experimental data, with a correlation coefficient of approximately 0.974. The effects of input variables on profiles of the stages of loading, washing and elution were consistent with the literature
Resumo:
This study investigates the chemical species produced water from the reservoir areas of oil production in the field of Monte Alegre (onshore production) with a proposal of developing a model applied to the identification of the water produced in different zones or groups of zones.Starting from the concentrations of anions and cátions from water produced as input parameters in Linear Discriminate Analysis, it was possible to estimate and compare the model predictions respecting the particularities of their methods in order to ascertain which one would be most appropriate. The methods Resubstitution, Holdout Method and Lachenbruch were used for adjustment and general evaluation of the built models. Of the estimated models for Wells producing water for a single production area, the most suitable method was the "Holdout Method and had a hit rate of 90%. Discriminant functions (CV1, CV2 and CV3) estimated in this model were used to modeling new functions for samples ofartificial mixtures of produced water (producedin our laboratory) and samples of mixtures actualproduced water (water collected inwellsproducingmore thanonezone).The experiment with these mixtures was carried out according to a schedule experimental mixtures simplex type-centroid also was simulated in which the presence of water from steam injectionin these tanks fora part of amostras. Using graphs of two and three dimensions was possible to estimate the proportion of water in the production area
Resumo:
Os Algoritmos Genético (AG) e o Simulated Annealing (SA) são algoritmos construídos para encontrar máximo ou mínimo de uma função que representa alguma característica do processo que está sendo modelado. Esses algoritmos possuem mecanismos que os fazem escapar de ótimos locais, entretanto, a evolução desses algoritmos no tempo se dá de forma completamente diferente. O SA no seu processo de busca trabalha com apenas um ponto, gerando a partir deste sempre um nova solução que é testada e que pode ser aceita ou não, já o AG trabalha com um conjunto de pontos, chamado população, da qual gera outra população que sempre é aceita. Em comum com esses dois algoritmos temos que a forma como o próximo ponto ou a próxima população é gerada obedece propriedades estocásticas. Nesse trabalho mostramos que a teoria matemática que descreve a evolução destes algoritmos é a teoria das cadeias de Markov. O AG é descrito por uma cadeia de Markov homogênea enquanto que o SA é descrito por uma cadeia de Markov não-homogênea, por fim serão feitos alguns exemplos computacionais comparando o desempenho desses dois algoritmos
Resumo:
Two-level factorial designs are widely used in industrial experimentation. However, many factors in such a design require a large number of runs to perform the experiment, and too many replications of the treatments may not be feasible, considering limitations of resources and of time, making it expensive. In these cases, unreplicated designs are used. But, with only one replicate, there is no internal estimate of experimental error to make judgments about the significance of the observed efects. One of the possible solutions for this problem is to use normal plots or half-normal plots of the efects. Many experimenters use the normal plot, while others prefer the half-normal plot and, often, for both cases, without justification. The controversy about the use of these two graphical techniques motivates this work, once there is no register of formal procedure or statistical test that indicates \which one is best". The choice between the two plots seems to be a subjective issue. The central objective of this master's thesis is, then, to perform an experimental comparative study of the normal plot and half-normal plot in the context of the analysis of the 2k unreplicated factorial experiments. This study involves the construction of simulated scenarios, in which the graphics performance to detect significant efects and to identify outliers is evaluated in order to verify the following questions: Can be a plot better than other? In which situations? What kind of information does a plot increase to the analysis of the experiment that might complement those provided by the other plot? What are the restrictions on the use of graphics? Herewith, this work intends to confront these two techniques; to examine them simultaneously in order to identify similarities, diferences or relationships that contribute to the construction of a theoretical reference to justify or to aid in the experimenter's decision about which of the two graphical techniques to use and the reason for this use. The simulation results show that the half-normal plot is better to assist in the judgement of the efects, while the normal plot is recommended to detect outliers in the data