121 resultados para Nuvem


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The generation of effluent from the finishing process in textile industry is a serious environmental problem and turned into an object of study in several scientific papers. Contamination with dyes and the presences of substances that are toxic to the environment characterize this difficult treatment effluent. Several processes have already been evaluated to remove and even degrade such pollutants are examples: coagulation-flocculation, biological treatment and advanced oxidative processes, but not yet sufficient to enable the recovery of dye or at least of the recovery agent. An alternative to this problem is the cloud point extraction that involves the application of nonionic surfactants at temperatures above the cloud point, making the water a weak solvent to the surfactant, providing the agglomeration of those molecules around the dyes molecules by affinity with the organic phase. After that, the formation of two phases occurred: the diluted one, poor in dye and surfactant, and the other one, coacervate, with higher concentrations of dye and surfactants than the other one. The later use of the coacervate as a dye and surfactant recycle shows the technical and economic viability of this process. In this paper, the cloud point extraction is used to remove the dye Reactive Blue from the water, using nonionic surfactant nonyl phenol with 9,5 etoxilations. The aim is to solubilize the dye molecules in surfactant, varying the concentration and temperature to study its effects. Evaluating the dye concentration in dilute phase after extraction, it is possible to analyze thermodynamic variables, build Langmuir isotherms, determine the behavior of the coacervate volume for a surfactant concentration and temperature, the distribution coefficient and the dye removal efficiency. The concentration of surfactant proved itself to be crucial to the success of the treatment. The results of removal efficiency reached values of 91,38%, 90,69%, 89,58%, 87,22% and 84,18% to temperatures of 65,0, 67,5, 70,0, 72,5 and 75,0°C, respectively, showing that the cloud point extraction is an efficient alternative for the treatment of wastewater containing Reactive Blue

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Environmental sustainability has become one of the topics of greatest interest in industry, mainly due to effluent generation. Phenols are found in many industries effluents, these industries might be refineries, coal processing, pharmaceutical, plastics, paints and paper and pulp industries. Because phenolic compounds are toxic to humans and aquatic organisms, Federal Resolution CONAMA No. 430 of 13.05.2011 limits the maximum content of phenols, in 0.5 mg.L-1, for release in freshwater bodies. In the effluents treatment, the liquid-liquid extraction process is the most economical for the phenol recovery, because consumes little energy, but in most cases implements an organic solvent, and the use of it can cause some environmental problems due to the high toxicity of this compound. Because of this, exists a need for new methodologies, which aims to replace these solvents for biodegradable ones. Some literature studies demonstrate the feasibility of phenolic compounds removing from aqueous effluents, by biodegradable solvents. In this extraction kind called "Cloud Point Extraction" is used a nonionic surfactant as extracting agent of phenolic compounds. In order to optimize the phenol extraction process, this paper studies the mathematical modeling and optimization of extraction parameters and investigates the effect of the independent variables in the process. A 32 full factorial design has been done with operating temperature and surfactant concentration as independent variables and, parameters extraction: Volumetric fraction of coacervate phase, surfactant and residual concentration of phenol in dilute phase after separation phase and phenol extraction efficiency, as dependent variables. To achieve the objectives presented before, the work was carried out in five steps: (i) selection of some literature data, (ii) use of Box-Behnken model to find out mathematical models that describes the process of phenol extraction, (iii) Data analysis were performed using STATISTICA 7.0 and the analysis of variance was used to assess the model significance and prediction (iv) models optimization using the response surface method (v) Mathematical models validation using additional measures, from samples different from the ones used to construct the model. The results showed that the mathematical models found are able to calculate the effect of the surfactant concentration and the operating temperature in each extraction parameter studied, respecting the boundaries used. The models optimization allowed the achievement of consistent and applicable results in a simple and quick way leading to high efficiency in process operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a study on coastal hydrodynamics and the spread of an oil spill in waters off Macau and Galinhos, on the east coast of the state of Rio Grande do Norte in Northeast Brazil. This area has a very marked coastal dynamic owing to the complexity of its geomorphological features, developed in a regime of semidiurnal mesotides involving reefs, spits, estuaries, mangroves, lakes and dunes. The region also plays an important role in the socioeconomic development of the state, given that the production of oil, natural gas, salt and shrimp is concentrated there. The series of oil platforms is interconnected by a pipeline system that carries oil to the local terminal. This pipeline could leak at any moment, causing immense ecological damage. To gauge the risks of an oil leak and resulting contamination of the coastal region, two hydrodynamic scenarios were simulated. The results obtained were used to implement a contaminant transport model with the creation of various oil leak scenarios modeled at different volumes (from small to large) and intensities (sporadic and continuous), at points considered critical for the model (on two platforms and at two pipeline intersections), under different wind (summer and winter) and tidal (high and low at new, full and quarter moon phases) conditions. The use of hydrodynamic circulation computer models as a tool for representing a real project design has been increasingly frequent in recent years, given that they enable the realistic simulation of the hydrodynamic circulation pattern in bodies of water and an analysis of the impacts caused by contaminants released into the water. This study used the computer models contained in SisBAHIA®, in continuous development in the area of Coastal Engineering and Oceanography at COPPE/UFRJ

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The spatial resolution improvement of orbital sensors has broadened considerably the applicability of their images in solving urban areas problems. But as the spatial resolution improves, the shadows become even a more serious problem especially when detailed information (under the shadows) is required. Besides those shadows caused by buildings and houses, clouds projected shadows are likely to occur. In this case there is information occlusion by the cloud in association with low illumination and contrast areas caused by the cloud shadow on the ground. Thus, it's important to use efficient methods to detect shadows and clouds areas in digital images taking in count that these areas care for especial processing. This paper proposes the application of Mathematical Morphology (MM) in shadow and clouds detection. Two parts of a panchromatic QuickBird image of Cuiab-MT urban area were used. The proposed method takes advantage of the fact that shadows (low intensity - dark areas) and clouds (high intensity - bright areas) represent the bottom and top, respectively, of the image as it is thought to be a topographic surface. This characteristic allowed MM area opening and closing operations to be applied to reduce or eliminate the bottom and top of the topographic surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use a finite diference eulerian numerical code, called ZEUS 3D, to do simulations involving the collision between two magnetized molecular clouds, aiming to evaluate the rate of star formation triggered by the collision and to analyse how that rate varies depending on the relative orientations between the cloud magnetic fields before the shock. The ZEUS 3D code is not an easy code to handle. We had to create two subroutines, one to study the cloud-cloud collision and the other for the data output. ZEUS is a modular code. Its hierarchical way of working is explained as well as the way our subroutines work. We adopt two sets of different initial values for density, temperature and magnetic field of the clouds and of the external medium in order to study the collision between two molecular clouds. For each set, we analyse in detail six cases with different directions and orientations of the cloud magnetic field relative to direction of motion of the clouds. The analysis of these twelve cases allowed us to conform analytical-theoretical proposals found in the literature, and to obtain several original results. Previous works indicate that, if the cloud magnetic fields before the collision are orthogonal to the direction of motion, then a strong inhibition of star formation will occur during a cloud-cloud shock, whereas if those magnetic fields are parallel to the direction of motion, star formation will be stimulated. Our treatment of the problem confirmed numerically those results, and further allowed us to quantify the relative star forming efficiencies in each case. Moreover, we propose and analyse an intermediate case where the field of one of the clouds is orthogonal to the motion and the field of the other one is parallel to the motion. We conclude that, in this case, the rate at which the star formation occurs has a value also intermediate between the two extreme cases we mentioned above. Besides that we study the case in which the fields are orthogonal to the direction of the motion but, instead of being parallel to each other, they are anti-parallel, and we obtained for this case the corresponding variation of the star formation rate due to this alteration of the field configuration. This last case has not been studied in the literature before. Our study allows us to obtain, from the simulations, the rate of star formation in each case, as well as the temporal dependence of that rate as each collision evolves, what we do in detail for one of the cases in particular. The values we obtain for the rate of star formation are in accordance with those expected from the presently existing observational data

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional applications of feature selection in areas such as data mining, machine learning and pattern recognition aim to improve the accuracy and to reduce the computational cost of the model. It is done through the removal of redundant, irrelevant or noisy data, finding a representative subset of data that reduces its dimensionality without loss of performance. With the development of research in ensemble of classifiers and the verification that this type of model has better performance than the individual models, if the base classifiers are diverse, comes a new field of application to the research of feature selection. In this new field, it is desired to find diverse subsets of features for the construction of base classifiers for the ensemble systems. This work proposes an approach that maximizes the diversity of the ensembles by selecting subsets of features using a model independent of the learning algorithm and with low computational cost. This is done using bio-inspired metaheuristics with evaluation filter-based criteria

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to great difficulty of accurate solution of Combinatorial Optimization Problems, some heuristic methods have been developed and during many years, the analysis of performance of these approaches was not carried through in a systematic way. The proposal of this work is to make a statistical analysis of heuristic approaches to the Traveling Salesman Problem (TSP). The focus of the analysis is to evaluate the performance of each approach in relation to the necessary computational time until the attainment of the optimal solution for one determined instance of the TSP. Survival Analysis, assisted by methods for the hypothesis test of the equality between survival functions was used. The evaluated approaches were divided in three classes: Lin-Kernighan Algorithms, Evolutionary Algorithms and Particle Swarm Optimization. Beyond those approaches, it was enclosed in the analysis, a memetic algorithm (for symmetric and asymmetric TSP instances) that utilizes the Lin-Kernighan heuristics as its local search procedure

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The distribution of petroleum products through pipeline networks is an important problem that arises in production planning of refineries. It consists in determining what will be done in each production stage given a time horizon, concerning the distribution of products from source nodes to demand nodes, passing through intermediate nodes. Constraints concerning storage limits, delivering time, sources availability, limits on sending or receiving, among others, have to be satisfied. This problem can be viewed as a biobjective problem that aims at minimizing the time needed to for transporting the set of packages through the network and the successive transmission of different products in the same pipe is called fragmentation. This work are developed three algorithms that are applied to this problem: the first algorithm is discrete and is based on Particle Swarm Optimization (PSO), with local search procedures and path-relinking proposed as velocity operators, the second and the third algorithms deal of two versions based on the Non-dominated Sorting Genetic Algorithm II (NSGA-II). The proposed algorithms are compared to other approaches for the same problem, in terms of the solution quality and computational time spent, so that the efficiency of the developed methods can be evaluated

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The separation methods are reduced applications as a result of the operational costs, the low output and the long time to separate the uids. But, these treatment methods are important because of the need for extraction of unwanted contaminants in the oil production. The water and the concentration of oil in water should be minimal (around 40 to 20 ppm) in order to take it to the sea. Because of the need of primary treatment, the objective of this project is to study and implement algorithms for identification of polynomial NARX (Nonlinear Auto-Regressive with Exogenous Input) models in closed loop, implement a structural identification, and compare strategies using PI control and updated on-line NARX predictive models on a combination of three-phase separator in series with three hydro cyclones batteries. The main goal of this project is to: obtain an optimized process of phase separation that will regulate the system, even in the presence of oil gushes; Show that it is possible to get optimized tunings for controllers analyzing the mesh as a whole, and evaluate and compare the strategies of PI and predictive control applied to the process. To accomplish these goals a simulator was used to represent the three phase separator and hydro cyclones. Algorithms were developed for system identification (NARX) using RLS(Recursive Least Square), along with methods for structure models detection. Predictive Control Algorithms were also implemented with NARX model updated on-line, and optimization algorithms using PSO (Particle Swarm Optimization). This project ends with a comparison of results obtained from the use of PI and predictive controllers (both with optimal state through the algorithm of cloud particles) in the simulated system. Thus, concluding that the performed optimizations make the system less sensitive to external perturbations and when optimized, the two controllers show similar results with the assessment of predictive control somewhat less sensitive to disturbances