10 resultados para Discarding

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dissertating study about the solidarity economy has the objective to analyze the four unions responsible for the selective municipal garbage collection in Natal. It aims at verifying the consolidation of these unions as solidarity economic undertakings, revealing which progresses they have made, as well as the social and economic insertion of the garbage collectors and their process of conquering citizenship. The referred four unions had been founded and are constituted, in their majority, by collectors coming from the Cidade Nova lixão (big garbage). As it was closed in August 2004, they decided to make a union in order to collecting garbage. As what concerns the methodic and theoretic proceedings, our research has been developed with a critical perspective and a qualitative approach without discarding and quantitative one. The central analytical categories of this paper are: association, work, social exclusion and citizenship. Our research has had three articulated axis which aim was to apprehend the subject, disclosing it. The exposition of the investigative results is subdivided in four chapters. The first one approaches the main aspects of the crisis of the capital and its reflexes in the world of work. Here we deal with the question the structural unemployment coming as a result of the present economic model, the mains changes verified in the Brazilian work market, as well as levels of unemployment affecting the work market in Natal s metropolitan region. The second chapter treats of the origin, concept and revival in Brazil concerning the tradition of thought and cooperative economic organization, which has recovered the central elements of the associative thought and is nowadays studied in Latin America under the name of solidarity economy. The third chapter deals with embodiment of the collectors unions, its history, appearing and development of each union. The fourth chapter presents the relative dimensions of the analysis categories supported in the reports of institutional actors as well as the perception collectors have about the recyclable stuffs, the way they face the daily life and so on, what brings about the contradictions present in their reality. The final comments sum up the main trends and particularities of the unions researched under the light of the solidarity economy and disclose the real perspectives of social and economic insertion of these collectors and the process they follow to conquest social recognition

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This present Thesis, is explorer work and presents an analysis of e-wastes of the industry of cellular mobile telephony, evaluating the evolution of the telecommunications nets and as if it holds the global and Brazilian market of cellular telephony. It approaches the elements gifts in the cellular devices that can badly cause to the environment and the health, the discarding of the devices in end of life cycle is made. It analyzes the new European regulation of electric equipment residues and electronic, the WEEE, as it influenced the strategy of the companies manufacturers of mobile phone cellular and of that she forms is possible to create a Brazilian national industry for recycling of devices of cellular, with conditions to globally competition. For this some possible models of being implanted in Brazil are presented. The project of law 203/91 on solid residues is argued and as it would be interesting if to persist some proposals presented to the project, to create a Brazilian market of recycling with capacity of global competition for use to advantage of the European regulation if to get a competitive advantage

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the Markov chain will be the tool used in the modeling and analysis of convergence of the genetic algorithm, both the standard version as for the other versions that allows the genetic algorithm. In addition, we intend to compare the performance of the standard version with the fuzzy version, believing that this version gives the genetic algorithm a great ability to find a global optimum, own the global optimization algorithms. The choice of this algorithm is due to the fact that it has become, over the past thirty yares, one of the more importan tool used to find a solution of de optimization problem. This choice is due to its effectiveness in finding a good quality solution to the problem, considering that the knowledge of a good quality solution becomes acceptable given that there may not be another algorithm able to get the optimal solution for many of these problems. However, this algorithm can be set, taking into account, that it is not only dependent on how the problem is represented as but also some of the operators are defined, to the standard version of this, when the parameters are kept fixed, to their versions with variables parameters. Therefore to achieve good performance with the aforementioned algorithm is necessary that it has an adequate criterion in the choice of its parameters, especially the rate of mutation and crossover rate or even the size of the population. It is important to remember that those implementations in which parameters are kept fixed throughout the execution, the modeling algorithm by Markov chain results in a homogeneous chain and when it allows the variation of parameters during the execution, the Markov chain that models becomes be non - homogeneous. Therefore, in an attempt to improve the algorithm performance, few studies have tried to make the setting of the parameters through strategies that capture the intrinsic characteristics of the problem. These characteristics are extracted from the present state of execution, in order to identify and preserve a pattern related to a solution of good quality and at the same time that standard discarding of low quality. Strategies for feature extraction can either use precise techniques as fuzzy techniques, in the latter case being made through a fuzzy controller. A Markov chain is used for modeling and convergence analysis of the algorithm, both in its standard version as for the other. In order to evaluate the performance of a non-homogeneous algorithm tests will be applied to compare the standard fuzzy algorithm with the genetic algorithm, and the rate of change adjusted by a fuzzy controller. To do so, pick up optimization problems whose number of solutions varies exponentially with the number of variables

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The decontamination of the materials has been subject of some studies. One of the factors that it increases the pollution is the lack of responsibility in the discarding of toxic trash, as for example the presence of PCB (Polychlorinated Biphenyls) in the environment. In the Brazilian regulations, the material contaminated with PCB in concentrations higher than 50 ppm must be stored in special places or destroyed, usually by incineration in plasma furnace with dual steps. Due to high cost of the procedure, new methodologies of PCBs removal has been studied. The objective of this study was to develop an experimental methodology and analytical methodology for quantification of removal of PCBs through out the processes of extractions using supercritical fluid and Soxhlet method, also technical efficiency of the two processes of extraction, in the treatment of contaminated materials with PCBs. The materials studied were soils and wood, both were simulated contamination with concentration of 6.000, 33.000 and 60.000 mg of PCB/ kg of materials. Soxhlet extractions were performed using 100 ml of hexane, and temperature of 180 ºC. Extractions by fluid supercritical were performed at conditions of 200 bar, 70°C, and supercritical CO2 flow-rate of 3 g/min for 1-3 hours. The extracts obtained were quantified using Gas chromatography-mass spectrometry (GC/MS). The conventional extractions were made according to factorial experimental planning technique 22, with aim of study the influence of two variables of process extraction for the Soxhlet method: contaminant concentration and extraction time for obtain a maximum removal of PCB in the materials. The extractions for Soxhlet method were efficient for extraction of PCBs in soil and wood in both solvent studied (hexane and ethanol). In the experimental extraction in soils, the better efficient of removal of PCBs using ethanol as solvent was 81.3% than 95% for the extraction using hexane as solvent, for equal time of extraction. The results of the extraction with wood showed statistically it that there is not difference between the extractions in both solvent studied. The supercritical fluid extraction in the conditions studied showed better efficiency in the extraction of PCBs in the wood matrix than in soil, for two hours extractions the obtain percentual of 43.9 ± 0.5 % for the total of PCBs extracted in the soils against 95.1 ± 0,5% for the total of PCBs extracted in the wood. The results demonstrated that the extractions were satisfactory for both technical studied

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An evaluation project was conducted on the technique of treatment for effluent oil which is the deriving process to improve cashews. During the evaluation the following techniques were developed: advanced processes of humid oxidation, oxidative processes, processes of biological treatment and processes of adsorption. The assays had been carried through in kinetic models, with an evaluation of the quality of the process by means of determining the chemical demand of oxygen (defined as a technique of control by means of comparative study between the available techniques). The results demonstrated that the natural biodegradation of the effluent ones is limited, as result using the present natural flora in the effluent one revealed impracticable for an application in the industrial systems, independent of the evaluation environment (with or without the oxygen presence). The job of specific microorganisms for the oily composite degradation developed the viability technique of this route, the acceptable levels of inclusion in effluent system of treatment of the improvement of the cashew being highly good with reasonable levels of removal of CDO. However, the use combined with other techniques of daily pay-treatment for these effluent ones revealed to still be more efficient for the context of the treatment of effluent and discarding in receiving bodies in acceptable standards for resolution CONAMA 357/2005. While the significant generation of solid residues the process of adsorption with agroindustrial residues (in special the chitosan) is a technical viable alternative, however, when applied only for the treatment of the effluent ones for discarding in bodies of water, the economic viability is harmed and minimized ambient profits. Though, it was proven that if used for ends of I reuse, the viability is equalized and justifies the investments. There was a study of the photochemistry process which have are applicable to the treatment of the effluent ones, having resulted more satisfactory than those gotten for the UV-Peroxide techniques. There was different result on the one waited for the use of catalyses used in the process of Photo. The catalyses contained the mixing oxide base of Cerium and Manganese, incorporated of Potassium promoters this had presented the best results in the decomposition of the involved pollutants. Having itself an agreed form the gotten photochemistry daily paytreatment resulted, then after disinfection with chlorine the characteristics next the portability to the water were guarantee. The job of the humid oxidation presented significant results in the removal of pollutants; however, its high cost alone is made possible for job in projects of reuses, areas of low scarcity and of raised costs with the capitation/acquisition of the water, in special, for use for industrial and potable use. The route with better economic conditions and techniques for the job in the treatment of the effluent ones of the improvement of the cashew possesses the sequence to follow: conventional process of separation water-oil, photochemistry process and finally, the complementary biological treatment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While looking to the body and finding it engraved by cultural, imaginary and power-related texts through a discourse embodied in itself, this research proposes a new vertiginous approach to it by analyzing the following works: Poesia completa II, more specifically, Educação sentimental, and Novas cartas portuguesas. Such transgressive and performatic works are from Maria Teresa Horta (1937-), Portuguese writer, who proposes a new education by a renewed language: a sentimental education spawned from the erotic element. Starting from the deconstruction of the view over the spoiled body and exposed in Novas cartas portuguesas, from the poetic texts in Educação sentimental, as well as in remaining ones in Poesia completa II, Horta disassembles and reassembles the body, giving a new meaning to the symbols that surround us and our experiences. Other than proposing to all, men and women, such new meaning of the behavior and current practices models, Horta s education allows, through a performance action, the construction of a stage for female identity, free from the phallic influence. A new identity, able to handle all holy and profane characteristics of women, discarding the chromatic lens of sin. Horta s poetry emerges as a new proposal of literary labor

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work aims to detect polycyclic aromatic hydrocarbons (PAHs) through optimized analytical techniques, such as gas chromatography with flame-ionisation detector (CGFID), gas chromatography coupled to mass spectrometry (CGMS), Fluorescence Spectroscopy of Molecular and Purpot of oils and greases (POG). Apply to chemometrics, Factorial Planning 23, in the preparation of samples by liquid-liquid extraction. The sample preparation was used for liquid-liquid extraction and factors in this sample was used for the application of factorial planning 23, such as the use of ultrasound, solvents (dichloromethane, hexane and chloroform) and ratio of solvent / synthetic sample. These factors were assigned two types of levels: positive and negative. It was used to form the cube to better analyze the answers. The responses of the eight combinations were obtained in reading the spectrofluorimetric. The optimization of equipment were used, and they served in the HPA's identification of the samples collected in Rio Potengi. The optimization of the equipment was observed every 16's and PAH in the samples was found that the HPA's came from contamination of the Rio Potengi. The contamination comes through organic household waste, hospital waste, and among other contamination that comes from industries that are installed around the River The factorial design of high validity, it was observed a more effective sample preparation. The factorial design of liquid-liquid extraction showed a way to spend less solvent in less time using an ideal solvent, but also a way to extract more analyte from the matrix itself is water. In planning a smaller form factor extraction was the use of ultrasound, the ratio 1:3 corresponding to a solvent and sample 3 and the best solvent was dichloromethane who presented a viable extraction, not discarding the possibility of using also the hexane. The chloroform and may be toxic not had a good extraction

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.