35 resultados para Reuso


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technology of anaerobic reactors for sanitary wastewater treatment has been extensively developed in Brazil, and today it is practically consolidated. They present several advantages, such as low construction and operating costs, and low sludge production, the anaerobic reactors are an attractive alternative to minimize problematic lack of basic sanitation in urban areas, and also of the rural areas. The anaerobic filters have been widely used in Brazil. It produces an effluent with low concentration of organic matter and solids suspended, besides conserving the nutrients, therefore, it is good for use in irrigation, but the practice must be associated with knowledge of the pathogens presence. The main objective of this study was to evaluate the efficiency of anaerobic filters in removal faecal coliforms and helminth eggs, and to verify if the effluent can be used for agricultural purposes, according to the World Organization of Health (WHO, 1989). The protocol used to enumerate helminths eggs was the modified Bailenger method, (Ayres and Mara, 1996) recommended by WHO for evaluation of raw effluent and treated effluent. The membrane filtration method was utilized to determine the concentrations of faecal coliforms. Three different systems of sewer treatment composed by anaerobic filters were analyzed. The results, in a general analysis, showed that all the researched systems reached a larger removal than 93% to helminth eggs, resulting in an effluent with smaller average than 1 egg/L. One of these systems, Sistema RN, reached a larger removal than 99%, confirming the good performance of the anaerobic filters in removal helminths eggs. Even with low concentrations of eggs in the influent, the filters were capable to remove this parameter efficiently. About faecal coliforms, it was observed for all the researched systems an effluent with 106 CFU/100mL. The high concentrations to faecal coliforms in the effluent just allow reuse for restricted irrigation, in agreement with the guidelines of WHO. Although the researched systems have not removed faecal coliforms efficiently, the results indicated a good efficiency of the anaerobic filters in removal helminth eggs

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polyester fibers are the most used fibers in the world and disperse dyes are used for dyeing these fibers. After dyeing, the colorful dyebath is discharged into effluent streams, which needs a special treatment for color removal. Surfactants interaction with dyes has been evaluated in several studies, including the textile area, specifically in the separation of dyes from textile wastewater. In this work a cationic surfactant was used in a microemulsion system for the extraction of anionic dyes (disperses dyes) from textile wastewater. These microemulsion system was composed by dodecylamonium chloride (surfactant), kerosene oil (organic phase), isoamyl alcohol (cosurfactant) and the wastewater (aqueous phase). The wastewater that results after the dyeing process is acid (pH 5). It was observed that changing the pH value to above 12.8 the extraction could be made, resulting in an aqueous phase with low color level. The Scheffé net experimental design was used for the extraction process optimization, and the obtained results were evaluated using the program "Statistica 7.0". The optimal microemulsion system was composed by 59.8wt.% of wastewater, 30.1wt.% of kerosene, 3.37wt.% of surfactant and 6.73wt.% of cosurfactant, providing extraction upper than 96%. A mix of reactive dyebath (50%) and disperse dyebath (50%) was used as aqueous phase and it presented extraction upper than 98%. The water phase after extraction process can be reused in a new dyeing, being obtained satisfactory results, according to the limits established by textile industry for a good dyeing. Tests were accomplished seeking to study the influence of salt addition and temperature. An experimental design was used for this purpose, which showed that the extraction doesn't depend on those factors. In this way, the removal of color from textile wastewater by microemulsion is a viable technique (that does not depend of external factors such as salinity and temperature), being obtained good extraction results even with in wastewater mixtures

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enzymatic synthesis of peptides using proteases has attracted a great deal of attention in recent years. One key challenge in peptide synthesis is to find supports for protease immobilization capable of working in aqueous medium at high performance, producing watersoluble oligopeptides. At present, few reports have been described using this strategy. Therefore, the aim of this thesis was to immobilize proteases applying different methods (Immobilization by covalent bound, entrapment onto polymeric gels of PVA and immobilization on glycidil metacrylate magnetic nanoparticles) in order to produce water-soluble oligopeptides derived from lysine. Three different proteases were used: trypsin, α-chymotrypsin and bromelain. According to immobilization strategies associated to the type of protease employed, trypsin-resin systems showed the best performance in terms of hydrolytic activity and oligopeptides synthesis. Hydrolytic activities of the free and immobilized enzymes were determined spectrophotometrically based on the absorbance change at 660 nm at 25 °C (Casein method). Calculations of oligolysine yield and average degree of polymerization (DPavg) were monitored by 1H-NMR analysis. Trypsin was covalently immobilized onto four different resins (Amberzyme, Eupergit C, Eupergit CM and Grace 192). Maximum yield of bound protein was 92 mg/g, 82 mg/g and 60 mg/g support for each resin respectively. The effectiveness of these systems (Trypsin-resins) was evaluated by hydrolysis of casein and synthesis of water-soluble oligolysine. Most systems were capable of catalyzing oligopeptide synthesis in aqueous medium, albeit at different efficiencies, namely: 40, 37 and 35% for Amberzyme, Eupergit C and Eupergit CM, respectively, in comparison with free enzyme. These systems produced oligomers in only 1 hour with DPavg higher than free enzyme. Among these systems, the Eupergit C-Trypsin system showed greater efficiency than others in terms of hydrolytic activity and thermal stability. However, this did not occur for oligolysine synthesis. Trypsin-Amberzyme proved to be more successful in oligopeptide synthesis, and exhibited excellent reusability, since it retained 90% of its initial hydrolytic and synthetic activity after 7 reuses. Trypsin hydrophobic interactions with Amberzyme support are responsible for protecting against strong enzyme conformational changes in the medium. In addition, the high concentration of oxirane groups on the surface promoted multi-covalent linking and, consequently, prevented the immobilized enzyme from leaching. The aforementioned results suggest that immobilized Trypsin on the supports evaluated can be efficiently used for oligopeptides synthesis in aqueous media

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents a proposal of a multi-middleware environment to develop distributed applications, which abstracts different underlying middleware platforms. This work describes: (i) the reference architecture designed for the environment, (ii) an implementation which aims to validate the specified architecture integrating CORBA and EJB, (iii) a case study illustrating the use of the environment, (iv) a performance analysis. The proposed environment allows interoperability on middleware platforms, allowing the reuse of components of different kinds of middleware platforms in a transparency away to the developer and without major losses in performance. Also in the implementation we developed an Eclipse plugin which allows developers gain greater productivity at developing distributed applications using the proposed environment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increase of applications complexity has demanded hardware even more flexible and able to achieve higher performance. Traditional hardware solutions have not been successful in providing these applications constraints. General purpose processors have inherent flexibility, since they perform several tasks, however, they can not reach high performance when compared to application-specific devices. Moreover, since application-specific devices perform only few tasks, they achieve high performance, although they have less flexibility. Reconfigurable architectures emerged as an alternative to traditional approaches and have become an area of rising interest over the last decades. The purpose of this new paradigm is to modify the device s behavior according to the application. Thus, it is possible to balance flexibility and performance and also to attend the applications constraints. This work presents the design and implementation of a coarse grained hybrid reconfigurable architecture to stream-based applications. The architecture, named RoSA, consists of a reconfigurable logic attached to a processor. Its goal is to exploit the instruction level parallelism from intensive data-flow applications to accelerate the application s execution on the reconfigurable logic. The instruction level parallelism extraction is done at compile time, thus, this work also presents an optimization phase to the RoSA architecture to be included in the GCC compiler. To design the architecture, this work also presents a methodology based on hardware reuse of datapaths, named RoSE. RoSE aims to visualize the reconfigurable units through reusability levels, which provides area saving and datapath simplification. The architecture presented was implemented in hardware description language (VHDL). It was validated through simulations and prototyping. To characterize performance analysis some benchmarks were used and they demonstrated a speedup of 11x on the execution of some applications

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the increase of processing ability, storage and several kinds of communication existing such as Bluetooth, infrared, wireless networks, etc.., mobile devices are no longer only devices with specific function and have become tools with various functionalities. In the business field, the benefits that these kinds of devices can offer are considerable, because the portability allows tasks that previously could only be performed within the work environment, can be performed anywhere. In the context of oil exploration companies, mobile applications allow quick actions could be taken by petroleum engineers and technicians, using their mobile devices to avoid potential catastrophes like an unexpected stop or break of important equipment. In general, the configuration of equipment for oil extraction is performed on the work environment using computer systems in desktop platforms. After the obtained configuration, an employee goes to equipment to be configured and perform the modifications obtained on the use desktop system. This management process equipment for oil extraction takes long time and does not guarantee the maintenance in time to avoid problems. With the use of mobile devices, management and maintenance of equipment for oil extraction can be performed in a more agile time once it enables the engineer or technician oil can perform this configuration at the time and place where the request comes for example, near in the oil well where the equipment is located. The wide variety of mobile devices creates a big difficulty in developing mobile applications, since for one application can function in several types of devices, the application must be changed for each specific type of device, which makes the development quite costly. This paper defines and implements a software product line for designing sucker-rod pumping systems on mobile devices. This product line of software, called BMMobile, aims to produce products that are capable of performing calculations to determine the possible configurations for the equipment in the design suckerrod pumping, and managing the variabilities of the various products that can be generated. Besides, this work performs two evaluations. The first evaluation will verify the consistency of the products produced by the software product line. The second evaluation will verify the reuse of some products generated by SPL developed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing complexity of integrated circuits has boosted the development of communications architectures like Networks-on-Chip (NoCs), as an architecture; alternative for interconnection of Systems-on-Chip (SoC). Networks-on-Chip complain for component reuse, parallelism and scalability, enhancing reusability in projects of dedicated applications. In the literature, lots of proposals have been made, suggesting different configurations for networks-on-chip architectures. Among all networks-on-chip considered, the architecture of IPNoSys is a non conventional one, since it allows the execution of operations, while the communication process is performed. This study aims to evaluate the execution of data-flow based applications on IPNoSys, focusing on their adaptation against the design constraints. Data-flow based applications are characterized by the flowing of continuous stream of data, on which operations are executed. We expect that these type of applications can be improved when running on IPNoSys, because they have a programming model similar to the execution model of this network. By observing the behavior of these applications when running on IPNoSys, were performed changes in the execution model of the network IPNoSys, allowing the implementation of an instruction level parallelism. For these purposes, analysis of the implementations of dataflow applications were performed and compared

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O domínio alvo deste trabalho são os sistemas colaborativos distribuídos onde o foco está na troca dê mensagens entre usuários remotamente distribuídos. Nestes sistemas, há a necessidade das mensagens possuírem conteúdo multimídia e poderem ser entregues tanto a um usuário específico quanto a um grupo ou grupos de usuários. O objetivo deste trabalho é desenvolver um framework que facilite: a construção desse tipo de sistymas e diminua o tempo gasto com desenvolvimento através da técnica de reuso. Este trabalho apresenta o N2N Framework - Uma plataforma para desenvolvimento de Sistemas Colaborativos Distribuídos. O Framework foi concebido através da análise do comportamento de aplicações com características de multimídias colaborativas, como ambientes virtuais multi-usuários, chats, enquetes, e torcidas virtuais. O Framework foi implementado usando-se a plataforma Java. O N2N Framework facilita o design e implementação de sistemas colaborativos distribuídos, implementando a entrega das mensagens, e direcionando o desenvolvedor de aplicações para a preocupação com implementação de suas mensagens específicas e o processamento que delas decorre

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of Regression Test (RT) is to reuse the test suite of the latest version of a software in its current version, in order to maximize the value of the tests already developed and ensure that old features continue working after the new changes. Even with reuse, it is common that not all tests need to be executed again. Because of that, it is encouraged to use Regression Tests Selection (RTS) techniques, which aims to select from all tests, only those that reveal faults, this reduces costs and makes this an interesting practice for the testing teams. Several recent research works evaluate the quality of the selections performed by RTS techniques, identifying which one presents the best results, measured by metrics such as inclusion and precision. The RTS techniques should seek in the System Under Test (SUT) for tests that reveal faults. However, because this is a problem without a viable solution, they alternatively seek for tests that reveal changes, where faults may occur. Nevertheless, these changes may modify the execution flow of the algorithm itself, leading some tests no longer exercise the same stretch. In this context, this dissertation investigates whether changes performed in a SUT would affect the quality of the selection of tests performed by an RTS, if so, which features the changes present which cause errors, leading the RTS to include or exclude tests wrongly. For this purpose, a tool was developed using the Java language to automate the measurement of inclusion and precision averages achieved by a regression test selection technique for a particular feature of change. In order to validate this tool, an empirical study was conducted to evaluate the RTS technique Pythia, based on textual differencing, on a large web information system, analyzing the feature of types of tasks performed to evolve the SUT

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The search for sustainable solutions through an appropriate environmental administration of the available natural resources, that comes from encounter to the aspirations of preservation of the environment and of the human being, in way to diagnose and to solve the environmental and social problems with the smallest possible impact to the nature and the man, it is the great challenge, so much for that generation, as for the future generations. The study of the environmental problems of the water and the participation and the social actors' environmental understanding as a whole, interferes in the field of the thematic environmental international, contemplating the strategic need of an appropriate administration of that very natural one, through a program returned to the diagnosis of the problems and in the search of compatible maintainable solutions, in a social and environmental politics of planning and environmental education, centered above all in the citizen's voice , user of that system. The present thesis she seeks to study the problem of the maintainable administration of the water, focusing the participation and the citizen's environmental understanding in the use of that very natural one for urban residential activities, in what concerns the approach and analyses of variables that treat of the measurement of general knowledge and you adapt, sense of community of the access to the means of information and of the attitudes and environmental behaviors, besides the variables of partner-demographic characterization or personal identification of the interviewed ones of an exploratory research of the type " survey ", accomplished through a stratified aleatory sampling, being the strata each one of the 4 (four) Political-Administrative Areas of the Natal city, having happened the collection of the data in the period of february to april/2002. The methodology used in this work it constitutes in the application of questionnaires with scales of the type Likert to measure the echo-varied of the study, besides a partner-demographic scale for the characterization of the studied sample. For the analysis of the results, it was made an exploratory descriptive study initially, followed by the use of techniques statistical multivariate s, such as, factorial analysis through the application of main components, besides the accomplishment of studies of multiple lineal regression. To complement this study, the accomplishment of Tests of Independence was proceeded through the Qui-square of Pearson, in way to verify the dependence of the associations between the partner-demographic variables and the principal selected variables and presents in the resulting factors of the factorial analysis. The results appear for a low level of environmental knowledge, of access to the information and community's sense, besides the verification that the principal factors resultants send for the need of feeling emphasis in the programs and administration actions addressed for the environmental understanding, the behaviors and attitudes that approach the information and the environmental education, besides the reuse of the water

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reverberation is caused by the reflection of the sound in adjacent surfaces close to the sound source during its propagation to the listener. The impulsive response of an environment represents its reverberation characteristics. Being dependent on the environment, reverberation takes to the listener characteristics of the space where the sound is originated and its absence does not commonly sounds like “natural”. When recording sounds, it is not always possible to have the desirable characteristics of reverberation of an environment, therefore methods for artificial reverberation have been developed, always seeking a more efficient implementations and more faithful to the real environments. This work presents an implementation in FPGAs (Field Programmable Gate Arrays ) of a classic digital reverberation audio structure, based on a proposal of Manfred Schroeder, using sets of all-pass and comb filters. The developed system exploits the use of reconfigurable hardware as a platform development and implementation of digital audio effects, focusing on the modularity and reuse characteristics