838 resultados para Pellets reuse
Resumo:
Perovskite-like ceramic materials present the general formula ABO3, where A is a rare earth element or an alkaline metal element, and B is a transition metal. These materials are strong candidates to assume the position of cathode in Solid Oxide Fuel Cells (SOFC), because they present thermal stability at elevated temperatures and interesting chemical and physical properties, such as superconductivity, dieletricity, magnetic resistivity, piezoelectricity, catalytic activity and electrocatalytic and optical properties. In this work the cathodes of Solid Oxide Fuel Cells with the perovskite structure of La1-xSrxMnO3 (x = 0.15, 0.22, 0.30) and the electrolyte composed of zirconia-stabilized-yttria were synthesized by the Pechini method. The obtained resins were thermal treatment at 300 ºC for 2h and the obtained precursors were characterized by thermal analysis by DTA and TG / DTG. The powder precursors were calcined at temperatures from 450 to 1350ºC and were analyzed using XRD, FTIR, laser granulometry, XRF, surface area measurement by BET and SEM methods. The pellets were sintered from the powder to the study of bulk density and thermal expansion
Resumo:
Were synthesized different ferrites NixZn1-xFe2O4 (0,4 ≤ x ≤ 0,6) compositions by using citrate precursor method. Initially, the precursors citrates of iron, nickel and zinc were mixed and homogenized. The stoichiometric compositions were calcined at 350°C without atmosphere control and the calcined powders were pressed in pellets and toroids. The pressed material was sintered from 1100º up to 1200ºC in argon atmosphere. The calcined powders were characterized by XRD, TGA/DTG, FTIR, SEM and vibrating sample magnetometer (VSM). All sintered samples were characterized using XRD, SEM, VSM and measurements of magnetic permeability and loss factor were obtained. It was formed pure ferromagnetic phase at all used temperatures. The Rietveld analyses allowed to calculate the cations level occupation and the crystallite size. The analyses obtained nanometric crystals (12-20 nm) to the calcined powder. By SEM, the sintered samples shows grains sizes from 1 to 10 μm. Sintered densities (ρ) were measured by the Archimedes method and with increasing Zn content, the bulk density decrease. The better magnetization results (105-110 emu/g) were obtained for x=0,6 at all sintering temperatures. The hysteresis shows characteristics of soft magnetic material. Two magnetization processes were considered, superparamagnetism at low temperature and the magnetic domains formation at high temperatures. The sintered toroids presents relative magnetic permeability (μr) from 7 to 32 and loss factor (tanδ) of about 1. The frequency response of toroids range from 0,3 kHz to 0,2 GHz. The composition x=0,5 presents both greater μr and tanδ values and x=0,6 the most broad range of frequency response. Various microstructural factors show influence on the behavior of μr and tanδ, such as: grain size, porosity across grain boundary and inside the grain, grain boundary content and domain walls movement during the process of magnetization at high frequency studies (0,3kKz 0,2 GHz)
Resumo:
The quantitative chemical characterization of the inorganic fraction of scale products is very relevant in studying, monitoring and controlling corrosive processes of oil pipelines. The X-ray fluorescence spectrometry (XRF) is a very versatile analytical technique, which can be used in quantitative analysis in solid samples at low concentrations of the chemical element, in the order of few ppm. A methodology that involves sample preparation diluted in the proportion of 1:7 (one portion of the sample for seven of wax), pressed as pellets was used in the XRF calibration for chemical analysis of scale products from oil pipelines. The calibration involved the preparation of reference samples from mixtures of P.A. reagents, aiming to optimize the time consumed in the steps of sample preparation and analysis of Al, Ba, Ca, Fe, K, Mg, Mn, Na, P, S, Si, Sr and Ti, using the same pressed pellet for trace and major elements analysis
Resumo:
Different compositions of Ni0,5-xCuxZn0,5Fe2O4 and Ni0,5-xCoxZn0,5Fe2O4 0 ≤ x ≤ 0.3 were synthesized ferrite y the citrate precursor method. The stoichiometric compositions were calcined in air at 350°C and then pressed into pellets and toroids. The pressed samples were sintered at temperatures of 1000, 1050 and 1100°C/3h in air control at the speed of heating and cooling. The calcined powders were characterized by XRD, TGA / DTG, FTIR, SEM and vibrating sample magnetometry (VSM) and the sintered samples by XRD, SEM, MAV, density and measurements of permeability and magnetic losses. There was pure phase formation ferrimagnetism applied at all temperatures except for A-I composition at all sintering temperatures and A-II only at a temperature of 1100°C. Crystallite sizes were obtained by Rietveld analysis, nanometer size from 11 to 20 nm for the calcined powders. For SEM, the sintered samples showed grain size between 1 and 10 micrometers. Bulk density (ρ) of sintered material presented to the Families almost linear behavior with increasing temperature and a tendency to decrease with increasing concentration of copper, different behavior of the B Family, where the increase in temperature decreased the density. The magnetic measurements revealed the powder characteristics of a soft ferrimagnetic material. Two processes of magnetization were considered, the superparamagnetism at low temperatures (350°C) and the formation of magnetic domains at higher temperatures. Obtaining the best parameters for P and B-II magnetic ferrites at high temperatures. The sintered material at 1000°C showed a relative permeability (μ) from 50 to 800 for the A Family and from 10 to 600 for the B Family. The samples sintered at 1100°C, B Family showed a variation from 10 to 1000 and the magnetic loss (tan δ) of A and B Families, around of 1. The frequency response of the toroidal core is in the range of 0.3 kHz. Several factors contribute to the behavior of microstructure considering the quantities μ and tan δ, such as the grain size, inter-and intragranular porosity, amount of grain boundary and the aspects of the dynamics of domain walls at high frequencies.
Resumo:
The environmental impact due to the improper disposal of metal-bearing industrial effluents imposes the need of wastewater treatment, since heavy metals are nonbiodegradable and hazardous substances that may cause undesirable effects to humans and the environment. The use of microemulsion systems for the extraction of metal ions from wastewaters is effective when it occurs in a Winsor II (WII) domain, where a microemulsion phase is in equilibrium with an aqueous phase in excess. However, the microemulsion phase formed in this system has a higher amount of active matter when compared to a WIII system (microemulsion in equilibrium with aqueous and oil phases both in excess). This was the reason to develop a comparative study to evaluate the efficiency of two-phases and three-phases microemulsion systems (WII and WIII) in the extraction of Cu+2 and Ni+2 from aqueous solutions. The systems were composed by: saponified coconut oil (SCO) as surfactant, n-Butanol as cosurfactant, kerosene as oil phase, and synthetic solutions of CuSO4.5H2O and NiSO4.6H2O, with 2 wt.% NaCl, as aqueous phase. Pseudoternary phase diagrams were obtained and the systems were characterized by using surface tension measurements, particle size determination and scanning electron microscopy (SEM). The concentrations of metal ions before and after extraction were determined by atomic absorption spectrometry. The extraction study of Cu+2 and Ni+2 in the WIII domain contributed to a better understanding of microemulsion extraction, elucidating the various behaviors presented in the literature for these systems. Furthermore, since WIII systems presented high extraction efficiencies, similar to the ones presented by Winsor II systems, they represented an economic and technological advantage in heavy metal extraction due to a small amount of surfactant and cosurfactant used in the process and also due to the formation of a reduced volume of aqueous phase, with high concentration of metal. Considering the reextraction process, it was observed that WIII system is more effective because it is performed in the oil phase, unlike reextraction in WII, which is performed in the aqueous phase. The presence of the metalsurfactant complex in the oil phase makes possible to regenerate only the surfactant present in the organic phase, and not all the surfactant in the process, as in WII system. This fact allows the reuse of the microemulsion phase in a new extraction process, reducing the costs with surfactant regeneration
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
The development of information and communication technologies, in particular, Internet, and its Web 2.0 information environment has led to significant changes in contemporary society as to the ways of producing informational content. Collaboration and remix, favored by the new services and applications resulting from the development of the Web, are practices which contribute for the exponential growth of information producers. An important part of humanity ceases to be a mere consumer of symbolic goods and becomes a member in a society that sees in the collaboration and remix a new form of creation, use and dissemination of intellectual content. However, as such practices involve the production and use of information intelectual content, and are ruled by a legisltion which determine determines under what conditions the author and the user must produce and use the intellectual work. This legislation established for a context prior to the develompment of the Web has created an imbalance in the context of Web 2.0 which needs to be solved in some way so as to provide the required rebalance for the flow of information. This study explores the collaborative Web environment, the scope of copyright law in Web enviroment and the Creative Commons licenses as an alternative for producers and users of information to create, recreate, share, use, reuse and disseminate legally the intellectual production for the benefit of the construction of knowledge.
Resumo:
This work presents a proposal of a multi-middleware environment to develop distributed applications, which abstracts different underlying middleware platforms. This work describes: (i) the reference architecture designed for the environment, (ii) an implementation which aims to validate the specified architecture integrating CORBA and EJB, (iii) a case study illustrating the use of the environment, (iv) a performance analysis. The proposed environment allows interoperability on middleware platforms, allowing the reuse of components of different kinds of middleware platforms in a transparency away to the developer and without major losses in performance. Also in the implementation we developed an Eclipse plugin which allows developers gain greater productivity at developing distributed applications using the proposed environment
Resumo:
The increase of applications complexity has demanded hardware even more flexible and able to achieve higher performance. Traditional hardware solutions have not been successful in providing these applications constraints. General purpose processors have inherent flexibility, since they perform several tasks, however, they can not reach high performance when compared to application-specific devices. Moreover, since application-specific devices perform only few tasks, they achieve high performance, although they have less flexibility. Reconfigurable architectures emerged as an alternative to traditional approaches and have become an area of rising interest over the last decades. The purpose of this new paradigm is to modify the device s behavior according to the application. Thus, it is possible to balance flexibility and performance and also to attend the applications constraints. This work presents the design and implementation of a coarse grained hybrid reconfigurable architecture to stream-based applications. The architecture, named RoSA, consists of a reconfigurable logic attached to a processor. Its goal is to exploit the instruction level parallelism from intensive data-flow applications to accelerate the application s execution on the reconfigurable logic. The instruction level parallelism extraction is done at compile time, thus, this work also presents an optimization phase to the RoSA architecture to be included in the GCC compiler. To design the architecture, this work also presents a methodology based on hardware reuse of datapaths, named RoSE. RoSE aims to visualize the reconfigurable units through reusability levels, which provides area saving and datapath simplification. The architecture presented was implemented in hardware description language (VHDL). It was validated through simulations and prototyping. To characterize performance analysis some benchmarks were used and they demonstrated a speedup of 11x on the execution of some applications
Resumo:
Research on Wireless Sensor Networks (WSN) has evolved, with potential applications in several domains. However, the building of WSN applications is hampered by the need of programming in low-level abstractions provided by sensor OS and of specific knowledge about each application domain and each sensor platform. We propose a MDA approach do develop WSN applications. This approach allows domain experts to directly contribute in the developing of applications without needing low level knowledge on WSN platforms and, at the same time, it allows network experts to program WSN nodes to met application requirements without specific knowledge on the application domain. Our approach also promotes the reuse of the developed software artifacts, allowing an application model to be reused across different sensor platforms and a platform model to be reused for different applications
Resumo:
Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television
Resumo:
This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)
Resumo:
Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process