960 resultados para Technicans in industry
Resumo:
This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory
Resumo:
The use of wireless sensor and actuator networks in industry has been increasing past few years, bringing multiple benefits compared to wired systems, like network flexibility and manageability. Such networks consists of a possibly large number of small and autonomous sensor and actuator devices with wireless communication capabilities. The data collected by sensors are sent directly or through intermediary nodes along the network to a base station called sink node. The data routing in this environment is an essential matter since it is strictly bounded to the energy efficiency, thus the network lifetime. This work investigates the application of a routing technique based on Reinforcement Learning s Q-Learning algorithm to a wireless sensor network by using an NS-2 simulated environment. Several metrics like energy consumption, data packet delivery rates and delays are used to validate de proposal comparing it with another solutions existing in the literature
Resumo:
In the last decades there was a concentrate effort of researchers in the search for options to the problem of the continuity of city development and environmental preservation. The recycling and reuse of materials in industry have been considerate as the best option to sustainable development. One of the relevant aspects in this case refers to the rational use of electrical energy. At this point, the role of engineering is to conceive new processes and materials, with the objective of reducing energy consumption and maintaining, at the same time the benefits of the technology. In this context, the objective of the present research is to analyze quantitatively the thermal behavior of walls constructed with concrete blocks which composition aggregates the expanded polystyrene (EPS) reused in the shape of flakes and in the shape of a board, resulting in a “light concrete”. Experiments were conducted, systematically, with a wall (considerate as a standard) constructed with blocks of ordinary concrete; two walls constructed with blocks of light concrete, distinct by the proportion of EPS/sand; a wall of ceramic bricks (“eight holes” type) and a wall with ordinary blocks of cement, in a way to obtain a comparative analysis of the thermal behavior of the systems. Others tests conducted with the blocks were: stress analysis and thermal properties analysis (ρ, cp e k). Based on the results, it was possible to establish quantitative relationship between the concentration (density) of EPS in the constructive elements and the decreasing of the heat transfer rate, that also changes the others thermal properties of the material, as was proved. It was observed that the walls of light concrete presents better thermal behavior compared with the other two constructive systems world wide used. Based in the results of the investigation, there was shown the viability of the use of EPS as aggregate (raw material) in the composition of the concrete, with the objective of the fabrication of blocks to non-structural masonry that works as a thermal insulation in buildings. A direct consequence of this result is the possibility of reduction of the consume of the electrical energy used to climatization of buildings. Other aspect of the investigation that must be pointed was the reuse of the EPS as a raw material to civil construction, with a clear benefit to reducing of environmental problems
Resumo:
The industries of structural ceramics are among the most important production chains in the state of Rio Grande do Norte. The industry and other interest groups to target the replacement of firewood by natural gas. Studies accordingly concluded that simple change does not guarantee products of superior quality, and that the increase in spending on fuel can economically cripple the use of gas for burning the majority of products manufactured by that action. However some proposals of innovations in terms of process and product are being studied in an attempt to justify the use of natural gas in industry, structural ceramics. One of the aspects investigated is the development of ceramic products differentiated, with new designs and greater value added. Inserted in that context, this paper aims to investigate the potential use of clay-firing clear fabrication of the "bricks of apparent joins drought", a new ceramic product with an innovative way. The development of the work was done in three stages. In the initial stage was held the characterization of raw materials, sought information on physical, chemical, mineralogical and mechanical samples. In the second stage five bodies were made using two of the nine ceramic clay characterized the first step. The masses were analyzed and compared with respect to the size distribution, plasticity and technological properties. In the last part of this work was carried out tests on massive bricks manufactured on an industrial scale. The results show that the nine clays can be used in the manufacture of new ceramic products, is the only constituent of mass ceramic or by mixing with other(s) clay(s
Resumo:
Environmental sustainability has become one of the topics of greatest interest in industry, mainly due to effluent generation. Phenols are found in many industries effluents, these industries might be refineries, coal processing, pharmaceutical, plastics, paints and paper and pulp industries. Because phenolic compounds are toxic to humans and aquatic organisms, Federal Resolution CONAMA No. 430 of 13.05.2011 limits the maximum content of phenols, in 0.5 mg.L-1, for release in freshwater bodies. In the effluents treatment, the liquid-liquid extraction process is the most economical for the phenol recovery, because consumes little energy, but in most cases implements an organic solvent, and the use of it can cause some environmental problems due to the high toxicity of this compound. Because of this, exists a need for new methodologies, which aims to replace these solvents for biodegradable ones. Some literature studies demonstrate the feasibility of phenolic compounds removing from aqueous effluents, by biodegradable solvents. In this extraction kind called "Cloud Point Extraction" is used a nonionic surfactant as extracting agent of phenolic compounds. In order to optimize the phenol extraction process, this paper studies the mathematical modeling and optimization of extraction parameters and investigates the effect of the independent variables in the process. A 32 full factorial design has been done with operating temperature and surfactant concentration as independent variables and, parameters extraction: Volumetric fraction of coacervate phase, surfactant and residual concentration of phenol in dilute phase after separation phase and phenol extraction efficiency, as dependent variables. To achieve the objectives presented before, the work was carried out in five steps: (i) selection of some literature data, (ii) use of Box-Behnken model to find out mathematical models that describes the process of phenol extraction, (iii) Data analysis were performed using STATISTICA 7.0 and the analysis of variance was used to assess the model significance and prediction (iv) models optimization using the response surface method (v) Mathematical models validation using additional measures, from samples different from the ones used to construct the model. The results showed that the mathematical models found are able to calculate the effect of the surfactant concentration and the operating temperature in each extraction parameter studied, respecting the boundaries used. The models optimization allowed the achievement of consistent and applicable results in a simple and quick way leading to high efficiency in process operation.
Resumo:
The babassu (Orbignya phalerata) is a native tree found in northern Brazil. Extracts of the babassu coconut have been widely used in industry. Babassu flour has about 60% starch, thus, besides nourishment it can be used as an alternative biofuel source. However, the properties of this starch lack of study and understanding. The main purpose of this study was to investigate the thermal behavior of raw babassu flour and its solid hydrolyzed fraction. The analyses were carried out using SHIMADZU DSC and TG thermic analyzers. The results demonstrated a reduction in thermal stability of the solid hydrolyzed fraction compared to raw matter. The kinetic parameters were investigated using non-isothermal methods and the parameters obtained for its decomposition process were an E(a) of 166.86 kJ mol(-1) and a frequency factor (beta) of 6.283 x 1014 min(-1); this was determined to be a first order reaction (n = 1). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Nas últimas duas décadas, as cerâmicas avançadas têm sido exaustivamente utilizadas em aplicações na indústria devido às suas propriedades de elevada resistência ao desgaste e dureza. Entretanto, ainda se tem um alto custo agregado ao acabamento da peça. Esse acabamento geralmente é feito pelo processo de retificação, único processo economicamente viável que produz superfícies de elevada qualidade e precisão geométrica. Nesse contexto, as empresas vêm buscando a otimização no processo de retificação como, por exemplo, a redução do fluxo de fluido de corte utilizado, o que também visa atender exigências mundiais de preservação ambiental. Desta forma, este projeto pretendeu explorar a técnica da Mínima Quantidade de Lubrificação (MQL) na retificação cilíndrica externa de mergulho em cerâmicas com rebolos diamantados. Foram utilizados dois métodos de refrigeração: o convencional e o MQL, com três avanços de corte para cada caso. Foram usados um bocal convencional e um bocal para o MQL, tendo este um uniformizador de saída do jato. Foram analisadas como variáveis de saída: a emissão acústica, relação G, aspecto da superfície via microscopia eletrônica de varredura (MEV), rugosidade e circularidade. Assim, embora a refrigeração convencional ainda apresente os melhores resultados em comparação com a refrigeração com MQL, esta última pode atender os requisitos necessários para diversas aplicações, em especial quando utilizadas baixas espessuras equivalentes de corte (h eq). Além disso, a técnica de MQL possui a vantagem de gerar um menor impacto ambiental em comparação com a lubrificação convencional, devido ao uso mínimo de fluido de corte cujo descarte é cada vez mais regulamentado e custoso.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
During the storage of oil, sludge is formed in the bottoms of tanks, due to decantation, since the sludge is composed of a large quantity of oil (heavy petroleum fractions), water and solids. The oil sludge is a complex viscous mixture which is considered as a hazardous waste. It is then necessary to develop methods and technologies that optimize the cleaning process, oil extraction and applications in industry. Therefore, this study aimed to determine the composition of the oil sludge, to obtain and characterize microemulsion systems (MES), and to study their applications in the treatment of sludge. In this context, the Soxhlet extraction of crude oil sludge and aged sludge was carried out, and allowing to quantify the oil (43.9 % and 84.7 % - 13 ºAPI), water (38.7 % and 9.15 %) and solid (17.3 % and 6.15 %) contents, respectively. The residues were characterized using the techniques of X-ray fluorescence (XRF), Xray diffraction (XRD) and transmission Infrared (FT-IR). The XRF technique determined the presence of iron and sulfur in higher proportions, confirming by XRD the presence of the following minerals: Pyrite (FeS2), Pyrrhotite (FeS) and Magnetite (Fe3O4). The FT-IR showed the presence of heavy oil fractions. In parallel, twelve MES were prepared, combining the following constituents: two nonionic surfactants (Unitol L90 and Renex 110 - S), three cosurfactants (butanol, sec-butanol and isoamyl alcohol - C), three aqueous phase (tap water - ADT, acidic solution 6 % HCl, and saline solution - 3.5 % NaCl - AP) and an oil phase (kerosene - OP). From the obtained systems, a common point was chosen belonging to the microemulsion region (25 % [C+S] 5 % OP and AP 70 %), which was characterized at room temperature (25°C) by viscosity (Haake Rheometer Mars), particle diameter (Zeta Plus) and thermal stability. Mixtures with this composition were applied to oil sludge solubilization under agitation at a ratio of 1:4, by varying time and temperature. The efficiencies of solubilization were obtained excluding the solids, which ranged between 73.5 % and 95 %. Thus, two particular systems were selected for use in storage tanks, with efficiencies of oil sludge solubilization over 90 %, which proved the effectiveness of the MES. The factorial design delimited within the domain showed how the MES constituents affect the solubilization of aged oil sludge, as predictive models. The MES A was chosen as the best system, which solubilized a high amount of aged crude oil sludge (~ 151.7 g / L per MES)
Resumo:
Aiming to consumer s safety the presence of pathogenic contaminants in foods must be monitored because they are responsible for foodborne outbreaks that depending on the level of contamination can ultimately cause the death of those who consume them. In industry is necessary that this identification be fast and profitable. This study shows the utility and application of near-infrared (NIR) transflectance spectroscopy as an alternative method for the identification and classification of Escherichia coli and Salmonella Enteritidis in commercial fruit pulp (pineapple). Principal Component Analysis (PCA), Independent Modeling of Class Analogy (SIMCA) and Discriminant Analysis Partial Least Squares (PLS-DA) were used in the analysis. It was not possible to obtain total separation between samples using PCA and SIMCA. The PLS-DA showed good performance in prediction capacity reaching 87.5% for E. coli and 88.3% for S. Enteritides, respectively. The best models were obtained for the PLS-DA with second derivative spectra treated with a sensitivity and specificity of 0.87 and 0.83, respectively. These results suggest that the NIR spectroscopy and PLS-DA can be used to discriminate and detect bacteria in the fruit pulp
Resumo:
PLCs (acronym for Programmable Logic Controllers) perform control operations, receiving information from the environment, processing it and modifying this same environment according to the results produced. They are commonly used in industry in several applications, from mass transport to petroleum industry. As the complexity of these applications increase, and as various are safety critical, a necessity for ensuring that they are reliable arouses. Testing and simulation are the de-facto methods used in the industry to do so, but they can leave flaws undiscovered. Formal methods can provide more confidence in an application s safety, once they permit their mathematical verification. We make use of the B Method, which has been successfully applied in the formal verification of industrial systems, is supported by several tools and can handle decomposition, refinement, and verification of correctness according to the specification. The method we developed and present in this work automatically generates B models from PLC programs and verify them in terms of safety constraints, manually derived from the system requirements. The scope of our method is the PLC programming languages presented in the IEC 61131-3 standard, although we are also able to verify programs not fully compliant with the standard. Our approach aims to ease the integration of formal methods in the industry through the abbreviation of the effort to perform formal verification in PLCs
Resumo:
Brazil has about 8,500 km of coastline and on this scale, fishing is a historically important source of animal protein for human consumption. The national fishing background shows a growth of marine fishery production until 1985 and within this period it was recorded a steady decline. From the year 2003 fishing statistics aim to some "recovery" of the total fisheries production, which probably is related to a change in industry practice. The target of commercial fishing became smaller species with low commercial value, but very abundants. The coney, Cephalopholis fulva (Serranidae), is one of these species that have been suffering a greater fishing pressure in recent years. In order to provide data about the current situation of the genetic diversity of these populations, several molecular markers have been being used for this purpose. The prior knowledge of genetic variability is crucial for management and biodiversity conservation. To this end, the control region sequences (dloop) of mtDNA from Cephalopholis fulva (Serranidae) from five geographical points of the coast of Brazil (Ceará, Rio Grande do Norte, Bahia and Espírito Santo) and the Archipelago of Fernando de Noronha (FN) were sequenced and their genetic diversity analyzed. The FST values were very low (0.0246 to 0.000), indicating high gene flow between the sampled spots. The indices h and indicate a secondary contact between previously allopatric lineages differentiated or large and stable populations with long evolutionary history. Tests of Tajima and Fu showed expansion for all populations. In contrast, the mismatch distribution and SSD indicated expansion just for coastal populations. Unlike other species of the Atlantic which have been deeply affected by events on later Pleistocene, the population-genetic patterns of C. fulva may be related to recent events occurred approximately 130,000 years ago. Moreover, the data presented by geographical samples of the specie C. fulva showed high genetic diversity, also indicating the absence of deleterious effects of over-exploitation on this specie, as well as evidence of complete panmixia between all sampled populations
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Factorial experiments are widely used in industry to investigate the effects of process factors on quality response variables. Many food processes, for example, are not only subject to variation between days, but also between different times of the day. Removing this variation using blocking factors leads to row-column designs. In this paper, an algorithm is described for constructing factorial row-column designs when the factors are quantitative, and the data are to be analysed by fitting a polynomial model. The row-column designs are constructed using an iterative interchange search, where interchanges that result in an improvement in the weighted mean of the efficiency factors corresponding to the parameters of interest are accepted. Some examples illustrating the performance of the algorithm are given.
Resumo:
This article describes a technique for Large Scale Virtual Environments (LSVEs) partitioning in hexagon cells and using portal in the cell interfaces to reduce the number of messages on the network and the complexity of the virtual world. These environments usually demand a high volume of data that must be sent only to those users who needs the information [Greenhalgh, Benford 1997].