25 resultados para Brams, Steven J.: The win-win solution
Resumo:
This is a qualitative and reflexive research with focus on digital literacy. Among the digital media that could support the teaching of argumentation in the Science & Technology and Information Technology undergraduate courses of the Federal University of Rio Grande do Norte, we chose a serious game as object of research. Given the object of study in the discipline of reading and writing II – argumentation and genre from the order of argumentative writing -, common to the undergraduate courses mentioned, we invest on the development of a serious game, named ArgumentACTION, because we believe that it may, in fact, become a promising didactic instrument. Therefore we intend to understand whether and how this game can help students develop their reading and writing skills more independently, specifically towards an argumentative order genre: the opinion piece. With this research, we intend to contribute to the teaching of the Portuguese language on three bases: extending theoretical scope, in order to generate greater intelligibility on the teaching-learning process of argumenting; proposing a new methodological possibility, with the incorporation of a serious games to teaching; perfecting the game with which we are working, in order to build – and make available – a more refined digital tool to subsidize the teaching and learning of reading and writing of opinion pieces. To do so, we use the following as theoretical-methodological: Studies of Literacy (KLEIMAN, 2012b; TINOCO, 2008; OLIVEIRA, 2010; GEE, 2009; 2010; ROJO, 2012), The Applied Linguistics (KLEIMAN, 1998; BUSH-LEE, 2009), The Philosophy of Language (BAKHTIN, VOLOSHINOV, 2012) and Critical Pedagogy (DEWEY, 2010). A group of students from the upper mentioned undergraduate courses collaborated with this research by playing and analyzing the game. They were also interviewed about their experience in this matter. From the data generated, we established the categories of analysis: decollection, interest, multimodality/multisemiosis and interactivity, agent of literacy, learning principles. The conclusions we obtained show that the investment in applications, especially games, can bring real benefits to the teaching/learning of the Portuguese language; moreover they reveal that the work on argumenting has much to gain with the incorporation of serious games; however the possible advantages depend on a focused teaching practice and constant improvements and updates of this type of interactive tool, as well as the pedagogical practice from those who use and develop the games.
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools
Resumo:
Portland-polymers composites are promising candidates to be used as cementing material in Northeastern oil wells of Brazil containing heavy oils submitted to steam injection. In this way, it is necessary to evaluate its degradation in the commonly acidizind agents. In addition, to identify how aggressive are the different hostile environments it is an important contribution on the decision of the acidic systems to be used in. It was investigated the performance of the Portland-polymer composites using powdered polyurethane, aqueous polyurethane, rubber tire residues and a biopolymer, those were reinforced with polished carbon steel SAE 1045 to make the electrochemical measurements. HCl 15,0 %, HCl 6,0 % + HF 1,5 % (soft mud acid), HCl 12,0 % + HF 3,0 % (regular mud acid) and HAc 10 % + HF 1,5 % were used as degrading environment and electrolytes. The more aggressive acid solution to the plain Portland hardened cement paste was the regular mud acid, that showed loss of weight around 23.0 %, followed by the soft mud acid, the showed 11.0 %, 15.0 % HCl with 7,0 % and, at last the 10.0 % HAc plus HF 1.5 % with just 1.0 %. The powdered polyurethane-composite and the aqueous polyurethane one showed larger durability, with reduction around 87.0 % on the loss of weight in regular mud acid. The acid attack is superficial and it occurs as an action layer, where the degraded layer is responsible for the decrease on the kinetic of the degrading process. This behavior can be seen mainly on the Portland- aqueous polyurethane composite, because the degraded layer is impregnated with chemically modified polymer. The fact of the acid attack does not have influence on the compressive strength or fratography of the samples, in a general way, confirms that theory. The mechanism of the efficiency of the Portland-polymers composites subjected to acid attack is due to decreased porosity and permeability related with the plain Portland paste, minor quantity of Ca+2, element preferentially leached to the acidic solution, wave effect and to substitute part of the degrading bulk for the polymeric one. The electrolyte HAc 10 % + HF 1,5 % was the least aggressive one to the external corrosion of the casing, showing open circuit potentials around +250 mV compared to -130 mV to the simulated pore solution to the first 24 hours immersion. This behavior has been performed for two months at least. Similar corrosion rates were showed between both of the electrolytes, around 0.01 μA.cm-2. Total impedance values, insipient arcs and big polarization resistance capacitive arcs on the Nyquist plots, indicating passivity process, confirm its efficiency. In this way, Portlandpolymers composites are possible solutions to be succeed applied to oilwell cementing concomitant submitted to steam injection and acidizing operation and the HAc 10,0 % + HF 1,5 % is the less aggressive solution to the external corrosion of the casing
Resumo:
Boron is a semi-metal present in certain types of soils and natural waters. It is essential to the healthy development of plants and non-toxic to humans, depending on its concentration. It is used in various industries and it s present in water production coming from oil production. More specifically in Rio Grande do Norte, one of the largest oil producers on shore of Brazil, the relationship water/oil in some fields becomes more than 90%. The most common destination of this produced water is disposal in open sea after processing to meet the legal specification. In this context, this research proposes to study the extraction of boron in water produced by microemulsion systems for industrial utilization. It was taken into account the efficiency of extraction of boron related to surfactant (DDA and OCS, both characterized by FT-IR), cosurfactant (butanol and isoamyl alcohol), organic phase (kerosene and heptanes) and aqueous phase (solution of boron 3.6 ppm in alkaline pH). The ratio cosurfactant/ surfactant used was four and the percentage of organic phases for all points of study was set at 5%. It was chosen points with the highest percentage of aqueous phase. Each system was designed for three points of different compositions in relation to the constituents of a pseudoternary diagram. These points were chosen according to studies of phase behavior in pseudoternary diagrams made in previous studies. For this research, points were chosen in the Winsor II region. The excess aqueous solution obtained in these systems was separated and analyzed by ICP OES. For the data set obtained, the better efficiency in the extraction of boron was obtained using the system with DAC, isoamyl alcohol and heptanes, which extracted 49% in a single step. OCS was not viable to the extraction of boron by microemulsion system in the conditions defined in this study
Resumo:
The benznidazole (BNZ) is the only alternative for Chagas disease treatment in Brazil. This drug has low solubility, which restricts its dissolution rate. Thus, the present work aimed to study the BNZ interactions in binary systems with beta cyclodextrin (β-CD) and hydroxypropyl-beta cyclodextrin (HP-β-CD), in order to increase the apparent aqueous solubility of drug. The influence of seven hydrophilic polymers, triethanolamine (TEA) and 1-methyl-2- pyrrolidone (NMP) in benznidazole apparent aqueous solubility, as well as the formation of inclusion complexes was also investigated. The interactions in solution were predicted and investigated using phase solubility diagram methodology, nuclear magnetic resonance of protons (RMN) and molecular modeling. Complexes were obtained in solid phase by spray drying and physicochemical characterization included the UV-Vis spectrophotometric spectroscopy in the infrared region, scanning electron microscopy, X-ray diffraction and dissolution drug test from the different systems. The increment on apparent aqueous solubility of drug was achieved with a linear type (AL) in presence of both cyclodextrins at different pH values. The hydrophilic polymers and 1-methyl-2-pyrrolidone contributes to the formation of inclusion complexes, while the triethanolamine decreased the complex stability constant (Kc). The log-linear model applied for solubility diagrams revealed that both triethanolamine and 1-methyl-2-pyrrolidone showed an action cosolvent (both solvents) and complexing (1-methyl-2-pyrrolidone). The best results were obtained with complexes involving 1-methyl-2-pyrrolidone and hydroxypropylbeta- cyclodextrin, with an increased of benznidazole solubility in 27.9 and 9.4 times, respectively. The complexes effectiveness was proven by dissolution tests, in which the ternary complexes and physical mixtures involving 1-methyl- 2-pyrrolidone and both cyclodextrins investigated showed better results, showing the potential use as novel pharmaceutical ingredient, that leads to increased benznidazole bioavailability
Resumo:
Foam was developed as a novel vehicle for streptokinase with the purpose of increasing the contact time and area between the fibrinolytic and the target thrombus, which would lead to a greater therapeutic efficacy at lower doses, decreasing the drug s potential to cause bleeding. Fibrinolytic foams were prepared using CO2 and human albumin (at different v:v ratios), as the gas and liquid phases, respectively, and streptokinase at a low total dose (100,000 IU) was used as fibrinolytic agent conveyed in 1 mL of foam and in isotonic saline solution. The foams were characterized as foam stability and apparent viscosity. The thrombolytic effect of the streptokinase foam was determined in vitro as thrombus lysis and the results were compared to those of a fibrinolytic solution (prepared using the same dose of streptokinase) and foam without the fibrinolytic. In vitro tests were conducted using fresh clots were weighed and placed in test tubes kept at 37 ° C. All the samples were injected intrathrombus using a multiperforated catheter. The results showed that both foam stability and apparent viscosity increased with the increase in the CO2:albumin solution ratio and therefore, the ratio of 3:1 was used for the incorporation of streptokinase. The results of thrombus lysis showed that the streptokinase foam presented the highest thrombolytic activity (44.78 ± 9.97%) when compared to those of the streptokinase solution (32.07 ± 3.41%) and the foam without the drug (19.2 ± 7.19%). We conclude that fibrinolytic foam showed statistically significant results regarding the enhancement of the lytic activity of streptokinase compared to the effect of the prepared saline solution, thus it can be a promising alternative in the treatment of thrombosis. However, in vivo studies are needed in order to corroborate the results obtained in vitro
Resumo:
This thesis is resulted of a research on propagated the social representations of constructivism in the Internet, which as had estimated basic the idea of Moscovici (1961) of that the media it has determinative paper in the popularization of the scientific theories, in formation e propagation of the social representations and in the construction of behaviors human beings. Understanding the Internet as latest space of circulation of social communications, therefore, privileged field (and still not explored) for studies of social representations, we choose this half midiático as investigative field of our research, it if it constitutes in the "great ocean of the new informational planet" (LEVY, 2000, p. 126), besides making possible the interaction with different forms of images, different individuals, different ' world ', configuring itself as important space of symbolic production e of analysis of the representational process in the world contemporary. Based in these questions, we trace as objective to analyze the circulating speech propagated on-line on the constructivism, searching to apprehend the social representations shared the respect. This objective was constructed in face of the insertion of the theory constructivist in the educational way has two decades more than, as well as of its consolidation as reference in different social contexts, mainly in midiático. The question that directed the research was: as this theory obtains to penetrate in some layers of our society, to influence the world readings e the behaviors of different people? That modifications it suffers in this trajectory and which the paper of the Internet in this process? The corpus of the research was constituted of the substances on the theory found in the Internet, in pages in Portuguese of Google, in the period of 27 of July of 2004 the 17 of August of 2004. The data they had been analyzed on the basis of the project of analysis of content elaborated for Moscovici (1961) and in the estimated theoreticians of the social communication, in special of the hypermedia. The results had disclosed that the Internet participates of determinative form in the process of popularization of the constructivism theory, not only spreading out its estimated theoretician- methodological for the domain public (diffusion), but propagating positive beliefs and images of its postulates (propagation) and using them it service of interests politicians e financiers (propaganda). The popularization of the constructivism in the Internet and social representations of this theory propagated configure one process in which the theory passes of pedagogical theoretical landmark for grief, assuming a commercial character extremely and the status of solution for problems of the education and the society, stirring up attitudes and behaviors in public, most general possible, that comes to take care of to the interests politicians e marketing. Recognizing the diversity, the permanent renewal of information and the negotiation of directions gifts in the Internet, as well as the innumerable ones forms of access to the information found by it, we consider results of the research as fruits of the inherent characteristics to this way midiático, to the time where we call the attention for relativity these results, for the plasticity of knowing them and sensible propagated on-line e for the great challenge that the hypermedia imposes to the study of communication phenomena, of the social representations and in educational quarrels and for the necessity of new research that they investigate the forms of socialization of the information and communication propitiated by hypermedia and its implications in the construction of the symbolic universes and in practical social
Resumo:
Techniques of optimization known as metaheuristics have achieved success in the resolution of many problems classified as NP-Hard. These methods use non deterministic approaches that reach very good solutions which, however, don t guarantee the determination of the global optimum. Beyond the inherent difficulties related to the complexity that characterizes the optimization problems, the metaheuristics still face the dilemma of xploration/exploitation, which consists of choosing between a greedy search and a wider exploration of the solution space. A way to guide such algorithms during the searching of better solutions is supplying them with more knowledge of the problem through the use of a intelligent agent, able to recognize promising regions and also identify when they should diversify the direction of the search. This way, this work proposes the use of Reinforcement Learning technique - Q-learning Algorithm - as exploration/exploitation strategy for the metaheuristics GRASP (Greedy Randomized Adaptive Search Procedure) and Genetic Algorithm. The GRASP metaheuristic uses Q-learning instead of the traditional greedy-random algorithm in the construction phase. This replacement has the purpose of improving the quality of the initial solutions that are used in the local search phase of the GRASP, and also provides for the metaheuristic an adaptive memory mechanism that allows the reuse of good previous decisions and also avoids the repetition of bad decisions. In the Genetic Algorithm, the Q-learning algorithm was used to generate an initial population of high fitness, and after a determined number of generations, where the rate of diversity of the population is less than a certain limit L, it also was applied to supply one of the parents to be used in the genetic crossover operator. Another significant change in the hybrid genetic algorithm is the proposal of a mutually interactive cooperation process between the genetic operators and the Q-learning algorithm. In this interactive/cooperative process, the Q-learning algorithm receives an additional update in the matrix of Q-values based on the current best solution of the Genetic Algorithm. The computational experiments presented in this thesis compares the results obtained with the implementation of traditional versions of GRASP metaheuristic and Genetic Algorithm, with those obtained using the proposed hybrid methods. Both algorithms had been applied successfully to the symmetrical Traveling Salesman Problem, which was modeled as a Markov decision process
Resumo:
We propose a new paradigm for collective learning in multi-agent systems (MAS) as a solution to the problem in which several agents acting over the same environment must learn how to perform tasks, simultaneously, based on feedbacks given by each one of the other agents. We introduce the proposed paradigm in the form of a reinforcement learning algorithm, nominating it as reinforcement learning with influence values. While learning by rewards, each agent evaluates the relation between the current state and/or action executed at this state (actual believe) together with the reward obtained after all agents that are interacting perform their actions. The reward is a result of the interference of others. The agent considers the opinions of all its colleagues in order to attempt to change the values of its states and/or actions. The idea is that the system, as a whole, must reach an equilibrium, where all agents get satisfied with the obtained results. This means that the values of the state/actions pairs match the reward obtained by each agent. This dynamical way of setting the values for states and/or actions makes this new reinforcement learning paradigm the first to include, naturally, the fact that the presence of other agents in the environment turns it a dynamical model. As a direct result, we implicitly include the internal state, the actions and the rewards obtained by all the other agents in the internal state of each agent. This makes our proposal the first complete solution to the conceptual problem that rises when applying reinforcement learning in multi-agent systems, which is caused by the difference existent between the environment and agent models. With basis on the proposed model, we create the IVQ-learning algorithm that is exhaustive tested in repetitive games with two, three and four agents and in stochastic games that need cooperation and in games that need collaboration. This algorithm shows to be a good option for obtaining solutions that guarantee convergence to the Nash optimum equilibrium in cooperative problems. Experiments performed clear shows that the proposed paradigm is theoretical and experimentally superior to the traditional approaches. Yet, with the creation of this new paradigm the set of reinforcement learning applications in MAS grows up. That is, besides the possibility of applying the algorithm in traditional learning problems in MAS, as for example coordination of tasks in multi-robot systems, it is possible to apply reinforcement learning in problems that are essentially collaborative
Resumo:
This work proposes a formulation for optimization of 2D-structure layouts submitted to mechanic and thermal shipments and applied an h-adaptive filter process which conduced to computational low spend and high definition structural layouts. The main goal of the formulation is to minimize the structure mass submitted to an effective state of stress of von Mises, with stability and lateral restriction variants. A criterion of global measurement was used for intents a parametric condition of stress fields. To avoid singularity problems was considerate a release on the stress restriction. On the optimization was used a material approach where the homogenized constructive equation was function of the material relative density. The intermediary density effective properties were represented for a SIMP-type artificial model. The problem was simplified by use of the method of finite elements of Galerkin using triangles with linear Lagrangian basis. On the solution of the optimization problem, was applied the augmented Lagrangian Method, that consists on minimum problem sequence solution with box-type restrictions, resolved by a 2nd orderprojection method which uses the method of the quasi-Newton without memory, during the problem process solution. This process reduces computational expends showing be more effective and solid. The results materialize more refined layouts with accurate topologic and shape of structure definitions. On the other hand formulation of mass minimization with global stress criterion provides to modeling ready structural layouts, with violation of the criterion of homogeneous distributed stress
Resumo:
This work was motivated by the importance of conducting a study of vehicle emissions in captive fleets with diesel engine, coupled with the predictive maintenance plan. This type of maintenance includes techniques designed to meet the growing market demand to reduce maintenance costs by increasing the reliability of diagnoses, which has increased interest in automated predictive maintenance on diesel engines, preventing problems that might evolve into routine turn into serious situations, solved only with complex and costly repairs, the Reliability Centered Maintenance, will be the methodology that will make our goal is reached, beyond maintaining the vehicles regulated as fuel consumption and emissions. To Therefore, technical improvements were estimated capable of penetrating the automotive market and give the inshore fleet emission rates of opacity of the vehicles, being directly related to the conditions of the lubricating oil thus contributing to reducing maintenance costs by contributing significantly to emissions of pollutants and an improvement in the air in large cities. This criterion was adopted and implemented, em 241 buses and produced a diagnosis of possible failures by the correlation between the characterization of used lubricating oils and the analysis of opacity, with the objective of the aid the detection and solution of failures for the maintenance of sub-systems according to design criteria, and for this to be a deductive methodology to determine potential causes of failures, has been automated to implement a predictive maintenance system for this purpose was used in our study a mobile unit equipped with a opacimeter and a kit for collection and analysis of lubricating oil and the construction of the network diagnostics, we used a computer program in Microsoft Office Access 2007 platform tool is indispensable for creating a database data, this method is being used and successfully implemented in seven (7) bus companies from the city of Natal (RN) Brazil
Resumo:
The lubricants found in the market are of mineral or synthetic origin and harm to humans and the environment, mainly due to their improper discard. Therefore industries are seeking to develop products that cause less environmental impact, so to decrease mainly, operator aggression the Cutting Fluids became an emulsion of oil / water or water / oil. However, the emulsion was not considered the most suitable solution for environmental question, therefore the search for biodegradable lubricants and which no are toxic continues and so vegetable oils are seen, again, as a basis for the production of lubricants. The biggest problem with these oils is their oxidative instability that is intensified when working at high temperatures. The process transesterification decreases the oxidation, however changes some physical and chemical properties. Therefore soybean oil after the transesterification process was subjected to tests of density, dynamic viscosity, kinematic viscosity which is calculated from two parameters mentioned, flash point and acidity. Besides the physico-chemical test the soybean oil was subjected to a dynamic test in a tribometer adapted from a table vise, whose induced wear was the adhesive and ultimately was used as cutting fluid in a process of turning in two different materials, steel 1045 and cast iron. This latter test presented results below the mineral cutting fluid which it was compared in all tests, already in other experiments the result was satisfactory and other experiments not, so that chemical additives can be added to the oil analyzed to try equate all parameters and so formulate a biolubrificante not toxic to apply in machining processes of metalworking industry
Resumo:
Copper is one of the most used metals in platingprocesses of galvanic industries. The presence of copper, a heavy metal, in galvanic effluents is harmful to the environment.The main objective of this researchwas the removal ofcopperfromgalvanic effluents, using for this purpose anionic surfactants. The removal process is based on the interaction between the polar head group of the anionic surfactant and the divalent copper in solution. The surfactants used in this study were derived from soybean oil (OSS), coconut oil (OCS), and sunflower oil (OGS). It was used a copper synthetic solution (280 ppm Cu+2) simulating the rinse water from a copper acid bath of a galvanic industry. It were developed 23and 32 factorial designs to evaluate the parameters that have influence in theremoval process. For each surfactant (OSS, OCS, and OGS), the independent variables evaluated were: surfactant concentration (1.25 to 3.75 g/L), pH (5 to 9) and the presence of an anionic polymer (0 to 0.0125 g/L).From the results obtained in the 23 factorial design and in the calculus for estimatingthe stoichiometric relationship between surfactants and copper in solution, it were developed new experimental tests, varying surfactant concentration in the range of 1.25 to 6.8 g/L (32 factorial design).The results obtained in the experimental designs were subjected to statistical evaluations to obtain Pareto charts and mathematical modelsfor Copper removal efficiency (%). The statistical evaluation of the 23 and 32factorial designs, using saponifiedcoconut oil (OCS), presented the mathematical model that best described the copper removal process.It can be concluded that OCS was the most efficient anionic surfactant, removing 100% of the copper present in the synthetic galvanic solution
Resumo:
In heavy oil fields there is a great difficulty of the oil to flow from the reservoir to the well, making its production more difficult and with high cost. Most of the original volumes of oil found in the world are considered unrecoverable by the use of the current methods. The injection of micellar solutions has a direct action in the oil interfacial properties, resulting in an enhanced oil recovery. The objective of this research was the study and selection of micellar solutions with ability to decrease the interfacial interactions between fluids and reservoir formation, increasing oil production. The selected micellar solutions were obtained using commercial surfactants and surfactants synthesized in laboratory, based on the intrinsic properties of these molecules, to use in the enhanced oil recovery. Petroleum Reservoirs were simulated using sandstone plugs from Botucatu formation. Experiments with conventional and enhanced oil recovery techniques were accomplished. The obtained results showed that all micellar solutions were able to enhance oil recovery, and the micellar solution prepared with a SB anionic surfactant, at 2% KCl solution, showed the best recovery factor. It was also accomplished an economic analysis with the SB surfactant solution. With the injection of 20% porous volume of micellar solution, followed by brine injection, the increment in petroleum recovery can reach 81% recovery factor in the 3rd porous volume injected. The increment in the total cost by the addition of surfactant to the injection water represents R$ 7.50/ton of injected fluid
Resumo:
This dissertation presents a methodology to the optimization of a predial system of cold water distribution. It s about a study of a case applied to the Tropical Buzios Residential Condominium, located in the Búzio s Beach, Nísia Floresta city, the east coast of the Rio Grande do Norte state, twenty kilometers far from Natal. The design of cold water distribution networks according to Norm NBR 5626 of the ABNT - Brazilian Association of Techniques Norms, does not guarantee that the joined solution is the optimal solution of less cost. It s necessary the use of an optimization methodology, that supplies us, between all the possible solutions, the minimum cost solution. In the optimization process of the predial system of water distribution of the Tropical Búzios Condominium, is used Method Granados, that is an iterative algorithm of optimization, based on the Dynamic Programming, that supplies the minimum cost s network, in function of the piezometric quota of the reservoir. For the application of this Method in ramifies networks, is used a program of computer in C language. This process is divided in two stages: attainment of the previous solution and reduction of the piezometric quota of headboard. In the attainment of the previous solution, the minors possible diameters are used that guarantee the limit of maximum speed and the requirements of minimum pressures. The piezometric quota of headboard is raised to guarantee these requirements. In the second stage of the Granados Method, an iterative process is used and it objective is to reduce the quota of headboard gradually, considering the substitution of stretches of the network pipes for the subsequent diameters, considering a minimum addition of the network cost. The diameter change is made in the optimal stretch that presents the lesser Exchange Gradient. The process is locked up when the headboard quota of desired is reached. The optimized network s material costs are calculated, and is made the analysis of the same ones, through the comparison with the conventional network s costs