937 resultados para ISE and ITSE optimization
Resumo:
Over the last few years, the Data Center market has increased exponentially and this tendency continues today. As a direct consequence of this trend, the industry is pushing the development and implementation of different new technologies that would improve the energy consumption efficiency of data centers. An adaptive dashboard would allow the user to monitor the most important parameters of a data center in real time. For that reason, monitoring companies work with IoT big data filtering tools and cloud computing systems to handle the amounts of data obtained from the sensors placed in a data center.Analyzing the market trends in this field we can affirm that the study of predictive algorithms has become an essential area for competitive IT companies. Complex algorithms are used to forecast risk situations based on historical data and warn the user in case of danger. Considering that several different users will interact with this dashboard from IT experts or maintenance staff to accounting managers, it is vital to personalize it automatically. Following that line of though, the dashboard should only show relevant metrics to the user in different formats like overlapped maps or representative graphs among others. These maps will show all the information needed in a visual and easy-to-evaluate way. To sum up, this dashboard will allow the user to visualize and control a wide range of variables. Monitoring essential factors such as average temperature, gradients or hotspots as well as energy and power consumption and savings by rack or building would allow the client to understand how his equipment is behaving, helping him to optimize the energy consumption and efficiency of the racks. It also would help him to prevent possible damages in the equipment with predictive high-tech algorithms.
Resumo:
In this paper a consistent analysis of reinforced concrete (RC) two-dimensional (2-D) structures,namely slab structures subjected to in-plane and out-plane forces, is presented. By using this method of analysis the well established methodology for dimensioning and verifying RC sections of beam structures is extended to 2-D structures. The validity of the proposed analysis results is checked by comparing them with some published experimental test results. Several examples show some of these proposed analysis features, such as the influence of the reinforcement layout on the service and ultimate behavior of a slab structure and the non straightforward problem of the optimal dimension at a slab point subjected to several loading cases. Also, in these examples, the method applications to design situations as multiple steel families and non orthogonal reinforcement layout are commented.
Resumo:
We develop a heuristic model for chaperonin-facilitated protein folding, the iterative annealing mechanism, based on theoretical descriptions of "rugged" conformational free energy landscapes for protein folding, and on experimental evidence that (i) folding proceeds by a nucleation mechanism whereby correct and incorrect nucleation lead to fast and slow folding kinetics, respectively, and (ii) chaperonins optimize the rate and yield of protein folding by an active ATP-dependent process. The chaperonins GroEL and GroES catalyze the folding of ribulose bisphosphate carboxylase at a rate proportional to the GroEL concentration. Kinetically trapped folding-incompetent conformers of ribulose bisphosphate carboxylase are converted to the native state in a reaction involving multiple rounds of quantized ATP hydrolysis by GroEL. We propose that chaperonins optimize protein folding by an iterative annealing mechanism; they repeatedly bind kinetically trapped conformers, randomly disrupt their structure, and release them in less folded states, allowing substrate proteins multiple opportunities to find pathways leading to the most thermodynamically stable state. By this mechanism, chaperonins greatly expand the range of environmental conditions in which folding to the native state is possible. We suggest that the development of this device for optimizing protein folding was an early and significant evolutionary event.
Resumo:
Small, single-module proteins that fold in a single cooperative step may be paradigms for understanding early events in protein-folding pathways generally. Recent experimental studies of the 64-residue chymotrypsin inhibitor 2 (CI2) support a nucleation mechanism for folding, as do some computer stimulations. CI2 has a nucleation site that develops only in the transition state for folding. The nucleus is composed of a set of adjacent residues (an alpha-helix), stabilized by long-range interactions that are formed as the rest of the protein collapses around it. A simple analysis of the optimization of the rate of protein folding predicts that rates are highest when the denatured state has little residual structure under physiological conditions and no intermediates accumulate. This implies that any potential nucleation site that is composed mainly of adjacent residues should be just weakly populated in the denatured state and become structured only in a high-energy intermediate or transition state when it is stabilized by interactions elsewhere in the protein. Hierarchical mechanisms of folding in which stable elements of structure accrete are unfavorable. The nucleation-condensation mechanism of CI2 fulfills the criteria for fast folding. On the other hand, stable intermediates do form in the folding of more complex proteins, and this may be an unavoidable consequence of increasing size and nucleation at more than one site.
Resumo:
It has become clear that many organisms possess the ability to regulate their mutation rate in response to environmental conditions. So the question of finding an optimal mutation rate must be replaced by that of finding an optimal mutation schedule. We show that this task cannot be accomplished with standard population-dynamic models. We then develop a "hybrid" model for populations experiencing time-dependent mutation that treats population growth as deterministic but the time of first appearance of new variants as stochastic. We show that the hybrid model agrees well with a Monte Carlo simulation. From this model, we derive a deterministic approximation, a "threshold" model, that is similar to standard population dynamic models but differs in the initial rate of generation of new mutants. We use these techniques to model antibody affinity maturation by somatic hypermutation. We had previously shown that the optimal mutation schedule for the deterministic threshold model is phasic, with periods of mutation between intervals of mutation-free growth. To establish the validity of this schedule, we now show that the phasic schedule that optimizes the deterministic threshold model significantly improves upon the best constant-rate schedule for the hybrid and Monte Carlo models.
Resumo:
The increasing economic competition drives the industry to implement tools that improve their processes efficiencies. The process automation is one of these tools, and the Real Time Optimization (RTO) is an automation methodology that considers economic aspects to update the process control in accordance with market prices and disturbances. Basically, RTO uses a steady-state phenomenological model to predict the process behavior, and then, optimizes an economic objective function subject to this model. Although largely implemented in industry, there is not a general agreement about the benefits of implementing RTO due to some limitations discussed in the present work: structural plant/model mismatch, identifiability issues and low frequency of set points update. Some alternative RTO approaches have been proposed in literature to handle the problem of structural plant/model mismatch. However, there is not a sensible comparison evaluating the scope and limitations of these RTO approaches under different aspects. For this reason, the classical two-step method is compared to more recently derivative-based methods (Modifier Adaptation, Integrated System Optimization and Parameter estimation, and Sufficient Conditions of Feasibility and Optimality) using a Monte Carlo methodology. The results of this comparison show that the classical RTO method is consistent, providing a model flexible enough to represent the process topology, a parameter estimation method appropriate to handle measurement noise characteristics and a method to improve the sample information quality. At each iteration, the RTO methodology updates some key parameter of the model, where it is possible to observe identifiability issues caused by lack of measurements and measurement noise, resulting in bad prediction ability. Therefore, four different parameter estimation approaches (Rotational Discrimination, Automatic Selection and Parameter estimation, Reparametrization via Differential Geometry and classical nonlinear Least Square) are evaluated with respect to their prediction accuracy, robustness and speed. The results show that the Rotational Discrimination method is the most suitable to be implemented in a RTO framework, since it requires less a priori information, it is simple to be implemented and avoid the overfitting caused by the Least Square method. The third RTO drawback discussed in the present thesis is the low frequency of set points update, this problem increases the period in which the process operates at suboptimum conditions. An alternative to handle this problem is proposed in this thesis, by integrating the classic RTO and Self-Optimizing control (SOC) using a new Model Predictive Control strategy. The new approach demonstrates that it is possible to reduce the problem of low frequency of set points updates, improving the economic performance. Finally, the practical aspects of the RTO implementation are carried out in an industrial case study, a Vapor Recompression Distillation (VRD) process located in Paulínea refinery from Petrobras. The conclusions of this study suggest that the model parameters are successfully estimated by the Rotational Discrimination method; the RTO is able to improve the process profit in about 3%, equivalent to 2 million dollars per year; and the integration of SOC and RTO may be an interesting control alternative for the VRD process.
Resumo:
This article continues the investigation of stationarity and regularity properties of infinite collections of sets in a Banach space started in Kruger and López (J. Optim. Theory Appl. 154(2), 2012), and is mainly focused on the application of the stationarity criteria to infinitely constrained optimization problems. We consider several settings of optimization problems which involve (explicitly or implicitly) infinite collections of sets and deduce for them necessary conditions characterizing stationarity in terms of dual space elements—normals and/or subdifferentials.
Resumo:
We present a derivative-free optimization algorithm coupled with a chemical process simulator for the optimal design of individual and complex distillation processes using a rigorous tray-by-tray model. The proposed approach serves as an alternative tool to the various models based on nonlinear programming (NLP) or mixed-integer nonlinear programming (MINLP) . This is accomplished by combining the advantages of using a commercial process simulator (Aspen Hysys), including especially suited numerical methods developed for the convergence of distillation columns, with the benefits of the particle swarm optimization (PSO) metaheuristic algorithm, which does not require gradient information and has the ability to escape from local optima. Our method inherits the superstructure developed in Yeomans, H.; Grossmann, I. E.Optimal design of complex distillation columns using rigorous tray-by-tray disjunctive programming models. Ind. Eng. Chem. Res.2000, 39 (11), 4326–4335, in which the nonexisting trays are considered as simple bypasses of liquid and vapor flows. The implemented tool provides the optimal configuration of distillation column systems, which includes continuous and discrete variables, through the minimization of the total annual cost (TAC). The robustness and flexibility of the method is proven through the successful design and synthesis of three distillation systems of increasing complexity.
Resumo:
This paper introduces a new optimization model for the simultaneous synthesis of heat and work exchange networks. The work integration is performed in the work exchange network (WEN), while the heat integration is carried out in the heat exchanger network (HEN). In the WEN synthesis, streams at high-pressure (HP) and low-pressure (LP) are subjected to pressure manipulation stages, via turbines and compressors running on common shafts and stand-alone equipment. The model allows the use of several units of single-shaft-turbine-compressor (SSTC), as well as helper motors and generators to respond to any shortage and/or excess of energy, respectively, in the SSTC axes. The heat integration of the streams occurs in the HEN between each WEN stage. Thus, as the inlet and outlet streams temperatures in the HEN are dependent of the WEN design, they must be considered as optimization variables. The proposed multi-stage superstructure is formulated in mixed-integer nonlinear programming (MINLP), in order to minimize the total annualized cost composed by capital and operational expenses. A case study is conducted to verify the accuracy of the proposed approach. The results indicate that the heat integration between the WEN stages is essential to enhance the work integration, and to reduce the total cost of process due the need of a smaller amount of hot and cold utilities.
Resumo:
Biopolymers do not have competitive prices, which has prevented their industrial exploitation on a global scale so far. In this context, Using nanoclays, improvements in certain biopolymer properties, mainly mechanical and thermal, have been achieved. However, research has been much less focused on changing optical properties through the incorporation of nanoclays. At the same time, current research has focused on obtaining nanopigments, by organic dyes adsoptions into different nanoclays in order to achieve sustainable colouring and high performance materials. By combining advances in these lines of research, biodegradable composites with optimal mechanical and optical properties can be obtained. The aim of this work is to find the optimal formulation of naturally sourced nanopigments, incorporate them into a biological origin epoxy resin, and obtain a significant improvement in their mechanical, and optical properties. We combine three structural modifiers in the nanopigment synthesis: surfactant, silane and mordant salt. The latter was selected in order to replicate the mordant textile dyeing with natural dyes. Using a Taguchi’s desing L8, we look for the effect of the presence of the modifiers, the pH acidification, and the interactions effect between the synthesis factors. Three natural dyes were selected: chlorophyll, beta-carotene, and beetroot extract. Furthermore we use two kinds of laminar nanoclays, differentiated by the ion exchange charge: montmorillonite, and hydrotalcite. Then the thermal, mechanical and colorimetric characterization of the bionanocomposite materials was carried out. The optimal conditions to obtain the best bionanocomposite materials are using acid pH, and modifying the nanoclays with mordant and surfactant.
Resumo:
There are many models in the literature that have been proposed in the last decades aimed at assessing the reliability, availability and maintainability (RAM) of safety equipment, many of them with a focus on their use to assess the risk level of a technological system or to search for appropriate design and/or surveillance and maintenance policies in order to assure that an optimum level of RAM of safety systems is kept during all the plant operational life. This paper proposes a new approach for RAM modelling that accounts for equipment ageing and maintenance and testing effectiveness of equipment consisting of multiple items in an integrated manner. This model is then used to perform the simultaneous optimization of testing and maintenance for ageing equipment consisting of multiple items. An example of application is provided, which considers a simplified High Pressure Injection System (HPIS) of a typical Power Water Reactor (PWR). Basically, this system consists of motor driven pumps (MDP) and motor operated valves (MOV), where both types of components consists of two items each. These components present different failure and cause modes and behaviours, and they also undertake complex test and maintenance activities depending on the item involved. The results of the example of application demonstrate that the optimization algorithm provide the best solutions when the optimization problem is formulated and solved considering full flexibility in the implementation of testing and maintenance activities taking part of such an integrated RAM model.
Resumo:
Mode of access: Internet.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.