7 resultados para Integrated operation and maintenance

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes a real-world production planning and scheduling problem occurring at an integrated pulp and paper mill (P&P) which manufactures paper for cardboard out of produced pulp. During the cooking of wood chips in the digester, two by-products are produced: the pulp itself (virgin fibers) and the waste stream known as black liquor. The former is then mixed with recycled fibers and processed in a paper machine. Here, due to significant sequence-dependent setups in paper type changeovers, sizing and sequencing of lots have to be made simultaneously in order to efficiently use capacity. The latter is converted into electrical energy using a set of evaporators, recovery boilers and counter-pressure turbines. The planning challenge is then to synchronize the material flow as it moves through the pulp and paper mills, and energy plant, maximizing customer demand (as backlogging is allowed), and minimizing operation costs. Due to the intensive capital feature of P&P, the output of the digester must be maximized. As the production bottleneck is not fixed, to tackle this problem we propose a new model that integrates the critical production units associated to the pulp and paper mills, and energy plant for the first time. Simple stochastic mixed integer programming based local search heuristics are developed to obtain good feasible solutions for the problem. The benefits of integrating the three stages are discussed. The proposed approaches are tested on real-world data. Our work may help P&P companies to increase their competitiveness and reactiveness in dealing with demand pattern oscillations. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Morphometric methods permit identification of insect species and are an aid for taxonomy. Quantitative wing traits were used to identify male euglossine bees. Landmark- and outline-based methods have been primarily used independently. Here, we combine the two methods using five Euglossa. Landmark-based methods correctly classified 84% and outline-based 77%, but an integrated analysis correctly classified 91% of samples. Some species presented significantly high reclassification percentages when only wing cell contour was considered, and correct identification of specimens with damaged wings was also obtained using this methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Many experimental models using lung lavage have been developed for the study of acute respiratory distress syndrome (ARDS). The original technique has been modified by many authors, resulting in difficulties with reproducibility. There is insufficient detail on the lung injury models used, including hemodynamic stability during animal preparation and drawbacks encountered such as mortality. The authors studied the effects of the pulmonary recruitment and the use of fixed tidal volume (Vt) or fixed inspiratory pressure in the experimental ARDS model installation. Methods: Adult rabbits were submitted to repeated lung lavages with 30 ml/kg warm saline until the ARDS definition (PaO2/FiO(2) <= 100) was reached. The animals were divided into three groups, according to the technique used for mechanical ventilation: 1) fixed Vt of 10 ml/kg; 2) fixed inspiratory pressure (IP) with a tidal volume of 10 ml/kg prior to the first lung lavage; and 3) fixed Vt of 10 ml/kg with pulmonary recruitment before the first lavage. Results: The use of alveolar recruitment maneuvers, and the use of a fixed Vt or IP between the lung lavages did not change the number of lung lavages necessary to obtain the experimental model of ARDS or the hemodynamic stability of the animals during the procedure. A trend was observed toward an increased mortality rate with the recruitment maneuver and with the use of a fixed IP. Discussion: There were no differences between the three study groups, with no disadvantage in method of lung recruitment, either fixed tidal volume or fixed inspiratory pressure, regarding the number of lung lavages necessary to obtain the ARDS animal model. Furthermore, the three different procedures resulted in good hemodynamic stability of the animals, and low mortality rate. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To understand the regulatory dynamics of transcription factors (TFs) and their interplay with other cellular components we have integrated transcriptional, protein-protein and the allosteric or equivalent interactions which mediate the physiological activity of TFs in Escherichia coli. To study this integrated network we computed a set of network measurements followed by principal component analysis (PCA), investigated the correlations between network structure and dynamics, and carried out a procedure for motif detection. In particular, we show that outliers identified in the integrated network based on their network properties correspond to previously characterized global transcriptional regulators. Furthermore, outliers are highly and widely expressed across conditions, thus supporting their global nature in controlling many genes in the cell. Motifs revealed that TFs not only interact physically with each other but also obtain feedback from signals delivered by signaling proteins supporting the extensive cross-talk between different types of networks. Our analysis can lead to the development of a general framework for detecting and understanding global regulatory factors in regulatory networks and reinforces the importance of integrating multiple types of interactions in underpinning the interrelationships between them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, the attainment of microsystems that integrate most of the stages involved in an analytical process has raised an enormous interest in several research fields. This approach provides experimental set-ups of increased robustness and reliability, which simplify their application to in-line and continuous biomedical and environmental monitoring. In this work, a novel, compact and autonomous microanalyzer aimed at multiwavelength colorimetric determinations is presented. It integrates the microfluidics (a three-dimensional mixer and a 25 mm length "Z-shape" optical flow-cell), a highly versatile multiwavelength optical detection system and the associated electronics for signal processing and drive, all in the same device. The flexibility provided by its design allows the microanalyzer to be operated either in single fixed mode to provide a dedicated photometer or in multiple wavelength mode to obtain discrete pseudospectra. To increase its reliability, automate its operation and allow it to work under unattended conditions, a multicommutation sub-system was developed and integrated with the experimental set-up. The device was initially evaluated in the absence of chemical reactions using four acidochromic dyes and later applied to determine some key environmental parameters such as phenol index, chromium(VI) and nitrite ions. Results were comparable with those obtained with commercial instrumentation and allowed to demonstrate the versatility of the proposed microanalyzer as an autonomous and portable device able to be applied to other analytical methodologies based on colorimetric determinations.