915 resultados para Computer Aided Process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new mathematical programming model for the retrofit of heat exchanger networks (HENs), wherein the pressure recovery of process streams is conducted to enhance heat integration. Particularly applied to cryogenic processes, HENs retrofit with combined heat and work integration is mainly aimed at reducing the use of expensive cold services. The proposed multi-stage superstructure allows the increment of the existing heat transfer area, as well as the use of new equipment for both heat exchange and pressure manipulation. The pressure recovery of streams is carried out simultaneously with the HEN design, such that the process conditions (streams pressure and temperature) are variables of optimization. The mathematical model is formulated using generalized disjunctive programming (GDP) and is optimized via mixed-integer nonlinear programming (MINLP), through the minimization of the retrofit total annualized cost, considering the turbine and compressor coupling with a helper motor. Three case studies are performed to assess the accuracy of the developed approach, including a real industrial example related to liquefied natural gas (LNG) production. The results show that the pressure recovery of streams is efficient for energy savings and, consequently, for decreasing the HEN retrofit total cost especially in sub-ambient processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Federal Highway Administration, Arlington, Va.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"UILU-ENG 77 1716."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (M.A.)--University of Illinois at Urbana-Champaign.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Supported in part by Contract no. U.S. AEC (11-1)l469."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"UIUCDCS-R-75-717"

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computer model of the mechanical alloying process has been developed to simulate phase formation during the mechanical alloying of Mo and Si elemental powders with a ternary addition of Al, Mg, Ti or Zr. Using the Arhennius equation, the model balances the formation rates of the competing reactions that are observed during milling. These reactions include the formation of tetragonal C11(b) MOSi2 (t-MoSi2) by combustion, the formation of the hexagonal C40 MoSi2 polymorph (h-MoSi2), the transformation of the tetragonal to the hexagonal form, and the recovery of t-MoSi2 from h-MoSi2 and deformed t-MoSi2. The addition of the ternary additions changes the free energy of formation of the associated MoSi2 alloys, i.e. Mo(Si, Al)(2), Mo(Mg, Al)(2), (Mo, Ti)Si-2 (Mo, Zr)Si-2 and (Mo, Fe)Si-2, respectively. Variation of the energy of formation alone is sufficient for the simulation to accurately model the observed phase formation. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological wastewater treatment is a complex, multivariate process, in which a number of physical and biological processes occur simultaneously. In this study, principal component analysis (PCA) and parallel factor analysis (PARAFAC) were used to profile and characterise Lagoon 115E, a multistage biological lagoon treatment system at Melbourne Water's Western Treatment Plant (WTP) in Melbourne, Australia. In this study, the objective was to increase our understanding of the multivariate processes taking place in the lagoon. The data used in the study span a 7-year period during which samples were collected as often as weekly from the ponds of Lagoon 115E and subjected to analysis. The resulting database, involving 19 chemical and physical variables, was studied using the multivariate data analysis methods PCA and PARAFAC. With these methods, alterations in the state of the wastewater due to intrinsic and extrinsic factors could be discerned. The methods were effective in illustrating and visually representing the complex purification stages and cyclic changes occurring along the lagoon system. The two methods proved complementary, with each having its own beneficial features. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Granulation is one of the fundamental operations in particulate processing and has a very ancient history and widespread use. Much fundamental particle science has occurred in the last two decades to help understand the underlying phenomena. Yet, until recently the development of granulation systems was mostly based on popular practice. The use of process systems approaches to the integrated understanding of these operations is providing improved insight into the complex nature of the processes. Improved mathematical representations, new solution techniques and the application of the models to industrial processes are yielding better designs, improved optimisation and tighter control of these systems. The parallel development of advanced instrumentation and the use of inferential approaches provide real-time access to system parameters necessary for improvements in operation. The use of advanced models to help develop real-time plant diagnostic systems provides further evidence of the utility of process system approaches to granulation processes. This paper highlights some of those aspects of granulation. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, with the expansion of organizational scope and the tendency for outsourcing, there has been an increasing need for Business Process Integration (BPI), understood as the sharing of data and applications among business processes. The research efforts and development paths in BPI pursued by many academic groups and system vendors, targeting heterogeneous system integration, continue to face several conceptual and technological challenges. This article begins with a brief review of major approaches and emerging standards to address BPI. Further, we introduce a rule-driven messaging approach to BPI, which is based on the harmonization of messages in order to compose a new, often cross-organizational process. We will then introduce the design of a temporal first order language (Harmonized Messaging Calculus) that provides the formal foundation for general rules governing the business process execution. Definitions of the language terms, formulae, safety, and expressiveness are introduced and considered in detail.