946 resultados para Design Processes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically, business process design has been driven by business objectives, specifically process improvement. However this cannot come at the price of control objectives which stem from various legislative, standard and business partnership sources. Ensuring the compliance to regulations and industrial standards is an increasingly important issue in the design of business processes. In this paper, we advocate that control objectives should be addressed at an early stage, i.e., design time, so as to minimize the problems of runtime compliance checking and consequent violations and penalties. To this aim, we propose supporting mechanisms for business process designers. This paper specifically presents a support method which allows the process designer to quantitatively measure the compliance degree of a given process model against a set of control objectives. This will allow process designers to comparatively assess the compliance degree of their design as well as be better informed on the cost of non-compliance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of large-scale solid-stale fermentation (SSF) processes is hampered by the lack of simple tools for the design of SSF bioreactors. The use of semifundamental mathematical models to design and operate SSF bioreactors can be complex. In this work, dimensionless design factors are used to predict the effects of scale and of operational variables on the performance of rotating drum bioreactors. The dimensionless design factor (DDF) is a ratio of the rate of heat generation to the rate of heat removal at the time of peak heat production. It can be used to predict maximum temperatures reached within the substrate bed for given operational variables. Alternatively, given the maximum temperature that can be tolerated during the fermentation, it can be used to explore the combinations of operating variables that prevent that temperature from being exceeded. Comparison of the predictions of the DDF approach with literature data for operation of rotating drums suggests that the DDF is a useful tool. The DDF approach was used to explore the consequences of three scale-up strategies on the required air flow rates and maximum temperatures achieved in the substrate bed as the bioreactor size was increased on the basis of geometric similarity. The first of these strategies was to maintain the superficial flow rate of the process air through the drum constant. The second was to maintain the ratio of volumes of air per volume of bioreactor constant. The third strategy was to adjust the air flow rate with increase in scale in such a manner as to maintain constant the maximum temperature attained in the substrate bed during the fermentation. (C) 2000 John Wiley & Sons, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of ionizing radiation in different compositions of polymer gel dosimeters are investigated using FT-Raman spectroscopy and NMR T-2 relaxation times. The dosimeters are manufactured from different concentrations of comonomers (acrylamide and N,N'-methylene-bis-acrylamide) dispersed in different concentrations of an aqueous gelatin matrix. Results are analysed using a model of fast exchange of magnetization between three proton pools. The fraction of protons in each pool is determined using the known chemical composition of the dosimeter and FT-Raman spectroscopy. Based on these results, the physical and chemical processes in interplay in the dosimeters are examined in view of their effect on the changes in T-2 The precipitation of growing macroradicals and the scavenging of free radicals by gelatin are used to explain the rate of polymerization. The model describes the changes in T-2 as a function of the absorbed dose up to 50 Gy for the different compositions. This is expected to aid the theoretical design of new, more efficient dosimeters, since it was demonstrated that the optimum dosimeter (i.e, with the lowest dose resolution) must have a range of relaxation times which match the range of T-2 values which can be determined with the lowest uncertainty using an MRI scanner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecological interface design (EID) is proving to be a promising approach to the design of interfaces for complex dynamic systems. Although the principles of EID and examples of its effective use are widely available, few readily available examples exist of how the individual displays that constitute an ecological interface are developed. This paper presents the semantic mapping process within EID in the context of prior theoretical work in this area. The semantic mapping process that was used in developing an ecological interface for the Pasteurizer II microworld is outlined, and the results of an evaluation of the ecological interface against a more conventional interface are briefly presented. Subjective reports indicate features of the ecological interface that made it particularly valuable for participants. Finally, we outline the steps of an analytic process for using EID. The findings presented here can be applied in the design of ecological interfaces or of configural displays for dynamic processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal wetlands are dynamic and include the freshwater-intertidal interface. In many parts of the world such wetlands are under pressure from increasing human populations and from predicted sea-level rise. Their complexity and the limited knowledge of processes operating in these systems combine to make them a management challenge.Adaptive management is advocated for complex ecosystem management (Hackney 2000; Meretsky et al. 2000; Thom 2000;National Research Council 2003).Adaptive management identifies management aims,makes an inventory/environmental assessment,plans management actions, implements these, assesses outcomes, and provides feedback to iterate the process (Holling 1978;Walters and Holling 1990). This allows for a dynamic management system that is responsive to change. In the area of wetland management recent adaptive approaches are exemplified by Natuhara et al. (2004) for wild bird management, Bunch and Dudycha (2004) for a river system, Thom (2000) for restoration, and Quinn and Hanna (2003) for seasonal wetlands in California. There are many wetland habitats for which we currently have only rudimentary knowledge (Hackney 2000), emphasizing the need for good information as a prerequisite for effective management. The management framework must also provide a way to incorporate the best available science into management decisions and to use management outcomes as opportunities to improve scientific understanding and provide feedback to the decision system. Figure 9.1 shows a model developed by Anorov (2004) based on the process-response model of Maltby et al. (1994) that forms a framework for the science that underlies an adaptive management system in the wetland context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the design of lattice domes, design engineers need expertise in areas such as configuration processing, nonlinear analysis, and optimization. These are extensive numerical, iterative, and lime-consuming processes that are prone to error without an integrated design tool. This article presents the application of a knowledge-based system in solving lattice-dome design problems. An operational prototype knowledge-based system, LADOME, has been developed by employing the combined knowledge representation approach, which uses rules, procedural methods, and an object-oriented blackboard concept. The system's objective is to assist engineers in lattice-dome design by integrating all design tasks into a single computer-aided environment with implementation of the knowledge-based system approach. For system verification, results from design examples are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cases of high-sided vehicles striking low bridges is a large problem in many countries, especially the UK. This paper describes an experiment to evaluate a new design of markings for low bridges. A full size bridge was constructed which was capable of having its overhead clearance adjusted. Subjects sat in a truck cab as. it drove towards the bridge and were asked to judge whether the vehicle could pass safely under the bridge. The main objective of the research, was to determine whether marking the bridge with a newly devised experimental marking would result in more cautious decisions from subjects regarding whether or not the experimental bridge structure could be passed under safely compared with the currently used UK bridge marking standard. The results show that the type of bridge marking influenced the level of caution associated with decisions regarding bridge navigation, with the new marking design producing the most cautious decisions for the two different bridge heights used, at all distances away from the bridge structure. Additionally, the distance before the bridge at which decisions were given had an effect on the level of caution associated with decisions regarding bridge navigation (the closer to the bridge, the more cautious the decisions became, irrespective of the marking design). The implications of these results for reducing the number of bridge strikes are discussed. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses robust model-order reduction of a high dimensional nonlinear partial differential equation (PDE) model of a complex biological process. Based on a nonlinear, distributed parameter model of the same process which was validated against experimental data of an existing, pilot-scale BNR activated sludge plant, we developed a state-space model with 154 state variables in this work. A general algorithm for robustly reducing the nonlinear PDE model is presented and based on an investigation of five state-of-the-art model-order reduction techniques, we are able to reduce the original model to a model with only 30 states without incurring pronounced modelling errors. The Singular perturbation approximation balanced truncating technique is found to give the lowest modelling errors in low frequency ranges and hence is deemed most suitable for controller design and other real-time applications. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement while drilling (MWD) techniques can provide a useful tool to aid drill and blast engineers in open cut mining. By avoiding time consuming tasks such as scan-lines and rock sample collection for laboratory tests, MWD techniques can not only save time but also improve the reliability of the blast design by providing the drill and blast engineer with the information specially tailored for use. While most mines use a standard blast pattern and charge per blasthole, based on a single rock factor for the entire bench or blast region, information derived from the MWD parameters can improve the blast design by providing more accurate rock properties for each individual blasthole. From this, decisions can be made on the most appropriate type and amount of explosive charge to place in a per blasthole or to optimise the inter-hole timing detonation time of different decks and blastholes. Where real-time calculations are feasible, the system could extend the present blast design even be used to determine the placement of subsequent holes towards a more appropriate blasthole pattern design like asymmetrical blasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blasting has been the most frequently used method for rock breakage since black powder was first used to fragment rocks, more than two hundred years ago. This paper is an attempt to reassess standard design techniques used in blasting by providing an alternative approach to blast design. The new approach has been termed asymmetric blasting. Based on providing real time rock recognition through the capacity of measurement while drilling (MWD) techniques, asymmetric blasting is an approach to deal with rock properties as they occur in nature, i.e., randomly and asymmetrically spatially distributed. It is well accepted that performance of basic mining operations, such as excavation and crushing rely on a broken rock mass which has been pre conditioned by the blast. By pre-conditioned we mean well fragmented, sufficiently loose and with adequate muckpile profile. These muckpile characteristics affect loading and hauling [1]. The influence of blasting does not end there. Under the Mine to Mill paradigm, blasting has a significant leverage on downstream operations such as crushing and milling. There is a body of evidence that blasting affects mineral liberation [2]. Thus, the importance of blasting has increased from simply fragmenting and loosing the rock mass, to a broader role that encompasses many aspects of mining, which affects the cost of the end product. A new approach is proposed in this paper which facilitates this trend 'to treat non-homogeneous media (rock mass) in a non-homogeneous manner (an asymmetrical pattern) in order to achieve an optimal result (in terms of muckpile size distribution).' It is postulated there are no logical reasons (besides the current lack of means to infer rock mass properties in the blind zones of the bench and onsite precedents) for drilling a regular blast pattern over a rock mass that is inherently heterogeneous. Real and theoretical examples of such a method are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the question of which is better: the batch or the continuous activated sludge processes? It is an important question because dissension still exists in the wastewater industry as to the relative merits of each of the processes. A review of perceived differences in the processes from the point of view of two related disciplines, process engineering and biotechnology, is presented together with the results of previous comparative studies. These reviews highlight possible areas where more understanding is required. This is provided in the paper by application of the flexibility index to two case studies. The flexibility index is a useful process design tool that measures the ability of the process to cope with long term changes in operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The two steps of nitrification, namely the oxidation of ammonia to nitrite and nitrite to nitrate, often need to be considered separately in process studies. For a detailed examination, it is desirable to monitor the two-step sequence using online measurements. In this paper, the use of online titrimetric and off-gas analysis (TOGA) methods for the examination of the process is presented. Using the known reaction stoichiometry, combination of the measured signals (rates of hydrogen ion production, oxygen uptake and carbon dioxide transfer) allows the determination of the three key process rates, namely the ammonia consumption rate, the nitrite accumulation rate and the nitrate production rate. Individual reaction rates determined with the TOGA sensor under a number of operation conditions are presented. The rates calculated directly from the measured signals are compared with those obtained from offline liquid sample analysis. Statistical analysis confirms that the results from the two approaches match well. This result could not have been guaranteed using alternative online methods. As a case study, the influences of pH and dissolved oxygen (DO) on nitrite accumulation are tested using the proposed method. It is shown that nitrite accumulation decreased with increasing DO and pH. Possible reasons for these observations are discussed. (C) 2003 Elsevier Science Ltd. All rights reserved.