987 resultados para Process variability
Resumo:
It is common for organizations to maintain multiple variants of a given business process, such as multiple sales processes for different products or multiple bookkeeping processes for different countries. Conventional business process modeling languages do not explicitly support the representation of such families of process variants. This gap triggered significant research efforts over the past decade leading to an array of approaches to business process variability modeling. This survey examines existing approaches in this field based on a common set of criteria and illustrates their key concepts using a running example. The analysis shows that existing approaches are characterized by the fact that they extend a conventional process mod- eling language with constructs that make it able to capture customizable process models. A customizable process model represents a family of process variants in a way that each variant can be derived by adding or deleting fragments according to configuration parameters or according to a domain model. The survey puts into evidence an abundance of customizable process modeling languages, embodying a diverse set of con- structs. In contrast, there is comparatively little tool support for analyzing and constructing customizable process models, as well as a scarcity of empirical evaluations of languages in the field.
Resumo:
Knowledge of the pollutant build-up process is a key requirement for developing stormwater pollution mitigation strategies. In this context, process variability is a concept which needs to be understood in-depth. Analysis of particulate build-up on three road surfaces in an urban catchment confirmed that particles <150µm and >150µm have characteristically different build-up patterns, and these patterns are consistent over different field conditions. Three theoretical build-up patterns were developed based on the size-fractionated particulate build-up patterns, and these patterns explain the variability in particle behavior and the variation in particle-bound pollutant load and composition over the antecedent dry period. Behavioral variability of particles <150µm was found to exert the most significant influence on the build-up process variability. As characterization of process variability is particularly important in stormwater quality modeling, it is recommended that the influence of behavioral variability of particles <150µm on pollutant build-up should be specifically addressed. This would eliminate model deficiencies in the replication of the build-up process and facilitate the accounting of the inherent process uncertainty, and thereby enhance the water quality predictions.
Resumo:
Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150µm and >150µm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes.
Resumo:
A generalized technique is proposed for modeling the effects of process variations on dynamic power by directly relating the variations in process parameters to variations in dynamic power of a digital circuit. The dynamic power of a 2-input NAND gate is characterized by mixed-mode simulations, to be used as a library element for 65mn gate length technology. The proposed methodology is demonstrated with a multiplier circuit built using the NAND gate library, by characterizing its dynamic power through Monte Carlo analysis. The statistical technique of Response. Surface Methodology (RSM) using Design of Experiments (DOE) and Least Squares Method (LSM), are employed to generate a "hybrid model" for gate power to account for simultaneous variations in multiple process parameters. We demonstrate that our hybrid model based statistical design approach results in considerable savings in the power budget of low power CMOS designs with an error of less than 1%, with significant reductions in uncertainty by atleast 6X on a normalized basis, against worst case design.
Resumo:
With the rapid scaling down of the semiconductor process technology, the process variation aware circuit design has become essential today. Several statistical models have been proposed to deal with the process variation. We propose an accurate BSIM model for handling variability in 45nm CMOS technology. The MOSFET is designed to meet the specification of low standby power technology of International Technology Roadmap for Semiconductors (ITRS).The process parameters variation of annealing temperature, oxide thickness, halo dose and title angle of halo implant are considered for the model development. One parameter variation at a time is considered for developing the model. The model validation is done by performance matching with device simulation results and reported error is less than 10%.© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Resumo:
A promising technique for the large-scale manufacture of micro-fluidic devices and photonic devices is hot embossing of polymers such as PMMA. Micro-embossing is a deformation process where the workpiece material is heated to permit easier material flow and then forced over a planar patterned tool. While there has been considerable, attention paid to process feasibility very little effort has been put into production issues such as process capability and eventual process control. In this paper, we present initial studies aimed at identifying the origins and magnitude of variability for embossing features at the micron scale in PMMA. Test parts with features ranging from 3.5- 630 µm wide and 0.9 µm deep were formed. Measurements at this scale proved very difficult, and only atomic force microscopy was able to provide resolution sufficient to identify process variations. It was found that standard deviations of widths at the 3-4 µm scale were on the order of 0.5 µm leading to a coefficient of variation as high as 13%. Clearly, the transition from test to manufacturing for this process will require understanding the causes of this variation and devising control methods to minimize its magnitude over all types of parts.
Resumo:
Variability in the pollutant wash-off process is a concept which needs to be understood in-depth in order to better assess the outcomes of stormwater quality models, and thereby strengthen stormwater pollution mitigation strategies. Current knowledge about the wash-off process does not extend to a clear understanding of the influence of the initially available pollutant build-up on the variability of the pollutant wash-off load and composition. Consequently, pollutant wash-off process variability is poorly characterised in stormwater quality models, which can result in inaccurate stormwater quality predictions. Mathematical simulation of particulate wash-off from three urban road surfaces confirmed that the wash-off load of particle size fractions <150µm and >150µm after a storm event vary with the build-up of the respective particle size fractions available at the beginning of the storm event. Furthermore, pollutant load and composition associated with the initially available build-up of <150µm particles predominantly influence the variability in washed-off pollutant load and composition. The influence of the build-up of pollutants associated with >150µm particles on wash-off process variability is significant only for relatively shorter duration storm events.
Resumo:
The main source of protein for human and animal consumption is from the agricultural sector, where the production is vulnerable to diseases, fluctuations in climatic conditions and deteriorating hydrological conditions due to water pollution. Therefore Single Cell Protein (SCP) production has evolved as an excellent alternative. Among all sources of microbial protein, yeast has attained global acceptability and has been preferred for SCP production. The screening and evaluation of nutritional and other culture variables of microorganisms are very important in the development of a bioprocess for SCP production. The application of statistical experimental design in bioprocess development can result in improved product yields, reduced process variability, closer confirmation of the output response to target requirements and reduced development time and overall cost.The present work was undertaken to develop a bioprocess technology for the mass production of a marine yeast, Candida sp.S27. Yeasts isolated from the offshore waters of the South west coast of India and maintained in the Microbiology Laboratory were subjected to various tests for the selection of a potent strain for biomass production. The selected marine yeast was identified based on ITS sequencing. Biochemical/nutritional characterization of Candida sp.S27 was carried out. Using Response Surface Methodology (RSM) the process parameters (pH, temperature and salinity) were optimized. For mass production of yeast biomass, a chemically defined medium (Barnett and Ingram, 1955) and a crude medium (Molasses-Yeast extract) were optimized using RSM. Scale up of biomass production was done in a Bench top Fermenter using these two optimized media. Comparative efficacy of the defined and crude media were estimated besides nutritional evaluation of the biomass developed using these two optimized media.
Resumo:
In a industrial environment, to know the process one is working with is crucial to ensure its good functioning. In the present work, developed at Prio Biocombustíveis S.A. facilities, using process data, collected during the present work, and historical process data, the methanol recovery process was characterized, having started with the characterization of key process streams. Based on the information retrieved from the stream characterization, Aspen Plus® process simulation software was used to replicate the process and perform a sensitivity analysis with the objective of accessing the relative importance of certain key process variables (reflux/feed ratio, reflux temperature, reboiler outlet temperature, methanol, glycerol and water feed compositions). The work proceeded with the application of a set of statistical tools, starting with the Principal Components Analysis (PCA) from which the interactions between process variables and their contribution to the process variability was studied. Next, the Design of Experiments (DoE) was used to acquire experimental data and, with it, create a model for the water amount in the distillate. However, the necessary conditions to perform this method were not met and so it was abandoned. The Multiple Linear Regression method (MLR) was then used with the available data, creating several empiric models for the water at distillate, the one with the highest fit having a R2 equal to 92.93% and AARD equal to 19.44%. Despite the AARD still being relatively high, the model is still adequate to make fast estimates of the distillate’s quality. As for fouling, its presence has been noticed many times during this work. Not being possible to directly measure the fouling, the reboiler inlet steam pressure was used as an indicator of the fouling growth and its growth variation with the amount of Used Cooking Oil incorporated in the whole process. Comparing the steam cost associated to the reboiler’s operation when fouling is low (1.5 bar of steam pressure) and when fouling is high (reboiler’s steam pressure of 3 bar), an increase of about 58% occurs when the fouling increases.
Resumo:
Assessing build-up and wash-off process uncertainty is important for accurate interpretation of model outcomes to facilitate informed decision making for developing effective stormwater pollution mitigation strategies. Uncertainty inherent to pollutant build-up and wash-off processes influences the variations in pollutant loads entrained in stormwater runoff from urban catchments. However, build-up and wash-off predictions from stormwater quality models do not adequately represent such variations due to poor characterisation of the variability of these processes in mathematical models. The changes to the mathematical form of current models with the incorporation of process variability, facilitates accounting for process uncertainty without significantly affecting the model prediction performance. Moreover, the investigation of uncertainty propagation from build-up to wash-off confirmed that uncertainty in build-up process significantly influences wash-off process uncertainty. Specifically, the behaviour of particles <150µm during build-up primarily influences uncertainty propagation, resulting in appreciable variations in the pollutant load and composition during a wash-off event.
Resumo:
Uncertainty inherent to heavy metal build-up and wash-off stems from process variability. This results in inaccurate interpretation of stormwater quality model predictions. The research study has characterised the variability in heavy metal build-up and wash-off processes based on the temporal variations in particle-bound heavy metals commonly found on urban roads. The study outcomes found that the distribution of Al, Cr, Mn, Fe, Ni, Cu, Zn, Cd and Pb were consistent over particle size fractions <150µm and >150µm, with most metals concentrated in the particle size fraction <150µm. When build-up and wash-off are considered as independent processes, the temporal variations in these processes in relation to the heavy metals load are consistent with variations in the particulate load. However, the temporal variations in the load in build-up and wash-off of heavy metals and particulates are not consistent for consecutive build-up and wash-off events that occur on a continuous timeline. These inconsistencies are attributed to interactions between heavy metals and particulates <150µm and >150µm, which are influenced by particle characteristics such as organic matter content. The behavioural variability of particles determines the variations in the heavy metals load entrained in stormwater runoff. Accordingly, the variability in build-up and wash-off of particle-bound pollutants needs to be characterised in the description of pollutant attachment to particulates in stormwater quality modelling. This will ensure the accounting of process uncertainty, and thereby enhancing the interpretation of the outcomes derived from modelling studies.
Resumo:
Reducing energy consumption is a major challenge for "energy-intensive" industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of "optimized" operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method.
Resumo:
Reducing energy consumption is a major challenge for energy-intensive industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of optimized operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method. © 2006 IEEE.
Resumo:
In this paper, we present a unified approach to an energy-efficient variation-tolerant design of Discrete Wavelet Transform (DWT) in the context of image processing applications. It is to be noted that it is not necessary to produce exactly correct numerical outputs in most image processing applications. We exploit this important feature and propose a design methodology for DWT which shows energy quality tradeoffs at each level of design hierarchy starting from the algorithm level down to the architecture and circuit levels by taking advantage of the limited perceptual ability of the Human Visual System. A unique feature of this design methodology is that it guarantees robustness under process variability and facilitates aggressive voltage over-scaling. Simulation results show significant energy savings (74% - 83%) with minor degradations in output image quality and avert catastrophic failures under process variations compared to a conventional design. © 2010 IEEE.