989 resultados para Bellman-Harris Branching Processes
Resumo:
Iteration is unavoidable in the design process and should be incorporated when planning and managing projects in order to minimize surprises and reduce schedule distortions. However, planning and managing iteration is challenging because the relationships between its causes and effects are complex. Most approaches which use mathematical models to analyze the impact of iteration on the design process focus on a relatively small number of its causes and effects. Therefore, insights derived from these analytical models may not be robust under a broader consideration of potential influencing factors. In this article, we synthesize an explanatory framework which describes the network of causes and effects of iteration identified from the literature, and introduce an analytic approach which combines a task network modeling approach with System Dynamics simulation. Our approach models the network of causes and effects of iteration alongside the process architecture which is required to analyze the impact of iteration on design process performance. We show how this allows managers to assess the impact of changes to process architecture and to management levers which influence iterative behavior, accounting for the fact that these changes can occur simultaneously and can accumulate in non-linear ways. We also discuss how the insights resulting from this analysis can be visualized for easier consumption by project participants not familiar with simulation methods. Copyright © 2010 by ASME.
Resumo:
Reducing energy consumption is a major challenge for "energy-intensive" industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of "optimized" operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method.
Resumo:
Reducing energy consumption is a major challenge for energy-intensive industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of optimized operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method. © 2006 IEEE.
Resumo:
A new approximate solution for the first passage probability of a stationary Gaussian random process is presented which is based on the estimation of the mean clump size. A simple expression for the mean clump size is derived in terms of the cumulative normal distribution function, which avoids the lengthy numerical integrations which are required by similar existing techniques. The method is applied to a linear oscillator and an ideal bandpass process and good agreement with published results is obtained. By making a slight modification to an existing analysis it is shown that a widely used empirical result for the asymptotic form of the first passage probability can be deduced theoretically.
Resumo:
In recent years, many industrial firms have been able to use roadmapping as an effective process methodology for projecting future technology and for coordinating technology planning and strategy. Firms potentially realize a number of benefits in deploying technology roadmapping (TRM) processes. Roadmaps provide information identifying which new technologies will meet firms' future product demands, allowing companies to leverage R&D investments through choosing appropriately out of a range of alternative technologies. Moreover, the roadmapping process serves an important communication tool helping to bring about consensus among roadmap developers, as well as between participants brought in during the development process, who may communicate their understanding of shared corporate goals through the roadmap. However, there are few conceptual accounts or case studies have made the argument that roadmapping processes may be used effectively as communication tools. This paper, therefore, seeks to elaborate a theoretical foundation for identifying the factors that must be considered in setting up a roadmap and for analyzing the effect of these factors on technology roadmap credibility as perceived by its users. Based on the survey results of 120 different R&D units, this empirical study found that firms need to explore further how they can enable frequent interactions between the TRM development team and TRM participants. A high level of interaction will improve the credibility of a TRM, with communication channels selected by the organization also positively affecting TRM credibility. © 2011 Elsevier Inc.
Resumo:
A novel test method for the characterisation of flexible forming processes is proposed and applied to four flexible forming processes: Incremental Sheet Forming (ISF), conventional spinning, the English wheel and power hammer. The proposed method is developed in analogy with time-domain control engineering, where a system is characterised by its impulse response. The spatial impulse response is used to characterise the change in workpiece deformation created by a process, but has also been applied with a strain spectrogram, as a novel way to characterise a process and the physical effect it has on the workpiece. Physical and numerical trials to study the effects of process and material parameters on spatial impulse response lead to three main conclusions. Incremental sheet forming is particularly sensitive to process parameters. The English wheel and power hammer are strongly similar and largely insensitive to both process and material parameters. Spinning develops in two stages and is sensitive to most process parameters, but insensitive to prior deformation. Finally, the proposed method could be applied to modelling, classification of existing and novel processes, product-process matching and closed-loop control of flexible forming processes. © 2012 Elsevier B.V.
Resumo:
Modelling dialogue as a Partially Observable Markov Decision Process (POMDP) enables a dialogue policy robust to speech understanding errors to be learnt. However, a major challenge in POMDP policy learning is to maintain tractability, so the use of approximation is inevitable. We propose applying Gaussian Processes in Reinforcement learning of optimal POMDP dialogue policies, in order (1) to make the learning process faster and (2) to obtain an estimate of the uncertainty of the approximation. We first demonstrate the idea on a simple voice mail dialogue task and then apply this method to a real-world tourist information dialogue task. © 2010 Association for Computational Linguistics.
Resumo:
Fibrous collagenous networks are not only stiff but also tough, due to their complex microstructures. This stiff yet tough behavior is desirable for both medical and military applications but it is difficult to reproduce in engineering materials. While the nonlinear hyperelastic behavior of fibrous networks has been extensively studied, the understanding of toughness is still incomplete. Here, we identify a microstructure mimicking the branched bundles of a natural type I collagen network, in which partially cross-linked long fibers give rise to novel combinations of stiffness and toughness. Finite element analysis shows that the stiffness of fully cross-linked fibrous networks is amplified by increasing the fibril length and cross-link density. However, a trade-off of such stiff networks is reduced toughness. By having partially cross-linked networks with long fibrils, the networks have comparable stiffness and improved toughness as compared to the fully cross-linked networks. Further, the partially cross-linked networks avoid the formation of kinks, which cause fibril rupture during deformation. As a result, the branching allows the networks to have stiff yet tough behavior.
Resumo:
We introduce a Gaussian process model of functions which are additive. An additive function is one which decomposes into a sum of low-dimensional functions, each depending on only a subset of the input variables. Additive GPs generalize both Generalized Additive Models, and the standard GP models which use squared-exponential kernels. Hyperparameter learning in this model can be seen as Bayesian Hierarchical Kernel Learning (HKL). We introduce an expressive but tractable parameterization of the kernel function, which allows efficient evaluation of all input interaction terms, whose number is exponential in the input dimension. The additional structure discoverable by this model results in increased interpretability, as well as state-of-the-art predictive power in regression tasks.
Resumo:
This paper reports an extensive analysis of the defect-related localized emission processes occurring in InGaN/GaN-based light-emitting diodes (LEDs) at low reverse- and forward-bias conditions. The analysis is based on combined electrical characterization and spectrally and spatially resolved electroluminescence (EL) measurements. Results of this analysis show that: (i) under reverse bias, LEDs can emit a weak luminescence signal, which is directly proportional to the injected reverse current. Reverse-bias emission is localized in submicrometer-size spots; the intensity of the signal is strongly correlated to the threading dislocation (TD) density, since TDs are preferential paths for leakage current conduction. (ii) Under low forward-bias conditions, the intensity of the EL signal is not uniform over the device area. Spectrally resolved EL analysis of green LEDs identifies the presence of localized spots emitting at 600 nm (i.e., in the yellow spectral region), whose origin is ascribed to localized tunneling occurring between the quantum wells and the barrier layers of the diodes, with subsequent defect-assisted radiative recombination. The role of defects in determining yellow luminescence is confirmed by the high activation energy of the thermal quenching of yellow emission (Ea =0.64&eV). © 2012 IEEE.
Resumo:
Decisions about noisy stimuli require evidence integration over time. Traditionally, evidence integration and decision making are described as a one-stage process: a decision is made when evidence for the presence of a stimulus crosses a threshold. Here, we show that one-stage models cannot explain psychophysical experiments on feature fusion, where two visual stimuli are presented in rapid succession. Paradoxically, the second stimulus biases decisions more strongly than the first one, contrary to predictions of one-stage models and intuition. We present a two-stage model where sensory information is integrated and buffered before it is fed into a drift diffusion process. The model is tested in a series of psychophysical experiments and explains both accuracy and reaction time distributions. © 2012 Rüter et al.