981 resultados para Mean-Reverting Process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the fabrication of cantilever silicon-on-insulator (SOI) optical waveguides and presents solutions to the challenges of using a very thin 260-nm active silicon layer in the SOI structure to enable single-transverse-mode operation of the waveguide with minimal optical transmission losses. In particular, to ameliorate the anchor effect caused by the mean stress difference between the active silicon layer and buried oxide layer, a cantilever flattening process based on Ar plasma treatment is developed and presented. Vertical deflections of 0.5 mu m for 70-mu m-long cantilevers are mitigated to within few nanometers. Experimental investigations of cantilever mechanical resonance characteristics confirm the absence of significant detrimental side effects. Optical and mechanical modeling is extensively used to supplement experimental observations. This approach can satisfy the requirements for on-chip simultaneous readout of many integrated cantilever sensors in which the displacement or resonant frequency changes induced by analyte absorption are measured using an optical-waveguide-based division multiplexed system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exascale systems of the future are predicted to have mean time between failures (MTBF) of less than one hour. At such low MTBFs, employing periodic checkpointing alone will result in low efficiency because of the high number of application failures resulting in large amount of lost work due to rollbacks. In such scenarios, it is highly necessary to have proactive fault tolerance mechanisms that can help avoid significant number of failures. In this work, we have developed a mechanism for proactive fault tolerance using partial replication of a set of application processes. Our fault tolerance framework adaptively changes the set of replicated processes periodically based on failure predictions to avoid failures. We have developed an MPI prototype implementation, PAREP-MPI that allows changing the replica set. We have shown that our strategy involving adaptive process replication significantly outperforms existing mechanisms providing up to 20 percent improvement in application efficiency even for exascale systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dynamics of the survival of recruiting fish are analyzed as evolving random processes of aggregation and mortality. The analyses draw on recent advances in the physics of complex networks and, in particular, the scale-free degree distribution arising from growing random networks with preferential attachment of links to nodes. In this study simulations were conducted in which recruiting fish 1) were subjected to mortality by using alternative mortality encounter models and 2) aggregated according to random encounters (two schools randomly encountering one another join into a single school) or preferential attachment (the probability of a successful aggregation of two schools is proportional to the school sizes). The simulations started from either a “disaggregated” (all schools comprised a single fish) or an aggregated initial condition. Results showed the transition of the school-size distribution with preferential attachment evolving toward a scale-free school size distribution, whereas random attachment evolved toward an exponential distribution. Preferential attachment strategies performed better than random attachment strategies in terms of recruitment survival at time when mortality encounters were weighted toward schools rather than to individual fish. Mathematical models were developed whose solutions (either analytic or numerical) mimicked the simulation results. The resulting models included both Beverton-Holt and Ricker-like recruitment, which predict recruitment as a function of initial mean school size as well as initial stock size. Results suggest that school-size distributions during recruitment may provide information on recruitment processes. The models also provide a template for expanding both theoretical and empirical recruitment research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In standard Gaussian Process regression input locations are assumed to be noise free. We present a simple yet effective GP model for training on input points corrupted by i.i.d. Gaussian noise. To make computations tractable we use a local linear expansion about each input point. This allows the input noise to be recast as output noise proportional to the squared gradient of the GP posterior mean. The input noise variances are inferred from the data as extra hyperparameters. They are trained alongside other hyperparameters by the usual method of maximisation of the marginal likelihood. Training uses an iterative scheme, which alternates between optimising the hyperparameters and calculating the posterior gradient. Analytic predictive moments can then be found for Gaussian distributed test points. We compare our model to others over a range of different regression problems and show that it improves over current methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today's fast-paced, dynamic environments mean that for organizations to keep "ahead of the game", engineering managers need to maximize current opportunities and avoid repeating past mistakes. This article describes the development study of a collaborative strategic management tool - the Experience Scan to capture past experience and apply learning from this to present and future situations. Experience Scan workshops were held in a number of different technology organizations, developing and refining the tool until its format stabilized. From participants' feedback, the workshop-based tool was judged to be a useful and efficient mechanism for communication and knowledge management, contributing to organizational learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a numerical study of the impact of process-induced variations on the achievable motional resistance Rx of one-dimensional, cyclic and cross-coupled architectures of electrostatically transduced MEMS resonators operating in the 250 kHz range. Monte Carlo numerical simulations which accounted for up to 0.75% variation in critical resonator feature sizes were initiated on 1, 2, 3, 4, 5 and 9 coupled MEMS resonators for three distinct coupling architectures. Improvements of 100X in the spread of Rx and 2.7X in mean achievable Rx are reported for the case of 9 resonators when implemented in the cross-coupled topology, as opposed to the traditional one-dimensional chain. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface micro-roughness, surface chemical properties, and surface wettability are three important aspects of wafer surfaces during a wafer cleaning process, which determine the bonding quality of ordinary direct wafer bonding. In this study, InP wafers are divided into four groups and treated by different chemical processes. Subsequently, the characteristics of the treated InP surfaces are carefully studied by X-ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM), and contact angle measurements. The optimal wafer treatment method for wafer bonding is determined by comparing the results of the processes as a whole. This optimization is later evaluated by a scanning electronic microscope (SEM), and the ridge waveguide 1.55 mu m Si-based InP/InGaAsP multi-quantum-well laser chips are also fabricated. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By incorporating self-consistent field theory with lattice Boltzmann method, a model for polymer melts is proposed. Compared with models based on Ginzburg-Landau free energy, our model does not employ phenomenological free energies to describe systems and can consider the chain topological details of polymers. We use this model to study the effects of hydrodynamic interactions on the dynamics of microphase separation for block copolymers. In the early stage of phase separation, an exponential growth predicted by Cahn-Hilliard treatment is found. Simulation results also show that the effect of hydrodynamic interactions can be neglected in the early stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wave breaking in the open ocean and coastal zones remains an intriguing yet incompletely understood process, with a strong observed association with wave groups. Recent numerical study of the evolution of fully nonlinear, two-dimensional deep water wave groups identified a robust threshold of a diagnostic growth-rate parameter that separated nonlinear wave groups that evolved to breaking from those that evolved with recurrence. This paper investigates whether these deep water wave-breaking results apply more generally, particularly in finite-water-depth conditions. For unforced nonlinear wave groups in intermediate water depths over a flat bottom, it was found that the upper bound of the diagnostic growth-rate threshold parameter established for deep water wave groups is also applicable in intermediate water depths, given by k(0) h greater than or equal to 2, where k(0) is the mean carrier wavenumber and h is the mean depth. For breaking onset over an idealized circular arc sandbar located on an otherwise flat, intermediate-depth (k(0) h greater than or equal to 2) environment, the deep water breaking diagnostic growth rate was found to be applicable provided that the height of the sandbar is less than one-quarter of the ambient mean water depth. Thus, for this range of intermediate-depth conditions, these two classes of bottom topography modify only marginally the diagnostic growth rate found for deep water waves. However, when intermediate-depth wave groups ( k(0) h greater than or equal to 2) shoal over a sandbar whose height exceeds one-half of the ambient water depth, the waves can steepen significantly without breaking. In such cases, the breaking threshold level and the maximum of the diagnostic growth rate increase systematically with the height of the sandbar. Also, the dimensions and position of the sandbar influenced the evolution and breaking threshold of wave groups. For sufficiently high sandbars, the effects of bottom topography can induce additional nonlinearity into the wave field geometry and associated dynamics that modifies the otherwise robust deep water breaking-threshold results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To develop sedation, pain, and agitation quality measures using process control methodology and evaluate their properties in clinical practice. Design: A Sedation Quality Assessment Tool was developed and validated to capture data for 12-hour periods of nursing care. Domains included pain/discomfort and sedation-agitation behaviors; sedative, analgesic, and neuromuscular blocking drug administration; ventilation status; and conditions potentially justifying deep sedation. Predefined sedation-related adverse events were recorded daily. Using an iterative process, algorithms were developed to describe the proportion of care periods with poor limb relaxation, poor ventilator synchronization, unnecessary deep sedation, agitation, and an overall optimum sedation metric. Proportion charts described processes over time (2 monthly intervals) for each ICU. The numbers of patients treated between sedation-related adverse events were described with G charts. Automated algorithms generated charts for 12 months of sequential data. Mean values for each process were calculated, and variation within and between ICUs explored qualitatively. Setting: Eight Scottish ICUs over a 12-month period. Patients: Mechanically ventilated patients. Interventions: None. Measurements and Main Results: The Sedation Quality Assessment Tool agitation-sedation domains correlated with the Richmond Sedation Agitation Scale score (Spearman [rho] = 0.75) and were reliable in clinician-clinician (weighted kappa; [kappa] = 0.66) and clinician-researcher ([kappa] = 0.82) comparisons. The limb movement domain had fair correlation with Behavioral Pain Scale ([rho] = 0.24) and was reliable in clinician-clinician ([kappa] = 0.58) and clinician-researcher ([kappa] = 0.45) comparisons. Ventilator synchronization correlated with Behavioral Pain Scale ([rho] = 0.54), and reliability in clinician-clinician ([kappa] = 0.29) and clinician-researcher ([kappa] = 0.42) comparisons was fair-moderate. Eight hundred twenty-five patients were enrolled (range, 59-235 across ICUs), providing 12,385 care periods for evaluation (range 655-3,481 across ICUs). The mean proportion of care periods with each quality metric varied between ICUs: excessive sedation 12-38%; agitation 4-17%; poor relaxation 13-21%; poor ventilator synchronization 8-17%; and overall optimum sedation 45-70%. Mean adverse event intervals ranged from 1.5 to 10.3 patients treated. The quality measures appeared relatively stable during the observation period. Conclusions: Process control methodology can be used to simultaneously monitor multiple aspects of pain-sedation-agitation management within ICUs. Variation within and between ICUs could be used as triggers to explore practice variation, improve quality, and monitor this over time

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has considered the optimisation of granola breakfast cereal manufacturing processes by wet granulation and pneumatic conveying. Granola is an aggregated food product used as a breakfast cereal and in cereal bars. Processing of granola involves mixing the dry ingredients (typically oats, nuts, etc.) followed by the addition of a binder which can contain honey, water and/or oil. In this work, the design and operation of two parallel wet granulation processes to produce aggregate granola products were incorporated: a) a high shear mixing granulation process followed by drying/toasting in an oven. b) a continuous fluidised bed followed by drying/toasting in an oven. In high shear granulation the influence of process parameters on key granule aggregate quality attributes such as granule size distribution and textural properties of granola were investigated. The experimental results show that the impeller rotational speed is the single most important process parameter which influences granola physical and textural properties. After that binder addition rate and wet massing time also show significant impacts on granule properties. Increasing the impeller speed and wet massing time increases the median granule size while also presenting a positive correlation with density. The combination of high impeller speed and low binder addition rate resulted in granules with the highest levels of hardness and crispness. In the fluidised bed granulation process the effect of nozzle air pressure and binder spray rate on key aggregate quality attributes were studied. The experimental results show that a decrease in nozzle air pressure leads to larger in mean granule size. The combination of lowest nozzle air pressure and lowest binder spray rate results in granules with the highest levels of hardness and crispness. Overall, the high shear granulation process led to larger, denser, less porous and stronger (less likely to break) aggregates than the fluidised bed process. The study also examined the particle breakage of granola during pneumatic conveying produced by both the high shear granulation and the fluidised bed granulation process. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. Particle breakage increases with applied pressure drop, and a 90° bend pipe results in more attrition for all conveying velocities relative to other pipe geometry. Additionally for the granules produced in the high shear granulator; those produced at the highest impeller speed, while being the largest also have the lowest levels of proportional breakage while smaller granules produced at the lowest impeller speed have the highest levels of breakage. This effect clearly shows the importance of shear history (during granule production) on breakage during subsequent processing. In terms of the fluidised bed granulation, there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. Finally, a simple power law breakage model based on process input parameters was developed for both manufacturing processes. It was found suitable for predicting the breakage of granola breakfast cereal at various applied air velocities using a number of pipe configurations, taking into account shear histories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing complexity of new manufacturing processes and the continuously growing range of fabrication options mean that critical decisions about the insertion of new technologies must be made as early as possible in the design process. Mitigating the technology risks under limited knowledge is a key factor and major requirement to secure a successful development of the new technologies. In order to address this challenge, a risk mitigation methodology that incorporates both qualitative and quantitative analysis is required. This paper outlines the methodology being developed under a major UK grand challenge project - 3D-Mintegration. The main focus is on identifying the risks through identification of the product key characteristics using a product breakdown approach. The assessment of the identified risks uses quantification and prioritisation techniques to evaluate and rank the risks. Traditional statistical process control based on process capability and six sigma concepts are applied to measure the process capability as a result of the risks that have been identified. This paper also details a numerical approach that can be used to undertake risk analysis. This methodology is based on computational framework where modelling and statistical techniques are integrated. Also, an example of modeling and simulation technique is given using focused ion beam which is among the investigated in the project manufacturing processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the influence of process parameters on the fluidised hot melt granulation of lactose and PEG 6000, and the subsequent tablet pressing of the granules. Granulation experiments were performed to assess the effect of granulation time and binder content of the feed on the resulting granule properties such as mass mean granule size, size distribution, granule fracture stress, and granule porosity. These data were correlated using the granule growth regime model. It was found that the dominant granule growth mechanisms in this melt granulation system were nucleation followed by steady growth (PEG 10–20% w/w). However, with binder contents greater than 20% w/w, the granulation mechanism moved to the “over-wet massing” regime in which discrete granule formation could not be obtained. The granules produced in the melt fluidised bed process were subsequently pressed into tablets using an industrial tablet press. The physical properties of the tablets: fracture stress, disintegration time and friability were assessed using industry standards. These analyses indicated that particle size and binder content of the initial granules influenced the mechanical properties of the tablets. It was noted that a decrease in initial granule size resulted in an increase in the fracture stress of the tablets formed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a fast algorithm for moving window principal component analysis (MWPCA) which will adapt a principal component model. This incorporates the concept of recursive adaptation within a moving window to (i) adapt the mean and variance of the process variables, (ii) adapt the correlation matrix, and (iii) adjust the PCA model by recomputing the decomposition. This paper shows that the new algorithm is computationally faster than conventional moving window techniques, if the window size exceeds 3 times the number of variables, and is not affected by the window size. A further contribution is the introduction of an N-step-ahead horizon into the process monitoring. This implies that the PCA model, identified N-steps earlier, is used to analyze the current observation. For monitoring complex chemical systems, this work shows that the use of the horizon improves the ability to detect slowly developing drifts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of the process variables, pH of aqueous phase, rate of addition of organic, polymeric, drug-containing phase to aqueous phase, organic:aqueous phase volume ratio and aqueous phase temperature on the entrapment of propranolol hydrochloride in ethylcellulose (N4) microspheres prepared by the solvent evaporation method were examined using a factorial design. The observed range of drug entrapment was 1.43 +/- 0.02%w/w (pH 6, 25 degrees C, phase volume ratio 1:10, fast rate of addition) to 16.63 +/- 0.92%w/w (pH 9, 33 degrees C, phase volume ratio 1:10, slow rate of addition) which corresponded to mean entrapment efficiencies of 2.86 and 33.26, respectively. Increased pH, increased temperature and decreased rate of addition significantly enhanced entrapment efficiency. However, organic:aqueous phase volume ratio did not significantly affect drug entrapment. Statistical interactions were observed between pH and rate of addition, pH and temperature, and temperature and rate of addition. The observed interactions involving pH are suggested to be due to the abilities of increased temperature and slow rate of addition to sufficiently enhance the solubility of dichloromethane in the aqueous phase, which at pH 9, but not pH 6, allows partial polymer precipitation prior to drug partitioning into the aqueous phase. The interaction between temperature and rate of addition is due to the relative lack of effect of increased temperature on drug entrapment following slow rate of addition of the organic phase. In comparison to the effects of pH on drug entrapment, the contributions of the other physical factors examined were limited.