985 resultados para control engineering computing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Basic research related to heavy-ion cancer therapy has been done at the Institute of Modern Physics (IMP), Chinese Academy of Sciences since 1995. Now a plan of clinical trial with heavy ions has been launched at IMP. First, superficially placed tumor treatment with heavy ions is expected in the therapy terminal at the Heavy Ion Research Facility in Lanzhou (HIRFL), where carbon ion beams with energy up to 100 MeV/u can be supplied. The shallow-seated tumor therapy terminal at HIRFL is equipped with a passive beam delivery system including two orthogonal dipole magnets, which continuously scan pencil beams laterally and generate a broad and uniform irradiation field, a motor-driven energy degrader and a multi-leaf collimator. Two different types of range modulator, ripple filter and ridge filter with which Guassian-shaped physical dose and uniform biological effective dose Bragg peaks can be shaped for therapeutic ion beams respectively, have been designed and manufactured. Therefore, two-dimensional and three-dimensional conformal irradiations to tumors can be performed with the passive beam delivery system at the earlier therapy terminal. Both the conformal irradiation methods have been verified experimentally and carbon-ion conformal irradiations to patients with superficially placed tumors have been carried out at HIRFL since November 2006.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a statistical-based fault diagnosis scheme for application to internal combustion engines. The scheme relies on an identified model that describes the relationships between a set of recorded engine variables using principal component analysis (PCA). Since combustion cycles are complex in nature and produce nonlinear relationships between the recorded engine variables, the paper proposes the use of nonlinear PCA (NLPCA). The paper further justifies the use of NLPCA by comparing the model accuracy of the NLPCA model with that of a linear PCA model. A new nonlinear variable reconstruction algorithm and bivariate scatter plots are proposed for fault isolation, following the application of NLPCA. The proposed technique allows the diagnosis of different fault types under steady-state operating conditions. More precisely, nonlinear variable reconstruction can remove the fault signature from the recorded engine data, which allows the identification and isolation of the root cause of abnormal engine behaviour. The paper shows that this can lead to (i) an enhanced identification of potential root causes of abnormal events and (ii) the masking of faulty sensor readings. The effectiveness of the enhanced NLPCA based monitoring scheme is illustrated by its application to a sensor fault and a process fault. The sensor fault relates to a drift in the fuel flow reading, whilst the process fault relates to a partial blockage of the intercooler. These faults are introduced to a Volkswagen TDI 1.9 Litre diesel engine mounted on an experimental engine test bench facility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper NOx emissions modelling for real-time operation and control of a 200 MWe coal-fired power generation plant is studied. Three model types are compared. For the first model the fundamentals governing the NOx formation mechanisms and a system identification technique are used to develop a grey-box model. Then a linear AutoRegressive model with eXogenous inputs (ARX) model and a non-linear ARX model (NARX) are built. Operation plant data is used for modelling and validation. Model cross-validation tests show that the developed grey-box model is able to consistently produce better overall long-term prediction performance than the other two models.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Subspace monitoring has recently been proposed as a condition monitoring tool that requires considerably fewer variables to be analysed compared to dynamic principal component analysis (PCA). This paper analyses subspace monitoring in identifying and isolating fault conditions, which reveals that the existing work suffers from inherent limitations if complex fault senarios arise. Based on the assumption that the fault signature is deterministic while the monitored variables are stochastic, the paper introduces a regression-based reconstruction technique to overcome these limitations. The utility of the proposed fault identification and isolation method is shown using a simulation example and the analysis of experimental data from an industrial reactive distillation unit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In polymer extrusion, delivery of a melt which is homogenous in composition and temperature is important for good product quality. However, the process is inherently prone to temperature fluctuations which are difficult to monitor and control via single point based conventional thermo- couples. In this work, the die melt temperature profile was monitored by a thermocouple mesh and the data obtained was used to generate a model to predict the die melt temperature profile. A novel nonlinear model was then proposed which was demonstrated to be in good agreement with training and unseen data. Furthermore, the proposed model was used to select optimum process settings to achieve the desired average melt temperature across the die while improving the temperature homogeneity. The simulation results indicate a reduction in melt temperature variations of up to 60%.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the deployment on GPUs of PROP, a program of the 2DRMP suite which models electron collisions with H-like atoms and ions. Because performance on GPUs is better in single precision than in double precision, the numerical stability of the PROP program in single precision has been studied. The numerical quality of PROP results computed in single precision and their impact on the next program of the 2DRMP suite has been analyzed. Successive versions of the PROP program on GPUs have been developed in order to improve its performance. Particular attention has been paid to the optimization of data transfers and of linear algebra operations. Performance obtained on several architectures (including NVIDIA Fermi) are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Future digital signal processing (DSP) systems must provide robustness on algorithm and application level to the presence of reliability issues that come along with corresponding implementations in modern semiconductor process technologies. In this paper, we address this issue by investigating the impact of unreliable memories on general DSP systems. In particular, we propose a novel framework to characterize the effects of unreliable memories, which enables us to devise novel methods to mitigate the associated performance loss. We propose to deploy specifically designed data representations, which have the capability of substantially improving the system reliability compared to that realized by conventional data representations used in digital integrated circuits, such as 2's-complement or sign-magnitude number formats. To demonstrate the efficacy of the proposed framework, we analyze the impact of unreliable memories on coded communication systems, and we show that the deployment of optimized data representations substantially improves the error-rate performance of such systems.