34 resultados para Shipping process with debit

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep hole drilling is one of the most complicated metal cutting processes and one of the most difficult to perform on CNC machine-tools or machining centres under conditions of limited manpower or unmanned operation. This research work investigates aspects of the deep hole drilling process with small diameter twist drills and presents a prototype system for real time process monitoring and adaptive control; two main research objectives are fulfilled in particular : First objective is the experimental investigation of the mechanics of the deep hole drilling process, using twist drills without internal coolant supply, in the range of diarneters Ø 2.4 to Ø4.5 mm and working length up to 40 diameters. The definition of the problems associated with the low strength of these tools and the study of mechanisms of catastrophic failure which manifest themselves well before and along with the classic mechanism of tool wear. The relationships between drilling thrust and torque with the depth of penetration and the various machining conditions are also investigated and the experimental evidence suggests that the process is inherently unstable at depths beyond a few diameters. Second objective is the design and implementation of a system for intelligent CNC deep hole drilling, the main task of which is to ensure integrity of the process and the safety of the tool and the workpiece. This task is achieved by means of interfacing the CNC system of the machine tool to an external computer which performs the following functions: On-line monitoring of the drilling thrust and torque, adaptive control of feed rate, spindle speed and tool penetration (Z-axis), indirect monitoring of tool wear by pattern recognition of variations of the drilling thrust with cumulative cutting time and drilled depth, operation as a data base for tools and workpieces and finally issuing of alarms and diagnostic messages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This technical report builds on previous reports to derive the likelihood and its derivatives for a Gaussian Process with a modified Bessel function based covariance function. The full derivation is shown. The likelihood (with gradient information) can be used in maximum likelihood procedures (i.e. gradient based optimisation) and in Hybrid Monte Carlo sampling (i.e. within a Bayesian framework).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Improving healthcare quality is a growing need of any society. Although various quality improvement projects are routinely deployed by the healthcare professional, they are characterised by a fragmented approach, i.e. they are not linked with the strategic intent of the organisation. This study introduces a framework which integrates all quality improvement projects with the strategic intent of the organisation. It first derives the strengths, weaknesses, opportunities and threats (SWOT) matrix of the system with the involvement of the concerned stakeholders (clinical professional), which helps identify a few projects, the implementation of which ensures achievement of desired quality. The projects are then prioritised using the analytic hierarchy process with the involvement of the concerned stakeholders (clinical professionals) and implemented in order to improve system performance. The effectiveness of the method has been demonstrated using a case study in the intensive care unit of Queen Elizabeth Hospital in Bridgetown, Barbados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work describes the programme of activities relating to a mechanical study of the Conform extrusion process. The main objective was to provide a basic understanding of the mechanics of the Conform process with particular emphasis placed on modelling using experimental and theoretical considerations. The experimental equipment used includes a state of the art computer-aided data-logging system and high temperature loadcells (up to 260oC) manufactured from tungsten carbide. Full details of the experimental equipment is presented in sections 3 and 4. A theoretical model is given in Section 5. The model presented is based on the upper bound theorem using a variation of the existing extrusion theories combined with temperature changes in the feed metal across the deformation zone. In addition, constitutive equations used in the model have been generated from existing experimental data. Theoretical and experimental data are presented in tabular form in Section 6. The discussion of results includes a comprehensive graphical presentation of the experimental and theoretical data. The main findings are: (i) the establishment of stress/strain relationships and an energy balance in order to study the factors affecting redundant work, and hence a model suitable for design purposes; (ii) optimisation of the process, by determination of the extrusion pressure for the range of reduction and changes in the extrusion chamber geometry at lower wheel speeds; and (iii) an understanding of the control of the peak temperature reach during extrusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research was undertaken to: develop a process for the direct solvent extraction of castor oil seeds. A literature survey confirmed the desirability of establishing such a process with emphasis on the decortication, size, reduction, detoxification-deallergenization, and solvent·extraction operations. A novel process was developed for the dehulling of castor seeds which consists of pressurizing the beans and then suddenly releasing the pressure to vaccum. The degree of dehulling varied according to the pressure applied and the size of the beans. Some of the batches were difficult-to-hull, and this phenomenon was investigated using the scanning electron microscope and by thickness and compressive strength measurements. The other variables studied to lesser degrees included residence time, moisture, content, and temperature.The method was successfully extended to cocoa beans, and (with modifications) to peanuts. The possibility of continuous operation was looked into, and a mechanism was suggested to explain the method works. The work on toxins and allergens included an extensive literature survey on the properties of these substances and the methods developed for their deactivation Part of the work involved setting up an assay method for measuring their concentration in the beans and cake, but technical difficulties prevented the completion of this aspect of the project. An appraisal of the existing deactivation methods was made in the course of searching for new ones. A new method of reducing the size of oilseeds was introduced in this research; it involved freezing the beans in cardice and milling them in a coffee grinder, the method was found to be a quick, efficient, and reliable. An application of the freezing technique was successful in dehulling soybeans and de-skinning peanut kernels. The literature on the solvent extraction, of oilseeds, especially castor, was reviewed: The survey covered processes, equipment, solvents, and mechanism of leaching. three solvents were experimentally investigated: cyclohexane, ethanol, and acetone. Extraction with liquid ammonia and liquid butane was not effective under the conditions studied. Based on the results of the research a process has been suggested for the direct solvent extraction of castor seeds, the various sections of the process have analysed, and the factors affecting the economics of the process were discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research describes a computerized model of human classification which has been constructed to represent the process by which assessments are made for psychodynamic psychotherapy. The model assigns membership grades (MGs) to clients so that the most suitable ones have high values in the therapy category. Categories consist of a hierarchy of components, one of which, ego strength, is analysed in detail to demonstrate the way it has captured the psychotherapist's knowledge. The bottom of the hierarchy represents the measurable factors being assessed during an interview. A questionnaire was created to gather the identified information and was completed by the psychotherapist after each assessment. The results were fed into the computerized model, demonstrating a high correlation between the model MGs and the suitability ratings of the psychotherapist (r = .825 for 24 clients). The model has successfully identified the relevant data involved in assessment and simulated the decision-making process of the expert. Its cognitive validity enables decisions to be explained, which means that it has potential for therapist training and also for enhancing the referral process, with benefits in cost effectiveness as well as in the reduction of trauma to clients. An adapted version measuring client improvement would give quantitative evidence for the benefit of therapy, thereby supporting auditing and accountability. © 1997 The British Psychological Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most machine-learning algorithms are designed for datasets with features of a single type whereas very little attention has been given to datasets with mixed-type features. We recently proposed a model to handle mixed types with a probabilistic latent variable formalism. This proposed model describes the data by type-specific distributions that are conditionally independent given the latent space and is called generalised generative topographic mapping (GGTM). It has often been observed that visualisations of high-dimensional datasets can be poor in the presence of noisy features. In this paper we therefore propose to extend the GGTM to estimate feature saliency values (GGTMFS) as an integrated part of the parameter learning process with an expectation-maximisation (EM) algorithm. The efficacy of the proposed GGTMFS model is demonstrated both for synthetic and real datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many problems in spatial statistics it is necessary to infer a global problem solution by combining local models. A principled approach to this problem is to develop a global probabilistic model for the relationships between local variables and to use this as the prior in a Bayesian inference procedure. We show how a Gaussian process with hyper-parameters estimated from Numerical Weather Prediction Models yields meteorologically convincing wind fields. We use neural networks to make local estimates of wind vector probabilities. The resulting inference problem cannot be solved analytically, but Markov Chain Monte Carlo methods allow us to retrieve accurate wind fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most traditional methods for extracting the relationships between two time series are based on cross-correlation. In a non-linear non-stationary environment, these techniques are not sufficient. We show in this paper how to use hidden Markov models (HMMs) to identify the lag (or delay) between different variables for such data. We first present a method using maximum likelihood estimation and propose a simple algorithm which is capable of identifying associations between variables. We also adopt an information-theoretic approach and develop a novel procedure for training HMMs to maximise the mutual information between delayed time series. Both methods are successfully applied to real data. We model the oil drilling process with HMMs and estimate a crucial parameter, namely the lag for return.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many problems in spatial statistics it is necessary to infer a global problem solution by combining local models. A principled approach to this problem is to develop a global probabilistic model for the relationships between local variables and to use this as the prior in a Bayesian inference procedure. We show how a Gaussian process with hyper-parameters estimated from Numerical Weather Prediction Models yields meteorologically convincing wind fields. We use neural networks to make local estimates of wind vector probabilities. The resulting inference problem cannot be solved analytically, but Markov Chain Monte Carlo methods allow us to retrieve accurate wind fields.