938 resultados para Optimistic data replication system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a method of recognizing handwritten digits by fitting generative models that are built from deformable B-splines with Gaussian ``ink generators'' spaced along the length of the spline. The splines are adjusted using a novel elastic matching procedure based on the Expectation Maximization (EM) algorithm that maximizes the likelihood of the model generating the data. This approach has many advantages. (1) After identifying the model most likely to have generated the data, the system not only produces a classification of the digit but also a rich description of the instantiation parameters which can yield information such as the writing style. (2) During the process of explaining the image, generative models can perform recognition driven segmentation. (3) The method involves a relatively small number of parameters and hence training is relatively easy and fast. (4) Unlike many other recognition schemes it does not rely on some form of pre-normalization of input images, but can handle arbitrary scalings, translations and a limited degree of image rotation. We have demonstrated our method of fitting models to images does not get trapped in poor local minima. The main disadvantage of the method is it requires much more computation than more standard OCR techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work describes the programme of activities relating to a mechanical study of the Conform extrusion process. The main objective was to provide a basic understanding of the mechanics of the Conform process with particular emphasis placed on modelling using experimental and theoretical considerations. The experimental equipment used includes a state of the art computer-aided data-logging system and high temperature loadcells (up to 260oC) manufactured from tungsten carbide. Full details of the experimental equipment is presented in sections 3 and 4. A theoretical model is given in Section 5. The model presented is based on the upper bound theorem using a variation of the existing extrusion theories combined with temperature changes in the feed metal across the deformation zone. In addition, constitutive equations used in the model have been generated from existing experimental data. Theoretical and experimental data are presented in tabular form in Section 6. The discussion of results includes a comprehensive graphical presentation of the experimental and theoretical data. The main findings are: (i) the establishment of stress/strain relationships and an energy balance in order to study the factors affecting redundant work, and hence a model suitable for design purposes; (ii) optimisation of the process, by determination of the extrusion pressure for the range of reduction and changes in the extrusion chamber geometry at lower wheel speeds; and (iii) an understanding of the control of the peak temperature reach during extrusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is known that distillation tray efficiency depends on the liquid flow pattern, particularly for large diameter trays. Scale·up failures due to liquid channelling have occurred, and it is known that fitting flow control devices to trays sometirr.es improves tray efficiency. Several theoretical models which explain these observations have been published. Further progress in understanding is at present blocked by lack of experimental measurements of the pattern of liquid concentration over the tray. Flow pattern effects are expected to be significant only on commercial size trays of a large diameter and the lack of data is a result of the costs, risks and difficulty of making these measurements on full scale production columns. This work presents a new experiment which simulates distillation by water cooling. and provides a means of testing commercial size trays in the laboratory. Hot water is fed on to the tray and cooled by air forced through the perforations. The analogy between heat and mass transfer shows that the water temperature at any point is analogous to liquid concentration and the enthalpy of the air is analogous to vapour concentration. The effect of the liquid flow pattern on mass transfer is revealed by the temperature field on the tray. The experiment was implemented and evaluated in a column of 1.2 m. dia. The water temperatures were measured by thennocouples interfaced to an electronic computerised data logging system. The "best surface" through the experimental temperature measurements was obtained by the mathematical technique of B. splines, and presented in tenos of lines of constant temperature. The results revealed that in general liquid channelling is more imponant in the bubbly "mixed" regime than in the spray regime. However, it was observed that severe channelling also occurred for intense spray at incipient flood conditions. This is an unexpected result. A computer program was written to calculate point efficiency as well as tray efficiency, and the results were compared with distillation efficiencies for similar loadings. The theoretical model of Porter and Lockett for predicting distillation was modified to predict water cooling and the theoretical predictions were shown to be similar to the experimental temperature profiles. A comparison of the repeatability of the experiments with an errors analysis revealed that accurate tray efficiency measurements require temperature measurements to better than ± 0.1 °c which is achievable with conventional techniques. This was not achieved in this work, and resulted in considerable scatter in the efficiency results. Nevertheless it is concluded that the new experiment is a valuable tool for investigating the effect of the liquid flow pattern on tray mass transfer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Serial and parallel interconnection of photonic devices is integral to the construction of any all-optical data processing system. This thesis presents results from a series of experiments centering on the use of the nonlinear-optical loop mirror (NOLM) switch in architectures for the manipulation and generation of ultrashort pulses. Detailed analysis of soliton switching in a single NOLM and cascade of two NOLM's is performed, centering on primary limitations to device operation, effect of cascading on amplitude response, and impact of switching on the characteristics of incident pulses. By using relatively long input pulses, device failure due to stimulated Raman generation is postponed to demonstrate multiple-peaked switching for the first time. It is found that while cascading leads to a sharpening of the overall switching characteristic, pulse spectral and temporal integrity is not significantly degraded, and emerging pulses retain their essential soliton character. In addition, by including an asymmetrically placed in-fibre Bragg reflector as a wavelength selective loss element in the basic NOLM configuration, both soliton self-switching and dual-wavelength control-pulse switching are spectrally quantised. Results are presented from a novel dual-wavelength laser configuration generating pulse trains with an ultra-low rms inter-pulse-stream timing jitter level of 630fs enabling application in ultrafast switching environments at data rates as high as 130GBits/s. In addition, the fibre NOLM is included in architectures for all-optical memory, demonstrating storage and logical inversion of a 0.5kByte random data sequence; and ultrafast phase-locking of a gain-switched distributed feedback laser at 1.062GHz, the fourteenth harmonic of the system baseband frequency. The stringent requirements for environmental robustness of these architectures highlight the primary weaknesses of the NOLM in its fibre form and recommendations to overcome its inherent drawbacks are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A detailed investigation has been undertaken into the field induced electron emission (FIEE) mechanism that occurs at microscopically localised `sites' on uncoated and dielectric coated metallic electrodes. These processes have been investigated using two dedicated experimental systems that were developed for this study. The first is a novel combined photo/field emission microscope, which employs a UV source to stimulate photo-electrons from the sample surface in order to generate a topographical image. This system utilises an electrostatic lens column to provide identical optical properties under the different operating conditions required for purely topographical and combined photo/field imaging. The system has been demonstrated to have a resolution approaching 1m. Emission images have been obtained from carbon emission sites using this system to reveal that emission may occur from the edge triple junction or from the bulk of the carbon particle. An existing UHV electron spectrometer has been extensively rebuilt to incorporate a computer control and data acquisition system, improved sample handling and manipulation and a specimen heating stage. Details are given of a comprehensive study into the effects of sample heating on the emission process under conditions of both bulk and transient heating. Similar studies were also performed under conditions of both zero and high applied field. These show that the properties of emission sites are strongly temperature and field dependent thus indicating that the emission process is `non-metallic' in nature. The results have been shown to be consistent with an existing hot electron emission model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to design, construct, commission and operate a laboratory scale gasifier system that could be used to investigate the parameters that influence the gasification process. The gasifier is of the open-core variety and is fabricated from 7.5 cm bore quartz glass tubing. Gas cleaning is by a centrifugal contacting scrubber, with the product gas being flared. The system employs an on-line dedicated gas analysis system, monitoring the levels of H2, CO, CO2 and CH4 in the product gas. The gas composition data, as well as the gas flowrate, temperatures throughout the system and pressure data is recorded using a BBC microcomputer based data-logging system. Ten runs have been performed using the system of which six were predominantly commissioning runs. The main emphasis in the commissioning runs was placed on the gas clean-up, the product gas cleaning and the reactor bed temperature measurement. The reaction was observed to occur in a narrow band, of about 3 to 5 particle diameters thick. Initially the fuel was pyrolysed, with the volatiles produced being combusted and providing the energy to drive the process, and then the char product was gasified by reaction with the pyrolysis gases. Normally, the gasifier is operated with reaction zone supported on a bed of char, although it has been operated for short periods without a char bed. At steady state the depth of char remains constant, but by adjusting the air inlet rate it has been shown that the depth of char can be increased or decreased. It has been shown that increasing the depth of the char bed effects some improvement in the product gas quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sensitive and precise radioimmunoassays for insulin and glucagon have been established. Although it was possible to employ similar precepts to the development of both hormone assays, the establishment of a reliable glucagon radioimmunoassay was complicated by the poor immunogenicity and instability of the peptide. Thus, unlike insulin antisera which were prepared by monthly injection of guinea pigs with crystalline insulin emulsified in adjuvant, the successful production of glucagon antisera was accomplished by immunisation of rabbits and guinea pigs with glucagon covalently linked to bovine plasma albumin. The conventional chloramine-T iodination with purification by gel chromatography was only suitable for the production of labelled insulin. Quality tracer for use in the glucagon radioimmunoassay was prepared by trace iodination, with subsequent purification of monoiodinated glucagon by anion exchange chromatography. Separation of free and antibody bound moieties by coated charcoal was applicable to both hormone assays, and a computerised data processing system, relying on logit-log transformation, was used to analyse all assay results. The assays were employed to evaluate the regulation of endocrine pancreatic function and the role of insulin and glucagon in the pathogenesis of the obese hyperglycaemic syndrome in mice. In the homozygous (ob/ob) condition, mice of the Birmingham strain were characterised by numerous abnormalities of glucose homeostasis, several of which were detected in heterozygous (ob/+) mice. Obese mice exhibited pancreatic alpha cell dysfunction and hyperglucagonaemia. Investigation of this defect revealed a marked insensitivity of an insulin dependent glucose sensing mechanism that inhibited glucagon secretion. Although circulating glucagon was of minor importance in the maintenance of hyperinsulinaemia, lack of suppression of alpha cell function by glucose and insulin contributed significantly to both the insulin insensitivity and the hyperglycaemia of obese mice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study of conveying practice demonstrates that belt conveyors provide a versatile and. much-used method of transporting bulk materials, but a review of belting manufacturers' design procedures shows that belt design and selection rules are often based on experience with all-cotton belts no longer in common use, and are net completely relevant to modern synthetic constructions. In particular, provision of the property "load support", which was not critical with cotton belts, is shown to determine the outcome of most belt selection exercises and lead to gross over specification of other design properties in many cases. The results of an original experimental investigation into this property, carried out to determine the belt and conveyor parameters that affect it, how the major role that belt stiffness plays in its provision; the basis for a belt stiffness test relevant to service conditions is given. A proposal for a more rational method of specifying load support data results from the work, but correlation of the test results with service performance is necessary before the absolute toad support capability required from a belt for given working conditions can be quantified. A study to attain this correlation is the major proposal for future work resulting from the present investigation, but a full review of the literature on conveyor design and a study of present practice within the belting industry demonstrate other, less critical, factors that could profitably be investigated. It is suggested that the most suitable method of studying these would be a rational data collection system to provide information on various facets of belt service behaviour; a basis for such a system is proposed. In addition to the work above, proposals for simplifying the present belt selection methods are made and a strain transducer suitable for use in future experimental investigations is developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mathematical model has been developed for predicting the spectral distribution of solar radiation incident on a horizontal surface. The solar spectrum in the wavelength range 0.29 to 4.0 micrometers has been divided in 144 intervals. Two variables in the model are the atmospheric water vapour content and atmospheric turbidity. After allowing for absorption and scattering in the atmosphere, the spectral intensity of direct and diffuse components of radiation are computed. When the predicted radiation levels are compared with the measured values for the total radiation and the values with glass filters RG715, RG630 and OG530, a close agreement (± 5%) has been achieved under clear sky conditions. A solar radiation measuring facility, close to the centre of Birmingham, has been set up utilising a microcomputer based data logging system. A suite of computer programs in the BASIC programming language has been developed and extensively tested for solar radiation data, logging, analysis and plotting. Two commonly used instruments, the Eppley PSP pyranometer and the Kipp and Zonen CM5 pyranometer, have been compared under different experimental conditions. Three models for computing the inclined plane irradiation, using total and diffuse radiation on a horizontal surface, have been tested for Birmingham. The anisotropic-alI-sky model, proposed by Klucher, provides a good agreement between the measured and the predicted radiation levels. Measurements of solar spectral distribution, using glass filters, are also reported for a number of inclines facing South.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the competitive challenge facing business today, the need to keep cost down and quality up is a matter of survival. One way in which wire manufacturers can meet this challenge is to possess a thorough understanding of deformation, friction and lubrication during the wire drawing process, and therefore to make good decisions regarding the selection and application of lubricants as well as the die design. Friction, lubrication and die design during wire drawing thus become the subject of this study. Although theoretical and experimental investigations have been being carried out ever since the establishment of wire drawing technology, many problems remain unsolved. It is therefore necessary to conduct further research on traditional and fundamental subjects such as the mechanics of deformation, friction, lubrication and die design in wire drawing. Drawing experiments were carried out on an existing bull-block under different cross-sectional area reductions, different speeds and different lubricants. The instrumentation to measure drawing load and drawing speed was set up and connected to the wire drawing machine, together with a data acquisition system. A die box connected to the existing die holder for using dry soap lubricant was designed and tested. The experimental results in terms of drawing stress vs percentage area reduction curves under different drawing conditions were analysed and compared. The effects on drawing stress of friction, lubrication, drawing speed and pressure die nozzle are discussed. In order to determine the flow stress of the material during deformation, tensile tests were performed on an Instron universal test machine, using the wires drawn under different area reductions. A polynomial function is used to correlate the flow stress of the material with the plastic strain, on which a general computer program has been written to find out the coefficients of the stress-strain function. The residual lubricant film on the steel wire after drawing was examined both radially and longitudinally using an SEM and optical microscope. The lubricant film on the drawn wire was clearly observed. Therefore, the micro-analysis by SEM provides a way of friction and lubrication assessment in wire drawing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bayesian decision theory is increasingly applied to support decision-making processes under environmental variability and uncertainty. Researchers from application areas like psychology and biomedicine have applied these techniques successfully. However, in the area of software engineering and speci?cally in the area of self-adaptive systems (SASs), little progress has been made in the application of Bayesian decision theory. We believe that techniques based on Bayesian Networks (BNs) are useful for systems that dynamically adapt themselves at runtime to a changing environment, which is usually uncertain. In this paper, we discuss the case for the use of BNs, speci?cally Dynamic Decision Networks (DDNs), to support the decision-making of self-adaptive systems. We present how such a probabilistic model can be used to support the decision making in SASs and justify its applicability. We have applied our DDN-based approach to the case of an adaptive remote data mirroring system. We discuss results, implications and potential bene?ts of the DDN to enhance the development and operation of self-adaptive systems, by providing mechanisms to cope with uncertainty and automatically make the best decision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Timing jitter is a major factor limiting the performance of any high-speed, long-haul data transmission system. It arises from a number of reasons, such as interaction with accumulated spontaneous emission, inter-symbol interference (ISI), electrostriction etc. Some effects causing timing jitter can be reduced by means of non-linear filtering, using, for example, a nonlinear optical loop mirror (NOLM) [1]. The NOLM has been shown to reduce the timing jitter by suppressing the ASE and by stabilising the pulse duration [2, 3]. In this paper, we investigate the dynamics of timing jitter in a 2R regenerated system, nonlinearly guided by NOLMs at bit rates of 10, 20, 40, and 80- Gbit/s. Transmission performance of an equivalent non-regenerated (generic) system is taken as a reference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The value of knowing about data availability and system accessibility is analyzed through theoretical models of Information Economics. When a user places an inquiry for information, it is important for the user to learn whether the system is not accessible or the data is not available, rather than not have any response. In reality, various outcomes can be provided by the system: nothing will be displayed to the user (e.g., a traffic light that does not operate, a browser that keeps browsing, a telephone that does not answer); a random noise will be displayed (e.g., a traffic light that displays random signals, a browser that provides disorderly results, an automatic voice message that does not clarify the situation); a special signal indicating that the system is not operating (e.g., a blinking amber indicating that the traffic light is down, a browser responding that the site is unavailable, a voice message regretting to tell that the service is not available). This article develops a model to assess the value of the information for the user in such situations by employing the information structure model prevailing in Information Economics. Examples related to data accessibility in centralized and in distributed systems are provided for illustration.