73 resultados para Multi-phase Modelling

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A second-harmonic direct current (DC) ripple compensation technique is presented for a multi-phase, fault-tolerant, permanent magnet machine. The analysis has been undertaken in a general manner for any pair of phases in operation with the remaining phases inactive. The compensation technique determines the required alternating currents in the machine to eliminate the second-harmonic DC-link current, while at the same time minimising the total rms current in the windings. An additional benefit of the compensation technique is a reduction in the magnitude of the electromagnetic torque ripple. Practical results are included from a 70 kW, five-phase generator system to validate the analysis and illustrate the performance of the compensation technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Self-Organizing Map (SOM) algorithm has been extensively studied and has been applied with considerable success to a wide variety of problems. However, the algorithm is derived from heuristic ideas and this leads to a number of significant limitations. In this paper, we consider the problem of modelling the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. We introduce a novel form of latent variable model, which we call the GTM algorithm (for Generative Topographic Mapping), which allows general non-linear transformations from latent space to data space, and which is trained using the EM (expectation-maximization) algorithm. Our approach overcomes the limitations of the SOM, while introducing no significant disadvantages. We demonstrate the performance of the GTM algorithm on simulated data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is currently considerable interest in developing general non-linear density models based on latent, or hidden, variables. Such models have the ability to discover the presence of a relatively small number of underlying `causes' which, acting in combination, give rise to the apparent complexity of the observed data set. Unfortunately, to train such models generally requires large computational effort. In this paper we introduce a novel latent variable algorithm which retains the general non-linear capabilities of previous models but which uses a training procedure based on the EM algorithm. We demonstrate the performance of the model on a toy problem and on data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Visualization has proven to be a powerful and widely-applicable tool the analysis and interpretation of data. Most visualization algorithms aim to find a projection from the data space down to a two-dimensional visualization space. However, for complex data sets living in a high-dimensional space it is unlikely that a single two-dimensional projection can reveal all of the interesting structure. We therefore introduce a hierarchical visualization algorithm which allows the complete data set to be visualized at the top level, with clusters and sub-clusters of data points visualized at deeper levels. The algorithm is based on a hierarchical mixture of latent variable models, whose parameters are estimated using the expectation-maximization algorithm. We demonstrate the principle of the approach first on a toy data set, and then apply the algorithm to the visualization of a synthetic data set in 12 dimensions obtained from a simulation of multi-phase flows in oil pipelines and to data in 36 dimensions derived from satellite images.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Organizations today face intense competitive and economic pressures leading to large scale transformation of existing business operations and transactions. In addition, organizations have adopted automated business processes to deal with partners and customers. E-business diffusion is a multi-phase process, moving from initiation through to routinisation and an insight into the adoption processes helps organizations to adopt e-business more effectively. It is imperative that organizations effectively manage the e-business environment, and all associated changes to accommodate the changing relationships with customers and business partners and more importantly, to improve performance. This chapter discusses the process of e-business implementation, usage and diffusion (routinisation stage) on business performance. © 2010, IGI Global.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A technique is presented for the development of a high precision and resolution Mean Sea Surface (MSS) model. The model utilises Radar altimetric sea surface heights extracted from the geodetic phase of the ESA ERS-1 mission. The methodology uses a modified Le Traon et al. (1995) cubic-spline fit of dual ERS-1 and TOPEX/Poseidon crossovers for the minimisation of radial orbit error. The procedure then uses Fourier domain processing techniques for spectral optimal interpolation of the mean sea surface in order to reduce residual errors within the model. Additionally, a multi-satellite mean sea surface integration technique is investigated to supplement the first model with additional enhanced data from the GEOSAT geodetic mission.The methodology employs a novel technique that combines the Stokes' and Vening-Meinsz' transformations, again in the spectral domain. This allows the presentation of a new enhanced GEOSAT gravity anomaly field.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The work described in this thesis focuses on the use of a design-of-experiments approach in a multi-well mini-bioreactor to enable the rapid establishments of high yielding production phase conditions in yeast, which is an increasingly popular host system in both academic and industrial laboratories. Using green fluorescent protein secreted from the yeast, Pichia pastoris, a scalable predictive model of protein yield per cell was derived from 13 sets of conditions each with three factors (temperature, pH and dissolved oxygen) at 3 levels and was directly transferable to a 7 L bioreactor. This was in clear contrast to the situation in shake flasks, where the process parameters cannot be tightly controlled. By further optimisating both the accumulation of cell density in batch and improving the fed-batch induction regime, additional yield improvement was found to be additive to the per cell yield of the model. A separate study also demonstrated that improving biomass improved product yield in a second yeast species, Saccharomyces cerevisiae. Investigations of cell wall hydrophobicity in high cell density P. pastoris cultures indicated that cell wall hydrophobin (protein) compositional changes with growth phase becoming more hydrophobic in log growth than in lag or stationary phases. This is possibly due to an increased occurrence of proteins associated with cell division. Finally, the modelling approach was validated in mammalian cells, showing its flexibility and robustness. In summary, the strategy presented in this thesis has the benefit of reducing process development time in recombinant protein production, directly from bench to bioreactor.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A conventional neural network approach to regression problems approximates the conditional mean of the output vector. For mappings which are multi-valued this approach breaks down, since the average of two solutions is not necessarily a valid solution. In this article mixture density networks, a principled method to model conditional probability density functions, are applied to retrieving Cartesian wind vector components from satellite scatterometer data. A hybrid mixture density network is implemented to incorporate prior knowledge of the predominantly bimodal function branches. An advantage of a fully probabilistic model is that more sophisticated and principled methods can be used to resolve ambiguities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A conventional neural network approach to regression problems approximates the conditional mean of the output vector. For mappings which are multi-valued this approach breaks down, since the average of two solutions is not necessarily a valid solution. In this article mixture density networks, a principled method to model conditional probability density functions, are applied to retrieving Cartesian wind vector components from satellite scatterometer data. A hybrid mixture density network is implemented to incorporate prior knowledge of the predominantly bimodal function branches. An advantage of a fully probabilistic model is that more sophisticated and principled methods can be used to resolve ambiguities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

For the first time, Fiber Bragg grating (FBG) structures have been inscribed in single-core passive germanate and three-core passive and active tellurite glass fibers using 800nm femtosecond (fs) laser and phase mask technique. With fs peak power intensity in the order of 1011W/cm2, the FBG spectra with 2nd and 3rd order resonances at 1540 and 1033nm in the germanate glass fiber and 2nd order resonances at ~1694 and ~1677nm with strengths up to 14dB in all three cores in the tellurite fiber were observed. Thermal responsivities of the FBGs made in these mid-IR glass fibers were characterized, showing average temperature responsivity ~20pm/°C. Strain responsivities of the FBGs in germanate glass fiber were measured to be 1.219pm/µe.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose several all-pass spectrally-periodic optical structures composed of simple optical cavities for the implementation of repetition rate multipliers of periodic pulse train with uniform output train envelope by phase-only filtering, and analyze them in terms of robustness and accuracy.