13 resultados para Transforms,

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Database systems have a user interface one of the components of which will normally be a query language which is based on a particular data model. Typically data models provide primitives to define, manipulate and query databases. Often these primitives are designed to form self-contained query languages. This thesis describes a prototype implementation of a system which allows users to specify queries against the database in a query language whose primitives are not those provided by the actual model on which the database system is based, but those provided by a different data model. The implementation chosen is the Functional Query Language Front End (FQLFE). This uses the Daplex functional data model and query language. Using FQLFE, users can specify the underlying database (based on the relational model) in terms of Daplex. Queries against this specified view can then be made in Daplex. FQLFE transforms these queries into the query language (Quel) of the underlying target database system (Ingres). The automation of part of the Daplex function definition phase is also described and its implementation discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The locus of origin of the pattern evoked electroretinogram, (PERG), has been the subject of considerable discussion. A novel approach was adopted in this study to further elaborate the nature of the PERG evoked by pattern onset/offset presentation. The PERG was found to be linearly related to stimulus contrast and in particular was linearly related to the temporal contrast of the retinal image, when elicited by patterns of low spatial frequency. At high spatial frequencies the retinal image contrast is significantly reduced because of optical degradation. This is described by the eye's modulation transfer function (MTF). The retinal contrast of square wave grating and chequerboard patterns of increasing spatial frequency were found by filtering their Fourier transforms by the MTF. The filtered pattern harmonics were then resynthesised to constitute a profile of retinal image illuminance from which the temporal and spatial contrast of the image could be calculated. If the PERG is a pure illuminance response it should be spatially insensitive and dependent upon the temporal contrast of stimulation. The calculated loss of temporal contrast for finer patterns was expressed as a space-averaged temporal contrast attentuation factor. This factor, applied to PERGs evoked by low spatial frequency patterns, was used to predict the retinal illuminance response elicited by a finer pattern. The predicted response was subtracted from the recorded signal and residual waveform was proposed to represent specific activity. An additional correction for the attenuation of spatial contrast was applied to the extracted pattern specific response. Pattern specific responses computed for different spatial frequency patterns in this way are the predicted result of iso-contrast pattern stimulation. The pattern specific responses demonstrate a striking bandpass spatial selectivity which peaks at higher spatial frequencies in the more central retina. The variation of spatial sensitivity with eccentricity corresponds closely with estimated ganglion receptive field centre separation and psychophysical data. The variation of retinal structure with eccentricity, in the form of the volumes of the nuclear layers, was compared with the amplitudes of the computed retinal illuminance and pattern specific responses. The retinal illuminance response corresponds more closely to the outer and inner nuclear layers whilst the pattern specific response appears more closely related to the ganglion cell layer. In general the negative response transients correspond to the more proximal retinal layers. This thesis therefore supports the proposed contribution of proximal retinal cell activity to the PERG and describes techniques which may be further elaborated for more detailed studies of retinal receptive field dimensions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the question of how to communicate among distributed processes valuessuch as real numbers, continuous functions and geometrical solids with arbitrary precision, yet efficiently. We extend the established concept of lazy communication using streams of approximants by introducing explicit queries. We formalise this approach using protocols of a query-answer nature. Such protocols enable processes to provide valid approximations with certain accuracy and focusing on certain locality as demanded by the receiving processes through queries. A lattice-theoretic denotational semantics of channel and process behaviour is developed. Thequery space is modelled as a continuous lattice in which the top element denotes the query demanding all the information, whereas other elements denote queries demanding partial and/or local information. Answers are interpreted as elements of lattices constructed over suitable domains of approximations to the exact objects. An unanswered query is treated as an error anddenoted using the top element. The major novel characteristic of our semantic model is that it reflects the dependency of answerson queries. This enables the definition and analysis of an appropriate concept of convergence rate, by assigning an effort indicator to each query and a measure of information content to eachanswer. Thus we capture not only what function a process computes, but also how a process transforms the convergence rates from its inputs to its outputs. In future work these indicatorscan be used to capture further computational complexity measures. A robust prototype implementation of our model is available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the buzzwords of knowledge-based economy and knowledge-driven economy, policy-makers, as well as journalists and management consultants, are pushing forward a vision of change that transforms the way advanced economies work. Yet little is understood about how the knowledge-based economy differs from the old, traditional economy. It is generally agreed that the phenomenon has grown out of the branch of economic thought known as new growth theory. Digesting up-to-date thinking in economics, management, innovation studies and economic geography, this significant volume provides an account of these developments and how they have transformed advanced economies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural dolomitic rock has been investigated in the transesterification of C and C triglycerides and olive oil with a view to determining its viability as a solid base catalyst for use in biodiesel synthesis. XRD reveals that the dolomitic rock comprised 77% dolomite and 23% magnesian calcite. The generation of basic sites requires calcination at 900 °C, which increases the surface area and transforms the mineral into MgO nanocrystallites dispersed over CaO particles. Calcined dolomitic rock exhibits high activity towards the liquid phase transesterification of glyceryl tributyrate and trioctanoate, and even olive oil, with methanol for biodiesel production. © The Royal Society of Chemistry 2008.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Size-controlled MgO nanocrystals were synthesised via a simple sol-gel method and their bulk and surface properties characterised by powder XRD, HRTEM and XPS. Small, cubic MgO single crystals, generated by low temperature processing, expose weakly basic (100) surfaces. High temperature annealing transforms these into large, stepped cuboidal nanoparticles of periclase MgO which terminate in more basic (110) and (111) surfaces. The size dependent evolution of surface electronic structure correlates directly with the associated catalytic activity of these MgO nanocrystals towards glyceryl tributyrate transesterification, revealing a pronounced structural preference for (110) and (111) facets. © 2009 The Royal Society of Chemistry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The annealing properties of Type IA Bragg gratings are investigated and compared with Type I and Type IIA Bragg gratings. The transmission properties (mean and modulated wavelength components) of gratings held at predetermined temperatures are recorded from which decay characteristics are inferred. Our data show critical results concerning the high temperature stability of Type IA gratings, as they undergo a drastic initial decay at 100°C, with a consequent mean index change that is severely reduced at this temperature However, the modulated index change of IA gratings remains stable at lower annealing temperatures of 80°C, and the mean index change decays at a comparable rate to Type I gratings at 80°C. Extending this work to include the thermal decay of Type IA gratings inscribed under strain shows that the application of strain quite dramatically transforms the temperature characteristics of the Type IA grating, modifying the temperature coefficient and annealing curves, with the grating showing a remarkable improvement in high temperature stability, leading to a robust grating that can survive temperatures exceeding 180°C. Under conditions of inscription under strain it is found that the temperature coefficient increases, but is maintained at a value considerably different to the Type I grating. Therefore, the combination of Type I and IA (strained) gratings make it possible to decouple temperature and strain over larger temperature excursions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter contributes to the anthology on learning to research - researching to learn because it emphases a need to design curricula that enables living research, and on-going researcher development, rather than one that restricts student and staff activities, within a marketised approach towards time. In recent decades higher education (HE) has come to be valued for its contribution to the global economy. Referred to as the neo-liberal university, a strong prioritisation has been placed on meeting the needs of industry by providing a better workforce. This perspective emphasises the role of a degree in HE to secure future material affluence, rather than to study as an on-going investment in the self (Molesworth , Nixon & Scullion, 2009: 280). Students are treated primarily as consumers in this model, where through their tuition fees they purchase a product, rather than benefit from the transformative potential university education offers for the whole of life.Given that HE is now measured by the numbers of students it attracts, and later places into well-paid jobs, there is an intense pressure on time, which has led to a method where the learning experiences of students are broken down into discrete modules. Whilst this provides consistency, students can come to view research processes in a fragmented way within the modular system. Topics are presented chronologically, week-by-week and students simply complete a set of tasks to ‘have a degree’, rather than to ‘be learners’ (Molesworth , Nixon & Scullion, 2009: 277) who are living their research, in relation to their own past, present and future. The idea of living research in this context is my own adaptation of an approach suggested by C. Wright Mills (1959) in The Sociological Imagination. Mills advises that successful scholars do not split their work from the rest of their lives, but treat scholarship as a choice of how to live, as well as a choice of career. The marketised slant in HE thus creates a tension firstly, for students who are learning to research. Mills would encourage them to be creative, not instrumental, in their use of time, yet they are journeying through a system that is structured for a swift progression towards a high paid job, rather than crafted for reflexive inquiry, that transforms their understanding throughout life. Many universities are placing a strong focus on discrete skills for student employability, but I suggest that embedding the transformative skills emphasised by Mills empowers students and builds their confidence to help them make connections that aid their employability. Secondly, the marketised approach creates a problem for staff designing the curriculum, if students do not easily make links across time over their years of study and whole programmes. By researching to learn, staff can discover new methods to apply in their design of the curriculum, to help students make important and creative connections across their programmes of study.