26 resultados para Two Approaches
em Aston University Research Archive
Resumo:
In Spring 2009, the School of Languages and Social Sciences (LSS) at Aston University responded to a JISC (Joint Information Systems Committee) and Higher Education Academy (HEA) call for partners in Open Educational Resources (OER) projects. This led to participation in not one, but two different OER projects from within one small School of the University. This paper will share, from this unusual position, the experience of our English tutors, who participated in the HumBox Project, led by Languages, Linguistics and Area Studies (LLAS) and will compare the approach taken with the Sociology partnership in the C-SAP OER Project , led by the Centre for Sociology, Anthropology and Politics (C-SAP). These two HEA Subject Centre-led projects have taken different approaches to the challenges of encouraging tutors to deposit teaching resources, as on ongoing process, for others to openly access, download and re-purpose. As the projects draw to a close, findings will be discussed, in relation to the JISC OER call, with an emphasis on examining the language and discourses from the two collaborations to see where there are shared issues and outcomes, or different subject specific concerns to consider.
Resumo:
In the present state of the art of authorship attribution there seems to be an opposition between two approaches: cognitive and stylistic methodologies. It is proposed in this article that these two approaches are complementary and that the apparent gap between them can be bridged using Systemic Functional Linguistics (SFL) and in particular some of its theoretical constructions, such as codal variation. This article deals with the theoretical explanation of why such a theory would solve the debate between the two approaches and shows how these two views of authorship attribution are indeed complementary. Although the article is fundamentally theoretical, two example experimental trials are reported to show how this theory can be developed into a workable methodology of doing authorship attribution. In Trial 1, a SFL analysis was carried out on a small dataset consisting of three 300-word texts collected from three different authors whose socio-demographic background matched across a number of parameters. This trial led to some conclusions about developing a methodology based on SFL and suggested the development of another trial, which might hint at a more accurate and useful methodology. In Trial 2, Biber's (1988) multidimensional framework is employed, and a final methodology of authorship analysis based on this kind of analysis is proposed for future research. © 2013, EQUINOX PUBLISHING.
Resumo:
We employ two different methods, based on belief propagation and TAP,for decoding corrupted messages encoded by employing Sourlas's method, where the code word comprises products of K bits selected randomly from the original message. We show that the equations obtained by the two approaches are similar and provide the same solution as the one obtained by the replica approach in some cases K=2. However, we also show that for K>=3 and unbiased messages the iterative solution is sensitive to the initial conditions and is likely to provide erroneous solutions; and that it is generally beneficial to use Nishimori's temperature, especially in the case of biased messages.
Resumo:
This thesis is a study of three techniques to improve performance of some standard fore-casting models, application to the energy demand and prices. We focus on forecasting demand and price one-day ahead. First, the wavelet transform was used as a pre-processing procedure with two approaches: multicomponent-forecasts and direct-forecasts. We have empirically compared these approaches and found that the former consistently outperformed the latter. Second, adaptive models were introduced to continuously update model parameters in the testing period by combining ?lters with standard forecasting methods. Among these adaptive models, the adaptive LR-GARCH model was proposed for the fi?rst time in the thesis. Third, with regard to noise distributions of the dependent variables in the forecasting models, we used either Gaussian or Student-t distributions. This thesis proposed a novel algorithm to infer parameters of Student-t noise models. The method is an extension of earlier work for models that are linear in parameters to the non-linear multilayer perceptron. Therefore, the proposed method broadens the range of models that can use a Student-t noise distribution. Because these techniques cannot stand alone, they must be combined with prediction models to improve their performance. We combined these techniques with some standard forecasting models: multilayer perceptron, radial basis functions, linear regression, and linear regression with GARCH. These techniques and forecasting models were applied to two datasets from the UK energy markets: daily electricity demand (which is stationary) and gas forward prices (non-stationary). The results showed that these techniques provided good improvement to prediction performance.
Resumo:
Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control
Resumo:
To reduce global biodiversity loss, there is an urgent need to determine the most efficient allocation of conservation resources. Recently, there has been a growing trend for many governments to supplement public ownership and management of reserves with incentive programs for conservation on private land. This raises important questions, such as the extent to which private land conservation can improve conservation outcomes, and how it should be mixed with more traditional public land conservation. We address these questions, using a general framework for modelling environmental policies and a case study examining the conservation of endangered native grasslands to the west of Melbourne, Australia. Specifically, we examine three policies that involve i) spending all resources on creating public conservation areas; ii) spending all resources on an ongoing incentive program where private landholders are paid to manage vegetation on their property with 5-year contracts; and iii) splitting resources between these two approaches. The performance of each strategy is quantified with a vegetation condition change model that predicts future changes in grassland quality. Of the policies tested, no one policy was always best and policy performance depended on the objectives of those enacting the policy. Although policies to promote conservation on private land are proposed and implemented in many areas, they are rarely evaluated in terms of their ecological consequences. This work demonstrates a general method for evaluating environmental policies and highlights the utility of a model which combines ecological and socioeconomic processes.
Resumo:
This work studies the development of polymer membranes for the separation of hydrogen and carbon monoxide from a syngas produced by the partial oxidation of natural gas. The CO product is then used for the large scale manufacture of acetic acid by reaction with methanol. A method of economic evaluation has been developed for the process as a whole and a comparison is made between separation of the H2/CO mixture by a membrane system and the conventional method of cryogenic distillation. Costs are based on bids obtained from suppliers for several different specifications for the purity of the CO fed to the acetic acid reactor. When the purity of the CO is set at that obtained by cryogenic distillation it is shown that the membrane separator offers only a marginal cost advantage. Cost parameters for the membrane separation systems have been defined in terms of effective selectivity and cost permeability. These new parameters, obtained from an analysis of the bids, are then used in a procedure which defines the optimum degree of separation and recovery of carbon monoxide for a minimum cost of manufacture of acetic acid. It is shown that a significant cost reduction is achieved with a membrane separator at the optimum process conditions. A method of "targeting" the properties of new membranes has been developed. This involves defining the properties for new (hypothetical -yet to be developed) membranes such that their use for the hydrogen/carbon monoxide separation will produce a reduced cost of acetic acid manufacture. The use of the targeting method is illustrated in the development of new membranes for the separation of hydrogen and carbon monoxide. The selection of polymeric materials for new membranes is based on molecular design methods which predict the polymer properties from the molecular groups making up the polymer molecule. Two approaches have been used. One method develops the analogy between gas solubility in liquids and that in polymers. The UNIFAC group contribution method is then used to predict gas solubility in liquids. In the second method the polymer Permachor number, developed by Salame, has been correlated with hydrogen and carbon monoxide permeabilities. These correlations are used to predict the permeabilities of gases through polymers. Materials have been tested for hydrogen and carbon monoxide permeabilities and improvements in expected economic performance have been achieved.
Resumo:
This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.
Resumo:
We address the question of how to obtain effective fusion of identification information such that it is robust to the quality of this information. As well as technical issues data fusion is encumbered with a collection of (potentially confusing) practical considerations. These considerations are described during the early chapters in which a framework for data fusion is developed. Following this process of diversification it becomes clear that the original question is not well posed and requires more precise specification. We use the framework to focus on some of the technical issues relevant to the question being addressed. We show that fusion of hard decisions through use of an adaptive version of the maximum a posteriori decision rule yields acceptable performance. Better performance is possible using probability level fusion as long as the probabilities are accurate. Of particular interest is the prevalence of overconfidence and the effect it has on fused performance. The production of accurate probabilities from poor quality data forms the latter part of the thesis. Two approaches are taken. Firstly the probabilities may be moderated at source (either analytically or numerically). Secondly, the probabilities may be transformed at the fusion centre. In each case an improvement in fused performance is demonstrated. We therefore conclude that in order to obtain robust fusion care should be taken to model the probabilities accurately; either at the source or centrally.
Resumo:
The introduction of agent technology raises several security issues that are beyond conventional security mechanisms capability and considerations, but research in protecting the agent from malicious host attack is evolving. This research proposes two approaches to protecting an agent from being attacked by a malicious host. The first approach consists of an obfuscation algorithm that is able to protect the confidentiality of an agent and make it more difficult for a malicious host to spy on the agent. The algorithm uses multiple polynomial functions with multiple random inputs to convert an agent's critical data to a value that is meaningless to the malicious host. The effectiveness of the obfuscation algorithm is enhanced by addition of noise code. The second approach consists of a mechanism that is able to protect the integrity of the agent using state information, recorded during the agent execution process in a remote host environment, to detect a manipulation attack by a malicious host. Both approaches are implemented using a master-slave agent architecture that operates on a distributed migration pattern. Two sets of experimental test were conducted. The first set of experiments measures the migration and migration+computation overheads of the itinerary and distributed migration patterns. The second set of experiments is used to measure the security overhead of the proposed approaches. The protection of the agent is assessed by analysis of its effectiveness under known attacks. Finally, an agent-based application, known as Secure Flight Finder Agent-based System (SecureFAS) is developed, in order to prove the function of the proposed approaches.
Resumo:
Jackson System Development (JSD) is an operational software development method which addresses most of the software lifecycle either directly or by providing a framework into which more specialised techniques can fit. The method has two major phases: first an abstract specification is derived that is in principle executable; second the specification is implemented using a variety of transformations. The object oriented paradigm is based on data abstraction and encapsulation coupled to an inheritance architecture that is able to support software reuse. Its claims of improved programmer productivity and easier program maintenance make it an important technology to be considered for building complex software systems. The mapping of JSD specifications into procedural languages typified by Cobol, Ada, etc., involves techniques such as inversion and state vector separation to produce executable systems of acceptable performance. However, at present, no strategy exists to map JSD specifications into object oriented languages. The aim of this research is to investigate the relationship between JSD and the object oriented paradigm, and to identify and implement transformations capable of mapping JSD specifications into an object oriented language typified by Smalltalk-80. The direction which the transformational strategy follows is one whereby the concurrency of a specification is removed. Two approaches implementing inversion - an architectural transformation resulting in a simulated coroutine mechanism being generated - are described in detail. The first approach directly realises inversions by manipulating Smalltalk-80 system contexts. This is possible in Smalltalk-80 because contexts are first class objects and are accessible to the user like any other system object. However, problems associated with this approach are expounded. The second approach realises coroutine-like behaviour in a structure called a `followmap'. A followmap is the results of a transformation on a JSD process in which a collection of followsets is generated. Each followset represents all possible state transitions a process can undergo from the current state of the process. Followsets, together with exploitation of the class/instance mechanism for implementing state vector separation, form the basis for mapping JSD specifications into Smalltalk-80. A tool, which is also built in Smalltalk-80, supports these derived transformations and enables a user to generate Smalltalk-80 prototypes of JSD specifications.
Resumo:
Many of the applications of geometric modelling are concerned with the computation of well-defined properties of the model. The applications which have received less attention are those which address questions to which there is no unique answer. This thesis describes such an application: the automatic production of a dimensioned engineering drawing. One distinctive feature of this operation is the requirement for sophisticated decision-making algorithms at each stage in the processing of the geometric model. Hence, the thesis is focussed upon the design, development and implementation of such algorithms. Various techniques for geometric modelling are briefly examined and then details are given of the modelling package that was developed for this project, The principles of orthographic projection and dimensioning are treated and some published work on the theory of dimensioning is examined. A new theoretical approach to dimensioning is presented and discussed. The existing body of knowledge on decision-making is sampled and the author then shows how methods which were originally developed for management decisions may be adapted to serve the purposes of this project. The remainder of the thesis is devoted to reports on the development of decision-making algorithms for orthographic view selection, sectioning and crosshatching, the preparation of orthographic views with essential hidden detail, and two approaches to the actual insertion of dimension lines and text. The thesis concludes that the theories of decision-making can be applied to work of this kind. It may be possible to generate computer solutions that are closer to the optimum than some man-made dimensioning schemes. Further work on important details is required before a commercially acceptable package could be produced.
Resumo:
Cold roll forming is an extremely important but little studied sheet metal forming process. In this thesis, the process of cold roll forming is introduced and it is seen that form roll design is central to the cold roll forming process. The conventional design and manufacture of form rolls is discussed and it is observed that surrounding the design process are a number of activities which although peripheral are time consuming and a possible source of error. A CAD/CAM system is described which alleviates many of the problems traditional to form roll design. New techniques for the calculation of strip length and controlling the means of forming bends are detailed. The CAD/CAM system's advantages and limitations are discussed and, whilst the system has numerous significant advantages, its principal limitation can be said to be the need to manufacture form rolls and test them on a mill before a design can be stated satisfactory. A survey of the previous theoretical and experimental analysis of cold roll forming is presented and is found to be limited. By considering the previous work, a method of numerical analysis of the cold roll forming process is proposed based on a minimum energy approach. Parallel to the numerical analysis, a comprehensive range of software has been developed to enhance the designer's visualisation of the effects of his form roll design. A complementary approach to the analysis of form roll design is the generation of form roll design, a method for the partial generation of designs is described. It is suggested that the two approaches should continue in parallel and that the limitation of each approach is knowledge of the cold roll forming process. Hence, an initial experimental investigation of the rolling of channel sections is described. Finally, areas of potential future work are discussed.
Resumo:
For analysing financial time series two main opposing viewpoints exist, either capital markets are completely stochastic and therefore prices follow a random walk, or they are deterministic and consequently predictable. For each of these views a great variety of tools exist with which it can be tried to confirm the hypotheses. Unfortunately, these methods are not well suited for dealing with data characterised in part by both paradigms. This thesis investigates these two approaches in order to model the behaviour of financial time series. In the deterministic framework methods are used to characterise the dimensionality of embedded financial data. The stochastic approach includes here an estimation of the unconditioned and conditional return distributions using parametric, non- and semi-parametric density estimation techniques. Finally, it will be shown how elements from these two approaches could be combined to achieve a more realistic model for financial time series.
Resumo:
Decomposition of domestic wastes in an anaerobic environment results in the production of landfill gas. Public concern about landfill disposal and particularly the production of landfill gas has been heightened over the past decade. This has been due in large to the increased quantities of gas being generated as a result of modern disposal techniques, and also to their increasing effect on modern urban developments. In order to avert diasters, effective means of preventing gas migration are required. This, in turn requires accurate detection and monitoring of gas in the subsurface. Point sampling techniques have many drawbacks, and accurate measurement of gas is difficult. Some of the disadvantages of these techniques could be overcome by assessing the impact of gas on biological systems. This research explores the effects of landfill gas on plants, and hence on the spectral response of vegetation canopies. Examination of the landfill gas/vegetation relationship is covered, both by review of the literature and statistical analysis of field data. The work showed that, although vegetation health was related to landfill gas, it was not possible to define a simple correlation. In the landfill environment, contribution from other variables, such as soil characteristics, frequently confused the relationship. Two sites are investigated in detail, the sites contrasting in terms of the data available, site conditions, and the degree of damage to vegetation. Gas migration at the Panshanger site was dominantly upwards, affecting crops being grown on the landfill cap. The injury was expressed as an overall decline in plant health. Discriminant analysis was used to account for the variations in plant health, and hence the differences in spectral response of the crop canopy, using a combination of soil and gas variables. Damage to both woodland and crops at the Ware site was severe, and could be easily related to the presence of gas. Air photographs, aerial video, and airborne thematic mapper data were used to identify damage to vegetation, and relate this to soil type. The utility of different sensors for this type of application is assessed, and possible improvements that could lead to more widespread use are identified. The situations in which remote sensing data could be combined with ground survey are identified. In addition, a possible methodology for integrating the two approaches is suggested.