5 resultados para Exponential smoothing methods

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-linear relationships are common in microbiological research and often necessitate the use of the statistical techniques of non-linear regression or curve fitting. In some circumstances, the investigator may wish to fit an exponential model to the data, i.e., to test the hypothesis that a quantity Y either increases or decays exponentially with increasing X. This type of model is straight forward to fit as taking logarithms of the Y variable linearises the relationship which can then be treated by the methods of linear regression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The techniques associated with regression, whether linear or non-linear, are some of the most useful statistical procedures that can be applied in clinical studies in optometry. 2. In some cases, there may be no scientific model of the relationship between X and Y that can be specified in advance and the objective may be to provide a ‘curve of best fit’ for predictive purposes. In such cases, the fitting of a general polynomial type curve may be the best approach. 3. An investigator may have a specific model in mind that relates Y to X and the data may provide a test of this hypothesis. Some of these curves can be reduced to a linear regression by transformation, e.g., the exponential and negative exponential decay curves. 4. In some circumstances, e.g., the asymptotic curve or logistic growth law, a more complex process of curve fitting involving non-linear estimation will be required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The available literature concerning dextransucrase and dextran production and purification has been reviewed along with the reaction mechanisms of the enzyme. A discussion of basic fermentation theory is included, together with a brief description of bioreactor hydrodynamics and general biotechnology. The various fermenters used in this research work are described in detail, along with the various experimental techniques employed. The micro-organism Leuconostoc mesenteroides NRRL B512 (F) secretes dextransucrase in the presence of an inducer, sucrose, this being the only known inducer of the enzyme. Dextransucrase is a growth related product and a series of fed-batch fermentations have been carried out to extend the exponential growth phase of the organism. These experiments were carried out in a number of different sized vessels, ranging in size from 2.5 to 1,000 litres. Using a 16 litre vessel, dextransucrase activities in excess of 450 DSU/cm3 (21.67 U/cm3) have been obtained under non-aerated conditions. It has also been possible to achieve 442 DSU/cm3 (21.28 U/cm3) using the 1,000 litre vessel, although this has not been done consistently. A 1 litre and a 2.5 litre vessel were used for the continuous fermentations of dextransucrase. The 2.5 litre vessel was a very sophisticated MBR MiniBioreactor and was used for the majority of continuous fermentations carried out. An enzyme activity of approximately 108 DSU/cm3 (5.20 U/cm3) was achieved at a dilution rate of 0.50 h-1, which corresponds to the maximum growth rate of the cells under the process conditions. A number of continuous fermentations were operated for prolonged periods of time, with experimental run-times of up to 389 h being recorded without any incidence of contamination. The phenomenon of enzyme enhancement on hold-up of up to 100% was also noted during these fermentations, with dextransucrase of activity 89.7 DSU/cm3 (4.32 U/cm3) being boosted to 155.7 DSU/cm3 (7.50 U/cm3) following 24 hours of hold-up. These findings support the recommendation of a second reactor being placed in series with the existing vessel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. It shows how the minimizer can be obtained by a range of computational solver algorithms. In this second paper (part II), using this understanding developed in part I, we introduce several novel PWC denoising methods, which, for example, combine the global behaviour of mean shift clustering with the local smoothing of total variation diffusion, and show example solver algorithms for these new methods. Comparisons between these methods are performed on synthetic and real signals, revealing that our new methods have a useful role to play. Finally, overlaps between the generalized methods of these two papers and others such as wavelet shrinkage, hidden Markov models, and piecewise smooth filtering are touched on.