24 resultados para Forecasting and replenishment (CPFR)

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Routine cell line maintenance involves removal of waste products and replenishment of nutrients via replacement of cell culture media. Here, we report that routine maintenance of three discrete cell lines (HSB-CCRF-2 and Jurkat T cells, and phaeo-chromocytoma PC12 cells) decreases the principal cellular antioxidant, glutathione, by up to 42% in HSB-CCRF-2 cells between 60 and 120 min after media replenishment. However, cellular glutathione levels returned to baseline within 5 h after passage. The decrease in glutathione was associated with modulation of the response of Jurkat T cells to apoptotic and mitogenic signals. Methotrexate-induced apoptosis over 16 h, measured as accumulation of apoptotic nucleoids, was decreased from 22 to 17% if cells were exposed to cytotoxic agent 30 min after passage compared with cells exposed to MTX in the absence of passage. In contrast, interleukin-2 (IL-2) production over 24 h in response to the toxin phytohaemagglutinin (PHA), was increased by 34% if cells were challenged 2 h after passage compared with PHA treatment in the absence of passage. This research highlights the presence of a window of time after cell passage of non-adherent cells that may lead to over- or under-estimation of subsequent cell responses to toxins, which is dependent on cellular antioxidant capacity or redox state. © 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis deals with the background, development and description of a mathematical stock control methodology for use within an oil and chemical blending company, where demand and replenishment lead-times are generally non-stationary. The stock control model proper relies on, as input, adaptive forecasts of demand determined for an economical forecast/replenishment period precalculated on an individual stock-item basis. The control procedure is principally that of the continuous review, reorder level type, where the reorder level and reorder quantity 'float', that is, each changes in accordance with changes in demand. Two versions of the Methodology are presented; a cost minimisation version and a service level version. Realising the importance of demand forecasts, four recognised variations of the Trigg and Leach adaptive forecasting routine are examined. A fifth variation, developed, is proposed as part of the stock control methodology. The results of testing the cost minimisation version of the Methodology with historical data, by means of a computerised simulation, are presented together with a description of the simulation used. The performance of the Methodology is in addition compared favourably to a rule-of-thumb approach considered by the Company as an interim solution for reducing stack levels. The contribution of the work to the field of scientific stock control is felt to be significant for the following reasons:- (I) The Methodology is designed specifically for use with non-stationary demand and for this reason alone appears to be unique. (2) The Methodology is unique in its approach and the cost-minimisation version is shown to work successfully with the demand data presented. (3) The Methodology and the thesis as a whole fill an important gap between complex mathematical stock control theory and practical application. A brief description of a computerised order processing/stock monitoring system, designed and implemented as a pre-requisite for the Methodology's practical operation, is presented as an appendix.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the impact of Research and Development (R&D) on the productivity of China's high technology industry. In order to capture important differences in the effect of R&D on output that arise from geographic and socioeconomic differences across three major regions in China, we use a novel semiparametric approach that allows us to model heterogeneities across provinces and time. Using a unique provincial level panel dataset spanning the period 2000–2007, we find that the impact of R&D on output varies substantially in terms of magnitude and significance across different regions. Results show that the eastern region benefits the most from R&D investments, however it benefits the least from technical progress, while the western region benefits the least from R&D investments, but enjoys the highest benefits from technical progress. The central region benefits from R&D investments more than the western region and benefits from technical progress more than the eastern region. Our results suggest that R&D investments would significantly increase output in both the eastern and central regions, however technical progress in the central region may further compound the effects of R&D on output within the region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To compare the accuracy of different forecasting approaches an error measure is required. Many error measures have been proposed in the literature, however in practice there are some situations where different measures yield different decisions on forecasting approach selection and there is no agreement on which approach should be used. Generally forecasting measures represent ratios or percentages providing an overall image of how well fitted the forecasting technique is to the observations. This paper proposes a multiplicative Data Envelopment Analysis (DEA) model in order to rank several forecasting techniques. We demonstrate the proposed model by applying it to the set of yearly time series of the M3 competition. The usefulness of the proposed approach has been tested using the M3-competition where five error measures have been applied in and aggregated to a single DEA score.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online model order complexity estimation remains one of the key problems in neural network research. The problem is further exacerbated in situations where the underlying system generator is non-stationary. In this paper, we introduce a novelty criterion for resource allocating networks (RANs) which is capable of being applied to both stationary and slowly varying non-stationary problems. The deficiencies of existing novelty criteria are discussed and the relative performances are demonstrated on two real-world problems : electricity load forecasting and exchange rate prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obtaining wind vectors over the ocean is important for weather forecasting and ocean modelling. Several satellite systems used operationally by meteorological agencies utilise scatterometers to infer wind vectors over the oceans. In this paper we present the results of using novel neural network based techniques to estimate wind vectors from such data. The problem is partitioned into estimating wind speed and wind direction. Wind speed is modelled using a multi-layer perceptron (MLP) and a sum of squares error function. Wind direction is a periodic variable and a multi-valued function for a given set of inputs; a conventional MLP fails at this task, and so we model the full periodic probability density of direction conditioned on the satellite derived inputs using a Mixture Density Network (MDN) with periodic kernel functions. A committee of the resulting MDNs is shown to improve the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obtaining wind vectors over the ocean is important for weather forecasting and ocean modelling. Several satellite systems used operationally by meteorological agencies utilise scatterometers to infer wind vectors over the oceans. In this paper we present the results of using novel neural network based techniques to estimate wind vectors from such data. The problem is partitioned into estimating wind speed and wind direction. Wind speed is modelled using a multi-layer perceptron (MLP) and a sum of squares error function. Wind direction is a periodic variable and a multi-valued function for a given set of inputs; a conventional MLP fails at this task, and so we model the full periodic probability density of direction conditioned on the satellite derived inputs using a Mixture Density Network (MDN) with periodic kernel functions. A committee of the resulting MDNs is shown to improve the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the main challenges of emergency management lies in communicating risks to the public. On some occasions, risk communicators might seek to increase awareness over emerging risks, while on others the aim might be to avoid escalation of public reactions. Social media accounts offer an opportunity to rapidly distribute critical information and in doing so to mitigate the impact of emergencies by influencing public reactions. This article draws on theories of risk and emergency communication in order to consider the impact of Twitter as a tool for communicating risks to the public. We analyse 10,020 Twitter messages posted by the official accounts of UK local government authorities (councils) in the context of two major emergencies: the heavy snow of December 2010 and the riots of August 2011. Twitter was used in a variety of ways to communicate and manage associated risks including messages to provide official updates, encourage protective behaviour, increase awareness and guide public attention to mitigating actions. We discuss the importance of social media as means of increasing confidence in emergency management institutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between uncertainty and firms’ risk-taking behaviour has been a focus of investigation since early discussion of the nature of enterprise activity. Here, we focus on how firms’ perceptions of environmental uncertainty and their perceptions of the risks involved impact on their willingness to undertake green innovation. Analysis is based on a cross-sectional survey of UK food companies undertaken in 2008. The results reinforce the relationship between perceived environmental uncertainty and perceived innovation risk and emphasise the importance of macro-uncertainty in shaping firms’ willingness to undertake green innovation. The perceived (market-related) riskiness of innovation also positively influences the probability of innovating, suggesting either a proactive approach to stimulating market disruption or an opportunistic approach to innovation leadership.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a forecasting technique for forward energy prices, one day ahead. This technique combines a wavelet transform and forecasting models such as multi- layer perceptron, linear regression or GARCH. These techniques are applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the wavelet transform. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, the exchange rate forecasting performance of neural network models are evaluated against the random walk, autoregressive moving average and generalised autoregressive conditional heteroskedasticity models. There are no guidelines available that can be used to choose the parameters of neural network models and therefore, the parameters are chosen according to what the researcher considers to be the best. Such an approach, however,implies that the risk of making bad decisions is extremely high, which could explain why in many studies, neural network models do not consistently perform better than their time series counterparts. In this paper, through extensive experimentation, the level of subjectivity in building neural network models is considerably reduced and therefore giving them a better chance of Forecasting exchange rates with linear and nonlinear models 415 performing well. The results show that in general, neural network models perform better than the traditionally used time series models in forecasting exchange rates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Linear models reach their limitations in applications with nonlinearities in the data. In this paper new empirical evidence is provided on the relative Euro inflation forecasting performance of linear and non-linear models. The well established and widely used univariate ARIMA and multivariate VAR models are used as linear forecasting models whereas neural networks (NN) are used as non-linear forecasting models. It is endeavoured to keep the level of subjectivity in the NN building process to a minimum in an attempt to exploit the full potentials of the NN. It is also investigated whether the historically poor performance of the theoretically superior measure of the monetary services flow, Divisia, relative to the traditional Simple Sum measure could be attributed to a certain extent to the evaluation of these indices within a linear framework. Results obtained suggest that non-linear models provide better within-sample and out-of-sample forecasts and linear models are simply a subset of them. The Divisia index also outperforms the Simple Sum index when evaluated in a non-linear framework. © 2005 Taylor & Francis Group Ltd.