926 resultados para dynamic time warping (DTW)
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
Commentary on target article "From simple associations to systematic reasoning: a connectionist representation of rules, variables, and dynamic bindings using temporal synchrony", by L. Shastri and V. Ajjangadde, pp. 417-494
Resumo:
In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Using electricity load data and training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise and forgetting factors for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. We also find that a recently-proposed alternative novelty criterion, found to be more robust in stationary environments, does not fare so well in the non-stationary case due to the need for filter adaptability during training.
Resumo:
We present in this paper ideas to tackle the problem of analysing and forecasting nonstationary time series within the financial domain. Accepting the stochastic nature of the underlying data generator we assume that the evolution of the generator's parameters is restricted on a deterministic manifold. Therefore we propose methods for determining the characteristics of the time-localised distribution. Starting with the assumption of a static normal distribution we refine this hypothesis according to the empirical results obtained with the methods anc conclude with the indication of a dynamic non-Gaussian behaviour with varying dependency for the time series under consideration.
Resumo:
The deficiencies of stationary models applied to financial time series are well documented. A special form of non-stationarity, where the underlying generator switches between (approximately) stationary regimes, seems particularly appropriate for financial markets. We use a dynamic switching (modelled by a hidden Markov model) combined with a linear dynamical system in a hybrid switching state space model (SSSM) and discuss the practical details of training such models with a variational EM algorithm due to [Ghahramani and Hilton,1998]. The performance of the SSSM is evaluated on several financial data sets and it is shown to improve on a number of existing benchmark methods.
Resumo:
In the analysis and prediction of many real-world time series, the assumption of stationarity is not valid. A special form of non-stationarity, where the underlying generator switches between (approximately) stationary regimes, seems particularly appropriate for financial markets. We introduce a new model which combines a dynamic switching (controlled by a hidden Markov model) and a non-linear dynamical system. We show how to train this hybrid model in a maximum likelihood approach and evaluate its performance on both synthetic and financial data.
Resumo:
This paper explores the importance of collaboration between different types of organizations within an enterprise. To achieve successful collaboration requires both endogenous and exogenous factors of each organization to be considered and a shared meta-strategy supported by shared cross-organizational processes and technology. A rolling business plan would periodically review, assess and reposition each organization within this meta-strategy according to how well they have contributed. We show that recent technological advances have made organizational structures more agile, organizational infra-structure more connected and the sharing of real-time information an operational reality; we also discuss the challenges and risks.
Resumo:
Purpose: Meibomian-derived lipid secretions are well characterised but their subsequent fate in the ocular environment is less well understood. Phospholipids are thought to facilitate the interface between aqueous and lipid layers of the tear film and to be involved in ocular lubrication processes. We have extended our previous studies on phospholipid levels in the tear film to encompass the fate of polar and non-polar lipids in progressive accumulation and aging processes on both conventional and silicone-modified hydrogel lenses. This is an important aspect of the developing understanding of the role of lipids in the clinical performance of silicone hydrogels. Method: Several techniques were used to identify lipids in the tear film. Mass-spectrometric methods included Agilent 1100-based liquid chromatography coupled to mass spectrometry (LCMS) and Perkin Elmer gas chromatography mass spectrometry (GCMS). Thin layer chromatography (TLC) was used for separation of lipids on the basis of increasing solvent polarity. Routine assay of lipid extractions from patient-worn lenses was carried out using a Hewlett Packard 1090 liquid chromatograph coupled to both uv and Agilent 1100 fluorescence detection. A range of histological together with optical, and electron microscope techniques was used in deposit analysis. Results: Progressive lipid uptake was assessed in various ways, including: composition changes with wear time, differential lipid penetrate into the lens matrix and, particularly, the extent to which lipids become unextractable as a function of wear time. Solvent-based separation and HPLC gave consistent results indicating that the polarity of lipid classes decreased as follows: phospholipids/fatty acids > triglycerides > cholesterol/cholesteryl esters. Tear lipids were found to show autofluorescence—which underpinned the value of fluorescence microscopy and fluorescence detection coupled with HPLC separation. The most fluorescent lipids were found to be cholesteryl esters; histological techniques coupled with fluorescence microscopy indicated that white spots (’’jelly bumps’’) formed on silicone hydrogel lenses contain a high proportion of cholesteryl esters. Lipid profiles averaged for 30 symptomatic and 30 asymptomatic contact lens wearers were compiled. Peak classes were split into: cholesterol (C), cholesteryl esters (CE), glycerides (G), polar fatty acids/phospholipids (PL). The lipid ratio for ymptomatic/symptomatic was 0.6 ± 0.1 for all classes except one—the cholesterol ratio was 0.2 ± 0.05. Significantly the PL ratio was no different from that of any other class except cholesterol. Chromatography indicated that: lipid polarity decreased with depth of penetration and that lipid extractability decreased with wear time. Conclusions: Meibomian lipid composition differs from that in the tear film and on worn lenses. Although the same broad lipid classes were obtained by extraction from all lenses and all patients studied, quantities vary with wear and material. Lipid extractability diminishes with wear time regardless of the use of cleaning regimes. Dry eye symptoms in contact lens wear are frequently linked to lipid layer behaviour but seem to relate more to total lipid than to specific composition. Understanding the detail of lipid related processes is an important element of improving the clinical performance of materials and care solutions.
Resumo:
Fare, Grosskopf, Norris and Zhang developed a non-parametric productivity index, Malmquist index, using data envelopment analysis (DEA). The Malmquist index is a measure of productivity progress (regress) and it can be decomposed to different components such as 'efficiency catch-up' and 'technology change'. However, Malmquist index and its components are based on two period of time which can capture only a part of the impact of investment in long-lived assets. The effects of lags in the investment process on the capital stock have been ignored in the current model of Malmquist index. This paper extends the recent dynamic DEA model introduced by Emrouznejad and Thanassoulis and Emrouznejad for dynamic Malmquist index. This paper shows that the dynamic productivity results for Organisation for Economic Cooperation and Development countries should reflect reality better than those based on conventional model.
Resumo:
In this paper we propose a data envelopment analysis (DEA) based method for assessing the comparative efficiencies of units operating production processes where input-output levels are inter-temporally dependent. One cause of inter-temporal dependence between input and output levels is capital stock which influences output levels over many production periods. Such units cannot be assessed by traditional or 'static' DEA which assumes input-output correspondences are contemporaneous in the sense that the output levels observed in a time period are the product solely of the input levels observed during that same period. The method developed in the paper overcomes the problem of inter-temporal input-output dependence by using input-output 'paths' mapped out by operating units over time as the basis of assessing them. As an application we compare the results of the dynamic and static model for a set of UK universities. The paper is suggested that dynamic model capture the efficiency better than static model. © 2003 Elsevier Inc. All rights reserved.
Resumo:
Liposomes have been imaged using a plethora of techniques. However, few of these methods offer the ability to study these systems in their natural hydrated state without the requirement of drying, staining, and fixation of the vesicles. However, the ability to image a liposome in its hydrated state is the ideal scenario for visualization of these dynamic lipid structures and environmental scanning electron microscopy (ESEM), with its ability to image wet systems without prior sample preparation, offers potential advantages to the above methods. In our studies, we have used ESEM to not only investigate the morphology of liposomes and niosomes but also to dynamically follow the changes in structure of lipid films and liposome suspensions as water condenses on to or evaporates from the sample. In particular, changes in liposome morphology were studied using ESEM in real time to investigate the resistance of liposomes to coalescence during dehydration thereby providing an alternative assay of liposome formulation and stability. Based on this protocol, we have also studied niosome-based systems and cationic liposome/DNA complexes. Copyright © Informa Healthcare.
Resumo:
This preliminary report describes work carried out as part of work package 1.2 of the MUCM research project. The report is split in two parts: the ?rst part (Sections 1 and 2) summarises the state of the art in emulation of computer models, while the second presents some initial work on the emulation of dynamic models. In the ?rst part, we describe the basics of emulation, introduce the notation and put together the key results for the emulation of models with single and multiple outputs, with or without the use of mean function. In the second part, we present preliminary results on the chaotic Lorenz 63 model. We look at emulation of a single time step, and repeated application of the emulator for sequential predic- tion. After some design considerations, the emulator is compared with the exact simulator on a number of runs to assess its performance. Several general issues related to emulating dynamic models are raised and discussed. Current work on the larger Lorenz 96 model (40 variables) is presented in the context of dimension reduction, with results to be provided in a follow-up report. The notation used in this report are summarised in appendix.
Resumo:
A key objective of autonomic computing is to reduce the cost and expertise required for the management of complex IT systems. As a growing number of these systems are implemented as hierarchies or federations of lower-level systems, techniques that support the development of autonomic systems of systems are required. This article introduces one such technique, which involves the run-time synthesis of autonomic system connectors. These connectors are specified by means of a new type of autonomic computing policy termed a resource definition policy, and enable the dynamic realisation of collections of collaborating autonomic systems, as envisaged by the original vision of autonomic computing. We propose a framework for the formal specification of autonomic computing policies, and use it to define the new policy type and to describe its application to the development of autonomic system of systems. To validate the approach, we present a sample data-centre application that was built using connectors synthesised from resource-definition policies.
Resumo:
A study has been made of the dynamic behaviour of a nuclear fuel reprocessing plant utilising pulsed solvent extraction columns. A flowsheet is presented and the choice of an extraction device is discussed. The plant is described by a series of modules each module representing an item of equipment. Each module consists of a series of differential equations describing the dynamic behaviour of the equipment. The model is written in PMSP, a language developed for dynamic simulation models. The differential equations are solved to predict plant behaviour with time. The dynamic response of the plant to a range of disturbances has been assessed. The interactions between pulsed columns have been demonstrated and illustrated. The importance of auxillary items of equipment to plant performance is demonstrated. Control of the reprocessing plant is considered and the effect of control parameters on performance assessed.
Resumo:
Amongst all the objectives in the study of time series, uncovering the dynamic law of its generation is probably the most important. When the underlying dynamics are not available, time series modelling consists of developing a model which best explains a sequence of observations. In this thesis, we consider hidden space models for analysing and describing time series. We first provide an introduction to the principal concepts of hidden state models and draw an analogy between hidden Markov models and state space models. Central ideas such as hidden state inference or parameter estimation are reviewed in detail. A key part of multivariate time series analysis is identifying the delay between different variables. We present a novel approach for time delay estimating in a non-stationary environment. The technique makes use of hidden Markov models and we demonstrate its application for estimating a crucial parameter in the oil industry. We then focus on hybrid models that we call dynamical local models. These models combine and generalise hidden Markov models and state space models. Probabilistic inference is unfortunately computationally intractable and we show how to make use of variational techniques for approximating the posterior distribution over the hidden state variables. Experimental simulations on synthetic and real-world data demonstrate the application of dynamical local models for segmenting a time series into regimes and providing predictive distributions.