11 resultados para historical data

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The potential for the use of DEA and simulation in a mutually supporting role in guiding operating units to improved performance is presented. An analysis following a three-stage process is suggested. Stage one involves obtaining the data for the DEA analysis. This can be sourced from historical data, simulated data or a combination of the two. Stage two involves the DEA analysis that identifies benchmark operating units. In the third stage simulation can now be used in order to offer practical guidance to operating units towards improved performance. This can be achieved by the use of sensitivity analysis of the benchmark unit using a simulation model to offer direct support as to the feasibility and efficiency of any variations in operating practices to be tested. Alternatively, the simulation can be used as a mechanism to transmit the practices of the benchmark unit to weaker performing units by building a simulation model of the weaker unit to the process design of the benchmark unit. The model can then compare performance of the current and benchmark process designs. Quantifying improvement in this way provides a useful driver to any process change initiative that is required to bring the performance of weaker units up to the best in class. © 2005 Operational Research Society Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate boardroom governance using UK historical data for 1935. We demonstrate that there is a negative relationship between risk and incentives in this year. Prior research has produced anomalous results (Prendergast, 2002). Second, we show that average (median) board ownership of ordinary shares is about 7.95% (2.88%). Heuristically this figure is less than previously reported estimates for the US also using 1935 data. Finally, we show the phenomenon of multiple board membership. UK directors in 1935 hold many directorships – sometimes exceeding 10 concurrent memberships.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis describes an investigation by the author into the spares operation of compare BroomWade Ltd. Whilst the complete system, including the warehousing and distribution functions, was investigated, the thesis concentrates on the provisioning aspect of the spares supply problem. Analysis of the historical data showed the presence of significant fluctuations in all the measures of system performance. Two Industrial Dynamics simulation models were developed to study this phenomena. The models showed that any fluctuation in end customer demand would be amplified as it passed through the distributor and warehouse stock control systems. The evidence from the historical data available supported this view of the system's operation. The models were utilised to determine which parts of the total system could be expected to exert a critical influence on its performance. The lead time parameters of the supply sector were found to be critical and further study showed that the manner in which the lead time changed with work in progress levels was also an important factor. The problem therefore resolved into the design of a spares manufacturing system. Which exhibited the appropriate dynamic performance characteristics. The gross level of entity presentation, inherent in the Industrial Dynamics methodology, was found to limit the value of these models in the development of detail design proposals. Accordingly, an interacting job shop simulation package was developed to allow detailed evaluation of organisational factors on the performance characteristics of a manufacturing system. The package was used to develop a design for a pilot spares production unit. The need for a manufacturing system to perform successfully under conditions of fluctuating demand is not limited to the spares field. Thus, although the spares exercise provides an example of the approach, the concepts and techniques developed can be considered to have broad application throughout batch manufacturing industry.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis deals with the background, development and description of a mathematical stock control methodology for use within an oil and chemical blending company, where demand and replenishment lead-times are generally non-stationary. The stock control model proper relies on, as input, adaptive forecasts of demand determined for an economical forecast/replenishment period precalculated on an individual stock-item basis. The control procedure is principally that of the continuous review, reorder level type, where the reorder level and reorder quantity 'float', that is, each changes in accordance with changes in demand. Two versions of the Methodology are presented; a cost minimisation version and a service level version. Realising the importance of demand forecasts, four recognised variations of the Trigg and Leach adaptive forecasting routine are examined. A fifth variation, developed, is proposed as part of the stock control methodology. The results of testing the cost minimisation version of the Methodology with historical data, by means of a computerised simulation, are presented together with a description of the simulation used. The performance of the Methodology is in addition compared favourably to a rule-of-thumb approach considered by the Company as an interim solution for reducing stack levels. The contribution of the work to the field of scientific stock control is felt to be significant for the following reasons:- (I) The Methodology is designed specifically for use with non-stationary demand and for this reason alone appears to be unique. (2) The Methodology is unique in its approach and the cost-minimisation version is shown to work successfully with the demand data presented. (3) The Methodology and the thesis as a whole fill an important gap between complex mathematical stock control theory and practical application. A brief description of a computerised order processing/stock monitoring system, designed and implemented as a pre-requisite for the Methodology's practical operation, is presented as an appendix.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Around 80% of the 63 million people in the UK live in urban areas where demand for affordable housing is highest. Supply of new dwellings is a long way short of demand and with an average annual replacement rate of 0.5% more than 80% of the existing residential housing stock will still be in use by 2050. A high proportion of owner-occupiers, a weak private rental sector and lack of sustainable financing models render England’s housing market one of the least responsive in the developed world. As an exploratory research the purpose of this paper is to examine the provision of social housing in the United Kingdom with a particular focus on England, and to set out implications for housing associations delivering sustainable community development. The paper is based on an analysis of historical data series (Census data), current macro-economic data and population projections to 2033. The paper identifies a chronic undersupply of affordable housing in England which is likely to be exacerbated by demographic development, changes in household composition and reduced availability of finance to develop new homes. Based on the housing market trends analysed in this paper opportunities are identified for policy makers to remove barriers to the delivery of new affordable homes and for social housing providers to evolve their business models by taking a wider role in sustainable community development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper considers the global synchronisation of a stochastic version of coupled map lattices networks through an innovative stochastic adaptive linear quadratic pinning control methodology. In a stochastic network, each state receives only noisy measurement of its neighbours' states. For such networks we derive a generalised Riccati solution that quantifies and incorporates uncertainty of the forward dynamics and inverse controller in the derivation of the stochastic optimal control law. The generalised Riccati solution is derived using the Lyapunov approach. A probabilistic approximation type algorithm is employed to estimate the conditional distributions of the state and inverse controller from historical data and quantifying model uncertainties. The theoretical derivation is complemented by its validation on a set of representative examples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Rework strategies that involve different checking points as well as rework times can be applied into reconfigurable manufacturing system (RMS) with certain constraints, and effective rework strategy can significantly improve the mission reliability of manufacturing process. The mission reliability of process is a measurement of production ability of RMS, which serves as an integrated performance indicator of the production process under specified technical constraints, including time, cost and quality. To quantitatively characterize the mission reliability and basic reliability of RMS under different rework strategies, rework model of RMS was established based on the method of Logistic regression. Firstly, the functional relationship between capability and work load of manufacturing process was studied through statistically analyzing a large number of historical data obtained in actual machining processes. Secondly, the output, mission reliability and unit cost in different rework paths were calculated and taken as the decision variables based on different input quantities and the rework model mentioned above. Thirdly, optimal rework strategies for different input quantities were determined by calculating the weighted decision values and analyzing advantages and disadvantages of each rework strategy. At last, case application were demonstrated to prove the efficiency of the proposed method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose—This article considers North Korea and the notion of crisis, by linking historical development over the Korean peninsula to the conflict resolution literature, and investigates why despite a large number of destabilizing events, a war involving Pyongyang has yet to erupt. Design/methodology—This article uses historical data and a framework developed by Aggarwal et al., in order to highlight patterns of interaction between states such as the United States, North Korea and South Korea, organizations such as the United Nations, as well as processes such as the Six- Party Talks and the Agreed Framework. The article then develops a crisis framework based on conflict resolution and negotiation literature, and applies it to three North Korean administrations. Findings—Findings suggest that an open- ended understanding of time (for all parties involved on the peninsula) leads to an impossibility to reach a threshold where full- scale war would be triggered, thus leaving parties in a stable state of crisis for which escalating moves and de- escalating techniques might become irrelevant. Practical implications—It is hoped that this article will help further endeavors linking conflict resolution theoretical frameworks to the Korean peninsula security situation. In the case of the Korean peninsula, time has been understood as open-ended, leading parties to a lingering state of heightened hostilities that oscillates toward war, but that is controlled enough not to reach it. In-depth analysis of particular security sectors such as nuclear energy, food security, or missile testing would prove particularly useful in understanding the complexity of the Korean peninsula situation to a greater extent. It is hoped that this paper will help further endeavours linking conflict resolution theoretical frameworks to the Korean peninsula security situation. Originality/value—This research suggests that regarding the Korean peninsula, time has been understood as open- ended, leading parties to a lingering state of heightened.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article considers North Korea and the notion of crisis, by linking historical development over the Korean peninsula to the conflict resolution literature, and investigates why despite a large number of destabilising events, a war involving Pyongyang has yet to erupt. The paper considers historical data and uses a framework developed by Aggarwal et al. in order to highlight patterns of interaction between states such as the United States, North Korea and South Korea, organisations such as the United Nations, as well as processes such as the Six-Party Talk and the Agreed Framework. The paper then develops a crisis framework based on conflict resolution and negotiation literature, and applies it to three North Korean administrations. Findings suggests that an elastic understanding of time (for all parties involved on the peninsula) leads to an impossibility to reach a threshold where full-scale war would be triggered, thus leaving parties in a stable state of crisis for which escalating moves and de-escalating techniques might become irrelevant.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ontology engineering research community has focused for many years on supporting the creation, development and evolution of ontologies. Ontology forecasting, which aims at predicting semantic changes in an ontology, represents instead a new challenge. In this paper, we want to give a contribution to this novel endeavour by focusing on the task of forecasting semantic concepts in the research domain. Indeed, ontologies representing scientific disciplines contain only research topics that are already popular enough to be selected by human experts or automatic algorithms. They are thus unfit to support tasks which require the ability of describing and exploring the forefront of research, such as trend detection and horizon scanning. We address this issue by introducing the Semantic Innovation Forecast (SIF) model, which predicts new concepts of an ontology at time t + 1, using only data available at time t. Our approach relies on lexical innovation and adoption information extracted from historical data. We evaluated the SIF model on a very large dataset consisting of over one million scientific papers belonging to the Computer Science domain: the outcomes show that the proposed approach offers a competitive boost in mean average precision-at-ten compared to the baselines when forecasting over 5 years.