29 resultados para IDEAL Reference Model
em Aston University Research Archive
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
This thesis starts with a literature review, outlining the major issues identified in the literature concerning virtual manufacturing enterprise (VME) transformation. Then it details the research methodology used – a systematic approach for empirical research. next, based on the conceptual framework proposed, this thesis builds three modules to form a reference model, with the purpose of clarifying the important issues relevant to transforming a traditional manufacturing company into a VME. The first module proposes a mechanism of VME transformation – operating along the VME metabolism. The second module builds a management function within a VME to ensure a proper operation of the mechanism. This function helps identify six areas as closely related to VME transformation: lean manufacturing; competency protection; internal operation performance measurement; alliance performance measurement; knowledge management; alliance decision making. The third module continues and proposes an alliance performance measurement system which includes 14 categories of performance indicators. An analysis template for alliance decision making is also proposed and integrated into the first module. To validate these three modules, 7 manufacturing organisations (5 in China and 2 in the UK) were investigated, and these field case studies are analysed in this thesis. The evidence found in these organisations, together with the evidence collected from the literature, including both researcher views and literature case studies, provide support for triangulation evidence. In addition, this thesis identifies the strength and weakness patterns of the manufacturing companies within the theoretical niche of this research, and clarifies the relationships among some major research areas from the perspective of virtual manufacturing. Finally, the research findings are summarised, as well as their theoretical and practical implications. Research limitations and recommendations for future work conclude this thesis.
Resumo:
Because poor quality semantic metadata can destroy the effectiveness of semantic web technology by hampering applications from producing accurate results, it is important to have frameworks that support their evaluation. However, there is no such framework developedto date. In this context, we proposed i) an evaluation reference model, SemRef, which sketches some fundamental principles for evaluating semantic metadata, and ii) an evaluation framework, SemEval, which provides a set of instruments to support the detection of quality problems and the collection of quality metrics for these problems. A preliminary case study of SemEval shows encouraging results.
Resumo:
Purpose – The purpose of this paper is to outline a seven-phase simulation conceptual modelling procedure that incorporates existing practice and embeds a process reference model (i.e. SCOR). Design/methodology/approach – An extensive review of the simulation and SCM literature identifies a set of requirements for a domain-specific conceptual modelling procedure. The associated design issues for each requirement are discussed and the utility of SCOR in the process of conceptual modelling is demonstrated using two development cases. Ten key concepts are synthesised and aligned to a general process for conceptual modelling. Further work is outlined to detail, refine and test the procedure with different process reference models in different industrial contexts. Findings - Simulation conceptual modelling is often regarded as the most important yet least understood aspect of a simulation project (Robinson, 2008a). Even today, there has been little research development into guidelines to aid in the creation of a conceptual model. Design issues are discussed for building an ‘effective’ conceptual model and the domain-specific requirements for modelling supply chains are addressed. The ten key concepts are incorporated to aid in describing the supply chain problem (i.e. components and relationships that need to be included in the model), model content (i.e. rules for determining the simplest model boundary and level of detail to implement the model) and model validation. Originality/value – Paper addresses Robinson (2008a) call for research in defining and developing new approaches for conceptual modelling and Manuj et al., (2009) discussion on improving the rigour of simulation studies in SCM. It is expected that more detailed guidelines will yield benefits to both expert (i.e. avert typical modelling failures) and novice modellers (i.e. guided practice; less reliance on hopeful intuition)
Resumo:
This thesis is concerned with the role of diagenesis in forming ore deposits. Two sedimentary 'ore-types' have been examined; the Proterozoic copper-cobalt orebodies of the Konkola Basin on the Zambian Copperbelt, and the Permian Marl Slate of North East England. Facies analysis of the Konkola Basin shows the Ore-Shale to have formed in a subtidal to intertidal environment. A sequence of diagenetic events is outlined from which it is concluded that the sulphide ores are an integral part of the diagenetic process. Sulphur isotope data establish that the sulphides formed as a consequence of the bacterial reduction of sulphate, while the isotopic and geochemical composition of carbonates is shown to reflect changes in the compositions of diagenetic pore fluids. Geochemical studies indicate that the copper and cobalt bearing mineralising fluids probably had different sources. Veins which crosscut the orebodies contain hydrocarbon inclusions, and are shown to be of late diagenetic lateral secretion origin. RbiSr dating indicates that the Ore-Shale was subject to metamorphism at 529 A- 20 myrs. The sedimentology and petrology of the Marl Slate are described. Textural and geochemical studies suggest that much of the pyrite (framboidal) in the Marl Slate formed in an anoxic water column, while euhedral pyrite and base metal sulphides formed within the sediment during early diagenesis. Sulphur isotope data confirm that conditions were almost "ideal" for sulphide formation during Marl Slate deposition, the limiting factors in ore formation being the restricted supply of chalcophile elements. Carbon and oxygen isotope data, along with petrographic observations, indicate that much of the calcite and dolomite occurring in the Marl Slate is primary, and probably formed in isotopic equilibrium. A depositional model is proposed which explains all of the data presented and links the lithological variations with fluctuations in the anoxicioxic boundary layer of the water column.
Resumo:
The purpose of this thesis is twofold: to examine the validity of the rotating-field and cross-field theories of the single-phase induction motor when applied to a cage rotor machine; and to examine the extent to which skin effect is likely to modify the characteristics of a cage rotor machine. A mathematical analysis is presented for a single-phase induction motor in which the rotor parameters are modified by skin effect. Although this is based on the usual type of ideal machine, a new form of model rotor allows approximations for skin effect phenomena to be included as an integral part of the analysis. Performance equations appropriate to the rotating-field and cross-field theories are deduced, and the corresponding explanations for the steady-state mode of operation are critically examined. The evaluation of the winding currents and developed torque is simplified by the introduction of new dimensionless factors which are functions of the resistance/reactance ratios of the rotor and the speed. Tables of the factors are included for selected numerical values of the parameter ratios, and these are used to deduce typical operating characteristics for both cage and wound rotor machines. It is shown that a qualitative explanation of the mode of operation of a cage rotor machine is obtained from either theory; but the operating characteristics must be deduced from the performance equations of the rotating-field theory, because of the restrictions on the values of the rotor parameters imposed by skin effect.
Resumo:
Suboptimal maternal nutrition during gestation results in the establishment of long-term phenotypic changes and an increased disease risk in the offspring. To elucidate how such environmental sensitivity results in physiological outcomes, the molecular characterisation of these offspring has become the focus of many studies. However, the likely modification of key cellular processes such as metabolism in response to maternal undernutrition raises the question of whether the genes typically used as reference constants in gene expression studies are suitable controls. Using a mouse model of maternal protein undernutrition, we have investigated the stability of seven commonly used reference genes (18s, Hprt1, Pgk1, Ppib, Sdha, Tbp and Tuba1) in a variety of offspring tissues including liver, kidney, heart, retro-peritoneal and inter-scapular fat, extra-embryonic placenta and yolk sac, as well as in the preimplantation blastocyst and blastocyst-derived embryonic stem cells. We find that although the selected reference genes are all highly stable within this system, they show tissue, treatment and sex-specific variation. Furthermore, software-based selection approaches rank reference genes differently and do not always identify genes which differ between conditions. Therefore, we recommend that reference gene selection for gene expression studies should be thoroughly validated for each tissue of interest. © 2011 Elsevier Inc.
Resumo:
Video streaming via Transmission Control Protocol (TCP) networks has become a popular and highly demanded service, but its quality assessment in both objective and subjective terms has not been properly addressed. In this paper, based on statistical analysis a full analytic model of a no-reference objective metric, namely pause intensity (PI), for video quality assessment is presented. The model characterizes the video playout buffer behavior in connection with the network performance (throughput) and the video playout rate. This allows for instant quality measurement and control without requiring a reference video. PI specifically addresses the need for assessing the quality issue in terms of the continuity in the playout of TCP streaming videos, which cannot be properly measured by other objective metrics such as peak signal-to-noise-ratio, structural similarity, and buffer underrun or pause frequency. The performance of the analytical model is rigidly verified by simulation results and subjective tests using a range of video clips. It is demonstrated that PI is closely correlated with viewers' opinion scores regardless of the vastly different composition of individual elements, such as pause duration and pause frequency which jointly constitute this new quality metric. It is also shown that the correlation performance of PI is consistent and content independent. © 2013 IEEE.
Resumo:
A sieve plate distillation column has been constructed and interfaced to a minicomputer with the necessary instrumentation for dynamic, estimation and control studies with special bearing on low-cost and noise-free instrumentation. A dynamic simulation of the column with a binary liquid system has been compiled using deterministic models that include fluid dynamics via Brambilla's equation for tray liquid holdup calculations. The simulation predictions have been tested experimentally under steady-state and transient conditions. The simulator's predictions of the tray temperatures have shown reasonably close agreement with the measured values under steady-state conditions and in the face of a step change in the feed rate. A method of extending linear filtering theory to highly nonlinear systems with very nonlinear measurement functional relationships has been proposed and tested by simulation on binary distillation. The simulation results have proved that the proposed methodology can overcome the typical instability problems associated with the Kalman filters. Three extended Kalman filters have been formulated and tested by simulation. The filters have been used to refine a much simplified model sequentially and to estimate parameters such as the unmeasured feed composition using information from the column simulation. It is first assumed that corrupted tray composition measurements are made available to the filter and then corrupted tray temperature measurements are accessed instead. The simulation results have demonstrated the powerful capability of the Kalman filters to overcome the typical hardware problems associated with the operation of on-line analyzers in relation to distillation dynamics and control by, in effect, replacirig them. A method of implementing estimator-aided feedforward (EAFF) control schemes has been proposed and tested by simulation on binary distillation. The results have shown that the EAFF scheme provides much better control and energy conservation than the conventional feedback temperature control in the face of a sustained step change in the feed rate or multiple changes in the feed rate, composition and temperature. Further extensions of this work are recommended as regards simulation, estimation and EAFF control.
Resumo:
Background: The importance of appropriate normalization controls in quantitative real-time polymerase chain reaction (qPCR) experiments has become more apparent as the number of biological studies using this methodology has increased. In developing a system to study gene expression from transiently transfected plasmids, it became clear that normalization using chromosomally encoded genes is not ideal, at it does not take into account the transfection efficiency and the significantly lower expression levels of the plasmids. We have developed and validated a normalization method for qPCR using a co-transfected plasmid.Results: The best chromosomal gene for normalization in the presence of the transcriptional activators used in this study, cadmium, dexamethasone, forskolin and phorbol-12-myristate 13-acetate was first identified. qPCR data was analyzed using geNorm, Normfinder and BestKeeper. Each software application was found to rank the normalization controls differently with no clear correlation. Including a co-transfected plasmid encoding the Renilla luciferase gene (Rluc) in this analysis showed that its calculated stability was not as good as the optimised chromosomal genes, most likely as a result of the lower expression levels and transfection variability. Finally, we validated these analyses by testing two chromosomal genes (B2M and ActB) and a co-transfected gene (Rluc) under biological conditions. When analyzing co-transfected plasmids, Rluc normalization gave the smallest errors compared to the chromosomal reference genes.Conclusions: Our data demonstrates that transfected Rluc is the most appropriate normalization reference gene for transient transfection qPCR analysis; it significantly reduces the standard deviation within biological experiments as it takes into account the transfection efficiencies and has easily controllable expression levels. This improves reproducibility, data validity and most importantly, enables accurate interpretation of qPCR data. © 2010 Jiwaji et al; licensee BioMed Central Ltd.
Resumo:
This chapter examines the contexts in which people will process more deeply, and therefore be more influenced by, a position that is supported by either a numerical majority or minority. The chapter reviews the major theories of majority and minority influence with reference to which source condition is associated with most message processing (and where relevant, the contexts under which this occurs) and experimental research examining these predictions. The chapter then presents a new theoretical model (the source-context-elaboration model, SCEM) that aims to integrate the disparate research findings. The model specifies the processes underlying majority and minority influence, the contexts under which these processes occur and the consequences for attitudes changed by majority and minority influence. The chapter then describes a series of experiments that address each of the aspects of the theoretical model. Finally, a range of research-related issues are discussed and future issues for the research area as a whole are considered.
Resumo:
This study has been conceived with the primary objective of identifying and evaluating the financial aspects of the transformation in country/company relations of the international oil industry from the traditional concessionary system to the system of governmental participation in the ownership and operation of oil concessions. The emphasis of the inquiry was placed on assembling a case study of the oil exploitation arrangements of Libya. Through a comprehensive review of the literature, the sociopolitical factors surrounding the international oil business were identified and examined in an attempt to see their influence on contractual arrangements and particularly to gauge the impact of any induced contractual changes on the revenue benefit accruing to the host country from its oil operations. Some comparative analyses were made in the study to examine the viability of the Libyan participation deals both as an investment proposal and as a system of conducting oil activities in the country. The analysis was carried out in the light of specific hypotheses to assess the relative impact of the participation scheme in comparison with the alternative concessionary model on the net revenue resulting to the government from oil operations and the relative effect on the level of research and development within the industry. A discounted cash flow analysis was conducted to measure inputs and outputs of the comparative models and judge their revenue benefits. Then an empirical analysis was carried out to detect any significant behavioural changes in the exploration and development effort associated with the different oil exploitation systems. Results of the investigation of revenues support the argument that the mere introduction of the participation system has not resulted in a significant revenue benefit to the host government. Though there has been a significant increase in government revenue, associated with the period following the emergence of the participation agreements, this increase was mainly due to socio-economic factors other than the participation scheme. At the same time the empirical results have shown an association of the participation scheme with a decline of the oil industry's research and development efforts.
Resumo:
New Approach’ Directives now govern the health and safety of most products whether destined for workplace or domestic use. These Directives have been enacted into UK law by various specific legislation principally relating to work equipment, machinery and consumer products. This research investigates whether the risk assessment approach used to ensure the safety of machinery may be applied to consumer products. Crucially, consumer products are subject to the Consumer Protection Act (CPA) 1987, where there is no direct reference to “assessing risk”. This contrasts with the law governing the safety of products used in the workplace, where risk assessment underpins the approach. New Approach Directives are supported by European harmonised standards, and in the case of machinery, further supported by the risk assessment standard, EN 1050. The system regulating consumer product safety is discussed, its key elements identified and a graphical model produced. This model incorporates such matters as conformity assessment, the system of regulation, near miss and accident reporting. A key finding of the research is that New Approach Directives have a common feature of specifying essential performance requirements that provide a hazard prompt-list that can form the basis for a risk assessment (the hazard identification stage). Drawing upon 272 prosecution cases, and with thirty examples examined in detail, this research provides evidence that despite the high degree of regulation, unsafe consumer products still find their way onto the market. The research presents a number of risk assessment tools to help Trading Standards Officers (TSOs) prioritise their work at the initial inspection stage when dealing with subsequent enforcement action.
Resumo:
A study has been made of drugs acting at 5-HT receptors on animal models of anxiety. An elevated X-maze was used as a model of anxiety for rats and the actions of various ligands for the 5-HT receptor, and its subtypes, were examined in this model. 5-HT agonists, with varying affinities for the 5-HT receptor subtypes, were demonstrated to have anxiogenic-like activity. The 5-HT2 receptor antagonists ritanserin and ketanserin exhibited an anxiolytic-like profile. The new putatuve anxiolytics ipsapirone and buspirone, which are believed to be selective for 5-HT1 receptors, were also examined. The former had an anxiolytic profile whilst the latter was without effect. Antagonism studies showed the anxiogenic response to 8-hydroxy-2-(Di-n-propylamino)tetralin (8-OH-DPAT) to be antagonised by ipsapirone, pindolol, alprenolol and para-chlorophenylalanine, but not by diazepam, ritanserin, metoprolol, ICI118,551 or buspirone. To confirm some of the results obtained in the elevated X-maze the Social Interaction Test of anxiety was used. Results in this test mirrored the effects seen with the 5-HT agonists, ipsapirone and pindolol, whilst the 5-HT2 receptor antagonists were without effect. Studies using operant conflict models of anxiety produced marginal and varying results which appear to be in agreement with recent criticisms of such models. Finally, lesions of the dorsal raphe nucleus (DRN) were performed in order to investigate the mechanisms involved in the production of the anxiogenic response to 8-OH-DPAT. Overall the results lend support to the involvement of 5-HT, and more precisely 5-HT1, receptors in the manifestation of anxiety in such animal models.