921 resultados para Voice of Process
Resumo:
Proper maintenance of plant items is crucial for the safe and profitable operation of process plants, The relevant maintenance policies fall into the following four categories: (i) preventivejopportunistic/breakdown replacement policies, (ii) inspection/inspection-repair-replacernent policies, (iii) restorative maintenance policies, and (iv) condition based maintenance policies, For correlating failure times of component equipnent and complete systems, the Weibull failure distribution has been used, A new powerful method, SEQLIM, has been proposed for the estimation of the Weibull parameters; particularly, when maintenance records contain very few failures and many successful operation times. When a system consists of a number of replaceable, ageing components, an opporturistic replacernent policy has been found to be cost-effective, A simple opportunistic rrodel has been developed. Inspection models with various objective functions have been investigated, It was found that, on the assumption of a negative exponential failure distribution, all models converge to the same optimal inspection interval; provided the safety components are very reliable and the demand rate is low, When deterioration becomes a contributory factor to same failures, periodic inspections, calculated from above models, are too frequent, A case of safety trip systems has been studied, A highly effective restorative maintenance policy can be developed if the performance of the equipment under this category can be related to some predictive modelling. A novel fouling model has been proposed to determine cleaning strategies of condensers, Condition-based maintenance policies have been investigated. A simple gauge has been designed for condition monitoring of relief valve springs. A typical case of an exothermic inert gas generation plant has been studied, to demonstrate how various policies can be applied to devise overall maintenance actions.
Resumo:
The aim of this work has been to investigate the principle of combined centrifugal bioreaction-separation. The production of dextran and fructose by the action of the enzyme dextransucrase on sucrose was employed to elucidate some of the principles of this type of process. Dextran is a valuable pharmaceutical product used mainly as a blood volume expander and blood flow improver whilst fructose is an important dietary product. The development of a single step process capable of the simultaneous biosynthesis of dextran and the separation of the fructose by-product should improve dextran yields whilst reducing capital and processing costs. This thesis shows for the first time that it is possible to conduct successful bioreaction-separations using a rate-zonal centrifugation technique. By layering thin zones of dextrasucrase enzyme onto sucrose gradients and centrifuging, very high molecular weight (MW) dextran-enzyme complexes were formed that rapidly sedimented through the sucrose substrate gradients under the influence of the applied centrifugal field. The low MW fructose by-product sedimented at reduced rates and was thus separated from the enzyme and dextran during the reaction. The MW distribution of dextran recovered from the centrifugal bioreactor was compared with that from a conventional batch bioreactor. The results indicated that the centrifugal bioreactor produced up to 100% more clinical dextran with MWs of between 12 000 and 98 000 at 20% w/w sucrose concentrations than conventional bioreactors. This was due to the removal of acceptor fructose molecules from the sedimenting reaction zone by the action of the centrifugal field. Higher proportions of unwanted lower MW dextran were found in the conventional bioreactor than in the centrifugal bioreactor-separator. The process was studied on a number of alternative centrifugal systems. A zonal rotor fitted with a reorienting gradient core proved most successful for the evaluation of bioreactor performance. Results indicated that viscosity build-up in the reactor must be minimised in order to increase the yields of dextran per unit time and improve product separation. A preliminary attempt at modelling the process has also been made.
Resumo:
The initial aim of this research was to investigate the application of expert Systems, or Knowledge Base Systems technology to the automated synthesis of Hazard and Operability Studies. Due to the generic nature of Fault Analysis problems and the way in which Knowledge Base Systems work, this goal has evolved into a consideration of automated support for Fault Analysis in general, covering HAZOP, Fault Tree Analysis, FMEA and Fault Diagnosis in the Process Industries. This thesis described a proposed architecture for such an Expert System. The purpose of the System is to produce a descriptive model of faults and fault propagation from a description of the physical structure of the plant. From these descriptive models, the desired Fault Analysis may be produced. The way in which this is done reflects the complexity of the problem which, in principle, encompasses the whole of the discipline of Process Engineering. An attempt is made to incorporate the perceived method that an expert uses to solve the problem; keywords, heuristics and guidelines from techniques such as HAZOP and Fault Tree Synthesis are used. In a truly Expert System, the performance of the system is strongly dependent on the high quality of the knowledge that is incorporated. This expert knowledge takes the form of heuristics or rules of thumb which are used in problem solving. This research has shown that, for the application of fault analysis heuristics, it is necessary to have a representation of the details of fault propagation within a process. This helps to ensure the robustness of the system - a gradual rather than abrupt degradation at the boundaries of the domain knowledge.
Resumo:
This thesis begins with a sociolinguistic correlational study of three phonetic variables - (h), (t) and (ing) - as used by four occupational groups - nurses, chefs, hairdressers and taxi-drivers. The groups were selected to incorporate three independent variables: sex (male-dominated versus female-dominated occupations); training (length and specialisation - nurses and chefs being more specialised than hairdressers and taxi-drivers) and location (the populations were selected from two cities - Liverpool and Birmingham). Although the correlational work demonstrates intra-sex and occupation consistency in speakers' choice of linguistic variants (females (particularly nurses) being significantly closer to the prestige norm), it is essentially non-explanatory and cannot accout for narrative dynamics and style shift. Therefore, an in-depth qualitative examination of the data (which draws mainly on Narrative and Discourse Analysis) forms the major part of the analysis. The study first analyses features common to all the narratives, direct speech, expressive phonology and linguistic ambiguity emerging as characteristic of all humorous storytelling. Secondly, three major sources of inter-personal variation are invetigated: narrator perspective, sex and occuptational role. Perspective is found to vary with topic and personality, greater narrator involvement coinciding with a higher proportion of internal evaluation devices. Sex differences include topic choice and bonding in the storytelling sessions. Sex differences are also evident in style shifting, where the narrator mimics the voice of a character in the narrative (aodpting segmental and/or prosodic tokens to signal a change of persona). The research finds that female narrators rarely employ segmental accommodation downwards on the social scale (whereas men do), but are on the other hand adept at using prosodic effects for mimicry. Taxi-drivers emerge as the group with the most distinctive narrative flair, a fact which is related to their occupation. The conclusion stresses a need for both quantitative and qualitative approaches to data; the importance of occupational role, as opposed to sex role per se in determining narrative conventions; the view of narrative as a negotiable entity, which is the product of relationships among participants; and the importance of considering the totality of the communicative act.
Resumo:
This thesis is based upon a case study of the adoption of digital, electronic, microprocessor-based control systems by Albright & Wilson Limited - a UK chemical producer. It offers an explanation of the company's changing technology policy between 1978 and 1981, by examining its past development, internal features and industrial environment. Part One of the thesis gives an industry-level analysis which relates the development of process control technology to changes in the economic requirements of production . The rapid diffusion of microcomputers and other microelectronic equipment in the chemical industry is found to be a response to general need to raise the efficiency of all processes, imposed by the economic recession following 1973. Part Two examines the impaot of these technical and eoonomic ohanges upon Albright & Wilson Limited. The company's slowness in adopting new control technology is explained by its long history in which trends are identified whlich produced theconservatism of the 1970s. By contrast, a study of Tenneco Incorporated, a much more successful adoptor of automating technology, is offered with an analysis of the new technology policy of adoption of such equipment which it imposed upon Albright & Wilson, following the latter's takeover by Tenneco in 1978. Some indications of the consequences by this new policy of widespread adoptions of microprocessor-based control equipment are derived from a study of the first Albright & Wilson plant to use such equipment. The thesis concludes that companies which fail to adopt rapidly the new control technology may not survive in the recessionary environment, the long- established British companies may lack the flexibility to make such necessary changes and that multi-national companies may have an important role jn the planned transfer and adoption of new production technology through their subsidiaries in the UK.
Resumo:
Time after time… and aspect and mood. Over the last twenty five years, the study of time, aspect and - to a lesser extent - mood acquisition has enjoyed increasing popularity and a constant widening of its scope. In such a teeming field, what can be the contribution of this book? We believe that it is unique in several respects. First, this volume encompasses studies from different theoretical frameworks: functionalism vs generativism or function-based vs form-based approaches. It also brings together various sub-fields (first and second language acquisition, child and adult acquisition, bilingualism) that tend to evolve in parallel rather than learn from each other. A further originality is that it focuses on a wide range of typologically different languages, and features less studied languages such as Korean and Bulgarian. Finally, the book gathers some well-established scholars, young researchers, and even research students, in a rich inter-generational exchange, that ensures the survival but also the renewal and the refreshment of the discipline. The book at a glance The first part of the volume is devoted to the study of child language acquisition in monolingual, impaired and bilingual acquisition, while the second part focuses on adult learners. In this section, we will provide an overview of each chapter. The first study by Aviya Hacohen explores the acquisition of compositional telicity in Hebrew L1. Her psycholinguistic approach contributes valuable data to refine theoretical accounts. Through an innovating methodology, she gathers information from adults and children on the influence of definiteness, number, and the mass vs countable distinction on the constitution of a telic interpretation of the verb phrase. She notices that the notion of definiteness is mastered by children as young as 10, while the mass/count distinction does not appear before 10;7. However, this does not entail an adult-like use of telicity. She therefore concludes that beyond definiteness and noun type, pragmatics may play an important role in the derivation of Hebrew compositional telicity. For the second chapter we move from a Semitic language to a Slavic one. Milena Kuehnast focuses on the acquisition of negative imperatives in Bulgarian, a form that presents the specificity of being grammatical only with the imperfective form of the verb. The study examines how 40 Bulgarian children distributed in two age-groups (15 between 2;11-3;11, and 25 between 4;00 and 5;00) develop with respect to the acquisition of imperfective viewpoints, and the use of imperfective morphology. It shows an evolution in the recourse to expression of force in the use of negative imperatives, as well as the influence of morphological complexity on the successful production of forms. With Yi-An Lin’s study, we concentrate both on another type of informant and of framework. Indeed, he studies the production of children suffering from Specific Language Impairment (SLI), a developmental language disorder the causes of which exclude cognitive impairment, psycho-emotional disturbance, and motor-articulatory disorders. Using the Leonard corpus in CLAN, Lin aims to test two competing accounts of SLI (the Agreement and Tense Omission Model [ATOM] and his own Phonetic Form Deficit Model [PFDM]) that conflicts on the role attributed to spellout in the impairment. Spellout is the point at which the Computational System for Human Language (CHL) passes over the most recently derived part of the derivation to the interface components, Phonetic Form (PF) and Logical Form (LF). ATOM claims that SLI sufferers have a deficit in their syntactic representation while PFDM suggests that the problem only occurs at the spellout level. After studying the corpus from the point of view of tense / agreement marking, case marking, argument-movement and auxiliary inversion, Lin finds further support for his model. Olga Gupol, Susan Rohstein and Sharon Armon-Lotem’s chapter offers a welcome bridge between child language acquisition and multilingualism. Their study explores the influence of intensive exposure to L2 Hebrew on the development of L1 Russian tense and aspect morphology through an elicited narrative. Their informants are 40 Russian-Hebrew sequential bilingual children distributed in two age groups 4;0 – 4;11 and 7;0 - 8;0. They come to the conclusion that bilingual children anchor their narratives in perfective like monolinguals. However, while aware of grammatical aspect, bilinguals lack the full form-function mapping and tend to overgeneralize the imperfective on the principles of simplicity (as imperfective are the least morphologically marked forms), universality (as it covers more functions) and interference. Rafael Salaberry opens the second section on foreign language learners. In his contribution, he reflects on the difficulty L2 learners of Spanish encounter when it comes to distinguishing between iterativity (conveyed with the use of the preterite) and habituality (expressed through the imperfect). He examines in turn the theoretical views that see, on the one hand, habituality as part of grammatical knowledge and iterativity as pragmatic knowledge, and on the other hand both habituality and iterativity as grammatical knowledge. He comes to the conclusion that the use of preterite as a default past tense marker may explain the impoverished system of aspectual distinctions, not only at beginners but also at advanced levels, which may indicate that the system is differentially represented among L1 and L2 speakers. Acquiring the vast array of functions conveyed by a form is therefore no mean feat, as confirmed by the next study. Based on the prototype theory, Kathleen Bardovi-Harlig’s chapter focuses on the development of the progressive in L2 English. It opens with an overview of the functions of the progressive in English. Then, a review of acquisition research on the progressive in English and other languages is provided. The bulk of the chapter reports on a longitudinal study of 16 learners of L2 English and shows how their use of the progressive expands from the prototypical uses of process and continuousness to the less prototypical uses of repetition and future. The study concludes that the progressive spreads in interlanguage in accordance with prototype accounts. However, it suggests additional stages, not predicted by the Aspect Hypothesis, in the development from activities and accomplishments at least for the meaning of repeatedness. A similar theoretical framework is adopted in the following chapter, but it deals with a lesser studied language. Hyun-Jin Kim revisits the claims of the Aspect Hypothesis in relation to the acquisition of L2 Korean by two L1 English learners. Inspired by studies on L2 Japanese, she focuses on the emergence and spread of the past / perfective marker ¬–ess- and the progressive – ko iss- in the interlanguage of her informants throughout their third and fourth semesters of study. The data collected through six sessions of conversational interviews and picture description tasks seem to support the Aspect Hypothesis. Indeed learners show a strong association between past tense and accomplishments / achievements at the start and a gradual extension to other types; a limited use of past / perfective marker with states and an affinity of progressive with activities / accomplishments and later achievements. In addition, - ko iss– moves from progressive to resultative in the specific category of Korean verbs meaning wear / carry. While the previous contributions focus on function, Evgeniya Sergeeva and Jean-Pierre Chevrot’s is interested in form. The authors explore the acquisition of verbal morphology in L2 French by 30 instructed native speakers of Russian distributed in a low and high levels. They use an elicitation task for verbs with different models of stem alternation and study how token frequency and base forms influence stem selection. The analysis shows that frequency affects correct production, especially among learners with high proficiency. As for substitution errors, it appears that forms with a simple structure are systematically more frequent than the target form they replace. When a complex form serves as a substitute, it is more frequent only when it is replacing another complex form. As regards the use of base forms, the 3rd person singular of the present – and to some extent the infinitive – play this role in the corpus. The authors therefore conclude that the processing of surface forms can be influenced positively or negatively by the frequency of the target forms and of other competing stems, and by the proximity of the target stem to a base form. Finally, Martin Howard’s contribution takes up the challenge of focusing on the poorer relation of the TAM system. On the basis of L2 French data obtained through sociolinguistic interviews, he studies the expression of futurity, conditional and subjunctive in three groups of university learners with classroom teaching only (two or three years of university teaching) or with a mixture of classroom teaching and naturalistic exposure (2 years at University + 1 year abroad). An analysis of relative frequencies leads him to suggest a continuum of use going from futurate present to conditional with past hypothetic conditional clauses in si, which needs to be confirmed by further studies. Acknowledgements The present volume was inspired by the conference Acquisition of Tense – Aspect – Mood in First and Second Language held on 9th and 10th February 2008 at Aston University (Birmingham, UK) where over 40 delegates from four continents and over a dozen countries met for lively and enjoyable discussions. This collection of papers was double peer-reviewed by an international scientific committee made of Kathleen Bardovi-Harlig (Indiana University), Christine Bozier (Lund Universitet), Alex Housen (Vrije Universiteit Brussel), Martin Howard (University College Cork), Florence Myles (Newcastle University), Urszula Paprocka (Catholic University of Lublin), †Clive Perdue (Université Paris 8), Michel Pierrard (Vrije Universiteit Brussel), Rafael Salaberry (University of Texas at Austin), Suzanne Schlyter (Lund Universitet), Richard Towell (Salford University), and Daniel Véronique (Université d’Aix-en-Provence). We are very much indebted to that scientific committee for their insightful input at each step of the project. We are also thankful for the financial support of the Association for French Language Studies through its workshop grant, and to the Aston Modern Languages Research Foundation for funding the proofreading of the manuscript.
Resumo:
In today's market, the global competition has put manufacturing businesses in great pressures to respond rapidly to dynamic variations in demand patterns across products and changing product mixes. To achieve substantial responsiveness, the manufacturing activities associated with production planning and control must be integrated dynamically, efficiently and cost-effectively. This paper presents an iterative agent bidding mechanism, which performs dynamic integration of process planning and production scheduling to generate optimised process plans and schedules in response to dynamic changes in the market and production environment. The iterative bidding procedure is carried out based on currency-like metrics in which all operations (e.g. machining processes) to be performed are assigned with virtual currency values, and resource agents bid for the operations if the costs incurred for performing them are lower than the currency values. The currency values are adjusted iteratively and resource agents re-bid for the operations based on the new set of currency values until the total production cost is minimised. A simulated annealing optimisation technique is employed to optimise the currency values iteratively. The feasibility of the proposed methodology has been validated using a test case and results obtained have proven the method outperforming non-agent-based methods.
Resumo:
Stanley Hoffmann is one of the most eminent political scholars of our age—a renowned authority in the study of French, European, and world politics over half a century, an influential theorist of international relations, a critical analyst of US foreign policy, and a voice of moral conscience in many public debates of his time. Hoffmann has always asked big questions—and to those questions he brings an encyclopedic mind that crosses boundaries between politics, history, sociology, law, philosophy, ethics, and literature. This brief article highlights some aspects of his life and work, and introduces a symposium in his honor bringing together five leading scholars on France, Europe, international relations, and international law—each with an enduring debt to the teaching, writings and example of Stanley Hoffmann.
Resumo:
Liposomes provide an efficient delivery system for solubilisation and delivery of both small and macro molecules. However, they suffer from the disadvantage of instability when stored as aqueous dispersions. Cryoprotection of the liposomal systems provides an effective approach to overcome poor stability whilst maintaining formulation characteristics, although, the formulation of a freeze-dried product requires the consideration of not only the selection of an appropriate cryoprotectant, but also needs careful consideration of the processing parameters including pre-freezing conditions, primary and secondary drying protocols along with optimisation of cryoprotectant concentration. This current work investigates the application of amino acids as potential cryoprotectants for the stabilisation of liposomes, and results indicate that amino acids show biphasic nature of stabilisation with 4 mol of cryoprotectant per mole of the lipid exhibiting optimum cryoprotection. The investigations of process parameters showed that the pre-freezing temperatures below the glass transition of the amino acids followed by drying for over 6 h resulted in stable formulations. Studies investigating the efficiency of drug retention showed that the cryoprotection offered by lysine was similar to that shown by trehalose, suggesting that amino acids act as effective stabilisers. ESEM analysis was carried out to monitor morphology of the rehydrated liposomes. © 2007 Elsevier B.V. All rights reserved.
Resumo:
Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication. © 2014 Porcaro et al.
Resumo:
Reactive, but not a reactant. Heterogeneous catalysts play an unseen role in many of today's processes and products. With the increasing emphasis on sustainability in both products and processes, this handbook is the first to combine the hot topics of heterogeneous catalysis and clean technology. It focuses on the development of heterogeneous catalysts for use in clean chemical synthesis, dealing with how modern spectroscopic techniques can aid the design of catalysts for use in liquid phase reactions, their application in industrially important chemistries - including selective oxidation, hydrogenation, solid acid- and base-catalyzed processes - as well as the role of process intensification and use of renewable resources in improving the sustainability of chemical processes. With its emphasis on applications, this book is of high interest to those working in the industry.
Resumo:
Purpose – This paper describes a “work in progress” research project being carried out with a public health care provider in the UK, a large NHS hospital Trust. Enhanced engagement with patients is one of the Trust’s core principles, but it is recognised that much more needs to be done to achieve this, and that ICT systems may be able to provide some support. The project is intended to find ways to better capture and evaluate the “voice of the patient” in order to lead to improvements in health care quality, safety and effectiveness. Design/methodology/approach – We propose to investigate the use of a patient-orientated knowledge management system (KMS) in managing knowledge about and from patients. The study is a mixed methods (quantitative and qualitative) investigation based on traditional action research, intended to answer the following three research questions: (1) How can a KMS be used as a mechanism to capture and evaluate patient experiences to provoke patient service change (2) How can the KMS assist in providing a mechanism for systematising patient engagement? (3) How can patient feedback be used to stimulate improvements in care, quality and safety? Originality/value –This methodology aims to involve patients at all phases of the study from its initial design onwards, thus leading to an understanding of the issues associated with using a KMS to manage knowledge about and for patients that is driven by the patients themselves. Practical implications – The outcomes of the project for the collaborating hospital will be firstly, a system for capturing and evaluating knowledge about and from patients, and then as a consequence, improved outcomes for both the patients and the service provider. More generally, it will produce a set of guidelines for managing patient knowledge in an NHS hospital that have been tested in one case example.
Resumo:
* Under Knowledge Infrastructure we imply all the means that enable effective knowledge management within organization ~ knowledge process support.
Resumo:
Introduction: Production of functionalised particles using dry powder coating is a one-step, environmentally friendly process that paves the way for the development of particles with targeted properties and diverse functionalities. Areas covered: Applying the first principles in physical science for powders, fine guest particles can be homogeneously dispersed over the surface of larger host particles to develop functionalised particles. Multiple functionalities can be modified including: flowability, dispersibility, fluidisation, homogeneity, content uniformity and dissolution profile. The current publication seeks to understand the fundamental underpinning principles and science governing dry coating process, evaluate key technologies developed to produce functionalised particles along with outlining their advantages, limitations and applications and discusses in detail the resultant functionalities and their applications. Expert opinion: Dry particle coating is a promising solvent-free manufacturing technology to produce particles with targeted functionalities. Progress within this area requires the development of continuous processing devices that can overcome challenges encountered with current technologies such as heat generation and particle attrition. Growth within this field requires extensive research to further understand the impact of process design and material properties on resultant functionalities.
Resumo:
Rework strategies that involve different checking points as well as rework times can be applied into reconfigurable manufacturing system (RMS) with certain constraints, and effective rework strategy can significantly improve the mission reliability of manufacturing process. The mission reliability of process is a measurement of production ability of RMS, which serves as an integrated performance indicator of the production process under specified technical constraints, including time, cost and quality. To quantitatively characterize the mission reliability and basic reliability of RMS under different rework strategies, rework model of RMS was established based on the method of Logistic regression. Firstly, the functional relationship between capability and work load of manufacturing process was studied through statistically analyzing a large number of historical data obtained in actual machining processes. Secondly, the output, mission reliability and unit cost in different rework paths were calculated and taken as the decision variables based on different input quantities and the rework model mentioned above. Thirdly, optimal rework strategies for different input quantities were determined by calculating the weighted decision values and analyzing advantages and disadvantages of each rework strategy. At last, case application were demonstrated to prove the efficiency of the proposed method.