21 resultados para Shipping process without debit
em Aston University Research Archive
Resumo:
OBJECTIVES: The aim of this study was to investigate the influence of process parameters during dry coating on particle and dosage form properties upon varying the surface adsorbed moisture of microcrystalline cellulose (MCC), a model filler/binder for orally disintegrating tablets (ODTs). METHODS: The moisture content of MCC was optimised using the spray water method and analysed using thermogravimetric analysis. Microproperty/macroproperty assessment was investigated using atomic force microscopy, nano-indentation, scanning electron microscopy, tablet hardness and disintegration testing. KEY FINDINGS: The results showed that MCC demonstrated its best flowability at a moisture content of 11.2% w/w when compared to control, comprising of 3.9% w/w moisture. The use of the composite powder coating process (without air) resulted in up to 80% increase in tablet hardness, when compared to the control. The study also demonstrated that surface adsorbed moisture can be displaced upon addition of excipients during dry processing circumventing the need for particle drying before tabletting. CONCLUSIONS: It was concluded that MCC with a moisture content of 11% w/w provides a good balance between powder flowability and favourable ODT characteristics.
Resumo:
This special issue of the Journal of the Operational Research Society is dedicated to papers on the related subjects of knowledge management and intellectual capital. These subjects continue to generate considerable interest amongst both practitioners and academics. This issue demonstrates that operational researchers have many contributions to offer to the area, especially by bringing multi-disciplinary, integrated and holistic perspectives. The papers included are both theoretical as well as practical, and include a number of case studies showing how knowledge management has been implemented in practice that may assist other organisations in their search for a better means of managing what is now recognised as a core organisational activity. It has been accepted by a growing number of organisations that the precise handling of information and knowledge is a significant factor in facilitating their success but that there is a challenge in how to implement a strategy and processes for this handling. It is here, in the particular area of knowledge process handling that we can see the contributions of operational researchers most clearly as is illustrated in the papers included in this journal edition. The issue comprises nine papers, contributed by authors based in eight different countries on five continents. Lind and Seigerroth describe an approach that they call team-based reconstruction, intended to help articulate knowledge in a particular organisational. context. They illustrate the use of this approach with three case studies, two in manufacturing and one in public sector health care. Different ways of carrying out reconstruction are analysed, and the benefits of team-based reconstruction are established. Edwards and Kidd, and Connell, Powell and Klein both concentrate on knowledge transfer. Edwards and Kidd discuss the issues involved in transferring knowledge across frontières (borders) of various kinds, from those borders within organisations to those between countries. They present two examples, one in distribution and the other in manufacturing. They conclude that trust and culture both play an important part in facilitating such transfers, that IT should be kept in a supporting role in knowledge management projects, and that a staged approach to this IT support may be the most effective. Connell, Powell and Klein consider the oft-quoted distinction between explicit and tacit knowledge, and argue that such a distinction is sometimes unhelpful. They suggest that knowledge should rather be regarded as a holistic systemic property. The consequences of this for knowledge transfer are examined, with a particular emphasis on what this might mean for the practice of OR Their view of OR in the context of knowledge management very much echoes Lind and Seigerroth's focus on knowledge for human action. This is an interesting convergence of views given that, broadly speaking, one set of authors comes from within the OR community, and the other from outside it. Hafeez and Abdelmeguid present the nearest to a 'hard' OR contribution of the papers in this special issue. In their paper they construct and use system dynamics models to investigate alternative ways in which an organisation might close a knowledge gap or skills gap. The methods they use have the potential to be generalised to any other quantifiable aspects of intellectual capital. The contribution by Revilla, Sarkis and Modrego is also at the 'hard' end of the spectrum. They evaluate the performance of public–private research collaborations in Spain, using an approach based on data envelopment analysis. They found that larger organisations tended to perform relatively better than smaller ones, even though the approach used takes into account scale effects. Perhaps more interesting was that many factors that might have been thought relevant, such as the organisation's existing knowledge base or how widely applicable the results of the project would be, had no significant effect on the performance. It may be that how well the partnership between the collaborators works (not a factor it was possible to take into account in this study) is more important than most other factors. Mak and Ramaprasad introduce the concept of a knowledge supply network. This builds on existing ideas of supply chain management, but also integrates the design chain and the marketing chain, to address all the intellectual property connected with the network as a whole. The authors regard the knowledge supply network as the natural focus for considering knowledge management issues. They propose seven criteria for evaluating knowledge supply network architecture, and illustrate their argument with an example from the electronics industry—integrated circuit design and fabrication. In the paper by Hasan and Crawford, their interest lies in the holistic approach to knowledge management. They demonstrate their argument—that there is no simple IT solution for organisational knowledge management efforts—through two case study investigations. These case studies, in Australian universities, are investigated through cultural historical activity theory, which focuses the study on the activities that are carried out by people in support of their interpretations of their role, the opportunities available and the organisation's purpose. Human activities, it is argued, are mediated by the available tools, including IT and IS and in this particular context, KMS. It is this argument that places the available technology into the knowledge activity process and permits the future design of KMS to be improved through the lessons learnt by studying these knowledge activity systems in practice. Wijnhoven concentrates on knowledge management at the operational level of the organisation. He is concerned with studying the transformation of certain inputs to outputs—the operations function—and the consequent realisation of organisational goals via the management of these operations. He argues that the inputs and outputs of this process in the context of knowledge management are different types of knowledge and names the operation method the knowledge logistics. The method of transformation he calls learning. This theoretical paper discusses the operational management of four types of knowledge objects—explicit understanding; information; skills; and norms and values; and shows how through the proposed framework learning can transfer these objects to clients in a logistical process without a major transformation in content. Millie Kwan continues this theme with a paper about process-oriented knowledge management. In her case study she discusses an implementation of knowledge management where the knowledge is centred around an organisational process and the mission, rationale and objectives of the process define the scope of the project. In her case they are concerned with the effective use of real estate (property and buildings) within a Fortune 100 company. In order to manage the knowledge about this property and the process by which the best 'deal' for internal customers and the overall company was reached, a KMS was devised. She argues that process knowledge is a source of core competence and thus needs to be strategically managed. Finally, you may also wish to read a related paper originally submitted for this Special Issue, 'Customer knowledge management' by Garcia-Murillo and Annabi, which was published in the August 2002 issue of the Journal of the Operational Research Society, 53(8), 875–884.
Resumo:
Recent surveys reveal that many university students in the U.K. are not satisfied with the timeliness and usefulness of the feedback given by their tutors. Ensuring timeliness in marking can result in a reduction in the quality of feedback. Though suitable use of Information and Communication Technology should alleviate this problem, existing Virtual Learning Environments are inadequate to support detailed marking scheme creation and they provide little support for giving detailed feedback. This paper describes a unique new web-based tool called e-CAF for facilitating coursework assessment and feedback management directed by marking schemes. Using e-CAF, tutors can create or reuse detailed marking schemes efficiently without sacrificing the accuracy or thoroughness in marking. The flexibility in marking scheme design also makes it possible for tutors to modify a marking scheme during the marking process without having to reassess the students’ submissions. The resulting marking process will become more transparent to students.
Resumo:
Spin coating polymer blend thin films provides a method to produce multiphase functional layers of high uniformity covering large surface areas. Applications for such layers include photovoltaics and light-emitting diodes where performance relies upon the nanoscale phase separation morphology of the spun film. Furthermore, at micrometer scales, phase separation provides a route to produce self-organized structures for templating applications. Understanding the factors that determine the final phase-separated morphology in these systems is consequently an important goal. However, it has to date proved problematic to fully test theoretical models for phase separation during spin coating, due to the high spin speeds, which has limited the spatial resolution of experimental data obtained during the coating process. Without this fundamental understanding, production of optimized micro- and nanoscale structures is hampered. Here, we have employed synchronized stroboscopic illumination together with the high light gathering sensitivity of an electron-multiplying charge-coupled device camera to optically observe structure evolution in such blends during spin coating. Furthermore the use of monochromatic illumination has allowed interference reconstruction of three-dimensional topographies of the spin-coated film as it dries and phase separates with nanometer precision. We have used this new method to directly observe the phase separation process during spinning for a polymer blend (PS-PI) for the first time, providing new insights into the spin-coating process and opening up a route to understand and control phase separation structures. © 2011 American Chemical Society.
Resumo:
We used microwave plasma enhanced chemical vapor deposition (MPECVD) to carbonize an electrospun polyacrylonitrile (PAN) precursor to form carbon fibers. Scanning electron microscopy, Raman spectroscopy, and Fourier transform infrared spectroscopy were used to characterize the fibers at different evolution stages. It was found that MPECVD-carbonized PAN fibers do not exhibit any significant change in the fiber diameter, whilst conventionally carbonized PAN fibers show a 33% reduction in the fiber diameter. An additional coating of carbon nanowalls (CNWs) was formed on the surface of the carbonized PAN fibers during the MPECVD process without the assistance of any metallic catalysts. The result presented here may have a potential to develop a novel, economical, and straightforward approach towards the mass production of carbon fibrous materials containing CNWs. © 2013 American Institute of Physics.
Resumo:
Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.
Resumo:
The application of systems thinking to designing, managing, and improving business processes has developed a new "holonic-based" process modeling methodology. The theoretical background and the methodology are described using examples taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. A key point of differentiation attributed to this methodology is that it allows a set of models to be produced without taking a task breakdown approach but instead uses systems thinking and a construct known as the "holon" to build process descriptions as a system of systems (i.e., a holarchy). The process-oriented holonic modeling methodology has been used for total quality management and business process engineering exercises in different industrial sectors and builds models that connect the strategic vision of a company to its operational processes. Exercises have been conducted in response to environmental pressures to make operations align with strategic thinking as well as becoming increasingly agile and efficient. This unique methodology is best applied in environments of high complexity, low volume, and high variety, where repeated learning opportunities are few and far between (e.g., large development projects). © 2007 IEEE.
Resumo:
Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.
Resumo:
The literature on the potential use of liquid ammonia as a solvent for the extraction of aromatic hydrocarbons from mixtures with paraffins, and the application of reflux, has been reviewed. Reference is made to extractors suited to this application. A pilot scale extraction plant was designed comprising a Scm. diameter by 12Scm. high, 50 stage Rotating Disc Contactor with 2 external settlers. Provision was made for operation with, or without, reflux at a pressure of 10 bar and ambient temperature. The solvent recovery unit consisted of an evaporator, compressor and condenser in a refrigeration cycle. Two systems were selected for study, Cumene-n-Heptane-Ammonia and Toluene-Methylcyclohexane-Ammonia. Equlibrium data for the first system was determined experimentally in a specially-designed, equilibrium bomb. A technique was developed to withdraw samples under pressure for analysis by chromatography and titration. The extraction plant was commissioned with a kerosine-water system; detailed operating procedures were developed based on a Hazard and Operability Study. Experimental runs were carried out with both ternary ammonia systems. With the system Toluene-Methylcyclohexane-Ammonia the extraction plant and the solvent recovery facility, operated satisfactorily, and safely,in accordance with the operating procedures. Experimental data gave reasonable agreement with theory. Recommendations are made for further work with plant.
Resumo:
Pyrolysis is one of several thermochemical technologies that convert solid biomass into more useful and valuable bio-fuels. Pyrolysis is thermal degradation in the complete or partial absence of oxygen. Under carefully controlled conditions, solid biomass can be converted to a liquid known as bie-oil in 75% yield on dry feed. Bio-oil can be used as a fuel but has the drawback of having a high level of oxygen due to the presence of a complex mixture of molecular fragments of cellulose, hemicellulose and lignin polymers. Also, bio-oil has a number of problems in use including high initial viscosity, instability resulting in increased viscosity or phase separation and high solids content. Much effort has been spent on upgrading bio-oil into a more usable liquid fuel, either by modifying the liquid or by major chemical and catalytic conversion to hydrocarbons. The overall primary objective was to improve oil stability by exploring different ways. The first was to detennine the effect of feed moisture content on bio-oil stability. The second method was to try to improve bio-oil stability by partially oxygenated pyrolysis. The third one was to improve stability by co-pyrolysis with methanol. The project was carried out on an existing laboratory pyrolysis reactor system, which works well with this project without redesign or modification too much. During the finishing stages of this project, it was found that the temperature of the condenser in the product collection system had a marked impact on pyrolysis liquid stability. This was discussed in this work and further recommendation given. The quantity of water coming from the feedstock and the pyrolysis reaction is important to liquid stability. In the present work the feedstock moisture content was varied and pyrolysis experiments were carried out over a range of temperatures. The quality of the bio-oil produced was measured as water content, initial viscosity and stability. The result showed that moderate (7.3-12.8 % moisture) feedstock moisture led to more stable bio-oil. One of drawbacks of bio-oil was its instability due to containing unstable oxygenated chemicals. Catalytic hydrotreatment of the oil and zeolite cracking of pyrolysis vapour were discllssed by many researchers, the processes were intended to eliminate oxygen in the bio-oil. In this work an alternative way oxygenated pyrolysis was introduced in order to reduce oil instability, which was intended to oxidise unstable oxygenated chemicals in the bio-oil. The results showed that liquid stability was improved by oxygen addition during the pyrolysis of beech wood at an optimum air factor of about 0.09-0.15. Methanol as a postproduction additive to bio-oil has been studied by many researchers and the most effective result came from adding methanol to oil just after production. Co-pyrolysis of spruce wood with methanol was undertaken in the present work and it was found that methanol improved liquid stability as a co-pyrolysis solvent but was no more effective than when used as a postproduction additive.
Resumo:
This research examines the role of the information management process within a process-oriented enterprise, Xerox Ltd. The research approach is based on a post-positive paradigm and has resulted in thirty-five idiographic statements. The three major outcomes are: 1. The process-oriented holistic enterprise is an organisation that requires a long-term management commitment to its development. It depends on the careful management of people, tasks, information and technology. A complex integration of business processes is required and this can be managed through the use of consistent documentation techniques, clarity in the definition of process responsibilities and management attention to the global metrics and the centralisation of the management of the process model are critical to its success. 2. The role of the information management process within the context of a process-oriented enterprise is to provide flexible and cost-effective applications, technological, and process support to the business. This is best achieved through a centralisation of the management of information management and of the process model. A business-led approach combined with the consolidation of applications, information, process, and data architectures is central to providing effective business and process-focused support. 3. In a process oriented holistic enterprise, process and information management are inextricably linked. The model of process management depends heavily on information management, whilst the model of information management is totally focused around supporting and creating the process model. The two models are mutually creating - one cannot exist without the other. There is a duality concept of process and information management.
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
Deep hole drilling is one of the most complicated metal cutting processes and one of the most difficult to perform on CNC machine-tools or machining centres under conditions of limited manpower or unmanned operation. This research work investigates aspects of the deep hole drilling process with small diameter twist drills and presents a prototype system for real time process monitoring and adaptive control; two main research objectives are fulfilled in particular : First objective is the experimental investigation of the mechanics of the deep hole drilling process, using twist drills without internal coolant supply, in the range of diarneters Ø 2.4 to Ø4.5 mm and working length up to 40 diameters. The definition of the problems associated with the low strength of these tools and the study of mechanisms of catastrophic failure which manifest themselves well before and along with the classic mechanism of tool wear. The relationships between drilling thrust and torque with the depth of penetration and the various machining conditions are also investigated and the experimental evidence suggests that the process is inherently unstable at depths beyond a few diameters. Second objective is the design and implementation of a system for intelligent CNC deep hole drilling, the main task of which is to ensure integrity of the process and the safety of the tool and the workpiece. This task is achieved by means of interfacing the CNC system of the machine tool to an external computer which performs the following functions: On-line monitoring of the drilling thrust and torque, adaptive control of feed rate, spindle speed and tool penetration (Z-axis), indirect monitoring of tool wear by pattern recognition of variations of the drilling thrust with cumulative cutting time and drilled depth, operation as a data base for tools and workpieces and finally issuing of alarms and diagnostic messages.
Resumo:
High velocity oxyfuel (HVOF) thermal spraying is one of the most significant developments in the thermal spray industry since the development of the original plasma spray technique. The first investigation deals with the combustion and discrete particle models within the general purpose commercial CFD code FLUENT to solve the combustion of kerosene and couple the motion of fuel droplets with the gas flow dynamics in a Lagrangian fashion. The effects of liquid fuel droplets on the thermodynamics of the combusting gas flow are examined thoroughly showing that combustion process of kerosene is independent on the initial fuel droplet sizes. The second analysis copes with the full water cooling numerical model, which can assist on thermal performance optimisation or to determine the best method for heat removal without the cost of building physical prototypes. The numerical results indicate that the water flow rate and direction has noticeable influence on the cooling efficiency but no noticeable effect on the gas flow dynamics within the thermal spraying gun. The third investigation deals with the development and implementation of discrete phase particle models. The results indicate that most powder particles are not melted upon hitting the substrate to be coated. The oxidation model confirms that HVOF guns can produce metallic coating with low oxidation within the typical standing-off distance about 30cm. Physical properties such as porosity, microstructure, surface roughness and adhesion strength of coatings produced by droplet deposition in a thermal spray process are determined to a large extent by the dynamics of deformation and solidification of the particles impinging on the substrate. Therefore, is one of the objectives of this study to present a complete numerical model of droplet impact and solidification. The modelling results show that solidification of droplets is significantly affected by the thermal contact resistance/substrate surface roughness.
Resumo:
This research aimed to identify any common factors that have enabled and/or motivated SMEs to successfully implement ISO 14001 whilst the majority have not. It also identified what challenges and barriers SMEs face in doing so and how some have overcome these. The existing literature suggests that the majority of SMEs perceive their environmental impacts to be proportional to their size; have a poor understanding of environmental issues; have a poor awareness of environmental regulations; do that have the necessary expertise or leadership to address environmental issues and that SMEs with an environmental management system such as ISO 14001 are very much the minority. The main factors that influenced whether an SME had implemented ISO 14001 were: competitive advantage, regulatory compliance, supply chain pressures, leadership, expertise, resources and external support. This research used qualitative analysis of interviews with managers and directors from 8 SMEs with ISO 14001 and 4 without. All of the SMEs were based in the West Midlands or Staffordshire. Interviews were also conducted with 3 organisations offering support to businesses on environmental issues and with 1 large business who was engaging their suppliers (which included SMEs within this sample) on environmental issues. The research found that there were four main factors that enabled or motivated the SMEs to implement ISO 14001, these were: leadership, supply chain pressures, external support and SMEs' history and experience of accredited management systems. The main challenges that these business had to overcome and that prevented the other SMEs from achieving ISO 14001 were: achieving regulatory compliance, perceived financial cost, lack of perceived competitive advantage, access to relevant and affordable support and for those SMEs without ISO 14001 there was very little perceived external pressure or need for them to do so.