21 resultados para CASE tools

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, it has become increasingly common for companies to improve their competitiveness and find new markets by extending their operations through international new product development collaborations involving technology transfer. Technology development, cost reduction and market penetration are seen as the foci in such collaborative operations with the aim being to improve the competitive position of both partners. In this paper, the case of technology transfer through collaborative new product development in the machine tool sector is used to provide a typical example of such partnerships. The paper outlines the links between the operational aspects of collaborations and their strategic objectives. It is based on empirical data collected from the machine tool industries in the UK and China. The evidence includes longitudinal case studies and questionnaire surveys of machine tool manufacturers in both countries. The specific case of BSA Tools Ltd and its Chinese partner the Changcheng Machine Tool Works is used to provide an in-depth example of the operational development of a successful collaboration. The paper concludes that a phased coordination of commercial, technical and strategic interactions between the two partners is essential for such collaborations to work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Materials management function is always a major concern to the management of any industrial organisation as high inventory and an inefficient procurement process affect the profitability to a great extent. Problems multiply due to a very current business environment in India. Hence, existing materials planning and procurement processes and inventory management systems require a re-look with respect to a changing business environment. This study shows a radical improvement in materials management function of an Indian petroleum refinery through business process re-engineering (BPR) by analysing current processes, identifying key issues, deriving paradigm shifts and developing re-engineered processes through customer value analysis. BPR has been carried out on existing processes of “materials planning and procurement” and “warehousing and surplus disposal”. The re-engineered processes for materials management function trigger a few improvement projects that were identified by the group of executives who took part in the re-engineering exercise. Those projects were implemented in an integrated framework with the application of the state of art information technology tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is twofold: first, we compute quality-adjusted measures of productivity change for the three most important diagnostic technologies (i.e., the Computerised Tomography Scan, Electrocardiogram and Echocardiogram) in the major Portuguese hospitals. We use the Malmquist–Luenberger index, which allows to measure productivity growth while controlling for the quality of the production. Second, using non-parametric tests, we analyse whether the implementation of the Prospective Payment System may have had a positive impact on the movements of productivity over time. The results show that the PPS has helped hospitals to use these tools more efficiently and to improve their effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aston University has recently made PebblePad, an e-portfolio or personal learning system, available to all students within the University. The customisable Profiles within PebblePad allow students to self-declare their skills in particular areas, attaching evidence of their skills or an action plan for improvement to each statement. Formal Information Literacy (IL) teaching within Aston University is currently limited to Library & Information Services (LIS) Information Specialists delivering a maximum of one session to each student during each level of their degree. However, many of the skills are continually developed by students during the course of their academic studies. For this project, an IL skills profile was created within PebblePad, which was then promoted to groups of staff and students to complete during the academic session 2009-10. Functionality within PebblePad allowed students to share their IL skills profile, evidence, action plans or any other items they felt were appropriate with an LIS Information Specialist who was able to add comments and offer suggestions for activities to help the student to develop further. Activities were closely related to students’ coursework where possible: suggesting a student kept a short reflective log of their information searching and evaluating process for an upcoming essay, for example. Feedback on the usefulness of the IL Profile will be sought from students through focus groups and the communication tools in PebblePad. In this way, we hope to make students more aware of their IL skills and to offer IL skills support over a longer period of time than a single session can provide. We will present preliminary conclusions about the practicalities and benefits of a self-declaration approach to developing IL skills in students at Aston University.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the present study is to test the case linkage principles of behavioural consistency and behavioural distinctiveness using serial vehicle theft data. Data from 386 solved vehicle thefts committed by 193 offenders were analysed using Jaccard's, regression and Receiver Operating Characteristic analyses to determine whether objectively observable aspects of crime scene behaviour could be used to distinguish crimes committed by the same offender from those committed by different offenders. The findings indicate that spatial behaviour, specifically the distance between theft locations and between dump locations, is a highly consistent and distinctive aspect of vehicle theft behaviour; thus, intercrime and interdump distance represent the most useful aspects of vehicle theft for the purpose of case linkage analysis. The findings have theoretical and practical implications for understanding of criminal behaviour and for the development of decision-support tools to assist police investigation and apprehension of serial vehicle theft offenders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industry cluster policies are a current trend in local economic development programmes and represent a major shift from traditional approaches. This trend has been coupled by an increasing interest in new media industry as a significant focus for regional development strategies. In England clusters and new media industry have therefore come to be seen as important tools in promoting local and regional economic development. This study aimed to ascertain the success of these policies. In order to achieve the aims of the study, the Birmingham new media industry was chosen for the study. In addition to an extensive review of the literature, semi-structured interviews were conducted with new media firms and Business Support Agencies (BSAs) offering programmes to promote the development of the new media industry cluster. The key findings of the thesis are that the concerns of new media industry when choosing their location do not conform to the industry cluster theory. Moreover, close proximity in geographical location of the industries does not mean there is collaboration and any costs saved as a result of close proximity to similar firms are at present seen as irrelevant because of the type of products they offer. Building trust between firms is the key in developing the new media industry cluster and the BSAs can act as a broker and provide neutral ground to develop it. The key policy recommendations are that new media industry is continually changing and research must continuously track and analyse cluster dynamics in order to be aware of emerging trends and future developments that can positively and negatively affect the cluster. Policy makers need to keep in mind that there is no uniform tool kit to foster the different sectors in cluster development. It is also important for them to be winning support and trust of new media firms since this is key in the success of the cluster. When cluster programs are introduced they must explain their benefits to industries more effectively in order to encourage them to participate in programmes. The general conclusions of the thesis are that clusters are a potentially important tool in local economic development policy and that the new media industry has a considerable growth potential. The kinds of relationships which cluster theory suggests develop between do not, as yet, appear to exist within the new media cluster. There are however, steps that the BSAs can take to encourage their development. Thus, the BSAs need to ensure that they establish an environment that enables growth of the industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of non-technical factors in the design and implementation of information systems has been increasingly recognised by both researchers and practitioners, and recent literature highlights the need for new tools and techniques with an organisational, rather than technical, focus. The gap between what is technically possible and what is generally practised, is particularly wide in the sales and marketing field. This research describes the design and implementation of a decision support system (DSS) for marketing planning and control in a small, but complex company and examines the nature of the difficulties encountered. An intermediary with functional, rather than technical, expertise is used as a strategy for overcoming these by taking control of the whole of the systems design and implementation cycle. Given the practical nature of the research, an action research approach is adopted with the researcher undertaking this role. This approach provides a detailed case study of what actually happens during the DSS development cycle, allowing the influence of organisational factors to be captured. The findings of the research show how the main focus of the intermediary's role needs to be adapted over the systems development cycle; from coordination and liaison in the pre-design and design stages, to systems champion during the first part of the implementation stage, and finally to catalyst to ensure that the DSS is integrated into the decision-making process. Two practical marketing exercises are undertaken which illustrate the nature of the gap between the provision of information and its use. The lack of a formal approach to planning and control is shown to have a significant effect on the way the DSS is used and the role of the intermediary is extended successfully to accommodate this factor. This leads to the conclusion that for the DSS to play a fully effective role, small firms may need to introduce more structure into their marketing planning, and that the role of the intermediary, or Information Coordinator, should include the responsibility for introducing new techniques and ideas to aid with this.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite expectations being high, the industrial take-up of Semantic Web technologies in developing services and applications has been slower than expected. One of the main reasons is that many legacy systems have been developed without considering the potential of theWeb in integrating services and sharing resources.Without a systematic methodology and proper tool support, the migration from legacy systems to SemanticWeb Service-based systems can be a tedious and expensive process, which carries a significant risk of failure. There is an urgent need to provide strategies, allowing the migration of legacy systems to Semantic Web Services platforms, and also tools to support such strategies. In this paper we propose a methodology and its tool support for transitioning these applications to Semantic Web Services, which allow users to migrate their applications to Semantic Web Services platforms automatically or semi-automatically. The transition of the GATE system is used as a case study. © 2009 - IOS Press and the authors. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological capabilities in Chinese manufacturing have been transformed in the last three decades. However, the extent to which domestic market oriented state owned enterprises (SOEs) have developed their capabilities is not clear. Six SOEs in the automotive, steel and machine tools sectors in Beijing and Tianjin have been studied since the mid-1990s to assess the capability levels attained and the role of external sources and internal efforts in developing them. Aided by government policies, acquisition of technology and their own efforts, the case study companies appear to be broadly following the East Asian late industrialisation model. All six enterprises demonstrate competences in operating established technology, managing investment and making product and process improvements. The evidence suggests that companies without foreign joint venture (JV) collaborations have made more progress in this respect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulation is an effective method for improving supply chain performance. However, there is limited advice available to assist practitioners in selecting the most appropriate method for a given problem. Much of the advice that does exist relies on custom and practice rather than a rigorous conceptual or empirical analysis. An analysis of the different modelling techniques applied in the supply chain domain was conducted, and the three main approaches to simulation used were identified; these are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). This research has examined these approaches in two stages. Firstly, a first principles analysis was carried out in order to challenge the received wisdom about their strengths and weaknesses and a series of propositions were developed from this initial analysis. The second stage was to use the case study approach to test these propositions and to provide further empirical evidence to support their comparison. The contributions of this research are both in terms of knowledge and practice. In terms of knowledge, this research is the first holistic cross paradigm comparison of the three main approaches in the supply chain domain. Case studies have involved building ‘back to back’ models of the same supply chain problem using SD and a discrete approach (either DES or ABM). This has led to contributions concerning the limitations of applying SD to operational problem types. SD has also been found to have risks when applied to strategic and policy problems. Discrete methods have been found to have potential for exploring strategic problem types. It has been found that discrete simulation methods can model material and information feedback successfully. Further insights have been gained into the relationship between modelling purpose and modelling approach. In terms of practice, the findings have been summarised in the form of a framework linking modelling purpose, problem characteristics and simulation approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the work undertaken in the Scholarly Ontologies Project. The aim of the project has been to develop a computational approach to support scholarly sensemaking, through interpretation and argumentation, enabling researchers to make claims: to describe and debate their view of a document's key contributions and relationships to the literature. The project has investigated the technicalities and practicalities of capturing conceptual relations, within and between conventional documents in terms of abstract ontological structures. In this way, we have developed a new kind of index to distributed digital library systems. This paper reports a case study undertaken to test the sensemaking tools developed by the Scholarly Ontologies project. The tools used were ClaiMapper, which allows the user to sketch argument maps of individual papers and their connections, ClaiMaker, a server on which such models can be stored and saved, which provides interpretative services to assist the querying of argument maps across multiple papers and ClaimFinder, a novice interface to the search services in ClaiMaker.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Completing projects faster than the normal duration is always a challenge to the management of any project, as it often demands many paradigm shifts. Opportunities of globalization, competition from private sectors and multinationals force the management of public sector organizations in the Indian petroleum sector to take various aggressive strategies to maintain their profitability. Constructing infrastructure for handling petroleum products is one of them. Moreover, these projects are required to be completed in faster duration compared to normal schedules to remain competitive, to get faster return on investment, and to give longer project life. However, using conventional tools and techniques of project management, it is impossible to handle the problem of reducing the project duration from a normal period. This study proposes the use of concurrent engineering in managing projects for radically reducing project duration. The phases of the project are accomplished concurrently/simultaneously instead of in a series. The complexities that arise in managing projects are tackled through restructuring project organization, improving management commitment, strengthening project-planning activities, ensuring project quality, managing project risk objectively and integrating project activities through management information systems. These would not only ensure completion of projects in fast track, but also improve project effectiveness in terms of quality, cost effectiveness, team building, etc. and in turn overall productivity of the project organization would improve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High precision manufacturers continuously seek out disruptive technologies to improve the quality, cost, and delivery of their products. With the advancement of machine tool and measurement technology many companies are ready to capitalise on the opportunity of on-machine measurement (OMM). Coupled with business case, manufacturing engineers are now questioning whether OMM can soon eliminate the need for post-process inspection systems. Metrologists will however argue that the machining environment is too hostile and that there are numerous process variables which need consideration before traceable measurement on-the-machine can be achieved. In this paper we test the measurement capability of five new multi-axis machine tools enabled as OMM systems via on-machine probing. All systems are tested under various operating conditions in order to better understand the effects of potentially significant variables. This investigation has found that key process variables such as machine tool warm-up and tool-change cycles can have an effect on machine tool measurement repeatability. New data presented here is important to many manufacturers whom are considering utilising their high precision multi-axis machine tools for both the creation and verification of their products.