895 resultados para Process Re-engineering
Resumo:
Product verifications have become a cost-intensive and time-consuming aspect of modern electronics production, but with the onset of an ever-increasing miniaturisation, these aspects will become even more cumbersome. One may also go as far as to point out that certain precision assembly, such as within the biomedical sector, is legally bound to have 0 defects within production. Since miniaturisation and precision assembly will soon become a part of almost any product, the verifications phases of assembly need to be optimised in both functionality and cost. Another aspect relates to the stability and robustness of processes, a pre-requisite for flexibility. Furthermore, as the re-engineering cycle becomes ever more important, all information gathered within the ongoing process becomes vital. In view of these points, product, or process verification may be assumed to be an important and integral part of precision assembly. In this paper, product verification is defined as the process of determining whether or not the products, at a given phase in the life-cycle, fulfil the established specifications. Since the product is given its final form and function in the assembly, the product verification normally takes place somewhere in the assembly line which is the focus for this paper.
Resumo:
This article reports a longitudinal study that examined mergers between three large multi-site public-sector organizations. Both qualitative and quantitative methods of analysis are used to examine the effect of leadership and change management strategies on acceptance of cultural change by individuals. Findings indicate that in many cases the change that occurs as a result of a merger is imposed on the leaders themselves, and it is often the pace of change that inhibits the successful re-engineering of the culture. In this respect, the success or otherwise of any merger hinges on individual perceptions about the manner in which the process is handled and the direction in which the culture is moved. Communication and a transparent change process are important, as this will often determine not only how a leader will be regarded, but who will be regarded as a leader. Leaders need to be competent and trained in the process of transforming organizations to ensure that individuals within the organization accept the changes prompted by a merger.
Resumo:
Dynamic binary translation is the process of translating, modifying and rewriting executable (binary) code from one machine to another at run-time. This process of low-level re-engineering consists of a reverse engineering phase followed by a forward engineering phase. UQDBT, the University of Queensland Dynamic Binary Translator, is a machine-adaptable translator. Adaptability is provided through the specification of properties of machines and their instruction sets, allowing the support of different pairs of source and target machines. Most binary translators are closely bound to a pair of machines, making analyses and code hard to reuse. Like most virtual machines, UQDBT performs generic optimizations that apply to a variety of machines. Frequently executed code is translated to native code by the use of edge weight instrumentation, which makes UQDBT converge more quickly than systems based on instruction speculation. In this paper, we describe the architecture and run-time feedback optimizations performed by the UQDBT system, and provide results obtained in the x86 and SPARC® platforms.
Resumo:
Creativity is increasingly recognised as an essential component of engineering design. This paper describes an exploratory study into the nature and importance of creativity in engineering design problem solving in relation to the possible impact of software design tools. The first stage of the study involved an empirical investigation in the form of a case study of the use of standard CAD tool sets and the development of a systems engineering software support tool. It was found that there were several ways in which CAD influenced the creative process, including enhancing visualisation and communication, premature fixation, circumscribed thinking and bounded ideation. The tool development experience uncovered the difficulty in supporting creative processes from the developer's perspective. The issues were the necessity of making assumptions, achieving a balance between structure and flexibility, and the pitfalls of satisfying user wants and needs. The second part of the study involved the development of a model of the creative problem solving process in engineering design. This provided a possible explanation for why purpose designed engineering software tools might encourage an analytical problem solving approach and discourage a more creative approach.
Resumo:
The international economic and business environment continues to develop at a rapid rate. Increasing interactions between economies, particularly between Europe and Asia, has raised many important issues regarding transport infrastructure, logistics and broader supply chain management. The potential exists to further stimulate trade provided that these issues are addressed in a logical and systematic manner. However, if this potential is to be realised in practice there is a need to re-evaluate current supply chain configurations. A mismatch currently exists between the technological capability and the supply chain or logistical reality. This mismatch has sharpened the focus on the need for robust approaches to supply chain re-engineering. Traditional approaches to business re-engineering have been based on manufacturing systems engineering and business process management. A recognition that all companies exist as part of bigger supply chains has fundamentally changed the focus of re-engineering. Inefficiencies anywhere in a supply chain result in the chain as a whole being unable to reach its true competitive potential. This reality, combined with the potentially radical impact on business and supply chain architectures of the technologies associated with electronic business, requires organisations to adopt innovative approaches to supply chain analysis and re-design. This paper introduces a systems approach to supply chain re-engineering which is aimed at addressing the challenges which the evolving business environment brings with it. The approach, which is based on work with a variety of both conventional and electronic supply chains, comprises underpinning principles, a methodology and guidelines on good working practice, as well as a suite of tools and techniques. The adoption of approaches such as that outlined in this paper helps to ensure that robust supply chains are designed and implemented in practice. This facilitates an integrated approach, with involvement of all key stakeholders throughout the design process.
Resumo:
The thermoforming industry has been relatively slow to embrace modern measurement technologies. As a result researchers have struggled to develop accurate thermoforming simulations as some of the key aspects of the process remain poorly understood. For the first time, this work reports the development of a prototype multivariable instrumentation system for use in thermoforming. The system contains sensors for plug force, plug displacement, air pressure and temperature, plug temperature, and sheet temperature. Initially, it was developed to fit the tooling on a laboratory thermoforming machine, but later its performance was validated by installing it on a similar industrial tool. Throughout its development, providing access for the various sensors and their cabling was the most challenging task. In testing, all of the sensors performed well and the data collected has given a powerful insight into the operation of the process. In particular, it has shown that both the air and plug temperatures stabilize at more than 80C during the continuous thermoforming of amorphous polyethylene terephthalate (aPET) sheet at 110C. The work also highlighted significant differences in the timing and magnitude of the cavity pressures reached in the two thermoforming machines. The prototype system has considerable potential for further development.
Resumo:
Mass Customization (MC) is not a mature business strategy and hence it is not clear that a single or small group of operational models are dominating. Companies tend to approach MC from either a mass production or a customization origin and this in itself gives reason to believe that several operational models will be observable. This paper reviews actual and theoretical fulfilment systems that enterprises could apply when offering a pre-engineered catalogue of customizable products and options. Issues considered are: How product flows are structured in relation to processes, inventories and decoupling point(s); - Characteristics of the OF process that inhibit or facilitate fulfilment; - The logic of how products are allocated to customers; - Customer factors that influence OF process design and operation. Diversity in the order fulfilment structures is expected and is found in the literature. The review has identified four structural forms that have been used in a Catalogue MC context: - fulfilment from stock; - fulfilment from a single fixed decoupling point; - fulfilment from one of several fixed decoupling points; - fulfilment from several locations, with floating decoupling points. From the review it is apparent that producers are being imaginative in coping with the demands of high variety, high volume, customization and short lead times. These demands have encouraged the relationship between product, process and customer to be re-examined. Not only has this strengthened interest in commonality and postponement, but, as is reported in the paper, has led to the re-engineering of the order fulfilment process to create models with multiple fixed decoupling points and the floating decoupling point system
Resumo:
Este trabajo se inscribe en uno de los grandes campos de los estudios organizacionales: la estrategia. La perspectiva clásica en este campo promovió la idea de que proyectarse hacia el futuro implica diseñar un plan (una serie de acciones deliberadas). Avances posteriores mostraron que la estrategia podía ser comprendida de otras formas. Sin embargo, la evolución del campo privilegió en alguna medida la mirada clásica estableciendo, por ejemplo, múltiples modelos para ‘formular’ una estrategia, pero dejando en segundo lugar la manera en la que esta puede ‘emerger’. El propósito de esta investigación es, entonces, aportar al actual nivel de comprensión respecto a las estrategias emergentes en las organizaciones. Para hacerlo, se consideró un concepto opuesto —aunque complementario— al de ‘planeación’ y, de hecho, muy cercano en su naturaleza a ese tipo de estrategias: la improvisación. Dado que este se ha nutrido de valiosos aportes del mundo de la música, se acudió al saber propio de este dominio, recurriendo al uso de ‘la metáfora’ como recurso teórico para entenderlo y alcanzar el objetivo propuesto. Los resultados muestran que 1) las estrategias deliberadas y las emergentes coexisten y se complementan, 2) la improvisación está siempre presente en el contexto organizacional, 3) existe una mayor intensidad de la improvisación en el ‘como’ de la estrategia que en el ‘qué’ y, en oposición a la idea convencional al respecto, 4) se requiere cierta preparación para poder improvisar de manera adecuada.
Resumo:
The present study was carried out on six different ore types from the Salitre Alkaline Complex aiming to determine their mineralogical composition and the major features that are relevant in the mineral processing. The P(2)O(5) grades vary from 9 to 25%. The slime content (-0, 020 mm) varies between 20 and 34% (w/w) and carries 17-22% of the P(2)O(5) content. The samples essentially consist of apatite, iron oxi-hydroxides, ilmenite, clay minerals, carbonate, quartz, pyroxene, perovskite, secondary phosphates and other minor accessory minerals. Below 0.21 mm, apatite essentially occurs in free particles showing a clean surface or a weak coating of it-on oxi-hydroxides; the highly covered apatite (not recoverable by flotation) varies from 6 to 9%. In the deslimed fraction (above 0.020 mm) more than 97% of the total phosphor content occurs as apatite; the estimated P 2 0 5 potential recovery in flotation concentration is over 90% (71-76% overall recovery).
Resumo:
Fault detection and isolation (FDI) are important steps in the monitoring and supervision of industrial processes. Biological wastewater treatment (WWT) plants are difficult to model, and hence to monitor, because of the complexity of the biological reactions and because plant influent and disturbances are highly variable and/or unmeasured. Multivariate statistical models have been developed for a wide variety of situations over the past few decades, proving successful in many applications. In this paper we develop a new monitoring algorithm based on Principal Components Analysis (PCA). It can be seen equivalently as making Multiscale PCA (MSPCA) adaptive, or as a multiscale decomposition of adaptive PCA. Adaptive Multiscale PCA (AdMSPCA) exploits the changing multivariate relationships between variables at different time-scales. Adaptation of scale PCA models over time permits them to follow the evolution of the process, inputs or disturbances. Performance of AdMSPCA and adaptive PCA on a real WWT data set is compared and contrasted. The most significant difference observed was the ability of AdMSPCA to adapt to a much wider range of changes. This was mainly due to the flexibility afforded by allowing each scale model to adapt whenever it did not signal an abnormal event at that scale. Relative detection speeds were examined only summarily, but seemed to depend on the characteristics of the faults/disturbances. The results of the algorithms were similar for sudden changes, but AdMSPCA appeared more sensitive to slower changes.
Resumo:
We are witnessing an enormous growth in biological nitrogen removal from wastewater. It presents specific challenges beyond traditional COD (carbon) removal. A possibility for optimised process design is the use of biomass-supporting media. In this paper, attached growth processes (AGP) are evaluated using dynamic simulations. The advantages of these systems that were qualitatively described elsewhere, are validated quantitatively based on a simulation benchmark for activated sludge treatment systems. This simulation benchmark is extended with a biofilm model that allows for fast and accurate simulation of the conversion of different substrates in a biofilm. The economic feasibility of this system is evaluated using the data generated with the benchmark simulations. Capital savings due to volume reduction and reduced sludge production are weighed out against increased aeration costs. In this evaluation, effluent quality is integrated as well.
Resumo:
This paper reports on the development of specific slicing techniques for functional programs and their use for the identification of possible coherent components from monolithic code. An associated tool is also introduced. This piece of research is part of a broader project on program understanding and re-engineering of legacy code supported by formal methods
Resumo:
This paper presents an optimization study of a distillation column for methanol and aqueous glycerol separation in a biodiesel production plant. Considering the available physical data of the column configuration, a steady state model was built for the column using Aspen-HYSYS as process simulator. Several sensitivity analysis were performed in order to better understand the relation between the variables of the distillation process. With the information obtained by the simulator, it is possible to define the best range for some operational variables that maintain composition of the desired product under specifications and choose operational conditions to minimize energy consumptions.
Resumo:
Dissertation presented to obtain the Doutoramento (Ph.D.) degree in Biochemistry at the Instituto de Tecnologia Qu mica e Biol ogica da Universidade Nova de Lisboa
Resumo:
L'objectiu d'aquest projecte és fer ús de la nova programació orientada a aspectes (AOP) per a fer tasques de reenginyeria. la finalitat seria que, amb l'ajut d'aquesta tecnologia, es pogués extreure informació de l'execució d'una aplicació, de manera que a partir d'aquesta informació es pogués obtenir el diagrama de cas d'ús.