854 resultados para Process Design
Resumo:
This paper describes how dimensional variation management could be integrated throughout design, manufacture and verification, to improve quality while reducing cycle times and manufacturing cost in the Digital Factory environment. Initially variation analysis is used to optimize tolerances during product and tooling design and also results in the creation of a simplified representation of product key characteristics. This simplified representation can then be used to carry out measurability analysis and process simulation. The link established between the variation analysis model and measurement processes can subsequently be used throughout the production process to automatically update the variation analysis model in real time with measurement data. This ‘live’ simulation of variation during manufacture will allow early detection of quality issues and facilitate autonomous measurement assisted processes such as predictive shimming. A study is described showing how these principles can be demonstrated using commercially available software combined with a number of prototype applications operating as discrete modules. The commercially available modules include Catia/Delmia for product and process design, 3DCS for variation analysis and Spatial Analyzer for measurement simulation. Prototype modules are used to carry out measurability analysis and instrument selection. Realizing the full potential of Metrology in the Digital Factory will require that these modules are integrated and software architecture to facilitate this is described. Crucially this integration must facilitate the use of realtime metrology data describing the emerging assembly to update the digital model.
Resumo:
The conventional, geometrically lumped description of the physical processes inside a high shear granulator is not reliable for process design and scale-up. In this study, a compartmental Population Balance Model (PBM) with spatial dependence is developed and validated in two lab-scale high shear granulation processes using a 1.9L MiPro granulator and 4L DIOSNA granulator. The compartmental structure is built using a heuristic approach based on computational fluid dynamics (CFD) analysis, which includes the overall flow pattern, velocity and solids concentration. The constant volume Monte Carlo approach is implemented to solve the multi-compartment population balance equations. Different spatial dependent mechanisms are included in the compartmental PBM to describe granule growth. It is concluded that for both cases (low and high liquid content), the adjustment of parameters (e.g. layering, coalescence and breakage rate) can provide a quantitative prediction of the granulation process.
Resumo:
The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.
Resumo:
The mixing performance of three passive milli-scale reactors with different geometries was investigated at different Reynolds numbers. The effects of design and operating characteristics such as mixing channel shape and volume flow rate were investigated. The main objective of this work was to demonstrate a process design method that uses on Computational Fluid Dynamics (CFD) for modeling and Additive Manufacturing (AM) technology for manufacture. The reactors were designed and simulated using SolidWorks and Fluent 15.0 software, respectively. Manufacturing of the devices was performed with an EOS M-series AM system. Step response experiments with distilled Millipore water and sodium hydroxide solution provided time-dependent concentration profiles. Villermaux-Dushman reaction experiments were also conducted for additional verification of CFD results and for mixing efficiency evaluation of the different geometries. Time-dependent concentration data and reaction evaluation showed that the performance of the AM-manufactured reactors matched the CFD results reasonably well. The proposed design method allows the implementation of new and innovative solutions, especially in the process design phase, for industrial scale reactor technologies. In addition, rapid implementation is another advantage due to the virtual flow design and due to the fast manufacturing which uses the same geometric file formats.
Resumo:
Designing for users rather than with users is still a common practice in technology design and innovation as opposed to taking them on board in the process. Design for inclusion aims to define and understand end-users, their needs, context of use, and, by doing so, ensure that end-users are catered for and included, while the results are geared towards universality of use. We describe the central role of end-user and designer participation, immersion and perspective to build user-driven solutions. These approaches provided a critical understanding of the counterpart role. Designer(s) could understand what the user’s needs were, experience physical impairments, and see from other’s perspective the interaction with the environment. Users could understand challenges of designing for physical impairments, build a sense of ownership with technology and explore it from a creative perspective. The understanding of the peer’s role (user and designer), needs and perspective enhanced user participation and inclusion.
Resumo:
Single-cell oils (SCO) have been considered a promising source of 3rd generation biofuels mainly in the final form of biodiesel. However, its high production costs have been a barrier towards the commercialization of this commodity. The fast growing yeast Rhodosporidium toruloides NCYC 921 has been widely reported as a potential SCO producing yeast. In addition to its well-known high lipid content (that can be converted into biodiesel), is rich in high value added products such as carotenoids with commercial interest. The process design and integration may contribute to reduce the overall cost of biofuels and carotenoid production and is a mandatory step towards their commercialization. The present work addresses the biomass disruption, extraction, fractionation and recovery of products with special emphasis on high added valued carotenoids (beta-carotene, torulene, torularhodin) and fatty acids directed to biodiesel. The chemical structure of torularhodin with a terminal carboxylic group imposes an additional extra challenge in what concern its separation from fatty acids. The proposed feedstock is fresh biomass pellet obtained directly by centrifugation from a 5L fed-batch fermentation culture broth. The use of a wet instead of lyophilised biomass feedstock is a way to decrease processing energy costs and reduce downstream processing time. These results will contribute for a detailed process design. Gathered data will be of crucial importance for a further study on Life-Cycle Assessment (LCA).
Resumo:
Mass Customization (MC) is not a mature business strategy and hence it is not clear that a single or small group of operational models are dominating. Companies tend to approach MC from either a mass production or a customization origin and this in itself gives reason to believe that several operational models will be observable. This paper reviews actual and theoretical fulfilment systems that enterprises could apply when offering a pre-engineered catalogue of customizable products and options. Issues considered are: How product flows are structured in relation to processes, inventories and decoupling point(s); - Characteristics of the OF process that inhibit or facilitate fulfilment; - The logic of how products are allocated to customers; - Customer factors that influence OF process design and operation. Diversity in the order fulfilment structures is expected and is found in the literature. The review has identified four structural forms that have been used in a Catalogue MC context: - fulfilment from stock; - fulfilment from a single fixed decoupling point; - fulfilment from one of several fixed decoupling points; - fulfilment from several locations, with floating decoupling points. From the review it is apparent that producers are being imaginative in coping with the demands of high variety, high volume, customization and short lead times. These demands have encouraged the relationship between product, process and customer to be re-examined. Not only has this strengthened interest in commonality and postponement, but, as is reported in the paper, has led to the re-engineering of the order fulfilment process to create models with multiple fixed decoupling points and the floating decoupling point system
Resumo:
The knowledge of the liquid-liquid equilibria (LLE) between ionic liquids (ILs) and water is of utmost importance for environmental monitoring, process design and optimization. Therefore, in this work, the mutual solubilities with water, for the ILs combining the 1-methylimidazolium, [C(1)im](+); 1-ethylimidazolium, [C(2)im](+); 1-ethyl-3-propylimidazolium, [C(2)C(3)im](+); and 1-butyl-2,3-dimethylimidazolium, [C(4)C(1)C(1)im](+) cations with the bis(trifluoromethylsulfonyl)imide anion, were determined and compared with the isomers of the symmetric 1,3-dialkylimidazolium bis(trifluoromethylsulfonyl)imide ([C(n)C(n)im][NTf2], with n=1-3) and of the asymmetric 1-alkyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide ([C(n)C(1)im][NTf2], with n = 2-5) series of ILs. The results obtained provide a broad picture of the impact of the IL cation structural isomerism, including the number of alkyl side chains at the cation, on the water-IL mutual solubilities. Despite the hydrophobic behaviour associated to the [NTf2](-) anion, the results show a significant solubility of water in the IL-rich phase, while the solubility of ILs in the water-rich phase is much lower. The thermodynamic properties of solution indicate that the solubility of ILs in water is entropically driven and highly influenced by the cation size. Using the results obtained here in addition to literature data, a correlation between the solubility of [NTf2]-based ILs in water and their molar volume, for a large range of cations, is proposed. The COnductor like Screening MOdel for Real Solvents (COSMO-RS) was also used to estimate the LLE of the investigated systems and proved to be a useful predictive tool for the a priori screening of ILs aiming at finding suitable candidates before extensive experimental measurements.
Resumo:
The project goal was to determine plant operations and maintenance worker’s level of exposure to mercury during routine and non-routine (i.e. turnarounds and inspections) maintenance events in eight gas processing plants. The project team prepared sampling and analysis plans designed to each plant’s process design and scheduled maintenance events. Occupational exposure sampling and monitoring efforts were focused on the measurement of mercury vapor concentration in worker breathing zone air during specific maintenance events including: pipe scrapping, process filter replacement, and process vessel inspection. Similar exposure groups were identified and worker breathing zone and ambient air samples were collected and analyzed for total mercury. Occupational exposure measurement techniques included portable field monitoring instruments, standard passive and active monitoring methods and an emerging passive absorption technology. Process sampling campaigns were focused on inlet gas streams, mercury removal unit outlets, treated gas, acid gas and sales gas. The results were used to identify process areas with increased potential for mercury exposure during maintenance events. Sampling methods used for the determination of total mercury in gas phase streams were based on the USEPA Methods 30B and EPA 1631 and EPA 1669. The results of four six-week long sampling campaigns have been evaluated and some conclusions and recommendations have been made. The author’s role in this project included the direction of all field phases of the project and the development and implementation of the sampling strategy. Additionally, the author participated in the development and implementation of the Quality Assurance Project Plan, Data Quality Objectives, and Similar Exposure Groups identification. All field generated data was reviewed by the author along with laboratory reports in order to generate conclusions and recommendations.
Resumo:
Companies operating in the wood processing industry need to increase their productivity by implementing automation technologies in their production systems. An increasing global competition and rising raw material prizes challenge their competitiveness. Yet, too extensive automation brings risks such as a deterioration in situation awareness and operator deskilling. The concept of Levels of Automation is generally seen as means to achieve a balanced task allocation between the operators’ skills and competences and the need for automation technology relieving the humans from repetitive or hazardous work activities. The aim of this thesis was to examine to what extent existing methods for assessing Levels of Automation in production processes are applicable in the wood processing industry when focusing on an improved competitiveness of production systems. This was done by answering the following research questions (RQ): RQ1: What method is most appropriate to be applied with measuring Levels of Automation in the wood processing industry? RQ2: How can the measurement of Levels of Automation contribute to an improved competitiveness of the wood processing industry’s production processes? Literature reviews were used to identify the main characteristics of the wood processing industry affecting its automation potential and appropriate assessment methods for Levels of Automation in order to answer RQ1. When selecting the most suitable method, factors like the relevance to the target industry, application complexity or operational level the method is penetrating were important. The DYNAMO++ method, which covers both a rather quantitative technical-physical and a more qualitative social-cognitive dimension, was seen as most appropriate when taking into account these factors. To answer RQ 2, a case study was undertaken at a major Swedish manufacturer of interior wood products to point out paths how the measurement of Levels of Automation contributes to an improved competitiveness of the wood processing industry. The focus was on the task level on shop floor and concrete improvement suggestions were elaborated after applying the measurement method for Levels of Automation. Main aspects considered for generalization were enhancements regarding ergonomics in process design and cognitive support tools for shop-floor personnel through task standardization. Furthermore, difficulties regarding the automation of grading and sorting processes due to the heterogeneous material properties of wood argue for a suitable arrangement of human intervention options in terms of work task allocation. The application of a modified version of DYNAMO++ reveals its pros and cons during a case study which covers a high operator involvement in the improvement process and the distinct predisposition of DYNAMO++ to be applied in an assembly system.
Resumo:
The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.
Resumo:
This study evaluated the effect of specimens' design and manufacturing process on microtensile bond strength, internal stress distributions (Finite Element Analysis - FEA) and specimens' integrity by means of Scanning Electron Microscopy (SEM) and Laser Scanning Confocal Microscopy (LCM). Excite was applied to flat enamel surface and a resin composite build-ups were made incrementally with 1-mm increments of Tetric Ceram. Teeth were cut using a diamond disc or a diamond wire, obtaining 0.8 mm² stick-shaped specimens, or were shaped with a Micro Specimen Former, obtaining dumbbell-shaped specimens (n = 10). Samples were randomly selected for SEM and LCM analysis. Remaining samples underwent microtensile test, and results were analyzed with ANOVA and Tukey test. FEA dumbbell-shaped model resulted in a more homogeneous stress distribution. Nonetheless, they failed under lower bond strengths (21.83 ± 5.44 MPa)c than stick-shaped specimens (sectioned with wire: 42.93 ± 4.77 MPaª; sectioned with disc: 36.62 ± 3.63 MPa b), due to geometric irregularities related to manufacturing process, as noted in microscopic analyzes. It could be concluded that stick-shaped, nontrimmed specimens, sectioned with diamond wire, are preferred for enamel specimens as they can be prepared in a less destructive, easier, and more precise way.
Resumo:
The simultaneous design of the steady-state and dynamic performance of a process has the ability to satisfy much more demanding dynamic performance criteria than the design of dynamics only by the connection of a control system. A method for designing process dynamics based on the use of a linearised systems' eigenvalues has been developed. The eigenvalues are associated with system states using the unit perturbation spectral resolution (UPSR), characterising the dynamics of each state. The design method uses a homotopy approach to determine a final design which satisfies both steady-state and dynamic performance criteria. A highly interacting single stage forced circulation evaporator system, including control loops, was designed by this method with the goal of reducing the time taken for the liquid composition to reach steady-state. Initially the system was successfully redesigned to speed up the eigenvalue associated with the liquid composition state, but this did not result in an improved startup performance. Further analysis showed that the integral action of the composition controller was the source of the limiting eigenvalue. Design changes made to speed up this eigenvalue did result in an improved startup performance. The proposed approach provides a structured way to address the design-control interface, giving significant insight into the dynamic behaviour of the system such that a systematic design or redesign of an existing system can be undertaken with confidence.
Resumo:
Ecological interface design (EID) is proving to be a promising approach to the design of interfaces for complex dynamic systems. Although the principles of EID and examples of its effective use are widely available, few readily available examples exist of how the individual displays that constitute an ecological interface are developed. This paper presents the semantic mapping process within EID in the context of prior theoretical work in this area. The semantic mapping process that was used in developing an ecological interface for the Pasteurizer II microworld is outlined, and the results of an evaluation of the ecological interface against a more conventional interface are briefly presented. Subjective reports indicate features of the ecological interface that made it particularly valuable for participants. Finally, we outline the steps of an analytic process for using EID. The findings presented here can be applied in the design of ecological interfaces or of configural displays for dynamic processes.
Resumo:
The work presented herein follows an ongoing research that aims to analyze methodological practices to be applied in Design Education. A reflection about methodological strategies in Design Education and the function of drawing in Design represents the beginning of this study. Then, we developed an interdisciplinary pedagogical experience with the Graphic Design 1st grade students from our institution (IPCA). In the current academic year, 2013/2014, we continue to evolve this project, introducing changes in the initial proposal. Major alterations focused on the aspects that could be strengthened in terms of interdisciplinarity. In this article, the authors describe those changes and discuss the outcomes of the novel proposal. As we have already reported, this investigation follows a reflection about working methods to be adopted in Design Education. This is in accordance with other previously published works that purpose the enlargement of Design into new knowledge fields such as Experience or Service Design, changing not only the role of the graphic designer, but also the skills required to be a professional designer (Alain Findelli, 2001), (Brian Lawson, 2006), (Ciampa-Brewer, 2010). Furthermore, concepts such as cooperation or multidisciplinary design, amongst others, have been frequently debated as design teaching strategies (Heller and Talarico, 2011, pp. 82-85). These educational approaches also have an impact on our research. The analysis of all these authors’ contributions together with a reflection on our teaching practice allowed us to propose an improved interdisciplinary intervention.