906 resultados para engineering design process
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
This thesis work describes the creation of a pipework data structure for design system integration. Work is completed in pulp and paper plant delivery company with global engineering network operations in mind. User case of process design to 3D pipework design is introduced with influence of subcontracting engineering offices. Company data element list is gathered by using key person interviews and results are processed into a pipework data element list. Inter-company co-operation is completed in standardization association and common standard for pipework data elements is found. As result inter-company created pipework data element list is introduced. Further list usage, development and relations to design software vendors are evaluated.
Resumo:
TRIZ is one of the well-known tools, based on analytical methods for creative problem solving. This thesis suggests adapted version of contradiction matrix, a powerful tool of TRIZ and few principles based on concept of original TRIZ. It is believed that the proposed version would aid in problem solving, especially those encountered in chemical process industries with unit operations. In addition, this thesis would help fresh process engineers to recognize importance of various available methods for creative problem solving and learn TRIZ method of creative problem solving. This thesis work mainly provides idea on how to modify TRIZ based method according to ones requirements to fit in particular niche area and solve problems efficiently in creative way. Here in this case, the contradiction matrix developed is based on review of common problems encountered in chemical process industry, particularly in unit operations and resolutions are based on approaches used in past to handle those issues.
Improving the competitiveness of electrolytic Zinc process by chemical reaction engineering approach
Resumo:
This doctoral thesis describes the development work performed on the leachand purification sections in the electrolytic zinc plant in Kokkola to increase the efficiency in these two stages, and thus the competitiveness of the plant. Since metallic zinc is a typical bulk product, the improvement of the competitiveness of a plant was mostly an issue of decreasing unit costs. The problems in the leaching were low recovery of valuable metals from raw materials, and that the available technology offered complicated and expensive processes to overcome this problem. In the purification, the main problem was consumption of zinc powder - up to four to six times the stoichiometric demand. This reduced the capacity of the plant as this zinc is re-circulated through the electrolysis, which is the absolute bottleneck in a zinc plant. Low selectivity gave low-grade and low-value precipitates for further processing to metallic copper, cadmium, cobalt and nickel. Knowledge of the underlying chemistry was poor and process interruptions causing losses of zinc production were frequent. Studies on leaching comprised the kinetics of ferrite leaching and jarosite precipitation, as well as the stability of jarosite in acidic plant solutions. A breakthrough came with the finding that jarosite could precipitate under conditions where ferrite would leach satisfactorily. Based on this discovery, a one-step process for the treatment of ferrite was developed. In the plant, the new process almost doubled the recovery of zinc from ferrite in the same equipment as the two-step jarosite process was operated in at that time. In a later expansion of the plant, investment savings were substantial compared to other technologies available. In the solution purification, the key finding was that Co, Ni, and Cu formed specific arsenides in the “hot arsenic zinc dust” step. This was utilized for the development of a three-step purification stage based on fluidized bed technology in all three steps, i.e. removal of Cu, Co and Cd. Both precipitation rates and selectivity increased, which strongly decreased the zinc powder consumption through a substantially suppressed hydrogen gas evolution. Better selectivity improved the value of the precipitates: cadmium, which caused environmental problems in the copper smelter, was reduced from 1-3% reported normally down to 0.05 %, and a cobalt cake with 15 % Co was easily produced in laboratory experiments in the cobalt removal. The zinc powder consumption in the plant for a solution containing Cu, Co, Ni and Cd (1000, 25, 30 and 350 mg/l, respectively), was around 1.8 g/l; i.e. only 1.4 times the stoichiometric demand – or, about 60% saving in powder consumption. Two processes for direct leaching of the concentrate under atmospheric conditions were developed, one of which was implemented in the Kokkola zinc plant. Compared to the existing pressure leach technology, savings were obtained mostly in investment. The scientific basis for the most important processes and process improvements is given in the doctoral thesis. This includes mathematical modeling and thermodynamic evaluation of experimental results and hypotheses developed. Five of the processes developed in this research and development program were implemented in the plant and are still operated. Even though these processes were developed with the focus on the plant in Kokkola, they can also be implemented at low cost in most of the zinc plants globally, and have thus a great significance in the development of the electrolytic zinc process in general.
Resumo:
Middle section module of InnoTrackTM moving walk was re-engineered according to value analysis process. Self-supporting steel structure for moving walk was created as a result of this process. Designed structure was verified and validated by prototype tests and finite element method calculations. Self-supporting steel structure replaces the original design of middle section module in InnoTrackTM. Designed structure provides higher satisfaction to customers’ needs and at the same time, it uses less resources. The redesigned middle section module provides higher value to the customer.
Resumo:
Finnish design has attracted global attention lately and companies within the industry have potential in international markets. Because networks have been found to be extremely helpful in a firm’s international business operations and usefulness of networks is not fully exploited, their role in Finnish design companies is investigated. Accordingly, this study concentrates on understanding the role of networks in the internationalization process of Finnish design companies. This was investigated through describing the internationalization process of Finnish design companies, analyzing what kind of networks are related to internationalization process of Finnish design companies, and analyzing how networks are utilized in the internationalization process of Finnish design companies. The theoretical framework explores the Finnish design industry, internationalization process and networks. The Finnish design industry is introduced in general and the concept of design is defined to refer to the industries of textiles, furniture, clothing, and lighting equipment in the research. The theories of internationalization process, the Uppsala model and Luostarinen’s operation modes, are explored in detail. The Born Global theory, which is a contrary view to stage models, is also discussed. The concept of network is investigated, networks are classified into business and social networks, and network approach to internationalization is discussed. The research is conducted empirically and the research method is a descriptive case study. In this study, four case companies are investigated: the interior decoration unit of L-Fashion Group, Globe Hope, Klo Design, and Melaja Ltd. Data is collected by semi-structured interviews and the analysis is done in the following way: the case companies are introduced, their internationalization processes and networks are described and, finally, the comparison of the case companies is done in a form of cross-case analysis. This research showed that cooperation with social networks, such as locals or employees who have experience from the target market can be extremely helpful in the beginning of a Finnish design company’s internationalization process. This study also indicated that public organizations do not necessarily enhance the internationalization process in a design company point-of-view. In addition, the research showed that there is cooperation between small Finnish design companies whereas large design companies are not as open to cooperation with competitors.
Resumo:
The purpose of this thesis was to study the design of demand forecasting processes and management of demand. In literature review were different processes found and forecasting methods and techniques interviewed. Also role of bullwhip effect in supply chain was identified and how to manage it with information sharing operations. In the empirical part of study is at first described current situation and challenges in case company. After that will new way to handle demand introduced with target budget creation and how information sharing with 5 products and a few customers would bring benefits to company. Also the new S&OP process created within this study and organization for it.
Resumo:
Environmental issues, including global warming, have been serious challenges realized worldwide, and they have become particularly important for the iron and steel manufacturers during the last decades. Many sites has been shut down in developed countries due to environmental regulation and pollution prevention while a large number of production plants have been established in developing countries which has changed the economy of this business. Sustainable development is a concept, which today affects economic growth, environmental protection, and social progress in setting up the basis for future ecosystem. A sustainable headway may attempt to preserve natural resources, recycle and reuse materials, prevent pollution, enhance yield and increase profitability. To achieve these objectives numerous alternatives should be examined in the sustainable process design. Conventional engineering work cannot address all of these substitutes effectively and efficiently to find an optimal route of processing. A systematic framework is needed as a tool to guide designers to make decisions based on overall concepts of the system, identifying the key bottlenecks and opportunities, which lead to an optimal design and operation of the systems. Since the 1980s, researchers have made big efforts to develop tools for what today is referred to as Process Integration. Advanced mathematics has been used in simulation models to evaluate various available alternatives considering physical, economic and environmental constraints. Improvements on feed material and operation, competitive energy market, environmental restrictions and the role of Nordic steelworks as energy supplier (electricity and district heat) make a great motivation behind integration among industries toward more sustainable operation, which could increase the overall energy efficiency and decrease environmental impacts. In this study, through different steps a model is developed for primary steelmaking, with the Finnish steel sector as a reference, to evaluate future operation concepts of a steelmaking site regarding sustainability. The research started by potential study on increasing energy efficiency and carbon dioxide reduction due to integration of steelworks with chemical plants for possible utilization of available off-gases in the system as chemical products. These off-gases from blast furnace, basic oxygen furnace and coke oven furnace are mainly contained of carbon monoxide, carbon dioxide, hydrogen, nitrogen and partially methane (in coke oven gas) and have proportionally low heating value but are currently used as fuel within these industries. Nonlinear optimization technique is used to assess integration with methanol plant under novel blast furnace technologies and (partially) substitution of coal with other reducing agents and fuels such as heavy oil, natural gas and biomass in the system. Technical aspect of integration and its effect on blast furnace operation regardless of capital expenditure of new operational units are studied to evaluate feasibility of the idea behind the research. Later on the concept of polygeneration system added and a superstructure generated with alternative routes for off-gases pretreatment and further utilization on a polygeneration system producing electricity, district heat and methanol. (Vacuum) pressure swing adsorption, membrane technology and chemical absorption for gas separation; partial oxidation, carbon dioxide and steam methane reforming for methane gasification; gas and liquid phase methanol synthesis are the main alternative process units considered in the superstructure. Due to high degree of integration in process synthesis, and optimization techniques, equation oriented modeling is chosen as an alternative and effective strategy to previous sequential modelling for process analysis to investigate suggested superstructure. A mixed integer nonlinear programming is developed to study behavior of the integrated system under different economic and environmental scenarios. Net present value and specific carbon dioxide emission is taken to compare economic and environmental aspects of integrated system respectively for different fuel systems, alternative blast furnace reductants, implementation of new blast furnace technologies, and carbon dioxide emission penalties. Sensitivity analysis, carbon distribution and the effect of external seasonal energy demand is investigated with different optimization techniques. This tool can provide useful information concerning techno-environmental and economic aspects for decision-making and estimate optimal operational condition of current and future primary steelmaking under alternative scenarios. The results of the work have demonstrated that it is possible in the future to develop steelmaking towards more sustainable operation.
Resumo:
In this work the separation of multicomponent mixtures in counter-current columns with supercritical carbon dioxide has been investigated using a process design methodology. First the separation task must be defined, then phase equilibria experiments are carried out, and the data obtained are correlated with thermodynamic models or empirical functions. Mutual solubilities, Ki-values, and separation factors aij are determined. Based on this data possible operating conditions for further extraction experiments can be determined. Separation analysis using graphical methods are performed to optimize the process parameters. Hydrodynamic experiments are carried out to determine the flow capacity diagram. Extraction experiments in laboratory scale are planned and carried out in order to determine HETP values, to validate the simulation results, and to provide new materials for additional phase equilibria experiments, needed to determine the dependence of separation factors on concetration. Numerical simulation of the separation process and auxiliary systems is carried out to optimize the number of stages, solvent-to-feed ratio, product purity, yield, and energy consumption. Scale-up and cost analysis close the process design. The separation of palmitic acid and (oleic+linoleic) acids from PFAD-Palm Fatty Acids Distillates was used as a case study.
Business process re-engineering -menetelmät myyntiprosessin tehokkuuden parantamisessa (Case-yritys)
Resumo:
Tämän Pro Gradu-tutkielman tavoite on tutkia Business Process Re-engineering menetelmiä myyntiprosessien tehostamisessa. Tutkimuksen teoreettinen viiteke-hys rakentuu myynninjohtamisen, myyntiprosessien ja Business Process Mana-gementin ja Business Process Re-enineeringin ympärille. IT-järjestelmät ovat myös oleellinen osa-alue tutkimuksen kannalta ja niiden osuutta kuvataan niin myyntiprosesseissa kuin Business Process Re-engineering -menetelmien yhtey-dessä. Tutkielmassa perehdytään aikaisempaan tutkimusmateriaaliin ja akateemiseen kirjallisuuteen yllämainituilla osa-alueilla. Tavoitteena on löytää aikaisempia tutki-muksia myyntiprosessien tehostamisesta ja BPR:n roolista näissä tapauksissa. Myös myynninjohtamisen vaikutusta tehokkaaseen myyntiprosessiin tutkitaan, kuten myös IT-järjestelmien erilaisia rooleja tehokkaissa myyntiprosesseissa. Tutkielman empiirinen osio on kvalitatiivinen Case-tutkimus eräässä rahoitusalan yrityksessä. Tutkimus tehdään haastattelemalla myyntihenkilöstöä ja esimiehiä. Lisäksi analysoidaan yrityksen myyntiprosessiin liittyvää muuta materiaalia. Case-tutkimuksen tuloksia peilataan aiempaan akateemiseen tutkimukseen ja tuloksista pyritään löytämään ratkaisuja, miten BPR -menetelmillä voidaan tehostaa yrityksen myyntiprosessia.
Resumo:
The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.
Resumo:
Ohjelmiston suorituskyky on kokonaisvaltainen asia, johon kaikki ohjelmiston elinkaaren vaiheet vaikuttavat. Suorituskykyongelmat johtavat usein projektien viivästymisiin, kustannusten ylittymisiin sekä joissain tapauksissa projektin täydelliseen epäonnistumiseen. Software performance engineering (SPE) on ohjelmistolähtöinen lähestysmistapa, joka tarjoaa tekniikoita suorituskykyisen ohjelmiston kehittämiseen. Tämä diplomityö tutkii näitä tekniikoita ja valitsee niiden joukosta ne, jotka soveltuvat suorituskykyongelmien ratkaisemiseen kahden IT-laitehallintatuotteen kehityksessä. Työn lopputuloksena on päivitetty versio nykyisestä tuotekehitysprosessista, mikä huomioi sovellusten suorituskykyyn liittyvät haasteet tuotteiden elinkaaren eri vaiheissa.
Resumo:
The "Java Intelligent Tutoring System" (JITS) research project focused on designing, constructing, and determining the effectiveness of an Intelligent Tutoring System for beginner Java programming students at the postsecondary level. The participants in this research were students in the School of Applied Computing and Engineering Sciences at Sheridan College. This research involved consistently gathering input from students and instructors using JITS as it developed. The cyclic process involving designing, developing, testing, and refinement was used for the construction of JITS to ensure that it adequately meets the needs of students and instructors. The second objective in this dissertation determined the effectiveness of learning within this environment. The main findings indicate that JITS is a richly interactive ITS that engages students on Java programming problems. JITS is equipped with a sophisticated personalized feedback mechanism that models and supports each student in his/her learning style. The assessment component involved 2 main quantitative experiments to determine the effectiveness of JITS in terms of student performance. In both experiments it was determined that a statistically significant difference was achieved between the control group and the experimental group (i.e., JITS group). The main effect for Test (i.e., pre- and postiest), F( l , 35) == 119.43,p < .001, was qualified by a Test by Group interaction, F( l , 35) == 4.98,p < .05, and a Test by Time interaction, F( l , 35) == 43.82, p < .001. Similar findings were found for the second experiment; Test by Group interaction revealed F( 1 , 92) == 5.36, p < .025. In both experiments the JITS groups outperformed the corresponding control groups at posttest.
Resumo:
Les changements sont faits de façon continue dans le code source des logiciels pour prendre en compte les besoins des clients et corriger les fautes. Les changements continus peuvent conduire aux défauts de code et de conception. Les défauts de conception sont des mauvaises solutions à des problèmes récurrents de conception ou d’implémentation, généralement dans le développement orienté objet. Au cours des activités de compréhension et de changement et en raison du temps d’accès au marché, du manque de compréhension, et de leur expérience, les développeurs ne peuvent pas toujours suivre les normes de conception et les techniques de codage comme les patrons de conception. Par conséquent, ils introduisent des défauts de conception dans leurs systèmes. Dans la littérature, plusieurs auteurs ont fait valoir que les défauts de conception rendent les systèmes orientés objet plus difficile à comprendre, plus sujets aux fautes, et plus difficiles à changer que les systèmes sans les défauts de conception. Pourtant, seulement quelques-uns de ces auteurs ont fait une étude empirique sur l’impact des défauts de conception sur la compréhension et aucun d’entre eux n’a étudié l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes. Dans cette thèse, nous proposons trois principales contributions. La première contribution est une étude empirique pour apporter des preuves de l’impact des défauts de conception sur la compréhension et le changement. Nous concevons et effectuons deux expériences avec 59 sujets, afin d’évaluer l’impact de la composition de deux occurrences de Blob ou deux occurrences de spaghetti code sur la performance des développeurs effectuant des tâches de compréhension et de changement. Nous mesurons la performance des développeurs en utilisant: (1) l’indice de charge de travail de la NASA pour leurs efforts, (2) le temps qu’ils ont passé dans l’accomplissement de leurs tâches, et (3) les pourcentages de bonnes réponses. Les résultats des deux expériences ont montré que deux occurrences de Blob ou de spaghetti code sont un obstacle significatif pour la performance des développeurs lors de tâches de compréhension et de changement. Les résultats obtenus justifient les recherches antérieures sur la spécification et la détection des défauts de conception. Les équipes de développement de logiciels doivent mettre en garde les développeurs contre le nombre élevé d’occurrences de défauts de conception et recommander des refactorisations à chaque étape du processus de développement pour supprimer ces défauts de conception quand c’est possible. Dans la deuxième contribution, nous étudions la relation entre les défauts de conception et les fautes. Nous étudions l’impact de la présence des défauts de conception sur l’effort nécessaire pour corriger les fautes. Nous mesurons l’effort pour corriger les fautes à l’aide de trois indicateurs: (1) la durée de la période de correction, (2) le nombre de champs et méthodes touchés par la correction des fautes et (3) l’entropie des corrections de fautes dans le code-source. Nous menons une étude empirique avec 12 défauts de conception détectés dans 54 versions de quatre systèmes: ArgoUML, Eclipse, Mylyn, et Rhino. Nos résultats ont montré que la durée de la période de correction est plus longue pour les fautes impliquant des classes avec des défauts de conception. En outre, la correction des fautes dans les classes avec des défauts de conception fait changer plus de fichiers, plus les champs et des méthodes. Nous avons également observé que, après la correction d’une faute, le nombre d’occurrences de défauts de conception dans les classes impliquées dans la correction de la faute diminue. Comprendre l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes est important afin d’aider les équipes de développement pour mieux évaluer et prévoir l’impact de leurs décisions de conception et donc canaliser leurs efforts pour améliorer la qualité de leurs systèmes. Les équipes de développement doivent contrôler et supprimer les défauts de conception de leurs systèmes car ils sont susceptibles d’augmenter les efforts de changement. La troisième contribution concerne la détection des défauts de conception. Pendant les activités de maintenance, il est important de disposer d’un outil capable de détecter les défauts de conception de façon incrémentale et itérative. Ce processus de détection incrémentale et itérative pourrait réduire les coûts, les efforts et les ressources en permettant aux praticiens d’identifier et de prendre en compte les occurrences de défauts de conception comme ils les trouvent lors de la compréhension et des changements. Les chercheurs ont proposé des approches pour détecter les occurrences de défauts de conception, mais ces approches ont actuellement quatre limites: (1) elles nécessitent une connaissance approfondie des défauts de conception, (2) elles ont une précision et un rappel limités, (3) elles ne sont pas itératives et incrémentales et (4) elles ne peuvent pas être appliquées sur des sous-ensembles de systèmes. Pour surmonter ces limitations, nous introduisons SMURF, une nouvelle approche pour détecter les défauts de conception, basé sur une technique d’apprentissage automatique — machines à vecteur de support — et prenant en compte les retours des praticiens. Grâce à une étude empirique portant sur trois systèmes et quatre défauts de conception, nous avons montré que la précision et le rappel de SMURF sont supérieurs à ceux de DETEX et BDTEX lors de la détection des occurrences de défauts de conception. Nous avons également montré que SMURF peut être appliqué à la fois dans les configurations intra-système et inter-système. Enfin, nous avons montré que la précision et le rappel de SMURF sont améliorés quand on prend en compte les retours des praticiens.