873 resultados para Requirements engineering process
Resumo:
Tämän työn tarkoituksena oli tutkia kuinka organisaation kyvykkyyksiä voidaan mitata engineering- ja konsultointialalla käyttämällä ns. kyvykkyysauditointimenetelmää. Päämotiivit aineettoman omaisuuden mittaamiseksi tunnistettiin kirjallisuuskatsauksen pohjalta. Erilaisten menetelmien etuja ja haittoja tutkittiin, jotta kyvykkyysauditoinnin suorittamiseen liittyvät haasteet ja vaatimukset tulisivat tunnistetuiksi. Kyvykkyysauditoinnin rakentaminen vaati teollisuudenalan erityispiirteiden tunnistamista. Niiksi havaittiin tietointensiivisyys ja projektikeskeisyys. Auditoinnin implementaatioprosessi koostui neljästä osasta, joista kolmen ensimmäisen suorittamiseen case-yritys antoi merkittävän panoksensa. Kriittisten menestystekijöiden selvittämisen jälkeen voitiin niihin vaikuttavat organisaation kyvykkyydet tunnistaa ja arviointi suorittaa. Arvioinnit kerättiin sisäisiltä ja ulkoisilta arvioijilta, ja ne muodostivat pohjan analyysille, joka selvitti yrityksen kehittämistarpeita. Kyvykkyysauditoinnin hyödyiksi laskettiin kasvanut tietämys yrityksen vahvuuksista ja heikkouksista sekä mahdollisuus tarkkailla säännöllisesti sen kokonaissuorituskykyä ja parantaa sitä.
Resumo:
Laatu on osaltaan vahvistamassa asemaansa liike-elämässä yritysten kilpaillessa kansainvälisillä markkinoilla niin hinnalla kuin laadulla. Tämä suuntaus on synnyttänyt useita laatuohjelmia, joita käytetään ahkerasti yritysten kokonais- valtaisen laatujohtamisen (TQM) toteuttamisessa. Laatujohtaminen kattaa yrityksen kaikki toiminnot ja luo vaatimuksia myös yrityksen tukitoimintojen kehittämiselle ja parantamiselle. Näihin lukeutuu myös tämän tutkimuksen kohde tietohallinto (IT). Tutkielman tavoitteena oli kuvata IT prosessin nykytila. Tutkielmassa laadittu prosessikuvaus pohjautuu prosessijohtamisen teoriaan ja kohdeyrityksen käyttämään laatupalkinto kriteeristöön. Tutkimusmenetelmänä prosessin nykytilan selvittämiseksi käytettiin teemahaastattelutta. Prosessin nykytilan ja sille asetettujen vaatimusten selvittämiseksi haastateltiin IT prosessin asiakkaita. Prosessianalyysi, tärkeimpien ala-prosessien tunnistaminen ja parannusalueiden löytäminen ovat tämän tutkielman keskeisemmät tulokset. Tutkielma painottui IT prosessin heikkouksien ja parannuskohteiden etsimiseen jatkuvan kehittämisen pohjaksi, ei niinkään prosessin radikaaliin uudistamiseen. Tutkielmassa esitellään TQM:n periaatteet, laatutyökaluja sekä prosessijohtamisen terminologia, periaatteet ja sen systemaattinen toteutus. Työ antaa myös kuvan siitä, miten TQM ja prosessijohtaminen niveltyvät yrityksen laatutyössä.
Resumo:
The aim of the thesis was to study quality management with process approach and to find out how to utilize process management to improve quality. The operating environment of organizations has changed. Organizations are focusing on their core competences and networking with suppliers and customers to ensure more effective and efficient value creation for the end customer. Quality management is moving from inspection of the output to prevention of problems from occurring in the first place and management thinking is changing from functional approach to process approach. In the theoretical part of the thesis, it is studied how to define quality, how to achieve good quality, how to improve quality, and how to make sure the improvement goes on as never ending cycle. A selection of quality tools is introduced. Process approach to quality management is described and compared to functional approach, which is the traditional way to manage operations and quality. The customer focus is also studied, and it is presented, that to ensure long term customer commitment, organization needs to react to changing customer requirements and wishes by constantly improving the processes. In the experimental part the theories are tested in a process improvement business case. It is shown how to execute a process improvement project starting from defining the customer requirements, continuing to defining the process ownership, roles and responsibilities, boundaries, interfaces and the actual process activities. The control points and measures are determined for the process, as well as the feedback and corrective action process, to ensure continual improvement can be achieved and to enable verification that customer requirements are fulfilled.
Improving the competitiveness of electrolytic Zinc process by chemical reaction engineering approach
Resumo:
This doctoral thesis describes the development work performed on the leachand purification sections in the electrolytic zinc plant in Kokkola to increase the efficiency in these two stages, and thus the competitiveness of the plant. Since metallic zinc is a typical bulk product, the improvement of the competitiveness of a plant was mostly an issue of decreasing unit costs. The problems in the leaching were low recovery of valuable metals from raw materials, and that the available technology offered complicated and expensive processes to overcome this problem. In the purification, the main problem was consumption of zinc powder - up to four to six times the stoichiometric demand. This reduced the capacity of the plant as this zinc is re-circulated through the electrolysis, which is the absolute bottleneck in a zinc plant. Low selectivity gave low-grade and low-value precipitates for further processing to metallic copper, cadmium, cobalt and nickel. Knowledge of the underlying chemistry was poor and process interruptions causing losses of zinc production were frequent. Studies on leaching comprised the kinetics of ferrite leaching and jarosite precipitation, as well as the stability of jarosite in acidic plant solutions. A breakthrough came with the finding that jarosite could precipitate under conditions where ferrite would leach satisfactorily. Based on this discovery, a one-step process for the treatment of ferrite was developed. In the plant, the new process almost doubled the recovery of zinc from ferrite in the same equipment as the two-step jarosite process was operated in at that time. In a later expansion of the plant, investment savings were substantial compared to other technologies available. In the solution purification, the key finding was that Co, Ni, and Cu formed specific arsenides in the “hot arsenic zinc dust” step. This was utilized for the development of a three-step purification stage based on fluidized bed technology in all three steps, i.e. removal of Cu, Co and Cd. Both precipitation rates and selectivity increased, which strongly decreased the zinc powder consumption through a substantially suppressed hydrogen gas evolution. Better selectivity improved the value of the precipitates: cadmium, which caused environmental problems in the copper smelter, was reduced from 1-3% reported normally down to 0.05 %, and a cobalt cake with 15 % Co was easily produced in laboratory experiments in the cobalt removal. The zinc powder consumption in the plant for a solution containing Cu, Co, Ni and Cd (1000, 25, 30 and 350 mg/l, respectively), was around 1.8 g/l; i.e. only 1.4 times the stoichiometric demand – or, about 60% saving in powder consumption. Two processes for direct leaching of the concentrate under atmospheric conditions were developed, one of which was implemented in the Kokkola zinc plant. Compared to the existing pressure leach technology, savings were obtained mostly in investment. The scientific basis for the most important processes and process improvements is given in the doctoral thesis. This includes mathematical modeling and thermodynamic evaluation of experimental results and hypotheses developed. Five of the processes developed in this research and development program were implemented in the plant and are still operated. Even though these processes were developed with the focus on the plant in Kokkola, they can also be implemented at low cost in most of the zinc plants globally, and have thus a great significance in the development of the electrolytic zinc process in general.
Resumo:
The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.
Business process re-engineering -menetelmät myyntiprosessin tehokkuuden parantamisessa (Case-yritys)
Resumo:
Tämän Pro Gradu-tutkielman tavoite on tutkia Business Process Re-engineering menetelmiä myyntiprosessien tehostamisessa. Tutkimuksen teoreettinen viiteke-hys rakentuu myynninjohtamisen, myyntiprosessien ja Business Process Mana-gementin ja Business Process Re-enineeringin ympärille. IT-järjestelmät ovat myös oleellinen osa-alue tutkimuksen kannalta ja niiden osuutta kuvataan niin myyntiprosesseissa kuin Business Process Re-engineering -menetelmien yhtey-dessä. Tutkielmassa perehdytään aikaisempaan tutkimusmateriaaliin ja akateemiseen kirjallisuuteen yllämainituilla osa-alueilla. Tavoitteena on löytää aikaisempia tutki-muksia myyntiprosessien tehostamisesta ja BPR:n roolista näissä tapauksissa. Myös myynninjohtamisen vaikutusta tehokkaaseen myyntiprosessiin tutkitaan, kuten myös IT-järjestelmien erilaisia rooleja tehokkaissa myyntiprosesseissa. Tutkielman empiirinen osio on kvalitatiivinen Case-tutkimus eräässä rahoitusalan yrityksessä. Tutkimus tehdään haastattelemalla myyntihenkilöstöä ja esimiehiä. Lisäksi analysoidaan yrityksen myyntiprosessiin liittyvää muuta materiaalia. Case-tutkimuksen tuloksia peilataan aiempaan akateemiseen tutkimukseen ja tuloksista pyritään löytämään ratkaisuja, miten BPR -menetelmillä voidaan tehostaa yrityksen myyntiprosessia.
Resumo:
To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.
Resumo:
Requirements analysis focuses on stakeholders concerns and their influence towards e-government systems. Some characteristics of stakeholders concerns clearly show the complexity and conflicts. This imposes a number of questions in the requirements analysis, such as how are they relevant to stakeholders? What are their needs? How conflicts among the different stakeholders can be resolved? And what coherent requirements can be methodologically produced? This paper describes the problem articulation method in organizational semiotics which can be used to conduct such complex requirements analysis. The outcomes of the analysis enable e-government systems development and management to meet userspsila needs. A case study of Yantai Citizen Card is chosen to illustrate a process of analysing stakeholders in the lifecycle of requirements analysis.
Resumo:
Process scheduling techniques consider the current load situation to allocate computing resources. Those techniques make approximations such as the average of communication, processing, and memory access to improve the process scheduling, although processes may present different behaviors during their whole execution. They may start with high communication requirements and later just processing. By discovering how processes behave over time, we believe it is possible to improve the resource allocation. This has motivated this paper which adopts chaos theory concepts and nonlinear prediction techniques in order to model and predict process behavior. Results confirm the radial basis function technique which presents good predictions and also low processing demands show what is essential in a real distributed environment.
Resumo:
One of the first questions to consider when designing a new roll forming line is the number of forming steps required to produce a profile. The number depends on material properties, the cross-section geometry and tolerance requirements, but the tool designer also wants to minimize the number of forming steps in order to reduce the investment costs for the customer. There are several computer aided engineering systems on the market that can assist the tool designing process. These include more or less simple formulas to predict deformation during forming as well as the number of forming steps. In recent years it has also become possible to use finite element analysis for the design of roll forming processes. The objective of the work presented in this thesis was to answer the following question: How should the roll forming process be designed for complex geometries and/or high strength steels? The work approach included both literature studies as well as experimental and modelling work. The experimental part gave direct insight into the process and was also used to develop and validate models of the process. Starting with simple geometries and standard steels the work progressed to more complex profiles of variable depth and width, made of high strength steels. The results obtained are published in seven papers appended to this thesis. In the first study (see paper 1) a finite element model for investigating the roll forming of a U-profile was built. It was used to investigate the effect on longitudinal peak membrane strain and deformation length when yield strength increases, see paper 2 and 3. The simulations showed that the peak strain decreases whereas the deformation length increases when the yield strength increases. The studies described in paper 4 and 5 measured roll load, roll torque, springback and strain history during the U-profile forming process. The measurement results were used to validate the finite element model in paper 1. The results presented in paper 6 shows that the formability of stainless steel (e.g. AISI 301), that in the cold rolled condition has a large martensite fraction, can be substantially increased by heating the bending zone. The heated area will then become austenitic and ductile before the roll forming. Thanks to the phenomenon of strain induced martensite formation, the steel will regain the martensite content and its strength during the subsequent plastic straining. Finally, a new tooling concept for profiles with variable cross-sections is presented in paper 7. The overall conclusions of the present work are that today, it is possible to successfully develop profiles of complex geometries (3D roll forming) in high strength steels and that finite element simulation can be a useful tool in the design of the roll forming process.
Resumo:
During the development of system requirements, software system specifications are often inconsistent. Inconsistencies may arise for different reasons, for example, when multiple conflicting viewpoints are embodied in the specification, or when the specification itself is at a transient stage of evolution. These inconsistencies cannot always be resolved immediately. As a result, we argue that a formal framework for the analysis of evolving specifications should be able to tolerate inconsistency by allowing reasoning in the presence of inconsistency without trivialisation, and circumvent inconsistency by enabling impact analyses of potential changes to be carried out. This paper shows how clustered belief revision can help in this process. Clustered belief revision allows for the grouping of requirements with similar functionality into clusters and the assignment of priorities between them. By analysing the result of a cluster, an engineer can either choose to rectify problems in the specification or to postpone the changes until more information becomes available.
Resumo:
Enterprises need continuous product development activities to remain competitive in the marketplace. Their product development process (PDP) must manage stakeholders' needs - technical, financial, legal, and environmental aspects, customer requirements, Corporate strategy, etc. -, being a multidisciplinary and strategic issue. An approach to use real option to support the decision-making process at PDP phases in taken. The real option valuation method is often presented as an alternative to the conventional net present value (NPV) approach. It is based on the same principals of financial options: the right to buy or sell financial values (mostly stocks) at a predetermined price, with no obligation to do so. In PDP, a multi-period approach that takes into account the flexibility of, for instance, being able to postpone prototyping and design decisions, waiting for more information about technologies, customer acceptance, funding, etc. In the present article, the state of the art of real options theory is prospected and a model to use the real options in PDP is proposed, so that financial aspects can be properly considered at each project phase of the product development. Conclusion is that such model can provide more robustness to the decisions processes within PDP.