914 resultados para Using Lean tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this study is to create a seamless chain of actions and more detailed structure to the front end of innovation to be able to increase the front end performance and finally to influence the renewal of companies. The main goal is achieved through by the new concept of an integrated model of early activities of FEI leading to a discovery of new elements of opportunities and the identification of new business and growth areas. The procedure offers one possible solution to a dynamic strategy formation process in innovation development cycle. In this study the front end of innovation is positioned between a strategy reviews and a concept creation with needed procedures, tools, and frameworks. The starting point of the study is that the origins of innovation are not well enough understood. The study focuses attention on the early activities of FEI. These first activities are conceptualized in order to find out successful innovation initiatives and strategic renewal agendas. A seamless chain of activities resulting in faster and more precise identification of opportunities and growth areas available on markets and inside companies is needed. Three case studies were conducted in order to study company views on available theory doctrine and to identify the first practical experiences and procedures in the beginning of the front end of innovation. Successful innovation requires focus on renewal in both internal and external directions and they should be carefully balanced for best results. Instead of inside-out mode of actions the studied companies have a strong outside-in thinking mode and they mainly co-develop their innovation initiatives in close proximity with customers i.e. successful companies are an integral part of customers business and success. Companies have tailor-made innovation processes combined their way of working linked to their business goals, and priorities of actual needs of transformation. The result of this study is a new modular FEI platform which can be configured by companies against their actual business needs and drivers. This platform includes new elements of FEI documenting an architecture presenting how the system components work together. The system is a conceptual approach from theories of emergent strategy formation, opportunity identification and creation, interpretation-analysis-experimentation triad and the present FEI theories. The platform includes new features compared to actual models of FEI. It allows managers to better understand the importance of FEI in the whole innovation development stage and FEI as a phase and procedure to discover and implement emergent strategy. An adaptable company rethinks and redirects strategy proactively from time to time. Different parts of the business model are changed to remove identified obstacles for growth and renewal which gives them avenues to find right reforms for renewal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Switching power supplies are usually implemented with a control circuitry that uses constant clock frequency turning the power semiconductor switches on and off. A drawback of this customary operating principle is that the switching frequency and harmonic frequencies are present in both the conducted and radiated EMI spectrum of the power converter. Various variable-frequency techniques have been introduced during the last decade to overcome the EMC problem. The main objective of this study was to compare the EMI and steady-state performance of a switch mode power supply with different spread-spectrum/variable-frequency methods. Another goal was to find out suitable tools for the variable-frequency EMI analysis. This thesis can be divided into three main parts: Firstly, some aspects of spectral estimation and measurement are presented. Secondly, selected spread spectrum generation techniques are presented with simulations and background information. Finally, simulations and prototype measurements from the EMC and the steady-state performance are carried out in the last part of this work. Combination of the autocorrelation function, the Welch spectrum estimate and the spectrogram were used as a substitute for ordinary Fourier methods in the EMC analysis. It was also shown that the switching function can be used in preliminary EMC analysis of a SMPS and the spectrum and autocorrelation sequence of a switching function correlates with the final EMI spectrum. This work is based on numerous simulations and measurements made with the prototype. All these simulations and measurements are made with the boost DC/DC converter. Four different variable-frequency modulation techniques in six different configurations were analyzed and the EMI performance was compared to the constant frequency operation. Output voltage and input current waveforms were also analyzed in time domain to see the effect of the spread spectrum operation on these quantities. According to the results presented in this work, spread spectrum modulation can be utilized in power converter for EMI mitigation. The results from steady-state voltage measurements show, that the variable-frequency operation of the SMPS has effect on the voltage ripple, but the ripple measured from the prototype is still acceptable in some applications. Both current and voltage ripple can be controlled with proper main circuit and controller design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research is to demonstrate the use of Lean Six Sigma methodology in a manufacturing lead time improvement project. Moreover, the goal is to develop working solutions for the target company to improve its manufacturing lead time. The theoretical background is achieved through exploring the literature of Six Sigma, Lean and Lean Six Sigma. The development will be done in collaboration with the related stakeholders, by following the Lean Six Sigma improvement process DMAIC and by analyzing the process data from the target company. The focus of this research is in demonstrating how to use Lean Six Sigma improvement process DMAIC in practice, rather than in comparing Lean Six Sigma to other improvement methodologies. In order to validate the manufacturing system’s current state, improvement potential and solutions, statistical tools such as linear regression analysis were used. This ensured that all the decisions were as heavily based on actual data as possible. As a result of this research, a set of solutions were developed and implemented in the target company. These solutions included batch size reduction, bottleneck shift, first-in first-out queuing and shifting a data entry task from production planners to line workers. With the use of these solutions, the target company was able to reduce its manufacturing lead time by over one third.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research has shown that receptor-ligand interactions between surfaces of communicating cells are necessary prerequisites for cell proliferation, cell differentiation and immune defense. Cell-adhesion events have also been proposed for pathological conditions such as cancer growth, metastasis, and host-cell invasion by parasites such as Trypanosoma cruzi. RNA and DNA aptamers (aptus = Latin, fit) that have been selected from combinatorial nucleic acid libraries are capable of binding to cell-adhesion receptors leading to a halt in cellular processes induced by outside signals as a consequence of blockage of receptor-ligand interactions. We outline here a novel approach using RNA aptamers that bind to T. cruzi receptors and interrupt host-cell invasion in analogy to existing procedures of blocking selectin adhesion and function in vitro and in vivo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän Pro Gradu-tutkielman tavoitteena on muodostaa työkalu Lean-projektin kannattavuuden ennalta-arvioinnin toteuttamiseen soveltamalla tuottojakauma-menetelmää. Lisäksi tutkimus pyrkii selvittämään, minkälaista siihen liittyvää akateemista tutkimusta on aikaisemmin toteutettu sekä mitä haasteita tämänkaltaisen arvion toteuttamiselle on. Tutkimuksen syntymistä on motivoinut Lean-pro-jektien kannattavuuden ennalta-arvioimisen akateemisesta tutkimuksesta tunnistettu tutkimusaukko. Empiirinen tutkimus on toteutettu kvalitatiivisena tapaustutkimuksena, yhteistyössä Lean-projekteihin erikoistuneen konsultointiyrityksen kanssa. Empiiristä tutkimusta on ohjannut sille valittu metodologia, jonka tavoitteena on ollut systemaattisesti muodostaa tutkimuksen tavoitteen mukainen työkalu. Aineistonkeruumenetelmänä on toiminut teemahaastattelu, joka on toteutettu kaksiosaisena. Niiden pohjalta saadut aineistot on analysoitu Grounded theory-menetelmää käyttäen. Tutkimuksen tulokset osoittavat, että muodostetulla tuottojakauma-menetelmää soveltavalla työkalulla on mahdollista toteuttaa Lean-pro¬jektin kannattavuuden ennalta-arviointi. Tulosten perusteella, sen avulla pystytään myös vastaamaan tutkimuksessa tunnistettuihin haasteisiin, jotka ovat aikaisemmin rajoittaneet tämän arvion toteuttamista. Työkalulla on mahdollista, tutkimuksen perusteella, myös tukea sen yhteistyöyrityksen Lean-projektien myyntiä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At present, one of the main concerns of green network is to minimize the power consumption of network infrastructure. Surveys show that, the highest amount of power is consumed by the network devices during its runtime. However to control this power consumption it is important to know which factors has highest impact on this matter. This paper is focused on the measurement and modeling the power consumption of an Ethernet switch during its runtime considering various types of input parameters with all possible combinations. For the experiment, three input parameters are chosen. They are bandwidth, link load and number of connections. The output to be measured is the power consumption of the Ethernet switch. Due to the uncertain power consuming pattern of the Ethernet switch a fully-comprehensive experimental evaluation would require an unfeasible and cumbersome experimental phase. Because of that, design of experiment (DoE) method has been applied to obtain adequate information on the effects of each input parameters on the power consumption. The whole work consists of three parts. In the first part a test bed is planned with input parameters and the power consumption of the switch is measured. The second part is about generating a mathematical model with the help of design of experiment tools. This model can be used for measuring precise power consumption in different scenario and also pinpoint the parameters with higher influence in power consumption. And in the last part, the mathematical model is evaluated by comparing with the experimental values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduces heat demand forecasting models which are generated by using data mining algorithms. The forecast spans one full day and this forecast can be used in regulating heat consumption of buildings. For training the data mining models, two years of heat consumption data from a case building and weather measurement data from Finnish Meteorological Institute are used. The thesis utilizes Microsoft SQL Server Analysis Services data mining tools in generating the data mining models and CRISP-DM process framework to implement the research. Results show that the built models can predict heat demand at best with mean average percentage errors of 3.8% for 24-h profile and 5.9% for full day. A deployment model for integrating the generated data mining models into an existing building energy management system is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Press forming is nowadays one of the most common industrial methods in use for producing deeper trays from paperboard. Demands for material properties like recyclability and sustainability have increased also in the packaging industry, but there are still limitations related to the formability of paperboard. A majority of recent studies have focused on material development, but the potential of the package manufacturing process can also be improved by the development of tooling and process control. In this study, advanced converting tools (die cutting tools and the press forming mould) are created for production scale paperboard tray manufacturing. Also monitoring methods that enable the production of paperboard trays with enhanced quality, and can be utilized in process control are developed. The principles for tray blank preparation, including creasing pattern and die cutting tool design are introduced. The mould heating arrangement and determination of mould clearance are investigated to improve the quality of the press formed trays. The effect of the spring back of the tray walls on the tray dimensions can be managed by adjusting the heat-related process parameters and estimating it at the mould design stage. This enables production speed optimization as the process parameters can be adjusted more freely. Real-time monitoring of pressing force by using multiple force sensors embedded in the mould structure can be utilized in the evaluation of material characteristics on a modified production machinery. Comprehensive process control can be achieved with a combination of measurement of the outer dimensions of the trays and pressing force monitoring. The control method enables detection of defects and tracking changes in the material properties. The optimized converting tools provide a basis for effective operation of the control system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’évolution des protéines est un domaine important de la recherche en bioinformatique et catalyse l'intérêt de trouver des outils d'alignement qui peuvent être utilisés de manière fiable et modéliser avec précision l'évolution d'une famille de protéines. TM-Align (Zhang and Skolnick, 2005) est considéré comme l'outil idéal pour une telle tâche, en termes de rapidité et de précision. Par conséquent, dans cette étude, TM-Align a été utilisé comme point de référence pour faciliter la détection des autres outils d'alignement qui sont en mesure de préciser l'évolution des protéines. En parallèle, nous avons élargi l'actuel outil d'exploration de structures secondaires de protéines, Helix Explorer (Marrakchi, 2006), afin qu'il puisse également être utilisé comme un outil pour la modélisation de l'évolution des protéines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’autophagie est une voie hautement conservée de dégradation lysosomale des constituants cellulaires qui est essentiel à l’homéostasie cellulaire et contribue à l’apprêtement et à la présentation des antigènes. Les rôles relativement récents de l'autophagie dans l'immunité innée et acquise sous-tendent de nouveaux paradigmes immunologiques pouvant faciliter le développement de nouvelles thérapies où la dérégulation de l’autophagie est associée à des maladies auto-immunes. Cependant, l'étude in vivo de la réponse autophagique est difficile en raison du nombre limité de méthodes d'analyse pouvant fournir une définition dynamique des protéines clés impliquées dans cette voie. En conséquence, nous avons développé un programme de recherche en protéomique intégrée afin d’identifier et de quantifier les proteines associées à l'autophagie et de déterminer les mécanismes moléculaires régissant les fonctions de l’autophagosome dans la présentation antigénique en utilisant une approche de biologie des systèmes. Pour étudier comment l'autophagie et la présentation antigénique sont activement régulés dans les macrophages, nous avons d'abord procédé à une étude protéomique à grande échelle sous différentes conditions connues pour stimuler l'autophagie, tels l’activation par les cytokines et l’infection virale. La cytokine tumor necrosis factor-alpha (TNF-alpha) est l'une des principales cytokines pro-inflammatoires qui intervient dans les réactions locales et systémiques afin de développer une réponse immune adaptative. La protéomique quantitative d'extraits membranaires de macrophages contrôles et stimulés avec le TNF-alpha a révélé que l'activation des macrophages a entrainé la dégradation de protéines mitochondriales et des changements d’abondance de plusieurs protéines impliquées dans le trafic vésiculaire et la réponse immunitaire. Nous avons constaté que la dégradation des protéines mitochondriales était sous le contrôle de la voie ATG5, et était spécifique au TNF-alpha. En outre, l’utilisation d’un nouveau système de présentation antigènique, nous a permi de constater que l'induction de la mitophagie par le TNF-alpha a entrainée l’apprêtement et la présentation d’antigènes mitochondriaux par des molécules du CMH de classe I, contribuant ainsi la variation du répertoire immunopeptidomique à la surface cellulaire. Ces résultats mettent en évidence un rôle insoupçonné du TNF-alpha dans la mitophagie et permet une meilleure compréhension des mécanismes responsables de la présentation d’auto-antigènes par les molécules du CMH de classe I. Une interaction complexe existe également entre infection virale et l'autophagie. Récemment, notre laboratoire a fourni une première preuve suggérant que la macroautophagie peut contribuer à la présentation de protéines virales par les molécules du CMH de classe I lors de l’infection virale par l'herpès simplex virus de type 1 (HSV-1). Le virus HSV1 fait parti des virus humains les plus complexes et les plus répandues. Bien que la composition des particules virales a été étudiée précédemment, on connaît moins bien l'expression de l'ensemble du protéome viral lors de l’infection des cellules hôtes. Afin de caractériser les changements dynamiques de l’expression des protéines virales lors de l’infection, nous avons analysé par LC-MS/MS le protéome du HSV1 dans les macrophages infectés. Ces analyses nous ont permis d’identifier un total de 67 protéines virales structurales et non structurales (82% du protéome HSV1) en utilisant le spectromètre de masse LTQ-Orbitrap. Nous avons également identifié 90 nouveaux sites de phosphorylation et de dix nouveaux sites d’ubiquitylation sur différentes protéines virales. Suite à l’ubiquitylation, les protéines virales peuvent se localiser au noyau ou participer à des événements de fusion avec la membrane nucléaire, suggérant ainsi que cette modification pourrait influer le trafic vésiculaire des protéines virales. Le traitement avec des inhibiteurs de la réplication de l'ADN induit des changements sur l'abondance et la modification des protéines virales, mettant en évidence l'interdépendance des protéines virales au cours du cycle de vie du virus. Compte tenu de l'importance de la dynamique d'expression, de l’ubiquitylation et la phosphorylation sur la fonction des proteines virales, ces résultats ouvriront la voie vers de nouvelles études sur la biologie des virus de l'herpès. Fait intéressant, l'infection HSV1 dans les macrophages déclenche une nouvelle forme d'autophagie qui diffère remarquablement de la macroautophagie. Ce processus, appelé autophagie associée à l’enveloppe nucléaire (nuclear envelope derived autophagy, NEDA), conduit à la formation de vésicules membranaires contenant 4 couches lipidiques provenant de l'enveloppe nucléaire où on retrouve une grande proportion de certaines protéines virales, telle la glycoprotéine B. Les mécanismes régissant NEDA et leur importance lors de l’infection virale sont encore méconnus. En utilisant un essai de présentation antigénique, nous avons pu montrer que la voie NEDA est indépendante d’ATG5 et participe à l’apprêtement et la présentation d’antigènes viraux par le CMH de classe I. Pour comprendre l'implication de NEDA dans la présentation des antigènes, il est essentiel de caractériser le protéome des autophagosomes isolés à partir de macrophages infectés par HSV1. Aussi, nous avons développé une nouvelle approche de fractionnement basé sur l’isolation de lysosomes chargés de billes de latex, nous permettant ainsi d’obtenir des extraits cellulaires enrichis en autophagosomes. Le transfert des antigènes HSV1 dans les autophagosomes a été determine par protéomique quantitative. Les protéines provenant de l’enveloppe nucléaire ont été préférentiellement transférées dans les autophagosome lors de l'infection des macrophages par le HSV1. Les analyses protéomiques d’autophagosomes impliquant NEDA ou la macroautophagie ont permis de decouvrir des mécanismes jouant un rôle clé dans l’immunodominance de la glycoprotéine B lors de l'infection HSV1. Ces analyses ont également révélées que diverses voies autophagiques peuvent être induites pour favoriser la capture sélective de protéines virales, façonnant de façon dynamique la nature de la réponse immunitaire lors d'une infection. En conclusion, l'application des méthodes de protéomique quantitative a joué un rôle clé dans l'identification et la quantification des protéines ayant des rôles importants dans la régulation de l'autophagie chez les macrophages, et nous a permis d'identifier les changements qui se produisent lors de la formation des autophagosomes lors de maladies inflammatoires ou d’infection virale. En outre, notre approche de biologie des systèmes, qui combine la protéomique quantitative basée sur la spectrométrie de masse avec des essais fonctionnels tels la présentation antigénique, nous a permis d’acquérir de nouvelles connaissances sur les mécanismes moléculaires régissant les fonctions de l'autophagie lors de la présentation antigénique. Une meilleure compréhension de ces mécanismes permettra de réduire les effets nuisibles de l'immunodominance suite à l'infection virale ou lors du développement du cancer en mettant en place une réponse immunitaire appropriée.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis can be divided into three areas:1) the fabrication of a low temperature photo-luminescence and photoconductivity measuring unit 2) photo-luminescence in the chalcopyrite CulnSez and CulnS2 system for defect and composition analysis and 3) photo-luminescence and photo-conductivity of In:JS3. This thesis shows that photo-luminescence is one of most essential semiconductor characterization tool for a scientific group working on photovoltaics. Tools which can be robust, non-destructive, requiring minimal sample preparation for analysis and most informative of the device applications are sought after by industries and this thesis is towards establishing photo-luminescence as "THE" tool for semiconductor characterization. The possible application of photo-luminescence as a tool for compositional and quality analysis of semiconductor thin films has been worked upon by this thesis. Photo-conductivity complement photo-luminescence and together they provide all the information required for the fabrication of an opto-electronic device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial neural networks (ANNs) are relatively new computational tools that have found extensive utilization in solving many complex real-world problems. This paper describes how an ANN can be used to identify the spectral lines of elements. The spectral lines of Cadmium (Cd), Calcium (Ca), Iron (Fe), Lithium (Li), Mercury (Hg), Potassium (K) and Strontium (Sr) in the visible range are chosen for the investigation. One of the unique features of this technique is that it uses the whole spectrum in the visible range instead of individual spectral lines. The spectrum of a sample taken with a spectrometer contains both original peaks and spurious peaks. It is a tedious task to identify these peaks to determine the elements present in the sample. ANNs capability of retrieving original data from noisy spectrum is also explored in this paper. The importance of the need of sufficient data for training ANNs to get accurate results is also emphasized. Two networks are examined: one trained in all spectral lines and other with the persistent lines only. The network trained in all spectral lines is found to be superior in analyzing the spectrum even in a noisy environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.