9 resultados para Optimization techniques

em Cochin University of Science


Relevância:

70.00% 70.00%

Publicador:

Resumo:

To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The thesis deals with the preparation and dielectric characterization of Poly aniline and its analogues in ISM band frequency of 2-4 GHz that includes part of the microwave region (300 MHz to 300 GHz) of the electromagnetic spectrum and an initial dielectric study in the high frequency [O.05MHz-13 MHz]. PolyaniIine has been synthesized by an in situ doping reaction under different temperature and in the presence of inorganic dopants such as HCl H2S04, HN03, HCl04 and organic dopants such as camphorsulphonic acid [CSA], toluenesulphonic acid {TSA) and naphthalenesulphonic acid [NSA]. The variation in dielectric properties with change in reaction temperature, dopants and frequency has been studied. The effect of codopants and microemulsions on the dielectric properties has also been studied in the ISM band. The ISM band of frequencies (2-4 GHz) is of great utility in Industrial, Scientific and Medical (ISM) applications. Microwave heating is a very efficient method of heating dielectric materials and is extensively used in industrial as well as household heating applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analog-to digital Converters (ADC) have an important impact on the overall performance of signal processing system. This research is to explore efficient techniques for the design of sigma-delta ADC,specially for multi-standard wireless tranceivers. In particular, the aim is to develop novel models and algorithms to address this problem and to implement software tools which are avle to assist the designer's decisions in the system-level exploration phase. To this end, this thesis presents a framework of techniques to design sigma-delta analog to digital converters.A2-2-2 reconfigurable sigma-delta modulator is proposed which can meet the design specifications of the three wireless communication standards namely GSM,WCDMA and WLAN. A sigma-delta modulator design tool is developed using the Graphical User Interface Development Environment (GUIDE) In MATLAB.Genetic Algorithm(GA) based search method is introduced to find the optimum value of the scaling coefficients and to maximize the dynamic range in a sigma-delta modulator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study aimed at the utlisation of microbial organisms for the production of good quality chitin and chitosan. The three strains used for the study were Lactobacillus plantarum, Lactobacililus brevis and Bacillus subtilis. These strains were selected on the basis of their acid producing ability to reduce the pH of the fermenting substrates to prevent spoilage and thus caused demineralisation of the shell. Besides, the proteolytic enzymes in these strains acted on proteinaceous covering of shrimp and thus caused deprotenisation of shrimp shell waste. Thus the two processes involved in chitin production can be affected to certain extent using bacterial fermentation of shrimp shell.Optimization parameters like fermentation period, quantity of inoculum, type of sugar, concentration of sugar etc. for fermentation with three different strains were studied. For these, parameters like pH, Total titrable acidity (TTA), changes in sugar concentration, changes in microbial count, sensory changes etc. were studied.Fermentation study with Lactobacillus plantarum was continued with 20% w/v jaggery broth for 15 days. The inoculum prepared yislded a cell concentration of approximately 108 CFU/ml. In the present study, lactic acid and dilute hydrochloric acid were used for initial pH adjustment because; without adjusting the initial pH, it took more than 5 hours for the lactic acid bacteria to convert glucose to lactic acid and during this delay spoilage occurred due to putrefying enzymes active at neutral or higher pH. During the fermentation study, pH first decreased in correspondence with increase in TTA values. This showed a clear indication of acid production by the strain. This trend continued till their proteolytic activity showed an increasing trend. When the available sugar source started depleting, proteolytic activity also decreased and pH increased. This was clearly reflected in the sensory evaluation results. Lactic acid treated samples showed greater extent of demineralization and deprotenisation at the end of fermentation study than hydrochloric acid treated samples. It can be due to the effect of strong hydrochloric acid on the initial microbial count, which directly affects the fermentation process. At the end of fermentation, about 76.5% of ash was removed in lactic acid treated samples and 71.8% in hydrochloric acid treated samples; 72.8% of proteins in lactic acid treated samples and 70.6% in hydrochloric acid treated samples.The residual protein and ash in the fermented residue were reduced to permissible limit by treatment with 0.8N HCI and 1M NaOH. Characteristics of chitin like chitin content, ash content, protein content, % of N- acetylation etc. were studied. Quality characteristics like viscosity, degree of deacetylation and molecular weight of chitosan prepared were also compared. The chitosan samples prepared from lactic acid treated showed high viscosity than HCI treated samples. But degree of deacetylation is more in HCI treated samples than lactic acid treated ones. Characteristics of protein liquor obtained like its biogenic composition, amino acid composition, total volatile base nitrogen, alpha amino nitrogen etc. also were studied to find out its suitability as animal feed supplement.Optimization of fermentation parameters for Lactobacillus brevis fermentation study was also conducted and parameters were standardized. Then detailed fermentation study was done in 20%wlv jaggery broth for 17 days. Also the effect of two different acid treatments (mild HCI and lactic acid) used for initial pH adjustment on chitin production were also studied. In this study also trend of changes in pH. changes in sugar concentration ,microbial count changes were similar to Lactobacillus plantarum studies. At the end of fermentation, residual protein in the samples were only 32.48% in HCI treated samples and 31.85% in lactic acid treated samples. The residual ash content was about 33.68% in HCI treated ones and 32.52% in lactic acid treated ones. The fermented residue was converted to chitin with good characteristics by treatment with 1.2MNaOH and 1NHCI.Characteristics of chitin samples prepared were studied and extent of Nacetylation was about 84% in HCI treated chitin and 85%in lactic acid treated ones assessed from FTIR spectrum. Chitosan was prepared from these samples by usual chemical method and its extent of solubility, degree of deacetylation, viscosity and molecular weight etc were studied. The values of viscosity and molecular weight of the samples prepared were comparatively less than the chitosan prepared by Lactobacillus plantarum fermentation. Characteristics of protein liquor obtained were analyzed to determine its quality and is suitability as animal feed supplement.Another strain used for the study was Bacillus subtilis and fermentation was carried out in 20%w/v jaggery broth for 15 days. It was found that Bacillus subtilis was more efficient than other Lactobacillus species for deprotenisation and demineralization. This was mainly due to the difference in the proteolytic nature of the strains. About 84% of protein and 72% of ash were removed at the end of fermentation. Considering the statistical significance (P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Controlling the inorganic nitrogen by manipulating carbon / nitrogen ratio is a method gaining importance in aquaculture systems. Nitrogen control is induced by feeding bacteria with carbohydrates and through the subsequent uptake of nitrogen from the water for the synthesis of microbial proteins. The relationship between addition of carbohydrates, reduction of ammonium and the production of microbial protein depends on the microbial conversion coefficient. The carbon / nitrogen ratio in the microbial biomass is related to the carbon contents of the added material. The addition of carbonaceous substrate was found to reduce inorganic nitrogen in shrimp culture ponds and the resultant microbial proteins are taken up by shrimps. Thus, part of the feed protein is replaced and feeding costs are reduced in culture systems.The use of various locally available substrates for periphyton based aquaculture practices increases production and profitability .However, these techniques for extensive shrimp farming have not so far been evaluated. Moreover, an evaluation of artificial substrates together with carbohydrate source based farming system in reducing inorganic nitrogen production in culture systems has not yet been carried-out. Furthermore, variations in water and soil quality, periphyton production and shrimp production of the whole system have also not been determined so-far.This thesis starts with a general introduction , a brief review of the most relevant literature, results of various experiments and concludes with a summary (Chapter — 9). The chapters are organised conforming to the objectives of the present study. The major objectives of this thesis are, to improve the sustainability of shrimp farming by carbohydrate addition and periphyton substrate based shrimp production and to improve the nutrient utilisation in aquaculture systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Short term load forecasting is one of the key inputs to optimize the management of power system. Almost 60-65% of revenue expenditure of a distribution company is against power purchase. Cost of power depends on source of power. Hence any optimization strategy involves optimization in scheduling power from various sources. As the scheduling involves many technical and commercial considerations and constraints, the efficiency in scheduling depends on the accuracy of load forecast. Load forecasting is a topic much visited in research world and a number of papers using different techniques are already presented. The accuracy of forecast for the purpose of merit order dispatch decisions depends on the extent of the permissible variation in generation limits. For a system with low load factor, the peak and the off peak trough are prominent and the forecast should be able to identify these points to more accuracy rather than minimizing the error in the energy content. In this paper an attempt is made to apply Artificial Neural Network (ANN) with supervised learning based approach to make short term load forecasting for a power system with comparatively low load factor. Such power systems are usual in tropical areas with concentrated rainy season for a considerable period of the year

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the thesis was to design and develop spatially adaptive denoising techniques with edge and feature preservation, for images corrupted with additive white Gaussian noise and SAR images affected with speckle noise. Image denoising is a well researched topic. It has found multifaceted applications in our day to day life. Image denoising based on multi resolution analysis using wavelet transform has received considerable attention in recent years. The directionlet based denoising schemes presented in this thesis are effective in preserving the image specific features like edges and contours in denoising. Scope of this research is still open in areas like further optimization in terms of speed and extension of the techniques to other related areas like colour and video image denoising. Such studies would further augment the practical use of these techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Post-transcriptional gene silencing by RNA interference is mediated by small interfering RNA called siRNA. This gene silencing mechanism can be exploited therapeutically to a wide variety of disease-associated targets, especially in AIDS, neurodegenerative diseases, cholesterol and cancer on mice with the hope of extending these approaches to treat humans. Over the recent past, a significant amount of work has been undertaken to understand the gene silencing mediated by exogenous siRNA. The design of efficient exogenous siRNA sequences is challenging because of many issues related to siRNA. While designing efficient siRNA, target mRNAs must be selected such that their corresponding siRNAs are likely to be efficient against that target and unlikely to accidentally silence other transcripts due to sequence similarity. So before doing gene silencing by siRNAs, it is essential to analyze their off-target effects in addition to their inhibition efficiency against a particular target. Hence designing exogenous siRNA with good knock-down efficiency and target specificity is an area of concern to be addressed. Some methods have been developed already by considering both inhibition efficiency and off-target possibility of siRNA against agene. Out of these methods, only a few have achieved good inhibition efficiency, specificity and sensitivity. The main focus of this thesis is to develop computational methods to optimize the efficiency of siRNA in terms of “inhibition capacity and off-target possibility” against target mRNAs with improved efficacy, which may be useful in the area of gene silencing and drug design for tumor development. This study aims to investigate the currently available siRNA prediction approaches and to devise a better computational approach to tackle the problem of siRNA efficacy by inhibition capacity and off-target possibility. The strength and limitations of the available approaches are investigated and taken into consideration for making improved solution. Thus the approaches proposed in this study extend some of the good scoring previous state of the art techniques by incorporating machine learning and statistical approaches and thermodynamic features like whole stacking energy to improve the prediction accuracy, inhibition efficiency, sensitivity and specificity. Here, we propose one Support Vector Machine (SVM) model, and two Artificial Neural Network (ANN) models for siRNA efficiency prediction. In SVM model, the classification property is used to classify whether the siRNA is efficient or inefficient in silencing a target gene. The first ANNmodel, named siRNA Designer, is used for optimizing the inhibition efficiency of siRNA against target genes. The second ANN model, named Optimized siRNA Designer, OpsiD, produces efficient siRNAs with high inhibition efficiency to degrade target genes with improved sensitivity-specificity, and identifies the off-target knockdown possibility of siRNA against non-target genes. The models are trained and tested against a large data set of siRNA sequences. The validations are conducted using Pearson Correlation Coefficient, Mathews Correlation Coefficient, Receiver Operating Characteristic analysis, Accuracy of prediction, Sensitivity and Specificity. It is found that the approach, OpsiD, is capable of predicting the inhibition capacity of siRNA against a target mRNA with improved results over the state of the art techniques. Also we are able to understand the influence of whole stacking energy on efficiency of siRNA. The model is further improved by including the ability to identify the “off-target possibility” of predicted siRNA on non-target genes. Thus the proposed model, OpsiD, can predict optimized siRNA by considering both “inhibition efficiency on target genes and off-target possibility on non-target genes”, with improved inhibition efficiency, specificity and sensitivity. Since we have taken efforts to optimize the siRNA efficacy in terms of “inhibition efficiency and offtarget possibility”, we hope that the risk of “off-target effect” while doing gene silencing in various bioinformatics fields can be overcome to a great extent. These findings may provide new insights into cancer diagnosis, prognosis and therapy by gene silencing. The approach may be found useful for designing exogenous siRNA for therapeutic applications and gene silencing techniques in different areas of bioinformatics.