28 resultados para Hybrid heuristic algorithm

em Cochin University of Science


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel and fast technique for cryptographic applications is designed and developed using the symmetric key algorithm “MAJE4” and the popular asymmetric key algorithm “RSA”. The MAJE4 algorithm is used for encryption / decryption of files since it is much faster and occupies less memory than RSA. The RSA algorithm is used to solve the problem of key exchange as well as to accomplish scalability and message authentication. The focus is to develop a new hybrid system called MARS4 by combining the two cryptographic methods with an aim to get the advantages of both. The performance evaluation of MARS4 is done in comparison with MAJE4 and RSA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A genetic algorithm has been used for null steering in phased and adaptive arrays . It has been shown that it is possible to steer the array null s precisely to the required interference directions and to achieve any prescribed null depths . A comparison with the results obtained from the analytic solution shows the advantages of using the genetic algorithm for null steering in linear array patterns

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanoscale silica was synthesized by precipitation method using sodium silicate and dilute hydrochloric acid under controlled conditions. The synthesized silica was characterized by Scanning Electron Microscopy (SEM), Transmission Electron Microscopy (TEM), BET adsorption and X-Ray Diffraction (XRD). The particle size of silica was calculated to be 13 nm from the XRD results and the surface area was found to be 295 m2/g by BET method. The performance of this synthesized nanosilica as a reinforcing filler in natural rubber (NR) compound was investigated. The commercial silica was used as the reference material. Nanosilica was found to be effective reinforcing filler in natural rubber compound. Filler-matrix interaction was better for nanosilica than the commercial silica. The synthesized nanosilica was used in place of conventional silica in HRH (hexamethylene tetramine, resorcinol and silica) bonding system for natural rubber and styrene butadiene rubber / Nylon 6 short fiber composites. The efficiency of HRH bonding system based on nanosilica was better. Nanosilica was also used as reinforcing filler in rubber / Nylon 6 short fiber hybrid composite. The cure, mechanical, ageing, thermal and dynamic mechanical properties of nanosilica / Nylon 6 short fiber / elastomeric hybrid composites were studied in detail. The matrices used were natural rubber (NR), nitrile rubber (NBR), styrene butadiene rubber (SBR) and chloroprene rubber (CR). Fiber loading was varied from 0 to 30 parts per hundred rubber (phr) and silica loading was varied from 0 to 9 phr. Hexa:Resorcinol:Silica (HRH) ratio was maintained as 2:2:1. HRH loading was adjusted to 16% of the fiber loading. Minimum torque, maximum torque and cure time increased with silica loading. Cure rate increased with fiber loading and decreased with silica content. The hybrid composites showed improved mechanical properties in the presence of nanosilica. Tensile strength showed a dip at 10 phr fiber loading in the case of NR and CR while it continuously increased with fiber loading in the case of NBR and SBR. The nanosilica improved the tensile strength, modulus and tear strength better than the conventional silica. Abrasion resistance and hardness were also better for the nanosilica composites. Resilience and compression set were adversely affected. Hybrid composites showed anisotropy in mechanical properties. Retention in ageing improved with fiber loading and was better for nanosilica-filled hybrid composites. The nanosilica also improved the thermal stability of the hybrid composite better than the commercial silica. All the composites underwent two-step thermal degradation. Kinetic studies showed that the degradation of all the elastomeric composites followed a first-order reaction. Dynamic mechanical analysis revealed that storage modulus (E’) and loss modulus (E”) increased with nanosiica content, fiber loading and frequency for all the composites, independent of the matrix. The highest rate of increase was registered for NBR rubber.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data mining is one of the hottest research areas nowadays as it has got wide variety of applications in common man’s life to make the world a better place to live. It is all about finding interesting hidden patterns in a huge history data base. As an example, from a sales data base, one can find an interesting pattern like “people who buy magazines tend to buy news papers also” using data mining. Now in the sales point of view the advantage is that one can place these things together in the shop to increase sales. In this research work, data mining is effectively applied to a domain called placement chance prediction, since taking wise career decision is so crucial for anybody for sure. In India technical manpower analysis is carried out by an organization named National Technical Manpower Information System (NTMIS), established in 1983-84 by India's Ministry of Education & Culture. The NTMIS comprises of a lead centre in the IAMR, New Delhi, and 21 nodal centres located at different parts of the country. The Kerala State Nodal Centre is located at Cochin University of Science and Technology. In Nodal Centre, they collect placement information by sending postal questionnaire to passed out students on a regular basis. From this raw data available in the nodal centre, a history data base was prepared. Each record in this data base includes entrance rank ranges, reservation, Sector, Sex, and a particular engineering. From each such combination of attributes from the history data base of student records, corresponding placement chances is computed and stored in the history data base. From this data, various popular data mining models are built and tested. These models can be used to predict the most suitable branch for a particular new student with one of the above combination of criteria. Also a detailed performance comparison of the various data mining models is done.This research work proposes to use a combination of data mining models namely a hybrid stacking ensemble for better predictions. A strategy to predict the overall absorption rate for various branches as well as the time it takes for all the students of a particular branch to get placed etc are also proposed. Finally, this research work puts forward a new data mining algorithm namely C 4.5 * stat for numeric data sets which has been proved to have competent accuracy over standard benchmarking data sets called UCI data sets. It also proposes an optimization strategy called parameter tuning to improve the standard C 4.5 algorithm. As a summary this research work passes through all four dimensions for a typical data mining research work, namely application to a domain, development of classifier models, optimization and ensemble methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is no baseline data available at present on the nature of various diseases that occur in a orchid population, under cultivation, in any commercial orchid farm maintained by small scale entrepreneurs who invest considerable amount of money, effort and time. The available data on type of disease symptoms, causative agent, , nature of pathogens, as to bacteria or ftmgi or any other biological agents, and their source, appropriate and effective control measures could not be devised, for large scale implementation and effective management, although arbitrary methods are being practiced by very few farms. Further influence of seasonal variations and environmental factors on disease outbreak is also not scientifically documented and statistically verified as to their authenticity. In this context, the primary objective of the present study was to create a data bank on the following aspects 1. Occurrence of different disease symptoms in Dendrobium hybrid over a period of one year covering all seasons 2. Variations in the environmental parameters at the orchid farms 3. Variations in the characteristics of water used for irrigation in the selected orchid farm 4. Microbial population associated with the various disease symptoms 5. Isolation and identification of bacteria isolated from diseased plants 6. Statistical treatment of the quantitative data and evolving statistical model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work is intended to study the following important aspects of document image processing and develop new methods. (1) Segmentation ofdocument images using adaptive interval valued neuro-fuzzy method. (2) Improving the segmentation procedure using Simulated Annealing technique. (3) Development of optimized compression algorithms using Genetic Algorithm and parallel Genetic Algorithm (4) Feature extraction of document images (5) Development of IV fuzzy rules. This work also helps for feature extraction and foreground and background identification. The proposed work incorporates Evolutionary and hybrid methods for segmentation and compression of document images. A study of different neural networks used in image processing, the study of developments in the area of fuzzy logic etc is carried out in this work

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assembly job shop scheduling problem (AJSP) is one of the most complicated combinatorial optimization problem that involves simultaneously scheduling the processing and assembly operations of complex structured products. The problem becomes even more complicated if a combination of two or more optimization criteria is considered. This thesis addresses an assembly job shop scheduling problem with multiple objectives. The objectives considered are to simultaneously minimizing makespan and total tardiness. In this thesis, two approaches viz., weighted approach and Pareto approach are used for solving the problem. However, it is quite difficult to achieve an optimal solution to this problem with traditional optimization approaches owing to the high computational complexity. Two metaheuristic techniques namely, genetic algorithm and tabu search are investigated in this thesis for solving the multiobjective assembly job shop scheduling problems. Three algorithms based on the two metaheuristic techniques for weighted approach and Pareto approach are proposed for the multi-objective assembly job shop scheduling problem (MOAJSP). A new pairing mechanism is developed for crossover operation in genetic algorithm which leads to improved solutions and faster convergence. The performances of the proposed algorithms are evaluated through a set of test problems and the results are reported. The results reveal that the proposed algorithms based on weighted approach are feasible and effective for solving MOAJSP instances according to the weight assigned to each objective criterion and the proposed algorithms based on Pareto approach are capable of producing a number of good Pareto optimal scheduling plans for MOAJSP instances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Decimal multiplication is an integral part of financial, commercial, and internet-based computations. A novel design for single digit decimal multiplication that reduces the critical path delay and area for an iterative multiplier is proposed in this research. The partial products are generated using single digit multipliers, and are accumulated based on a novel RPS algorithm. This design uses n single digit multipliers for an n × n multiplication. The latency for the multiplication of two n-digit Binary Coded Decimal (BCD) operands is (n + 1) cycles and a new multiplication can begin every n cycle. The accumulation of final partial products and the first iteration of partial product generation for next set of inputs are done simultaneously. This iterative decimal multiplier offers low latency and high throughput, and can be extended for decimal floating-point multiplication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Decision trees are very powerful tools for classification in data mining tasks that involves different types of attributes. When coming to handling numeric data sets, usually they are converted first to categorical types and then classified using information gain concepts. Information gain is a very popular and useful concept which tells you, whether any benefit occurs after splitting with a given attribute as far as information content is concerned. But this process is computationally intensive for large data sets. Also popular decision tree algorithms like ID3 cannot handle numeric data sets. This paper proposes statistical variance as an alternative to information gain as well as statistical mean to split attributes in completely numerical data sets. The new algorithm has been proved to be competent with respect to its information gain counterpart C4.5 and competent with many existing decision tree algorithms against the standard UCI benchmarking datasets using the ANOVA test in statistics. The specific advantages of this proposed new algorithm are that it avoids the computational overhead of information gain computation for large data sets with many attributes, as well as it avoids the conversion to categorical data from huge numeric data sets which also is a time consuming task. So as a summary, huge numeric datasets can be directly submitted to this algorithm without any attribute mappings or information gain computations. It also blends the two closely related fields statistics and data mining

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes a parallel genetic algorithm for compressing scanned document images. A fitness function is designed with Hausdorff distance which determines the terminating condition. The algorithm helps to locate the text lines. A greater compression ratio has achieved with lesser distortion

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Detection of Objects in Video is a highly demanding area of research. The Background Subtraction Algorithms can yield better results in Foreground Object Detection. This work presents a Hybrid CodeBook based Background Subtraction to extract the foreground ROI from the background. Codebooks are used to store compressed information by demanding lesser memory usage and high speedy processing. This Hybrid method which uses Block-Based and Pixel-Based Codebooks provide efficient detection results; the high speed processing capability of block based background subtraction as well as high Precision Rate of pixel based background subtraction are exploited to yield an efficient Background Subtraction System. The Block stage produces a coarse foreground area, which is then refined by the Pixel stage. The system’s performance is evaluated with different block sizes and with different block descriptors like 2D-DCT, FFT etc. The Experimental analysis based on statistical measurements yields precision, recall, similarity and F measure of the hybrid system as 88.74%, 91.09%, 81.66% and 89.90% respectively, and thus proves the efficiency of the novel system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic heterostructures with carbon nanotubes having multiple functionalities are fascinating materials which can be manipulated by means of an external magnetic field. In this paper we report our investigations on the synthesis and optical limiting properties of pristine cobalt nanotubes and high coercivity cobalt-in-carbon nanotubes (a new nanosystem where carbon nanotubes are filled with cobalt nanotubes). A general mobility assisted growth mechanism for the formation of one-dimensional nanostructures inside nanopores is verified in the case of carbon nanotubes. The open-aperture z-scan technique is employed for the optical limiting measurements in which nanosecond laser pulses at 532 nm have been used for optical excitation. Compared to the benchmark pristine carbon nanotubes these materials show an enhanced nonlinear optical absorption, and the nonlinear optical parameters calculated from the data show that these materials are efficient optical limiters. To the best of our knowledge this is the first report where the optical limiting properties of metal nanotubes are compared to those of carbon nanotubes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of methods to economically synthesize single wire structured multiferroic systems with room temperature spin−charge coupling is expected to be important for building next-generation multifunctional devices with ultralow power consumption. We demonstrate the fabrication of a single nanowire multiferroic system, a new geometry, exhibiting room temperature magnetodielectric coupling. A coaxial nanotube/nanowire heterostructure of barium titanate (BaTiO3, BTO) and cobalt (Co) has been synthesized using a template-assisted method. Room temperature ferromagnetism and ferroelectricity were exhibited by this coaxial system, indicating the coexistence of more than one ferroic interaction in this composite system