41 resultados para Density-based Scanning Algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dynamic mechanical properties such as storage modulus, loss modulus and damping properties of blends of nylon copolymer (PA6,66) with ethylene propylene diene (EPDM) rubber was investigated with special reference to the effect of blend ratio and compatibilisation over a temperature range –100°C to 150°C at different frequencies. The effect of change in the composition of the polymer blends on tanδ was studied to understand the extent of polymer miscibility and damping characteristics. The loss tangent curve of the blends exhibited two transition peaks, corresponding to the glass transition temperature (Tg) of individual components indicating incompatibility of the blend systems. The morphology of the blends has been examined by using scanning electron microscopy. The Arrhenius relationship was used to calculate the activation energy for the glass transition of the blends. Finally, attempts have been made to compare the experimental data with theoretical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analog-to digital Converters (ADC) have an important impact on the overall performance of signal processing system. This research is to explore efficient techniques for the design of sigma-delta ADC,specially for multi-standard wireless tranceivers. In particular, the aim is to develop novel models and algorithms to address this problem and to implement software tools which are avle to assist the designer's decisions in the system-level exploration phase. To this end, this thesis presents a framework of techniques to design sigma-delta analog to digital converters.A2-2-2 reconfigurable sigma-delta modulator is proposed which can meet the design specifications of the three wireless communication standards namely GSM,WCDMA and WLAN. A sigma-delta modulator design tool is developed using the Graphical User Interface Development Environment (GUIDE) In MATLAB.Genetic Algorithm(GA) based search method is introduced to find the optimum value of the scaling coefficients and to maximize the dynamic range in a sigma-delta modulator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the investigation is to develop new high performance adhesive systems based on neoprene-phenolic blends. Initially the effect of addition of all possible ingredients like fillers, adhesion promoters, curing agents and their optimum compositions to neoprene solution is investigated. The phenolic resin used is a copolymer of phenol-cardanolformaldehyde prepared in the laboratory. The optimum ratio between phenol and cardanol that gives the maximum bond strength in metal-metal, rubber-rubber and rubber-metal specimens has been identified. Further the ratio between total phenols and formaldehyde is also optimised. The above adhesive system is further modified by the addition of epoxidized phenolic novolacs. For this purpose, phenolic novolac resins are prepared in different stoichiometric ratios and are subsequently epoxidized. The effectiveness of the adhesive for bonding different metal and rubber substrates is another part of the study. To study the ageing behaviour, different bonded specimens are exposed to high temperature, hot water and salt water and adhesive properties have been evaluated. The synthesized resins have been characterized by FTIR , HNMR spectroscopy. The molecular weights of the resins have been obtained by GPC. Thermogravimetric analysis and differential scanning calorimetry are used to study the thermal properties. The fractured surface analysis is studied by scanning electron microscopy. The study has brought to light the influence of phenol/ formaldehyde stoichiometric ratio, addition of cardanol (a renewable resource), adhesion promoters and suitability of the adhesive for different substrates and the age resistance of adhesive joints among other things.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Timely detection of sudden change in dynamics that adversely affect the performance of systems and quality of products has great scientific relevance. This work focuses on effective detection of dynamical changes of real time signals from mechanical as well as biological systems using a fast and robust technique of permutation entropy (PE). The results are used in detecting chatter onset in machine turning and identifying vocal disorders from speech signal.Permutation Entropy is a nonlinear complexity measure which can efficiently distinguish regular and complex nature of any signal and extract information about the change in dynamics of the process by indicating sudden change in its value. Here we propose the use of permutation entropy (PE), to detect the dynamical changes in two non linear processes, turning under mechanical system and speech under biological system.Effectiveness of PE in detecting the change in dynamics in turning process from the time series generated with samples of audio and current signals is studied. Experiments are carried out on a lathe machine for sudden increase in depth of cut and continuous increase in depth of cut on mild steel work pieces keeping the speed and feed rate constant. The results are applied to detect chatter onset in machining. These results are verified using frequency spectra of the signals and the non linear measure, normalized coarse-grained information rate (NCIR).PE analysis is carried out to investigate the variation in surface texture caused by chatter on the machined work piece. Statistical parameter from the optical grey level intensity histogram of laser speckle pattern recorded using a charge coupled device (CCD) camera is used to generate the time series required for PE analysis. Standard optical roughness parameter is used to confirm the results.Application of PE in identifying the vocal disorders is studied from speech signal recorded using microphone. Here analysis is carried out using speech signals of subjects with different pathological conditions and normal subjects, and the results are used for identifying vocal disorders. Standard linear technique of FFT is used to substantiate thc results.The results of PE analysis in all three cases clearly indicate that this complexity measure is sensitive to change in regularity of a signal and hence can suitably be used for detection of dynamical changes in real world systems. This work establishes the application of the simple, inexpensive and fast algorithm of PE for the benefit of advanced manufacturing process as well as clinical diagnosis in vocal disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various compositions of linear low density polyethylene(LLDPE) containing bio-filler(either starch or dextrin)of various particle sizes were prepared.The mechanical,thermal,FTIR,morphological(SEM),water absorption and melt flow(MFI) studies were carried out.Biodegradability of the compositions were determined using a shake culture flask containing amylase producing bacteria(vibrios),which were isolated from marine benthic environment and by soil burial test. The effect of low quantities of metal oxides and metal stearate as pro-oxidants in LLDPE and in the LLDPE-biofiller compositions was established by exposing the samples to ultraviolet light.The combination of bio-filler and a pro-oxidant improves the degradation of linear low density polyethylene.The maleation of LLDPE improves the compatibility of the c blend components and thepro-oxidants enhance the photodegradability of the compatibilised blends.The responsibility studies on the partially biodegradable LLDPE containing bio-fillers and pro-oxidants suggest that the blends could be repeatedly reprocessed without deterioration in mechanical properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational Biology is the research are that contributes to the analysis of biological data through the development of algorithms which will address significant research problems.The data from molecular biology includes DNA,RNA ,Protein and Gene expression data.Gene Expression Data provides the expression level of genes under different conditions.Gene expression is the process of transcribing the DNA sequence of a gene into mRNA sequences which in turn are later translated into proteins.The number of copies of mRNA produced is called the expression level of a gene.Gene expression data is organized in the form of a matrix. Rows in the matrix represent genes and columns in the matrix represent experimental conditions.Experimental conditions can be different tissue types or time points.Entries in the gene expression matrix are real values.Through the analysis of gene expression data it is possible to determine the behavioral patterns of genes such as similarity of their behavior,nature of their interaction,their respective contribution to the same pathways and so on. Similar expression patterns are exhibited by the genes participating in the same biological process.These patterns have immense relevance and application in bioinformatics and clinical research.Theses patterns are used in the medical domain for aid in more accurate diagnosis,prognosis,treatment planning.drug discovery and protein network analysis.To identify various patterns from gene expression data,data mining techniques are essential.Clustering is an important data mining technique for the analysis of gene expression data.To overcome the problems associated with clustering,biclustering is introduced.Biclustering refers to simultaneous clustering of both rows and columns of a data matrix. Clustering is a global whereas biclustering is a local model.Discovering local expression patterns is essential for identfying many genetic pathways that are not apparent otherwise.It is therefore necessary to move beyond the clustering paradigm towards developing approaches which are capable of discovering local patterns in gene expression data.A biclusters is a submatrix of the gene expression data matrix.The rows and columns in the submatrix need not be contiguous as in the gene expression data matrix.Biclusters are not disjoint.Computation of biclusters is costly because one will have to consider all the combinations of columans and rows in order to find out all the biclusters.The search space for the biclustering problem is 2 m+n where m and n are the number of genes and conditions respectively.Usually m+n is more than 3000.The biclustering problem is NP-hard.Biclustering is a powerful analytical tool for the biologist.The research reported in this thesis addresses the problem of biclustering.Ten algorithms are developed for the identification of coherent biclusters from gene expression data.All these algorithms are making use of a measure called mean squared residue to search for biclusters.The objective here is to identify the biclusters of maximum size with the mean squared residue lower than a given threshold. All these algorithms begin the search from tightly coregulated submatrices called the seeds.These seeds are generated by K-Means clustering algorithm.The algorithms developed can be classified as constraint based,greedy and metaheuristic.Constarint based algorithms uses one or more of the various constaints namely the MSR threshold and the MSR difference threshold.The greedy approach makes a locally optimal choice at each stage with the objective of finding the global optimum.In metaheuristic approaches particle Swarm Optimization(PSO) and variants of Greedy Randomized Adaptive Search Procedure(GRASP) are used for the identification of biclusters.These algorithms are implemented on the Yeast and Lymphoma datasets.Biologically relevant and statistically significant biclusters are identified by all these algorithms which are validated by Gene Ontology database.All these algorithms are compared with some other biclustering algorithms.Algorithms developed in this work overcome some of the problems associated with the already existing algorithms.With the help of some of the algorithms which are developed in this work biclusters with very high row variance,which is higher than the row variance of any other algorithm using mean squared residue, are identified from both Yeast and Lymphoma data sets.Such biclusters which make significant change in the expression level are highly relevant biologically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to study the variation in subduction zone geometry along and across the arc and the fault pattern within the subducting plate. Depth of penetration as well as the dip of the Benioff zone varies considerably along the arc which corresponds to the curvature of the fold- thrust belt which varies from concave to convex in different sectors of the arc. The entire arc is divided into 27 segments and depth sections thus prepared are utilized to investigate the average dip of the Benioff zone in the different parts of the entire arc, penetration depth of the subducting lithosphere, the subduction zone geometry underlying the trench, the arctrench gap, etc.The study also describes how different seismogenic sources are identified in the region, estimation of moment release rate and deformation pattern. The region is divided into broad seismogenic belts. Based on these previous studies and seismicity Pattern, we identified several broad distinct seismogenic belts/sources. These are l) the Outer arc region consisting of Andaman-Nicobar islands 2) the back-arc Andaman Sea 3)The Sumatran fault zone(SFZ)4)Java onshore region termed as Jave Fault Zone(JFZ)5)Sumatran fore arc silver plate consisting of Mentawai fault(MFZ)6) The offshore java fore arc region 7)The Sunda Strait region.As the Seismicity is variable,it is difficult to demarcate individual seismogenic sources.Hence, we employed a moving window method having a window length of 3—4° and with 50% overlapping starting from one end to the other. We succeeded in defining 4 sources each in the Andaman fore arc and Back arc region, 9 such sources (moving windows) in the Sumatran Fault zone (SFZ), 9 sources in the offshore SFZ region and 7 sources in the offshore Java region. Because of the low seismicity along JFZ, it is separated into three seismogenic sources namely West Java, Central Java and East Java. The Sunda strait is considered as a single seismogenic source.The deformation rates for each of the seismogenic zones have been computed. A detailed error analysis of velocity tensors using Monte—Carlo simulation method has been carried out in order to obtain uncertainties. The eigen values and the respective eigen vectors of the velocity tensor are computed to analyze the actual deformation pattem for different zones. The results obtained have been discussed in the light of regional tectonics, and their implications in terms of geodynamics have been enumerated.ln the light of recent major earthquakes (26th December 2004 and 28th March 2005 events) and the ongoing seismic activity, we have recalculated the variation in the crustal deformation rates prior and after these earthquakes in Andaman—Sumatra region including the data up to 2005 and the significant results has been presented.ln this chapter, the down going lithosphere along the subduction zone is modeled using the free air gravity data by taking into consideration the thickness of the crustal layer, the thickness of the subducting slab, sediment thickness, presence of volcanism, the proximity of the continental crust etc. Here a systematic and detailed gravity interpretation constrained by seismicity and seismic data in the Andaman arc and the Andaman Sea region in order to delineate the crustal structure and density heterogeneities a Io nagnd across the arc and its correlation with the seismogenic behaviour is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Demand on magnesium and its alloys is increased significantly in the automotive industry because of their great potential in reducing the weight of components, thus resulting in improvement in fuel efficiency of the vehicle. To date, most of Mg products have been fabricated by casting, especially, by die-casting because of its high productivity, suitable strength, acceptable quality & dimensional accuracy and the components produced through sand, gravity and low pressure die casting are small extent. In fact, higher solidification rate is possible only in high pressure die casting, which results in finer grain size. However, achieving high cooling rate in gravity casting using sand and permanent moulds is a difficult task, which ends with a coarser grain nature and exhibit poor mechanical properties, which is an important aspect of the performance in industrial applications. Grain refinement is technologically attractive because it generally does not adversely affect ductility and toughness, contrary to most other strengthening methods. Therefore formation of fine grain structure in these castings is crucial, in order to improve the mechanical properties of these cast components. Therefore, the present investigation is “GRAIN REFINEMENT STUDIES ON Mg AND Mg-Al BASED ALLOYS”. The primary objective of this present investigation is to study the effect of various grain refining inoculants (Al-4B, Al- 5TiB2 master alloys, Al4C3, Charcoal particles) on Pure Mg and Mg-Al alloys such as AZ31, AZ91 and study their grain refining mechanisms. The second objective of this work is to study the effect of superheating process on the grain size of AZ31, AZ91 Mg alloys with and without inoculants addition. In addition, to study the effect of grain refinement on the mechanical properties of Mg and Mg-Al alloys. The thesis is well organized with seven chapters and the details of the studies are given below in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LLDPE was blended with poly (vinyl alcohol) and mechanical, thermal, spectroscopic properties and biodegradability were investigated. The biodegradability of LLDPE/PVA blends has been studied in two environments, viz. (1) a culture medium containing Vibrio sp. and (2) a soil environment over a period of 15 weeks. Nanoanatase having photo catalytic activity was synthesized by hydrothermal method using titanium-iso-propoxide. The synthesized TiO2 was characterized by X-Ray diffraction (XRD), BET studies, FTIR studies and scanning electron microscopy (SEM). The crystallite size of titania was calculated to be ≈ 6nm from the XRD results and the surface area was found to be about 310m2/g by BET method. SEM shows that nanoanatase particles prepared by this method are spherical in shape. Linear low density polyethylene films containing polyvinyl alcohol and a pro-oxidant (TiO2 or cobalt stearate with or without vegetable oil) were prepared. The films were then subjected to natural weathering and UV exposure followed by biodegradation in culture medium as well as in soil environment. The degradation was monitored by mechanical property measurements, thermal studies, rate of weight loss, FTIR and SEM studies. Higher weight loss, texture change and greater increments in carbonyl index values were observed in samples containing cobalt stearate and vegetable oil. The present study demonstrates that the combination of LLDPE/PVA blends with (I) nanoanatase/vegetable oil and (ii) cobalt stearate/vegetable oil leads to extensive photodegradation. These samples show substantial degradation when subsequent exposure to Vibrio sp. is made. Thus a combined photodegradation and biodegradation process is a promising step towards obtaining a biodegradable grade of LLDPE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biodegradation is the chemical degradation of materials brought about by the action of naturally occurring microorganisms. Biodegradation is a relatively rapid process under suitable conditions of moisture, temperature and oxygen availability. The logic behind blending biopolymers such as starch with inert polymers like polyethylene is that if the biopolymer component is present in sufficient amount, and if it is removed by microorganisms in the waste disposal environment, then the base inert plastic should slowly degrade and disappear. The present work focuses on the preparation of biodegradable and photodegradable blends based on low density polyethylene incorporating small quantities of ionomers as compatibilizers. The thesis consists of eight chapters. The first chapter presents an introduction to the present research work and literature survey. The details of the materials used and the experimental procedures undertaken for the study are described in the second chapter. Preparation and characterization of low density polyethylene (LDPE)-biopolymer (starch/dextrin) blends are described in the third chapter. The result of investigations on the effect of polyethylene-co-methacrylic acid ionomers on the compatibility of LDPE and starch are reported in chapter 4. Chapter 5 has been divided into two parts. The first part deals with the effect of metal oxides on the photodegradation of LDPE. The second part describes the function of metal stearates on the photodegradation of LDPE. The results of the investigations on the role of various metal oxides as pro-oxidants on the degradation of ionomer compatibilized LDPE-starch blends are reported in chapter 6. Chapter 7 deals with the results of investigations on the role of various metal stearates as pro-oxidants on the degradation of ionomer compatibilized LDPE-starch blends. The conclusion of the investigations is presented in the last chapter of the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biometrics deals with the physiological and behavioral characteristics of an individual to establish identity. Fingerprint based authentication is the most advanced biometric authentication technology. The minutiae based fingerprint identification method offer reasonable identification rate. The feature minutiae map consists of about 70-100 minutia points and matching accuracy is dropping down while the size of database is growing up. Hence it is inevitable to make the size of the fingerprint feature code to be as smaller as possible so that identification may be much easier. In this research, a novel global singularity based fingerprint representation is proposed. Fingerprint baseline, which is the line between distal and intermediate phalangeal joint line in the fingerprint, is taken as the reference line. A polygon is formed with the singularities and the fingerprint baseline. The feature vectors are the polygonal angle, sides, area, type and the ridge counts in between the singularities. 100% recognition rate is achieved in this method. The method is compared with the conventional minutiae based recognition method in terms of computation time, receiver operator characteristics (ROC) and the feature vector length. Speech is a behavioural biometric modality and can be used for identification of a speaker. In this work, MFCC of text dependant speeches are computed and clustered using k-means algorithm. A backpropagation based Artificial Neural Network is trained to identify the clustered speech code. The performance of the neural network classifier is compared with the VQ based Euclidean minimum classifier. Biometric systems that use a single modality are usually affected by problems like noisy sensor data, non-universality and/or lack of distinctiveness of the biometric trait, unacceptable error rates, and spoof attacks. Multifinger feature level fusion based fingerprint recognition is developed and the performances are measured in terms of the ROC curve. Score level fusion of fingerprint and speech based recognition system is done and 100% accuracy is achieved for a considerable range of matching threshold

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assembly job shop scheduling problem (AJSP) is one of the most complicated combinatorial optimization problem that involves simultaneously scheduling the processing and assembly operations of complex structured products. The problem becomes even more complicated if a combination of two or more optimization criteria is considered. This thesis addresses an assembly job shop scheduling problem with multiple objectives. The objectives considered are to simultaneously minimizing makespan and total tardiness. In this thesis, two approaches viz., weighted approach and Pareto approach are used for solving the problem. However, it is quite difficult to achieve an optimal solution to this problem with traditional optimization approaches owing to the high computational complexity. Two metaheuristic techniques namely, genetic algorithm and tabu search are investigated in this thesis for solving the multiobjective assembly job shop scheduling problems. Three algorithms based on the two metaheuristic techniques for weighted approach and Pareto approach are proposed for the multi-objective assembly job shop scheduling problem (MOAJSP). A new pairing mechanism is developed for crossover operation in genetic algorithm which leads to improved solutions and faster convergence. The performances of the proposed algorithms are evaluated through a set of test problems and the results are reported. The results reveal that the proposed algorithms based on weighted approach are feasible and effective for solving MOAJSP instances according to the weight assigned to each objective criterion and the proposed algorithms based on Pareto approach are capable of producing a number of good Pareto optimal scheduling plans for MOAJSP instances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis summarizes the results on the studies on a syntax based approach for translation between Malayalam, one of Dravidian languages and English and also on the development of the major modules in building a prototype machine translation system from Malayalam to English. The development of the system is a pioneering effort in Malayalam language unattempted by previous researchers. The computational models chosen for the system is first of its kind for Malayalam language. An in depth study has been carried out in the design of the computational models and data structures needed for different modules: morphological analyzer , a parser, a syntactic structure transfer module and target language sentence generator required for the prototype system. The generation of list of part of speech tags, chunk tags and the hierarchical dependencies among the chunks required for the translation process also has been done. In the development process, the major goals are: (a) accuracy of translation (b) speed and (c) space. Accuracy-wise, smart tools for handling transfer grammar and translation standards including equivalent words, expressions, phrases and styles in the target language are to be developed. The grammar should be optimized with a view to obtaining a single correct parse and hence a single translated output. Speed-wise, innovative use of corpus analysis, efficient parsing algorithm, design of efficient Data Structure and run-time frequency-based rearrangement of the grammar which substantially reduces the parsing and generation time are required. The space requirement also has to be minimised

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clustering schemes improve energy efficiency of wireless sensor networks. The inclusion of mobility as a new criterion for the cluster creation and maintenance adds new challenges for these clustering schemes. Cluster formation and cluster head selection is done on a stochastic basis for most of the algorithms. In this paper we introduce a cluster formation and routing algorithm based on a mobility factor. The proposed algorithm is compared with LEACH-M protocol based on metrics viz. number of cluster head transitions, average residual energy, number of alive nodes and number of messages lost

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decimal multiplication is an integral part of financial, commercial, and internet-based computations. A novel design for single digit decimal multiplication that reduces the critical path delay and area for an iterative multiplier is proposed in this research. The partial products are generated using single digit multipliers, and are accumulated based on a novel RPS algorithm. This design uses n single digit multipliers for an n × n multiplication. The latency for the multiplication of two n-digit Binary Coded Decimal (BCD) operands is (n + 1) cycles and a new multiplication can begin every n cycle. The accumulation of final partial products and the first iteration of partial product generation for next set of inputs are done simultaneously. This iterative decimal multiplier offers low latency and high throughput, and can be extended for decimal floating-point multiplication.