16 resultados para System-based
em Cochin University of Science
Resumo:
This work aims to study the variation in subduction zone geometry along and across the arc and the fault pattern within the subducting plate. Depth of penetration as well as the dip of the Benioff zone varies considerably along the arc which corresponds to the curvature of the fold- thrust belt which varies from concave to convex in different sectors of the arc. The entire arc is divided into 27 segments and depth sections thus prepared are utilized to investigate the average dip of the Benioff zone in the different parts of the entire arc, penetration depth of the subducting lithosphere, the subduction zone geometry underlying the trench, the arctrench gap, etc.The study also describes how different seismogenic sources are identified in the region, estimation of moment release rate and deformation pattern. The region is divided into broad seismogenic belts. Based on these previous studies and seismicity Pattern, we identified several broad distinct seismogenic belts/sources. These are l) the Outer arc region consisting of Andaman-Nicobar islands 2) the back-arc Andaman Sea 3)The Sumatran fault zone(SFZ)4)Java onshore region termed as Jave Fault Zone(JFZ)5)Sumatran fore arc silver plate consisting of Mentawai fault(MFZ)6) The offshore java fore arc region 7)The Sunda Strait region.As the Seismicity is variable,it is difficult to demarcate individual seismogenic sources.Hence, we employed a moving window method having a window length of 3—4° and with 50% overlapping starting from one end to the other. We succeeded in defining 4 sources each in the Andaman fore arc and Back arc region, 9 such sources (moving windows) in the Sumatran Fault zone (SFZ), 9 sources in the offshore SFZ region and 7 sources in the offshore Java region. Because of the low seismicity along JFZ, it is separated into three seismogenic sources namely West Java, Central Java and East Java. The Sunda strait is considered as a single seismogenic source.The deformation rates for each of the seismogenic zones have been computed. A detailed error analysis of velocity tensors using Monte—Carlo simulation method has been carried out in order to obtain uncertainties. The eigen values and the respective eigen vectors of the velocity tensor are computed to analyze the actual deformation pattem for different zones. The results obtained have been discussed in the light of regional tectonics, and their implications in terms of geodynamics have been enumerated.ln the light of recent major earthquakes (26th December 2004 and 28th March 2005 events) and the ongoing seismic activity, we have recalculated the variation in the crustal deformation rates prior and after these earthquakes in Andaman—Sumatra region including the data up to 2005 and the significant results has been presented.ln this chapter, the down going lithosphere along the subduction zone is modeled using the free air gravity data by taking into consideration the thickness of the crustal layer, the thickness of the subducting slab, sediment thickness, presence of volcanism, the proximity of the continental crust etc. Here a systematic and detailed gravity interpretation constrained by seismicity and seismic data in the Andaman arc and the Andaman Sea region in order to delineate the crustal structure and density heterogeneities a Io nagnd across the arc and its correlation with the seismogenic behaviour is presented.
Resumo:
Biometrics deals with the physiological and behavioral characteristics of an individual to establish identity. Fingerprint based authentication is the most advanced biometric authentication technology. The minutiae based fingerprint identification method offer reasonable identification rate. The feature minutiae map consists of about 70-100 minutia points and matching accuracy is dropping down while the size of database is growing up. Hence it is inevitable to make the size of the fingerprint feature code to be as smaller as possible so that identification may be much easier. In this research, a novel global singularity based fingerprint representation is proposed. Fingerprint baseline, which is the line between distal and intermediate phalangeal joint line in the fingerprint, is taken as the reference line. A polygon is formed with the singularities and the fingerprint baseline. The feature vectors are the polygonal angle, sides, area, type and the ridge counts in between the singularities. 100% recognition rate is achieved in this method. The method is compared with the conventional minutiae based recognition method in terms of computation time, receiver operator characteristics (ROC) and the feature vector length. Speech is a behavioural biometric modality and can be used for identification of a speaker. In this work, MFCC of text dependant speeches are computed and clustered using k-means algorithm. A backpropagation based Artificial Neural Network is trained to identify the clustered speech code. The performance of the neural network classifier is compared with the VQ based Euclidean minimum classifier. Biometric systems that use a single modality are usually affected by problems like noisy sensor data, non-universality and/or lack of distinctiveness of the biometric trait, unacceptable error rates, and spoof attacks. Multifinger feature level fusion based fingerprint recognition is developed and the performances are measured in terms of the ROC curve. Score level fusion of fingerprint and speech based recognition system is done and 100% accuracy is achieved for a considerable range of matching threshold
Resumo:
Any automatically measurable, robust and distinctive physical characteristic or personal trait that can be used to identify an individual or verify the claimed identity of an individual, referred to as biometrics, has gained significant interest in the wake of heightened concerns about security and rapid advancements in networking, communication and mobility. Multimodal biometrics is expected to be ultra-secure and reliable, due to the presence of multiple and independent—verification clues. In this study, a multimodal biometric system utilising audio and facial signatures has been implemented and error analysis has been carried out. A total of one thousand face images and 250 sound tracks of 50 users are used for training the proposed system. To account for the attempts of the unregistered signatures data of 25 new users are tested. The short term spectral features were extracted from the sound data and Vector Quantization was done using K-means algorithm. Face images are identified based on Eigen face approach using Principal Component Analysis. The success rate of multimodal system using speech and face is higher when compared to individual unimodal recognition systems
Resumo:
Now a days, email has become the most widely communication way in daily life. The main reason for using email is probably because of the convenience and speed in which it can be transmitted irrespective of geographical distances. To improve security and efficiency of email system, most of the email system adopt PKI and IBE encryption schemes. However, both PKI and IBE encryption schemes have their own shortcomings and consequently bring security issues to email systems. This paper proposes a new secure email system based on IBE which combines finger print authentication and proxy service for encryption and decryption
Resumo:
Present work is aimed at development of an appropriate microbial technology for protection of larvae of macrobrachium rosenbergii from disease and to increase survival rate in hatcheries. Application of immunostimulants to activate the immune system of cultured animals against pathogen is the widely accepted alternative to antibiotics in aquaculture. The most important immunostimulant is glucan. Therefore a research programme entitled as extraction of glucan from Acremonium diospyri and its application in macrobrachium rosenbergii larval rearing system along with bacterians as microspheres. The main objectives of the study are development of aquaculture grade glucan from acremonium diospyri, microencapsulated drug delivery system for the larvae of M. rosenbergii and microencapsulated glucan with bacterian preparation for the enhanced production of M. rosenbergii in larval rearing system. Based on the results of field trials microencapsulated glucan with bacterin preparation, it is concluded that the microencapsulated preparation at a concentration of 25g per million larvae once in seven days will enhance the production and quality seed of M. rosenbergii.
Resumo:
The thesis introduced the octree and addressed the complete nature of problems encountered, while building and imaging system based on octrees. An efficient Bottom-up recursive algorithm and its iterative counterpart for the raster to octree conversion of CAT scan slices, to improve the speed of generating the octree from the slices, the possibility of utilizing the inherent parallesism in the conversion programme is explored in this thesis. The octree node, which stores the volume information in cube often stores the average density information could lead to “patchy”distribution of density during the image reconstruction. In an attempt to alleviate this problem and explored the possibility of using VQ to represent the imformation contained within a cube. Considering the ease of accommodating the process of compressing the information during the generation of octrees from CAT scan slices, proposed use of wavelet transforms to generate the compressed information in a cube. The modified algorithm for generating octrees from the slices is shown to accommodate the eavelet compression easily. Rendering the stored information in the form of octree is a complex task, necessarily because of the requirement to display the volumetric information. The reys traced from each cube in the octree, sum up the density en-route, accounting for the opacities and transparencies produced due to variations in density.
Resumo:
Nanoscale silica was synthesized by precipitation method using sodium silicate and dilute hydrochloric acid under controlled conditions. The synthesized silica was characterized by Scanning Electron Microscopy (SEM), Transmission Electron Microscopy (TEM), BET adsorption and X-Ray Diffraction (XRD). The particle size of silica was calculated to be 13 nm from the XRD results and the surface area was found to be 295 m2/g by BET method. The performance of this synthesized nanosilica as a reinforcing filler in natural rubber (NR) compound was investigated. The commercial silica was used as the reference material. Nanosilica was found to be effective reinforcing filler in natural rubber compound. Filler-matrix interaction was better for nanosilica than the commercial silica. The synthesized nanosilica was used in place of conventional silica in HRH (hexamethylene tetramine, resorcinol and silica) bonding system for natural rubber and styrene butadiene rubber / Nylon 6 short fiber composites. The efficiency of HRH bonding system based on nanosilica was better. Nanosilica was also used as reinforcing filler in rubber / Nylon 6 short fiber hybrid composite. The cure, mechanical, ageing, thermal and dynamic mechanical properties of nanosilica / Nylon 6 short fiber / elastomeric hybrid composites were studied in detail. The matrices used were natural rubber (NR), nitrile rubber (NBR), styrene butadiene rubber (SBR) and chloroprene rubber (CR). Fiber loading was varied from 0 to 30 parts per hundred rubber (phr) and silica loading was varied from 0 to 9 phr. Hexa:Resorcinol:Silica (HRH) ratio was maintained as 2:2:1. HRH loading was adjusted to 16% of the fiber loading. Minimum torque, maximum torque and cure time increased with silica loading. Cure rate increased with fiber loading and decreased with silica content. The hybrid composites showed improved mechanical properties in the presence of nanosilica. Tensile strength showed a dip at 10 phr fiber loading in the case of NR and CR while it continuously increased with fiber loading in the case of NBR and SBR. The nanosilica improved the tensile strength, modulus and tear strength better than the conventional silica. Abrasion resistance and hardness were also better for the nanosilica composites. Resilience and compression set were adversely affected. Hybrid composites showed anisotropy in mechanical properties. Retention in ageing improved with fiber loading and was better for nanosilica-filled hybrid composites. The nanosilica also improved the thermal stability of the hybrid composite better than the commercial silica. All the composites underwent two-step thermal degradation. Kinetic studies showed that the degradation of all the elastomeric composites followed a first-order reaction. Dynamic mechanical analysis revealed that storage modulus (E’) and loss modulus (E”) increased with nanosiica content, fiber loading and frequency for all the composites, independent of the matrix. The highest rate of increase was registered for NBR rubber.
Resumo:
Global Positioning System (GPS), with its high integrity, continuous availability and reliability, revolutionized the navigation system based on radio ranging. With four or more GPS satellites in view, a GPS receiver can find its location anywhere over the globe with accuracy of few meters. High accuracy - within centimeters, or even millimeters is achievable by correcting the GPS signal with external augmentation system. The use of satellite for critical application like navigation has become a reality through the development of these augmentation systems (like W AAS, SDCM, and EGNOS, etc.) with a primary objective of providing essential integrity information needed for navigation service in their respective regions. Apart from these, many countries have initiated developing space-based regional augmentation systems like GAGAN and IRNSS of India, MSAS and QZSS of Japan, COMPASS of China, etc. In future, these regional systems will operate simultaneously and emerge as a Global Navigation Satellite System or GNSS to support a broad range of activities in the global navigation sector.Among different types of error sources in the GPS precise positioning, the propagation delay due to the atmospheric refraction is a limiting factor on the achievable accuracy using this system. The WADGPS, aimed for accurate positioning over a large area though broadcasts different errors involved in GPS ranging including ionosphere and troposphere errors, due to the large temporal and spatial variations in different atmospheric parameters especially in lower atmosphere (troposphere), the use of these broadcasted tropospheric corrections are not sufficiently accurate. This necessitated the estimation of tropospheric error based on realistic values of tropospheric refractivity. Presently available methodologies for the estimation of tropospheric delay are mostly based on the atmospheric data and GPS measurements from the mid-latitude regions, where the atmospheric conditions are significantly different from that over the tropics. No such attempts were made over the tropics. In a practical approach when the measured atmospheric parameters are not available analytical models evolved using data from mid-latitudes for this purpose alone can be used. The major drawback of these existing models is that it neglects the seasonal variation of the atmospheric parameters at stations near the equator. At tropics the model underestimates the delay in quite a few occasions. In this context, the present study is afirst and major step towards the development of models for tropospheric delay over the Indian region which is a prime requisite for future space based navigation program (GAGAN and IRNSS). Apart from the models based on the measured surface parameters, a region specific model which does not require any measured atmospheric parameter as input, but depends on latitude and day of the year was developed for the tropical region with emphasis on Indian sector.Large variability of atmospheric water vapor content in short spatial and/or temporal scales makes its measurement rather involved and expensive. A local network of GPS receivers is an effective tool for water vapor remote sensing over the land. This recently developed technique proves to be an effective tool for measuring PW. The potential of using GPS to estimate water vapor in the atmosphere at all-weather condition and with high temporal resolution is attempted. This will be useful for retrieving columnar water vapor from ground based GPS data. A good network of GPS could be a major source of water vapor information for Numerical Weather Prediction models and could act as surrogate to the data gap in microwave remote sensing for water vapor over land.
Resumo:
In the present scenario of energy demand overtaking energy supply top priority is given for energy conservation programs and policies. Most of the process plants are operated on continuous basis and consumes large quantities of energy. Efficient management of process system can lead to energy savings, improved process efficiency, lesser operating and maintenance cost, and greater environmental safety. Reliability and maintainability of the system are usually considered at the design stage and is dependent on the system configuration. However, with the growing need for energy conservation, most of the existing process systems are either modified or are in a state of modification with a view for improving energy efficiency. Often these modifications result in a change in system configuration there by affecting the system reliability. It is important that system modifications for improving energy efficiency should not be at the cost of reliability. Any new proposal for improving the energy efficiency of the process or equipments should prove itself to be economically feasible for gaining acceptance for implementation. In order to arrive at the economic feasibility of the new proposal, the general trend is to compare the benefits that can be derived over the lifetime as well as the operating and maintenance costs with the investment to be made. Quite often it happens that the reliability aspects (or loss due to unavailability) are not taken into consideration. Plant availability is a critical factor for the economic performance evaluation of any process plant.The focus of the present work is to study the effect of system modification for improving energy efficiency on system reliability. A generalized model for the valuation of process system incorporating reliability is developed, which is used as a tool for the analysis. It can provide an awareness of the potential performance improvements of the process system and can be used to arrive at the change in process system value resulting from system modification. The model also arrives at the pay back of the modified system by taking reliability aspects also into consideration. It is also used to study the effect of various operating parameters on system value. The concept of breakeven availability is introduced and an algorithm for allocation of component reliabilities of the modified process system based on the breakeven system availability is also developed. The model was applied to various industrial situations.
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
In the present work, the author has designed and developed all types of solar air heaters called porous and nonporous collectors. The developed solar air heaters were subjected to different air mass flow rates in order to standardize the flow per unit area of the collector. Much attention was given to investigate the performance of the solar air heaters fitted with baffles. The output obtained from the experiments on pilot models, helped the installation of solar air heating system for industrial drying applications also. Apart from these, various types of solar dryers, for small and medium scale drying applications, were also built up. The feasibility of ‘latent heat thermal energy storage system’ based on Phase Change Material was also undertaken. The application of solar greenhouse for drying industrial effluent was analyzed in the present study and a solar greenhouse was developed. The effectiveness of Computational Fluid Dynamics (CFD) in the field of solar air heaters was also analyzed. The thesis is divided into eight chapters.
Resumo:
The thesis mainly focuses on material characterization in different environments: freely available samples taken in planar fonn, biological samples available in small quantities and buried objects.Free space method, finds many applications in the fields of industry, medicine and communication. As it is a non-contact method, it can be employed for monitoring the electrical properties of materials moving through a conveyor belt in real time. Also, measurement on such systems at high temperature is possible. NID theory can be applied to the characterization of thin films. Dielectric properties of thin films deposited on any dielectric substrate can be determined. ln chemical industry, the stages of a chemical reaction can be monitored online. Online monitoring will be more efficient as it saves time and avoids risk of sample collection.Dielectric contrast is one of the main factors, which decides the detectability of a system. lt could be noted that the two dielectric objects of same dielectric constant 3.2 (s, of plastic mine) placed in a medium of dielectric constant 2.56 (er of sand) could even be detected employing the time domain analysis of the reflected signal. This type of detection finds strategic importance as it provides solution to the problem of clearance of non-metallic mines. The demining of these mines using the conventional techniques had been proved futile. The studies on the detection of voids and leakage in pipes find many applications.The determined electrical properties of tissues can be used for numerical modeling of cells, microwave imaging, SAR test etc. All these techniques need the accurate determination of dielectric constant. ln the modem world, the use of cellular and other wireless communication systems is booming up. At the same time people are concemed about the hazardous effects of microwaves on living cells. The effect is usually studied on human phantom models. The construction of the models requires the knowledge of the dielectric parameters of the various body tissues. lt is in this context that the present study gains significance. The case study on biological samples shows that the properties of normal and infected body tissues are different. Even though the change in the dielectric properties of infected samples from that of normal one may not be a clear evidence of an ailment, it is an indication of some disorder.ln medical field, the free space method may be adapted for imaging the biological samples. This method can also be used in wireless technology. Evaluation of electrical properties and attenuation of obstacles in the path of RF waves can be done using free waves. An intelligent system for controlling the power output or frequency depending on the feed back values of the attenuation may be developed.The simulation employed in GPR can be extended for the exploration of the effects due to the factors such as the different proportion of water content in the soil, the level and roughness of the soil etc on the reflected signal. This may find applications in geological explorations. ln the detection of mines, a state-of-the art technique for scanning and imaging an active mine field can be developed using GPR. The probing antenna can be attached to a robotic arm capable of three degrees of rotation and the whole detecting system can be housed in a military vehicle. In industry, a system based on the GPR principle can be developed for monitoring liquid or gas through a pipe, as pipe with and without the sample gives different reflection responses. lt may also be implemented for the online monitoring of different stages of extraction and purification of crude petroleum in a plant.Since biological samples show fluctuation in the dielectric nature with time and other physiological conditions, more investigation in this direction should be done. The infected cells at various stages of advancement and the normal cells should be analysed. The results from these comparative studies can be utilized for the detection of the onset of such diseases. Studying the properties of infected tissues at different stages, the threshold of detectability of infected cells can be determined.
Resumo:
Learning Disability (LD) is a general term that describes specific kinds of learning problems. It is a neurological condition that affects a child's brain and impairs his ability to carry out one or many specific tasks. The learning disabled children are neither slow nor mentally retarded. This disorder can make it problematic for a child to learn as quickly or in the same way as some child who isn't affected by a learning disability. An affected child can have normal or above average intelligence. They may have difficulty paying attention, with reading or letter recognition, or with mathematics. It does not mean that children who have learning disabilities are less intelligent. In fact, many children who have learning disabilities are more intelligent than an average child. Learning disabilities vary from child to child. One child with LD may not have the same kind of learning problems as another child with LD. There is no cure for learning disabilities and they are life-long. However, children with LD can be high achievers and can be taught ways to get around the learning disability. In this research work, data mining using machine learning techniques are used to analyze the symptoms of LD, establish interrelationships between them and evaluate the relative importance of these symptoms. To increase the diagnostic accuracy of learning disability prediction, a knowledge based tool based on statistical machine learning or data mining techniques, with high accuracy,according to the knowledge obtained from the clinical information, is proposed. The basic idea of the developed knowledge based tool is to increase the accuracy of the learning disability assessment and reduce the time used for the same. Different statistical machine learning techniques in data mining are used in the study. Identifying the important parameters of LD prediction using the data mining techniques, identifying the hidden relationship between the symptoms of LD and estimating the relative significance of each symptoms of LD are also the parts of the objectives of this research work. The developed tool has many advantages compared to the traditional methods of using check lists in determination of learning disabilities. For improving the performance of various classifiers, we developed some preprocessing methods for the LD prediction system. A new system based on fuzzy and rough set models are also developed for LD prediction. Here also the importance of pre-processing is studied. A Graphical User Interface (GUI) is designed for developing an integrated knowledge based tool for prediction of LD as well as its degree. The designed tool stores the details of the children in the student database and retrieves their LD report as and when required. The present study undoubtedly proves the effectiveness of the tool developed based on various machine learning techniques. It also identifies the important parameters of LD and accurately predicts the learning disability in school age children. This thesis makes several major contributions in technical, general and social areas. The results are found very beneficial to the parents, teachers and the institutions. They are able to diagnose the child’s problem at an early stage and can go for the proper treatments/counseling at the correct time so as to avoid the academic and social losses.
Resumo:
The work presented in this thesis is regarding the development and evaluation of new bonding agents for short polyester fiber - polyurethane elastomer composites. The conventional bonding system based on hexamethylenetetramine, resorcinol and hydrated silica was not effective as a bonding agent for the composite, as the water eliminated during the formation of the RF resin hydrolysed the urethane linkages. Four bonding agents based on MDI/'I‘DI and polypropyleneglycol, propyleneglycol and glycerol were prepared and the composite recipe was optimised with respect to the cure characteristics and mechanical properties. The flow properties, stress relaxation pattern and the thermal degradation characteristics of the composites containing different bonding agents were then studied in detail to evaluate the new bonding systems. The optimum loading of resin was 5 phr and the ratio of the -01 to isocyanate was 1:1. The cure characteristics showed that the optimum combination of cure rate and processability was given by the composite with the resin based on polypropyleneglycol/ glycerol/ 4,4’diphenylmethanediisocynate (PPG/GL/MDI). From the rheological studies of the composites with and without bonding agents it was observed that all the composites showed pseudoplastic nature and the activation energy of flow of the composite was not altered by the presence of bonding agents. Mechanical properties such as tensile strength, modulus, tear resistance and abrasion resistance were improved in the presence of bonding agents and the effect was more pronounced in the case of abrasion resistance. The composites based on MDI/GL showed better initial properties while composites with resins based on MDI/PPG showed better aging resistance. Stress relaxation showed a multistage relaxation behaviour for the composite. Within the-strain levels studied, the initial rate of relaxation was higher and the cross over time was lesser for the composite containing bonding agents. The bonding agent based on MDI/PPG/GL was found to be a better choice for improving stress relaxation characteristics with better interfacial bonding. Thennogravimetirc analysis showed that the presence of fiber and bonding agents improved the thennal stability of the polyurethane elastomer marginally and it was maximum in the case of MDI / GL based bonding agents. The kinetics of degradation was not altered by the presence of bonding agents
Resumo:
In this paper we address the problem of face detection and recognition of grey scale frontal view images. We propose a face recognition system based on probabilistic neural networks (PNN) architecture. The system is implemented using voronoi/ delaunay tessellations and template matching. Images are segmented successfully into homogeneous regions by virtue of voronoi diagram properties. Face verification is achieved using matching scores computed by correlating edge gradients of reference images. The advantage of classification using PNN models is its short training time. The correlation based template matching guarantees good classification results