965 resultados para Science and Technology System
Resumo:
Studies in urban water supply system are few in the state of Kerala. It is a little researched area. In the case of water pricing a number of studies are available. In Kerala state, exception to Jacob John’s study on “Economics of Public Water Supply System”, which is a case study of Trivandrum Water Supply System in 1997, no exhaustive research work has so far come out in this field. loreover no indepth research study has come up, so far, relating to household ater demand analysis and the distribution system of urban piped water supply. he proposed study is first of its kind, which focuses on the distributional and Iailability problems of piped water supply in an urban centre in Kerala state. Hence there is a felt need for enquiring into the sufficiency of )table water supplied to people in urban areas and the efficiency maintained in roviding the scarce resource and preventing its misuse by the consumers. It is in llS backdrop that this study was undertaken and its empirical part was conducted |Calicut city in the state of Kerala. Study is confined to the water supply system ithe city of Calicut
Resumo:
The base concept from which the entire research problem emerged is as follows: Lack of spatial planning and effective development management system lead to urban sprawl with non-optimal density of population to support urban infrastructure on the one side causing a lesser quality of life in urban areas. On the other side it causes loss of productivity of natural ecosystems and agricultural areas due to disturbance to the ecosystems. Planned compact high density development with compatible mixed land use can go a long way in achieving environmental efficiency of development management system.
Resumo:
In this paper we discuss our research in developing general and systematic method for anomaly detection. The key ideas are to represent normal program behaviour using system call frequencies and to incorporate probabilistic techniques for classification to detect anomalies and intrusions. Using experiments on the sendmail system call data, we demonstrate that we can construct concise and accurate classifiers to detect anomalies. We provide an overview of the approach that we have implemented
Resumo:
This paper discusses our research in developing a generalized and systematic method for anomaly detection. The key ideas are to represent normal program behaviour using system call frequencies and to incorporate probabilistic techniques for classification to detect anomalies and intrusions. Using experiments on the sendmail system call data, we demonstrate that concise and accurate classifiers can be constructed to detect anomalies. An overview of the approach that we have implemented is provided.
Resumo:
The recent trends envisage multi-standard architectures as a promising solution for the future wireless transceivers to attain higher system capacities and data rates. The computationally intensive decimation filter plays an important role in channel selection for multi-mode systems. An efficient reconfigurable implementation is a key to achieve low power consumption. To this end, this paper presents a dual-mode Residue Number System (RNS) based decimation filter which can be programmed for WCDMA and 802.16e standards. Decimation is done using multistage, multirate finite impulse response (FIR) filters. These FIR filters implemented in RNS domain offers high speed because of its carry free operation on smaller residues in parallel channels. Also, the FIR filters exhibit programmability to a selected standard by reconfiguring the hardware architecture. The total area is increased only by 24% to include WiMAX compared to a single mode WCDMA transceiver. In each mode, the unused parts of the overall architecture is powered down and bypassed to attain power saving. The performance of the proposed decimation filter in terms of critical path delay and area are tabulated.
Resumo:
The recent trends envisage multi-standard architectures as a promising solution for the future wireless transceivers. The computationally intensive decimation filter plays an important role in channel selection for multi-mode systems. An efficient reconfigurable implementation is a key to achieve low power consumption. To this end, this paper presents a dual-mode Residue Number System (RNS) based decimation filter which can be programmed for WCDMA and 802.11a standards. Decimation is done using multistage, multirate finite impulse response (FIR) filters. These FIR filters implemented in RNS domain offers high speed because of its carry free operation on smaller residues in parallel channels. Also, the FIR filters exhibit programmability to a selected standard by reconfiguring the hardware architecture. The total area is increased only by 33% to include WLANa compared to a single mode WCDMA transceiver. In each mode, the unused parts of the overall architecture is powered down and bypassed to attain power saving. The performance of the proposed decimation filter in terms of critical path delay and area are tabulated
Resumo:
This paper proposes a content based image retrieval (CBIR) system using the local colour and texture features of selected image sub-blocks and global colour and shape features of the image. The image sub-blocks are roughly identified by segmenting the image into partitions of different configuration, finding the edge density in each partition using edge thresholding, morphological dilation and finding the corner density in each partition. The colour and texture features of the identified regions are computed from the histograms of the quantized HSV colour space and Gray Level Co- occurrence Matrix (GLCM) respectively. A combined colour and texture feature vector is computed for each region. The shape features are computed from the Edge Histogram Descriptor (EHD). Euclidean distance measure is used for computing the distance between the features of the query and target image. Experimental results show that the proposed method provides better retrieving result than retrieval using some of the existing methods
Effectiveness Of Feature Detection Operators On The Performance Of Iris Biometric Recognition System
Resumo:
Iris Recognition is a highly efficient biometric identification system with great possibilities for future in the security systems area.Its robustness and unobtrusiveness, as opposed tomost of the currently deployed systems, make it a good candidate to replace most of thesecurity systems around. By making use of the distinctiveness of iris patterns, iris recognition systems obtain a unique mapping for each person. Identification of this person is possible by applying appropriate matching algorithm.In this paper, Daugman’s Rubber Sheet model is employed for irisnormalization and unwrapping, descriptive statistical analysis of different feature detection operators is performed, features extracted is encoded using Haar wavelets and for classification hammingdistance as a matching algorithm is used. The system was tested on the UBIRIS database. The edge detection algorithm, Canny, is found to be the best one to extract most of the iris texture. The success rate of feature detection using canny is 81%, False Accept Rate is 9% and False Reject Rate is 10%.
Resumo:
Biometrics has become important in security applications. In comparison with many other biometric features, iris recognition has very high recognition accuracy because it depends on iris which is located in a place that still stable throughout human life and the probability to find two identical iris's is close to zero. The identification system consists of several stages including segmentation stage which is the most serious and critical one. The current segmentation methods still have limitation in localizing the iris due to circular shape consideration of the pupil. In this research, Daugman method is done to investigate the segmentation techniques. Eyelid detection is another step that has been included in this study as a part of segmentation stage to localize the iris accurately and remove unwanted area that might be included. The obtained iris region is encoded using haar wavelets to construct the iris code, which contains the most discriminating feature in the iris pattern. Hamming distance is used for comparison of iris templates in the recognition stage. The dataset which is used for the study is UBIRIS database. A comparative study of different edge detector operator is performed. It is observed that canny operator is best suited to extract most of the edges to generate the iris code for comparison. Recognition rate of 89% and rejection rate of 95% is achieved
Resumo:
A novel and fast technique for cryptographic applications is designed and developed using the symmetric key algorithm “MAJE4” and the popular asymmetric key algorithm “RSA”. The MAJE4 algorithm is used for encryption / decryption of files since it is much faster and occupies less memory than RSA. The RSA algorithm is used to solve the problem of key exchange as well as to accomplish scalability and message authentication. The focus is to develop a new hybrid system called MARS4 by combining the two cryptographic methods with an aim to get the advantages of both. The performance evaluation of MARS4 is done in comparison with MAJE4 and RSA.
Resumo:
Underwater target localization and tracking attracts tremendous research interest due to various impediments to the estimation task caused by the noisy ocean environment. This thesis envisages the implementation of a prototype automated system for underwater target localization, tracking and classification using passive listening buoy systems and target identification techniques. An autonomous three buoy system has been developed and field trials have been conducted successfully. Inaccuracies in the localization results, due to changes in the environmental parameters, measurement errors and theoretical approximations are refined using the Kalman filter approach. Simulation studies have been conducted for the tracking of targets with different scenarios even under maneuvering situations. This system can as well be used for classifying the unknown targets by extracting the features of the noise emanations from the targets.
Resumo:
One thousand, two hundred and sixty four samples of individually quick-frozen (IQF) peeled and deveined raw and 914 samples of cooked ready to eat shrimp samples produced from farm raised black tiger (Penaeus monodon) obtained from a seafood unit working under HACCP concept were analysed for total aerobic plate count (APC), coliform count, Escherichia coli, coagulase positive Staphylococci and Salmonella. The overall bacteriological quality of the product was found to be good. Of the frozen raw shrimp, 96% of samples showed APC below 105 while 99% of the frozen cooked ready-to-eat samples showed APC less than 104. The APC ranged from 1·0´102 to 4·2´106 cfu/gm in frozen raw shrimp and from 1·0´102 to 6·4´104 cfu/gm in the frozen cooked shrimp. Prevalences of coliforms in raw shrimp and cooked shrimp samples were 14·4% and 2·9% respectively. The coliform count in raw products ranged from 1·0´101 to 2·5´103 cfu/gm and in the cooked products, from 1·0 ´101 to 1·8´102 cfu/gm. Although all the cooked shrimp samples were free of coagulase positive staphylococci, E. coli and Salmonella, 1·0, 2·0 and 0·1% of the frozen raw shrimp samples tested positive for coagulase positive Staphylococci, E. coli and Salmonella respectively. The Salmonella strain was identified as Salmonella typhimurium. The results of the present study highlight the importance of implementation of HACCP system in the seafood industry to ensure consistent quality of frozen seafood
Resumo:
The library professional in an academic institution has to anticipate the changing expectations of the users, and be flexible in adopting new skills and levels of awareness. Technology has drastically changed the way librarians define themselves and the way they think about their profession and the institutions they serve. In addition to the technical and professional skills, commitment to user centred services and skills for effective oral and written communication; they must have other skills, including business and management, teaching, leadership, etc. Eventually, library and information professionals in academic libraries need to update their knowledge and skills in Information and Communication Technology (ICT) as they play the role of key success factor in enabling the library to perform its role as an information support system for the society.
Resumo:
one of the key sectors, identified by the Department of Industries Government of Kerala, for the cluster development initiative is Handloom, which gives employment to over over 50,000 people directly. Despite its age old tradition and fame, the performance of the sector vis-à-vis power looms is not very rosy owing to (i) competition from cheap power loom cloth from other states (ii) scarcity of quality yarn (iii) price escalation of yarn, dyes, chemicals and other raw materials (iv) the shrinking market for handlooms in Kerala (v) non-demand based production and inadequacy of new designs and (vi) inefficiencies in the system, particularly in the co-operative sector. Cluster based approach is adopted in the handloom sector with the objective of providing necessary support mechanism to come out of the crisis that the sector faces now. While four cluster schemes are being implemented in Kerala, it is under IHDS-CDP that the State got a sizeable number of clusters benefiting a large number of societies and weavers- 24 handloom clusters, bringing 152 handloom co-operative societies and over 19,800 handloom workers under the Programme. This research attempts to revisit the underlying rationale and context of the new direction and would attempt to broadly analyze the growth trends under the influence of cluster model adopted by the State IHDS-CDP for the revival of handloom sector through a detailed study of the handloom co-operative societies in Kerala. If handloom sector in Kerala can be revived using cluster based approach, it can be easily concluded that cluster is capable of taking the MSME in Kerala to a ‘high growth path.’ The study is aimed at understanding how best clusters emerge as appropriate industrial organization suitable for the current global structure of manufacture
Resumo:
Cerebral glioma is the most prevalent primary brain tumor, which are classified broadly into low and high grades according to the degree of malignancy. High grade gliomas are highly malignant which possess a poor prognosis, and the patients survive less than eighteen months after diagnosis. Low grade gliomas are slow growing, least malignant and has better response to therapy. To date, histological grading is used as the standard technique for diagnosis, treatment planning and survival prediction. The main objective of this thesis is to propose novel methods for automatic extraction of low and high grade glioma and other brain tissues, grade detection techniques for glioma using conventional magnetic resonance imaging (MRI) modalities and 3D modelling of glioma from segmented tumor slices in order to assess the growth rate of tumors. Two new methods are developed for extracting tumor regions, of which the second method, named as Adaptive Gray level Algebraic set Segmentation Algorithm (AGASA) can also extract white matter and grey matter from T1 FLAIR an T2 weighted images. The methods were validated with manual Ground truth images, which showed promising results. The developed methods were compared with widely used Fuzzy c-means clustering technique and the robustness of the algorithm with respect to noise is also checked for different noise levels. Image texture can provide significant information on the (ab)normality of tissue, and this thesis expands this idea to tumour texture grading and detection. Based on the thresholds of discriminant first order and gray level cooccurrence matrix based second order statistical features three feature sets were formulated and a decision system was developed for grade detection of glioma from conventional T2 weighted MRI modality.The quantitative performance analysis using ROC curve showed 99.03% accuracy for distinguishing between advanced (aggressive) and early stage (non-aggressive) malignant glioma. The developed brain texture analysis techniques can improve the physician’s ability to detect and analyse pathologies leading to a more reliable diagnosis and treatment of disease. The segmented tumors were also used for volumetric modelling of tumors which can provide an idea of the growth rate of tumor; this can be used for assessing response to therapy and patient prognosis.