880 resultados para Network Security System
Resumo:
Article publié dans le journal « Journal of Information Security Research ». March 2012.
Resumo:
Les réseaux sociaux accueillent chaque jour des millions d’utilisateurs. Les usagers de ces réseaux, qu’ils soient des particuliers ou des entreprises, sont directement affectés par leur fulgurante expansion. Certains ont même développé une certaine dépendance à l’usage des réseaux sociaux allant même jusqu’à transformer leurs habitudes de vie de tous les jours. Cependant, cet engouement pour les réseaux sociaux n’est pas sans danger. Il va de soi que leur expansion favorise et sert également l’expansion des attaques en ligne. Les réseaux sociaux constituent une opportunité idéale pour les délinquants et les fraudeurs de porter préjudice aux usagers. Ils ont accès à des millions de victimes potentielles. Les menaces qui proviennent des amis et auxquelles font face les utilisateurs de réseaux sociaux sont nombreuses. On peut citer, à titre d’exemple, la cyberintimidation, les fraudes, le harcèlement criminel, la menace, l’incitation au suicide, la diffusion de contenu compromettant, la promotion de la haine, l’atteinte morale et physique, etc. Il y a aussi un « ami très proche » qui peut être très menaçant sur les réseaux sociaux : soi-même. Lorsqu’un utilisateur divulgue trop d’informations sur lui-même, il contribue sans le vouloir à attirer vers lui les arnaqueurs qui sont à la recherche continue d’une proie. On présente dans cette thèse une nouvelle approche pour protéger les utilisateurs de Facebook. On a créé une plateforme basée sur deux systèmes : Protect_U et Protect_UFF. Le premier système permet de protéger les utilisateurs d’eux-mêmes en analysant le contenu de leurs profils et en leur proposant un ensemble de recommandations dans le but de leur faire réduire la publication d’informations privées. Le second système vise à protéger les utilisateurs de leurs « amis » dont les profils présentent des symptômes alarmants (psychopathes, fraudeurs, criminels, etc.) en tenant compte essentiellement de trois paramètres principaux : le narcissisme, le manque d’émotions et le comportement agressif.
Resumo:
Le nombre important de véhicules sur le réseau routier peut entraîner des problèmes d'encombrement et de sécurité. Les usagers des réseaux routiers qui nous intéressent sont les camionneurs qui transportent des marchandises, pouvant rouler avec des véhicules non conformes ou emprunter des routes interdites pour gagner du temps. Le transport de matières dangereuses est réglementé et certains lieux, surtout les ponts et les tunnels, leur sont interdits d'accès. Pour aider à faire appliquer les lois en vigueur, il existe un système de contrôles routiers composé de structures fixes et de patrouilles mobiles. Le déploiement stratégique de ces ressources de contrôle mise sur la connaissance du comportement des camionneurs que nous allons étudier à travers l'analyse de leurs choix de routes. Un problème de choix de routes peut se modéliser en utilisant la théorie des choix discrets, elle-même fondée sur la théorie de l'utilité aléatoire. Traiter ce type de problème avec cette théorie est complexe. Les modèles que nous utiliserons sont tels, que nous serons amenés à faire face à des problèmes de corrélation, puisque plusieurs routes partagent probablement des arcs. De plus, puisque nous travaillons sur le réseau routier du Québec, le choix de routes peut se faire parmi un ensemble de routes dont le nombre est potentiellement infini si on considère celles ayant des boucles. Enfin, l'étude des choix faits par un humain n'est pas triviale. Avec l'aide du modèle de choix de routes retenu, nous pourrons calculer une expression de la probabilité qu'une route soit prise par le camionneur. Nous avons abordé cette étude du comportement en commençant par un travail de description des données collectées. Le questionnaire utilisé par les contrôleurs permet de collecter des données concernant les camionneurs, leurs véhicules et le lieu du contrôle. La description des données observées est une étape essentielle, car elle permet de présenter clairement à un analyste potentiel ce qui est accessible pour étudier les comportements des camionneurs. Les données observées lors d'un contrôle constitueront ce que nous appellerons une observation. Avec les attributs du réseau, il sera possible de modéliser le réseau routier du Québec. Une sélection de certains attributs permettra de spécifier la fonction d'utilité et par conséquent la fonction permettant de calculer les probabilités de choix de routes par un camionneur. Il devient alors possible d'étudier un comportement en se basant sur des observations. Celles provenant du terrain ne nous donnent pas suffisamment d'information actuellement et même en spécifiant bien un modèle, l'estimation des paramètres n'est pas possible. Cette dernière est basée sur la méthode du maximum de vraisemblance. Nous avons l'outil, mais il nous manque la matière première que sont les observations, pour continuer l'étude. L'idée est de poursuivre avec des observations de synthèse. Nous ferons des estimations avec des observations complètes puis, pour se rapprocher des conditions réelles, nous continuerons avec des observations partielles. Ceci constitue d'ailleurs un défi majeur. Nous proposons pour ces dernières, de nous servir des résultats des travaux de (Bierlaire et Frejinger, 2008) en les combinant avec ceux de (Fosgerau, Frejinger et Karlström, 2013). Bien qu'elles soient de nature synthétiques, les observations que nous utilisons nous mèneront à des résultats tels, que nous serons en mesure de fournir une proposition concrète qui pourrait aider à optimiser les décisions des responsables des contrôles routiers. En effet, nous avons réussi à estimer, sur le réseau réel du Québec, avec un seuil de signification de 0,05 les valeurs des paramètres d'un modèle de choix de routes discrets, même lorsque les observations sont partielles. Ces résultats donneront lieu à des recommandations sur les changements à faire dans le questionnaire permettant de collecter des données.
Resumo:
The study is a close scrutiny of the process of investigation of offences in India along with an analysis of powers and functions of the investigating agency. The offences, which are prejudicial to sovereignty, integrity and security of the nation or to its friendly relations with foreign states, are generally called the offences against national security. Offences against national security being prejudicial to the very existence of the nation and its legal system, is a heinous and terrible one. As early as 1971 the Law Commission of India had pointed out the need of treating the offences relating to national security and their perpetrators on a totally different procedural footing. The recommendation that, all the offences coming under the said category ought to be brought under the purview of a single enactment so as to confront such offences effectively. The discrepancies in and inadequacies of the criminal justice system in India as much as they are related to the investigations of the offences against national security are examined and the reforms are also suggested. The quality of criminal justice is closely linked with the caliber of the prosecution system and many of the acquittals in courts can be ascribed not only to poor investigations but also to poor quality of prosecution.
Resumo:
A new design of' a dual-frequency dual-polarized square microsh'ip antenna fed along the diagonal, embedded with a square slot having three extended stubs for frequency tuning, is introduced. The proposed antenna was fabricated using a standard photolithographic method and the antenna was tested using the HP 3510(:; Vector Network Analyser. The antenna is capable of generating dual resonant frequencies with mutually perpendicular polarizations and broad radiation pattern characteristics. Such dual-frequency designs find wide applications in personal mobile handsets combining GSM and CDS 1800 modes, and applications in which different frequencies are used for emission and reception such as personal satellite communications and cellular network systems.
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
In India, Food Security meant supply of food grains and the medium was Public Distribution System. Public Distribution System (PDS) is a rationing mechanism that entitles households to specified quantities of selected commodities at subsidized prices. The Objectives of PDS are maintaining Price Stability, rationing during times of scarcity, welfare of the poor, and keeping a check on private trade. Kerala has registered remarkable improvement in poverty reduction in general over the years among all social sections, including scheduled caste and scheduled tribe population. As part of the structural adjustment intended to reduce public expenditure, PDS has been modified as Revamped PDS (RPDS) during 1992 and later on as Targeted PDS (TPDS) in 1997, intended to target households on the basis of income criterion, classifying people as Below Poverty Line (BPL) and Above Poverty Line (APL). TPDS provides 25Kg. of food gra.ins through the Fair Price Shops per month @ Rs.3/- per Kg. of rice/ wheat to the BPL category and @Rs.8.90 and Rs.6.7O for rice and wheat respectively to the APL category of people. Since TPDS is intended to target the poor people, the subsidy spent by the government for the scheme should be beneficial to the poor people and naturally they should utilize the benefits by purchasing the food grains allotted under the scheme. Several studies have shown that there is underutilization of the allotments under TPDS. Therefore, the extent of utilization of TPDS in food grains, how and why remains as a major hurdle, in improving the structure and system of PDS. Livelihood of the tribal population being under threat due to increasing degradation of the resources, the targeting system ought to be effective among the tribal population. Therefore, performance of the TPDS in food grains, in terms of the utilization by the tribal population in Kerala, impact thereof and the factors, if any, affecting proper utilization were considered as the research problem in this study. The study concentrated on the pattern of consumption of food grains by the tribal people, whether their hunger needs are met by distribution of food grains through the TPDS, extent to which TPDS in food grains reduce their share of expenditure on food in the total household expenditure, and the factors affecting the utilization of the TPDS in food grains by the tribal population. Going through the literature, it has been noted that only few studies concentrated on the utilization of TPDS in food grains among the tribal population in Kerala.The Research Design used in this study is descriptive in nature, but exploratory in some aspects. Idukki, Palakkad and Wayanad have more than 60% of the population of the tribals in the state. Within the three districts mentioned above, 14 villages with scheduled tribe concentration were selected for the study. 95 tribal colonies were selected from among the various tribal settlements. Collection of primary data was made from 1231 households with in the above tribal colonies. Analysis of data on the socio-economic factors of the tribal people, pattern of food consumption, extent of reduction in the share of expenditure on food among the household expenditure of the tribal people and the impact of TPDS on the tribal families etc. and testing of hypotheses to find out the relation/association of each of the six variables, using the data on BPL and APL categories of households separately have resulted in findings such as six percent of the tribal families do not have Ration Cards, average per capita consumption of food grains by the tribal people utilizing TPDS meets 62% of their minimum requirement, whereas the per capita consumption of food grains by the tribal people is higher than the national average per capita consumption, 63% deficiency in food grains may be felt by tribal people in general, if TPDS is withdrawn, and the deficit for BPL tribal people may be 82%, TPDS facilitates a reduction of 9.71% in the food expenditure among the total household expenditure of the tribal people in general, share of food to non-food among BPL category of tribals is 55:45 and 40:60 among the APL, Variables, viz. household income, number of members in the family and distance of FPS from tribal settlements etc. have influence on the quantity of rice being purchased by the tribal people from the Fair Price Shops, and there is influence of household income and distance of FPS from tribal settlements on the quantity of rice being purchased by the tribal people from the open market. Rationing with differential pricing on phased allotments, rectification of errors in targeting, anomalies in norms and procedures for classifying tribal people as BPL/APL, exclusive Income Generation for tribal population, paddy cultivation in the landholdings possessed by the tribal people, special drive for allotment of Ration Cards to the tribal people, especially those belonging to the BPL category, Mobile Fair Price Shops in tribal settlements, ensure quality of the food grains distributed through the TPDS, distribution of wheat flour in packed condition instead of wheat through the Fair Price Shops are recommended to address the shortcomings and weaknesses of the TPDS vis-avis the tribal population in Kerala.
Resumo:
Biometrics deals with the physiological and behavioral characteristics of an individual to establish identity. Fingerprint based authentication is the most advanced biometric authentication technology. The minutiae based fingerprint identification method offer reasonable identification rate. The feature minutiae map consists of about 70-100 minutia points and matching accuracy is dropping down while the size of database is growing up. Hence it is inevitable to make the size of the fingerprint feature code to be as smaller as possible so that identification may be much easier. In this research, a novel global singularity based fingerprint representation is proposed. Fingerprint baseline, which is the line between distal and intermediate phalangeal joint line in the fingerprint, is taken as the reference line. A polygon is formed with the singularities and the fingerprint baseline. The feature vectors are the polygonal angle, sides, area, type and the ridge counts in between the singularities. 100% recognition rate is achieved in this method. The method is compared with the conventional minutiae based recognition method in terms of computation time, receiver operator characteristics (ROC) and the feature vector length. Speech is a behavioural biometric modality and can be used for identification of a speaker. In this work, MFCC of text dependant speeches are computed and clustered using k-means algorithm. A backpropagation based Artificial Neural Network is trained to identify the clustered speech code. The performance of the neural network classifier is compared with the VQ based Euclidean minimum classifier. Biometric systems that use a single modality are usually affected by problems like noisy sensor data, non-universality and/or lack of distinctiveness of the biometric trait, unacceptable error rates, and spoof attacks. Multifinger feature level fusion based fingerprint recognition is developed and the performances are measured in terms of the ROC curve. Score level fusion of fingerprint and speech based recognition system is done and 100% accuracy is achieved for a considerable range of matching threshold
Resumo:
In this paper we discuss our research in developing general and systematic method for anomaly detection. The key ideas are to represent normal program behaviour using system call frequencies and to incorporate probabilistic techniques for classification to detect anomalies and intrusions. Using experiments on the sendmail system call data, we demonstrate that we can construct concise and accurate classifiers to detect anomalies. We provide an overview of the approach that we have implemented
Resumo:
This paper discusses our research in developing a generalized and systematic method for anomaly detection. The key ideas are to represent normal program behaviour using system call frequencies and to incorporate probabilistic techniques for classification to detect anomalies and intrusions. Using experiments on the sendmail system call data, we demonstrate that concise and accurate classifiers can be constructed to detect anomalies. An overview of the approach that we have implemented is provided.
Resumo:
Biometrics has become important in security applications. In comparison with many other biometric features, iris recognition has very high recognition accuracy because it depends on iris which is located in a place that still stable throughout human life and the probability to find two identical iris's is close to zero. The identification system consists of several stages including segmentation stage which is the most serious and critical one. The current segmentation methods still have limitation in localizing the iris due to circular shape consideration of the pupil. In this research, Daugman method is done to investigate the segmentation techniques. Eyelid detection is another step that has been included in this study as a part of segmentation stage to localize the iris accurately and remove unwanted area that might be included. The obtained iris region is encoded using haar wavelets to construct the iris code, which contains the most discriminating feature in the iris pattern. Hamming distance is used for comparison of iris templates in the recognition stage. The dataset which is used for the study is UBIRIS database. A comparative study of different edge detector operator is performed. It is observed that canny operator is best suited to extract most of the edges to generate the iris code for comparison. Recognition rate of 89% and rejection rate of 95% is achieved
Resumo:
Diagnosis of Hridroga (cardiac disorders) in Ayurveda requires the combination of many different types of data, including personal details, patient symptoms, patient histories, general examination results, Ashtavidha pareeksha results etc. Computer-assisted decision support systems must be able to combine these data types into a seamless system. Intelligent agents, an approach that has been used chiefly in business applications, is used in medical diagnosis in this case. This paper is about a multi-agent system named “Distributed Ayurvedic Diagnosis and Therapy System for Hridroga using Agents” (DADTSHUA). It describes the architecture of the DADTSHUA model .This system is using mobile agents and ontology for passing data through the network. Due to this, transport delay can be minimized. It is a system which will be very helpful for the beginning physicians to eliminate his ambiguity in diagnosis and therapy. The system is implemented using Java Agent DEvelopment framework (JADE), which is a java-complaint mobile agent platform from TILab.
Resumo:
Short term load forecasting is one of the key inputs to optimize the management of power system. Almost 60-65% of revenue expenditure of a distribution company is against power purchase. Cost of power depends on source of power. Hence any optimization strategy involves optimization in scheduling power from various sources. As the scheduling involves many technical and commercial considerations and constraints, the efficiency in scheduling depends on the accuracy of load forecast. Load forecasting is a topic much visited in research world and a number of papers using different techniques are already presented. The accuracy of forecast for the purpose of merit order dispatch decisions depends on the extent of the permissible variation in generation limits. For a system with low load factor, the peak and the off peak trough are prominent and the forecast should be able to identify these points to more accuracy rather than minimizing the error in the energy content. In this paper an attempt is made to apply Artificial Neural Network (ANN) with supervised learning based approach to make short term load forecasting for a power system with comparatively low load factor. Such power systems are usual in tropical areas with concentrated rainy season for a considerable period of the year
Resumo:
The Towed Array electronics is a multi-channel simultaneous real time high speed data acquisition system. Since its assembly is highly manpower intensive, the costs of arrays are prohibitive and therefore any attempt to reduce the manufacturing, assembly, testing and maintenance costs is a welcome proposition. The Network Based Towed Array is an innovative concept and its implementation has remarkably simplified the fabrication, assembly and testing and revolutionised the Towed Array scenario. The focus of this paper is to give a good insight into the Reliability aspects of Network Based Towed Array. A case study of the comparison between the conventional array and the network based towed array is also dealt with
Resumo:
Now a days, email has become the most widely communication way in daily life. The main reason for using email is probably because of the convenience and speed in which it can be transmitted irrespective of geographical distances. To improve security and efficiency of email system, most of the email system adopt PKI and IBE encryption schemes. However, both PKI and IBE encryption schemes have their own shortcomings and consequently bring security issues to email systems. This paper proposes a new secure email system based on IBE which combines finger print authentication and proxy service for encryption and decryption