998 resultados para database security


Relevância:

100.00% 100.00%

Publicador:

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This project is about retrieving data in range without allowing the server to read it, when the database is stored in the server. Basically, our goal is to build a database that allows the client to maintain the confidentiality of the data stored, despite all the data is stored in a different location from the client's hard disk. This means that all the information written on the hard disk can be easily read by another person who can do anything with it. Given that, we need to encrypt that data from eavesdroppers or other people. This is because they could sell it or log into accounts and use them for stealing money or identities. In order to achieve this, we need to encrypt the data stored in the hard drive, so that only the possessor of the key can easily read the information stored, while all the others are going to read only encrypted data. Obviously, according to that, all the data management must be done by the client, otherwise any malicious person can easily retrieve it and use it for any malicious intention. All the methods analysed here relies on encrypting data in transit. In the end of this project we analyse 2 theoretical and practical methods for the creation of the above databases and then we tests them with 3 datasets and with 10, 100 and 1000 queries. The scope of this work is to retrieve a trend that can be useful for future works based on this project.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ohjelmistojen tietoturva on noussut viime aikoina entistä tärkeämpään rooliin. Ohjelmistojen suunnittelu pitää alusta alkaen hoitaa siten, että tietoturva tulee huomioitua. Ohjelman helppokäyttöisyys ei saisi ajaa tietoturvan edelle, eikä myöskään ohjeiden lukematta jättäminen saa tarkoittaa tietoturvan menetystä. Tärkeä osa ohjelmistojen tietoturvaa on myös ohjelmiston laillinen käyttö. Se miten laiton käyttö estetään sen sijaan on erittäin vaikeaa toteuttaa nykyjärjestelmissä. Työn tarkoituksena oli tutkia Intellitel Communications Oy:n sanomayhdyskäytävää, Intellitel Messaging Gateway, tuotetietoturvan näkökulmasta, löytää sieltä mahdolliset virheet ja myös korjata ne.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

No desenvolvimento deste Trabalho de Investigação Aplicada, pretende-se responder à questão: Quais os requisitos necessários a implementar numa base de dados relacional de controlos de segurança da informação para Unidades, Estabelecimentos ou Órgãos militares do Exército Português? Deste modo, para se responder a esta questão central, houve necessidade de subdividir esta em quatro questões derivadas, sendo elas: 1. Quais as principais dimensões de segurança da informação ao nível organizacional? 2. Quais as principais categorias de segurança da informação ao nível organizacional? 3. Quais os principais controlos de segurança da informação a implementar numa organização militar? 4. Quais os requisitos funcionais necessários a implementar numa base de dados de controlos de segurança da informação a implementar numa organização militar? Para responder a estas questões de investigação, este trabalho assenta numa investigação aplicada, com o objetivo de desenvolver uma aplicação prática para os conhecimentos adquiridos, materializando-se assim numa base de dados. Ainda, quanto ao objetivo da investigação, este é descritivo, explicativo e exploratório, uma vez que, tem o objetivo de descrever as principais dimensões, categorias e controlos da segurança da informação, assim como o objetivo de explicar quais são os requisitos funcionais necessários a implementar numa base de dados de controlos de segurança da informação. Por último, tem ainda o objetivo de efetuar um estudo exploratório, comprovando a eficácia da base de dados. Esta investigação assenta no método indutivo, partindo de premissas particulares para chegar a conclusões gerais, isto é, a partir de análise de documentos e de inquéritos por entrevista, identificar-se-ão quais são os requisitos funcionais necessários a implementar, generalizando para todas as Unidades, Estabelecimentos ou Órgãos militares do Exército Português. No que corresponde ao método de procedimentos, usar-se-á o método comparativo, com vista a identificar qual é a norma internacional de gestão de segurança de informação mais indicada a registar na base de dados. Por último, como referido anteriormente, no que concerne às técnicas de investigação, será usado o inquérito por entrevista, identificando os requisitos necessários a implementar, e a análise de documentos, identificando as principais dimensões, categoriasou controlos necessários a implementar numa base de dados de controlos de segurança da informação. Posto isto, numa primeira fase da investigação, através da análise de documentos, percecionam-se as principais dimensões, categorias e controlos de segurança da informação necessários a aplicar nas Unidades, Estabelecimentos ou Órgãos militares do Exército Português, por forma a contribuir para o sucesso na gestão da segurança da informação militar. Ainda, através de entrevistas a especialistas da área de segurança da informação e dos Sistemas de Informação nas unidades militares, identificar-se-ão quais os requisitos funcionais necessários a implementar numa base de dados de controlos de segurança da informação a implementar numa organização militar. Por último, numa segunda fase, através do modelo de desenvolvimento de software em cascata revisto, pretende-se desenvolver uma base de dados relacional, em Microsoft Access, de controlos de segurança da Informação a fim de implementar em Unidades, Estabelecimentos ou Órgãos militares do Exército Português. Posteriormente, após o desenvolvimento da base de dados, pretende-se efetuar um estudo exploratório com vista a validar a mesma, de modo a comprovar se esta responde às necessidades para a qual foi desenvolvida.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Exist a constant worry for Database Security, many times the security is affected for the security configuration connection, this paper will study how make security connection to a Database, explaining common mistakes and to give recommendations for keep low the associated risk to these process.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Statistics can be useful when assessing the practical relevance of varying rules and practices on the involuntary loss of nationality across EU member states. Yet while much progress has been made within the EU in recent years with regard to the collection of comparable and reliable information on the acquisition of nationality, statistics on the loss of nationality are hard to find and, where available, difficult to interpret. In this comparative report, the authors explore the landscape of existing statistical data on loss of nationality in the European Union. They identify challenges to the existing methods of data collection and data interpretation and introduce an online statistical database, bringing together all existing statistical data on loss of nationality in the EU. These data are summarised in tables and graphs and discussed with reference to the relevant national and European sources. The authors conclude with recommendations to policy-makers on how to improve data collection in this area.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the recent explosion in the complexity and amount of digital multimedia data, there has been a huge impact on the operations of various organizations in distinct areas, such as government services, education, medical care, business, entertainment, etc. To satisfy the growing demand of multimedia data management systems, an integrated framework called DIMUSE is proposed and deployed for distributed multimedia applications to offer a full scope of multimedia related tools and provide appealing experiences for the users. This research mainly focuses on video database modeling and retrieval by addressing a set of core challenges. First, a comprehensive multimedia database modeling mechanism called Hierarchical Markov Model Mediator (HMMM) is proposed to model high dimensional media data including video objects, low-level visual/audio features, as well as historical access patterns and frequencies. The associated retrieval and ranking algorithms are designed to support not only the general queries, but also the complicated temporal event pattern queries. Second, system training and learning methodologies are incorporated such that user interests are mined efficiently to improve the retrieval performance. Third, video clustering techniques are proposed to continuously increase the searching speed and accuracy by architecting a more efficient multimedia database structure. A distributed video management and retrieval system is designed and implemented to demonstrate the overall performance. The proposed approach is further customized for a mobile-based video retrieval system to solve the perception subjectivity issue by considering individual user's profile. Moreover, to deal with security and privacy issues and concerns in distributed multimedia applications, DIMUSE also incorporates a practical framework called SMARXO, which supports multilevel multimedia security control. SMARXO efficiently combines role-based access control (RBAC), XML and object-relational database management system (ORDBMS) to achieve the target of proficient security control. A distributed multimedia management system named DMMManager (Distributed MultiMedia Manager) is developed with the proposed framework DEMUR; to support multimedia capturing, analysis, retrieval, authoring and presentation in one single framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the growth in new technologies, using online tools have become an everyday lifestyle. It has a greater impact on researchers as the data obtained from various experiments needs to be analyzed and knowledge of programming has become mandatory even for pure biologists. Hence, VTT came up with a new tool, R Executables (REX) which is a web application designed to provide a graphical interface for biological data functions like Image analysis, Gene expression data analysis, plotting, disease and control studies etc., which employs R functions to provide results. REX provides a user interactive application for the biologists to directly enter the values and run the required analysis with a single click. The program processes the given data in the background and prints results rapidly. Due to growth of data and load on server, the interface has gained problems concerning time consumption, poor GUI, data storage issues, security, minimal user interactive experience and crashes with large amount of data. This thesis handles the methods by which these problems were resolved and made REX a better application for the future. The old REX was developed using Python Django and now, a new programming language, Vaadin has been implemented. Vaadin is a Java framework for developing web applications and the programming language is extremely similar to Java with new rich components. Vaadin provides better security, better speed, good and interactive interface. In this thesis, subset functionalities of REX was selected which includes IST bulk plotting and image segmentation and implemented those using Vaadin. A code of 662 lines was programmed by me which included Vaadin as the front-end handler while R language was used for back-end data retrieval, computing and plotting. The application is optimized to allow further functionalities to be migrated with ease from old REX. Future development is focused on including Hight throughput screening functions along with gene expression database handling

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today's complicated computing environment, managing data has become the primary concern of all industries. Information security is the greatest challenge and it has become essential to secure the enterprise system resources like the databases and the operating systems from the attacks of the unknown outsiders. Our approach plays a major role in detecting and managing vulnerabilities in complex computing systems. It allows enterprises to assess two primary tiers through a single interface as a vulnerability scanner tool which provides a secure system which is also compatible with the security compliance of the industry. It provides an overall view of the vulnerabilities in the database, by automatically scanning them with minimum overhead. It gives a detailed view of the risks involved and their corresponding ratings. Based on these priorities, an appropriate mitigation process can be implemented to ensure a secured system. The results show that our approach could effectively optimize the time and cost involved when compared to the existing systems

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Continuous Plankton Recorder (CPR) survey, operated by the Sir Alister Hardy Foundation for Ocean Science (SAHFOS), is the largest plankton monitoring programme in the world and has spanned > 70 yr. The dataset contains information from -200 000 samples, with over 2.3 million records of individual taxa. Here we outline the evolution of the CPR database through changes in technology, and how this has increased data access. Recent high-impact publications and the expanded role of CPR data in marine management demonstrate the usefulness of the dataset. We argue that solely supplying data to the research community is not sufficient in the current research climate; to promote wider use, additional tools need to be developed to provide visual representation and summary statistics. We outline 2 software visualisation tools, SAHFOS WinCPR and the digital CPR Atlas, which provide access to CPR data for both researchers and non-plankton specialists. We also describe future directions of the database, data policy and the development of visualisation tools. We believe that the approach at SAHFOS to increase data accessibility and provide new visualisation tools has enhanced awareness of the data and led to the financial security of the organisation; it also provides a good model of how long-term monitoring programmes can evolve to help secure their future.