816 resultados para Information security
Resumo:
The principal methods of compression and different types of non-encrypted objects are described. An analysis is made of the results obtained from examinations of the speed of compression for objects when using passwords with different length. The size of the new file obtained after compression is also analyzed. Some evaluations are made with regard to the methods and the objects used in the examinations. In conclusion some deductions are drawn as well as recommendations for future work.
Resumo:
Евгений Николов, Димитрина Полимирова - Докладът представя текущото състояние на “облачните изчисления” и “облачните информационни атаки” в светлината на компютърната вирусология и информационната сигурност. Обсъдени са категориите “облачни възможни информационни атаки” и “облачни успешни информационни атаки”. Коментирана е архитектурата на “облачните изчисления” и основните компоненти, които изграждат тяхната инфраструктура, съответно “клиенти” (“clients”), „центрове за съхранение на данни“ (“datacenters”) и „разпределени сървъри“ (“dirstributed servers”). Коментирани са и услугите, които се предлагат от “облачните изчисления” – SaaS, HaaS и PaaS. Посочени са предимствата и недостатъците на компонентите и услугите по отношение на “облачните информационни атаки”. Направен е анализ на текущото състояние на “облачните информационни атаки” на територията на България, Балканския полуостров и Югоизточна Европа по отношение на компонентите и на услугите. Резултатите са представени под формата на 3D графични обекти. На края са направени съответните изводи и препоръки под формата на заключение.
Resumo:
This paper dials with presentations of the Bulgarian Cultural and Historical Heritage in the Cyberspace. The study was taking place at the Information management course with bachelor students in Information Technologies, Information Brokerage and Information Security at the University of Library Studies and Information Technologies. The students describe about 300 different objectives – cultural and historical, material and immaterial.
Resumo:
The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.
Resumo:
The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.
Resumo:
This work aims to understand how cloud computing contextualizes the IT government and decision agenda, in the light of the multiple streams model, considering the current status of public IT policies, the dynamics of the agenda setting for the area, the interface between the various institutions, and existing initiatives on the use of cloud computing in government. Therefore, a qualitative study was conducted through interviews with a group of policy makers and the other group consists of IT managers. As analysis technique, this work made use of content analysis and analysis of documents, with some results by word cloud. As regards the main results to overregulation to the area, usually scattered in various agencies of the federal government, which hinders the performance of the managers. Identified a lack of knowledge of standards, government programs, regulations and guidelines. Among these he highlighted a lack of understanding of the TI Maior Program, the lack of effectiveness of the National Broadband Plan in view of the respondents, as well as the influence of Internet Landmark as an element that can jam the advances in the use of computing cloud in the Brazilian government. Also noteworthy is the bureaucratization of the acquisition of goods to IT services, limited, in many cases, technological advances. Regarding the influence of the actors, it was not possible to identify the presence of a political entrepreneur, and it was noticed a lack of political force. Political flow was affected only by changes within the government. Fragmentation was a major factor for the theme of weakening the agenda formation. Information security was questioned by the respondents pointed out that the main limitation coupled with the lack of training of public servants. In terms of benefits, resource economy is highlighted, followed by improving efficiency. Finally, the discussion about cloud computing needs to advance within the public sphere, whereas the international experience is already far advanced, framing cloud computing as a responsible element for the improvement of processes, services and economy of public resources
Resumo:
The information constitutes one of the most valuable strategic assets for the organization. However, the organizational environment in which it is inserted is very complex and heterogeneous, making emerging issues relevant to the Governance of information technology (IT) and Information Security. Academic Studies and market surveys indicate that the origin of most accidents with the information assets is the behavior of people organization itself rather than external attacks. Taking as a basis the promotion of a culture of safety among users and ensuring the protection of information in their properties of confidentiality, integrity and availability, organizations must establish its Information Security Policy (PSI). This policy is to formalise the guidelines in relation to the security of corporate information resources, in order to avoid that the asset vulnerabilities are exploited by threats and can bring negative consequences to the business. But, for the PSI being effective, it is required that the user have readiness to accept and follow the procedures and safety standards. In the light of this context, the present study aims to investigate what are the motivators extrinsic and intrinsic that affect the willingness of the user to be in accordance with the organization's security policies. The theoretical framework addresses issues related to IT Governance, Information Security, Theory of deterrence, Motivation and Behavior Pro-social. It was created a theoretical model based on the studies of Herath and Rao (2009) and D'Arcy, Hovav and Galletta (2009) that are based on General Deterrence Theory and propose the following influencing factors in compliance with the Policy: Severity of Punishment, Certainty of Detection, Peer Behaviour, Normative Beliefs, Perceived Effectiveness and Moral Commitment. The research used a quantitative approach, descriptive. The data were collected through a questionnaire with 18 variables with a Likert scale of five points representing the influencing factors proposed by the theory. The sample was composed of 391 students entering the courses from the Center for Applied Social Sciences of the Universidade Federal do Rio Grande do Norte. For the data analysis, were adopted the techniques of Exploratory Factor Analysis, Analysis of Cluster hierarchical and nonhierarchical, Logistic Regression and Multiple Linear Regression. As main results, it is noteworthy that the factor severity of punishment is what contributes the most to the theoretical model and also influences the division of the sample between users more predisposed and less prone. As practical implication, the research model applied allows organizations to provide users less prone and, with them, to carry out actions of awareness and training directed and write Security Policies more effective.
Resumo:
This research seeks to understand how the problem of information security is treated in Brazil by the public thematization and also how it can affect the political and economic aspects of both Brazilian companies and government by using a study case based on the document leak event of the National Security Agency by Snowden. For this, the study case of sites, blogs and news portal coverage was carried out from the perspective of evidential paradigm, studies of movement and event concept. We are interested in examining how the media handles the information security topic and what its impact on national and international political relations. The subject matter was considered the largest data leakage in history of the NSA, which ranks as the world's largest agency of expression intelligence. This leak caused great repercussions in Brazil since it was revealed that the country was the most watched by the United States of America, behind only USA itself. The consequences were: a big tension between Brazil and the US and a public discussion about privacy and freedom on Internet. The research analyzed 256 publications released by Brazilian media outlets in digital media, in the period between June and July 2013.
Resumo:
This research seeks to understand how the problem of information security is treated in Brazil by the public thematization and also how it can affect the political and economic aspects of both Brazilian companies and government by using a study case based on the document leak event of the National Security Agency by Snowden. For this, the study case of sites, blogs and news portal coverage was carried out from the perspective of evidential paradigm, studies of movement and event concept. We are interested in examining how the media handles the information security topic and what its impact on national and international political relations. The subject matter was considered the largest data leakage in history of the NSA, which ranks as the world's largest agency of expression intelligence. This leak caused great repercussions in Brazil since it was revealed that the country was the most watched by the United States of America, behind only USA itself. The consequences were: a big tension between Brazil and the US and a public discussion about privacy and freedom on Internet. The research analyzed 256 publications released by Brazilian media outlets in digital media, in the period between June and July 2013.
Resumo:
This project is about retrieving data in range without allowing the server to read it, when the database is stored in the server. Basically, our goal is to build a database that allows the client to maintain the confidentiality of the data stored, despite all the data is stored in a different location from the client's hard disk. This means that all the information written on the hard disk can be easily read by another person who can do anything with it. Given that, we need to encrypt that data from eavesdroppers or other people. This is because they could sell it or log into accounts and use them for stealing money or identities. In order to achieve this, we need to encrypt the data stored in the hard drive, so that only the possessor of the key can easily read the information stored, while all the others are going to read only encrypted data. Obviously, according to that, all the data management must be done by the client, otherwise any malicious person can easily retrieve it and use it for any malicious intention. All the methods analysed here relies on encrypting data in transit. In the end of this project we analyse 2 theoretical and practical methods for the creation of the above databases and then we tests them with 3 datasets and with 10, 100 and 1000 queries. The scope of this work is to retrieve a trend that can be useful for future works based on this project.
Resumo:
Questa tesi tratta un argomento che si è fatto sempre più interessante, soprattutto in questi ultimi anni, l'integrità firmware e hardware di un sistema. Oggigiorno milioni di persone fanno completamente affidamento al proprio sistema lasciando nelle loro mani moli di dati personali e non, molte delle quali si affidano ai moderni antivirus i quali, però, non sono in grado di rilevare e gestire attacchi che implicano l'alterazione dei firmware. Verranno mostrati diversi attacchi di questo tipo cercando di fare capire come la relativa sicurezza sia importante, inoltre saranno discussi diversi progetti reputati interessanti. Sulla base delle ricerche effettuate, poi, sarà mostrata la progettazione e l'implementazione di un software in grado di rilevare alterazioni hardware e firmware in un sistema.
Resumo:
Publisher PDF
Resumo:
Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.
Resumo:
After years of deliberation, the EU commission sped up the reform process of a common EU digital policy considerably in 2015 by launching the EU digital single market strategy. In particular, two core initiatives of the strategy were agreed upon: General Data Protection Regulation and the Network and Information Security (NIS) Directive law texts. A new initiative was additionally launched addressing the role of online platforms. This paper focuses on the platform privacy rationale behind the data protection legislation, primarily based on the proposal for a new EU wide General Data Protection Regulation. We analyse the legislation rationale from an Information System perspective to understand the role user data plays in creating platforms that we identify as “processing silos”. Generative digital infrastructure theories are used to explain the innovative mechanisms that are thought to govern the notion of digitalization and successful business models that are affected by digitalization. We foresee continued judicial data protection challenges with the now proposed Regulation as the adoption of the “Internet of Things” continues. The findings of this paper illustrate that many of the existing issues can be addressed through legislation from a platform perspective. We conclude by proposing three modifications to the governing rationale, which would not only improve platform privacy for the data subject, but also entrepreneurial efforts in developing intelligent service platforms. The first modification is aimed at improving service differentiation on platforms by lessening the ability of incumbent global actors to lock-in the user base to their service/platform. The second modification posits limiting the current unwanted tracking ability of syndicates, by separation of authentication and data store services from any processing entity. Thirdly, we propose a change in terms of how security and data protection policies are reviewed, suggesting a third party auditing procedure.
Resumo:
The information technology - IT- benefits have been more perceived during the last decades. Both IT and business managers are dealing with subjects like governance, IT-Business alignment, information security and others on their top priorities. Talking about governance, specifically, managers are facing it with a technical approach, that gives emphasis on protection against invasions, antivirus systems, access controls and others technical issues. The IT risk management, commonly, is faced under this approach, that means, has its importance reduced and delegated to IT Departments. On the last two decades, a new IT risk management perspective raised, bringing an holistic view of IT risk to the organization. According to this new perspective, the strategies formulation process should take into account the IT risks. With the growing of IT dependence on most of organizations, the necessity of a better comprehension about the subject becomes more clear. This work shows a study in three public organizations of the Pernambuco State that investigates how those organizations manage their IT risks. Structured interviews were made with IT managers, and later, analyzed and compared with conceptual categories found in the literature. The results shows that the IT risks culture and IT governance are weakly understood and implemented on those organizations, where there are not such an IT risk methodology formally defined, neither executed. In addition, most of practices suggested in the literature were found, even without an alignment with an IT risks management process