820 resultados para information security management assessment
Resumo:
The information technology - IT- benefits have been more perceived during the last decades. Both IT and business managers are dealing with subjects like governance, IT-Business alignment, information security and others on their top priorities. Talking about governance, specifically, managers are facing it with a technical approach, that gives emphasis on protection against invasions, antivirus systems, access controls and others technical issues. The IT risk management, commonly, is faced under this approach, that means, has its importance reduced and delegated to IT Departments. On the last two decades, a new IT risk management perspective raised, bringing an holistic view of IT risk to the organization. According to this new perspective, the strategies formulation process should take into account the IT risks. With the growing of IT dependence on most of organizations, the necessity of a better comprehension about the subject becomes more clear. This work shows a study in three public organizations of the Pernambuco State that investigates how those organizations manage their IT risks. Structured interviews were made with IT managers, and later, analyzed and compared with conceptual categories found in the literature. The results shows that the IT risks culture and IT governance are weakly understood and implemented on those organizations, where there are not such an IT risk methodology formally defined, neither executed. In addition, most of practices suggested in the literature were found, even without an alignment with an IT risks management process
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Security Onion is a Network Security Manager (NSM) platform that provides multiple Intrusion Detection Systems (IDS) including Host IDS (HIDS) and Network IDS (NIDS). Many types of data can be acquired using Security Onion for analysis. This includes data related to: Host, Network, Session, Asset, Alert and Protocols. Security Onion can be implemented as a standalone deployment with server and sensor included or with a master server and multiple sensors allowing for the system to be scaled as required. Many interfaces and tools are available for management of the system and analysis of data such as Sguil, Snorby, Squert and Enterprise Log Search and Archive (ELSA). These interfaces can be used for analysis of alerts and captured events and then can be further exported for analysis in Network Forensic Analysis Tools (NFAT) such as NetworkMiner, CapME or Xplico. The Security Onion platform also provides various methods of management such as Secure SHell (SSH) for management of server and sensors and Web client remote access. All of this with the ability to replay and analyse example malicious traffic makes the Security Onion a suitable low cost alternative for Network Security Management. In this paper, we have a feature and functionality review for the Security Onion in terms of: types of data, configuration, interface, tools and system management.
Resumo:
Tämän tutkimuksen päätavoitteena oli luoda laskentamalli identiteetin- ja käyttöoikeuksien hallintajärjestelmien kustannus- ja tulosvaikutuksista. Mallin tarkoitus oli toimia järjestelmätoimittajien apuvälineenä, jolla mahdolliset asiakkaat voidaan paremmin vakuuttaa järjestelmän kustannushyödyistä myyntitilanteessa. Vastaavia kustannusvaikutuksia mittaavia malleja on rakennettu hyvin vähän, ja tässä tutkimuksessa rakennettu malli eroaa niistä sekä järjestelmätoimittajan työkustannusten että tietoturvariskien huomioimisen osalta. Laskentamallin toimivuuden todentamiseksi syntynyttä laskentamallia testattiin kahdessa yrityksessä, joiden käytössä on keskitetty identiteetinhallintajärjestelmä. Testaus suoritettiin syöttämällä yrityksen tiedot laskentamalliin ja vertaamalla mallin antamia tuloksia yrityksen havaitsemiin kustannusvaikutuksiin. Sekä kirjallisuuskatsauksen että laskentamallin testaamisen perusteella voidaan todeta, että identiteetinhallintaprosessin merkittävimmät kustannustekijät ovat identiteettien luomiseen ja muutoksiin kuluva työaika sekä näiden toimintojen aiheuttama työntekijän tehokkuuden laskeminen prosessin aikana. Tutkimuksen perusteella keskitettyjen identiteetinhallintajärjestelmien avulla on mahdollista saavuttaa merkittäviä kustannussäästöjä identiteetinhallintaprosessin toiminnoista, lisenssikustannuksista sekä IT-palvelukustannuksista. Kaikki kustannussäästöt eivät kuitenkaan ole konkreettisia, vaan liittyvät esimerkiksi työtehokkuuden nousemiseen järjestelmän ansiosta. Kustannusvaikutusten lisäksi identiteetinhallintajärjestelmät tarjoavat muita hyötyjä, joiden rahallisen arvon laskeminen on erittäin haastavaa. Laskentamallin käytön haasteina ovatkin konkreettisten ja epäsuorien kustannussäästöjen tunnistaminen ja arvottaminen sekä investoinnin kokonaishyötyjen arvioinnin vaikeus.
Resumo:
O Gerenciamento de Processo de Negócio (Business Process Management- BPM) tem sido uma prática adotada pelas organizações públicas e privadas. O BPM possibilita a identificação eficaz das necessidades e das informações necessárias para suportar a operacionalização ou a automatização do processo de negócio. Considerando que as organizações têm necessitado terceirizar esse serviço, o objetivo desse trabalho foi propor um modelo de medição para contratação dos serviços de BPM. Para atender a esse objetivo, o modelo conceitual construído partiu da premissa que a gestão da contratação de um BPM deve fornecer critérios para mensurar a demanda, ou serviço, avaliar a qualidade dos serviços prestados e a qualidade do produto recebido. O estudo adotou como instrumentos de coleta de dados a pesquisa documental e a revisão sistemática. Com base nos objetivos e questões de pesquisa foram identificados os strings para busca, definidas as fontes de busca, critérios de inclusão e exclusão dos resultados. Todos os trabalhos selecionados foram lidos e analisados e foi utilizado um mapa mental para consolidação dos resultados. Foi utilizado o GQM (Goal, Questions, Metrics) para a elaboração das medições e adotado o estudo de caso. Parte das medições propostas foram aplicadas em 13 modelos de processo de negócio do mundo real com o objetivo de verificar sua coerência e comparar os resultados. Foi realizada entrevista com especialista em modelagem de processos de negócios para avaliar os resultados obtidos e, na percepção deste especialista a maior parte das métricas propostas pela pesquisa é adequada à realidade de mercado, considerando o contexto de terceirização desse serviço.
Resumo:
Understanding and predicting patterns of distribution and abundance of marine resources is important for con- servation and management purposes in small-scale artisanal fisheries and industrial fisheries worldwide. The goose barnacle (Pollicipes pollicipes) is an important shellfish resource and its distribution is closely related to wave exposure at different spatial scales. We modelled the abundance (percent coverage) of P. pollicipes as a function of a simple wave exposure index based on fetch estimates from digitized coastlines at different spatial scales. The model accounted for 47.5% of the explained deviance and indicated that barnacle abundance increases non-linearly with wave exposure at both the smallest (metres) and largest (kilometres) spatial scales considered in this study. Distribution maps were predicted for the study region in SW Portugal. Our study suggests that the relationship between fetch-based exposure indices and P. pollicipes percent cover may be used as a simple tool for providing stakeholders with information on barnacle distribution patterns. This information may improve assessment of harvesting grounds and the dimension of exploitable areas, aiding management plans and support- ing decision making on conservation, harvesting pressure and surveillance strategies for this highly appreciated and socio- economically important marine resource.
Resumo:
In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: 1) if policies are complex, their enforcement can lead to performance decay of database servers; 2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.
Resumo:
In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: (1) if policies are complex, their enforcement can lead to performance decay of database servers; (2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.
Resumo:
This thesis presents security issues and vulnerabilities in home and small office local area networks that can be used in cyber-attacks. There is previous research done on single vulnerabilities and attack vectors, but not many papers present full scale attack examples towards LAN. First this thesis categorizes different security threads and later in the paper methods to launch the attacks are shown by example. Offensive security and penetration testing is used as research methods in this thesis. As a result of this thesis an attack is conducted using vulnerabilities in WLAN, ARP protocol, browser as well as methods of social engineering. In the end reverse shell access is gained to the target machine. Ready-made tools are used in the attack and their inner workings are described. Prevention methods are presented towards the attacks in the end of the thesis.
Resumo:
The information technology - IT- benefits have been more perceived during the last decades. Both IT and business managers are dealing with subjects like governance, IT-Business alignment, information security and others on their top priorities. Talking about governance, specifically, managers are facing it with a technical approach, that gives emphasis on protection against invasions, antivirus systems, access controls and others technical issues. The IT risk management, commonly, is faced under this approach, that means, has its importance reduced and delegated to IT Departments. On the last two decades, a new IT risk management perspective raised, bringing an holistic view of IT risk to the organization. According to this new perspective, the strategies formulation process should take into account the IT risks. With the growing of IT dependence on most of organizations, the necessity of a better comprehension about the subject becomes more clear. This work shows a study in three public organizations of the Pernambuco State that investigates how those organizations manage their IT risks. Structured interviews were made with IT managers, and later, analyzed and compared with conceptual categories found in the literature. The results shows that the IT risks culture and IT governance are weakly understood and implemented on those organizations, where there are not such an IT risk methodology formally defined, neither executed. In addition, most of practices suggested in the literature were found, even without an alignment with an IT risks management process
Resumo:
In the subprime lending market, Ameriquest Mortgage Company is one of the leading lenders. It is widely known for its advertising slogan of “proud sponsor of the American dream.” Yet in 2006, an investigation into unlawful mortgage lending practices and the subsequent $325 million multi-State settlement brought even more attention to this company. What caused this lawsuit which brought irreparable damage to its reputation and financial loss for Ameriquest? This study focuses on the Information System Security management of the company. The study first introduces Ameriquest, and then briefly describes the lawsuit and settlement, and then discusses the IS security control in Ameriquest. The discussion will cover the internal control, external control, and technical controls of the company.
Resumo:
O presente trabalho de investigação aplicada tem como titulo “Processo de Awareness dos Utilizadores nas Redes Militares”, com o intuito de “identificar a forma mais eficiente e eficaz de efetuar um design de um processo de awareness de forma a sensibilizar os utilizadores do sistema de e-mail do Exército para os ataques de phishing” que é o objetivo desta investigação. Por este motivo, de início foram selecionados objetivos específicos que remetem para este principal. Foi definido que precisamos de conhecer as principais teorias comportamentais que influenciam o sucesso dos ataques de phishing, de forma a perceber e combater estes mesmos. Foi, também, necessário perceber quais os principais métodos ou técnicas de ensino de atitudes, para possibilitar a sensibilização dos utilizadores, como também era necessário definir o meio de awareness para executar esta mesma. Por último, era necessário o processo de awareness, portanto, precisamos de critérios de avaliação e, para isso, é importante definir estes mesmos para validar a investigação. Para responder a estes quatro objetivos específicos e ao objetivo geral da investigação foi criada a questão central do trabalho que é “Como efetuar o design de um processo de awareness para o Exército que reduza o impacto dos ataques de phishing executados através do seu sistema de e-mail?” Devido ao carácter teórico-prático desta investigação, foi decidido que o método de investigação seria o Hipotético-Dedutivo, e o método de procedimento seria o Estudo de Caso. Foi uma investigação exploratória, utilizando as técnicas de pesquisa bibliográfica e análise documental para executar uma revisão de literatura completa com o intuito de apoiar a investigação, como, também, fundamentar todo o trabalho de campo realizado. Para a realização deste estudo, foi necessário estudar a temática Segurança da Informação, já que esta suporta a investigação. Para existir segurança da informação é necessário que as propriedades da segurança da informação se mantenham preservadas, isto é, a confidencialidade, a integridade e a disponibilidade. O trabalho de campo consistiu em duas partes, a construção dos questionários e da apresentação de sensibilização e a sua aplicação e avaliação (outputs da investigação). Estes produtos foram usados na sessão de sensibilização através da aplicação do questionário de aferição seguido da apresentação de sensibilização, e terminando com o questionário de validação (processo de awareness). Conseguiu-se, após a sensibilização, através do processo de awareness, que os elementos identificassem com maior rigor os ataques de phishing. Para isso utilizou-se, na sensibilização, o método de ensino ativo, que incorpora boas práticas para a construção de produtos de sensibilização, utilizando os estilos de aprendizagem auditivo, mecânico e visual, que permite alterar comportamentos.
Resumo:
Global climate change is predicted to have impacts on the frequency and severity of flood events. In this study, output from Global Circulation Models (GCMs) for a range of possible future climate scenarios was used to force hydrologic models for four case study watersheds built using the Soil and Water Assessment Tool (SWAT). GCM output was applied with either the "delta change" method or a bias correction. Potential changes in flood risk are assessed based on modeling results and possible relationships to watershed characteristics. Differences in model outputs when using the two different methods of adjusting GCM output are also compared. Preliminary results indicate that watersheds exhibiting higher proportions of runoff in streamflow are more vulnerable to changes in flood risk. The delta change method appears to be more useful when simulating extreme events as it better preserves daily climate variability as opposed to using bias corrected GCM output.
Resumo:
The objective of this research is to identify the factors that influence the migration of free software to proprietary software, or vice-versa. The theoretical framework was developed in light of the Diffusion of Innovations Theory (DIT) proposed by Rogers (1976, 1995), and the Unified Theory of Acceptance and Use of Technology (UTAUT) proposed by Venkatesh, Morris, Davis and Davis (2003). The research was structured in two phases: the first phase was exploratory, characterized by adjustments of the revised theory to fit Brazilian reality and the identification of companies that could be the subject of investigation; and the second phase was qualitative, in which case studies were conducted at ArcelorMittal Tubarão (AMT), a private company that migrated from proprietary software (Unix) to free software (Linux), and the city government of Serra, in Espírito Santo state, a public organization that migrated from free software (OpenOffice) to proprietary (MS Office). The results show that software migration decision takes into account factors that go beyond issues involving technical or cost aspects, such as cultural barriers, user rejection and resistance to change. These results underscore the importance of social aspects, which can play a decisive role in the decision regarding software migration and its successful implementation.
Resumo:
The main purpose of this paper is to propose and test a model to assess the degree of conditions favorability in the adoption of agile methods to develop software where traditional methods predominate. In order to achieve this aim, a survey was applied on software developers of a Brazilian public retail bank. Two different statistical techniques were used in order to assess the quantitative data from the closed questions in the survey. The first, exploratory factorial analysis validated the structure of perspectives related to the agile model of the proposed assessment. The second, frequency distribution analysis to categorize the answers. Qualitative data from the survey opened question were analyzed with the technique of qualitative thematic content analysis. As a result, the paper proposes a model to assess the degree of favorability conditions in the adoption of Agile practices within the context of the proposed study.