816 resultados para Data security principle
Resumo:
Numerous statements and declarations have been made over recent decades in support of open access to research data. The growing recognition of the importance of open access to research data has been accompanied by calls on public research funding agencies and universities to facilitate better access to publicly funded research data so that it can be re-used and redistributed as public goods. International and inter-governmental bodies such as the ICSU/CODATA, the OECD and the European Union are strong supporters of open access to and re-use of publicly funded research data. This thesis focuses on the research data created by university researchers in Malaysian public universities whose research activities are funded by the Federal Government of Malaysia. Malaysia, like many countries, has not yet formulated a policy on open access to and re-use of publicly funded research data. Therefore, the aim of this thesis is to develop a policy to support the objective of enabling open access to and re-use of publicly funded research data in Malaysian public universities. Policy development is very important if the objective of enabling open access to and re-use of publicly funded research data is to be successfully achieved. In developing the policy, this thesis identifies a myriad of legal impediments arising from intellectual property rights, confidentiality, privacy and national security laws, novelty requirements in patent law and lack of a legal duty to ensure data quality. Legal impediments such as these have the effect of restricting, obstructing, hindering or slowing down the objective of enabling open access to and re-use of publicly funded research data. A key focus in the formulation of the policy was the need to resolve the various legal impediments that have been identified. This thesis analyses the existing policies and guidelines of Malaysian public universities to ascertain to what extent the legal impediments have been resolved. An international perspective is adopted by making a comparative analysis of the policies of public research funding agencies and universities in the United Kingdom, the United States and Australia to understand how they have dealt with the identified legal impediments. These countries have led the way in introducing policies which support open access to and re-use of publicly funded research data. As well as proposing a policy supporting open access to and re-use of publicly funded research data in Malaysian public universities, this thesis provides procedures for the implementation of the policy and guidelines for addressing the legal impediments to open access and re-use.
Resumo:
The purpose of the current study was to develop a measurement of information security culture in developing countries such as Saudi Arabia. In order to achieve this goal, the study commenced with a comprehensive review of the literature, the outcome being the development of a conceptual model as a reference base. The literature review revealed a lack of academic and professional research into information security culture in developing countries and more specifically in Saudi Arabia. Given the increasing importance and significant investment developing countries are making in information technology, there is a clear need to investigate information security culture from developing countries perspective such as Saudi Arabia. Furthermore, our analysis indicated a lack of clear conceptualization and distinction between factors that constitute information security culture and factors that influence information security culture. Our research aims to fill this gap by developing and validating a measurement model of information security culture, as well as developing initial understanding of factors that influence security culture. A sequential mixed method consisting of a qualitative phase to explore the conceptualisation of information security culture, and a quantitative phase to validate the model is adopted for this research. In the qualitative phase, eight interviews with information security experts in eight different Saudi organisations were conducted, revealing that security culture can be constituted as reflection of security awareness, security compliance and security ownership. Additionally, the qualitative interviews have revealed that factors that influence security culture are top management involvement, policy enforcement, policy maintenance, training and ethical conduct policies. These factors were confirmed by the literature review as being critical and important for the creation of security culture and formed the basis for our initial information security culture model, which was operationalised and tested in different Saudi Arabian organisations. Using data from two hundred and fifty-four valid responses, we demonstrated the validity and reliability of the information security culture model through Exploratory Factor Analysis (EFA), followed by Confirmatory Factor Analysis (CFA.) In addition, using Structural Equation Modelling (SEM) we were further able to demonstrate the validity of the model in a nomological net, as well as provide some preliminary findings on the factors that influence information security culture. The current study contributes to the existing body of knowledge in two major ways: firstly, it develops an information security culture measurement model; secondly, it presents empirical evidence for the nomological validity for the security culture measurement model and discovery of factors that influence information security culture. The current study also indicates possible future related research needs.
Resumo:
Non-linear feedback shift register (NLFSR) ciphers are cryptographic tools of choice of the industry especially for mobile communication. Their attractive feature is a high efficiency when implemented in hardware or software. However, the main problem of NLFSR ciphers is that their security is still not well investigated. The paper makes a progress in the study of the security of NLFSR ciphers. In particular, we show a distinguishing attack on linearly filtered NLFSR (or LF-NLFSR) ciphers. We extend the attack to a linear combination of LF-NLFSRs. We investigate the security of a modified version of the Grain stream cipher and show its vulnerability to both key recovery and distinguishing attacks.
Resumo:
Drawing on data from the Australian Business Assessment of Computer User Security (ABACUS) survey, this paper examines a range of factors that may influence businesses’ likelihood of being victimised by a computer security incident. It has been suggested that factors including business size, industry sector, level of outsourcing, expenditure on computer security functions and types of computer security tools and/or policies used may influence the probability of particular businesses experiencing such incidents. This paper uses probability modelling to test whether this is the case for the 4,000 businesses that responded to the ABACUS survey. It was found that the industry sector that a business belonged to, and business expenditure on computer security, were not related to businesses’ likelihood of detecting computer security incidents. Instead, the number of employees that a business has and whether computer security functions were outsourced were found to be key indicators of businesses’ likelihood of detecting incidents. Some of the implications of these findings are considered in this paper.
Resumo:
In this paper we will examine passenger actions and activities at the security screening points of Australian domestic and international airports. Our findings and analysis provide a more complete understanding of the current airport passenger security screening experience. Data in this paper is comprised of field studies conducted at two Australian airports, one domestic and one international. Video data was collected by cameras situated either side of the security screening point. A total of one hundred and ninety-six passengers were observed. Two methods of analysis are used. First, the activities of passengers are coded and analysed to reveal the common activities at domestic and international security regimes and between quiet and busy periods. Second, observation of passenger activities is used to reveal uncommon aspects. The results show that passengers do more at security screening that being passively scanned. Passengers queue, unpack the required items from their bags and from their pockets, walk through the metal-detector, re-pack and occasionally return to be re-screened. For each of these activities, passengers must understand the procedures at the security screening point and must co-ordinate various actions and objects in time and space. Through this coordination passengers are active participants in making the security checkpoint function – they are co-producers of the security screening process.
Resumo:
As support grows for greater access to information and data held by governments, so does awareness of the need for appropriate policy, technical and legal frameworks to achieve the desired economic and societal outcomes. Since the late 2000s numerous international organizations, inter-governmental bodies and governments have issued open government data policies, which set out key principles underpinning access to, and the release and reuse of data. These policies reiterate the value of government data and establish the default position that it should be openly accessible to the public under transparent and non-discriminatory conditions, which are conducive to innovative reuse of the data. A key principle stated in open government data policies is that legal rights in government information must be exercised in a manner that is consistent with and supports the open accessibility and reusability of the data. In particular, where government information and data is protected by copyright, access should be provided under licensing terms which clearly permit its reuse and dissemination. This principle has been further developed in the policies issued by Australian Governments into a specific requirement that Government agencies are to apply the Creative Commons Attribution licence (CC BY) as the default licensing position when releasing government information and data. A wide-ranging survey of the practices of Australian Government agencies in managing their information and data, commissioned by the Office of the Australian Information Commissioner in 2012, provides valuable insights into progress towards the achievement of open government policy objectives and the adoption of open licensing practices. The survey results indicate that Australian Government agencies are embracing open access and a proactive disclosure culture and that open licensing under Creative Commons licences is increasingly prevalent. However, the finding that ‘[t]he default position of open access licensing is not clearly or robustly stated, nor properly reflected in the practice of Government agencies’ points to the need to further develop the policy framework and the principles governing information access and reuse, and to provide practical guidance tools on open licensing if the broadest range of government information and data is to be made available for innovative reuse.
Resumo:
For decades Supervisory Control and Data Acquisition (SCADA) and Industrial Control Systems (ICS) have used computers to monitor and control physical processes in many critical industries, including electricity generation, gas pipelines, water distribution, waste treatment, communications and transportation. Increasingly these systems are interconnected with corporate networks via the Internet, making them vulnerable and exposed to the same risks as those experiencing cyber-attacks on a conventional network. Very often SCADA networks services are viewed as a specialty subject, more relevant to engineers than standard IT personnel. Educators from two Australian universities have recognised these cultural issues and highlighted the gap between specialists with SCADA systems engineering skills and the specialists in network security with IT background. This paper describes a learning approach designed to help students to bridge this gap, gain theoretical knowledge of SCADA systems' vulnerabilities to cyber-attacks via experiential learning and acquire practical skills through actively participating in hands-on exercises.
Resumo:
This paper analyses the probabilistic linear discriminant analysis (PLDA) speaker verification approach with limited development data. This paper investigates the use of the median as the central tendency of a speaker’s i-vector representation, and the effectiveness of weighted discriminative techniques on the performance of state-of-the-art length-normalised Gaussian PLDA (GPLDA) speaker verification systems. The analysis within shows that the median (using a median fisher discriminator (MFD)) provides a better representation of a speaker when the number of representative i-vectors available during development is reduced, and that further, usage of the pair-wise weighting approach in weighted LDA and weighted MFD provides further improvement in limited development conditions. Best performance is obtained using a weighted MFD approach, which shows over 10% improvement in EER over the baseline GPLDA system on mismatched and interview-interview conditions.
Resumo:
The geographic location of cloud data storage centres is an important issue for many organisations and individuals due to various regulations that require data and operations to reside in specific geographic locations. Thus, cloud users may want to be sure that their stored data have not been relocated into unknown geographic regions that may compromise the security of their stored data. Albeshri et al. (2012) combined proof of storage (POS) protocols with distance-bounding protocols to address this problem. However, their scheme involves unnecessary delay when utilising typical POS schemes due to computational overhead at the server side. The aim of this paper is to improve the basic GeoProof protocol by reducing the computation overhead at the server side. We show how this can maintain the same level of security while achieving more accurate geographic assurance.
Resumo:
To enhance the performance of the k-nearest neighbors approach in forecasting short-term traffic volume, this paper proposed and tested a two-step approach with the ability of forecasting multiple steps. In selecting k-nearest neighbors, a time constraint window is introduced, and then local minima of the distances between the state vectors are ranked to avoid overlappings among candidates. Moreover, to control extreme values’ undesirable impact, a novel algorithm with attractive analytical features is developed based on the principle component. The enhanced KNN method has been evaluated using the field data, and our comparison analysis shows that it outperformed the competing algorithms in most cases.
Resumo:
In Responsibility to Protect and Women, Peace and Security: Aligning the Protection Agendas, editors Davies, Nwokora, Stamnes and Teitt address the intersections of the Responsibility to Protect (R2P) principle and the Women, Peace, and Security (WPS) agenda. Widespread or systematic sexual or gender-based violence is a war crime, a crime against humanity and an act of genocide, all of which are clearly addressed in the R2P principle. The protection of those at risk of widespread sexual violence is therefore not only relative to the Women, Peace and Security (WPS) agenda, but a fundamental sovereign obligation for all states as part of their commitment to R2P. Contributions from policy-makers and academics consider both the merits and the utility of aligning the protection agendas of R2P and WPS. Ultimately, a number of actionable recommendations are made concerning a unification of the agendas to best support the global empowerment of women and prevention of mass atrocities.
Resumo:
A new era of cyber warfare has appeared on the horizon with the discovery and detection of Stuxnet. Allegedly planned, designed, and created by the United States and Israel, Stuxnet is considered the first known cyber weapon to attack an adversary state. Stuxnet's discovery put a lot of attention on the outdated and obsolete security of critical infrastructure. It became very apparent that electronic devices that are used to control and operate critical infrastructure like programmable logic controllers (PLCs) or supervisory control and data acquisition (SCADA) systems lack very basic security and protection measures. Part of that is due to the fact that when these devices were designed, the idea of exposing them to the Internet was not in mind. However, now with this exposure, these devices and systems are considered easy prey to adversaries.
Resumo:
Espionage, surveillance and clandestine operations by secret agencies and governments were something of an East–West obsession in the second half of the twentieth century, a fact reflected in literature and film. In the twenty-first century, concerns of the Cold War and the threat of Communism have been rearticulated in the wake of 9/11. Under the rubric of ‘terror’ attacks, the discourses of security and surveillance are now framed within an increasingly global context. As this article illustrates, surveillance fiction written for young people engages with the cultural and political tropes that reflect a new social order that is different from the Cold War era, with its emphasis on spies, counter espionage, brainwashing and psychological warfare. While these tropes are still evident in much recent literature, advances in technology have transformed the means of tracking, profiling and accumulating data on individuals’ daily activities. Little Brother, The Hunger Games and Article 5 reflect the complex relationship between the real and the imaginary in the world of surveillance and, as this paper discusses, raise moral and ethical issues that are important questions for young people in our age of security.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)