908 resultados para Security-critical software
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
The use of open source software continues to grow on a daily basis. Today, enterprise applications contain 40% to 70% open source code and this fact has legal, development, IT security, risk management and compliance organizations focusing their attention on its use, as never before. They increasingly understand that the open source content within an application must be detected. Once uncovered, decisions regarding compliance with intellectual property licensing obligations must be made and known security vulnerabilities must be remediated. It is no longer sufficient from a risk perspective to not address both open source issues.
Resumo:
The subject matter of the analysis conducted in the text is information and anti-terrorist security of Poland, which has been presented within the context of a clash between two spheres – the state and the private sphere. Furthermore, the issues of security have been supplemented with a description of the tasks and activity of the Internal Security Agency, as well as a synthetic appraisal of a terrorist threat to Poland. The main parts of this work are concerned with: (1) the state and the private sphere, (2) " terrorism " and terrorist offences, (3) the tasks and activity of the Internal Security Agency, (4) an appraisal of a terrorist threat to Poland. Given the necessity to elaborate the research problem, the text features the following research questions: (1) To what extent does referring to a threat to security influence a limitation on rights and freedoms in Poland (with regard to the clash between the state and the private sphere)?, (2) To what extent do the tasks and activity of the Internal Security Agency influence the effectiveness of anti-terrorist security in Poland?
Resumo:
Dominant paradigms of causal explanation for why and how Western liberal-democracies go to war in the post-Cold War era remain versions of the 'liberal peace' or 'democratic peace' thesis. Yet such explanations have been shown to rest upon deeply problematic epistemological and methodological assumptions. Of equal importance, however, is the failure of these dominant paradigms to account for the 'neoliberal revolution' that has gripped Western liberal-democracies since the 1970s. The transition from liberalism to neoliberalism remains neglected in analyses of the contemporary Western security constellation. Arguing that neoliberalism can be understood simultaneously through the Marxian concept of ideology and the Foucauldian concept of governmentality – that is, as a complementary set of 'ways of seeing' and 'ways of being' – the thesis goes on to analyse British security in policy and practice, considering it as an instantiation of a wider neoliberal way of war. In so doing, the thesis draws upon, but also challenges and develops, established critical discourse analytic methods, incorporating within its purview not only the textual data that is usually considered by discourse analysts, but also material practices of security. This analysis finds that contemporary British security policy is predicated on a neoliberal social ontology, morphology and morality – an ideology or 'way of seeing' – focused on the notion of a globalised 'network-market', and is aimed at rendering circulations through this network-market amenable to neoliberal techniques of government. It is further argued that security practices shaped by this ideology imperfectly and unevenly achieve the realisation of neoliberal 'ways of being' – especially modes of governing self and other or the 'conduct of conduct' – and the re-articulation of subjectivities in line with neoliberal principles of individualism, risk, responsibility and flexibility. The policy and practice of contemporary British 'security' is thus recontextualised as a component of a broader 'neoliberal way of war'.
Resumo:
The narrative of the United States is of a "nation of immigrants" in which the language shift patterns of earlier ethnolinguistic groups have tended towards linguistic assimilation through English. In recent years, however, changes in the demographic landscape and language maintenance by non-English speaking immigrants, particularly Hispanics, have been perceived as threats and have led to calls for an official English language policy.This thesis aims to contribute to the study of language policy making from a societal security perspective as expressed in attitudes regarding language and identity originating in the daily interaction between language groups. The focus is on the role of language and American identity in relation to immigration. The study takes an interdisciplinary approach combining language policy studies, security theory, and critical discourse analysis. The material consists of articles collected from four newspapers, namely USA Today, The New York Times, Los Angeles Times, and San Francisco Chronicle between April 2006 and December 2007.Two discourse types are evident from the analysis namely Loyalty and Efficiency. The former is mainly marked by concerns of national identity and contains speech acts of security related to language shift, choice and English for unity. Immigrants are represented as dehumanised, and harmful. Immigration is given as sovereignty-related, racial, and as war. The discourse type of Efficiency is mainly instrumental and contains speech acts of security related to cost, provision of services, health and safety, and social mobility. Immigrants are further represented as a labour resource. These discourse types reflect how the construction of the linguistic 'we' is expected to be maintained. Loyalty is triggered by arguments that the collective identity is threatened and is itself used in reproducing the collective 'we' through hegemonic expressions of monolingualism in the public space and semi-public space. The denigration of immigrants is used as a tool for enhancing societal security through solidarity and as a possible justification for the denial of minority rights. Also, although language acquisition patterns still follow the historical trend of language shift, factors indicating cultural separateness such as the appearance of speech communities or the use of minority languages in the public space and semi-public space have led to manifestations of intolerance. Examples of discrimination and prejudice towards minority groups indicate that the perception of worth of a shared language differs from the actual worth of dominant language acquisition for integration purposes. The study further indicates that the efficient working of the free market by using minority languages to sell services or buy labour is perceived as conflicting with nation-building notions since it may create separately functioning sub-communities with a new cultural capital recognised as legitimate competence. The discourse types mainly represent securitising moves constructing existential threats. The perception of threat and ideas of national belonging are primarily based on a zero-sum notion favouring monolingualism. Further, the identity of the immigrant individual is seen as dynamic and adaptable to assimilationist measures whereas the identity of the state and its members are perceived as static. Also, the study shows that debates concerning language status are linked to extra-linguistic matters. To conclude, policy makers in the US need to consider the relationship between four factors, namely societal security based on collective identity, individual/human security, human rights, and a changing linguistic demography, for proposed language intervention measures to be successful.
Resumo:
We describe a tool for analysing information flow in security hardware. It identifies both sub-circuits critical to the preservation of security as well as the potential for information flow due to hardware failure. The tool allows for the composition of both logical and physical views of circuit designs. An example based on a cryptographic device is provided.
Resumo:
This paper describes the implementation of a TMR (Triple Modular Redundant) microprocessor system on a FPGA. The system exhibits true redundancy in that three instances of the same processor system (both software and hardware) are executed in parallel. The described system uses software to control external peripherals and a voter is used to output correct results. An error indication is asserted whenever two of the three outputs match or all three outputs disagree. The software has been implemented to conform to a particular safety critical coding guideline/standard which is popular in industry. The system was verified by injecting various faults into it.
Resumo:
Increasingly users are seen as the weak link in the chain, when it comes to the security of corporate information. Should the users of computer systems act in any inappropriate or insecure manner, then they may put their employers in danger of financial losses, information degradation or litigation, and themselves in danger of dismissal or prosecution. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of inappropriate behaviours, and in so doing, protecting corporate information, is through the formulation and application of a formal ‘acceptable use policy (AUP). Whilst the AUP has attracted some academic interest, it has tended to be prescriptive and overly focussed on the role of the Internet, and there is relatively little empirical material that explicitly addresses the purpose, positioning or content of real acceptable use policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and composition of a sample of authentic policies – taken from the higher education sector – rather than simply making general prescriptions about what they ought to contain. There are two important conclusions to be drawn from this study: (1) the primary role of the AUP appears to be as a mechanism for dealing with unacceptable behaviour, rather than proactively promoting desirable and effective security behaviours, and (2) the wide variation found in the coverage and positioning of the reviewed policies is unlikely to be fostering a coherent approach to security management, across the higher education sector.
Resumo:
Ensuring the security of corporate information, that is increasingly stored, processed and disseminated using information and communications technologies [ICTs], has become an extremely complex and challenging activity. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of security breaches, and in so doing, protecting corporate information, is through the formulation and application of a formal information security policy (InSPy). Whilst a great deal has now been written about the importance and role of the information security policy, and approaches to its formulation and dissemination, there is relatively little empirical material that explicitly addresses the structure or content of security policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and content of authentic information security policies, rather than simply making general prescriptions about what they ought to contain. Having established the structure and key features of the reviewed policies, the paper critically explores the underlying conceptualisation of information security embedded in the policies. There are two important conclusions to be drawn from this study: (1) the wide diversity of disparate policies and standards in use is unlikely to foster a coherent approach to security management; and (2) the range of specific issues explicitly covered in university policies is surprisingly low, and reflects a highly techno-centric view of information security management.
Resumo:
The main requirements to DRM platforms implementing effective user experience and strong security measures to prevent unauthorized use of content are discussed. Comparison of hardware-based and software- based platforms is made showing the general inherent advantages of hardware DRM solutions. Analysis and evaluation of the main flaws of hardware platforms are conducted, pointing out the possibilities to overcome them. The overview of the existing concepts for practical realization of hardware DRM protection reveals their advantages and disadvantages and the increasing demand for creation of multi-core architecture, which could assure an effective DRM protection without decreasing the user’s freedom and importing risks for end system security.
Resumo:
In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. ^ The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as (1) closure or connectedness within the group, (2) bridging ties which extend outside of the group, and (3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. ^ The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. ^ Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software. ^
Resumo:
In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as 1) closure or connectedness within the group, 2) bridging ties which extend outside of the group, and 3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software.