785 resultados para Governance of security
Resumo:
Increasingly it has been argued that senior management teams (SMTs), comprising principals, deputy heads and other personnel, play a critical role in the governance of schools. In recent years, many researchers have drawn upon the tools of micropolitics to illuminate the relationships, dynamics and power plays between and amongst members of SMTs. The paper has two foci. Firstly, it overviews some of the seminal literature in the field of SMTs and micropolitics in an attempt to identify the working practices of and challenges facing members of SMTs. Secondly, it discusses an instrument, the TEAM Development Questionnaire, that emerged from a synthesis of this writing and research. The questionnaire presented here was especially devised to use with members of SMTs to help them (i) identify the dynamics amongst team members; and (ii) identify areas for the team to improve. A set of procedures for implementing the TEAM Development Questionnaire is provided to demonstrate its application to the field.
Resumo:
Understanding micropolitics has become an important part of understanding leadership and power relations within schools. In this paper we review some of the pertinent literature and writing in the field, particularly as it relates to school leadership. Drawing on a couple of existing models, we present a new model that highlights three central power-based leadership approaches—‘power with’, ‘power through’ and ‘power over’. We put forward two contrasting vignettes that reveal a variety of micropolitical strategies used by school principals in the governance of their schools. These strategies range from favouritism and control at one end to empowerment and collaboration at the other. The vignettes are analysed in the light of the model and micropolitical literature presented in this article.
Resumo:
As with the broader field of education research, most writing on the subject of school excursions and field trips has centred around progressive/humanist concerns for building pupil’s self-esteem and for the development of the ‘whole child’. Such research has also stressed the importance of a broad, grounded, and experiential curriculum - as exemplified by subjects containing these extra-school activities - as well as the possibility of strengthening the relationship between student and teacher. Arguing that this approach to the field trip is both exhausted of ideas and conceptually flawed, this paper proposes some alternate routes into the area for the prospective researcher. First, it is argued that by historicising the subject matter, it can be seen that school excursions are not simply the product of the contemporary humanist desire for diverse and fulfilling educational experiences, rather they can, in part, be traced to eighteenth century beliefs among the English gentry that travel formed a crucial component of a good education, to the advent of an affordable public rail system, and to school tours associated with the Temperance movement. Second, field trips can be understood from within the associated framework of concerns over the governance of tourism and the organisation of disciplinary apparatuses for the production of an educated and regulated citizenry. Far from being a simple learning experience, museums and art galleries form part of a complex of disciplinary and power relations designed to produce a populace with very specific capacities, aspirations and styles of public conduct. Finally, rather than allowing children ‘freedom’ from the constraints of the classroom, on the contrary, through the medium of the field-trip, children can become accustomed to having their activities governed in the broader domain of the generalised community . School excursions thereby constitute an effective tactic through which young people have their conduct managed, and their social and scholastic identities shaped and administered.
Resumo:
This chapter seeks to develop an analysis of the contemporary use of the ePortfolio (Electronic Portfolio) in education practices. Unlike other explorations of this new technology which are deterministic in their approach, the authors seek to reveal the techniques and practices of government which underpin the implementation of the e-portfolio. By interrogating a specific case study example from a large Australian university’s preservice teacher program, the authors find that the e-portfolio is represented as eLearning technology but serves to govern students via autonomization and self responsibilization. Using policy data and other key documents, they are able to reveal the e-portfolio as a delegated authority in the governance of preservice teachers. However, despite this ongoing trend, they suggest that like other practices of government, the e-portfolio will eventually fail. This however the authors conclude opens up space for critical thought and engagement which is not afforded presently.
Resumo:
Airports, over time, have emerged as separate independent entities often described as ‘enclaves’. As such airports regularly planned and implemented developments within their boundaries with limited inclusion of local actors in decision making processes. Urban encroachment on airport boundaries has increasingly focused the planning interests of airports to consider what their neighbouring cities are doing. Likewise city planners are progressively more interested in the development activities of airports. Despite shared interests in what happens on the either side of the fence line, relationships between airports and their neighbouring cities have often been strained, if not, at times, hostile. A number of strategies and conceptualisations for the co-existence of urban and airport environs have been put forward. However, these models are likely to have a limited effect unless they can be implemented to maximise opportunities for both cities and airports, and at the same time not confound their long-term interests. The isolation of airport planning from local and regional planning agencies, and the resulting power struggles are not new. Under current conditions the need to ‘bridge the gap’ between airports and their urban surrounds has become an increasing, yet under explored imperative. This paper examines the decision making arena for airport-region development to define the barriers, enablers, tensions and puzzles for the governance of airport-region development, from a cross-country perspective. Findings suggest that while there are many embedded rule structures that foster airport-region tensions, there are nonetheless a number of pathways for moving airports beyond decision making enclaves, to more integrated mechanisms for city and regional planning. In providing preliminary answers for overcoming the barriers, tensions and intractable issues of mutually agreeable airport and city development, the research makes a primary contribution to the ground level governance of collaborative planning. This research also serves as a launching point for future, more detailed research into the areas of airport-region decision making and collaborative planning for airport-regions. This work was carried out through the Airport Metropolis Research Project under the Australian Research Council’s Linkage Projects funding scheme (LP0775225).
Resumo:
Successful repair of wounds and tissues remains a major healthcare and biomedical challenge in the 21st Century. In particular, chronic wounds often lead to loss of functional ability, increased pain and decreased quality of life, and can be a burden on carers and health-system resources. Advanced healing therapies employing biological dressings, skin substitutes, growth factor-based therapies and synthetic a cellular matrices, all of which aim to correct irregular and dysfunctional cellular pathways present in chronic wounds, are becoming more popular. This review focuses on recent advances in biologically inspired devices for would healing and includes a commentary on the challenges facing the regulatory governance of such products.
Resumo:
The most costly operations encountered in pairing computations are those that take place in the full extension field Fpk . At high levels of security, the complexity of operations in Fpk dominates the complexity of the operations that occur in the lower degree subfields. Consequently, full extension field operations have the greatest effect on the runtime of Miller’s algorithm. Many recent optimizations in the literature have focussed on improving the overall operation count by presenting new explicit formulas that reduce the number of subfield operations encountered throughout an iteration of Miller’s algorithm. Unfortunately, almost all of these improvements tend to suffer for larger embedding degrees where the expensive extension field operations far outweigh the operations in the smaller subfields. In this paper, we propose a new way of carrying out Miller’s algorithm that involves new explicit formulas which reduce the number of full extension field operations that occur in an iteration of the Miller loop, resulting in significant speed ups in most practical situations of between 5 and 30 percent.
Resumo:
Authorised users (insiders) are behind the majority of security incidents with high financial impacts. Because authorisation is the process of controlling users’ access to resources, improving authorisation techniques may mitigate the insider threat. Current approaches to authorisation suffer from the assumption that users will (can) not depart from the expected behaviour implicit in the authorisation policy. In reality however, users can and do depart from the canonical behaviour. This paper argues that the conflict of interest between insiders and authorisation mechanisms is analogous to the subset of problems formally studied in the field of game theory. It proposes a game theoretic authorisation model that can ensure users’ potential misuse of a resource is explicitly considered while making an authorisation decision. The resulting authorisation model is dynamic in the sense that its access decisions vary according to the changes in explicit factors that influence the cost of misuse for both the authorisation mechanism and the insider.
Resumo:
RFID has been widely used in today's commercial and supply chain industry, due to the significant advantages it offers and the relatively low production cost. However, this ubiquitous technology has inherent problems in security and privacy. This calls for the development of simple, efficient and cost effective mechanisms against a variety of security threats. This paper proposes a two-step authentication protocol based on the randomized hash-lock scheme proposed by S. Weis in 2003. By introducing additional measures during the authentication process, this new protocol proves to enhance the security of RFID significantly, and protects the passive tags from almost all major attacks, including tag cloning, replay, full-disclosure, tracking, and eavesdropping. Furthermore, no significant changes to the tags is required to implement this protocol, and the low complexity level of the randomized hash-lock algorithm is retained.
Resumo:
Tracking/remote monitoring systems using GNSS are a proven method to enhance the safety and security of personnel and vehicles carrying precious or hazardous cargo. While GNSS tracking appears to mitigate some of these threats, if not adequately secured, it can be a double-edged sword allowing adversaries to obtain sensitive shipment and vehicle position data to better coordinate their attacks, and to provide a false sense of security to monitoring centers. Tracking systems must be designed with the ability to perform route-compliance and thwart attacks ranging from low-level attacks such as the cutting of antenna cables to medium and high-level attacks involving radio jamming and signal / data-level simulation, especially where the goods transported have a potentially high value to terrorists. This paper discusses the use of GNSS in critical tracking applications, addressing the mitigation of GNSS security issues, augmentation systems and communication systems in order to provide highly robust and survivable tracking systems.
Resumo:
Video surveillance technology, based on Closed Circuit Television (CCTV) cameras, is one of the fastest growing markets in the field of security technologies. However, the existing video surveillance systems are still not at a stage where they can be used for crime prevention. The systems rely heavily on human observers and are therefore limited by factors such as fatigue and monitoring capabilities over long periods of time. To overcome this limitation, it is necessary to have “intelligent” processes which are able to highlight the salient data and filter out normal conditions that do not pose a threat to security. In order to create such intelligent systems, an understanding of human behaviour, specifically, suspicious behaviour is required. One of the challenges in achieving this is that human behaviour can only be understood correctly in the context in which it appears. Although context has been exploited in the general computer vision domain, it has not been widely used in the automatic suspicious behaviour detection domain. So, it is essential that context has to be formulated, stored and used by the system in order to understand human behaviour. Finally, since surveillance systems could be modeled as largescale data stream systems, it is difficult to have a complete knowledge base. In this case, the systems need to not only continuously update their knowledge but also be able to retrieve the extracted information which is related to the given context. To address these issues, a context-based approach for detecting suspicious behaviour is proposed. In this approach, contextual information is exploited in order to make a better detection. The proposed approach utilises a data stream clustering algorithm in order to discover the behaviour classes and their frequency of occurrences from the incoming behaviour instances. Contextual information is then used in addition to the above information to detect suspicious behaviour. The proposed approach is able to detect observed, unobserved and contextual suspicious behaviour. Two case studies using video feeds taken from CAVIAR dataset and Z-block building, Queensland University of Technology are presented in order to test the proposed approach. From these experiments, it is shown that by using information about context, the proposed system is able to make a more accurate detection, especially those behaviours which are only suspicious in some contexts while being normal in the others. Moreover, this information give critical feedback to the system designers to refine the system. Finally, the proposed modified Clustream algorithm enables the system to both continuously update the system’s knowledge and to effectively retrieve the information learned in a given context. The outcomes from this research are: (a) A context-based framework for automatic detecting suspicious behaviour which can be used by an intelligent video surveillance in making decisions; (b) A modified Clustream data stream clustering algorithm which continuously updates the system knowledge and is able to retrieve contextually related information effectively; and (c) An update-describe approach which extends the capability of the existing human local motion features called interest points based features to the data stream environment.
Resumo:
We investigate known security flaws in the context of security ceremonies to gain an understanding of the ceremony analysis process. The term security ceremonies is used to describe a system of protocols and humans which interact for a specific purpose. Security ceremonies and ceremony analysis is an area of research in its infancy, and we explore the basic principles involved to better understand the issues involved.We analyse three ceremonies, HTTPS, EMV and Opera Mini, and use the information gained from the experience to establish a list of typical flaws in ceremonies. Finally, we use that list to analyse a protocol proven secure for human use. This leads to a realisation of the strengths and weaknesses of ceremony analysis.
Resumo:
In dynamic and uncertain environments such as healthcare, where the needs of security and information availability are difficult to balance, an access control approach based on a static policy will be suboptimal regardless of how comprehensive it is. The uncertainty stems from the unpredictability of users’ operational needs as well as their private incentives to misuse permissions. In Role Based Access Control (RBAC), a user’s legitimate access request may be denied because its need has not been anticipated by the security administrator. Alternatively, even when the policy is correctly specified an authorised user may accidentally or intentionally misuse the granted permission. This paper introduces a novel approach to access control under uncertainty and presents it in the context of RBAC. By taking insights from the field of economics, in particular the insurance literature, we propose a formal model where the value of resources are explicitly defined and an RBAC policy (entailing those predictable access needs) is only used as a reference point to determine the price each user has to pay for access, as opposed to representing hard and fast rules that are always rigidly applied.
Resumo:
In dynamic and uncertain environments, where the needs of security and information availability are difficult to balance, an access control approach based on a static policy will be suboptimal regardless of how comprehensive it is. Risk-based approaches to access control attempt to address this problem by allocating a limited budget to users, through which they pay for the exceptions deemed necessary. So far the primary focus has been on how to incorporate the notion of budget into access control rather than what or if there is an optimal amount of budget to allocate to users. In this paper we discuss the problems that arise from a sub-optimal allocation of budget and introduce a generalised characterisation of an optimal budget allocation function that maximises organisations expected benefit in the presence of self-interested employees and costly audit.
Resumo:
This paper examines some of the central global ethical and governance challenges of climate change and carbon emis-sions reduction in relation to globalization, the “global financial crisis” (GFC), and unsustainable conceptions of the “good life”, and argues in favour of the development of a global carbon “integrity system”. It is argued that a funda-mental driver of our climate problems is the incipient spread of an unsustainable Western version of the “good life”, where resource-intensive, high-carbon western lifestyles, although frequently criticized as unsustainable and deeply unsatisfying, appear to have established an unearned ethical legitimacy. While the ultimate solution to climate change is the development of low carbon lifestyles, the paper argues that it is also important that economic incentives support and stimulate that search: the sustainable versions of the good life provide an ethical pull, whilst the incentives provide an economic push. Yet, if we are going to secure sustainable low carbon lifestyles, it is argued, we need more than the ethical pull and the economic push. Each needs to be institutionalized—built into the governance of global, regional, national, sub-regional, corporate and professional institutions. Where currently weakness in each exacerbates the weaknesses in others, it is argued that governance reform is required in all areas supporting sustainable, low carbon versions of the good life.