814 resultados para Policy of Memory
Resumo:
Information security has been recognized as a core requirement for corporate governance that is expected to facilitate not only the management of risks, but also as a corporate enabler that supports and contributes to the sustainability of organizational operations. In implementing information security, the enterprise information security policy is the set of principles and strategies that guide the course of action for the security activities and may be represented as a brief statement that defines program goals and sets information security and risk requirements. The enterprise information security policy (alternatively referred to as security policy in this paper) that represents the meta-policy of information security is an element of corporate ICT governance and is derived from the strategic requirements for risk management and corporate governance. Consistent alignment between the security policy and the other corporate business policies and strategies has to be maintained if information security is to be implemented according to evolving business objectives. This alignment may be facilitated by managing security policy alongside other corporate business policies within the strategic management cycle. There are however limitations in current approaches for developing and managing the security policy to facilitate consistent strategic alignment. This paper proposes a conceptual framework for security policy management by presenting propositions to positively affect security policy alignment with business policies and prescribing a security policy management approach that expounds on the propositions.
Resumo:
Over the past five years, Australia has accepted approximately 50 000 individuals through its Humanitarian program. To integrate these individuals specialised medical and psychological services have been established in major centres of Australia. Australia has been involved in a heated and partisan debate as to the policy of the government in responding to the refugee situation. Regardless of the outcome of the debate, it is imperative that Australia establishes and develops effective policies and processes to respond to the mental health needs of refugees and asylum seekers. To this end, the current review provides an overview of published studies relating to the psychological treatment of refugees and asylum seekers, as well as studies covering the delivery of related services in response to the needs of this group. In this review we aim to provide an informed perspective in terms of research evidence where this is available. Reported research is supported by findings from local focus groups conducted in Queensland, Australia. The overall aim is to provide an optimum response to facilitate the development of effective and humane programs for a significantly disadvantaged group in our community.
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.
Resumo:
The adsorption of low-energy C20 isomers on diamond (0 0 1)–(2×1) surface was investigated by molecular dynamics simulation using the Brenner potential. The energy dependence of chemisorption characteristic was studied. We found that there existed an energy threshold for chemisorption of C20 to occur. Between 10 and 20 eV, the C20 fullerene has high probability of chemisorption and the adsorbed cage retains its original structure, which supports the experimental observations of memory effects. However, the structures of the adsorbed bowl and ring C20 were different from their original ones. In this case, the local order in cluster-assembled films would be different from the free clusters.
Resumo:
Introduction Road safety researchers rely heavily on self-report data to explore the aetiology of crash risk. However, researchers consistently acknowledge a range of limitations associated with this methodological approach (e.g., self-report bias), which has been hypothesised to reduce the predictive efficacy of scales. Although well researched in other areas, one important factor often neglected in road safety studies is the fallibility of human memory. Given accurate recall is a key assumption in many studies, the validity and consistency of self-report data warrants investigation. The aim of the current study was to examine the consistency of self-report data of crash history and details of the most recent reported crash on two separate occasions. Materials & Method A repeated measures design was utilised to examine the self-reported crash involvement history of 214 general motorists over a two month period. Results A number of interesting discrepancies were noted in relation to number of lifetime crashes reported by the participants and the descriptions of their most recent crash across the two occasions. Of the 214 participants who reported having been involved in a crash, 35 (22.3%) reported a lower number of lifetime crashes as Time 2, than at Time 1. Of the 88 drivers who reported no change in number of lifetime crashes, 10 (11.4%) described a different most recent crash. Additionally, of the 34 reporting an increase in the number of lifetime crashes, 29 (85.3%) of these described the same crash on both occasions. Assessed as a whole, at least 47.1% of participants made a confirmed mistake at Time 1 or Time 2. Conclusions These results raise some doubt in regard to the accuracy of memory recall across time. Given that self-reported crash involvement is the predominant dependent variable used in the majority of road safety research, this issue warrants further investigation. Replication of the study with a larger sample size that includes multiple recall periods would enhance understanding into the significance of this issue for road safety methodology.
Resumo:
This thesis is a study of how the contents of volatile memory on the Windows operating system can be better understood and utilised for the purposes of digital forensic investigations. It proposes several techniques to improve the analysis of memory, with a focus on improving the detection of unknown code such as malware. These contributions allow the creation of a more complete reconstruction of the state of a computer at acquisition time, including whether or not the computer has been infected by malicious code.
Resumo:
The study of memory in most behavioral paradigms, including emotional memory paradigms, has focused on the feed forward components that underlie Hebb’s first postulate, associative synaptic plasticity. Hebb’s second postulate argues that activated ensembles of neurons reverberate in order to provide temporal coordination of different neural signals, and thereby facilitate coincidence detection. Recent evidence from our groups has suggested that the lateral amygdala (LA) contains recurrent microcircuits and that these may reverberate. Additionally this reverberant activity is precisely timed with latencies that would facilitate coincidence detection between cortical and sub cortical afferents to the LA.Thus, recent data at the microcircuit level in the amygdala provide some physiological evidence in support of the second Hebbian postulate.
Resumo:
In this paper we present a cryptanalysis of a new 256-bit hash function, FORK-256, proposed by Hong et al. at FSE 2006. This cryptanalysis is based on some unexpected differentials existing for the step transformation. We show their possible uses in different attack scenarios by giving a 1-bit (resp. 2-bit) near collision attack against the full compression function of FORK-256 running with complexity of 2^125 (resp. 2^120) and with negligible memory, and by exhibiting a 22-bit near pseudo-collision. We also show that we can find collisions for the full compression function with a small amount of memory with complexity not exceeding 2^126.6 hash evaluations. We further show how to reduce this complexity to 2^109.6 hash computations by using 273 memory words. Finally, we show that this attack can be extended with no additional cost to find collisions for the full hash function, i.e. with the predefined IV.
Resumo:
The generation of a correlation matrix for set of genomic sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. Each sequence may be millions of bases long and there may be thousands of such sequences which we wish to compare, so not all sequences may fit into main memory at the same time. Each sequence needs to be compared with every other sequence, so we will generally need to page some sequences in and out more than once. In order to minimize execution time we need to minimize this I/O. This paper develops an approach for faster and scalable computing of large-size correlation matrices through the maximal exploitation of available memory and reducing the number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different bioinformatics problems with different correlation matrix sizes. The significant performance improvement of the approach over previous work is demonstrated through benchmark examples.
Resumo:
The hippocampus is an anatomically distinct region of the medial temporal lobe that plays a critical role in the formation of declarative memories. Here we show that a computer simulation of simple compartmental cells organized with basic hippocampal connectivity is capable of producing stimulus intensity sensitive wide-band fluctuations of spectral power similar to that seen in real EEG. While previous computational models have been designed to assess the viability of the putative mechanisms of memory storage and retrieval, they have generally been too abstract to allow comparison with empirical data. Furthermore, while the anatomical connectivity and organization of the hippocampus is well defined, many questions regarding the mechanisms that mediate large-scale synaptic integration remain unanswered. For this reason we focus less on the specifics of changing synaptic weights and more on the population dynamics. Spectral power in four distinct frequency bands were derived from simulated field potentials of the computational model and found to depend on the intensity of a random input. The majority of power occurred in the lowest frequency band (3-6 Hz) and was greatest to the lowest intensity stimulus condition (1% maximal stimulus). In contrast, higher frequency bands ranging from 7-45 Hz show an increase in power directly related with an increase in stimulus intensity. This trend continues up to a stimulus level of 15% to 20% of the maximal input, above which power falls dramatically. These results suggest that the relative power of intrinsic network oscillations are dependent upon the level of activation and that above threshold levels all frequencies are damped, perhaps due to over activation of inhibitory interneurons.
Resumo:
Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.
Resumo:
Peggy Shaw’s RUFF, (USA 2013) and Queensland Theatre Company’s collaboration with Queensland University of Technology, Total Dik!, (Australia 2013) overtly and evocatively draw on an aestheticized use of the cinematic techniques and technologies of Chroma Key to reveal the tensions in their production and add layers to their performances. In doing so they offer invaluable insight where the filmic and theatrical approaches overlap. This paper draws on Eckersall, Grehan and Scheer’s New Media Dramaturgy (2014) to reposition the frame as a contribution to intermedial theatre and performance practices in light of increasing convergence between seemingly disparate discourses. In RUFF, the scenic environment replicates a chroma-key ‘studio’ which facilitates the reconstruction of memory displaced after a stroke. RUFF uses the screen and projections to recall crooners, lounge singers, movie stars, rock and roll bands, and an eclectic line of eccentric family members living inside Shaw. While the show pays tribute to those who have kept her company across decades of theatrical performance, use of non-composited chroma-key technique as a theatrical device and the work’s taciturn revelation of the production process during performance, play a central role in its exploration of the juxtaposition between its reconstructed form and content. In contrast Total Dik! uses real-time green screen compositing during performance as a scenic device. Actors manipulate scale models, refocus cameras and generate scenes within scenes in the construction of the work’s examination of an isolated Dictator. The ‘studio’ is again replicated as a site for (re)construction, only in this case Total Dik! actively seeks to reveal the process of production as the performance plays out. Building on RUFF, and other works such as By the Way, Meet Vera Stark, (2012) and Hotel Modern’s God’s Beard (2012), this work blends a convergence of mobile technologies, models, and green screen capture to explore aspects of transmedia storytelling in a theatrical environment (Jenkins, 2009, 2013). When a green screen is placed on stage, it reads at once as metaphor and challenge to the language of theatre. It becomes, or rather acts, as a ‘sign’ that alludes to the nature of the reconstructed, recomposited, manipulated and controlled. In RUFF and in Total Dik!, it is also a place where as a mode of production and subsequent reveal, it adds weight to performance. These works are informed by Auslander (1999) and Giesenkam (2007) and speak to and echo Lehmann’s Postdramatic Theatre (2006). This paper’s consideration of the integration of studio technique and live performance as a dynamic approach to multi-layered theatrical production develops our understanding of their combinatory use in a live performance environment.
Resumo:
In May 2011, the Australian Federal Education Minister announced there would be a unique, innovative and new policy of performance pay for teachers, Rewards for Great Teachers (Garrett, 2011a). In response, this paper uses critical policy historiography to argue that the unintended consequences of performance pay for teachers makes it unlikely it will deliver improved quality or efficiency in Australian schools. What is new, in the Australian context, is that performance pay is one of a raft of education policies being driven by the federal government within a system that constitutionally and historically has placed the responsibility for schooling with the states and territories. Since 2008, a key platform of the Australian federal Labor government has been a commitment to an Education Revolution that would promote quality, equity and accountability in Australian schools. This commitment has resulted in new national initiatives impacting on Australian schools including a high-stakes testing regime 14 National Assessment Program 13 Literacy and Numeracy (NAPLAN) 14a mandated national curriculum (the Australian Curriculum), professional standards for teachers and teacher accreditation 14Australian Institute for Teaching and School Leadership (AITSL) 14and the idea of rewarding excellent teachers through performance pay (Garrett, 2011b). These reforms demonstrate the increased influence of the federal government in education policy processes and the growth of a 1Ccoercive federalism 1D that pits the state and federal governments against each other (Harris-Hart, 2010). Central to these initiatives is the measuring, or auditing, of educational practices and relationships. While this shift in education policy hegemony from state to federal governments has been occurring in Australia at least since the 1970s, it has escalated and been transformed in more recent times with a greater emphasis on national human capital agendas which link education and training to Australia 19s international economic competitiveness (Lingard & Sellar, in press). This paper uses historically informed critical analysis to critique claims about the effects of such policies. We argue that performance pay has a detailed and complex historical trajectory both internationally and within Australian states. Using Gale 19s (2001) critical policy historiography, we illuminate some of the effects that performance pay policies have had on education internationally and in particular within Australia. This critical historical lens also provides opportunities to highlight how teachers have, in the past, tactically engaged with such policies.
Resumo:
In the recent decision of Hunter and New England Local Health District v McKenna; Hunter and New England Local Health District v Simon, the High Court of Australia held that a hospital and its medical staff owed no common law duty of care to third parties claiming for mental harm, against the background of statutory powers to detain mentally ill patients. This conclusion was based in part on the statutory framework and in part on the inconsistency which would arise if such a duty was imposed. If such a duty was imposed in these circumstances, the consequence may be that doctors would generally detain rather than discharge mentally ill persons to avoid the foreseeable risk of harm to others. Such an approach would be inconsistent with the policy of the mental health legislation , which favours personal liberty and discharge rather than detention unless no other care of a less restrictive kind is appropriate and reasonably available.
Resumo:
The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.