982 resultados para Nudging, Choice Architecture, Libertarian Paternalism, Regulation
Resumo:
Supervisory Control And Data Acquisition (SCADA) systems are widely used in the management of critical infrastructure such as electricity and water distrubution systems. Currently there is little understanding of how to best protect SCADA systems from malicious attacks. We review the constraints and requirements for SCADA security and propose a suitable architecture (SKMA) for secure SCADA communications. The architecture includes a proposed key management protocol (SKMP). We compare the architecture with a previous proposal from Sandia Labs.
Resumo:
This chapter explores the development of concepts of interactive environments by comparing two major projects that frame the period of this book. The Fun Palace of 1960 and the Generator of 1980 both proposed interactive environments responsive to the needs and behaviour of their users, but the contrast in terms of the available technology and what it enabled could not be more marked. The Fun Palace broke new architectural, organizational and social ground and was arguably the first proposition for cybernetic architecture; the Generator demonstrated how it could be achieved. Both projects are now acknowledged as seminal architectural propositions of the twentieth century, and both were designed by Cedric Price.
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.
Resumo:
The notion of designing with change constitutes a fundamental and foundational theoretical premise for much of what constitutes landscape architecture, notably through engagement with ecology, particularly since the work of Ian McHarg in the 1960s and his key text Design with Nature. However, while most if not all texts in landscape architecture would cite this engagement of change theoretically, few go any further than citation, and when they do their methods seem fixated on utilising empirical, quantitative scientific tools for doing so, rather than the tools of design, in an architectural sense, as implied by the name of the discipline, landscape architecture.
Resumo:
The notion of recombinant architecture signals a loosening of spatial connections between physical and digital-online environments (Mitchell, 1996; 2000; 2003). Such an idea also points to the transformative nature of the designing approaches concerned with the creation of spaces where bits meet bodies to fulfil human needs and desires and, at the same time, pursuing those human dimensions of space and place which are so important to our senses of belonging, physical comfort and amenity. This paper proposes that recombinant spaces and places draw on familiar architectural forms and functions and on the transforming functions of digital-online modes. Perspectives, approaches and resources outlined in the paper support designing and re-designing enterprises and aim to stimulate discussion in the Digital Environments strand of this online conference: 'Under Construction: a world without walls'.
Resumo:
Major infrastructure assets are often governed by a mix of public and private organizations, each fulfilling a specific and separate role i.e. policy, ownership, operation or maintenance. This mix of entities is a legacy of Public Choice Theory influenced NPM reforms of the late 20th century. The privatization of the public sector has resulted in agency theory based ‘self-interest’ relationships and governance arrangements for major infrastructure assets which emphasize economic efficiency but which do not do not advance non-economic public values and the collective Public Interest. The community is now requiring that governments fulfill their stewardship role of also satisfying non-economic public values such as sustainability and intergenerational responsibility. In the 21st century governance arrangements which minimize individual self-interest alone and look to also pursue the interests of other stakeholders have emerged. Relational contracts, Public-Private Partnerships (PPP’s) and hybrid mixes of organizations from the state, market and network modes (Keast et al 2006) provide options for governance which better meet the interests of contractors, government and the community there is emerging a body of research which extends the consideration of the immediate governance configuration to the metagovernance environment constituted by hierarchy, regulation, industry standards, trust, culture and values. Stewardship theory has reemerged as a valuable aid in the understanding of the features of governance configurations which establish relationships between principal and agent which maximize the agent acting in the interests of the principal, even to the detriment of the agent. This body of literature suggests that an improved stewardship outcome from infrastructure governance configurations can be achieved by the application of the emerging options as to the immediate governance configuration, and the surrounding metagovernance environment. Stewardship theory provides a framework for the design of the relationships within that total governance environment, focusing on the achievement of a better, complete stewardship outcome. This paper explores the directions future research might take in seeking to improve the understanding of the design of the governance of major, critical infrastructure assets.
Resumo:
The overall rate of omission of items for 28,331 17 year old Australian students on a high stakes test of achievement in the common elements or cognitive skills of the senior school curriculum is reported for a subtest in multiple choice format and a subtest in short response format. For the former, the omit rates were minuscule and there was no significant difference by gender or by type of school attended. For the latter, where an item can be 'worth' up to five times that of a single multiple choice item, the omit rates were between 10 and 20 times that for multiple choice and the difference between male and female omit rate was significant as was the difference between students from government and non-government schools. For both formats, females from single sex schools omitted significantly fewer items than did females from co-educational schools. Some possible explanations of omit behaviour are alluded to.
Resumo:
This issue of the Griffith Law Review focuses on consumer law, and the pervasive nature of this area of law. We are all consumers, but do not necessarily identify as such, nor are we a homogeneous group. The boundaries of
Resumo:
The automation of various aspects of air traffic management has many wide-reaching benefits including: reducing the workload for Air Traffic Controllers; increasing the flexibility of operations (both civil and military) within the airspace system through facilitating automated dynamic changes to en-route flight plans; ensuring safe aircraft separation for a complex mix of airspace users within a highly complex and dynamic airspace management system architecture. These benefits accumulate to increase the efficiency and flexibility of airspace use(1). Such functions are critical for the anticipated increase in volume of manned and unmanned aircraft traffic. One significant challenge facing the advancement of airspace automation lies in convincing air traffic regulatory authorities that the level of safety achievable through the use of automation concepts is comparable to, or exceeds, the accepted safety performance of the current system.
Resumo:
Australia is currently well placed to contribute to the global growth of human stem cell research. However, as the science has progressed, authorities have had to deal with the ongoing challenges of regulating such a fast moving field of scientific endeavour. Australia’s past and current approach to regulating the use of embryos in human embryonic stem cell research provides an insight into how Australia may continue to adapt to future regulatory challenges presented by human stem cell research. In the broader context, a number of issues have been identified that may impact upon the success of future human stem cell research in Australia.