51 resultados para The standard model
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.
Resumo:
Abstract—Corneal topography estimation that is based on the Placido disk principle relies on good quality of precorneal tear film and sufficiently wide eyelid (palpebral) aperture to avoid reflections from eyelashes. However, in practice, these conditions are not always fulfilled resulting in missing regions, smaller corneal coverage, and subsequently poorer estimates of corneal topography. Our aim was to enhance the standard operating range of a Placido disk videokeratoscope to obtain reliable corneal topography estimates in patients with poor tear film quality, such as encountered in those diagnosed with dry eye, and with narrower palpebral apertures as in the case of Asian subjects. This was achieved by incorporating in the instrument’s own topography estimation algorithm an image processing technique that comprises a polar-domain adaptive filter and amorphological closing operator. The experimental results from measurements of test surfaces and real corneas showed that the incorporation of the proposed technique results in better estimates of corneal topography, and, in many cases, to a significant increase in the estimated coverage area making such an enhanced videokeratoscope a better tool for clinicians.
Resumo:
In this chapter, a rationale is developed for incorporating philosophy into teacher training programs as a means of both preparing quality teachers for the 21st century and meeting the expectations detailed in the professional standards established by the statutory authority that regulates the profession in Queensland, the Queensland College of Teaching is presented. Furthermore, in-service teachers from Buranda State School, a Brisbane primary school that has been successfully teaching philosophy to its students for over 10 years, shares their experiences of teaching philosophy and how it has enhanced student learning and the quality of teaching and professionalism of the teachers. Finally, the implications of embedding philosophy into teacher training programs are explored in terms of developing the personal integrity of beginning teachers.
Resumo:
The design of driven pile foundations involves an iterative process requiring an initial estimate of the refusal level to determine the depth of boreholes for subsequent analyses. Current procedures for determining borehole depths incorporate parameters typically unknown at the investigation stage. Thus, a quantifiable procedure more applicable at this preliminary stage would provide greater confidence in estimating the founding level of driven piles. This paper examines the effectiveness of the Standard Penetration Test (SPT) in directly estimating driven pile refusal levels. A number of significant correlations were obtained between SPT information and pile penetration records demonstrating the potential application of the SPT. Results indicated pile penetration was generally best described as a function of both the pile toe and cumulative shaft SPT values. The influence of the toe SPT increased when piles penetrated rock. A refusal criteria was established from the results to guide both the estimation of borehole depths and likely pile lengths during the design stage.
Resumo:
In the past two decades there has been increasing interest in branding tourism destinations in an effort to meaningfully differentiate against a myriad of competing places that offer similar attractions and facilities. The academic literature relating to destination branding commenced only as recently as 1998, and there remains a dearth of empirical data that tests the effectiveness of brand campaigns, particularly in terms of enhancing destination loyalty. This paper reports the results of an investigation into destination brand loyalty for Australia as a long haul destination in a South American market. In spite of the high level of academic interest in the measurement of perceptions of destinations since the 1970s, few previous studies have examined perceptions held by South American consumers. Drawing on a model of consumer-based brand equity (CBBE), antecedents of destination brand loyalty was tested with data from a large Chilean sample of travelers, comprising a mix of previous visitors and non-visitors to Australia. Findings suggest that destination brand awareness, brand image, and brand value are positively related to brand loyalty for a long-haul destination. However, destination brand quality was not significantly related. The results also indicate that Australia is a more compelling destination brand for previous visitors compared to non-visitors.
Resumo:
The field of literacy studies has always been challenged by the changing technologies that humans have used to express, represent and communicate their feelings, ideas, understandings and knowledge. However, while the written word has remained central to literacy processes over a long period, it is generally accepted that there have been significant changes to what constitutes ‘literate’ practice. In particular, the status of the printed word has been challenged by the increasing dominance of the image, along with the multimodal meaning-making systems facilitated by digital media. For example, Gunther Kress and other members of the New London Group have argued that the second half of the twentieth century saw a significant cultural shift from the linguistic to the visual as the dominant semiotic mode. This in turn, they suggest, was accompanied by a cultural shift ‘from page to screen’ as a dominant space of representation (e.g. Cope & Kalantzis, 2000; Kress, 2003; New London Group, 1996). In a similar vein, Bill Green has noted that we have witnessed a shift from the regime of the print apparatus to a regime of the digital electronic apparatus (Lankshear, Snyder and Green, 2000). For these reasons, the field of literacy education has been challenged to find new ways to conceptualise what is meant by ‘literacy’ in the twenty first century and to rethink the conditions under which children might best be taught to be fully literate so that they can operate with agency in today’s world.
Resumo:
Forest policy and forestry management in Tasmania have undergone a number of changes in the last thirty years, many explicitly aimed at improving industry sustainability, job security, and forest biodiversity conservation. Yet forestry remains a contentious issue in Tasmania, due to a number of interacting factors, most significant of which is the prevalence of a ‘command and control’ governance approach by policymakers and managers. New approaches such as multiple-stakeholder decision-making, adaptive management, and direct public participation in policymaking are needed. Such an approach has been attempted in Canada in the last decade, through the Canadian Model Forest Program, and may be suitable for Tasmania. This paper seeks to describe what the Canadian Model Forest approach is, how it may be implemented in Tasmania, and what role it may play in the shift to a new forestry paradigm. Until such a paradigm shift occurs contentions and confrontations are likely to continue.
Resumo:
The contributions of this thesis fall into three areas of certificateless cryptography. The first area is encryption, where we propose new constructions for both identity-based and certificateless cryptography. We construct an n-out-of- n group encryption scheme for identity-based cryptography that does not require any special means to generate the keys of the trusted authorities that are participating. We also introduce a new security definition for chosen ciphertext secure multi-key encryption. We prove that our construction is secure as long as at least one authority is uncompromised, and show that the existing constructions for chosen ciphertext security from identity-based encryption also hold in the group encryption case. We then consider certificateless encryption as the special case of 2-out-of-2 group encryption and give constructions for highly efficient certificateless schemes in the standard model. Among these is the first construction of a lattice-based certificateless encryption scheme. Our next contribution is a highly efficient certificateless key encapsulation mechanism (KEM), that we prove secure in the standard model. We introduce a new way of proving the security of certificateless schemes based that are based on identity-based schemes. We leave the identity-based part of the proof intact, and just extend it to cover the part that is introduced by the certificateless scheme. We show that our construction is more efficient than any instanciation of generic constructions for certificateless key encapsulation in the standard model. The third area where the thesis contributes to the advancement of certificateless cryptography is key agreement. Swanson showed that many certificateless key agreement schemes are insecure if considered in a reasonable security model. We propose the first provably secure certificateless key agreement schemes in the strongest model for certificateless key agreement. We extend Swanson's definition for certificateless key agreement and give more power to the adversary. Our new schemes are secure as long as each party has at least one uncompromised secret. Our first construction is in the random oracle model and gives the adversary slightly more capabilities than our second construction in the standard model. Interestingly, our standard model construction is as efficient as the random oracle model construction.
Resumo:
This was the question that confronted Wilson J in Jarema Pty Ltd v Michihiko Kato [2004] QSC 451. Facts The plaintiff was the buyer of a commercial property at Bundall. The property comprised a 6 storey office building with a basement car park with 54 car parking spaces. The property was sold for $5 million with the contract being the standard REIQ/QLS form for Commercial Land and Buildings (2nd ed GST reprint). The contract provided for a “due diligence” period. During this period, the buyer’s solicitors discovered that there was no direct access from a public road to the car park entrance. Access to the car park was over a lot of which the Gold Coast City Council was the registered owner under a nomination of trustees, the Council holding the property on trust for car parking and town planning purposes. Due to the absence of a registered easement over the Council’s land, the buyer’s solicitors sought a reduction in the purchase price. The seller would not agree to this. Finally the sale was completed with the buyer reserving its rights to seek compensation.
Resumo:
A basic tenet of ecological economics is that economic growth and development are ultimately constrained by environmental carrying capacities. It is from this basis that notions of a sustainable economy and of sustainable economic development emerge to undergird the “standard model” of ecological economics. However, the belief in “hard” environmental constraints may be obscuring the important role of the entrepreneur in the co-evolution of economic and environmental relations, and hence limiting or distorting the analytic focus of ecological economics and the range of policy options that are considered for sustainable economic development. This paper outlines a co-evolutionary model of the dynamics of economic and ecological systems as connected by entrepreneurial behaviour. We then discuss some of the key analytic and policy implications.
Resumo:
Discrete Markov random field models provide a natural framework for representing images or spatial datasets. They model the spatial association present while providing a convenient Markovian dependency structure and strong edge-preservation properties. However, parameter estimation for discrete Markov random field models is difficult due to the complex form of the associated normalizing constant for the likelihood function. For large lattices, the reduced dependence approximation to the normalizing constant is based on the concept of performing computationally efficient and feasible forward recursions on smaller sublattices which are then suitably combined to estimate the constant for the whole lattice. We present an efficient computational extension of the forward recursion approach for the autologistic model to lattices that have an irregularly shaped boundary and which may contain regions with no data; these lattices are typical in applications. Consequently, we also extend the reduced dependence approximation to these scenarios enabling us to implement a practical and efficient non-simulation based approach for spatial data analysis within the variational Bayesian framework. The methodology is illustrated through application to simulated data and example images. The supplemental materials include our C++ source code for computing the approximate normalizing constant and simulation studies.
Resumo:
The encryption method is a well established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data. In this paper we review the conventional encryption method which can be partially queried and propose the encryption method for numerical data which can be effectively queried. The proposed system includes the design of the service scenario, and metadata.
Resumo:
Gaining a competitive edge in the area of the engagement, success and retention of commencing students is a significant issue in higher education, made more so currently because of the considerable and increasing pressure on teaching and learning from the new standards framework and performance funding. This paper introduces the concept of maturity models (MMs) and their application to assessing the capability of higher education institutions (HEIs) to address student engagement, success and retention (SESR). A concise description of the features of maturity models is presented with reference to an SESR-MM currently being developed. The SESR-MM is proposed as a viable instrument for assisting HEIs in the management and improvement of their SESR activities.