927 resultados para Precautionary Principle
Resumo:
Since 1995 the buildingSMART International Alliance for Interoperability (buildingSMART)has developed a robust standard called the Industry Foundation Classes (IFC). IFC is an object oriented data model with related file format that has facilitated the efficient exchange of data in the development of building information models (BIM). The Cooperative Research Centre for Construction Innovation has contributed to the international effort in the development of the IFC standard and specifically the reinforced concrete part of the latest IFC 2x3 release. Industry Foundation Classes have been endorsed by the International Standards Organisation as a Publicly Available Specification (PAS) under the ISO label ISO/PAS 16739. For more details, go to http://www.tc184- sc4.org/About_TC184-SC4/About_SC4_Standards/ The current IFC model covers the building itself to a useful level of detail. The next stage of development for the IFC standard is where the building meets the ground (terrain) and with civil and external works like pavements, retaining walls, bridges, tunnels etc. With the current focus in Australia on infrastructure projects over the next 20 years a logical extension to this standard was in the area of site and civil works. This proposal recognises that there is an existing body of work on the specification of road representation data. In particular, LandXML is recognised as also is TransXML in the broader context of transportation and CityGML in the common interfacing of city maps, buildings and roads. Examination of interfaces between IFC and these specifications is therefore within the scope of this project. That such interfaces can be developed has already been demonstrated in principle within the IFC for Geographic Information Systems (GIS) project. National road standards that are already in use should be carefully analysed and contacts established in order to gain from this knowledge. The Object Catalogue for the Road Transport Sector (OKSTRA) should be noted as an example. It is also noted that buildingSMART Norway has submitted a proposal
Resumo:
Damage localization induced by strain softening can be predicted by the direct minimization of a global energy function. This article concerns the computational strategy for implementing this principle for softening materials such as concrete. Instead of using heuristic global optimization techniques, our strategies are a hybrid of local optimization methods with a path-finding approach to ensure a global optimum. With admissible nodal displacements being independent variables, it is easy to deal with the geometric (mesh) constraint conditions. The direct search optimization methods recover the localized solutions for a range of softening lattice models which are representative of quasi-brittle structures
Resumo:
This study established that the core principle underlying categorisation of activities have the potential to provide more comprehensive outcomes than the recognition of activities because it takes into consideration activities other than directional locomotion.
Resumo:
The purpose of this chapter is to discuss the relationship between crime and morality, with a specific focus on crimes against morality. While we argue that all crimes have a general moral basis, condemned as wrong or bad and proscribed by society, there is a specific group of offences in modern democratic nations labelled crimes against morality. Included within this group are offences related to prostitution, pornography and homosexuality. What do these crimes have in common? Most clearly they tend to have a sexual basis and are often argued to do sexual harm, in both a moral and /or psychological sense, as well as physically. Conversely they are often argued to be victimless crimes, especially when the acts occur between consenting adults. Finally they are considered essentially private acts but they often occur, and are regulated, in the public domain. Most importantly, each of these crimes against morality has only relatively recently (ie in the past 150 years) become identified and regulated by the state as a criminal offence.
Resumo:
The aim of this paper is to show how principles of ecological psychology and dynamical systems theory can underpin a philosophy of coaching practice in a nonlinear pedagogy. Nonlinear pedagogy is based on a view of the human movement system as a nonlinear dynamical system. We demonstrate how this perspective of the human movement system can aid understanding of skill acquisition processes and underpin practice for sports coaches. We provide a description of nonlinear pedagogy followed by a consideration of some of the fundamental principles of ecological psychology and dynamical systems theory that underpin it as a coaching philosophy. We illustrate how each principle impacts on nonlinear pedagogical coaching practice, demonstrating how each principle can substantiate a framework for the coaching process.
Resumo:
Public private partnerships (PPP) have been widely used as a method for public infrastructure project delivery not only locally and internationally, however the adoption of PPPs in social infrastructure procurement has still been very limited. The objective of this paper is to investigate the potential of implementation of current PPP framework in social affordable housing projects in South East Queensland. Data were collected from 22 interviewees with rich experiences in the industry. The findings of this study show that affordable housing investment have been considered by the industry practitioners as a risky business in comparison to other private rental housing investment. The main determents of the adoption of PPPs in social infrastructure project are the tenant-related factors, such as the inability of paying rent and the inability of caring the property. The study also suggests the importance of seeking strategic partnership with community-based organisation that has experiences in managing similar tenants’ profiles. Current PPP guideline is also viewed as inappropriate for the affordable housing projects, but the principle of VFM framework and risk allocation in PPPs still be applied to the affordable housing projects. This study helps to understand the viability of PPP in social housing procurement projects, and point out the importance of developing guideline for multi-stakeholder partnership and the expansion of the current VFM and PPPs guidelines.
Resumo:
There are various principles for layout design such as balance, rhythm, unity and harmony, but each principle has often been introduced as a separate concept rather than within an integrated and systematic structure, so that designers and design students have to keep practices for the acquisition of skills. The paper seeks to develop a conceptual framework for a systematic mapping of layout design principles by using Yin and Yang and the Five Elements. Yin and Yang theory explains all natural phenomena with its own conceptual model and facilitates finding harmony and balance between the visual elements in terms of systematic and organic relations. Most common and well-known layout design principles are defined with 10 different resources such as design books and articles, and have been remapped following with the structure of Yin and Yang and the Five Elements. A systematic framework explaining the relationships of design principles was created and 32 design students participated in its efficiency test. The outcome suggests there is a high possibility that the framework can be used in professional fields and design education.
Resumo:
Mobile robots are widely used in many industrial fields. Research on path planning for mobile robots is one of the most important aspects in mobile robots research. Path planning for a mobile robot is to find a collision-free route, through the robot’s environment with obstacles, from a specified start location to a desired goal destination while satisfying certain optimization criteria. Most of the existing path planning methods, such as the visibility graph, the cell decomposition, and the potential field are designed with the focus on static environments, in which there are only stationary obstacles. However, in practical systems such as Marine Science Research, Robots in Mining Industry, and RoboCup games, robots usually face dynamic environments, in which both moving and stationary obstacles exist. Because of the complexity of the dynamic environments, research on path planning in the environments with dynamic obstacles is limited. Limited numbers of papers have been published in this area in comparison with hundreds of reports on path planning in stationary environments in the open literature. Recently, a genetic algorithm based approach has been introduced to plan the optimal path for a mobile robot in a dynamic environment with moving obstacles. However, with the increase of the number of the obstacles in the environment, and the changes of the moving speed and direction of the robot and obstacles, the size of the problem to be solved increases sharply. Consequently, the performance of the genetic algorithm based approach deteriorates significantly. This motivates the research of this work. This research develops and implements a simulated annealing algorithm based approach to find the optimal path for a mobile robot in a dynamic environment with moving obstacles. The simulated annealing algorithm is an optimization algorithm similar to the genetic algorithm in principle. However, our investigation and simulations have indicated that the simulated annealing algorithm based approach is simpler and easier to implement. Its performance is also shown to be superior to that of the genetic algorithm based approach in both online and offline processing times as well as in obtaining the optimal solution for path planning of the robot in the dynamic environment. The first step of many path planning methods is to search an initial feasible path for the robot. A commonly used method for searching the initial path is to randomly pick up some vertices of the obstacles in the search space. This is time consuming in both static and dynamic path planning, and has an important impact on the efficiency of the dynamic path planning. This research proposes a heuristic method to search the feasible initial path efficiently. Then, the heuristic method is incorporated into the proposed simulated annealing algorithm based approach for dynamic robot path planning. Simulation experiments have shown that with the incorporation of the heuristic method, the developed simulated annealing algorithm based approach requires much shorter processing time to get the optimal solutions in the dynamic path planning problem. Furthermore, the quality of the solution, as characterized by the length of the planned path, is also improved with the incorporated heuristic method in the simulated annealing based approach for both online and offline path planning.
Resumo:
Abstract—Corneal topography estimation that is based on the Placido disk principle relies on good quality of precorneal tear film and sufficiently wide eyelid (palpebral) aperture to avoid reflections from eyelashes. However, in practice, these conditions are not always fulfilled resulting in missing regions, smaller corneal coverage, and subsequently poorer estimates of corneal topography. Our aim was to enhance the standard operating range of a Placido disk videokeratoscope to obtain reliable corneal topography estimates in patients with poor tear film quality, such as encountered in those diagnosed with dry eye, and with narrower palpebral apertures as in the case of Asian subjects. This was achieved by incorporating in the instrument’s own topography estimation algorithm an image processing technique that comprises a polar-domain adaptive filter and amorphological closing operator. The experimental results from measurements of test surfaces and real corneas showed that the incorporation of the proposed technique results in better estimates of corneal topography, and, in many cases, to a significant increase in the estimated coverage area making such an enhanced videokeratoscope a better tool for clinicians.
Resumo:
A wide range of screening strategies have been employed to isolate antibodies and other proteins with specific attributes, including binding affinity, specificity, stability and improved expression. However, there remains no high-throughput system to screen for target-binding proteins in a mammalian, intracellular environment. Such a system would allow binding reagents to be isolated against intracellular clinical targets such as cell signalling proteins associated with tumour formation (p53, ras, cyclin E), proteins associated with neurodegenerative disorders (huntingtin, betaamyloid precursor protein), and various proteins crucial to viral replication (e.g. HIV-1 proteins such as Tat, Rev and Vif-1), which are difficult to screen by phage, ribosome or cell-surface display. This study used the β-lactamase protein complementation assay (PCA) as the display and selection component of a system for screening a protein library in the cytoplasm of HEK 293T cells. The colicin E7 (ColE7) and Immunity protein 7 (Imm7) *Escherichia coli* proteins were used as model interaction partners for developing the system. These proteins drove effective β-lactamase complementation, resulting in a signal-to-noise ratio (9:1 – 13:1) comparable to that of other β-lactamase PCAs described in the literature. The model Imm7-ColE7 interaction was then used to validate protocols for library screening. Single positive cells that harboured the Imm7 and ColE7 binding partners were identified and isolated using flow cytometric cell sorting in combination with the fluorescent β-lactamase substrate, CCF2/AM. A single-cell PCR was then used to amplify the Imm7 coding sequence directly from each sorted cell. With the screening system validated, it was then used to screen a protein library based the Imm7 scaffold against a proof-of-principle target. The wild-type Imm7 sequence, as well as mutants with wild-type residues in the ColE7- binding loop were enriched from the library after a single round of selection, which is consistent with other eukaryotic screening systems such as yeast and mammalian cell-surface display. In summary, this thesis describes a new technology for screening protein libraries in a mammalian, intracellular environment. This system has the potential to complement existing screening technologies by allowing access to intracellular proteins and expanding the range of targets available to the pharmaceutical industry.
Resumo:
The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.
Resumo:
An alternative approach to port decoupling and matching of arrays with tightly coupled elements is proposed. The method is based on the inherent decoupling effect obtained by feeding the orthogonal eigenmodes of the array. For this purpose, a modal feed network is connected to the array. The decoupled external ports of the feed network may then be matched independently by using conventional matching circuits. Such a system may be used in digital beam forming applications with good signal-to-noise performance. The theory is applicable to arrays with an arbitrary number of elements, but implementation is only practical for smaller arrays. The principle is illustrated by means of two examples.
Resumo:
The biomechanical or biophysical principles can be applied to study biological structures in their modern or fossil form. Bone is an important tissue in paleontological studies as it is a commonly preserved element in most fossil vertebrates, and can often allow its microstructures such as lacuna and canaliculi to be studied in detail. In this context, the principles of Fluid Mechanics and Scaling Laws have been previously applied to enhance the understanding of bone microarchitecture and their implications for the evolution of hydraulic structures to transport fluid. It has been shown that the microstructure of bone has evolved to maintain efficient transport between the nutrient supply and cells, the living components of the tissue. Application of the principle of minimal expenditure of energy to this analysis shows that the path distance comprising five or six lamellar regions represents an effective limit for fluid and solute transport between the nutrient supply and cells; beyond this threshold, hydraulic resistance in the network increases and additional energy expenditure is necessary for further transportation. This suggests an optimization of the size of bone’s building blocks (such as osteon or trabecular thickness) to meet the metabolic demand concomitant to minimal expenditure of energy. This biomechanical aspect of bone microstructure is corroborated from the ratio of osteon to Haversian canal diameters and scaling constants of several mammals considered in this study. This aspect of vertebrate bone microstructure and physiology may provide a basis of understanding of the form and function relationship in both extinct and extant taxa.
Resumo:
The quality of the environment is important to client recovery and rehabilitation. • The preferred environment for the care of the mentally ill over time has been the home. • Environmental strategies in the care of the mentally ill became more important in the eighteenth century, when it was noticed that patients were more manageable in a pleasant environment. • Confinement of the mentally ill in large public asylums was largely an innovation of the nineteenth century. • The therapeutic milieu is a consciously organised environment. • Maxwell Jones in the United States and Thomas Main in the United Kingdom pioneered the concept of the hospital and environment as treatment tools. • The goals of the therapeutic milieu are containment, structure, support, involvement, validation, symptom management, and maintaining links with family and the community. • The principles on which the therapeutic milieu is based include: open communication, democratisation, reality confrontation, permissiveness, group cohesion and the multidisciplinary team. • The principle guiding the care of clients in the community is that of the least-restrictive alternative. • The therapeutic community residence is an environment that encourages the development of the client as a person in interaction with others, rather than as someone suffering from a health problem or disability. • The preferred contemporary setting for the provision of mental health care is the community. • The predominant form of service delivery in the community is case management, which has been found to be most effective for people with severe mental illnesses. • The principles of caring in the community are self-determination, normalisation, a focus on client strengths, and the community as a resource
Resumo:
Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.