170 resultados para coherent scatter
Resumo:
Few frameworks exist for the teaching and assessment of programming subjects that are coherent and logical. Nor are they sufficiently generic and adaptable to be used outside the particular tertiary institutions in which they were developed. This paper presents the Teaching and Assessment of Software Development (TASD) frame-work. We describe its development and implementation at an Australian university and demonstrate, with examples, how it has been used, with supporting data. Extracts of criteria sheets (grading rubrics) for a variety of assessment tasks are included. The numerous advantages of this new framework are discussed with comparisons made to those reported in the published literature.
Resumo:
While the need for teamwork skills consistently appears in job advertisements across all sectors, the development of these skills for many university students (and some academic staff) remains one of the most painful and often complained about experiences. This presentation introduces the final phase of a project that has investigated and analysed the design of teamwork assessment across all discipline areas in order to provide a university-wide protocol for this important graduate capability. The protocol concentrates best practice guidelines and resources across a range of approaches to team assessment and includes an online diagnostic tool for evaluating the quality of assessment design. Guide-lines are provided for all aspects of the design process such as the development of real-world relevance; choosing the ideal team structure; planning for intervention and conflict resolution; and selecting appropriate marking options. While still allowing academic staff to exercise creativity in assessment design; the guidelines increase the possibility of students’ experiencing a consistent and explicit approach to teamwork throughout their course. If implementation of the protocol is successful, the project team predicts that the resulting consistency and explicitness in approaches to teamwork will lead to more coherent skill development across units, more realistic expectations for students and staff and better communication between all those participating in the process.
Resumo:
There is not a single, coherent, jurisprudence for civil society organisations. Pressure for a clearly enuciated body of law applying to the whole of this sector of society continues to increase. The rise of third sector scholarship, the retreat of the welfare state, the rediscovery of the concept of civil society and pressures to strengthen social capital have all contributed to an ongoing stream of inquiry into the laws that regulate and favour civil society organisations. There have been almost thirty inquiries over the last sixty years into the doctrine of charitable purpose in common law countries. Those inquiries have established that problems with the law applying to civil society organisations are rooted in the common law adopting a ‘technical’ definition of charitable purpose and the failure of this body of law to develop in response to societal changes. Even though it is now well recognised that problems with law reform stem from problems inherent in the doctrine of charitable purpose, statutory reforms have merely ‘bolted on’ additions to the flawed ‘technical’ definition. In this way the scope of operation of the law has been incrementally expanded to include a larger number of civil society organisations. This piecemeal approach continues the exclusion of most civil society organisations from the law of charities discourse, and fails to address the underlying jurisprudential problems. Comprehensive reform requires revisiting the foundational problems embedded in the doctrine of charitable purpose, being informed by recent scholarship, and a paradigm shift that extends the doctrine to include all civil society organisations. Scholarly inquiry into civil society organisations, particularly from within the discipline of neoclassical economics, has elucidated insights that can inform legal theory development. This theory development requires decoupling the two distinct functions performed by the doctrine of charitable purpose which are: setting the scope of regulation, and determining entitlement to favours, such as tax exemption. If the two different functions of the doctrine are considered separately in the light of theoretical insights from other disciplines, the architecture for a jurisprudence emerges that facilitates regulation, but does not necessarily favour all civil society organisations. Informed by that broader discourse it is argued that when determining the scope of regulation, civil society organisations are identified by reference to charitable purposes that are not technically defined. These charitable purposes are in essence purposes which are: Altruistic, for public Benefit, pursued without Coercion. These charitable puposes differentiate civil society organisations from organisations in the three other sectors namely; Business, which is manifest in lack of altruism; Government, which is characterised by coercion; and Family, which is characterised by benefits being private not public. When determining entitlement to favour, it is theorised that it is the extent or nature of the public benefit evident in the pursuit of a charitable purpose that justifies entitlement to favour. Entitlement to favour based on the extent of public benefit is the theoretically simpler – the greater the public benefit the greater the justification for favour. To be entitled to favour based on the nature of a purpose being charitable the purpose must fall within one of three categories developed from the first three heads of Pemsel’s case (the landmark categorisation case on taxation favour). The three categories proposed are: Dealing with Disadvantage, Encouraging Edification; and Facilitating Freedom. In this alternative paradigm a recast doctrine of charitable purpose underpins a jurisprudence for civil society in a way similar to the way contract underpins the jurisprudence for the business sector, the way that freedom from arbitrary coercion underpins the jurisprudence of the government sector and the way that equity within families underpins succession and family law jurisprudence for the family sector. This alternative architecture for the common law, developed from the doctrine of charitable purpose but inclusive of all civil society purposes, is argued to cover the field of the law applying to civil society organisations and warrants its own third space as a body of law between public law and private law in jurisprudence.
Resumo:
In the quest for shorter time-to-market, higher quality and reduced cost, model-driven software development has emerged as a promising approach to software engineering. The central idea is to promote models to first-class citizens in the development process. Starting from a set of very abstract models in the early stage of the development, they are refined into more concrete models and finally, as a last step, into code. As early phases of development focus on different concepts compared to later stages, various modelling languages are employed to most accurately capture the concepts and relations under discussion. In light of this refinement process, translating between modelling languages becomes a time-consuming and error-prone necessity. This is remedied by model transformations providing support for reusing and automating recurring translation efforts. These transformations typically can only be used to translate a source model into a target model, but not vice versa. This poses a problem if the target model is subject to change. In this case the models get out of sync and therefore do not constitute a coherent description of the software system anymore, leading to erroneous results in later stages. This is a serious threat to the promised benefits of quality, cost-saving, and time-to-market. Therefore, providing a means to restore synchronisation after changes to models is crucial if the model-driven vision is to be realised. This process of reflecting changes made to a target model back to the source model is commonly known as Round-Trip Engineering (RTE). While there are a number of approaches to this problem, they impose restrictions on the nature of the model transformation. Typically, in order for a transformation to be reversed, for every change to the target model there must be exactly one change to the source model. While this makes synchronisation relatively “easy”, it is ill-suited for many practically relevant transformations as they do not have this one-to-one character. To overcome these issues and to provide a more general approach to RTE, this thesis puts forward an approach in two stages. First, a formal understanding of model synchronisation on the basis of non-injective transformations (where a number of different source models can correspond to the same target model) is established. Second, detailed techniques are devised that allow the implementation of this understanding of synchronisation. A formal underpinning for these techniques is drawn from abductive logic reasoning, which allows the inference of explanations from an observation in the context of a background theory. As non-injective transformations are the subject of this research, there might be a number of changes to the source model that all equally reflect a certain target model change. To help guide the procedure in finding “good” source changes, model metrics and heuristics are investigated. Combining abductive reasoning with best-first search and a “suitable” heuristic enables efficient computation of a number of “good” source changes. With this procedure Round-Trip Engineering of non-injective transformations can be supported.
Resumo:
An experimental investigation has been made of a round, non-buoyant plume of nitric oxide, NO, in a turbulent grid flow of ozone, 03, using the Turbulent Smog Chamber at the University of Sydney. The measurements have been made at a resolution not previously reported in the literature. The reaction is conducted at non-equilibrium so there is significant interaction between turbulent mixing and chemical reaction. The plume has been characterized by a set of constant initial reactant concentration measurements consisting of radial profiles at various axial locations. Whole plume behaviour can thus be characterized and parameters are selected for a second set of fixed physical location measurements where the effects of varying the initial reactant concentrations are investigated. Careful experiment design and specially developed chemilurninescent analysers, which measure fluctuating concentrations of reactive scalars, ensure that spatial and temporal resolutions are adequate to measure the quantities of interest. Conserved scalar theory is used to define a conserved scalar from the measured reactive scalars and to define frozen, equilibrium and reaction dominated cases for the reactive scalars. Reactive scalar means and the mean reaction rate are bounded by frozen and equilibrium limits but this is not always the case for the reactant variances and covariances. The plume reactant statistics are closer to the equilibrium limit than those for the ambient reactant. The covariance term in the mean reaction rate is found to be negative and significant for all measurements made. The Toor closure was found to overestimate the mean reaction rate by 15 to 65%. Gradient model turbulent diffusivities had significant scatter and were not observed to be affected by reaction. The ratio of turbulent diffusivities for the conserved scalar mean and that for the r.m.s. was found to be approximately 1. Estimates of the ratio of the dissipation timescales of around 2 were found downstream. Estimates of the correlation coefficient between the conserved scalar and its dissipation (parallel to the mean flow) were found to be between 0.25 and the significant value of 0.5. Scalar dissipations for non-reactive and reactive scalars were found to be significantly different. Conditional statistics are found to be a useful way of investigating the reactive behaviour of the plume, effectively decoupling the interaction of chemical reaction and turbulent mixing. It is found that conditional reactive scalar means lack significant transverse dependence as has previously been found theoretically by Klimenko (1995). It is also found that conditional variance around the conditional reactive scalar means is relatively small, simplifying the closure for the conditional reaction rate. These properties are important for the Conditional Moment Closure (CMC) model for turbulent reacting flows recently proposed by Klimenko (1990) and Bilger (1993). Preliminary CMC model calculations are carried out for this flow using a simple model for the conditional scalar dissipation. Model predictions and measured conditional reactive scalar means compare favorably. The reaction dominated limit is found to indicate the maximum reactedness of a reactive scalar and is a limiting case of the CMC model. Conventional (unconditional) reactive scalar means obtained from the preliminary CMC predictions using the conserved scalar p.d.f. compare favorably with those found from experiment except where measuring position is relatively far upstream of the stoichiometric distance. Recommendations include applying a full CMC model to the flow and investigations both of the less significant terms in the conditional mean species equation and the small variation of the conditional mean with radius. Forms for the p.d.f.s, in addition to those found from experiments, could be useful for extending the CMC model to reactive flows in the atmosphere.
Resumo:
This paper describes a biologically inspired approach to vision-only simultaneous localization and mapping (SLAM) on ground-based platforms. The core SLAM system, dubbed RatSLAM, is based on computational models of the rodent hippocampus, and is coupled with a lightweight vision system that provides odometry and appearance information. RatSLAM builds a map in an online manner, driving loop closure and relocalization through sequences of familiar visual scenes. Visual ambiguity is managed by maintaining multiple competing vehicle pose estimates, while cumulative errors in odometry are corrected after loop closure by a map correction algorithm. We demonstrate the mapping performance of the system on a 66 km car journey through a complex suburban road network. Using only a web camera operating at 10 Hz, RatSLAM generates a coherent map of the entire environment at real-time speed, correctly closing more than 51 loops of up to 5 km in length.
Resumo:
This paper shows initial results in deploying the biologically inspired Simultaneous Localisation and Mapping system, RatSLAM, in an outdoor environment. RatSLAM has been widely tested in indoor environments on the task of producing topologically coherent maps based on a fusion of odometric and visual information. This paper details the changes required to deploy RatSLAM on a small tractor equipped with odometry and an omnidirectional camera. The principal changes relate to the vision system, with others required for RatSLAM to use omnidirectional visual data. The initial results from mapping around a 500 m loop are promising, with many improvements still to be made.
Resumo:
Simultaneous Localization And Mapping (SLAM) is one of the major challenges in mobile robotics. Probabilistic techniques using high-end range finding devices are well established in the field, but recent work has investigated vision only approaches. This paper presents a method for generating approximate rotational and translation velocity information from a single vehicle-mounted consumer camera, without the computationally expensive process of tracking landmarks. The method is tested by employing it to provide the odometric and visual information for the RatSLAM system while mapping a complex suburban road network. RatSLAM generates a coherent map of the environment during an 18 km long trip through suburban traffic at speeds of up to 60 km/hr. This result demonstrates the potential of ground based vision-only SLAM using low cost sensing and computational hardware.
Resumo:
Vietnam's present draft of the proposed new Law on Competition is currently in its ninth version. Although there is a need to enact legislation as quickly as possible, Vietnam cannot rush the drafting process. Under its Bilateral Trade Agreement with the USA, Vietnam has committed to improve the quality of its laws and consistency of its legislative framework. Since the Law on Competition will be fundamental in establishing the legal framework for a more coherent and effective competition regime, and will have profound influences on Vietnam's objective of becoming a socialist-oriented market economy, its provisions must be well constructed and well considered, and this takes time. This article shows how the proposed Law is being crafted as compared to older drafts which sheds light on changes in policy during the drafting process. Where possible, the Draft is also compared with the laws in other jurisdictions for any assistance they might lend. In this author's opinion not all the changes are positive but any defects in the draft are not intractable and can be remedied prior to promulgation.
Resumo:
Current rapid increases in the scope of regional development and the reach of technology have combined with the expanding scale of modern settlements to focus growing attention on infrastructure provisionneeds. This has included organisational and funding systems, the management of new technologies and regional scale social provisions. In this chapter, the evolution of urban and regional infrastructure is traced from its earliest origins in the growth of organized societies of 5 ,000 years ago. Infrastructure needs and provision are illustrated for the arenas of metropolitan, provincial and rural regions. Rural infrastructure examples and lessons are drawn from global case studies. Recent expansions of the scope of infrastructure are examined and issues of governance and process discussed. Phased planning processes are related to cycles of program adoption, objective formulation, option evaluation and programme budgeting. Issues of privatisation and public interest are considered. Matters of contemporary global significance are explored, including the current economic contraction and the effects of global climate change. Conclusions are drawn about the role and importance of linking regional planning to coherent regional infrastructure programs and budgets
Resumo:
Home Automation (HA) has emerged as a prominent ¯eld for researchers and in- vestors confronting the challenge of penetrating the average home user market with products and services emerging from technology based vision. In spite of many technology contri- butions, there is a latent demand for a®ordable and pragmatic assistive technologies for pro-active handling of complex lifestyle related problems faced by home users. This study has pioneered to develop an Initial Technology Roadmap for HA (ITRHA) that formulates a need based vision of 10-15 years, identifying market, product and technology investment opportunities, focusing on those aspects of HA contributing to e±cient management of home and personal life. The concept of Family Life Cycle is developed to understand the temporal needs of family. In order to formally describe a coherent set of family processes, their relationships, and interaction with external elements, a reference model named Fam- ily System is established that identi¯es External Entities, 7 major Family Processes, and 7 subsystems-Finance, Meals, Health, Education, Career, Housing, and Socialisation. Anal- ysis of these subsystems reveals Soft, Hard and Hybrid processes. Rectifying the lack of formal methods for eliciting future user requirements and reassessing evolving market needs, this study has developed a novel method called Requirement Elicitation of Future Users by Systems Scenario (REFUSS), integrating process modelling, and scenario technique within the framework of roadmapping. The REFUSS is used to systematically derive process au- tomation needs relating the process knowledge to future user characteristics identi¯ed from scenarios created to visualise di®erent futures with richly detailed information on lifestyle trends thus enabling learning about the future requirements. Revealing an addressable market size estimate of billions of dollars per annum this research has developed innovative ideas on software based products including Document Management Systems facilitating automated collection, easy retrieval of all documents, In- formation Management System automating information services and Ubiquitous Intelligent System empowering the highly mobile home users with ambient intelligence. Other product ideas include robotic devices of versatile Kitchen Hand and Cleaner Arm that can be time saving. Materialisation of these products require technology investment initiating further research in areas of data extraction, and information integration as well as manipulation and perception, sensor actuator system, tactile sensing, odour detection, and robotic controller. This study recommends new policies on electronic data delivery from service providers as well as new standards on XML based document structure and format.
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.
Resumo:
Australian privacy law regulates how government agencies and private sector organisations collect, store and use personal information. A coherent conceptual basis of personal information is an integral requirement of information privacy law as it determines what information is regulated. A 2004 report conducted on behalf of the UK’s Information Commissioner (the 'Booth Report') concluded that there was no coherent definition of personal information currently in operation because different data protection authorities throughout the world conceived the concept of personal information in different ways. The authors adopt the models developed by the Booth Report to examine the conceptual basis of statutory definitions of personal information in Australian privacy laws. Research findings indicate that the definition of personal information is not construed uniformly in Australian privacy laws and that different definitions rely upon different classifications of personal information. A similar situation is evident in a review of relevant case law. Despite this, the authors conclude the article by asserting that a greater jurisprudential discourse is required based on a coherent conceptual framework to ensure the consistent development of Australian privacy law.
Resumo:
This work is focussed on developing a commissioning procedure so that a Monte Carlo model, which uses BEAMnrc’s standard VARMLC component module, can be adapted to match a specific BrainLAB m3 micro-multileaf collimator (μMLC). A set of measurements are recommended, for use as a reference against which the model can be tested and optimised. These include radiochromic film measurements of dose from small and offset fields, as well as measurements of μMLC transmission and interleaf leakage. Simulations and measurements to obtain μMLC scatter factors are shown to be insensitive to relevant model parameters and are therefore not recommended, unless the output of the linear accelerator model is in doubt. Ultimately, this note provides detailed instructions for those intending to optimise a VARMLC model to match the dose delivered by their local BrainLAB m3 μMLC device.