934 resultados para Spatial modelling
Resumo:
Literacy studies have begun to examine the spatial dimension of literacy practices in a way that foregrounds space, and that considers space as constitutive to human relations and practices. This chapter provides an introduction to spatial literacy research, providing a guide to key theorists, themes, and studies that have shaped historical and new developments in spatial approaches to literacy practice and pedagogy. It begins by reconceptualising socio-spatial approaches to literacy research and defines terms. Intersections with related social theories are examined, with an emphasis on critical approaches and the politics of space. It clarifies the relationship between socio-spatial and socio-cultural paradigms, revisiting the spatial in seminal socio-cultural research. It covers new ground,including networks, flows, and deterritorialisation of literacy practice. The chapter concludes with challenges and recommendations for future language research and educational practice.
Resumo:
The Council of Australian Governments (COAG) in 2003 gave in-principle approval to a best-practice report recommending a holistic approach to managing natural disasters in Australia incorporating a move from a traditional response-centric approach to a greater focus on mitigation, recovery and resilience with community well-being at the core. Since that time, there have been a range of complementary developments that have supported the COAG recommended approach. Developments have been administrative, legislative and technological, both, in reaction to the COAG initiative and resulting from regular natural disasters. This paper reviews the characteristics of the spatial data that is becoming increasingly available at Federal, state and regional jurisdictions with respect to their being fit for the purpose for disaster planning and mitigation and strengthening community resilience. In particular, Queensland foundation spatial data, which is increasingly accessible by the public under the provisions of the Right to Information Act 2009, Information Privacy Act 2009, and recent open data reform initiatives are evaluated. The Fitzroy River catchment and floodplain is used as a case study for the review undertaken. The catchment covers an area of 142,545 km2, the largest river catchment flowing to the eastern coast of Australia. The Fitzroy River basin experienced extensive flooding during the 2010–2011 Queensland floods. The basin is an area of important economic, environmental and heritage values and contains significant infrastructure critical for the mining and agricultural sectors, the two most important economic sectors for Queensland State. Consequently, the spatial datasets for this area play a critical role in disaster management and for protecting critical infrastructure essential for economic and community well-being. The foundation spatial datasets are assessed for disaster planning and mitigation purposes using data quality indicators such as resolution, accuracy, integrity, validity and audit trail.
Resumo:
While there are many similarities between the languages of the various workflow management systems, there are also significant differences. One particular area of differences is caused by the fact that different systems impose different syntactic restrictions. In such cases, business analysts have to choose between either conforming to the language in their specifications or transforming these specifications afterwards. The latter option is preferable as this allows for a separation of concerns. In this paper we investigate to what extent such transformations are possible in the context of various syntactical restrictions (the most restrictive of which will be referred to as structured workflows). We also provide a deep insight into the consequences, particularly in terms of expressive power, of imposing such restrictions.
Resumo:
This paper deals with the failure of high adhesive, low compressive strength, thin layered polymer mortar joints in masonry through a contact modelling in finite element framework. Failure due to combined shear, tensile and compressive stresses are considered through a constitutive damaging contact model that incorporates traction–separation as a function of displacement discontinuity. The modelling method is verified using single and multiple contact analyses of thin mortar layered masonry specimens under shear, tensile and compressive stresses and their combinations. Using this verified method, the failure of thin mortar layered masonry under a range of shear to tension ratios and shear to compression ratios has been examined. Finally, this model is applied to thin bed masonry wallettes for their behaviour under biaxial tension–tension and compression–tension loadings perpendicular and parallel to the bed joints.
Resumo:
The article focuses on how the information seeker makes decisions about relevance. It will employ a novel decision theory based on quantum probabilities. This direction derives from mounting research within the field of cognitive science showing that decision theory based on quantum probabilities is superior to modelling human judgements than standard probability models [2, 1]. By quantum probabilities, we mean decision event space is modelled as vector space rather than the usual Boolean algebra of sets. In this way,incompatible perspectives around a decision can be modelled leading to an interference term which modifies the law of total probability. The interference term is crucial in modifying the probability judgements made by current probabilistic systems so they align better with human judgement. The goal of this article is thus to model the information seeker user as a decision maker. For this purpose, signal detection models will be sketched which are in principle applicable in a wide variety of information seeking scenarios.
Resumo:
Modelling business processes for analysis or redesign usually requires the collaboration of many stakeholders. These stakeholders may be spread across locations or even companies, making co-located collaboration costly and difficult to organize. Modern process modelling technologies support remote collaboration but lack support for visual cues used in co-located collaboration. Previously we presented a prototype 3D virtual world process modelling tool that supports a number of visual cues to facilitate remote collaborative process model creation and validation. However, the added complexity of having to navigate a virtual environment and using an avatar for communication made the tool difficult to use for novice users. We now present an evolved version of the technology that addresses these issues by providing natural user interfaces for non-verbal communication, navigation and model manipulation.
Resumo:
Finite Element modelling of bone fracture fixation systems allows computational investigation of the deformation response of the bone to load. Once validated, these models can be easily adapted to explore changes in design or configuration of a fixator. The deformation of the tissue within the fracture gap determines its healing and is often summarised as the stiffness of the construct. FE models capable of reproducing this behaviour would provide valuable insight into the healing potential of different fixation systems. Current model validation techniques lack depth in 6D load and deformation measurements. Other aspects of the FE model creation such as the definition of interfaces between components have also not been explored. This project investigated the mechanical testing and FE modelling of a bone– plate construct for the determination of stiffness. In depth 6D measurement and analysis of the generated forces, moments and movements showed large out of plane behaviours which had not previously been characterised. Stiffness calculated from the interfragmentary movement was found to be an unsuitable summary parameter as the error propagation is too large. Current FE modelling techniques were applied in compression and torsion mimicking the experimental setup. Compressive stiffness was well replicated, though torsional stiffness was not. The out of plane behaviours prevalent in the experimental work were not replicated in the model. The interfaces between the components were investigated experimentally and through modification to the FE model. Incorporation of the interface modelling techniques into the full construct models had no effect in compression but did act to reduce torsional stiffness bringing it closer to that of the experiment. The interface definitions had no effect on out of plane behaviours, which were still not replicated. Neither current nor novel FE modelling techniques were able to replicate the out of plane behaviours evident in the experimental work. New techniques for modelling loads and boundary conditions need to be developed to mimic the effects of the entire experimental system.
Resumo:
This timely and thorough book seeks to provide evidence-based assessments of ways in which spatial planning may develop and deliver new strategies for addressing both the causes and impacts of climate change. The authors state that much of the analysis is informed by experiences and learning from their own involvements with climate change projects. The book aims to be relevant to a wide audience and nominates its intended readership to include planning practitioners, scholars, post-graduate students of built environment courses, politicians and the ‘interested’ public. In this regard, the authors skilfully deliver with a comprehensive and accessible dissemination of the nexus between spatial planning and climate change...
Resumo:
The use of public space by young people and children is a major issue in a number of countries and a range of measures are deployed to to control public space which restrict their social and spatial citizenship rights.
Resumo:
A demo video showing the BPMVM prototype using several natural user interfaces, such as multi-touch input, full-body tracking and virtual reality.
Resumo:
This paper describes a risk model for estimating the likelihood of collisions at low-exposure railway level crossings, demonstrating the effect that differences in safety integrity can have on the likelihood of a collision. The model facilitates the comparison of safety benefits between level crossings with passive controls (stop or give-way signs) and level crossings that have been hypothetically upgraded with conventional or low-cost warning devices. The scenario presented illustrates how treatment of a cross-section of level crossings with low cost devices can provide a greater safety benefit compared to treatment with conventional warning devices for the same budget.
Resumo:
Keeping exotic plant pests out of our country relies on good border control or quarantine. However with increasing globalization and mobilization some things slip through. Then the back up systems become important. This can include an expensive form of surveillance that purposively targets particular pests. A much wider net is provided by general surveillance, which is assimilated into everyday activities, like farmers checking the health of their crops. In fact farmers and even home gardeners have provided a front line warning system for some pests (eg European wasp) that could otherwise have wreaked havoc. Mathematics is used to model how surveillance works in various situations. Within this virtual world we can play with various surveillance and management strategies to "see" how they would work, or how to make them work better. One of our greatest challenges is estimating some of the input parameters : because the pest hasn't been here before, it's hard to predict how well it might behave: establishing, spreading, and what types of symptoms it might express. So we rely on experts to help us with this. This talk will look at the mathematical, psychological and logical challenges of helping experts to quantify what they think. We show how the subjective Bayesian approach is useful for capturing expert uncertainty, ultimately providing a more complete picture of what they think... And what they don't!
Resumo:
This book provides a general framework for specifying, estimating, and testing time series econometric models. Special emphasis is given to estimation by maximum likelihood, but other methods are also discussed, including quasi-maximum likelihood estimation, generalized method of moments estimation, nonparametric estimation, and estimation by simulation. An important advantage of adopting the principle of maximum likelihood as the unifying framework for the book is that many of the estimators and test statistics proposed in econometrics can be derived within a likelihood framework, thereby providing a coherent vehicle for understanding their properties and interrelationships. In contrast to many existing econometric textbooks, which deal mainly with the theoretical properties of estimators and test statistics through a theorem-proof presentation, this book squarely addresses implementation to provide direct conduits between the theory and applied work.
Resumo:
Within-building spatial variability of indoor air quality may influence substantially the reliability of human exposure assessments based on single point samples, but have hitherto been little studied. To investigate and understand the within-building spatial variation of air pollutants, field measurements were conducted in a 7 level office building in Brisbane, Australia. The building consists of 3 sections (A side, Meddler and B side).
Resumo:
Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.