735 resultados para 3D gravity modelling
Resumo:
Global awareness for cleaner and renewable energy is transforming the electricity sector at many levels. New technologies are being increasingly integrated into the electricity grid at high, medium and low voltage levels, new taxes on carbon emissions are being introduced and individuals can now produce electricity, mainly through rooftop photovoltaic (PV) systems. While leading to improvements, these changes also introduce challenges, and a question that often rises is ‘how can we manage this constantly evolving grid?’ The Queensland Government and Ergon Energy, one of the two Queensland distribution companies, have partnered with some Australian and German universities on a project to answer this question in a holistic manner. The project investigates the impact the integration of renewables and other new technologies has on the physical structure of the grid, and how this evolving system can be managed in a sustainable and economical manner. To aid understanding of what the future might bring, a software platform has been developed that integrates two modelling techniques: agent-based modelling (ABM) to capture the characteristics of the different system units accurately and dynamically, and particle swarm optimization (PSO) to find the most economical mix of network extension and integration of distributed generation over long periods of time. Using data from Ergon Energy, two types of networks (3 phase, and Single Wired Earth Return or SWER) have been modelled; three-phase networks are usually used in dense networks such as urban areas, while SWER networks are widely used in rural Queensland. Simulations can be performed on these networks to identify the required upgrades, following a three-step process: a) what is already in place and how it performs under current and future loads, b) what can be done to manage it and plan the future grid and c) how these upgrades/new installations will perform over time. The number of small-scale distributed generators, e.g. PV and battery, is now sufficient (and expected to increase) to impact the operation of the grid, which in turn needs to be considered by the distribution network manager when planning for upgrades and/or installations to stay within regulatory limits. Different scenarios can be simulated, with different levels of distributed generation, in-place as well as expected, so that a large number of options can be assessed (Step a). Once the location, sizing and timing of assets upgrade and/or installation are found using optimisation techniques (Step b), it is possible to assess the adequacy of their daily performance using agent-based modelling (Step c). One distinguishing feature of this software is that it is possible to analyse a whole area at once, while still having a tailored solution for each of the sub-areas. To illustrate this, using the impact of battery and PV can have on the two types of networks mentioned above, three design conditions can be identified (amongst others): · Urban conditions o Feeders that have a low take-up of solar generators, may benefit from adding solar panels o Feeders that need voltage support at specific times, may be assisted by installing batteries · Rural conditions - SWER network o Feeders that need voltage support as well as peak lopping may benefit from both battery and solar panel installations. This small example demonstrates that no single solution can be applied across all three areas, and there is a need to be selective in which one is applied to each branch of the network. This is currently the function of the engineer who can define various scenarios against a configuration, test them and iterate towards an appropriate solution. Future work will focus on increasing the level of automation in identifying areas where particular solutions are applicable.
Resumo:
Stigmergy is a biological term used when discussing a sub-set of insect swarm-behaviour describing the apparent organisation seen during their activities. Stigmergy describes a communication mechanism based on environment-mediated signals which trigger responses among the insects. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, where the pheromones are a form of environment-mediated communication. What is interesting with this phenomenon is that highly organized societies are achieved without an apparent management structure. Stigmergy is also observed in human environments, both natural and engineered. It is implicit in the Web where sites provide a virtual environment supporting coordinative contributions. Researchers in varying disciplines appreciate the power of this phenomenon and have studied how to exploit it. As stigmergy becomes more widely researched we see its definition mutate as papers citing original work become referenced themselves. Each paper interprets these works in ways very specific to the research being conducted. Our own research aims to better understand what improves the collaborative function of a Web site when exploiting the phenomenon. However when researching stigmergy to develop our understanding we discover a lack of a standardized and abstract model for the phenomenon. Papers frequently cited the same generic descriptions before becoming intimately focused on formal specifications of an algorithm, or esoteric discussions regarding sub-facets of the topic. None provide a holistic and macro-level view to model and standardize the nomenclature. This paper provides a content analysis of influential literature documenting the numerous theoretical and experimental papers that have focused on stigmergy. We establish that stigmergy is a phenomenon that transcends the insect world and is more than just a metaphor when applied to the human world. We present from our own research our general theory and abstract model of semantics of stigma in stigmergy. We hope our model will clarify the nuances of the phenomenon into a useful road-map, and standardise vocabulary that we witness becoming confused and divergent. Furthermore, this paper documents the analysis on which we base our next paper: Special Theory of Stigmergy: A Design Pattern for Web 2.0 Collaboration.
Resumo:
Many mature term-based or pattern-based approaches have been used in the field of information filtering to generate users’ information needs from a collection of documents. A fundamental assumption for these approaches is that the documents in the collection are all about one topic. However, in reality users’ interests can be diverse and the documents in the collection often involve multiple topics. Topic modelling, such as Latent Dirichlet Allocation (LDA), was proposed to generate statistical models to represent multiple topics in a collection of documents, and this has been widely utilized in the fields of machine learning and information retrieval, etc. But its effectiveness in information filtering has not been so well explored. Patterns are always thought to be more discriminative than single terms for describing documents. However, the enormous amount of discovered patterns hinder them from being effectively and efficiently used in real applications, therefore, selection of the most discriminative and representative patterns from the huge amount of discovered patterns becomes crucial. To deal with the above mentioned limitations and problems, in this paper, a novel information filtering model, Maximum matched Pattern-based Topic Model (MPBTM), is proposed. The main distinctive features of the proposed model include: (1) user information needs are generated in terms of multiple topics; (2) each topic is represented by patterns; (3) patterns are generated from topic models and are organized in terms of their statistical and taxonomic features, and; (4) the most discriminative and representative patterns, called Maximum Matched Patterns, are proposed to estimate the document relevance to the user’s information needs in order to filter out irrelevant documents. Extensive experiments are conducted to evaluate the effectiveness of the proposed model by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model significantly outperforms both state-of-the-art term-based models and pattern-based models
Resumo:
Aims The Medical Imaging Training Immersive Environment (MITIE) system is a recently developed virtual reality (VR) platform that allows students to practice a range of medical imaging techniques. The aim of this pilot study was to harvest user feedback about the educational value of the application and inform future pedagogical development. This presentation explores the use of this technology for skills training and blurring the boundaries between academic learning and clinical skills training. Background MITIE is a 3D VR environment that allows students to manipulate a patient and radiographic equipment in order to produce a VR-generated image for comparison with a gold standard. As with VR initiatives in other health disciplines (1-6) the software mimics clinical practice as much as possible and uses 3D technology to enhance immersion and realism. The software was developed by the Medical Imaging Course Team at a provider University with funding from a Health Workforce Australia “Simulated Learning Environments” grant. Methods Over 80 students undertaking the Bachelor of Medical Imaging Course were randomised to receive practical experience with either MITIE or radiographic equipment in the medical radiation laboratory. Student feedback about the educational value of the software was collected and performance with an assessed setup was measured for both groups for comparison. Ethical approval for the project was provided by the university ethics panel. Results This presentation provides qualitative analysis of student perceptions relating to satisfaction, usability and educational value as well as comparative quantitative performance data. Students reported high levels of satisfaction and both feedback and assessment results confirmed the application’s significance as a pre-clinical training tool. There was a clear emerging theme that MITIE could be a useful learning tool that students could access to consolidate their clinical learning, either during their academic timetables or their clinical placement. Conclusion Student feedback and performance data indicate that MITIE has a valuable role to play in the clinical skills training for medical imaging students both in the academic and the clinical environment. Future work will establish a framework for an appropriate supporting pedagogy that can cross the boundary between the two environments. This project was possible due to funding made available by Health Workforce Australia.
Resumo:
Spreading cell fronts play an essential role in many physiological processes. Classically, models of this process are based on the Fisher-Kolmogorov equation; however, such continuum representations are not always suitable as they do not explicitly represent behaviour at the level of individual cells. Additionally, many models examine only the large time asymptotic behaviour, where a travelling wave front with a constant speed has been established. Many experiments, such as a scratch assay, never display this asymptotic behaviour, and in these cases the transient behaviour must be taken into account. We examine the transient and asymptotic behaviour of moving cell fronts using techniques that go beyond the continuum approximation via a volume-excluding birth-migration process on a regular one-dimensional lattice. We approximate the averaged discrete results using three methods: (i) mean-field, (ii) pair-wise, and (iii) one-hole approximations. We discuss the performace of these methods, in comparison to the averaged discrete results, for a range of parameter space, examining both the transient and asymptotic behaviours. The one-hole approximation, based on techniques from statistical physics, is not capable of predicting transient behaviour but provides excellent agreement with the asymptotic behaviour of the averaged discrete results, provided that cells are proliferating fast enough relative to their rate of migration. The mean-field and pair-wise approximations give indistinguishable asymptotic results, which agree with the averaged discrete results when cells are migrating much more rapidly than they are proliferating. The pair-wise approximation performs better in the transient region than does the mean-field, despite having the same asymptotic behaviour. Our results show that each approximation only works in specific situations, thus we must be careful to use a suitable approximation for a given system, otherwise inaccurate predictions could be made.
Resumo:
Philosophical inquiry in the teaching and learning of mathematics has received continued, albeit limited, attention over many years (e.g., Daniel, 2000; English, 1994; Lafortune, Daniel, Fallascio, & Schleider, 2000; Kennedy, 2012a). The rich contributions these communities can offer school mathematics, however, have not received the deserved recognition, especially from the mathematics education community. This is a perplexing situation given the close relationship between the two disciplines and their shared values for empowering students to solve a range of challenging problems, often unanticipated, and often requiring broadened reasoning. In this article, I first present my understanding of philosophical inquiry as it pertains to the mathematics classroom, taking into consideration the significant work that has been undertaken on socio-political contexts in mathematics education (e.g., Skovsmose & Greer, 2012). I then consider one approach to advancing philosophical inquiry in the mathematics classroom, namely, through modelling activities that require interpretation, questioning, and multiple approaches to solution. The design of these problem activities, set within life-based contexts, provides an ideal vehicle for stimulating philosophical inquiry.
Resumo:
The number of office building retrofit projects is increasing. These projects are characterised by processes which have a close relationship with waste generation and therefore demand a high level of waste management. In a preliminary study reported separately, we identified seven critical factors of on-site waste generation in office building retrofit projects. Through semi-structured interviews and Interpretive Structural Modelling, this research further investigated the interrelationships among these critical waste factors, to identify each factor’s level of influence on waste generation and propose effective solutions for waste minimization. “Organizational commitment” was identified as the fundamental issue for waste generation in the ISM system. Factors related to plan, design and construction processes were found to be located in the middle levels of the ISM model but still had significant impacts on the system as a whole. Based on the interview findings and ISM analysis results, some practical solutions were proposed for waste minimization in building retrofit projects: (1) reusable and adaptable fit-out design; (2) a system for as-built drawings and building information; (3) integrated planning for retrofitting work process and waste management; and (4) waste benchmarking development for retrofit projects. This research will provide a better understanding of waste issues associated with building retrofit projects and facilitate enhanced waste minimization.
Resumo:
Energy auditing is an effective but costly approach for reducing the long-term energy consumption of buildings. When well-executed, energy loss can be quickly identified in the building structure and its subsystems. This then presents opportunities for improving energy efficiency. We present a low-cost, portable technology called "HeatWave" which allows non-experts to generate detailed 3D surface temperature models for energy auditing. This handheld 3D thermography system consists of two commercially available imaging sensors and a set of software algorithms which can be run on a laptop. The 3D model can be visualized in real-time by the operator so that they can monitor their degree of coverage as the sensors are used to capture data. In addition, results can be analyzed offline using the proposed "Spectra" multispectral visualization toolbox. The presence of surface temperature data in the generated 3D model enables the operator to easily identify and measure thermal irregularities such as thermal bridges, insulation leaks, moisture build-up and HVAC faults. Moreover, 3D models generated from subsequent audits of the same environment can be automatically compared to detect temporal changes in conditions and energy use over time.
Resumo:
Object classification is plagued by the issue of session variation. Session variation describes any variation that makes one instance of an object look different to another, for instance due to pose or illumination variation. Recent work in the challenging task of face verification has shown that session variability modelling provides a mechanism to overcome some of these limitations. However, for computer vision purposes, it has only been applied in the limited setting of face verification. In this paper we propose a local region based intersession variability (ISV) modelling approach, and apply it to challenging real-world data. We propose a region based session variability modelling approach so that local session variations can be modelled, termed Local ISV. We then demonstrate the efficacy of this technique on a challenging real-world fish image database which includes images taken underwater, providing significant real-world session variations. This Local ISV approach provides a relative performance improvement of, on average, 23% on the challenging MOBIO, Multi-PIE and SCface face databases. It also provides a relative performance improvement of 35% on our challenging fish image dataset.
Resumo:
Food waste is a current challenge that both developing and developed countries face. This project applied a novel combination of available methods in Mechanical, agricultural and food engineering to address these challenges. A systematic approach was devised to investigate possibilities of reducing food waste and increasing the efficiency of industry by applying engineering concepts and theories including experimental, mathematical and computational modelling methods. This study highlights the impact of comprehensive understanding of agricultural and food material response to the mechanical operations and its direct relation to the volume of food wasted globally.
Resumo:
Terrorists usually target high occupancy iconic and public buildings using vehicle borne incendiary devices in order to claim a maximum number of lives and cause extensive damage to public property. While initial casualties are due to direct shock by the explosion, collapse of structural elements may extensively increase the total figure. Most of these buildings have been or are built without consideration of their vulnerability to such events. Therefore, the vulnerability and residual capacity assessment of buildings to deliberately exploded bombs is important to provide mitigation strategies to protect the buildings' occupants and the property. Explosive loads and their effects on a building have therefore attracted significant attention in the recent past. Comprehensive and economical design strategies must be developed for future construction. This research investigates the response and damage of reinforced concrete (RC) framed buildings together with their load bearing key structural components to a near field blast event. Finite element method (FEM) based analysis was used to investigate the structural framing system and components for global stability, followed by a rigorous analysis of key structural components for damage evaluation using the codes SAP2000 and LS DYNA respectively. The research involved four important areas in structural engineering. They are blast load determination, numerical modelling with FEM techniques, material performance under high strain rate and non-linear dynamic structural analysis. The response and damage of a RC framed building for different blast load scenarios were investigated. The blast influence region for a two dimensional RC frame was investigated for different load conditions and identified the critical region for each loading case. Two types of design methods are recommended for RC columns to provide superior residual capacities. They are RC columns detailing with multi-layer steel reinforcement cages and a composite columns including a central structural steel core. These are to provide post blast gravity load resisting capacity compared to typical RC column against a catastrophic collapse. Overall, this research broadens the current knowledge of blast and residual capacity analysis of RC framed structures and recommends methods to evaluate and mitigate blast impact on key elements of multi-storey buildings.
Resumo:
During the evolution of the music industry, developments in the media environment have required music firms to adapt in order to survive. Changes in broadcast radio programming during the 1950s; the Compact Cassette during the 1970s; and the deregulation of media ownership during the 1990s are all examples of changes which have heavily affected the music industry. This study explores similar contemporary dynamics, examines how decision makers in the music industry perceive and make sense of the developments, and reveals how they revise their business strategies, based on their mental models of the media environment. A qualitative system dynamics model is developed in order to support the reasoning brought forward by the study. The model is empirically grounded, but is also based on previous music industry research and a theoretical platform constituted by concepts from evolutionary economics and sociology of culture. The empirical data primarily consist of 36 personal interviews with decision makers in the American, British and Swedish music industrial ecosystems. The study argues that the model which is proposed, more effectively explains contemporary music industry dynamics than music industry models presented by previous research initiatives. Supported by the model, the study is able to show how “new” media outlets make old music business models obsolete and challenge the industry’s traditional power structures. It is no longer possible to expose music at one outlet (usually broadcast radio) in the hope that it will lead to sales of the same music at another (e.g. a compact disc). The study shows that many music industry decision makers still have not embraced the new logic, and have not yet challenged their traditional mental models of the media environment. Rather, they remain focused on preserving the pivotal role held by the CD and other physical distribution technologies. Further, the study shows that while many music firms remain attached to the old models, other firms, primarily music publishers, have accepted the transformation, and have reluctantly recognised the realities of a virtualised environment.
Resumo:
Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Conventionally the fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve in ISO834 [1]. However, modern commercial and residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the fire performance of LSF walls was undertaken using realistic design fire curves developed based on Eurocode parametric [2] and Barnett’s BFD [3] curves using both full scale fire tests and numerical studies. It included LSF walls without cavity insulation, and the recently developed externally insulated composite panel system. This paper presents the details of finite element models developed to simulate the full scale fire tests of LSF wall panels under realistic design fires. Finite element models of LSF walls exposed to realistic design fires were developed, and analysed under both transient and steady state fire conditions using the measured stud time-temperature curves. Transient state analyses were performed to simulate fire test conditions while steady state analyses were performed to obtain the load ratio versus time and failure temperature curves of LSF walls. Details of the developed finite element models and the results including the axial deformation and lateral deflection versus time curves, and the stud failure modes and times are presented in this paper. Comparison with fire test results demonstrate the ability of developed finite element models to predict the performance and fire resistance ratings of LSF walls under realistic design fires.
Resumo:
This thesis developed a method for real-time and handheld 3D temperature mapping using a combination of off-the-shelf devices and efficient computer algorithms. It contributes a new sensing and data processing framework to the science of 3D thermography, unlocking its potential for application areas such as building energy auditing and industrial monitoring. New techniques for the precise calibration of multi-sensor configurations were developed, along with several algorithms that ensure both accurate and comprehensive surface temperature estimates can be made for rich 3D models as they are generated by a non-expert user.
Resumo:
Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.