572 resultados para Bayesian adaptive design
Resumo:
This paper presents an explanation of why the reuse of building components after demolition or deconstruction is critical to the future of the construction industry. An examination of the historical cause and response to climate change sets the scene as to why governance is becoming increasingly focused on the built environment as a mechanism to controlling waste generation associated with the process of demolition, construction and operation. Through an annotated description to the evolving design and construction methodology of a range of timber dwellings (typically 'Queenslanders' during the eras of 1880-1900, 1900-1920 & 1920-1940) the paper offers an evaluation to the variety of materials, which can be used advantageously by those wishing to 'regenerate' a Queenslander. This analysis of 'regeneration' details the constraints when considering relocation and/ or reuse by adaption including deconstruction of building components against the legislative framework requirements of the Queensland Building Act 1975 and the Queensland Sustainable Planning Act 2009, with a specific examination to those of the Building Codes of Australia. The paper concludes with a discussion of these constraints, their impacts on 'regeneration' and the need for further research to seek greater understanding of the practicalities and drivers of relocation, adaptive and building components suitability for reuse after deconstruction.
Resumo:
Small element spacing in compact arrays results in strong mutual coupling between array elements. Performance degradation associated with the strong coupling can be avoided through the introduction of a decoupling network consisting of interconnected reactive elements. We present a systematic design procedure for decoupling networks of symmetrical arrays with more than three elements and characterized by circulant scattering parameter matrices. The elements of the decoupling network are obtained through repeated decoupling of the characteristic eigenmodes of the array, which allows the calculation of element values using closed-form expressions.
Resumo:
The use of adaptive wing/aerofoil designs is being considered, as they are promising techniques in aeronautic/ aerospace since they can reduce aircraft emissions and improve aerodynamic performance of manned or unmanned aircraft. This paper investigates the robust design and optimization for one type of adaptive techniques: active flow control bump at transonic flow conditions on a natural laminar flow aerofoil. The concept of using shock control bump is to control supersonic flow on the suction/pressure side of natural laminar flow aerofoil that leads to delaying shock occurrence (weakening its strength) or boundary layer separation. Such an active flow control technique reduces total drag at transonic speeds due to reduction of wave drag. The location of boundary-layer transition can influence the position and structure of the supersonic shock on the suction/pressure side of aerofoil. The boundarylayer transition position is considered as an uncertainty design parameter in aerodynamic design due to the many factors, such as surface contamination or surface erosion. This paper studies the shock-control-bump shape design optimization using robust evolutionary algorithms with uncertainty in boundary-layer transition locations. The optimization method is based on a canonical evolution strategy and incorporates the concepts of hierarchical topology, parallel computing, and asynchronous evaluation. The use of adaptive wing/aerofoil designs is being considered, as they are promising techniques in aeronautic/ aerospace since they can reduce aircraft emissions and improve aerodynamic performance of manned or unmanned aircraft. This paper investigates the robust design and optimization for one type of adaptive techniques: active flow control bump at transonic flow conditions on a natural laminar flow aerofoil. The concept of using shock control bump is to control supersonic flow on the suction/pressure side of natural laminar flow aerofoil that leads to delaying shock occurrence (weakening its strength) or boundary-layer separation. Such an active flow control technique reduces total drag at transonic speeds due to reduction of wave drag. The location of boundary-layer transition can influence the position and structure of the supersonic shock on the suction/pressure side of aerofoil. The boundarylayer transition position is considered as an uncertainty design parameter in aerodynamic design due to the many factors, such as surface contamination or surface erosion. This paper studies the shock-control-bump shape design optimization using robust evolutionary algorithms with uncertainty in boundary-layer transition locations. The optimization method is based on a canonical evolution strategy and incorporates the concepts of hierarchical topology, parallel computing, and asynchronous evaluation. Two test cases are conducted: the first test assumes the boundary-layer transition position is at 45% of chord from the leading edge, and the second test considers robust design optimization for the shock control bump at the variability of boundary-layer transition positions. The numerical result shows that the optimization method coupled to uncertainty design techniques produces Pareto optimal shock-control-bump shapes, which have low sensitivity and high aerodynamic performance while having significant total drag reduction.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.
Resumo:
An introduction to design of eliciting knowledge from experts.
Resumo:
Computer resource allocation represents a significant challenge particularly for multiprocessor systems, which consist of shared computing resources to be allocated among co-runner processes and threads. While an efficient resource allocation would result in a highly efficient and stable overall multiprocessor system and individual thread performance, ineffective poor resource allocation causes significant performance bottlenecks even for the system with high computing resources. This thesis proposes a cache aware adaptive closed loop scheduling framework as an efficient resource allocation strategy for the highly dynamic resource management problem, which requires instant estimation of highly uncertain and unpredictable resource patterns. Many different approaches to this highly dynamic resource allocation problem have been developed but neither the dynamic nature nor the time-varying and uncertain characteristics of the resource allocation problem is well considered. These approaches facilitate either static and dynamic optimization methods or advanced scheduling algorithms such as the Proportional Fair (PFair) scheduling algorithm. Some of these approaches, which consider the dynamic nature of multiprocessor systems, apply only a basic closed loop system; hence, they fail to take the time-varying and uncertainty of the system into account. Therefore, further research into the multiprocessor resource allocation is required. Our closed loop cache aware adaptive scheduling framework takes the resource availability and the resource usage patterns into account by measuring time-varying factors such as cache miss counts, stalls and instruction counts. More specifically, the cache usage pattern of the thread is identified using QR recursive least square algorithm (RLS) and cache miss count time series statistics. For the identified cache resource dynamics, our closed loop cache aware adaptive scheduling framework enforces instruction fairness for the threads. Fairness in the context of our research project is defined as a resource allocation equity, which reduces corunner thread dependence in a shared resource environment. In this way, instruction count degradation due to shared cache resource conflicts is overcome. In this respect, our closed loop cache aware adaptive scheduling framework contributes to the research field in two major and three minor aspects. The two major contributions lead to the cache aware scheduling system. The first major contribution is the development of the execution fairness algorithm, which degrades the co-runner cache impact on the thread performance. The second contribution is the development of relevant mathematical models, such as thread execution pattern and cache access pattern models, which in fact formulate the execution fairness algorithm in terms of mathematical quantities. Following the development of the cache aware scheduling system, our adaptive self-tuning control framework is constructed to add an adaptive closed loop aspect to the cache aware scheduling system. This control framework in fact consists of two main components: the parameter estimator, and the controller design module. The first minor contribution is the development of the parameter estimators; the QR Recursive Least Square(RLS) algorithm is applied into our closed loop cache aware adaptive scheduling framework to estimate highly uncertain and time-varying cache resource patterns of threads. The second minor contribution is the designing of a controller design module; the algebraic controller design algorithm, Pole Placement, is utilized to design the relevant controller, which is able to provide desired timevarying control action. The adaptive self-tuning control framework and cache aware scheduling system in fact constitute our final framework, closed loop cache aware adaptive scheduling framework. The third minor contribution is to validate this cache aware adaptive closed loop scheduling framework efficiency in overwhelming the co-runner cache dependency. The timeseries statistical counters are developed for M-Sim Multi-Core Simulator; and the theoretical findings and mathematical formulations are applied as MATLAB m-file software codes. In this way, the overall framework is tested and experiment outcomes are analyzed. According to our experiment outcomes, it is concluded that our closed loop cache aware adaptive scheduling framework successfully drives co-runner cache dependent thread instruction count to co-runner independent instruction count with an error margin up to 25% in case cache is highly utilized. In addition, thread cache access pattern is also estimated with 75% accuracy.
Resumo:
In this paper we present a sequential Monte Carlo algorithm for Bayesian sequential experimental design applied to generalised non-linear models for discrete data. The approach is computationally convenient in that the information of newly observed data can be incorporated through a simple re-weighting step. We also consider a flexible parametric model for the stimulus-response relationship together with a newly developed hybrid design utility that can produce more robust estimates of the target stimulus in the presence of substantial model and parameter uncertainty. The algorithm is applied to hypothetical clinical trial or bioassay scenarios. In the discussion, potential generalisations of the algorithm are suggested to possibly extend its applicability to a wide variety of scenarios
Resumo:
A wireless sensor network collected real-time water-quality measurements to investigate how current irrigation practices—in particular, underground water salination—affect the environment. New protocols provided high end-to-end packet delivery rates in the hostile deployment environment.
Resumo:
Most existing requirements engineering approaches focus on the modelling and specification of the IT artefacts ignoring the environment where the application is deployed. Although some requirements engineering approaches consider the stakeholder’s goals, they still focus on the IT artefacts’ specification. However, IT artefacts are embedded in a dynamic organisational environment and their design and specification cannot be separated from the environment’s constant evolution. Therefore, during the initial stages of a requirements engineering process it is advantageous to consider the integration of IT design with organisational design. We proposed the ADMITO (Analysis, Design and Management of IT and Organisations) approach to represent the dynamic relations between social and material entities, where the latter are divided into technological and organisational entities. In this paper we show how by using ADMITO in a concrete case, the Queensland Health Payroll (QHP) case, it is possible to have an integrated representation of IT and organisational design supporting organisational change and IT requirements specification.
Resumo:
This book involves a comprehensive study of the learning environment by adopting Grounded Theory methodology in a qualitative comparative way.It explores the limitations and benefits of a face-to-face and a virtual design studio as experienced by architecture students and educators at an Australian university in order to find the optimal combination for a blended environment to enhance the students’ experience. The main outcome:holistic multidimensional blended learning model,that through the various modalities,provides adaptive capacity in a range of settings.The model facilitates learning through self-determination,self-management,and the personalisation of the learning environment. Another outcome:a conceptual design education framework,provides a basic tool for educators to evaluate existing learning environments and to develop new learning environments with enough flexibility to respond effectively to a highly dynamic and increasingly technological world.The provision of a practical framework to assist design schools to improve their educational settings according to a suitable pedagogy that meets today’s needs and accommodates tomorrow’s changes.
Resumo:
The Old Government House, a former residence of the Queen’s representatives in Brisbane, Australia, symbolises British cultural heritage of Colonial Queensland. Located on the campus of the Queensland University of Technology, it is one of the oldest surviving examples of a stately residence in Queensland. Built in 1860s, the Old Government House was originally intended as a temporary residence for the first governor of the newly independent colony of Queensland. However, it remained the vice-regal residence until 1909, serving eleven succeeding governors. Nearly seven decades later, it became the first building in Queensland to be protected under heritage legislation. Thus its importance, as an excellent exemplar that demonstrates the significance of cultural heritage, was established. The Old Government House has survived 150 years of restoration work, refurbishments, and additions. Through these years, it has served the people of Queensland in a multitude of roles. This paper aims to investigate the survival of heritage listed buildings through their adaptive re-use. Its focus will be on the adaptive reuse of the Old Government House through its refurbishments and additions over a period of 150 years. Through a qualitative research process this paper will endeavour to establish the significance of restoration work on the Old Government house; the new opportunities that has opened up as a result of the restoration work; the continued maintenance and management of the building through adaptive re-use; the economic benefits of restoration work; and its contribution to the on-going interest in the preservation of the Tangible Cultural Heritage.
Resumo:
This paper reports outcomes of a pilot study to develop a conceptual framework to allow people to retrofit a building-layer to gain better control of their own built- environments. The study was initiated by the realisation that discussions surrounding the improvement of building performances tend to be about top-down technological solutions rather than to help and encourage bottom-up involvement of building-users. While users are the ultimate beneficiaries and their feedback is always appreciated, their direct involvements in managing buildings would often be regarded as obstruction or distraction. This is largely because casual interventions by uninformed building-users tend to disrupt the system. Some earlier researches showed however that direct and active participation of users could improve the building performance if appropriate training and/or systems were introduced. We also speculate this in long run would also make the built environment more sustainable. With this in mind, we looked for opportunities to retrofit our own office with an interactive layer to study how we could introduce ad-hoc systems for building-users. The aim of this paper is to describe our vision and initial attempts followed by discussion.
Resumo:
Reasoning with uncertain knowledge and belief has long been recognized as an important research issue in Artificial Intelligence (AI). Several methodologies have been proposed in the past, including knowledge-based systems, fuzzy sets, and probability theory. The probabilistic approach became popular mainly due to a knowledge representation framework called Bayesian networks. Bayesian networks have earned reputation of being powerful tools for modeling complex problem involving uncertain knowledge. Uncertain knowledge exists in domains such as medicine, law, geographical information systems and design as it is difficult to retrieve all knowledge and experience from experts. In design domain, experts believe that design style is an intangible concept and that its knowledge is difficult to be presented in a formal way. The aim of the research is to find ways to represent design style knowledge in Bayesian net works. We showed that these networks can be used for diagnosis (inferences) and classification of design style. The furniture design style is selected as an example domain, however the method can be used for any other domain.