572 resultados para Bayesian adaptive design
Resumo:
This paper presents the Mossman Mill District Practices Framework. It was developed in the Wet Tropics region within the Great Barrier Reef in north-eastern Australia to describe the environmental benefits of agricultural management practices for the sugar cane industry. The framework translates complex, unclear and overlapping environmental plans, policy and legal arrangements into a simple framework of management practices that landholders can use to improve their management actions. Practices range from those that are old or outdated through to aspirational practices that have the potential to achieve desired resource condition targets. The framework has been applied by stakeholders at multiple scales to better coordinate and integrate a range of policy arrangements to improve natural resource management. It has been used to structure monitoring and evaluation in order to underpin a more adaptive approach to planning at mill district and property scale. Potentially, the framework and approach can be applied across fields of planning where adaptive management is needed. It has the potential to overcome many of the criticisms of property-scale and regional Natural Resource Management.
Resumo:
Scaffolds play a pivotal role in tissue engineering, promoting the synthesis of neo extra-cellular matrix (ECM), and providing temporary mechanical support for the cells during tissue regeneration. Advances introduced by additive manufacturing techniques have significantly improved the ability to regulate scaffold architecture, enhancing the control over scaffold shape and porosity. Thus, considerable research efforts have been devoted to the fabrication of 3D porous scaffolds with optimized micro-architectural features. This chapter gives an overview of the methods for the design of additively manufactured scaffolds and their applicability in tissue engineering (TE). Along with a survey of the state of the art, the Authors will also present a recently developed method, called Load-Adaptive Scaffold Architecturing (LASA), which returns scaffold architectures optimized for given applied mechanical loads systems, once the specific stress distribution is evaluated through Finite Element Analysis (FEA).
Resumo:
This was a comparative study of the possibility of a net zero energy house in Queensland, Australia. It examines the actual energy use and thermal comfort conditions of an occupied Brisbane home and compares performance with the 10 star scale rating scheme for Australian residential buildings. An adaptive comfort psychometric chart was developed for this analysis. The house's capacity for the use of the natural ventilation was studied by CFD modelling. This study showed that the house succeeded in achieving the definition of net zero energy on an annual and monthly basis for lighting, cooking and space heating / cooling and for 70% of days for lighting, hot water and cooking services.
Resumo:
Spatial data are now prevalent in a wide range of fields including environmental and health science. This has led to the development of a range of approaches for analysing patterns in these data. In this paper, we compare several Bayesian hierarchical models for analysing point-based data based on the discretization of the study region, resulting in grid-based spatial data. The approaches considered include two parametric models and a semiparametric model. We highlight the methodology and computation for each approach. Two simulation studies are undertaken to compare the performance of these models for various structures of simulated point-based data which resemble environmental data. A case study of a real dataset is also conducted to demonstrate a practical application of the modelling approaches. Goodness-of-fit statistics are computed to compare estimates of the intensity functions. The deviance information criterion is also considered as an alternative model evaluation criterion. The results suggest that the adaptive Gaussian Markov random field model performs well for highly sparse point-based data where there are large variations or clustering across the space; whereas the discretized log Gaussian Cox process produces good fit in dense and clustered point-based data. One should generally consider the nature and structure of the point-based data in order to choose the appropriate method in modelling a discretized spatial point-based data.
Resumo:
Quantifying the impact of biochemical compounds on collective cell spreading is an essential element of drug design, with various applications including developing treatments for chronic wounds and cancer. Scratch assays are a technically simple and inexpensive method used to study collective cell spreading; however, most previous interpretations of scratch assays are qualitative and do not provide estimates of the cell diffusivity, D, or the cell proliferation rate,l. Estimating D and l is important for investigating the efficacy of a potential treatment and provides insight into the mechanism through which the potential treatment acts. While a few methods for estimating D and l have been proposed, these previous methods lead to point estimates of D and l, and provide no insight into the uncertainty in these estimates. Here, we compare various types of information that can be extracted from images of a scratch assay, and quantify D and l using discrete computational simulations and approximate Bayesian computation. We show that it is possible to robustly recover estimates of D and l from synthetic data, as well as a new set of experimental data. For the first time, our approach also provides a method to estimate the uncertainty in our estimates of D and l. We anticipate that our approach can be generalized to deal with more realistic experimental scenarios in which we are interested in estimating D and l, as well as additional relevant parameters such as the strength of cell-to-cell adhesion or the strength of cell-to-substrate adhesion.
Resumo:
We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy. We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy.
Resumo:
In this chapter, we explore methods for automatically generating game content—and games themselves—adapted to individual players in order to improve their playing experience or achieve a desired effect. This goes beyond notions of mere replayability and involves modeling player needs to maximize their enjoyment, involvement, and interest in the game being played. We identify three main aspects of this process: generation of new content and rule sets, measurement of this content and the player, and adaptation of the game to change player experience. This process forms a feedback loop of constant refinement, as games are continually improved while being played. Framed within this methodology, we present an overview of our recent and ongoing research in this area. This is illustrated by a number of case studies that demonstrate these ideas in action over a variety of game types, including 3D action games, arcade games, platformers, board games, puzzles, and open-world games. We draw together some of the lessons learned from these projects to comment on the difficulties, the benefits, and the potential for personalized gaming via adaptive game design.
Resumo:
Utilities worldwide are focused on supplying peak electricity demand reliably and cost effectively, requiring a thorough understanding of all the factors influencing residential electricity use at peak times. An electricity demand reduction project based on comprehensive residential consumer engagement was established within an Australian community in 2008, and by 2011, peak demand had decreased to below pre-intervention levels. This paper applied field data discovered through qualitative in-depth interviews of 22 residential households at the community to a Bayesian Network complex system model to examine whether the system model could explain successful peak demand reduction in the case study location. The knowledge and understanding acquired through insights into the major influential factors and the potential impact of changes to these factors on peak demand would underpin demand reduction intervention strategies for a wider target group.
Resumo:
In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.
Resumo:
Purpose – The purpose of this paper is to describe an innovative compliance control architecture for hybrid multi‐legged robots. The approach was verified on the hybrid legged‐wheeled robot ASGUARD, which was inspired by quadruped animals. The adaptive compliance controller allows the system to cope with a variety of stairs, very rough terrain, and is also able to move with high velocity on flat ground without changing the control parameters. Design/methodology/approach – The paper shows how this adaptivity results in a versatile controller for hybrid legged‐wheeled robots. For the locomotion control we use an adaptive model of motion pattern generators. The control approach takes into account the proprioceptive information of the torques, which are applied on the legs. The controller itself is embedded on a FPGA‐based, custom designed motor control board. An additional proprioceptive inclination feedback is used to make the same controller more robust in terms of stair‐climbing capabilities. Findings – The robot is well suited for disaster mitigation as well as for urban search and rescue missions, where it is often necessary to place sensors or cameras into dangerous or inaccessible areas to get a better situation awareness for the rescue personnel, before they enter a possibly dangerous area. A rugged, waterproof and dust‐proof corpus and the ability to swim are additional features of the robot. Originality/value – Contrary to existing approaches, a pre‐defined walking pattern for stair‐climbing was not used, but an adaptive approach based only on internal sensor information. In contrast to many other walking pattern based robots, the direct proprioceptive feedback was used in order to modify the internal control loop, thus adapting the compliance of each leg on‐line.
Resumo:
Piezoelectric polymers based on polyvinylidene fluoride (PVDF) are of interest as smart materials for novel space-based telescope applications. Dimensional adjustments of adaptive thin polymer films are achieved via controlled charge deposition. Predicting their long-term performance requires a detailed understanding of the piezoelectric property changes that develop during space environmental exposure. The overall materials performance is governed by a combination of chemical and physical degradation processes occurring in low Earth orbit as established by our past laboratory-based materials performance experiments (see report SAND 2005-6846). Molecular changes are primarily induced via radiative damage, and physical damage from temperature and atomic oxygen exposure is evident as depoling, loss of orientation and surface erosion. The current project extension has allowed us to design and fabricate small experimental units to be exposed to low Earth orbit environments as part of the Materials International Space Station Experiments program. The space exposure of these piezoelectric polymers will verify the observed trends and their degradation pathways, and provide feedback on using piezoelectric polymer films in space. This will be the first time that PVDF-based adaptive polymer films will be operated and exposed to combined atomic oxygen, solar UV and temperature variations in an actual space environment. The experiments are designed to be fully autonomous, involving cyclic application of excitation voltages, sensitive film position sensors and remote data logging. This mission will provide critically needed feedback on the long-term performance and degradation of such materials, and ultimately the feasibility of large adaptive and low weight optical systems utilizing these polymers in space.
Resumo:
The era of knowledge-based urban development has led to an unprecedented increase in mobility of people and the subsequent growth in new typologies of agglomerated enclaves of knowledge such as knowledge and innovation spaces. Within this context, a new role has been assigned to contemporary public spaces to attract and retain the mobile knowledge workforce by creating a sense of place. This paper investigates place making in the globalized knowledge economy, which develops a sense of permanence spatio-temporally to knowledge workers displaying a set of particular characteristics and simultaneously is process-dependent getting developed by the internal and external flows and contributing substantially in the development of the broader context it stands in relation with. The paper reviews the literature and highlights observations from Kelvin Grove Urban Village, located in Australia’s new world city Brisbane, to understand the application of urban design as a vehicle to create and sustain place making in knowledge and innovation spaces. This research seeks to analyze the modified permeable typology of public spaces that makes knowledge and innovation spaces more viable and adaptive as per the changing needs of the contemporary globalized knowledge society.
Resumo:
So far, most Phase II trials have been designed and analysed under a frequentist framework. Under this framework, a trial is designed so that the overall Type I and Type II errors of the trial are controlled at some desired levels. Recently, a number of articles have advocated the use of Bavesian designs in practice. Under a Bayesian framework, a trial is designed so that the trial stops when the posterior probability of treatment is within certain prespecified thresholds. In this article, we argue that trials under a Bayesian framework can also be designed to control frequentist error rates. We introduce a Bayesian version of Simon's well-known two-stage design to achieve this goal. We also consider two other errors, which are called Bayesian errors in this article because of their similarities to posterior probabilities. We show that our method can also control these Bayesian-type errors. We compare our method with other recent Bayesian designs in a numerical study and discuss implications of different designs on error rates. An example of a clinical trial for patients with nasopharyngeal carcinoma is used to illustrate differences of the different designs.
Resumo:
Stallard (1998, Biometrics 54, 279-294) recently used Bayesian decision theory for sample-size determination in phase II trials. His design maximizes the expected financial gains in the development of a new treatment. However, it results in a very high probability (0.65) of recommending an ineffective treatment for phase III testing. On the other hand, the expected gain using his design is more than 10 times that of a design that tightly controls the false positive error (Thall and Simon, 1994, Biometrics 50, 337-349). Stallard's design maximizes the expected gain per phase II trial, but it does not maximize the rate of gain or total gain for a fixed length of time because the rate of gain depends on the proportion: of treatments forwarding to the phase III study. We suggest maximizing the rate of gain, and the resulting optimal one-stage design becomes twice as efficient as Stallard's one-stage design. Furthermore, the new design has a probability of only 0.12 of passing an ineffective treatment to phase III study.