247 resultados para D-optimal design
Resumo:
This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.
Resumo:
A power electronics-based buffer is examined in which through control of its PWM converters, the buffer-load combination is driven to operate under either constant power or constant impedance modes. A battery, incorporated within the buffer, provides the energy storage facility to facilitate the necessary power flow control. Real power demand from upstream supply is regulated under fault condition, and the possibility of voltage or network instability is reduced. The proposed buffer is also applied to a wind farm. It is shown that the buffer stabilizes the power contribution from the farm. Based on a battery cost-benefit analysis, a method is developed to determine the optimal level of the power supplied from the wind farm and the corresponding capacity of the battery storage system.
Resumo:
Purpose Two diodes which do not require correction factors for small field relative output measurements are designed and validated using experimental methodology. This was achieved by adding an air layer above the active volume of the diode detectors, which canceled out the increase in response of the diodes in small fields relative to standard field sizes. Methods Due to the increased density of silicon and other components within a diode, additional electrons are created. In very small fields, a very small air gap acts as an effective filter of electrons with a high angle of incidence. The aim was to design a diode that balanced these perturbations to give a response similar to a water-only geometry. Three thicknesses of air were placed at the proximal end of a PTW 60017 electron diode (PTWe) using an adjustable “air cap”. A set of output ratios (ORfclin Det ) for square field sizes of side length down to 5 mm was measured using each air thickness and compared to ORfclin Det measured using an IBA stereotactic field diode (SFD). k fclin, f msr Qclin,Qmsr was transferred from the SFD to the PTWe diode and plotted as a function of air gap thickness for each field size. This enabled the optimal air gap thickness to be obtained by observing which thickness of air was required such that k fclin, f msr Qclin,Qmsr was equal to 1.00 at all field sizes. A similar procedure was used to find the optimal air thickness required to make a modified Sun Nuclear EDGE detector (EDGEe) which s “correction-free” in small field relative dosimetry. In addition, the feasibility of experimentally transferring k fclin, f msr Qclin,Qmsr values from the SFD to unknown diodes was tested by comparing the experimentally transferred k fclin, f msr Qclin,Qmsr values for unmodified PTWe and EDGEe diodes to Monte Carlo simulated values. Results 1.0 mm of air was required to make the PTWe diode correction-free. This modified diode (PTWeair) produced output factors equivalent to those in water at all field sizes (5–50 mm). The optimal air thickness required for the EDGEe diode was found to be 0.6 mm. The modified diode (EDGEeair) produced output factors equivalent to those in water, except at field sizes of 8 and 10 mm where it measured approximately 2% greater than the relative dose to water. The experimentally calculated k fclin, f msr Qclin,Qmsr for both the PTWe and the EDGEe diodes (without air) matched Monte Carlo simulated results, thus proving that it is feasible to transfer k fclin, f msr Qclin,Qmsr from one commercially available detector to another using experimental methods and the recommended experimental setup. Conclusions It is possible to create a diode which does not require corrections for small field output factor measurements. This has been performed and verified experimentally. The ability of a detector to be “correction-free” depends strongly on its design and composition. A nonwater-equivalent detector can only be “correction-free” if competing perturbations of the beam cancel out at all field sizes. This should not be confused with true water equivalency of a detector.
Resumo:
CONTEXT AND OBJECTIVE: Suboptimal vitamin D status can be corrected by vitamin D supplementation, but individual responses to supplementation vary. We aimed to examine genetic and nongenetic determinants of change in serum 25-hydroxyvitamin D (25(OH)D) after supplementation. DESIGN AND PARTICIPANTS: We used data from a pilot randomized controlled trial in which 644 adults aged 60 to 84 years were randomly assigned to monthly doses of placebo, 30 000 IU, or 60 000 IU vitamin D3 for 12 months. Baseline characteristics were obtained from a self-administered questionnaire. Eighty-eight single-nucleotide polymorphisms (SNPs) in 41 candidate genes were genotyped using Sequenom MassArray technology. Serum 25(OH)D levels before and after the intervention were measured using the Diasorin Liaison platform immunoassay. We used linear regression models to examine associations between genetic and nongenetic factors and change in serum 25(OH)D levels. RESULTS: Supplement dose and baseline 25(OH)D level explained 24% of the variability in response to supplementation. Body mass index, self-reported health status, and ambient UV radiation made a small additional contribution. SNPs in CYP2R1, IRF4, MC1R, CYP27B1, VDR, TYRP1, MCM6, and HERC2 were associated with change in 25(OH)D level, although only CYP2R1 was significant after adjustment for multiple testing. Models including SNPs explained a similar proportion of variability in response to supplementation as models that included personal and environmental factors. CONCLUSION: Stepwise regression analyses suggest that genetic variability may be associated with response to supplementation, perhaps suggesting that some people might need higher doses to reach optimal 25(OH)D levels or that there is variability in the physiologically normal level of 25(OH)D.
Resumo:
Integrating renewable energy into public space is becoming more common as a climate change solution. However, this approach is often guided by the environmental pillar of sustainability, with less focus on the economic and social pillars. The purpose of this paper is to examine this issue in the speculative renewable energy propositions for Freshkills Park in New York City submitted for the 2012 Land Art Generator Initiative (LAGI) competition. This paper first proposes an optimal electricity distribution (OED) framework in and around public spaces based on relevant ecology and energy theory (Odum’s fourth and fifth law of thermodynamics). This framework addresses social engagement related to public interaction, and economic engagement related to the estimated quantity of electricity produced, in conjunction with environmental engagement related to the embodied energy required to construct the renewable energy infrastructure. Next, the study uses the OED framework to analyse the top twenty-five projects submitted for the LAGI 2012 competition. The findings reveal an electricity distribution imbalance and suggest a lack of in-depth understanding about sustainable electricity distribution within public space design. The paper concludes with suggestions for future research.
Resumo:
The hemodynamic response function (HRF) describes the local response of brain vasculature to functional activation. Accurate HRF modeling enables the investigation of cerebral blood flow regulation and improves our ability to interpret fMRI results. Block designs have been used extensively as fMRI paradigms because detection power is maximized; however, block designs are not optimal for HRF parameter estimation. Here we assessed the utility of block design fMRI data for HRF modeling. The trueness (relative deviation), precision (relative uncertainty), and identifiability (goodness-of-fit) of different HRF models were examined and test-retest reproducibility of HRF parameter estimates was assessed using computer simulations and fMRI data from 82 healthy young adult twins acquired on two occasions 3 to 4 months apart. The effects of systematically varying attributes of the block design paradigm were also examined. In our comparison of five HRF models, the model comprising the sum of two gamma functions with six free parameters had greatest parameter accuracy and identifiability. Hemodynamic response function height and time to peak were highly reproducible between studies and width was moderately reproducible but the reproducibility of onset time was low. This study established the feasibility and test-retest reliability of estimating HRF parameters using data from block design fMRI studies.
Resumo:
Several articles in this journal have studied optimal designs for testing a series of treatments to identify promising ones for further study. These designs formulate testing as an ongoing process until a promising treatment is identified. This formulation is considered to be more realistic but substantially increases the computational complexity. In this article, we show that these new designs, which control the error rates for a series of treatments, can be reformulated as conventional designs that control the error rates for each individual treatment. This reformulation leads to a more meaningful interpretation of the error rates and hence easier specification of the error rates in practice. The reformulation also allows us to use conventional designs from published tables or standard computer programs to design trials for a series of treatments. We illustrate these using a study in soft tissue sarcoma.
Resumo:
Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).