85 resultados para Central composite experimental design


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic generated semi and non volatile organic compounds (SVOCs and NVOCs) pose a serious threat to human and ecosystem health when washed off into receiving water bodies by stormwater. Climate change influenced rainfall characteristics makes the estimation of these pollutants in stormwater quite complex. The research study discussed in the paper developed a prediction framework for such pollutants under the dynamic influence of climate change on rainfall characteristics. It was established through principal component analysis (PCA) that the intensity and durations of low to moderate rain events induced by climate change mainly affect the wash-off of SVOCs and NVOCs from urban roads. The study outcomes were able to overcome the limitations of stringent laboratory preparation of calibration matrices by extracting uncorrelated underlying factors in the data matrices through systematic application of PCA and factor analysis (FA). Based on the initial findings from PCA and FA, the framework incorporated orthogonal rotatable central composite experimental design to set up calibration matrices and partial least square regression to identify significant variables in predicting the target SVOCs and NVOCs in four particulate fractions ranging from >300-1 μm and one dissolved fraction of <1 μm. For the particulate fractions range >300-1 μm, similar distributions of predicted and observed concentrations of the target compounds from minimum to 75th percentile were achieved. The inter-event coefficient of variations for particulate fractions of >300-1 μm were 5% to 25%. The limited solubility of the target compounds in stormwater restricted the predictive capacity of the proposed method for the dissolved fraction of <1 μm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines a matrix of synthetic water samples designed to include conditions that favour brominated disinfection by-product (Br-DBP) formation, in order to provide predictive models suitable for high Br-DBP forming waters such as salinity-impacted waters. Br-DBPs are known to be more toxic than their chlorinated analogues, in general, and their formation may be favoured by routine water treatment practices such as coagulation/flocculation under specific conditions; therefore, circumstances surrounding their formation must be understood. The chosen factors were bromide concentration, mineral alkalinity, bromide to dissolved organic carbon (Br/DOC) ratio and Suwannee River natural organic matter concentration. The relationships between these parameters and DBP formation were evaluated by response surface modelling of data generated using a face-centred central composite experimental design. Predictive models for ten brominated and/or chlorinated DBPs are presented, as well as models for total trihalomethanes (tTHMs) and total dihaloacetonitriles (tDHANs), and bromide substitution factors for the THMs and DHANs classes. The relationships described revealed that increasing alkalinity and increasing Br/DOC ratio were associated with increasing bromination of THMs and DHANs, suggesting that DOC lowering treatment methods that do not also remove bromide such as enhanced coagulation may create optimal conditions for Br-DBP formation in waters in which bromide is present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The climate in the Arctic is changing faster than anywhere else on earth. Poorly understood feedback processes relating to Arctic clouds and aerosol–cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in situ in this difficult-to-reach region with logistically demanding environmental conditions. The Arctic Summer Cloud Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007–2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait: two in open water and two in the marginal ice zone. After traversing the pack ice northward, an ice camp was set up on 12 August at 87°21' N, 01°29' W and remained in operation through 1 September, drifting with the ice. During this time, extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first-ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggests the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations, and the balance between local and remote aerosols sources remains open. Lack of cloud condensation nuclei (CCN) was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: In health related research, it is critical not only to demonstrate the efficacy of intervention, but to show that this is not due to chance or confounding variables. Content: Single case experimental design is a useful quasi-experimental design and method used to achieve these goals when there are limited participants and funds for research. This type of design has various advantages compared to group experimental designs. One such advantage is the capacity to focus on individual performance outcomes compared to group performance outcomes. Conclusions: This comprehensive review demonstrates the benefits and limitations of using single case experimental design, its various design methods, and data collection and analysis for research purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of Bayesian methodologies for solving optimal experimental design problems has increased. Many of these methods have been found to be computationally intensive for design problems that require a large number of design points. A simulation-based approach that can be used to solve optimal design problems in which one is interested in finding a large number of (near) optimal design points for a small number of design variables is presented. The approach involves the use of lower dimensional parameterisations that consist of a few design variables, which generate multiple design points. Using this approach, one simply has to search over a few design variables, rather than searching over a large number of optimal design points, thus providing substantial computational savings. The methodologies are demonstrated on four applications, including the selection of sampling times for pharmacokinetic and heat transfer studies, and involve nonlinear models. Several Bayesian design criteria are also compared and contrasted, as well as several different lower dimensional parameterisation schemes for generating the many design points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. A protocol for a new peer-led self-management programme for communitydwelling older people with diabetes in Shanghai, China. Background. The increasing prevalence of type 2 diabetes poses major public health challenges. Appropriate education programmes could help people with diabetes to achieve self-management and better health outcomes. Providing education programmes to the fast growing number of people with diabetes present a real challenge to Chinese healthcare system, which is strained for personnel and funding shortages. Empirical literature and expert opinions suggest that peer education programmes are promising. Design. Quasi-experimental. Methods. This study is a non-equivalent control group design (protocol approved in January, 2008). A total of 190 people, with 95 participants in each group, will be recruited from two different, but similar, communities. The programme, based on Social Cognitive Theory, will consist of basic diabetes instruction and social support and self-efficacy enhancing group activities. Basic diabetes instruction sessions will be delivered by health professionals, whereas social support and self-efficacy enhancing group activities will be led by peer leaders. Outcome variables include: self-efficacy, social support, self-management behaviours, depressive status, quality of life and healthcare utilization, which will be measured at baseline, 4 and 12 weeks. Discussion. This theory-based programme tailored to Chinese patients has potential for improving diabetes self-management and subsequent health outcomes. In addition, the delivery mode, through involvement of peer leaders and existing community networks,is especially promising considering healthcare resource shortage in China.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future data set drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature to rapidly obtain samples from the posterior is importance sampling, using the prior as the importance distribution. However, importance sampling will tend to break down if there is a reasonable number of experimental observations and/or the model parameter is high dimensional. In this paper we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times which produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present fully Bayesian experimental designs for nonlinear mixed effects models, in which we develop simulation-based optimal design methods to search over both continuous and discrete design spaces. Although Bayesian inference has commonly been performed on nonlinear mixed effects models, there is a lack of research into performing Bayesian optimal design for nonlinear mixed effects models that require searches to be performed over several design variables. This is likely due to the fact that it is much more computationally intensive to perform optimal experimental design for nonlinear mixed effects models than it is to perform inference in the Bayesian framework. In this paper, the design problem is to determine the optimal number of subjects and samples per subject, as well as the (near) optimal urine sampling times for a population pharmacokinetic study in horses, so that the population pharmacokinetic parameters can be precisely estimated, subject to cost constraints. The optimal sampling strategies, in terms of the number of subjects and the number of samples per subject, were found to be substantially different between the examples considered in this work, which highlights the fact that the designs are rather problem-dependent and require optimisation using the methods presented in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis progresses Bayesian experimental design by developing novel methodologies and extensions to existing algorithms. Through these advancements, this thesis provides solutions to several important and complex experimental design problems, many of which have applications in biology and medicine. This thesis consists of a series of published and submitted papers. In the first paper, we provide a comprehensive literature review on Bayesian design. In the second paper, we discuss methods which may be used to solve design problems in which one is interested in finding a large number of (near) optimal design points. The third paper presents methods for finding fully Bayesian experimental designs for nonlinear mixed effects models, and the fourth paper investigates methods to rapidly approximate the posterior distribution for use in Bayesian utility functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big Datasets are endemic, but they are often notoriously difficult to analyse because of their size, heterogeneity, history and quality. The purpose of this paper is to open a discourse on the use of modern experimental design methods to analyse Big Data in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has wide generality and advantageous inferential and computational properties. In particular, the principled experimental design approach is shown to provide a flexible framework for analysis that, for certain classes of objectives and utility functions, delivers near equivalent answers compared with analyses of the full dataset under a controlled error rate. It can also provide a formalised method for iterative parameter estimation, model checking, identification of data gaps and evaluation of data quality. Finally, it has the potential to add value to other Big Data sampling algorithms, in particular divide-and-conquer strategies, by determining efficient sub-samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The degradation efficiencies and behaviors of caffeic acid (CaA), p-coumaric acid (pCoA) and ferulic acid (FeA) in aqueous sucrose solutions containing the mixture of these hydroxycinnamic acids (HCAs) mixtures were studied by the Fenton oxidation process. Central composite design and multi-response surface methodology were used to evaluate and optimize the interactive effects of process parameters. Four quadratic polynomial models were developed for the degradation of each individual acid in the mixture and the total HCAs degraded. Sucrose was the most influential parameter that significantly affected the total amount of HCA degraded. Under the conditions studied there was < 0.01% loss of sucrose in all reactions. The optimal values of the process parameters for a 200 mg/L HCA mixture in water (pH 4.73, 25.15 °C) and sucrose solution (13 mass%, pH 5.39, 35.98 °C) were 77% and 57% respectively. Regression analysis showed goodness of fit between the experimental results and the predicted values. The degradation behavior of CaA differed from those of pCoA and FeA, where further CaA degradation is observed at increasing sucrose and decreasing solution pH. The differences (established using UV/Vis and ATR-FTIR spectroscopy) were because, unlike the other acids, CaA formed a complex with Fe(III) or with Fe(III) hydrogen-bonded to sucrose, and coprecipitated with lepidocrocite, an iron oxyhydroxide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Progression to the castration-resistant state is the incurable and lethal end stage of prostate cancer, and there is strong evidence that androgen receptor (AR) still plays a central role in this process. We hypothesize that knocking down AR will have a major effect on inhibiting growth of castration-resistant tumors. Experimental Design: Castration-resistant C4-2 human prostate cancer cells stably expressing a tetracycline-inducible AR-targeted short hairpin RNA (shRNA) were generated to directly test the effects of AR knockdown in C4-2 human prostate cancer cells and tumors. Results:In vitro expression of AR shRNA resulted in decreased levels of AR mRNA and protein, decreased expression of prostate-specific antigen (PSA), reduced activation of the PSA-luciferase reporter, and growth inhibition of C4-2 cells. Gene microarray analyses revealed that AR knockdown under hormone-deprived conditions resulted in activation of genes involved in apoptosis, cell cycle regulation, protein synthesis, and tumorigenesis. To ensure that tumors were truly castration-resistant in vivo, inducible AR shRNA expressing C4-2 tumors were grown in castrated mice to an average volume of 450 mm3. In all of the animals, serum PSA decreased, and in 50% of them, there was complete tumor regression and disappearance of serum PSA. Conclusions: Whereas castration is ineffective in castration-resistant prostate tumors, knockdown of AR can decrease serum PSA, inhibit tumor growth, and frequently cause tumor regression. This study is the first direct evidence that knockdown of AR is a viable therapeutic strategy for treatment of prostate tumors that have already progressed to the castration-resistant state.