998 resultados para latin hypercube


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we have used simulations to make a conjecture about the coverage of a t-dimensional subspace of a d-dimensional parameter space of size n when performing k trials of Latin Hypercube sampling. This takes the form P(k,n,d,t) = 1 - e^(-k/n^(t-1)). We suggest that this coverage formula is independent of d and this allows us to make connections between building Populations of Models and Experimental Designs. We also show that Orthogonal sampling is superior to Latin Hypercube sampling in terms of allowing a more uniform coverage of the t-dimensional subspace at the sub-block size level. These ideas have particular relevance when attempting to perform uncertainty quantification and sensitivity analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although uncertainties in material properties have been addressed in the design of flexible pavements, most current modeling techniques assume that pavement layers are homogeneous. The paper addresses the influence of the spatial variability of the resilient moduli of pavement layers by evaluating the effect of the variance and correlation length on the pavement responses to loading. The integration of the spatially varying log-normal random field with the finite-difference method has been achieved through an exponential autocorrelation function. The variation in the correlation length was found to have a marginal effect on the mean values of the critical strains and a noticeable effect on the standard deviation which decreases with decreases in correlation length. This reduction in the variance arises because of the spatial averaging phenomenon over the softer and stiffer zones generated because of spatial variability. The increase in the mean value of critical strains with decreasing correlation length, although minor, illustrates that pavement performance is adversely affected by the presence of spatially varying layers. The study also confirmed that the higher the variability in the pavement layer moduli, introduced through a higher value of coefficient of variation (COV), the higher the variability in the pavement response. The study concludes that ignoring spatial variability by modeling the pavement layers as homogeneous that have very short correlation lengths can result in the underestimation of the critical strains and thus an inaccurate assessment of the pavement performance. (C) 2014 American Society of Civil Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper proposes Latin hypercube sampling combined with the stratified sampling of variance reduction technique to calculate accurate fracture probability. In the compound sampling, the number of simulations is relatively small and the calculation error is satisfactory.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a method of spatial sampling based on stratification by Local Moran’s I i calculated using auxiliary information. The sampling technique is compared to other design-based approaches including simple random sampling, systematic sampling on a regular grid, conditional Latin Hypercube sampling and stratified sampling based on auxiliary information, and is illustrated using two different spatial data sets. Each of the samples for the two data sets is interpolated using regression kriging to form a geostatistical map for their respective areas. The proposed technique is shown to be competitive in reproducing specific areas of interest with high accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we provide estimates for the coverage of parameter space when using Latin Hypercube Sampling, which forms the basis of building so-called populations of models. The estimates are obtained using combinatorial counting arguments to determine how many trials, k, are needed in order to obtain specified parameter space coverage for a given value of the discretisation size n. In the case of two dimensions, we show that if the ratio (Ø) of trials to discretisation size is greater than 1, then as n becomes moderately large the fractional coverage behaves as 1-exp-ø. We compare these estimates with simulation results obtained from an implementation of Latin Hypercube Sampling using MATLAB.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Between-subject and within-subject variability is ubiquitous in biology and physiology and understanding and dealing with this is one of the biggest challenges in medicine. At the same time it is difficult to investigate this variability by experiments alone. A recent modelling and simulation approach, known as population of models (POM), allows this exploration to take place by building a mathematical model consisting of multiple parameter sets calibrated against experimental data. However, finding such sets within a high-dimensional parameter space of complex electrophysiological models is computationally challenging. By placing the POM approach within a statistical framework, we develop a novel and efficient algorithm based on sequential Monte Carlo (SMC). We compare the SMC approach with Latin hypercube sampling (LHS), a method commonly adopted in the literature for obtaining the POM, in terms of efficiency and output variability in the presence of a drug block through an in-depth investigation via the Beeler-Reuter cardiac electrophysiological model. We show improved efficiency via SMC and that it produces similar responses to LHS when making out-of-sample predictions in the presence of a simulated drug block.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Large integration of solar Photo Voltaic (PV) in distribution network has resulted in over-voltage problems. Several control techniques are developed to address over-voltage problem using Deterministic Load Flow (DLF). However, intermittent characteristics of PV generation require Probabilistic Load Flow (PLF) to introduce variability in analysis that is ignored in DLF. The traditional PLF techniques are not suitable for distribution systems and suffer from several drawbacks such as computational burden (Monte Carlo, Conventional convolution), sensitive accuracy with the complexity of system (point estimation method), requirement of necessary linearization (multi-linear simulation) and convergence problem (Gram–Charlier expansion, Cornish Fisher expansion). In this research, Latin Hypercube Sampling with Cholesky Decomposition (LHS-CD) is used to quantify the over-voltage issues with and without the voltage control algorithm in the distribution network with active generation. LHS technique is verified with a test network and real system from an Australian distribution network service provider. Accuracy and computational burden of simulated results are also compared with Monte Carlo simulations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nonlinear vibration analysis is performed using a C-0 assumed strain interpolated finite element plate model based on Reddy's third order theory. An earlier model is modified to include the effect of transverse shear variation along the plate thickness and Von-Karman nonlinear strain terms. Monte Carlo Simulation with Latin Hypercube Sampling technique is used to obtain the variance of linear and nonlinear natural frequencies of the plate due to randomness in its material properties. Numerical results are obtained for composite plates with different aspect ratio, stacking sequence and oscillation amplitude ratio. The numerical results are validated with the available literature. It is found that the nonlinear frequencies show increasing non-Gaussian probability density function with increasing amplitude of vibration and show dual peaks at high amplitude ratios. This chaotic nature of the dispersion of nonlinear eigenvalues is also r

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Polynomial chaos expansion (PCE) with Latin hypercube sampling (LHS) is employed for calculating the vibrational frequencies of an inviscid incompressible fluid partially filled in a rectangular tank with and without a baffle. Vibration frequencies of the coupled system are described through their projections on the PCE which uses orthogonal basis functions. PCE coefficients are evaluated using LHS. Convergence on the coefficient of variation is used to find the orthogonal polynomial basis function order which is employed in PCE. It is observed that the dispersion in the eigenvalues is more in the case of a rectangular tank with a baffle. The accuracy of the PCE method is verified with standard MCS results and is found to be more efficient.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A robust aeroelastic optimization is performed to minimize helicopter vibration with uncertainties in the design variables. Polynomial response surfaces and space-¯lling experimental designs are used to generate the surrogate model of aeroelastic analysis code. Aeroelastic simulations are performed at the sample inputs generated by Latin hypercube sampling. The response values which does not satisfy the frequency constraints are eliminated from the data for model ¯tting. This step increased the accuracy of response surface models in the feasible design space. It is found that the response surface models are able to capture the robust optimal regions of design space. The optimal designs show a reduction of 10 percent in the objective function comprising six vibratory hub loads and 1.5 to 80 percent reduction for the individual vibratory forces and moments. This study demonstrates that the second-order response surface models with space ¯lling-designs can be a favorable choice for computationally intensive robust aeroelastic optimization.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Designing and optimizing high performance microprocessors is an increasingly difficult task due to the size and complexity of the processor design space, high cost of detailed simulation and several constraints that a processor design must satisfy. In this paper, we propose the use of empirical non-linear modeling techniques to assist processor architects in making design decisions and resolving complex trade-offs. We propose a procedure for building accurate non-linear models that consists of the following steps: (i) selection of a small set of representative design points spread across processor design space using latin hypercube sampling, (ii) obtaining performance measures at the selected design points using detailed simulation, (iii) building non-linear models for performance using the function approximation capabilities of radial basis function networks, and (iv) validating the models using an independently and randomly generated set of design points. We evaluate our model building procedure by constructing non-linear performance models for programs from the SPEC CPU2000 benchmark suite with a microarchitectural design space that consists of 9 key parameters. Our results show that the models, built using a relatively small number of simulations, achieve high prediction accuracy (only 2.8% error in CPI estimates on average) across a large processor design space. Our models can potentially replace detailed simulation for common tasks such as the analysis of key microarchitectural trends or searches for optimal processor design points.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Polynomial Chaos Expansion with Latin Hypercube sampling is used to study the effect of material uncertainty on vibration control of a smart composite plate with piezoelectric sensors/actuators. Composite material properties and piezoelectric coefficients are considered as independent and normally distributed random variables. Numerical results show substantial variation in structural dynamic response due to material uncertainty of active vibration control system. This change in response due to material uncertainty can be compensated by actively tuning the feedback control system. Numerical results also show variation in dispersion of dynamic characteristics and control parameters with respect to ply angle and stacking sequence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a Physical layer Network Coding (PNC) scheme for the K-user wireless Multiple Access Relay Channel, in which K source nodes want to transmit messages to a destination node D with the help of a relay node R. The proposed scheme involves (i) Phase 1 during which the source nodes alone transmit and (ii) Phase 2 during which the source nodes and the relay node transmit. At the end of Phase 1, the relay node decodes the messages of the source nodes and during Phase 2 transmits a many-to-one function of the decoded messages. To counter the error propagation from the relay node, we propose a novel decoder which takes into account the possibility of error events at R. It is shown that if certain parameters are chosen properly and if the network coding map used at R forms a Latin Hypercube, the proposed decoder offers the maximum diversity order of two. Also, it is shown that for a proper choice of the parameters, the proposed decoder admits fast decoding, with the same decoding complexity order as that of the reference scheme based on Complex Field Network Coding (CFNC). Simulation results indicate that the proposed PNC scheme offers a large gain over the CFNC scheme.