871 resultados para nonparametric rationality tests


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on understanding the seismic response of geosynthetic reinforced retaining walls through shaking table tests on models of modular block and rigid faced reinforced retaining walls. Reduced-scale models of retaining walls reinforced with geogrid layers were constructed in a laminar box mounted on a uniaxial shaking table and subjected to various levels of sinusoidal base shaking. Models were instrumented with ultrasonic displacement sensors, earth pressure sensors and accelerometers. Effects of backfill density, number of reinforcement layers and reinforcement type on the performance of rigid faced and modular block walls were studied through different series of model tests. Performances of the walls were assessed in terms of face deformations, crest settlement and acceleration amplification at different elevations and compared. Modular block walls performed better than the rigid faced walls for the same level of base shaking because of the additional support derived by stacking the blocks with an offset. Type and quantity of reinforcement has significant effect on the seismic performance of both the types of walls. Displacements are more sensitive to relative density of the backfill and decrease with increasing relative density, the effect being more pronounced in case of unreinforced walls compared to the reinforced ones. Acceleration amplifications are not affected by the wall facing and inclusion of reinforcement. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By using six 4.5 Hz geophones, surface wave tests were performed on four different sites by dropping freely a 65 kg mass from a height of 5 m. The receivers were kept far away from the source to eliminate the arrival of body waves. Three different sources to nearest receiver distances (S), namely, 46 m, 56 m and 66 m, were chosen. Dispersion curves were drawn for all the sites. The maximum wavelength (lambda(max)), the maximum depth (d(max)) up to which exploration can be made and the frequency content of the signals depends on the site stiffness and the value of S. A stiffer site yields greater values of lambda(max) and d(max). For stiffer sites, an increase in S leads to an increase in lambda(max). The predominant time durations of the signals increase from stiffer to softer sites. An inverse analysis was also performed based on the stiffness matrix approach in conjunction with the maximum vertical flexibility coefficient of ground surface to establish the governing mode of excitation. For the Site 2, the results from the surface wave tests were found to compare reasonably well with that determined on the basis of cross boreholes seismic tests. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subtle concurrency errors in multithreaded libraries that arise because of incorrect or inadequate synchronization are often difficult to pinpoint precisely using only static techniques. On the other hand, the effectiveness of dynamic race detectors is critically dependent on multithreaded test suites whose execution can be used to identify and trigger races. Usually, such multithreaded tests need to invoke a specific combination of methods with objects involved in the invocations being shared appropriately to expose a race. Without a priori knowledge of the race, construction of such tests can be challenging. In this paper, we present a lightweight and scalable technique for synthesizing precisely these kinds of tests. Given a multithreaded library and a sequential test suite, we describe a fully automated analysis that examines sequential execution traces, and produces as its output a concurrent client program that drives shared objects via library method calls to states conducive for triggering a race. Experimental results on a variety of well-tested Java libraries yield 101 synthesized multithreaded tests in less than four minutes. Analyzing the execution of these tests using an off-the-shelf race detector reveals 187 harmful races, including several previously unreported ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study two multi-dimensional Goodness-of-Fit tests for spectrum sensing in cognitive radios. The multi-dimensional scenario refers to multiple CR nodes, each with multiple antennas, that record multiple observations from multiple primary users for spectrum sensing. These tests, viz., the Interpoint Distance (ID) based test and the h, f distance based tests are constructed based on the properties of stochastic distances. The ID test is studied in detail for a single CR node case, and a possible extension to handle multiple nodes is discussed. On the other hand, the h, f test is applicable in a multi-node setup. A robustness feature of the KL distance based test is discussed, which has connections with Middleton's class A model. Through Monte-Carlo simulations, the proposed tests are shown to outperform the existing techniques such as the eigenvalue ratio based test, John's test, and the sphericity test, in several scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a distributed sequential algorithm for quick detection of spectral holes in a Cognitive Radio set up. Two or more local nodes make decisions and inform the fusion centre (FC) over a reporting Multiple Access Channel (MAC), which then makes the final decision. The local nodes use energy detection and the FC uses mean detection in the presence of fading, heavy-tailed electromagnetic interference (EMI) and outliers. The statistics of the primary signal, channel gain and the EMI is not known. Different nonparametric sequential algorithms are compared to choose appropriate algorithms to be used at the local nodes and the Fe. Modification of a recently developed random walk test is selected for the local nodes for energy detection as well as at the fusion centre for mean detection. We show via simulations and analysis that the nonparametric distributed algorithm developed performs well in the presence of fading, EMI and outliers. The algorithm is iterative in nature making the computation and storage requirements minimal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present up-to-date electroweak fits of various Randall-Sundrum (RS) models. We consider the bulk RS, deformed RS, and the custodial RS models. For the bulk RS case we find the lightest Kaluza-Klein (KK) mode of the gauge boson to be similar to 8 TeV, while for the custodial case it is similar to 3 TeV. The deformed model is the least fine-tuned of all which can give a good fit for KK masses < 2 TeV depending on the choice of the model parameters. We also comment on the fine-tuning in each case.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the Gaussian process density sampler (GPDS), an exchangeable generative model for use in nonparametric Bayesian density estimation. Samples drawn from the GPDS are consistent with exact, independent samples from a distribution defined by a density that is a transformation of a function drawn from a Gaussian process prior. Our formulation allows us to infer an unknown density from data using Markov chain Monte Carlo, which gives samples from the posterior distribution over density functions and from the predictive distribution on data space. We describe two such MCMC methods. Both methods also allow inference of the hyperparameters of the Gaussian process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finitedimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets. Copyright 2009.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finite-dimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In situ compressive tests on specially designed small samples made from brittle metallic foams were accomplished in a loading device equipped in the scanning electron microscopy (SEM). Each of the small samples comprises only several cells in the effective test zone (ETZ), with one major cell in the middle. In such a system one can not only obtain sequential collapse-process images of a single cell and its cell walls with high resolution, but also correlate the detailed failure behaviour of the cell walls with the stress-strain response, therefore reveal the mechanisms of energy absorption in the mesoscopic scale. Meanwhile, the stress-strain behaviour is quite different from that of bulk foams in dimensions of enough large, indicating a strong size effect. According to the in situ observations, four failure modes in the cell-wall level were summarized, and these modes account for the mesoscopic mechanisms of energy absorption. Paralleled compression tests on bulk samples were also carried out, and it is found that both fracturing of a single cell and developing of fracture bands are defect-directed or weakness-directed processes. The mechanical properties of the brittle aluminum foams obtained from the present tests agree well with the size effect model for ductile cellular solids proposed by Onck et al. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Micro anchor is a kind of typical structures in micro/nano electromechanical systems (MEMS/NEMS), and it can be made by anodic bonding process, with thin films of metal or alloy as an intermediate layer. At the relative low temperature and voltage, specimens with actually sized micro anchor structures were anodically bonded using Pyrex 7740 glass and patterned crystalline silicon chips coated with aluminum thin film with a thickness comprised between 50 nm and 230 nm. To evaluate the bonding quality, tensile pulling tests have been finished with newly designed flexible fixtures for these specimens. The experimental results exhibit that the bonding tensile strength increases with the bonding temperature and voltage, but it decreases with the increase of the thickness of Al intermediate layer. This kind of thickness effect of the intermediate layer was not mentioned in the literature on anodic bonding. (C) 2008 Elsevier Ltd. All rights reserved.