116 resultados para Reliability testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study borrows the measures developed for the operation of water resources systems as a means of characterizing droughts in a given region. It is argued that the common approach of assessing drought using a univariate measure (severity or reliability) is inadequate as decision makers need assessment of the other facets considered here. It is proposed that the joint distribution of reliability, resilience, and vulnerability (referred to as RRV in a reservoir operation context), assessed using soil moisture data over the study region, be used to characterize droughts. Use is made of copulas to quantify the joint distribution between these variables. As reliability and resilience vary in a nonlinear but almost deterministic way, the joint probability distribution of only resilience and vulnerability is modeled. Recognizing the negative association between the two variables, a Plackett copula is used to formulate the joint distribution. The developed drought index, referred to as the drought management index (DMI), is able to differentiate the drought proneness of a given area when compared to other areas. An assessment of the sensitivity of the DMI to the length of the data segments used in evaluation indicates relative stability is achieved if the data segments are 5years or longer. The proposed approach is illustrated with reference to the Malaprabha River basin in India, using four adjoining Climate Prediction Center grid cells of soil moisture data that cover an area of approximately 12,000 km(2). (C) 2013 American Society of Civil Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The uncertainty in material properties and traffic characterization in the design of flexible pavements has led to significant efforts in recent years to incorporate reliability methods and probabilistic design procedures for the design, rehabilitation, and maintenance of pavements. In the mechanistic-empirical (ME) design of pavements, despite the fact that there are multiple failure modes, the design criteria applied in the majority of analytical pavement design methods guard only against fatigue cracking and subgrade rutting, which are usually considered as independent failure events. This study carries out the reliability analysis for a flexible pavement section for these failure criteria based on the first-order reliability method (FORM) and the second-order reliability method (SORM) techniques and the crude Monte Carlo simulation. Through a sensitivity analysis, the most critical parameter affecting the design reliability for both fatigue and rutting failure criteria was identified as the surface layer thickness. However, reliability analysis in pavement design is most useful if it can be efficiently and accurately applied to components of pavement design and the combination of these components in an overall system analysis. The study shows that for the pavement section considered, there is a high degree of dependence between the two failure modes, and demonstrates that the probability of simultaneous occurrence of failures can be almost as high as the probability of component failures. Thus, the need to consider the system reliability in the pavement analysis is highlighted, and the study indicates that the improvement of pavement performance should be tackled in the light of reducing this undesirable event of simultaneous failure and not merely the consideration of the more critical failure mode. Furthermore, this probability of simultaneous occurrence of failures is seen to increase considerably with small increments in the mean traffic loads, which also results in wider system reliability bounds. The study also advocates the use of narrow bounds to the probability of failure, which provides a better estimate of the probability of failure, as validated from the results obtained from Monte Carlo simulation (MCS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the analysis and design of municipal solid waste (MSW) landfills, there are many uncertainties associated with the properties of MSW during and after MSW placement. Several studies are performed involving different laboratory and field tests to understand the complex behavior and properties of MSW, and based on these studies, different models are proposed for the analysis of time dependent settlement response of MSW. For the analysis of MSW settlement, it is very important to account for the variability of model parameters that reflect different processes such as primary compression under loading, mechanical creep and biodegradation. In this paper, regression equations based on response surface method (RSM) are used to represent the complex behavior of MSW using a newly developed constitutive model. An approach to assess landfill capacities and develop landfill closure plans based on prediction of landfill settlements is proposed. The variability associated with model parameters relating to primary compression, mechanical creep and biodegradation are used to examine their influence on MSW settlement using reliability analysis framework and influence of various parameters on the settlement of MSW are estimated through sensitivity analysis. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of updating the reliability of instrumented structures based on measured response under random dynamic loading is considered. A solution strategy within the framework of Monte Carlo simulation based dynamic state estimation method and Girsanov’s transformation for variance reduction is developed. For linear Gaussian state space models, the solution is developed based on continuous version of the Kalman filter, while, for non-linear and (or) non-Gaussian state space models, bootstrap particle filters are adopted. The controls to implement the Girsanov transformation are developed by solving a constrained non-linear optimization problem. Numerical illustrations include studies on a multi degree of freedom linear system and non-linear systems with geometric and (or) hereditary non-linearities and non-stationary random excitations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the problem of finding a spectrum hole of a specified bandwidth in a given wide band of interest. We propose a new, simple and easily implementable sub-Nyquist sampling scheme for signal acquisition and a spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy in the frequency domain by testing a group of adjacent subbands in a single test. The sampling scheme deliberately introduces aliasing during signal acquisition, resulting in a signal that is the sum of signals from adjacent sub-bands. Energy-based hypothesis tests are used to provide an occupancy decision over the group of subbands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes. We extend this framework to a multi-stage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including non-contiguous spectrum hole search. Further, we provide the analytical means to optimize the hypothesis tests with respect to the detection thresholds, number of samples and group size to minimize the detection delay under a given error rate constraint. Depending on the sparsity and SNR, the proposed algorithms can lead to significantly lower detection delays compared to a conventional bin-by-bin energy detection scheme; the latter is in fact a special case of the group test when the group size is set to 1. We validate our analytical results via Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Small-scale mechanical testing of materials has gained prominence in the last decade or so due to the continuous miniaturization of components and devices in everyday application. This review describes the various micro-fabrication processes associated with the preparation of miniaturized specimens, geometries of test specimens and the small scale testing techniques used to determine the mechanical behaviour of materials at the length scales of a few hundred micro-meters and below. This is followed by illustrative examples in a selected class of materials. The choice of the case studies is based on the relevance of the materials used in today's world: evaluation of mechanical properties of thermal barrier coatings (TBCs), applied for enhanced high temperature protection of advanced gas turbine engine components, is essential since its failure by fracture leads to the collapse of the engine system. Si-based substrates, though brittle, are indispensible for MEMS/NEMS applications. Biological specimens, whose response to mechanical loads is important to ascertain their role in diseases and to mimic their structure for attaining high fracture toughness and impact resistance. An insight into the mechanisms behind the observed size effects in metallic systems can be exploited to achieve excellent strength at the nano-scale. A future outlook of where all this is heading is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An in situ approach involving a simple mix and shake method for testing the enantiopurity of primary, secondary and tertiary chiral amines and their derivatives, chiral amino alcohols, by H-1-NMR spectroscopy is developed. The protocol involves the in situ formation of chiral ammonium borate salt from a mixture of C-2 symmetric chiral BINOL, trialkoxyborane and chiral amines. The proposed concept was demonstrated convincingly on a large number of chiral and pro-chiral amines and amino alcohols, and also aids the precise measurement of enantiomeric excess. The protocol can be completed in a couple of minutes directly in the NMR sample tube, without the need for any physical separation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, an approach for target component and system reliability-based design optimisation (RBDO) to evaluate safety for the internal seismic stability of geosynthetic-reinforced soil (GRS) structures is presented. Three modes of failure are considered: tension failure of the bottom-most layer of reinforcement, pullout failure of the topmost layer of reinforcement, and total pullout failure of all reinforcement layers. The analysis is performed by treating backfill properties, geometric and strength properties of reinforcement as random variables. The optimum number of reinforcement layers and optimum pullout length needed to maintain stability against tension failure, pullout failure and total pullout failure for different coefficients of variation of friction angle of the backfill, design strength of the reinforcement and horizontal seismic acceleration coefficients by targeting various system reliability indices are proposed. The results provide guidelines for the total length of reinforcement required, considering the variability of backfill as well as seismic coefficients. One illustrative example is presented to explain the evaluation of reliability for internal stability of reinforced soil structures using the proposed approach. In the second illustration (the stability of five walls), the Kushiro wall subjected to the Kushiro-Oki earthquake, the Seiken wall subjected to the Chiba-ken Toho-Oki earthquake, the Ta Kung wall subjected to the Ji-Ji earthquake, and the Gould and Valencia walls subjected to Northridge earthquake are re-examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Earlier version of an indigenously developed Pressure Wave Generator (PWG) could not develop the necessary pressure ratio to satisfactorily operate a pulse tube cooler, largely due to high blow by losses in the piston cylinder seal gap and due to a few design deficiencies. Effect of different parameters like seal gap, piston diameter, piston stroke, moving mass and the piston back volume on the performance is studied analytically. Modifications were done to the PWG based on analysis and the performance is experimentally measured. A significant improvement in PWG performance is seen as a result of the modifications. The improved PWG is tested with the same pulse tube cooler but with different inertance tube configurations. A no load temperature of 130 K is achieved with an inertance tube configuration designed using Sage software. The delivered PV power is estimated to be 28.4 W which can produce a refrigeration of about 1 W at 80 K.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of time variant reliability analysis of randomly parametered and randomly driven nonlinear vibrating systems is considered. The study combines two Monte Carlo variance reduction strategies into a single framework to tackle the problem. The first of these strategies is based on the application of the Girsanov transformation to account for the randomness in dynamic excitations, and the second approach is fashioned after the subset simulation method to deal with randomness in system parameters. Illustrative examples include study of single/multi degree of freedom linear/non-linear inelastic randomly parametered building frame models driven by stationary/non-stationary, white/filtered white noise support acceleration. The estimated reliability measures are demonstrated to compare well with results from direct Monte Carlo simulations. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Load and resistance factor design (LRFD) approach for the design of reinforced soil walls is presented to produce designs with consistent and uniform levels of risk for the whole range of design applications. The evaluation of load and resistance factors for the reinforced soil walls based on reliability theory is presented. A first order reliability method (FORM) is used to determine appropriate ranges for the values of the load and resistance factors. Using pseudo-static limit equilibrium method, analysis is conducted to evaluate the external stability of reinforced soil walls subjected to earthquake loading. The potential failure mechanisms considered in the analysis are sliding failure, eccentricity failure of resultant force (or overturning failure) and bearing capacity failure. The proposed procedure includes the variability associated with reinforced backfill, retained backfill, foundation soil, horizontal seismic acceleration and surcharge load acting on the wall. Partial factors needed to maintain the stability against three modes of failure by targeting component reliability index of 3.0 are obtained for various values of coefficients of variation (COV) of friction angle of backfill and foundation soil, distributed dead load surcharge, cohesion of the foundation soil and horizontal seismic acceleration. A comparative study between LRFD and allowable stress design (ASD) is also presented with a design example. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MEMS resonators have potential application in the area of frequency selective devices (e.g., gyroscopes, mass sensors, etc.). In this paper, design of electro thermally tunable resonators is presented. SOIMUMPs process is used to fabricate resonators with springs (beams) and a central mass. When voltage is applied, due to joule heating, temperature of the conducting beams goes up. This results in increase of electrical resistance due to mobility degradation. Due to increase in the temperature, springs start softening and therefore the fundamental frequency decreases. So for a given structure, one can modify the original fundamental frequency by changing the applied voltage. Coupled thermal effects result in non-uniform heating. It is observed from measurements and simulations that some parts of the beam become very hot and therefore soften more. Consequently, at higher voltages, the structure (equivalent to a single resonator) behaves like coupled resonators and exhibits peak splitting. In this mode, the given resonator can be used as a band rejection filter. This process is reversible and repeatable. For the designed structure, it is experimentally shown that by varying the voltage from 1 to 16V, the resonant frequency could be changed by 28%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the use of adaptive group testing to find a spectrum hole of a specified bandwidth in a given wideband of interest. We propose a group testing-based spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy by testing a group of adjacent subbands in a single test. This is enabled by a simple and easily implementable sub-Nyquist sampling scheme for signal acquisition by the cognitive radios (CRs). The sampling scheme deliberately introduces aliasing during signal acquisition, resulting in a signal that is the sum of signals from adjacent subbands. Energy-based hypothesis tests are used to provide an occupancy decision over the group of subbands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes of a specified bandwidth. We extend this framework to a multistage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including noncontiguous spectrum hole search. Furthermore, we provide the analytical means to optimize the group tests with respect to the detection thresholds, number of samples, group size, and number of stages to minimize the detection delay under a given error probability constraint. Our analysis allows one to identify the sparsity and SNR regimes where group testing can lead to significantly lower detection delays compared with a conventional bin-by-bin energy detection scheme; the latter is, in fact, a special case of the group test when the group size is set to 1 bin. We validate our analytical results via Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given a Boolean function , we say a triple (x, y, x + y) is a triangle in f if . A triangle-free function contains no triangle. If f differs from every triangle-free function on at least points, then f is said to be -far from triangle-free. In this work, we analyze the query complexity of testers that, with constant probability, distinguish triangle-free functions from those -far from triangle-free. Let the canonical tester for triangle-freeness denotes the algorithm that repeatedly picks x and y uniformly and independently at random from , queries f(x), f(y) and f(x + y), and checks whether f(x) = f(y) = f(x + y) = 1. Green showed that the canonical tester rejects functions -far from triangle-free with constant probability if its query complexity is a tower of 2's whose height is polynomial in . Fox later improved the height of the tower in Green's upper bound to . A trivial lower bound of on the query complexity is immediate. In this paper, we give the first non-trivial lower bound for the number of queries needed. We show that, for every small enough , there exists an integer such that for all there exists a function depending on all n variables which is -far from being triangle-free and requires queries for the canonical tester. We also show that the query complexity of any general (possibly adaptive) one-sided tester for triangle-freeness is at least square root of the query complexity of the corresponding canonical tester. Consequently, this means that any one-sided tester for triangle-freeness must make at least queries.