980 resultados para Reliability testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Load and resistance factor design (LRFD) approach for the design of reinforced soil walls is presented to produce designs with consistent and uniform levels of risk for the whole range of design applications. The evaluation of load and resistance factors for the reinforced soil walls based on reliability theory is presented. A first order reliability method (FORM) is used to determine appropriate ranges for the values of the load and resistance factors. Using pseudo-static limit equilibrium method, analysis is conducted to evaluate the external stability of reinforced soil walls subjected to earthquake loading. The potential failure mechanisms considered in the analysis are sliding failure, eccentricity failure of resultant force (or overturning failure) and bearing capacity failure. The proposed procedure includes the variability associated with reinforced backfill, retained backfill, foundation soil, horizontal seismic acceleration and surcharge load acting on the wall. Partial factors needed to maintain the stability against three modes of failure by targeting component reliability index of 3.0 are obtained for various values of coefficients of variation (COV) of friction angle of backfill and foundation soil, distributed dead load surcharge, cohesion of the foundation soil and horizontal seismic acceleration. A comparative study between LRFD and allowable stress design (ASD) is also presented with a design example. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MEMS resonators have potential application in the area of frequency selective devices (e.g., gyroscopes, mass sensors, etc.). In this paper, design of electro thermally tunable resonators is presented. SOIMUMPs process is used to fabricate resonators with springs (beams) and a central mass. When voltage is applied, due to joule heating, temperature of the conducting beams goes up. This results in increase of electrical resistance due to mobility degradation. Due to increase in the temperature, springs start softening and therefore the fundamental frequency decreases. So for a given structure, one can modify the original fundamental frequency by changing the applied voltage. Coupled thermal effects result in non-uniform heating. It is observed from measurements and simulations that some parts of the beam become very hot and therefore soften more. Consequently, at higher voltages, the structure (equivalent to a single resonator) behaves like coupled resonators and exhibits peak splitting. In this mode, the given resonator can be used as a band rejection filter. This process is reversible and repeatable. For the designed structure, it is experimentally shown that by varying the voltage from 1 to 16V, the resonant frequency could be changed by 28%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the use of adaptive group testing to find a spectrum hole of a specified bandwidth in a given wideband of interest. We propose a group testing-based spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy by testing a group of adjacent subbands in a single test. This is enabled by a simple and easily implementable sub-Nyquist sampling scheme for signal acquisition by the cognitive radios (CRs). The sampling scheme deliberately introduces aliasing during signal acquisition, resulting in a signal that is the sum of signals from adjacent subbands. Energy-based hypothesis tests are used to provide an occupancy decision over the group of subbands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes of a specified bandwidth. We extend this framework to a multistage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including noncontiguous spectrum hole search. Furthermore, we provide the analytical means to optimize the group tests with respect to the detection thresholds, number of samples, group size, and number of stages to minimize the detection delay under a given error probability constraint. Our analysis allows one to identify the sparsity and SNR regimes where group testing can lead to significantly lower detection delays compared with a conventional bin-by-bin energy detection scheme; the latter is, in fact, a special case of the group test when the group size is set to 1 bin. We validate our analytical results via Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given a Boolean function , we say a triple (x, y, x + y) is a triangle in f if . A triangle-free function contains no triangle. If f differs from every triangle-free function on at least points, then f is said to be -far from triangle-free. In this work, we analyze the query complexity of testers that, with constant probability, distinguish triangle-free functions from those -far from triangle-free. Let the canonical tester for triangle-freeness denotes the algorithm that repeatedly picks x and y uniformly and independently at random from , queries f(x), f(y) and f(x + y), and checks whether f(x) = f(y) = f(x + y) = 1. Green showed that the canonical tester rejects functions -far from triangle-free with constant probability if its query complexity is a tower of 2's whose height is polynomial in . Fox later improved the height of the tower in Green's upper bound to . A trivial lower bound of on the query complexity is immediate. In this paper, we give the first non-trivial lower bound for the number of queries needed. We show that, for every small enough , there exists an integer such that for all there exists a function depending on all n variables which is -far from being triangle-free and requires queries for the canonical tester. We also show that the query complexity of any general (possibly adaptive) one-sided tester for triangle-freeness is at least square root of the query complexity of the corresponding canonical tester. Consequently, this means that any one-sided tester for triangle-freeness must make at least queries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

India's energy demand is increasing rapidly with the intensive growth of economy. The electricity demand in India exceeded the availability, both in terms of base load energy and peak availability. The efficient use of energy source and its conversion and utilizations are the viable alternatives available to the utilities or industry. There are essentially two approaches to electrical energy management. First at the supply / utility end (Supply Side Management or SSM) and the other at the consumer end (Demand Side Management or DSM). This work is based on Supply Side Management (SSM) protocol and consists of design, fabrication and testing of a control device that will be able to automatically regulate the power flow to an individual consumer's premise. This control device can monitor the overuse of electricity (above the connected load or contracted demand) by the individual consumers. The present project work specially emphasizes on contract demand of every consumer and tries to reduce the use beyond the contract demand. This control unit design includes both software and hardware work and designed for 0.5 kW contract demand. The device is tested in laboratory and reveals its potential use in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rifampicin (Rif) is a first line drug used for tuberculosis treatment. However, the emergence of drug resistant strains has necessitated synthesis and testing of newer analogs of Rif. Mycobacterium smegmatis is often used as a surrogate for M. tuberculosis. However, the presence of an ADP ribosyltransferase (Arr) in M. smegmatis inactivates Rif, rendering it impractical for screening of Rif analogs or other compounds when used in conjunction with them (Rif/Rif analogs). Rifampicin is also used in studying the role of various DNA repair enzymes by analyzing mutations in RpoB (a subunit of RNA polymerase) causing Rif resistance. These analyses use high concentrations of Rif when M. smegmatis is used as model. Here, we have generated M. smegmatis strains by deleting arr (Delta arr). The M. smegmatis Delta arr strains show minimum inhibitory concentration (MIC) for Rif which is similar to that for M. tuberculosis. The MICs for isoniazid, pyrazinamide, ethambutol, ciprofloxacin and streptomycin were essentially unaltered for M. smegmatis Delta arr. The growth profiles and mutation spectrum of Delta arr and, Delta arr combined with Delta udgB (udgB encodes a DNA repair enzyme that excises uracil) strains were similar to their counterparts wild-type for arr. However, the mutation spectrum of Delta fpg Delta arr strain differed somewhat from that of the Delta fpg strain (fpg encodes a DNA repair enzyme that excises 8-oxo-G). Our studies suggest M. smegmatis Delta arr strain as an ideal model system in drug testing and mutation spectrum determination in DNA repair studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given a function from Z(n) to itself one can determine its polynomial representability by using Kempner function. In this paper we present an alternative characterization of polynomial functions over Z(n) by constructing a generating set for the Z(n)-module of polynomial functions. This characterization results in an algorithm that is faster on average in deciding polynomial representability. We also extend the characterization to functions in several variables. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the cities in India are undergoing rapid development in recent decades, and many rural localities are undergoing transformation to urban hotspots. These developments have associated land use/land cover (LULC) change that effects runoff response from catchments, which is often evident in the form of increase in runoff peaks, volume and velocity in drain network. Often most of the existing storm water drains are in dilapidated stage owing to improper maintenance or inadequate design. The drains are conventionally designed using procedures that are based on some anticipated future conditions. Further, values of parameters/variables associated with design of the network are traditionally considered to be deterministic. However, in reality, the parameters/variables have uncertainty due to natural and/or inherent randomness. There is a need to consider the uncertainties for designing a storm water drain network that can effectively convey the discharge. The present study evaluates performance of an existing storm water drain network in Bangalore, India, through reliability analysis by Advance First Order Second Moment (AFOSM) method. In the reliability analysis, parameters that are considered to be random variables are roughness coefficient, slope and conduit dimensions. Performance of the existing network is evaluated considering three failure modes. The first failure mode occurs when runoff exceeds capacity of the storm water drain network, while the second failure mode occurs when the actual flow velocity in the storm water drain network exceeds the maximum allowable velocity for erosion control, whereas the third failure mode occurs when the minimum flow velocity is less than the minimum allowable velocity for deposition control. In the analysis, runoff generated from subcatchments of the study area and flow velocity in storm water drains are estimated using Storm Water Management Model (SWMM). Results from the study are presented and discussed. The reliability values are low under the three failure modes, indicating a need to redesign several of the conduits to improve their reliability. This study finds use in devising plans for expansion of the Bangalore storm water drain system. (C) 2015 The Authors. Published by Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo simulation methods involving splitting of Markov chains have been used in evaluation of multi-fold integrals in different application areas. We examine in this paper the performance of these methods in the context of evaluation of reliability integrals from the point of view of characterizing the sampling fluctuations. The methods discussed include the Au-Beck subset simulation, Holmes-Diaconis-Ross method, and generalized splitting algorithm. A few improvisations based on first order reliability method are suggested to select algorithmic parameters of the latter two methods. The bias and sampling variance of the alternative estimators are discussed. Also, an approximation to the sampling distribution of some of these estimators is obtained. Illustrative examples involving component and series system reliability analyses are presented with a view to bring out the relative merits of alternative methods. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fracture toughness measurements at the small scale have gained prominence over the years due to the continuing miniaturization of structural systems. Measurements carried out on bulk materials cannot be extrapolated to smaller length scales either due to the complexity of the microstructure or due to the size and geometric effect. Many new geometries have been proposed for fracture property measurements at small-length scales depending on the material behaviour and the type of device used in service. In situ testing provides the necessary environment to observe fracture at these length scales so as to determine the actual failure mechanism in these systems. In this paper, several improvements are incorporated to a previously proposed geometry of bending a doubly clamped beam for fracture toughness measurements. Both monotonic and cyclic loading conditions have been imposed on the beam to study R-curve and fatigue effects. In addition to the advantages that in situ SEM-based testing offers in such tests, FEM has been used as a simulation tool to replace cumbersome and expensive experiments to optimize the geometry. A description of all the improvements made to this specific geometry of clamped beam bending to make a variety of fracture property measurements is given in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A rapid and the simple chiral derivatizing protocol involving the coupling of 2-formylphenylboronic acid and an optically pure 1,1-binaphthalene]-2,2-diamine is introduced for the accurate determination of the enantiopurity of hydroxy acids and their derivatives, possessing one or two optically active centers, using H-1 NMR spectroscopy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Granular filters are provided for the safety of water retaining structure for protection against piping failure. The phenomenon of piping triggers when the base soil to be protected starts migrating in the direction of seepage flow under the influence of seepage force. To protect base soil from migration, the voids in the filter media should be small enough but it should not also be too small to block smooth passage of seeping water. Fulfilling these two contradictory design requirements at the same time is a major concern for the successful performance of granular filter media. Since Terzaghi era, conventionally, particle size distribution (PSD) of granular filters is designed based on particle size distribution characteristics of the base soil to be protected. The design approach provides a range of D15f value in which the PSD of granular filter media should fall and there exist infinite possibilities. Further, safety against the two critical design requirements cannot be ensured. Although used successfully for many decades, the existing filter design guidelines are purely empirical in nature accompanied with experience and good engineering judgment. In the present study, analytical solutions for obtaining the factor of safety with respect to base soil particle migration and soil permeability consideration as proposed by the authors are first discussed. The solution takes into consideration the basic geotechnical properties of base soil and filter media as well as existing hydraulic conditions and provides a comprehensive solution to the granular filter design with ability to assess the stability in terms of factor of safety. Considering the fact that geotechnical properties are variable in nature, probabilistic analysis is further suggested to evaluate the system reliability of the filter media that may help in risk assessment and risk management for decision making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The field of micro-/nano-mechanics of materials has been driven, on the one hand by the development of ever smaller structures in devices, and, on the other, by the need to map property variations in large systems that are microstructurally graded. Observations of `smaller is stronger' have also brought in questions of accompanying fracture property changes in the materials. In the wake of scattered articles on micro-scale fracture testing of various material classes, this review attempts to provide a holistic picture of the current state of the art. In the process, various reliable micro-scale geometries are shown, challenges with respect to instrumentation to probe ever smaller length scales are discussed and examples from recent literature are put together to exhibit the expanse of unusual fracture response of materials, from ductility in Si to brittleness in Pt. Outstanding issues related to fracture mechanics of small structures are critically examined for plausible solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study considers earthquake shake table testing of bending-torsion coupled structures under multi-component stationary random earthquake excitations. An experimental procedure to arrive at the optimal excitation cross-power spectral density (psd) functions which maximize/minimize the steady state variance of a chosen response variable is proposed. These optimal functions are shown to be derivable in terms of a set of system frequency response functions which could be measured experimentally without necessitating an idealized mathematical model to be postulated for the structure under study. The relationship between these optimized cross-psd functions to the most favourable/least favourable angle of incidence of seismic waves on the structure is noted. The optimal functions are also shown to be system dependent, mathematically the sharpest, and correspond to neither fully correlated motions nor independent motions. The proposed experimental procedure is demonstrated through shake table studies on two laboratory scale building frame models.