24 resultados para Risk based Maintenance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Load and resistance factor design (LRFD) approach for the design of reinforced soil walls is presented to produce designs with consistent and uniform levels of risk for the whole range of design applications. The evaluation of load and resistance factors for the reinforced soil walls based on reliability theory is presented. A first order reliability method (FORM) is used to determine appropriate ranges for the values of the load and resistance factors. Using pseudo-static limit equilibrium method, analysis is conducted to evaluate the external stability of reinforced soil walls subjected to earthquake loading. The potential failure mechanisms considered in the analysis are sliding failure, eccentricity failure of resultant force (or overturning failure) and bearing capacity failure. The proposed procedure includes the variability associated with reinforced backfill, retained backfill, foundation soil, horizontal seismic acceleration and surcharge load acting on the wall. Partial factors needed to maintain the stability against three modes of failure by targeting component reliability index of 3.0 are obtained for various values of coefficients of variation (COV) of friction angle of backfill and foundation soil, distributed dead load surcharge, cohesion of the foundation soil and horizontal seismic acceleration. A comparative study between LRFD and allowable stress design (ASD) is also presented with a design example. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes a new performance-based approach for evaluating the return period of seismic soil liquefaction based on standard penetration test (SPT) and cone penetration test (CPT) data. The conventional liquefaction evaluation methods consider a single acceleration level and magnitude and these approaches fail to take into account the uncertainty in earthquake loading. The seismic hazard analysis based on the probabilistic method clearly shows that a particular acceleration value is being contributed by different magnitudes with varying probability. In the new method presented in this article, the entire range of ground shaking and the entire range of earthquake magnitude are considered and the liquefaction return period is evaluated based on the SPT and CPT data. This article explains the performance-based methodology for the liquefaction analysis – starting from probabilistic seismic hazard analysis (PSHA) for the evaluation of seismic hazard and the performance-based method to evaluate the liquefaction return period. A case study has been done for Bangalore, India, based on SPT data and converted CPT values. The comparison of results obtained from both the methods have been presented. In an area of 220 km2 in Bangalore city, the site class was assessed based on large number of borehole data and 58 Multi-channel analysis of surface wave survey. Using the site class and peak acceleration at rock depth from PSHA, the peak ground acceleration at the ground surface was estimated using probabilistic approach. The liquefaction analysis was done based on 450 borehole data obtained in the study area. The results of CPT match well with the results obtained from similar analysis with SPT data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nucleoid-associated protein HU plays an important role in maintenance of chromosomal architecture and in global regulation of DNA transactions in bacteria. Although HU is essential for growth in Mycobacterium tuberculosis (Mtb), there have been no reported attempts to perturb HU function with small molecules. Here we report the crystal structure of the N-terminal domain of HU from Mtb. We identify a core region within the HU-DNA interface that can be targeted using stilbene derivatives. These small molecules specifically inhibit HU-DNA binding, disrupt nucleoid architecture and reduce Mtb growth. The stilbene inhibitors induce gene expression changes in Mtb that resemble those induced by HU deficiency. Our results indicate that HU is a potential target for the development of therapies against tuberculosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer is a complex disease which arises due to a series of genetic changes related to cell division and growth control. Cancer remains the second leading cause of death in humans next to heart diseases. As a testimony to our progress in understanding the biology of cancer and developments in cancer diagnosis and treatment methods, the overall median survival time of all cancers has increased six fold one year to six years during the last four decades. However, while the median survival time has increased dramatically for some cancers like breast and colon, there has been only little change for other cancers like pancreas and brain. Further, not all patients having a single type of tumour respond to the standard treatment. The differential response is due to genetic heterogeneity which exists not only between tumours, which is called intertumour heterogeneity, but also within individual tumours, which is called intratumoural heterogeneity. Thus it becomes essential to personalize the cancer treatment based on a specific genetic change in a given tumour. It is also possible to stratify cancer patients into low- and high-risk groups based on expression changes or alterations in a group of genes gene signatures and choose a more suitable mode of therapy. It is now possible that each tumour can be analysed using various high-throughput methods like gene expression profiling and next-generation sequencing to identify its unique fingerprint based on which a personalized or tailor-made therapy can be developed. Here, we review the important progress made in the recent years towards personalizing cancer treatment with the use of gene signatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of designing an optimal pointwise shrinkage estimator in the transform domain, based on the minimum probability of error (MPE) criterion. We assume an additive model for the noise corrupting the clean signal. The proposed formulation is general in the sense that it can handle various noise distributions. We consider various noise distributions (Gaussian, Student's-t, and Laplacian) and compare the denoising performance of the estimator obtained with the mean-squared error (MSE)-based estimators. The MSE optimization is carried out using an unbiased estimator of the MSE, namely Stein's Unbiased Risk Estimate (SURE). Experimental results show that the MPE estimator outperforms the SURE estimator in terms of SNR of the denoised output, for low (0 -10 dB) and medium values (10 - 20 dB) of the input SNR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coupling of endocytosis and exocytosis underlies fundamental biological processes ranging from fertilization to neuronal activity and cellular polarity. However, the mechanisms governing the spatial organization of endocytosis and exocytosis require clarification. Using a quantitative imaging-based screen in budding yeast, we identified 89 mutants displaying defects in the localization of either one or both pathways. High-resolution single-vesicle tracking revealed that the endocytic and exocytic mutants she4 Delta and bud6 Delta alter post-Golgi vesicle dynamics in opposite ways. The endocytic and exocytic pathways display strong interdependence during polarity establishment while being more independent during polarity maintenance. Systems analysis identified the exocyst complex as a key network hub, rich in genetic interactions with endocytic and exocytic components. Exocyst mutants displayed altered endocytic and post-Golgi vesicle dynamics and interspersed endocytic and exocytic domains compared with control cells. These data are consistent with an important role for the exocyst in coordinating endocytosis and exocytosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of multiplicative noise on a signal when compared with that of additive noise is very large. In this paper, we address the problem of suppressing multiplicative noise in one-dimensional signals. To deal with signals that are corrupted with multiplicative noise, we propose a denoising algorithm based on minimization of an unbiased estimator (MURE) of meansquare error (MSE). We derive an expression for an unbiased estimate of the MSE. The proposed denoising is carried out in wavelet domain (soft thresholding) by considering time-domain MURE. The parameters of thresholding function are obtained by minimizing the unbiased estimator MURE. We show that the parameters for optimal MURE are very close to the optimal parameters considering the oracle MSE. Experiments show that the SNR improvement for the proposed denoising algorithm is competitive with a state-of-the-art method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose optimal bilateral filtering techniques for Gaussian noise suppression in images. To achieve maximum denoising performance via optimal filter parameter selection, we adopt Stein's unbiased risk estimate (SURE)-an unbiased estimate of the mean-squared error (MSE). Unlike MSE, SURE is independent of the ground truth and can be used in practical scenarios where the ground truth is unavailable. In our recent work, we derived SURE expressions in the context of the bilateral filter and proposed SURE-optimal bilateral filter (SOBF). We selected the optimal parameters of SOBF using the SURE criterion. To further improve the denoising performance of SOBF, we propose variants of SOBF, namely, SURE-optimal multiresolution bilateral filter (SMBF), which involves optimal bilateral filtering in a wavelet framework, and SURE-optimal patch-based bilateral filter (SPBF), where the bilateral filter parameters are optimized on small image patches. Using SURE guarantees automated parameter selection. The multiresolution and localized denoising in SMBF and SPBF, respectively, yield superior denoising performance when compared with the globally optimal SOBF. Experimental validations and comparisons show that the proposed denoisers perform on par with some state-of-the-art denoising techniques. (C) 2015 SPIE and IS&T

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a methodology to reduce composite structure maintenance operational cost using SHM systems is adressed. Based on SHM real-time data, in-service structure lifetime prognostic and remaining useful lifetime (RUL) can be performed. Maintenance timetable can be therefore predicted by optimizing inspection times. A probabilistic ap-proach is combined with phenomenological fatigue damage models for composite mate-rials to perform maintenance cost-effectiveness of composite structure. A Monte Carlo method is used to estimate the probability of failure of composite structures and com-pute the average number of composite structure components to be replaced over the component lifetime. The replacement frequency of a given structure component over the aircraft lifetime is assessed. A first application of aeronautical composite structure maintenance is considered. Two composite models to predict the fatigue life and several laminates have been used. Our study shows that maintenance cost-effectiveness depends on material and fatigue loading applied.