894 resultados para Risk based Maintenance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes a new performance-based approach for evaluating the return period of seismic soil liquefaction based on standard penetration test (SPT) and cone penetration test (CPT) data. The conventional liquefaction evaluation methods consider a single acceleration level and magnitude and these approaches fail to take into account the uncertainty in earthquake loading. The seismic hazard analysis based on the probabilistic method clearly shows that a particular acceleration value is being contributed by different magnitudes with varying probability. In the new method presented in this article, the entire range of ground shaking and the entire range of earthquake magnitude are considered and the liquefaction return period is evaluated based on the SPT and CPT data. This article explains the performance-based methodology for the liquefaction analysis – starting from probabilistic seismic hazard analysis (PSHA) for the evaluation of seismic hazard and the performance-based method to evaluate the liquefaction return period. A case study has been done for Bangalore, India, based on SPT data and converted CPT values. The comparison of results obtained from both the methods have been presented. In an area of 220 km2 in Bangalore city, the site class was assessed based on large number of borehole data and 58 Multi-channel analysis of surface wave survey. Using the site class and peak acceleration at rock depth from PSHA, the peak ground acceleration at the ground surface was estimated using probabilistic approach. The liquefaction analysis was done based on 450 borehole data obtained in the study area. The results of CPT match well with the results obtained from similar analysis with SPT data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nucleoid-associated protein HU plays an important role in maintenance of chromosomal architecture and in global regulation of DNA transactions in bacteria. Although HU is essential for growth in Mycobacterium tuberculosis (Mtb), there have been no reported attempts to perturb HU function with small molecules. Here we report the crystal structure of the N-terminal domain of HU from Mtb. We identify a core region within the HU-DNA interface that can be targeted using stilbene derivatives. These small molecules specifically inhibit HU-DNA binding, disrupt nucleoid architecture and reduce Mtb growth. The stilbene inhibitors induce gene expression changes in Mtb that resemble those induced by HU deficiency. Our results indicate that HU is a potential target for the development of therapies against tuberculosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer is a complex disease which arises due to a series of genetic changes related to cell division and growth control. Cancer remains the second leading cause of death in humans next to heart diseases. As a testimony to our progress in understanding the biology of cancer and developments in cancer diagnosis and treatment methods, the overall median survival time of all cancers has increased six fold one year to six years during the last four decades. However, while the median survival time has increased dramatically for some cancers like breast and colon, there has been only little change for other cancers like pancreas and brain. Further, not all patients having a single type of tumour respond to the standard treatment. The differential response is due to genetic heterogeneity which exists not only between tumours, which is called intertumour heterogeneity, but also within individual tumours, which is called intratumoural heterogeneity. Thus it becomes essential to personalize the cancer treatment based on a specific genetic change in a given tumour. It is also possible to stratify cancer patients into low- and high-risk groups based on expression changes or alterations in a group of genes gene signatures and choose a more suitable mode of therapy. It is now possible that each tumour can be analysed using various high-throughput methods like gene expression profiling and next-generation sequencing to identify its unique fingerprint based on which a personalized or tailor-made therapy can be developed. Here, we review the important progress made in the recent years towards personalizing cancer treatment with the use of gene signatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of designing an optimal pointwise shrinkage estimator in the transform domain, based on the minimum probability of error (MPE) criterion. We assume an additive model for the noise corrupting the clean signal. The proposed formulation is general in the sense that it can handle various noise distributions. We consider various noise distributions (Gaussian, Student's-t, and Laplacian) and compare the denoising performance of the estimator obtained with the mean-squared error (MSE)-based estimators. The MSE optimization is carried out using an unbiased estimator of the MSE, namely Stein's Unbiased Risk Estimate (SURE). Experimental results show that the MPE estimator outperforms the SURE estimator in terms of SNR of the denoised output, for low (0 -10 dB) and medium values (10 - 20 dB) of the input SNR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coupling of endocytosis and exocytosis underlies fundamental biological processes ranging from fertilization to neuronal activity and cellular polarity. However, the mechanisms governing the spatial organization of endocytosis and exocytosis require clarification. Using a quantitative imaging-based screen in budding yeast, we identified 89 mutants displaying defects in the localization of either one or both pathways. High-resolution single-vesicle tracking revealed that the endocytic and exocytic mutants she4 Delta and bud6 Delta alter post-Golgi vesicle dynamics in opposite ways. The endocytic and exocytic pathways display strong interdependence during polarity establishment while being more independent during polarity maintenance. Systems analysis identified the exocyst complex as a key network hub, rich in genetic interactions with endocytic and exocytic components. Exocyst mutants displayed altered endocytic and post-Golgi vesicle dynamics and interspersed endocytic and exocytic domains compared with control cells. These data are consistent with an important role for the exocyst in coordinating endocytosis and exocytosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of multiplicative noise on a signal when compared with that of additive noise is very large. In this paper, we address the problem of suppressing multiplicative noise in one-dimensional signals. To deal with signals that are corrupted with multiplicative noise, we propose a denoising algorithm based on minimization of an unbiased estimator (MURE) of meansquare error (MSE). We derive an expression for an unbiased estimate of the MSE. The proposed denoising is carried out in wavelet domain (soft thresholding) by considering time-domain MURE. The parameters of thresholding function are obtained by minimizing the unbiased estimator MURE. We show that the parameters for optimal MURE are very close to the optimal parameters considering the oracle MSE. Experiments show that the SNR improvement for the proposed denoising algorithm is competitive with a state-of-the-art method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose optimal bilateral filtering techniques for Gaussian noise suppression in images. To achieve maximum denoising performance via optimal filter parameter selection, we adopt Stein's unbiased risk estimate (SURE)-an unbiased estimate of the mean-squared error (MSE). Unlike MSE, SURE is independent of the ground truth and can be used in practical scenarios where the ground truth is unavailable. In our recent work, we derived SURE expressions in the context of the bilateral filter and proposed SURE-optimal bilateral filter (SOBF). We selected the optimal parameters of SOBF using the SURE criterion. To further improve the denoising performance of SOBF, we propose variants of SOBF, namely, SURE-optimal multiresolution bilateral filter (SMBF), which involves optimal bilateral filtering in a wavelet framework, and SURE-optimal patch-based bilateral filter (SPBF), where the bilateral filter parameters are optimized on small image patches. Using SURE guarantees automated parameter selection. The multiresolution and localized denoising in SMBF and SPBF, respectively, yield superior denoising performance when compared with the globally optimal SOBF. Experimental validations and comparisons show that the proposed denoisers perform on par with some state-of-the-art denoising techniques. (C) 2015 SPIE and IS&T

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a methodology to reduce composite structure maintenance operational cost using SHM systems is adressed. Based on SHM real-time data, in-service structure lifetime prognostic and remaining useful lifetime (RUL) can be performed. Maintenance timetable can be therefore predicted by optimizing inspection times. A probabilistic ap-proach is combined with phenomenological fatigue damage models for composite mate-rials to perform maintenance cost-effectiveness of composite structure. A Monte Carlo method is used to estimate the probability of failure of composite structures and com-pute the average number of composite structure components to be replaced over the component lifetime. The replacement frequency of a given structure component over the aircraft lifetime is assessed. A first application of aeronautical composite structure maintenance is considered. Two composite models to predict the fatigue life and several laminates have been used. Our study shows that maintenance cost-effectiveness depends on material and fatigue loading applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coal-fired power plants may enjoy a significant advantage relative to gas plants in terms of cheaper fuel cost. Still, this advantage may erode or even turn into disadvantage depending on CO2 emission allowance price. This price will presumably rise in both the Kyoto Protocol commitment period (2008-2012) and the first post-Kyoto years. Thus, in a carbon-constrained environment, coal plants face financial risks arising in their profit margins, which in turn hinge on their so-called "clean dark spread". These risks are further reinforced when the price of the output electricity is determined by natural gas-fired plants' marginal costs, which differ from coal plants' costs. We aim to assess the risks in coal plants' margins. We adopt parameter values estimated from empirical data. These in turn are derived from natural gas and electricity markets alongside the EU ETS market where emission allowances are traded. Monte Carlo simulation allows to compute the expected value and risk profile of coal-based electricity generation. We focus on the clean dark spread in both time periods under different future scenarios in the allowance market. Specifically, bottom 5% and 10% percentiles are derived. According to our results, certain future paths of the allowance price may impose significant risks on the clean dark spread obtained by coal plants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Published also as: Documento de Trabajo Banco de España 0504/2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this paper is to analyse the value of information contained in prices of options on the IBEX 35 index at the Spanish Stock Exchange Market. The forward looking information is extracted using implied risk-neutral density functions estimated by a mixture of two-lognormals and three alternative risk-adjustments: the classic power and exponential utility functions and a habit-based specification that allows for a counter-cyclical variation of risk aversion. Our results show that at four-week horizon we can reject the hypothesis that between October 1996 and March 2000 the risk-neutral densities provide accurate predictions of the distributions of future realisations of the IBEX 35 index at a four-week horizon. When forecasting through risk-adjusted densities the performance of this period is statistically improved and we no longer reject that hypothesis. All risk-adjusted densities generate similar forecasting statistics. Then, at least for a horizon of four-weeks, the actual risk adjustment does not seem to be the issue. By contrast, at the one-week horizon risk-adjusted densities do not improve the forecasting ability of the risk-neutral counterparts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The requirement of setting annual catch limits to prevent overfishing has been added to the Magnuson-Stevens Fishery Conservation and Management Reauthorization Act of 2006 (MSRA). Because this requirement is new, a body of applied scientific practice for deriving annual catch limits and accompanying targets does not yet exist. This article demonstrates an approach to setting levels of catch that is intended to keep the probability of future overfishing at a preset low level. The proposed framework is based on stochastic projection with uncertainty in population dynamics. The framework extends common projection methodology by including uncertainty in the limit reference point and in management implementation, and by making explicit the risk of overfishing that managers consider acceptable. The approach is illustrated with application to gag (Mycteroperca microlepis), a grouper that inhabits the waters off the southeastern United States. Although devised to satisfy new legislation of the MSRA, the framework has potential application to any fishery where the management goal is to limit the risk of overfishing by controlling catch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CD6 has recently been identified and validated as risk gene for multiple sclerosis (MS), based on the association of a single nucleotide polymorphism (SNP), rs17824933, located in intron 1. CD6 is a cell surface scavenger receptor involved in T-cell activation and proliferation, as well as in thymocyte differentiation. In this study, we performed a haptag SNP screen of the CD6 gene locus using a total of thirteen tagging SNPs, of which three were non-synonymous SNPs, and replicated the recently reported GWAS SNP rs650258 in a Spanish-Basque collection of 814 controls and 823 cases. Validation of the six most strongly associated SNPs was performed in an independent collection of 2265 MS patients and 2600 healthy controls. We identified association of haplotypes composed of two non-synonymous SNPs [rs11230563 (R225W) and rs2074225 (A257V)] in the 2nd SRCR domain with susceptibility to MS (Pmax(T) permutation=161024). The effect of these haplotypes on CD6 surface expression and cytokine secretion was also tested. The analysis showed significantly different CD6 expression patterns in the distinct cell subsets, i.e. – CD4+ naı¨ve cells, P = 0.0001; CD8+ naı¨ve cells, P,0.0001; CD4+ and CD8+ central memory cells, P = 0.01 and 0.05, respectively; and natural killer T (NKT) cells, P = 0.02; with the protective haplotype (RA) showing higher expression of CD6. However, no significant changes were observed in natural killer (NK) cells, effector memory and terminally differentiated effector memory T cells. Our findings reveal that this new MS-associated CD6 risk haplotype significantly modifies expression of CD6 on CD4+ and CD8+ T cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines collapse risk of tall steel braced frame buildings using rupture-to-rafters simulations due to suite of San Andreas earthquakes. Two key advancements in this work are the development of (i) a rational methodology for assigning scenario earthquake probabilities and (ii) an artificial correction-free approach to broadband ground motion simulation. The work can be divided into the following sections: earthquake source modeling, earthquake probability calculations, ground motion simulations, building response, and performance analysis.

As a first step the kinematic source inversions of past earthquakes in the magnitude range of 6-8 are used to simulate 60 scenario earthquakes on the San Andreas fault. For each scenario earthquake a 30-year occurrence probability is calculated and we present a rational method to redistribute the forecast earthquake probabilities from UCERF to the simulated scenario earthquake. We illustrate the inner workings of the method through an example involving earthquakes on the San Andreas fault in southern California.

Next, three-component broadband ground motion histories are computed at 636 sites in the greater Los Angeles metropolitan area by superposing short-period (0.2~s-2.0~s) empirical Green's function synthetics on top of long-period ($>$ 2.0~s) spectral element synthetics. We superimpose these seismograms on low-frequency seismograms, computed from kinematic source models using the spectral element method, to produce broadband seismograms.

Using the ground motions at 636 sites for the 60 scenario earthquakes, 3-D nonlinear analysis of several variants of an 18-story steel braced frame building, designed for three soil types using the 1994 and 1997 Uniform Building Code provisions and subjected to these ground motions, are conducted. Model performance is classified into one of five performance levels: Immediate Occupancy, Life Safety, Collapse Prevention, Red-Tagged, and Model Collapse. The results are combined with the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a sparse number of credible source models available from large-magnitude past earthquakes. A stochastic source model generation algorithm thus becomes necessary for robust risk quantification using scenario earthquakes. We present an algorithm that combines the physics of fault ruptures as imaged in laboratory earthquakes with stress estimates on the fault constrained by field observations to generate stochastic source models for large-magnitude (Mw 6.0-8.0) strike-slip earthquakes. The algorithm is validated through a statistical comparison of synthetic ground motion histories from a stochastically generated source model for a magnitude 7.90 earthquake and a kinematic finite-source inversion of an equivalent magnitude past earthquake on a geometrically similar fault. The synthetic dataset comprises of three-component ground motion waveforms, computed at 636 sites in southern California, for ten hypothetical rupture scenarios (five hypocenters, each with two rupture directions) on the southern San Andreas fault. A similar validation exercise is conducted for a magnitude 6.0 earthquake, the lower magnitude limit for the algorithm. Additionally, ground motions from the Mw7.9 earthquake simulations are compared against predictions by the Campbell-Bozorgnia NGA relation as well as the ShakeOut scenario earthquake. The algorithm is then applied to generate fifty source models for a hypothetical magnitude 7.9 earthquake originating at Parkfield, with rupture propagating from north to south (towards Wrightwood), similar to the 1857 Fort Tejon earthquake. Using the spectral element method, three-component ground motion waveforms are computed in the Los Angeles basin for each scenario earthquake and the sensitivity of ground shaking intensity to seismic source parameters (such as the percentage of asperity area relative to the fault area, rupture speed, and risetime) is studied.

Under plausible San Andreas fault earthquakes in the next 30 years, modeled using the stochastic source algorithm, the performance of two 18-story steel moment frame buildings (UBC 1982 and 1997 designs) in southern California is quantified. The approach integrates rupture-to-rafters simulations into the PEER performance based earthquake engineering (PBEE) framework. Using stochastic sources and computational seismic wave propagation, three-component ground motion histories at 636 sites in southern California are generated for sixty scenario earthquakes on the San Andreas fault. The ruptures, with moment magnitudes in the range of 6.0-8.0, are assumed to occur at five locations on the southern section of the fault. Two unilateral rupture propagation directions are considered. The 30-year probabilities of all plausible ruptures in this magnitude range and in that section of the fault, as forecast by the United States Geological Survey, are distributed among these 60 earthquakes based on proximity and moment release. The response of the two 18-story buildings hypothetically located at each of the 636 sites under 3-component shaking from all 60 events is computed using 3-D nonlinear time-history analysis. Using these results, the probability of the structural response exceeding Immediate Occupancy (IO), Life-Safety (LS), and Collapse Prevention (CP) performance levels under San Andreas fault earthquakes over the next thirty years is evaluated.

Furthermore, the conditional and marginal probability distributions of peak ground velocity (PGV) and displacement (PGD) in Los Angeles and surrounding basins due to earthquakes occurring primarily on the mid-section of southern San Andreas fault are determined using Bayesian model class identification. Simulated ground motions at sites within 55-75km from the source from a suite of 60 earthquakes (Mw 6.0 − 8.0) primarily rupturing mid-section of San Andreas fault are considered for PGV and PGD data.