8 resultados para failure analysis strategy

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study evaluated the in vitro influence of pulse-repetition rate of Er:YAG laser and dentin depth on tensile bond strength of dentin-resin interface. Dentin surfaces of buccal or lingual surfaces from human third molars were submitted to tensile test in different depths (superficial, 1.0 and 1.5 mm) of the same dental area, using the same sample. Surface treatments were acid conditioning solely (control) and Er:YAG laser irradiation (80 mJ) followed by acid conditioning, with different pulse-repetition rates (1, 2, 3, or 4 Hz). Single bond/Z-250 system was used. The samples were stored in distilled water at 37 degrees C for 24 h, and then the first test (superficial dentine) was performed. The bond failures were analyzed. Following, the specimens were identified, grounded until 1.0- and 1.5-mm depths, submitted again to the treatments and to the second and, after that, to third-bond tests on a similar procedure and failure analysis. ANOVA and Tukey test demonstrated a significant difference (p < 0.001) for treatment and treatment X depth interaction (p < 0.05). The tested depths did not show influence (p > 0.05) on the bond strength of dentin-resin interface. It may be concluded that Er:YAG laser with 1, 2, 3, or 4 Hz combined with acid conditioning did not increase the resin tensile bond strength to dentin, regardless of dentin depth. (C) 2007 Wiley Periodicals, Inc.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There are several versions of the lognormal distribution in the statistical literature, one is based in the exponential transformation of generalized normal distribution (GN). This paper presents the Bayesian analysis for the generalized lognormal distribution (logGN) considering independent non-informative Jeffreys distributions for the parameters as well as the procedure for implementing the Gibbs sampler to obtain the posterior distributions of parameters. The results are used to analyze failure time models with right-censored and uncensored data. The proposed method is illustrated using actual failure time data of computers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: Changes in skeletal muscle morphology and metabolism are associated with limited functional capacity in heart failure, which can be attenuated by neuromuscular electrical stimulation (ES). The purpose of the present study was to analyse the effects of ES upon GLUT-4 protein content, fibre structure and vessel density of the skeletal muscle in a rat model of HF subsequent to myocardial infarction. Methods: Forty-four male Wistar rats were assigned to one of four groups: sham (S), sham submitted to ES (S+ES), heart failure (HF) and heart failure submitted to ES (HF+ES). The rats in the ES groups were submitted to ES of the left leg during 20 days (2.5 kHz, once a day, 30 min, duty cycle 50%- 15 s contraction/15 s rest). After this period, the left tibialis anterior muscle was collected from all the rats for analysis. Results: HF+ES rats showed lower values of lung congestion when compared with HF rats (P = 0.0001). Although muscle weight was lower in HF rats than in the S group, thus indicating hypotrophy, 20 days of ES led to their recovery (P < 0.0001). In both groups submitted to ES, there was an increase in muscle vessel density (P < 0.04). Additionally, heart failure determined a 49% reduction in GLUT-4 protein content (P < 0.03), which was recovered by ES (P < 0.01). Conclusion: In heart failure, ES improves morphological changes and raises GLUT-4 content in skeletal muscle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mechanisms responsible for the generation and maintenance of immunological memory to Plasmodium are poorly understood and the reasons why protective immunity in humans is so difficult to achieve and rapidly lost remain a matter for debate. A possible explanation for the difficulty in building up an efficient immune response against this parasite is the massive T cell apoptosis resulting from exposure to high-dose parasite Ag. To determine the immunological mechanisms required for long-term protection against P. chabaudi malaria and the consequences of high and low acute phase parasite loads for acquisition of protective immunity, we performed a detailed analysis of T and B cell compartments over a period of 200 days following untreated and drug-treated infections in female C57BL/6 mice. By comparing several immunological parameters with the capacity to control a secondary parasite challenge, we concluded that loss of full protective immunity is not determined by acute phase parasite load nor by serum levels of specific IgG2a and IgG1. Abs, but appears to be a consequence of the progressive decline in memory T cell response to parasites, which occurs similarly in untreated and drug-treated mice with time after infection. Furthermore, by analyzing adoptive transfer experiments, we confirmed the major role of CD4(+) T cells for guaranteeing long-term full protection against P. chabaudi malaria. The Journal of Immunology, 2008, 181: 8344-8355.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clustering is a difficult task: there is no single cluster definition and the data can have more than one underlying structure. Pareto-based multi-objective genetic algorithms (e.g., MOCK Multi-Objective Clustering with automatic K-determination and MOCLE-Multi-Objective Clustering Ensemble) were proposed to tackle these problems. However, the output of such algorithms can often contains a high number of partitions, becoming difficult for an expert to manually analyze all of them. In order to deal with this problem, we present two selection strategies, which are based on the corrected Rand, to choose a subset of solutions. To test them, they are applied to the set of solutions produced by MOCK and MOCLE in the context of several datasets. The study was also extended to select a reduced set of partitions from the initial population of MOCLE. These analysis show that both versions of selection strategy proposed are very effective. They can significantly reduce the number of solutions and, at the same time, keep the quality and the diversity of the partitions in the original set of solutions. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point placement strategies aim at mapping data points represented in higher dimensions to bi-dimensional spaces and are frequently used to visualize relationships amongst data instances. They have been valuable tools for analysis and exploration of data sets of various kinds. Many conventional techniques, however, do not behave well when the number of dimensions is high, such as in the case of documents collections. Later approaches handle that shortcoming, but may cause too much clutter to allow flexible exploration to take place. In this work we present a novel hierarchical point placement technique that is capable of dealing with these problems. While good grouping and separation of data with high similarity is maintained without increasing computation cost, its hierarchical structure lends itself both to exploration in various levels of detail and to handling data in subsets, improving analysis capability and also allowing manipulation of larger data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of bivariate distributions plays a fundamental role in survival and reliability studies. In this paper, we consider a location scale model for bivariate survival times based on the proposal of a copula to model the dependence of bivariate survival data. For the proposed model, we consider inferential procedures based on maximum likelihood. Gains in efficiency from bivariate models are also examined in the censored data setting. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the bivariate regression model for matched paired survival data. Sensitivity analysis methods such as local and total influence are presented and derived under three perturbation schemes. The martingale marginal and the deviance marginal residual measures are used to check the adequacy of the model. Furthermore, we propose a new measure which we call modified deviance component residual. The methodology in the paper is illustrated on a lifetime data set for kidney patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a study on a deterministic partially self-avoiding walk (tourist walk), which provides a novel method for texture feature extraction. The method is able to explore an image on all scales simultaneously. Experiments were conducted using different dynamics concerning the tourist walk. A new strategy, based on histograms. to extract information from its joint probability distribution is presented. The promising results are discussed and compared to the best-known methods for texture description reported in the literature. (C) 2009 Elsevier Ltd. All rights reserved.