916 resultados para Failure time analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis we have presented several inventory models of utility. Of these inventory with retrial of unsatisfied demands and inventory with postponed work are quite recently introduced concepts, the latt~~ being introduced for the first time. Inventory with service time is relatively new with a handful of research work reported. The di lficuity encoLlntered in inventory with service, unlike the queueing process, is that even the simplest case needs a 2-dimensional process for its description. Only in certain specific cases we can introduce generating function • to solve for the system state distribution. However numerical procedures can be developed for solving these problem.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most statistical methodology for phase III clinical trials focuses on the comparison of a single experimental treatment with a control. An increasing desire to reduce the time before regulatory approval of a new drug is sought has led to development of two-stage or sequential designs for trials that combine the definitive analysis associated with phase III with the treatment selection element of a phase II study. In this paper we consider a trial in which the most promising of a number of experimental treatments is selected at the first interim analysis. This considerably reduces the computational load associated with the construction of stopping boundaries compared to the approach proposed by Follman, Proschan and Geller (Biometrics 1994; 50: 325-336). The computational requirement does not exceed that for the sequential comparison of a single experimental treatment with a control. Existing methods are extended in two ways. First, the use of the efficient score as a test statistic makes the analysis of binary, normal or failure-time data, as well as adjustment for covariates or stratification straightforward. Second, the question of trial power is also considered, enabling the determination of sample size required to give specified power. Copyright © 2003 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes a real-time multi-camera surveillance system that can be applied to a range of application domains. This integrated system is designed to observe crowded scenes and has mechanisms to improve tracking of objects that are in close proximity. The four component modules described in this paper are (i) motion detection using a layered background model, (ii) object tracking based on local appearance, (iii) hierarchical object recognition, and (iv) fused multisensor object tracking using multiple features and geometric constraints. This integrated approach to complex scene tracking is validated against a number of representative real-world scenarios to show that robust, real-time analysis can be performed. Copyright (C) 2007 Hindawi Publishing Corporation. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Earth-directed coronal mass ejection (CME) of 8 April 2010 provided an opportunity for space weather predictions from both established and developmental techniques to be made from near–real time data received from the SOHO and STEREO spacecraft; the STEREO spacecraft provide a unique view of Earth-directed events from outside the Sun-Earth line. Although the near–real time data transmitted by the STEREO Space Weather Beacon are significantly poorer in quality than the subsequently downlinked science data, the use of these data has the advantage that near–real time analysis is possible, allowing actual forecasts to be made. The fact that such forecasts cannot be biased by any prior knowledge of the actual arrival time at Earth provides an opportunity for an unbiased comparison between several established and developmental forecasting techniques. We conclude that for forecasts based on the STEREO coronagraph data, it is important to take account of the subsequent acceleration/deceleration of each CME through interaction with the solar wind, while predictions based on measurements of CMEs made by the STEREO Heliospheric Imagers would benefit from higher temporal and spatial resolution. Space weather forecasting tools must work with near–real time data; such data, when provided by science missions, is usually highly compressed and/or reduced in temporal/spatial resolution and may also have significant gaps in coverage, making such forecasts more challenging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An important application of Big Data Analytics is the real-time analysis of streaming data. Streaming data imposes unique challenges to data mining algorithms, such as concept drifts, the need to analyse the data on the fly due to unbounded data streams and scalable algorithms due to potentially high throughput of data. Real-time classification algorithms that are adaptive to concept drifts and fast exist, however, most approaches are not naturally parallel and are thus limited in their scalability. This paper presents work on the Micro-Cluster Nearest Neighbour (MC-NN) classifier. MC-NN is based on an adaptive statistical data summary based on Micro-Clusters. MC-NN is very fast and adaptive to concept drift whilst maintaining the parallel properties of the base KNN classifier. Also MC-NN is competitive compared with existing data stream classifiers in terms of accuracy and speed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A quantificação do risco país – e do risco político em particular – levanta várias dificuldades às empresas, instituições, e investidores. Como os indicadores econômicos são atualizados com muito menos freqüência do que o Facebook, compreender, e mais precisamente, medir – o que está ocorrendo no terreno em tempo real pode constituir um desafio para os analistas de risco político. No entanto, com a crescente disponibilidade de “big data” de ferramentas sociais como o Twitter, agora é o momento oportuno para examinar os tipos de métricas das ferramentas sociais que estão disponíveis e as limitações da sua aplicação para a análise de risco país, especialmente durante episódios de violência política. Utilizando o método qualitativo de pesquisa bibliográfica, este estudo identifica a paisagem atual de dados disponíveis a partir do Twitter, analisa os métodos atuais e potenciais de análise, e discute a sua possível aplicação no campo da análise de risco político. Depois de uma revisão completa do campo até hoje, e tendo em conta os avanços tecnológicos esperados a curto e médio prazo, este estudo conclui que, apesar de obstáculos como o custo de armazenamento de informação, as limitações da análise em tempo real, e o potencial para a manipulação de dados, os benefícios potenciais da aplicação de métricas de ferramentas sociais para o campo da análise de risco político, particularmente para os modelos qualitativos-estruturados e quantitativos, claramente superam os desafios.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Survival Analysis, long duration models allow for the estimation of the healing fraction, which represents a portion of the population immune to the event of interest. Here we address classical and Bayesian estimation based on mixture models and promotion time models, using different distributions (exponential, Weibull and Pareto) to model failure time. The database used to illustrate the implementations is described in Kersey et al. (1987) and it consists of a group of leukemia patients who underwent a certain type of transplant. The specific implementations used were numeric optimization by BFGS as implemented in R (base::optim), Laplace approximation (own implementation) and Gibbs sampling as implemented in Winbugs. We describe the main features of the models used, the estimation methods and the computational aspects. We also discuss how different prior information can affect the Bayesian estimates

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present residual analysis techniques to assess the fit of correlated survival data by Accelerated Failure Time Models (AFTM) with random effects. We propose an imputation procedure for censored observations and consider three types of residuals to evaluate different model characteristics. We illustrate the proposal with the analysis of AFTM with random effects to a real data set involving times between failures of oil well equipment

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In survival analysis, the response is usually the time until the occurrence of an event of interest, called failure time. The main characteristic of survival data is the presence of censoring which is a partial observation of response. Associated with this information, some models occupy an important position by properly fit several practical situations, among which we can mention the Weibull model. Marshall-Olkin extended form distributions other a basic generalization that enables greater exibility in adjusting lifetime data. This paper presents a simulation study that compares the gradient test and the likelihood ratio test using the Marshall-Olkin extended form Weibull distribution. As a result, there is only a small advantage for the likelihood ratio test

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work we study the accelerated failure-time generalized Gamma regression models with a unified approach. The models attempt to estimate simultaneously the effects of covariates on the acceleration/deceleration of the timing of a given event and the surviving fraction. The method is implemented in the free statistical software R. Finally the model is applied to a real dataset referring to the time until the return of the disease in patients diagnosed with breast cancer

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study was to assess the microhardness of 5 glass ionomer cements (GIC) - Vidrion R (V, SS White), Fuji IX (F, GC Corp.), Magic Glass ART (MG, Vigodent), Maxxion R (MR, FGM) and ChemFlex (CF, Dentsply) - in the presence or absence of a surface protection treatment, and after different storage periods. For each GIC, 36 test specimens were made, divided into 3 groups according to the surface protection treatment applied - no protection, varnish or nail varnish. The specimens were stored in distilled water for 24 h, 7 and 30 days and the microhardness tests were performed at these times. The data obtained were submitted to the ANOVA for repeated measures and Tukey tests (α = 5%). The results revealed that the mean microhardness values of the GICs were, in decreasing order, as follows: F > CF = MR > MG > V; that surface protection was significant for MR, at 24 h, without protection (64.2 ± 3.6a), protected with GIC varnish (59.6 ± 3.4b) and protected with nail varnish (62.7 ± 2.8ab); for F, at 7 days, without protection (97.8 ± 3.7ab), protected with varnish (95.9 ± 3.2b) and protected with nail varnish (100.8 ± 3.4a); and at 30 days, for F, without protection (98.8 ± 2.6b), protected with varnish (103.3 ± 4.4a) and protected with nail varnish (101 ± 4.1ab) and, for V, without protection (46 ± 1.3b), protected with varnish (49.6 ± 1.7ab) and protected with nail varnish (51.1 ± 2.6a). The increase in storage time produced an increase in microhardness. It was concluded that the different GICs, surface protection treatments and storage times could alter the microhardness values.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study was to evaluate the effects of different light-curing units and resin cement curing types on the bond durability of a feldspathic ceramic bonded to dentin. The crowns of 40 human molars were sectioned, exposing the dentin. Forty ceramic blocks of VITA VM7 were produced according to the manufacturer's recommendations. The ceramic surface was etched with 10% hydrofluoric acid/60s and silanized. The dentin was treated with37% phosphoric acid/15s, and the adhesive was applied. The ceramic blocks were divided and cemented to dentin according to resin cement/RC curing type(dual-and photocured), light-curing unit (halogen light/QTH and LED), and storage conditions (dry and storage/150 days + 12,000 cycles/thermocycling). All blocks were stored in distilled water (37°C/24h) and sectioned (n = 10): G1-QTH + RC Photo, G2-QTH + RC Dual, G3-LED + RC Photo, G4-LED + RC Dual. Groups G5, G6, G7, and G8 were obtained exactly as G1 through G4, respectively, and then stored and thermocycled. Microtensile bond strength tests were performed (EMIC), and data were statistically analyzed by ANOVA and Tukey's test (5%). The bond strength values (MPa) were: G1-12.95 (6.40)ab; G2-12.02 (4.59)ab; G3-13.09 (5.62)ab; G4-15.96 (6.32)a; G5-6.22 (5.90)c; G6-9.48 (5.99)bc; G7-12.78 (11.30)ab; and G8-8.34 (5.98)bc. The same superscript letters indicate no significant differences. Different light-curing units affected the bond strength betweenceramic cemented to dentin when the photocured cement was used, and only after aging (LED>QTH). There was no difference between the effects of dual-and photo-cured resin-luting agents on the microtensile bond strength of the cement used in this study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pós-graduação em Matematica Aplicada e Computacional - FCT

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the improvement in quality of life of animals, it is increasingly frequent clinical care of elderly patients, which present renal disorders, including chronic renal failure. Recent studies report the use of stem cells to treat renal failure, which would improve the levels of urea and creatinine, and in renal ultrasound evaluation. With the present work, the idea is to report a case of ultrasonographic evaluation in a patient with chronic renal failure, liver disease and splenic nodule, which underwent stem cell therapy, where there was an improvement in the sonographic evaluation of part of the liver.