915 resultados para Fatigue. Composites. Modular Network. S-N Curves Probability. Weibull Distribution
Resumo:
In this paper, we proposed a new three-parameter long-term lifetime distribution induced by a latent complementary risk framework with decreasing, increasing and unimodal hazard function, the long-term complementary exponential geometric distribution. The new distribution arises from latent competing risk scenarios, where the lifetime associated scenario, with a particular risk, is not observable, rather we observe only the maximum lifetime value among all risks, and the presence of long-term survival. The properties of the proposed distribution are discussed, including its probability density function and explicit algebraic formulas for its reliability, hazard and quantile functions and order statistics. The parameter estimation is based on the usual maximum-likelihood approach. A simulation study assesses the performance of the estimation procedure. We compare the new distribution with its particular cases, as well as with the long-term Weibull distribution on three real data sets, observing its potential and competitiveness in comparison with some usual long-term lifetime distributions.
Resumo:
This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets. (c) 2011 Elsevier B.V. All rights reserved.
Weibull and generalised exponential overdispersion models with an application to ozone air pollution
Resumo:
We consider the problem of estimating the mean and variance of the time between occurrences of an event of interest (inter-occurrences times) where some forms of dependence between two consecutive time intervals are allowed. Two basic density functions are taken into account. They are the Weibull and the generalised exponential density functions. In order to capture the dependence between two consecutive inter-occurrences times, we assume that either the shape and/or the scale parameters of the two density functions are given by auto-regressive models. The expressions for the mean and variance of the inter-occurrences times are presented. The models are applied to the ozone data from two regions of Mexico City. The estimation of the parameters is performed using a Bayesian point of view via Markov chain Monte Carlo (MCMC) methods.
Resumo:
For the first time, we introduce a generalized form of the exponentiated generalized gamma distribution [Cordeiro et al. The exponentiated generalized gamma distribution with application to lifetime data, J. Statist. Comput. Simul. 81 (2011), pp. 827-842.] that is the baseline for the log-exponentiated generalized gamma regression model. The new distribution can accommodate increasing, decreasing, bathtub- and unimodal-shaped hazard functions. A second advantage is that it includes classical distributions reported in the lifetime literature as special cases. We obtain explicit expressions for the moments of the baseline distribution of the new regression model. The proposed model can be applied to censored data since it includes as sub-models several widely known regression models. It therefore can be used more effectively in the analysis of survival data. We obtain maximum likelihood estimates for the model parameters by considering censored data. We show that our extended regression model is very useful by means of two applications to real data.
Resumo:
In many applications of lifetime data analysis, it is important to perform inferences about the change-point of the hazard function. The change-point could be a maximum for unimodal hazard functions or a minimum for bathtub forms of hazard functions and is usually of great interest in medical or industrial applications. For lifetime distributions where this change-point of the hazard function can be analytically calculated, its maximum likelihood estimator is easily obtained from the invariance properties of the maximum likelihood estimators. From the asymptotical normality of the maximum likelihood estimators, confidence intervals can also be obtained. Considering the exponentiated Weibull distribution for the lifetime data, we have different forms for the hazard function: constant, increasing, unimodal, decreasing or bathtub forms. This model gives great flexibility of fit, but we do not have analytic expressions for the change-point of the hazard function. In this way, we consider the use of Markov Chain Monte Carlo methods to get posterior summaries for the change-point of the hazard function considering the exponentiated Weibull distribution.
Resumo:
In this article, for the first time, we propose the negative binomial-beta Weibull (BW) regression model for studying the recurrence of prostate cancer and to predict the cure fraction for patients with clinically localized prostate cancer treated by open radical prostatectomy. The cure model considers that a fraction of the survivors are cured of the disease. The survival function for the population of patients can be modeled by a cure parametric model using the BW distribution. We derive an explicit expansion for the moments of the recurrence time distribution for the uncured individuals. The proposed distribution can be used to model survival data when the hazard rate function is increasing, decreasing, unimodal and bathtub shaped. Another advantage is that the proposed model includes as special sub-models some of the well-known cure rate models discussed in the literature. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes. We analyze a real data set for localized prostate cancer patients after open radical prostatectomy.
Resumo:
In this thesis, three nitroxide based ionic systems were used to investigate structure and dynamics of their respective solutions in mixed solvents by means of electron paramagnetic resonance (EPR) and electron nuclear double resonance (ENDOR) spectroscopy at X- and W-band (9.5 and 94.5 GHz, respectively). rnFirst, the solvation of the inorganic radical Fremy’s salt (K2ON(SO3)2) in isotope substituted binary solvent mixtures (methanol/water) was investigated by means of high-field (W-band) pulse ENDOR spectroscopy and molecular dynamics (MD) simulations. From the analysis of orientation-selective 1H and 2H ENDOR spectra the principal components of the hyperfine coupling (hfc) tensor for chemically different protons (alcoholic methyl vs. exchangeable protons) were obtained. The methyl protons of the organic solvent approach with a mean distance of 3.5 Å perpendicular to the approximate plane spanned by ON(S)2 of the probe molecule. Exchangeable protons were found to be distributed isotropically, approaching closest to Fremy’s salt from the hydrogen-bonded network around the sulfonate groups. The distribution of exchangeable and methyl protons as found in MD simulations is in full agreement with the ENDOR results. The solvation was found to be similar for the studied solvent ratios between 1:2.3 and 2.3:1 and dominated by an interplay of H-bond (electrostatic) interactions and steric considerations with the NO group merely involved into H-bonds.rnFurther, the conformation of spin labeled poly(diallyldimethylammonium chloride) (PDADMAC) solutions in aqueous alcohol (methanol, ethanol, n-propanol, ethylene glycol, glycerol) mixtures in dependence of divalent sodium sulfate was investigated with double electron-electron resonance (DEER) spectroscopy. The DEER data was analyzed using the worm-like chain model which suggests that in organic-water solvent mixtures the polymer backbones are preferentially solvated by the organic solvent. We found a less serve impact on conformational changes due to salt than usually predicted in polyelectrolyte theory which stresses the importance of a delicate balance of hydrophobic and electrostatic interactions, in particular in the presence of organic solvents.rnFinally, the structure and dynamics of miniemulsions and polymerdispersions prepared with anionic surfactants, that were partially replaced by a spin labeled fatty acid in presence and absence of a lanthanide beta-diketonate complex was characterized by CW EPR spectroscopy. Such miniemulsions form multilayers with the surfactant head group bound to the lanthanide ion. Beta-diketonates were formerly used as NMR shift reagents and nowadays find application as luminescent materials in OLEDs and LCDs and as contrast agent in MRT. The embedding of the complex into a polymer matrix results in an easy processable material. It was found that the structure formation takes place in miniemulsion and is preserved during polymerization. For surfactants with carboxyl-head group a higher order of the alkyl chains and less lateral diffusion is found than for sulfat-head groups, suggesting a more uniform and stronger coordination to the metal ion. The stability of these bilayers depends on the temperature and the used surfactant which should be considered for the used polymerization temperature if a maximum output of the structured regions is wished.
Resumo:
Adaptation of vascular networks to functional demands needs vessel growth, vessel regression and vascular remodelling. Biomechanical forces resulting from blood flow play a key role in these processes. It is well-known that metabolic stimuli, mechanical forces and flow patterns can affect gene expression and remodelling of vascular networks in different ways. For instance, in the sprouting type of angiogenesis related to hypoxia, there is no blood flow in the rising capillary sprout. In contrast, it has been shown that an increase of wall shear stress initiates the splitting type of angiogenesis in skeletal muscle. Otherwise, during development, both sprouting and intussusception act in parallel in building the vascular network, although with differences in spatiotemporal distribution. Thereby, in addition to regulatory molecules, flow dynamics support the patterning and remodelling of the rising vascular tree. Herewith, we present an overview of angiogenic processes with respect to intussusceptive angiogenesis as related to local haemodynamics.
Resumo:
Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^
Resumo:
Anthropogenic impact on biomass of coastal plankton communities caused by submerged disposal of urban sewage waters (dumping) was studied. Observations were carried out in August-September of 2002-2004 in the Mamala Bay (Oahu Island, Hawaii Islands) using satellite and straight sea measurements. An analysis of variability of integral indicators of the water column determined on the basis of on-board measurements allowed us to divide them into two groups: elements most sensitive to pollution (heterotrophic bacteria (H-Bact), phototrophic cyanobacteria Synechococcus spp. (SYN), and chlorophyll a (CHLa)) and elements that manifested episodic positive dependence on inflow of polluted waters (heterotrophic unicellular eukaryotes, small unicellular algae, phototrophic green bacteria Prochlorococcus spp., as well as total biomass of microplankton). It was shown that submerged waste water disposal in the region of the diffuser of the dumping device led to insignificant (aver. 1.2-1.4 times) local increase in integral biomass of H-Bact, SYN, and in concentration of CHLa. Similar but sharper (aver. 1.5-2.1 times) increase in these parameters was found in water layers with maximal biomasses. Possible pathways of disposed waters (under the pycnocline, at its upper boundary, and in the entire mixed layer) were analyzed on the basis of studying vertical displacement of biomasses of H-Bact, SYN, and prochlorophytes. Possibility of using optical anomalies distinguished from satellite data as markers of anthropogenic eutrophication caused by dumping was confirmed. Application of such markers depends on water transparency and on shapes of curves of vertical distribution of autotrophic organisms.
Resumo:
We study the first passage statistics to adsorbing boundaries of a Brownian motion in bounded two-dimensional domains of different shapes and configurations of the adsorbing and reflecting boundaries. From extensive numerical analysis we obtain the probability P(ω) distribution of the random variable ω=τ1/(τ1+τ2), which is a measure for how similar the first passage times τ1 and τ2 are of two independent realizations of a Brownian walk starting at the same location. We construct a chart for each domain, determining whether P(ω) represents a unimodal, bell-shaped form, or a bimodal, M-shaped behavior. While in the former case the mean first passage time (MFPT) is a valid characteristic of the first passage behavior, in the latter case it is an insufficient measure for the process. Strikingly we find a distinct turnover between the two modes of P(ω), characteristic for the domain shape and the respective location of absorbing and reflective boundaries. Our results demonstrate that large fluctuations of the first passage times may occur frequently in two-dimensional domains, rendering quite vague the general use of the MFPT as a robust measure of the actual behavior even in bounded domains, in which all moments of the first passage distribution exist.
Resumo:
Este proyecto consiste en el diseño completo, de una red de distribución de TDT, a nivel local, mediante difusión SFN, Single Frequency Network. Este tipo de difusión, tiene la capacidad de difundir los servicios de televisión en una única frecuencia, cubriendo un área, ya sea local o estatal, aprovechando en las zonas de interferencia los rebotes de la señal y asà evitar el uso de una frecuencia distinta por cada centro de emisión, todos los que componen un área de cobertura. Para el diseño de la red, se ha optado por diseñar una red IP, mediante distribución multicast, ya que esta es la tecnologÃa imperante a dÃa de hoy, quedando obsoleta ya, la distribución analógica, ya que consume muchos más recursos y por consiguiente mucho más costosa de implementar. El documento se divide en cuatro capÃtulos. En el primer capÃtulo se realizará una introducción teórica a las redes de distribución SFN, centrándose en el cálculo de los retardos, punto fundamental en el diseño de este tipo de redes. Se continuará unas nociones básicas de redes IP y el protocolo multicast, en el que se basa el trasporte de la señal. El capÃtulo dos, se centra en el diseño de la red, desde los centros de producción, donde se generan los programas a emitir, hasta los diferentes centros de difusión que cubrirán todo el área de cobertura requerida, pasando por el centro de multiplexación, donde se sitúa la cabecera que compondrá el múltiplex a difundir. Se describirán los equipos y el diseño de los distintos centros que conforman la red, centros de producción, multiplexación y difusión. A demás se realizará el cálculo de retardo de la señal, necesario en este tipo de redes. Se continuará con el capÃtulo tres, donde se describirá la configuración de la red, tanto a nivel de equipamiento, como el diseño y asignación IP de toda la red, separando la red de servicio de la red de gestión para una mayor confiabilidad y eficiencia de la red. Se finalizará con la descripción de la gestión de la red, que mediante diferentes herramientas, proporcionan un monitoreado en tiempo real de todo el sistema, dando la posibilidad de adelantarsey previniendo posibles incidencias que, puedan causar alguna deficiencia en el servicio que se entrega al usuario final. ABSTRACT. This project involves the complete design of a network´s TDT distribution, locally, by broadcast SFN (Single Frequency Network). This type of broadcast, has the ability to broadcast television´s services on a single frequency, covering an area, whether local or state, drawing on the interference zones, signal´s rebounds, to avoid the use of a different frequency each broadcast center, all those who make a coverage area. For the design of the network, has been chosen to design an IP network using multicast distribution, since this is the prevailing technology today, as the analogue distribution, consumes more resources and therefore, much more costly to implement. The document is divided into four chapters. In the first chapter you can find a theoretical introduction to SFN distribution networks, focusing on the calculation of delays, fundamental point, in the design of these networks. A basic understanding of IP networks and the multicast protocol, in which the transport of the signal is based, will continue. Chapter two focuses on the design of the network, from production centers, where the programs are created to broadcast, to different distribution centers covering the entire area of coverage required, through the multiplexing center, where the head is located, which comprise the multiplex. Also, the equipment and design of the various centers in the network, production centers, multiplexing center and distribution centers, are described. Furthermore, the calculation of signal delay, necessary in such networks, is performed. We will continue with the chapter three, where the network configuration will be described, both in termsofequipment, such as design IP mapping of the entire network, separating the service network and management network, for increased the reliability and efficiency of the network. It will be completed with the description of the management of the network, using different tools provide real-time monitoring of the entire system, making it possible, to anticipate and prevent any incidents that might cause a deficiency in the service being delivered to final user.
Resumo:
Neste trabalho, foi proposta uma nova famÃlia de distribuições, a qual permite modelar dados de sobrevivência quando a função de risco tem formas unimodal e U (banheira). Ainda, foram consideradas as modificações das distribuições Weibull, Fréchet, half-normal generalizada, log-logÃstica e lognormal. Tomando dados não-censurados e censurados, considerou-se os estimadores de máxima verossimilhança para o modelo proposto, a fim de verificar a flexibilidade da nova famÃlia. Além disso, um modelo de regressão locação-escala foi utilizado para verificar a influência de covariáveis nos tempos de sobrevida. Adicionalmente, conduziu-se uma análise de resÃduos baseada nos resÃduos deviance modificada. Estudos de simulação, utilizando-se de diferentes atribuições dos parâmetros, porcentagens de censura e tamanhos amostrais, foram conduzidos com o objetivo de verificar a distribuição empÃrica dos resÃduos tipo martingale e deviance modificada. Para detectar observações influentes, foram utilizadas medidas de influência local, que são medidas de diagnóstico baseadas em pequenas perturbações nos dados ou no modelo proposto. Podem ocorrer situações em que a suposição de independência entre os tempos de falha e censura não seja válida. Assim, outro objetivo desse trabalho é considerar o mecanismo de censura informativa, baseado na verossimilhança marginal, considerando a distribuição log-odd log-logÃstica Weibull na modelagem. Por fim, as metodologias descritas são aplicadas a conjuntos de dados reais.
Resumo:
A floresta Amazônica possui um papel ambiental, social e econômico importante para a região, para o paÃs e para o mundo. Dessa forma, técnicas de exploração que visam a diminuição dos impactos causados à floresta são essenciais. Com isso, o objetivo dessa tese é comparar a Exploração de Impacto Reduzido com a Exploração Convencional na Amazônia brasileira através de modelos empÃricos de árvore individual de crescimento e produção. O experimento foi instalado na fazenda Agrossete, localizada em Paragominas - PA. Em 1993, três áreas dessa fazenda foram selecionadas para exploração. Na primeira área, 105 hectares foram explorados através da Exploração de Impacto Reduzido. Na segunda área, 75 hectares foram submetidos à Exploração Convencional. E, por fim, a terceira área foi mantida como área testemunha. A coleta de dados de diâmetro à altura do peito e a identificação das espécies dentro de uma parcela de 24,5 hectares, instalada aleatoriamente em cada área, foi realizada nos anos de 1993 (antes da colheita), 1994 (seis meses depois da colheita), 1995, 1996, 1998, 2000, 2003, 2006 e 2009. Dessa forma, as três áreas foram comparadas através do ajuste de um modelo de incremento diamétrico, considerando que efeito estocástico podia assumir outras quatro distribuições além da distribuição normal, de um modelo de probabilidade de mortalidade e de um modelo de probabilidade de recrutamento. O comportamento do incremento diamétrico indicou que as áreas que foram submetidas a exploração possuem o mesmo comportamento em quase todos os grupos de espécies, com exceção do grupo de espécies intermediárias. Os indivÃduos que são submetidos a exploração possuem um maior crescimento em diâmetros quando comparados com área que não sofreu exploração. Além disso, assumir o efeito estocástico com distribuição Weibull melhorou o ajuste dos modelos. Em relação à probabilidade de mortalidade, novamente as áreas que sofreram exploração possuem comportamento semelhante quanto à mortalidade, mas diferente da área que não foi explorada, sendo que os indivÃduos localizados nas áreas exploradas possuem uma maior probabilidade de morte em relação aos presentes na área não explorada. Os modelos de probabilidade de recrutamento indicaram diferença apenas entre as áreas exploradas e a área controle. Sendo que, as áreas exploradas apresentaram uma maior taxa de recrumento em comparação a área não explorada. Portanto, o comportamento individual das árvores após a exploração é o mesmo na Exploração Convencional e na Exploração de Impacto Reduzido.
Resumo:
We consider the problem of estimating P(Yi + (...) + Y-n > x) by importance sampling when the Yi are i.i.d. and heavy-tailed. The idea is to exploit the cross-entropy method as a toot for choosing good parameters in the importance sampling distribution; in doing so, we use the asymptotic description that given P(Y-1 + (...) + Y-n > x), n - 1 of the Yi have distribution F and one the conditional distribution of Y given Y > x. We show in some specific parametric examples (Pareto and Weibull) how this leads to precise answers which, as demonstrated numerically, are close to being variance minimal within the parametric class under consideration. Related problems for M/G/l and GI/G/l queues are also discussed.