967 resultados para Generalized inverse Gaussian distribution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A five-parameter distribution so-called the beta modified Weibull distribution is defined and studied. The new distribution contains, as special submodels, several important distributions discussed in the literature, such as the generalized modified Weibull, beta Weibull, exponentiated Weibull, beta exponential, modified Weibull and Weibull distributions, among others. The new distribution can be used effectively in the analysis of survival data since it accommodates monotone, unimodal and bathtub-shaped hazard functions. We derive the moments and examine the order statistics and their moments. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set is used to illustrate the importance and flexibility of the new distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a sample of censored survival times, the presence of an immune proportion of individuals who are not subject to death, failure or relapse, may be indicated by a relatively high number of individuals with large censored survival times. In this paper the generalized log-gamma model is modified for the possibility that long-term survivors may be present in the data. The model attempts to separately estimate the effects of covariates on the surviving fraction, that is, the proportion of the population for which the event never occurs. The logistic function is used for the regression model of the surviving fraction. Inference for the model parameters is considered via maximum likelihood. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. Finally, a data set from the medical area is analyzed under the log-gamma generalized mixture model. A residual analysis is performed in order to select an appropriate model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimation of Taylor`s power law for species abundance data may be performed by linear regression of the log empirical variances on the log means, but this method suffers from a problem of bias for sparse data. We show that the bias may be reduced by using a bias-corrected Pearson estimating function. Furthermore, we investigate a more general regression model allowing for site-specific covariates. This method may be efficiently implemented using a Newton scoring algorithm, with standard errors calculated from the inverse Godambe information matrix. The method is applied to a set of biomass data for benthic macrofauna from two Danish estuaries. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study in detail the so-called beta-modified Weibull distribution, motivated by the wide use of the Weibull distribution in practice, and also for the fact that the generalization provides a continuous crossover towards cases with different shapes. The new distribution is important since it contains as special sub-models some widely-known distributions, such as the generalized modified Weibull, beta Weibull, exponentiated Weibull, beta exponential, modified Weibull and Weibull distributions, among several others. It also provides more flexibility to analyse complex real data. Various mathematical properties of this distribution are derived, including its moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are also derived for the chf, mean deviations, Bonferroni and Lorenz curves, reliability and entropies. The estimation of parameters is approached by two methods: moments and maximum likelihood. We compare by simulation the performances of the estimates from these methods. We obtain the expected information matrix. Two applications are presented to illustrate the proposed distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The residence time distribution and mean residence time of a 10% sodium bicarbonate solution that is dried in a conventional spouted bed with inert bodies were measured with the stimulus-response method. Methylene blue was used as a chemical tracer, and the effects of the paste feed mode, size distribution of the inert bodies, and mean particle size on the residence times and dried powder properties were investigated. The results showed that the residence time distributions could be best reproduced with the perfect mixing cell model or N = 1 for the continuous stirred tank reactor in a series model. The mean residence times ranged from 6.04 to 12.90 min and were significantly affected by the factors studied. Analysis of variance on the experimental data showed that mean residence times were affected by the mean diameter of the inert bodies at a significance level of 1% and by the size distribution at a level of 5%. Moreover, altering the paste feed from dripping to pneumatic atomization affected mean residence time at a 5% significance level. The dried powder characteristics proved to be adequate for further industrial manipulation, as demonstrated by the low moisture content, narrow range of particle size, and good flow properties. The results of this research are significant in the study of the drying of heat-sensitive materials because it shows that by simultaneously changing the size distribution and average size of the inert bodies, the mean residence times of a paste can be reduced by half, thus decreasing losses due to degradation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a factorizing posterior approximation. For neural network models, we use a central limit theorem argument to make EP tractable when the number of parameters is large. For two types of models, we show that EP can achieve optimal generalization performance when data are drawn from a simple distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we study the possible microscopic origin of heavy-tailed probability density distributions for the price variation of financial instruments. We extend the standard log-normal process to include another random component in the so-called stochastic volatility models. We study these models under an assumption, akin to the Born-Oppenheimer approximation, in which the volatility has already relaxed to its equilibrium distribution and acts as a background to the evolution of the price process. In this approximation, we show that all models of stochastic volatility should exhibit a scaling relation in the time lag of zero-drift modified log-returns. We verify that the Dow-Jones Industrial Average index indeed follows this scaling. We then focus on two popular stochastic volatility models, the Heston and Hull-White models. In particular, we show that in the Hull-White model the resulting probability distribution of log-returns in this approximation corresponds to the Tsallis (t-Student) distribution. The Tsallis parameters are given in terms of the microscopic stochastic volatility model. Finally, we show that the log-returns for 30 years Dow Jones index data is well fitted by a Tsallis distribution, obtaining the relevant parameters. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Herpesviruses may be related to the etiology of aggressive periodontitis (AgP) and chronic periodontitis (CP) by triggering periodontal destruction or by increasing the risk for bacterial infection. This case-control study evaluated the presence of herpes simplex virus type 1 (HSV-1), Epstein-Barr virus type 1 (EBV-1), human cytomegalovirus (HCMV), Aggregatibacter actinomycetemcomitans (previously Actinobacillus actinomycetemcomitans), Porphyromonas gingivalis, Prevotella intermedia, and Tannerella forsythia (previously T. forsythensis) in patients with generalized AgP (AgP group), CP (CP group), or gingivitis (G group) and in healthy individuals (C group). Methods: Subgingival plaque samples were collected with paper points from 30 patients in each group. The nested polymerase chain reaction (PCR) method was used to detect HSV-1, EBV-1, and HCMV. Bacteria were identified by 16S rRNA-based PCR. Results: HSV-1, HCMV, and EBV-1 were detected in 86.7%, 46.7%, and 33.3% of the AgP group, respectively; in 40.0%, 50.0%, and 46.7% of the CP group, respectively; in 53.3%, 40.0%, and 20.0% of the G group, respectively; and in 20.0%, 56.7%, and 0.0% of the C group, respectively. A. actinomycetemcomitans was detected significantly more often in the AgP group compared to the other groups (P<0.005). P. gingivalis and T. forsythia were identified more frequently in AgP and CP groups, and AgP, CP, and G groups had higher frequencies of P. intermedia compared to the C group. Conclusion: In Brazilian patients, HSV-1 and EBV-1, rather than HCMV, were more frequently associated with CP and AgP. J Periodontol 2008;79:2313-2321.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the statistical properties of the local density of states of a one-dimensional Dirac equation in the presence of various types of disorder with Gaussian white-noise distribution. It is shown how either the replica trick or supersymmetry can be used to calculate exactly all the moments of the local density of states.' Careful attention is paid to how the results change if the local density of states is averaged over atomic length scales. For both the replica trick and supersymmetry the problem is reduced to finding the ground state of a zero-dimensional Hamiltonian which is written solely in terms of a pair of coupled spins which are elements of u(1, 1). This ground state is explicitly found for the particular case of the Dirac equation corresponding to an infinite metallic quantum wire with a single conduction channel. The calculated moments of the local density of states agree with those found previously by Al'tshuler and Prigodin [Sov. Phys. JETP 68 (1989) 198] using a technique based on recursion relations for Feynman diagrams. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past century, the debate over whether or not density-dependent factors regulate populations has generally focused on changes in mean population density, ignoring the spatial variance around the mean as unimportant noise. In an attempt to provide a different framework for understanding population dynamics based on individual fitness, this paper discusses the crucial role of spatial variability itself on the stability of insect populations. The advantages of this method are the following: (1) it is founded on evolutionary principles rather than post hoc assumptions; (2) it erects hypotheses that can be tested; and (3) it links disparate ecological schools, including spatial dynamics, behavioral ecology, preference-performance, and plant apparency into an overall framework. At the core of this framework, habitat complexity governs insect spatial variance. which in turn determines population stability. First, the minimum risk distribution (MRD) is defined as the spatial distribution of individuals that results in the minimum number of premature deaths in a population given the distribution of mortality risk in the habitat (and, therefore, leading to maximized population growth). The greater the divergence of actual spatial patterns of individuals from the MRD, the greater the reduction of population growth and size from high, unstable levels. Then, based on extensive data from 29 populations of the processionary caterpillar, Ochrogaster lunifer, four steps are used to test the effect of habitat interference on population growth rates. (1) The costs (increasing the risk of scramble competition) and benefits (decreasing the risk of inverse density-dependent predation) of egg and larval aggregation are quantified. (2) These costs and benefits, along with the distribution of resources, are used to construct the MRD for each habitat. (3) The MRD is used as a benchmark against which the actual spatial pattern of individuals is compared. The degree of divergence of the actual spatial pattern from the MRD is quantified for each of the 29 habitats. (4) Finally, indices of habitat complexity are used to provide highly accurate predictions of spatial divergence from the MRD, showing that habitat interference reduces population growth rates from high, unstable levels. The reason for the divergence appears to be that high levels of background vegetation (vegetation other than host plants) interfere with female host-searching behavior. This leads to a spatial distribution of egg batches with high mortality risk, and therefore lower population growth. Knowledge of the MRD in other species should be a highly effective means of predicting trends in population dynamics. Species with high divergence between their actual spatial distribution and their MRD may display relatively stable dynamics at low population levels. In contrast, species with low divergence should experience high levels of intragenerational population growth leading to frequent habitat-wide outbreaks and unstable dynamics in the long term. Six hypotheses, erected under the framework of spatial interference, are discussed, and future tests are suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new and efficient methodology for distribution network reconfiguration integrated with optimal power flow (OPF) based on a Benders decomposition approach. The objective minimizes power losses, balancing load among feeders and subject to constraints: capacity limit of branches, minimum and maximum power limits of substations or distributed generators, minimum deviation of bus voltages and radial optimal operation of networks. The Generalized Benders decomposition algorithm is applied to solve the problem. The formulation can be embedded under two stages; the first one is the Master problem and is formulated as a mixed integer non-linear programming problem. This stage determines the radial topology of the distribution network. The second stage is the Slave problem and is formulated as a non-linear programming problem. This stage is used to determine the feasibility of the Master problem solution by means of an OPF and provides information to formulate the linear Benders cuts that connect both problems. The model is programmed in GAMS. The effectiveness of the proposal is demonstrated through two examples extracted from the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiotherapy (RT) is one of the most important approaches in the treatment of cancer and its performance can be improved in three different ways: through the optimization of the dose distribution, by the use of different irradiation techniques or through the study of radiobiological initiatives. The first is purely physical because is related to the physical dose distributiuon. The others are purely radiobiological because they increase the differential effect between the tumour and the health tissues. The Treatment Planning Systems (TPS) are used in RT to create dose distributions with the purpose to maximize the tumoral control and minimize the complications in the healthy tissues. The inverse planning uses dose optimization techniques that satisfy the criteria specified by the user, regarding the target and the organs at risk (OAR’s). The dose optimization is possible through the analysis of dose-volume histograms (DVH) and with the use of computed tomography, magnetic resonance and other digital image techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper revisits the convolution operator and addresses its generalization in the perspective of fractional calculus. Two examples demonstrate the feasibility of the concept using analytical expressions and the inverse Fourier transform, for real and complex orders. Two approximate calculation schemes in the time domain are also tested.