75 resultados para 2-EPT probability density function

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJETIVO: Estimar valores de referência e função de hierarquia de docentes em Saúde Coletiva do Brasil por meio de análise da distribuição do índice h. MÉTODOS: A partir do portal da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior, 934 docentes foram identificados em 2008, dos quais 819 foram analisados. O índice h de cada docente foi obtido na Web of Science mediante algoritmos de busca com controle para homonímias e alternativas de grafia de nome. Para cada região e para o Brasil como um todo ajustou-se função densidade de probabilidade exponencial aos parâmetros média e taxa de decréscimo por região. Foram identificadas medidas de posição e, com o complemento da função probabilidade acumulada, função de hierarquia entre autores conforme o índice h por região. RESULTADOS: Dos docentes, 29,8% não tinham qualquer registro de citação (h = 0). A média de h para o País foi 3,1, com maior média na região Sul (4,7). A mediana de h para o País foi 2,1, também com maior mediana na Sul (3,2). Para uma padronização de população de autores em cem, os primeiros colocados para o País devem ter h = 16; na estratificação por região, a primeira posição demanda valores mais altos no Nordeste, Sudeste e Sul, sendo nesta última h = 24. CONCLUSÕES: Avaliados pelos índices h da Web of Science, a maioria dos autores em Saúde Coletiva não supera h = 5. Há diferenças entres as regiões, com melhor desempenho para a Sul e valores semelhantes entre Sudeste e Nordeste.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Consider N sites randomly and uniformly distributed in a d-dimensional hypercube. A walker explores this disordered medium going to the nearest site, which has not been visited in the last mu (memory) steps. The walker trajectory is composed of a transient part and a periodic part (cycle). For one-dimensional systems, travelers can or cannot explore all available space, giving rise to a crossover between localized and extended regimes at the critical memory mu(1) = log(2) N. The deterministic rule can be softened to consider more realistic situations with the inclusion of a stochastic parameter T (temperature). In this case, the walker movement is driven by a probability density function parameterized by T and a cost function. The cost function increases as the distance between two sites and favors hops to closer sites. As the temperature increases, the walker can escape from cycles that are reminiscent of the deterministic nature and extend the exploration. Here, we report an analytical model and numerical studies of the influence of the temperature and the critical memory in the exploration of one-dimensional disordered systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. Observations in the cosmological domain are heavily dependent on the validity of the cosmic distance-duality (DD) relation, eta = D(L)(z)(1+ z)(2)/D(A)(z) = 1, an exact result required by the Etherington reciprocity theorem where D(L)(z) and D(A)(z) are, respectively, the luminosity and angular diameter distances. In the limit of very small redshifts D(A)(z) = D(L)(z) and this ratio is trivially satisfied. Measurements of Sunyaev-Zeldovich effect (SZE) and X-rays combined with the DD relation have been used to determine D(A)(z) from galaxy clusters. This combination offers the possibility of testing the validity of the DD relation, as well as determining which physical processes occur in galaxy clusters via their shapes. Aims. We use WMAP (7 years) results by fixing the conventional Lambda CDM model to verify the consistence between the validity of DD relation and different assumptions about galaxy cluster geometries usually adopted in the literature. Methods. We assume that. is a function of the redshift parametrized by two different relations: eta(z) = 1+eta(0)z, and eta(z) = 1+eta(0)z/(1+z), where eta(0) is a constant parameter quantifying the possible departure from the strict validity of the DD relation. In order to determine the probability density function (PDF) of eta(0), we consider the angular diameter distances from galaxy clusters recently studied by two different groups by assuming elliptical (isothermal) and spherical (non-isothermal) beta models. The strict validity of the DD relation will occur only if the maximum value of eta(0) PDF is centered on eta(0) = 0. Results. It was found that the elliptical beta model is in good agreement with the data, showing no violation of the DD relation (PDF peaked close to eta(0) = 0 at 1 sigma), while the spherical (non-isothermal) one is only marginally compatible at 3 sigma. Conclusions. The present results derived by combining the SZE and X-ray surface brightness data from galaxy clusters with the latest WMAP results (7-years) favors the elliptical geometry for galaxy clusters. It is remarkable that a local property like the geometry of galaxy clusters might be constrained by a global argument provided by the cosmic DD relation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Void fraction sensors are important instruments not only for monitoring two-phase flow, but for furnishing an important parameter for obtaining flow map pattern and two-phase flow heat transfer coefficient as well. This work presents the experimental results obtained with the analysis of two axially spaced multiple-electrode impedance sensors tested in an upward air-water two-phase flow in a vertical tube for void fraction measurements. An electronic circuit was developed for signal generation and post-treatment of each sensor signal. By phase shifting the electrodes supplying the signal, it was possible to establish a rotating electric field sweeping across the test section. The fundamental principle of using a multiple-electrode configuration is based on reducing signal sensitivity to the non-uniform cross-section void fraction distribution problem. Static calibration curves were obtained for both sensors, and dynamic signal analyses for bubbly, slug, and turbulent churn flows were carried out. Flow parameters such as Taylor bubble velocity and length were obtained by using cross-correlation techniques. As an application of the void fraction tested, vertical flow pattern identification could be established by using the probability density function technique for void fractions ranging from 0% to nearly 70%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we proposed a new two-parameters lifetime distribution with increasing failure rate. The new distribution arises on a latent complementary risk problem base. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulae for its reliability and failure rate functions, quantiles and moments, including the mean and variance. A simple EM-type algorithm for iteratively computing maximum likelihood estimates is presented. The Fisher information matrix is derived analytically in order to obtaining the asymptotic covariance matrix. The methodology is illustrated on a real data set. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time-domain reflectometry (TDR) is an important technique to obtain series of soil water content measurements in the field. Diode-segmented probes represent an improvement in TDR applicability, allowing measurements of the soil water content profile with a single probe. In this paper we explore an extensive soil water content dataset obtained by tensiometry and TDR from internal drainage experiments in two consecutive years in a tropical soil in Brazil. Comparisons between the variation patterns of the water content estimated by both methods exhibited evidences of deterioration of the TDR system during this two year period at field conditions. The results showed consistency in the variation pattern for the tensiometry data, whereas TDR estimates were inconsistent, with sensitivity decreasing over time. This suggests that difficulties may arise for the long-term use of this TDR system under tropical field conditions. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kumaraswamy [Generalized probability density-function for double-bounded random-processes, J. Hydrol. 462 (1980), pp. 79-88] introduced a distribution for double-bounded random processes with hydrological applications. For the first time, based on this distribution, we describe a new family of generalized distributions (denoted with the prefix `Kw`) to extend the normal, Weibull, gamma, Gumbel, inverse Gaussian distributions, among several well-known distributions. Some special distributions in the new family such as the Kw-normal, Kw-Weibull, Kw-gamma, Kw-Gumbel and Kw-inverse Gaussian distribution are discussed. We express the ordinary moments of any Kw generalized distribution as linear functions of probability weighted moments (PWMs) of the parent distribution. We also obtain the ordinary moments of order statistics as functions of PWMs of the baseline distribution. We use the method of maximum likelihood to fit the distributions in the new class and illustrate the potentiality of the new model with an application to real data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a new parametric method to deal with discrepant experimental results is developed. The method is based on the fit of a probability density function to the data. This paper also compares the characteristics of different methods used to deduce recommended values and uncertainties from a discrepant set of experimental data. The methods are applied to the (137)Cs and (90)Sr published half-lives and special emphasis is given to the deduced confidence intervals. The obtained results are analyzed considering two fundamental properties expected from an experimental result: the probability content of confidence intervals and the statistical consistency between different recommended values. The recommended values and uncertainties for the (137)Cs and (90)Sr half-lives are 10,984 (24) days and 10,523 (70) days, respectively. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show a function that fits well the probability density of return times between two consecutive visits of a chaotic trajectory to finite size regions in phase space. It deviates from the exponential statistics by a small power-law term, a term that represents the deterministic manifestation of the dynamics. We also show how one can quickly and easily estimate the Kolmogorov-Sinai entropy and the short-term correlation function by realizing observations of high probable returns. Our analyses are performed numerically in the Henon map and experimentally in a Chua's circuit. Finally, we discuss how our approach can be used to treat the data coming from experimental complex systems and for technological applications. (C) 2009 American Institute of Physics. [doi: 10.1063/1.3263943]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate a conjecture on the cover times of planar graphs by means of large Monte Carlo simulations. The conjecture states that the cover time tau (G(N)) of a planar graph G(N) of N vertices and maximal degree d is lower bounded by tau (G(N)) >= C(d)N(lnN)(2) with C(d) = (d/4 pi) tan(pi/d), with equality holding for some geometries. We tested this conjecture on the regular honeycomb (d = 3), regular square (d = 4), regular elongated triangular (d = 5), and regular triangular (d = 6) lattices, as well as on the nonregular Union Jack lattice (d(min) = 4, d(max) = 8). Indeed, the Monte Carlo data suggest that the rigorous lower bound may hold as an equality for most of these lattices, with an interesting issue in the case of the Union Jack lattice. The data for the honeycomb lattice, however, violate the bound with the conjectured constant. The empirical probability distribution function of the cover time for the square lattice is also briefly presented, since very little is known about cover time probability distribution functions in general.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In type 2 diabetes (DM2) there is progressive deterioration in beta-cell function and mass. It was found that islet function was about 50% of normal at the time of diagnosis and reduction in beta-cell mass of about 60% at necropsy (accelerated apoptosis). Among the interventions to preserve the beta-cells, those to lead to short-term improvement of beta-cell secretion are weight loss, metformin, sulfonylureas, and insulin. The long-term improvement was demonstrated with short-term intensive insulin therapy of newly diagnosed DM2, the use of antiapoptotic drugs such as glitazones, and the use of glucagon-like peptide-1 receptor agonists (GLP-1 mimetics), not inactivated by the enzyme dipeptidyl peptidase 4 and/or to inhibit that enzyme (GLP-1 enhancers). The incretin hormones are released from the gastrointestinal tract in response to nutrient ingestion to enhance glucose-dependent insulin secretion from the pancreas and overall maintenance of glucose homeostasis. From the two major incretins, GLP-1 and GIP (glucose-dependent insulinotropic polypeptide), only the first one or its mimetics or enhancers can be used for treatment. The GLP-1 mimetics exenatide and liraglutide as well as the DPP4 inhibitors (sitagliptin and vildagliptin) were approved for treatment of DM2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A semiclassical approximation for an evolving density operator, driven by a `closed` Hamiltonian operator and `open` Markovian Lindblad operators, is obtained. The theory is based on the chord function, i.e. the Fourier transform of the Wigner function. It reduces to an exact solution of the Lindblad master equation if the Hamiltonian operator is a quadratic function and the Lindblad operators are linear functions of positions and momenta. Initially, the semiclassical formulae for the case of Hermitian Lindblad operators are reinterpreted in terms of a (real) double phase space, generated by an appropriate classical double Hamiltonian. An extra `open` term is added to the double Hamiltonian by the non-Hermitian part of the Lindblad operators in the general case of dissipative Markovian evolution. The particular case of generic Hamiltonian operators, but linear dissipative Lindblad operators, is studied in more detail. A Liouville-type equivariance still holds for the corresponding classical evolution in double phase space, but the centre subspace, which supports the Wigner function, is compressed, along with expansion of its conjugate subspace, which supports the chord function. Decoherence narrows the relevant region of double phase space to the neighbourhood of a caustic for both the Wigner function and the chord function. This difficulty is avoided by a propagator in a mixed representation, so that a further `small-chord` approximation leads to a simple generalization of the quadratic theory for evolving Wigner functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.