68 resultados para exceedance probabilities

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims. We derive lists of proper-motions and kinematic membership probabilities for 49 open clusters and possible open clusters in the zone of the Bordeaux PM2000 proper motion catalogue (+ 11 degrees <= delta <= + 18 degrees). We test different parametrisations of the proper motion and position distribution functions and select the most successful one. In the light of those results, we analyse some objects individually. Methods. We differenciate between cluster and field member stars, and assign membership probabilities, by applying a new and fully automated method based on both parametrisations of the proper motion and position distribution functions, and genetic algorithm optimization heuristics associated with a derivative-based hill climbing algorithm for the likelihood optimization. Results. We present a catalogue comprising kinematic parameters and associated membership probability lists for 49 open clusters and possible open clusters in the Bordeaux PM2000 catalogue region. We note that this is the first determination of proper motions for five open clusters. We confirm the non-existence of two kinematic populations in the region of 15 previously suspected non-existent objects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Jensen theorem is used to derive inequalities for semiclassical tunneling probabilities for systems involving several degrees of freedom. These Jensen inequalities are used to discuss several aspects of sub-barrier heavy-ion fusion reactions. The inequality hinges on general convexity properties of the tunneling coefficient calculated with the classical action in the classically forbidden region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the one-dimensional asymmetric simple exclusion process (ASEP) in which particles jump to the right at rate p is an element of (1/2, 1.] and to the left at rate 1 - p, interacting by exclusion. In the initial state there is a finite region such that to the left of this region all sites are occupied and to the right of it all sites are empty. Under this initial state, the hydrodynamical limit of the process converges to the rarefaction fan of the associated Burgers equation. In particular suppose that the initial state has first-class particles to the left of the origin, second-class particles at sites 0 and I, and holes to the right of site I. We show that the probability that the two second-class particles eventually collide is (1 + p)/(3p), where a collision occurs when one of the particles attempts to jump over the other. This also corresponds to the probability that two ASEP processes. started from appropriate initial states and coupled using the so-called ""basic coupling,"" eventually reach the same state. We give various other results about the behaviour of second-class particles in the ASEP. In the totally asymmetric case (p = 1) we explain a further representation in terms of a multi-type particle system, and also use the collision result to derive the probability of coexistence of both clusters in a two-type version of the corner growth model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is known that patients may cease participating in a longitudinal study and become lost to follow-up. The objective of this article is to present a Bayesian model to estimate the malaria transition probabilities considering individuals lost to follow-up. We consider a homogeneous population, and it is assumed that the considered period of time is small enough to avoid two or more transitions from one state of health to another. The proposed model is based on a Gibbs sampling algorithm that uses information of lost to follow-up at the end of the longitudinal study. To simulate the unknown number of individuals with positive and negative states of malaria at the end of the study and lost to follow-up, two latent variables were introduced in the model. We used a real data set and a simulated data to illustrate the application of the methodology. The proposed model showed a good fit to these data sets, and the algorithm did not show problems of convergence or lack of identifiability. We conclude that the proposed model is a good alternative to estimate probabilities of transitions from one state of health to the other in studies with low adherence to follow-up.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-level CASSCF/MRCI calculations with a quintuple-zeta quality basis set are reported by characterizing for the first time a manifold of electronic states of the CAs radical yet to be investigated experimentally. Along with the potential energy curves and the associated spectroscopic constants, the dipole moment functions for selected electronic states as well as the transition dipole moment functions for the most relevant electronic transitions are also presented. Estimates of radiative transition probabilities and lifetimes complement this investigation, which also assesses the effect of spin-orbit interaction on the A (2)Pi state. Whenever pertinent, comparisons of similarities and differences with the isovalent CN and CP radicals are made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVO: caracterizar a inserção de egressos do Curso de Fonoaudiologia da Universidade Estadual Paulista (UNESP) - Marília, em Programas de Pós-Graduação (PPG) Stricto Sensu brasileiros. MÉTODO: foram utilizadas listas de graduados e Curriculum Vitae do egresso e do orientador. RESULTADOS: dos 537 formados, 16,57% cursaram/estavam cursando PPG e destes, 98,88% em mestrado e 37,08% também em doutorado. Na grande área de conhecimento, 50% dos egressos de mestrado vincularam-se predominantemente a programas em Ciências da Saúde, 31,80% em Ciências Humanas e 13,64% em Linguística, Letras e Artes. No doutorado, 33, 33% em Ciências Humanas, 30,30% em Ciências da Saúde e em Linguística, Letras e Artes. Quanto à área de conhecimento, predominou a vinculação, no mestrado, de 30,68% em Fonoaudiologia, 28,41% em Educação, 13,64% em Linguística e 9,09% em Medicina I; e, no doutorado, de 33,33% em Educação, 30,30% em Linguística e 9,09% em Fonoaudiologia; 55,68% dissertações e 51,52% teses focalizaram a linguagem. A UNESP predominou com 39,77% no mestrado e 48,48% no doutorado. Predominou a vinculação a Programas com conceito 4 para 52,27% dos egressos do mestrado e 45,45% do doutorado. Quando constou a informação (55,68%), todos receberam fomento. O Teste de Razão de Verossimilhança não indicou diferenças significativas dos percentuais obtidos entre o mestrado e o doutorado. CONCLUSÃO: os resultados superaram os apresentados para o mesmo Estado, mostraram a característica interdisciplinar da Ciência Fonoaudiológica e o predomínio de temática em linguagem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a computer program developed for estimating penetrance rates in autosomal dominant diseases by means of family kinship and phenotype information contained within the pedigrees. The program also determines the exact 95% credibility interval for the penetrance estimate. Both executable (PenCalc for Windows) and web versions (PenCalcWeb) of the software are available. The web version enables further calculations, such as heterozygosity probabilities and assessment of offspring risks for all individuals in the pedigrees. Both programs can be accessed and down-loaded freely at the home-page address http://www.ib.usp.br/~otto/software.htm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

FUNDAMENTO: As troponinas cardíacas são marcadores altamente sensíveis e específicos de lesão miocárdica. Esses marcadores foram detectados na insuficiência cardíaca (IC) e estão associadas com mau prognóstico. OBJETIVO: Avaliar a relação da troponina T (cTnT) e suas faixas de valores com o prognóstico na IC descompensada. MÉTODOS: Estudaram-se 70 pacientes com piora da IC crônica que necessitaram de hospitalização. Na admissão, o modelo de Cox foi utilizado para avaliar as variáveis capazes de predizer o desfecho composto por morte ou re-hospitalização em razão de piora da IC durante um ano. RESULTADOS: Durante o seguimento, ocorreram 44 mortes, 36 re-hospitalizações por IC e 56 desfechos compostos. Na análise multivariada, os preditores de eventos clínicos foram: cTnT (cTnT > 0,100 ng/ml; hazard ratio (HR) 3,95 intervalo de confiança (IC) 95%: 1,64-9,49, p = 0,002), diâmetro diastólico final do ventrículo esquerdo (DDVE >70 mm; HR 1,92, IC95%: 1,06-3,47, p = 0,031) e sódio sérico (Na <135 mEq/l; HR 1,79, IC95%: 1,02-3,15, p = 0,044). Para avaliar a relação entre a elevação da cTnT e o prognóstico na IC descompensada, os pacientes foram estratificados em três grupos: cTnT-baixo (cTnT < 0,020 ng/ml, n = 22), cTnT-intermediário (cTnT > 0,020 e < 0,100 ng/ml, n = 36) e cTnT-alto (cTnT > 0,100 ng/ml, n = 12). As probabilidades de sobrevida e sobrevida livre de eventos foram: 54,2%, 31,5%, 16,7% (p = 0,020), e 36,4%, 11,5%, 8,3% (p = 0,005), respectivamente. CONCLUSÃO: A elevação da cTnT está associada com mau prognóstico na IC descompensada, e o grau dessa elevação pode facilitar a estratificação de risco

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gene clustering is a useful exploratory technique to group together genes with similar expression levels under distinct cell cycle phases or distinct conditions. It helps the biologist to identify potentially meaningful relationships between genes. In this study, we propose a clustering method based on multivariate normal mixture models, where the number of clusters is predicted via sequential hypothesis tests: at each step, the method considers a mixture model of m components (m = 2 in the first step) and tests if in fact it should be m - 1. If the hypothesis is rejected, m is increased and a new test is carried out. The method continues (increasing m) until the hypothesis is accepted. The theoretical core of the method is the full Bayesian significance test, an intuitive Bayesian approach, which needs no model complexity penalization nor positive probabilities for sharp hypotheses. Numerical experiments were based on a cDNA microarray dataset consisting of expression levels of 205 genes belonging to four functional categories, for 10 distinct strains of Saccharomyces cerevisiae. To analyze the method's sensitivity to data dimension, we performed principal components analysis on the original dataset and predicted the number of classes using 2 to 10 principal components. Compared to Mclust (model-based clustering), our method shows more consistent results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A susceptible-infective-recovered (SIR) epidemiological model based on probabilistic cellular automaton (PCA) is employed for simulating the temporal evolution of the registered cases of chickenpox in Arizona, USA, between 1994 and 2004. At each time step, every individual is in one of the states S, I, or R. The parameters of this model are the probabilities of each individual (each cell forming the PCA lattice ) passing from a state to another state. Here, the values of these probabilities are identified by using a genetic algorithm. If nonrealistic values are allowed to the parameters, the predictions present better agreement with the historical series than if they are forced to present realistic values. A discussion about how the size of the PCA lattice affects the quality of the model predictions is presented. Copyright (C) 2009 L. H. A. Monteiro et al.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification, prediction, and control of a system are engineering subjects, regardless of the nature of the system. Here, the temporal evolution of the number of individuals with dengue fever weekly recorded in the city of Rio de Janeiro, Brazil, during 2007, is used to identify SIS (susceptible-infective-susceptible) and SIR (susceptible-infective-removed) models formulated in terms of cellular automaton (CA). In the identification process, a genetic algorithm (GA) is utilized to find the probabilities of the state transition S -> I able of reproducing in the CA lattice the historical series of 2007. These probabilities depend on the number of infective neighbors. Time-varying and non-time-varying probabilities, three different sizes of lattices, and two kinds of coupling topology among the cells are taken into consideration. Then, these epidemiological models built by combining CA and GA are employed for predicting the cases of sick persons in 2008. Such models can be useful for forecasting and controlling the spreading of this infectious disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: With nearly 1,100 species, the fish family Characidae represents more than half of the species of Characiformes, and is a key component of Neotropical freshwater ecosystems. The composition, phylogeny, and classification of Characidae is currently uncertain, despite significant efforts based on analysis of morphological and molecular data. No consensus about the monophyly of this group or its position within the order Characiformes has been reached, challenged by the fact that many key studies to date have non-overlapping taxonomic representation and focus only on subsets of this diversity. Results: In the present study we propose a new definition of the family Characidae and a hypothesis of relationships for the Characiformes based on phylogenetic analysis of DNA sequences of two mitochondrial and three nuclear genes (4,680 base pairs). The sequences were obtained from 211 samples representing 166 genera distributed among all 18 recognized families in the order Characiformes, all 14 recognized subfamilies in the Characidae, plus 56 of the genera so far considered incertae sedis in the Characidae. The phylogeny obtained is robust, with most lineages significantly supported by posterior probabilities in Bayesian analysis, and high bootstrap values from maximum likelihood and parsimony analyses. Conclusion: A monophyletic assemblage strongly supported in all our phylogenetic analysis is herein defined as the Characidae and includes the characiform species lacking a supraorbital bone and with a derived position of the emergence of the hyoid artery from the anterior ceratohyal. To recognize this and several other monophyletic groups within characiforms we propose changes in the limits of several families to facilitate future studies in the Characiformes and particularly the Characidae. This work presents a new phylogenetic framework for a speciose and morphologically diverse group of freshwater fishes of significant ecological and evolutionary importance across the Neotropics and portions of Africa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider a random medium consisting of N points randomly distributed so that there is no correlation among the distances separating them. This is the random link model, which is the high dimensionality limit (mean-field approximation) for the Euclidean random point structure. In the random link model, at discrete time steps, a walker moves to the nearest point, which has not been visited in the last mu steps (memory), producing a deterministic partially self-avoiding walk (the tourist walk). We have analytically obtained the distribution of the number n of points explored by the walker with memory mu=2, as well as the transient and period joint distribution. This result enables us to explain the abrupt change in the exploratory behavior between the cases mu=1 (memoryless walker, driven by extreme value statistics) and mu=2 (walker with memory, driven by combinatorial statistics). In the mu=1 case, the mean newly visited points in the thermodynamic limit (N >> 1) is just < n >=e=2.72... while in the mu=2 case, the mean number < n > of visited points grows proportionally to N(1/2). Also, this result allows us to establish an equivalence between the random link model with mu=2 and random map (uncorrelated back and forth distances) with mu=0 and the abrupt change between the probabilities for null transient time and subsequent ones.