84 resultados para Railroad safety, Bayesian methods, Accident modification factor, Countermeasure selection
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
In this paper, we introduce a Bayesian analysis for bioequivalence data assuming multivariate pharmacokinetic measures. With the introduction of correlation parameters between the pharmacokinetic measures or between the random effects in the bioequivalence models, we observe a good improvement in the bioequivalence results. These results are of great practical interest since they can yield higher accuracy and reliability for the bioequivalence tests, usually assumed by regulatory offices. An example is introduced to illustrate the proposed methodology by comparing the usual univariate bioequivalence methods with multivariate bioequivalence. We also consider some usual existing discrimination Bayesian methods to choose the best model to be used in bioequivalence studies.
Resumo:
Hardy-Weinberg Equilibrium (HWE) is an important genetic property that populations should have whenever they are not observing adverse situations as complete lack of panmixia, excess of mutations, excess of selection pressure, etc. HWE for decades has been evaluated; both frequentist and Bayesian methods are in use today. While historically the HWE formula was developed to examine the transmission of alleles in a population from one generation to the next, use of HWE concepts has expanded in human diseases studies to detect genotyping error and disease susceptibility (association); Ryckman and Williams (2008). Most analyses focus on trying to answer the question of whether a population is in HWE. They do not try to quantify how far from the equilibrium the population is. In this paper, we propose the use of a simple disequilibrium coefficient to a locus with two alleles. Based on the posterior density of this disequilibrium coefficient, we show how one can conduct a Bayesian analysis to verify how far from HWE a population is. There are other coefficients introduced in the literature and the advantage of the one introduced in this paper is the fact that, just like the standard correlation coefficients, its range is bounded and it is symmetric around zero (equilibrium) when comparing the positive and the negative values. To test the hypothesis of equilibrium, we use a simple Bayesian significance test, the Full Bayesian Significance Test (FBST); see Pereira, Stern andWechsler (2008) for a complete review. The disequilibrium coefficient proposed provides an easy and efficient way to make the analyses, especially if one uses Bayesian statistics. A routine in R programs (R Development Core Team, 2009) that implements the calculations is provided for the readers.
Resumo:
The PHENIX experiment at the Relativistic Heavy Ion Collider has performed systematic measurements of phi meson production in the K(+)K(-) decay channel at midrapidity in p + p, d + Au, Cu + Cu, and Au + Au collisions at root s(NN) = 200 GeV. Results are presented on the phi invariant yield and the nuclear modification factor R(AA) for Au + Au and Cu + Cu, and R(dA) for d + Au collisions, studied as a function of transverse momentum (1 < p(T) < 7 GeV/c) and centrality. In central and midcentral Au + Au collisions, the R(AA) of phi exhibits a suppression relative to expectations from binary scaled p + p results. The amount of suppression is smaller than that of the pi(0) and the. in the intermediate p(T) range (2-5 GeV/c), whereas, at higher p(T), the phi, pi(0), and. show similar suppression. The baryon (proton and antiproton) excess observed in central Au + Au collisions at intermediate p(T) is not observed for the phi meson despite the similar masses of the proton and the phi. This suggests that the excess is linked to the number of valence quarks in the hadron rather than its mass. The difference gradually disappears with decreasing centrality, and, for peripheral collisions, the R(AA) values for both particle species are consistent with binary scaling. Cu + Cu collisions show the same yield and suppression as Au + Au collisions for the same number of N(part). The R(dA) of phi shows no evidence for cold nuclear effects within uncertainties.
Resumo:
In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Sensitivity and specificity are measures that allow us to evaluate the performance of a diagnostic test. In practice, it is common to have situations where a proportion of selected individuals cannot have the real state of the disease verified, since the verification could be an invasive procedure, as occurs with biopsy. This happens, as a special case, in the diagnosis of prostate cancer, or in any other situation related to risks, that is, not practicable, nor ethical, or in situations with high cost. For this case, it is common to use diagnostic tests based only on the information of verified individuals. This procedure can lead to biased results or workup bias. In this paper, we introduce a Bayesian approach to estimate the sensitivity and the specificity for two diagnostic tests considering verified and unverified individuals, a result that generalizes the usual situation based on only one diagnostic test.
Resumo:
In this paper, we compare the performance of two statistical approaches for the analysis of data obtained from the social research area. In the first approach, we use normal models with joint regression modelling for the mean and for the variance heterogeneity. In the second approach, we use hierarchical models. In the first case, individual and social variables are included in the regression modelling for the mean and for the variance, as explanatory variables, while in the second case, the variance at level 1 of the hierarchical model depends on the individuals (age of the individuals), and in the level 2 of the hierarchical model, the variance is assumed to change according to socioeconomic stratum. Applying these methodologies, we analyze a Colombian tallness data set to find differences that can be explained by socioeconomic conditions. We also present some theoretical and empirical results concerning the two models. From this comparative study, we conclude that it is better to jointly modelling the mean and variance heterogeneity in all cases. We also observe that the convergence of the Gibbs sampling chain used in the Markov Chain Monte Carlo method for the jointly modeling the mean and variance heterogeneity is quickly achieved.
Resumo:
In this paper, we introduce a Bayesian analysis for survival multivariate data in the presence of a covariate vector and censored observations. Different ""frailties"" or latent variables are considered to capture the correlation among the survival times for the same individual. We assume Weibull or generalized Gamma distributions considering right censored lifetime data. We develop the Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods.
Resumo:
We present results on strange and multistrange particle production in Au + Au collisions at root s(NN) = 62.4 GeV as measured with the STAR detector at RHIC. Midrapidity transverse momentum spectra and integrated yields of K(S)(0), Lambda, Xi, and Omega and their antiparticles are presented for different centrality classes. The particle yields and ratios follow a smooth energy dependence. Chemical freeze-out parameters, temperature, baryon chemical potential, and strangeness saturation factor obtained from the particle yields are presented. Intermediate transverse momentum (p(T)) phenomena are discussed based on the ratio of the measured baryon-to-meson spectra and nuclear modification factor. The centrality dependence of various measurements presented show a similar behavior as seen in Au + Au collisions at root s(NN) = 200 GeV.
Resumo:
The contribution of B meson decays to nonphotonic electrons, which are mainly produced by the semileptonic decays of heavy-flavor mesons, in p + p collisions at root s = 200 GeV has been measured using azimuthal correlations between nonphotonic electrons and hadrons. The extracted B decay contribution is approximately 50% at a transverse momentum of p(T) >= 5 GeV/c. These measurements constrain the nuclear modification factor for electrons from B and D meson decays. The result indicates that B meson production in heavy ion collisions is also suppressed at high p(T).
Resumo:
We report on K*(0) production at midrapidity in Au + Au and Cu + Cu collisions at root s(NN) = 62.4 and 200 GeV collected by the Solenoid Tracker at the Relativistic Heavy Ion Collider detector. The K*(0) is reconstructed via the hadronic decays K*(0) -> K(+)pi(-) and (K*(0)) over bar -> K(+)pi(-). Transverse momentum, p(T), spectra are measured over a range of p(T) extending from 0.2 GeV/c up to 5 GeV/c. The center-of-mass energy and system size dependence of the rapidity density, dN/dy, and the average transverse momentum, < p(T)>, are presented. The measured N(K*(0))/N(K) and N(phi)/N(K*(0)) ratios favor the dominance of rescattering of decay daughters of K*(0) over the hadronic regeneration for the K*(0) production. In the intermediate p(T) region (2.0 < p(T) < 4.0 GeV/c), the elliptic flow parameter, v(2), and the nuclear modification factor, R(CP), agree with the expectations from the quark coalescence model of particle production.
Resumo:
Species of the genus Culex Linnaeus have been incriminated as the main vectors of lymphatic filariases and are important vectors of arboviruses, including West Nile virus. Sequences corresponding to a fragment of 478 bp of the cytochrome c oxidase subunit I gene, which includes part of the barcode region, of 37 individuals of 17 species of genus Culex were generated to establish relationships among five subgenera, Culex, Phenacomyia, Melanoconion, Microculex, and Carrollia, and one species of the genus Lutzia that occurs in Brazil. Bayesian methods were employed for the phylogenetic analyses. Results of sequence comparisons showed that individuals identified as Culex dolosus, Culex mollis, and Culex imitator possess high intraspecific divergence (3.1, 2.3, and 3.5%, respectively) when using the Kimura two parameters model. These differences were associated either with distinct morphological characteristics of the male genitalia or larval and pupal stages, suggesting that these may represent species complexes. The Bayesian topology suggested that the genus and subgenus Culex are paraphyletic relative to Lutzia and Phenacomyia, respectively. The cytochrome c oxidase subunit I sequences may be a useful tool to both estimate phylogenetic relationships and identify morphologically similar species of the genus Culex.
Resumo:
The toucan genus Ramphastos (Piciformes: Ramphastidae) has been a model in the formulation of Neotropical paleobiogeographic hypotheses. Weckstein (2005) reported on the phylogenetic history of this genus based on three mitochondrial genes, but some relationships were weakly supported and one of the subspecies of R. vitellinus (citreolaemus) was unsampled. This study expands on Weckstein (2005) by adding more DNA sequence data (including a nuclear marker) and more samples, including R v. citreolaemus. Maximum parsimony, maximum likelihood, and Bayesian methods recovered similar trees, with nodes showing high support. A monophyletic R. vitellinus complex was strongly supported as the sister-group to R. brevis. The results also confirmed that the southeastern and northern populations of R. vitellinus ariel are paraphyletic. X v. citreolaemus is sister to the Amazonian subspecies of the vitellinus complex. Using three protein-coding genes (COI, cytochrome-b and ND2) and interval-calibrated nodes under a Bayesian relaxed-clock framework, we infer that ramphastid genera originated in the middle Miocene to early Pliocene, Ramphastos species originated between late Miocene and early Pleistocene, and intra-specific divergences took place throughout the Pleistocene. Parsimony-based reconstruction of ancestral areas indicated that evolution of the four trans-Andean Ramphastos taxa (R. v. citreolaemus, R. a. swainsonii, R. brevis and R. sulfuratus) was associated with four independent dispersals from the cis-Andean region. The last pulse of Andean uplift may have been important for the evolution of R. sulfuratus, whereas the origin of the other trans-Andean Ramphastos taxa is consistent with vicariance due to drying events in the lowland forests north of the Andes. Estimated rates of molecular evolution were higher than the ""standard"" bird rate of 2% substitutions/site/million years for two of the three genes analyzed (cytochrome-b and ND2). (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The substitution of missing values, also called imputation, is an important data preparation task for many domains. Ideally, the substitution of missing values should not insert biases into the dataset. This aspect has been usually assessed by some measures of the prediction capability of imputation methods. Such measures assume the simulation of missing entries for some attributes whose values are actually known. These artificially missing values are imputed and then compared with the original values. Although this evaluation is useful, it does not allow the influence of imputed values in the ultimate modelling task (e.g. in classification) to be inferred. We argue that imputation cannot be properly evaluated apart from the modelling task. Thus, alternative approaches are needed. This article elaborates on the influence of imputed values in classification. In particular, a practical procedure for estimating the inserted bias is described. As an additional contribution, we have used such a procedure to empirically illustrate the performance of three imputation methods (majority, naive Bayes and Bayesian networks) in three datasets. Three classifiers (decision tree, naive Bayes and nearest neighbours) have been used as modelling tools in our experiments. The achieved results illustrate a variety of situations that can take place in the data preparation practice.
Resumo:
Relativistic heavy ion collisions are the ideal experimental tool to explore the QCD phase diagram. Several results show that a very hot medium with a high energy density and partonic degrees of freedom is formed in these collisions, creating a new state of matter. Measurements of strange hadrons can bring important information about the bulk properties of such matter. The elliptic flow of strange hadrons such as phi, K(S)(0), Lambda, Xi and Omega shows that collectivity is developed at partonic level and at intermediate p(T) the quark coalescence is the dominant mechanism of hadronization. The nuclear modification factor is an another indicator of the presence of a very dense medium. The comparison between measurements of Au+Au and d+Au collisions, where only cold nuclear matter effects are expected, can shed more light on the bulk properties. In these proceedings, recent results from the STAR experiment on bulk matter properties are presented.
Resumo:
P>In the context of either Bayesian or classical sensitivity analyses of over-parametrized models for incomplete categorical data, it is well known that prior-dependence on posterior inferences of nonidentifiable parameters or that too parsimonious over-parametrized models may lead to erroneous conclusions. Nevertheless, some authors either pay no attention to which parameters are nonidentifiable or do not appropriately account for possible prior-dependence. We review the literature on this topic and consider simple examples to emphasize that in both inferential frameworks, the subjective components can influence results in nontrivial ways, irrespectively of the sample size. Specifically, we show that prior distributions commonly regarded as slightly informative or noninformative may actually be too informative for nonidentifiable parameters, and that the choice of over-parametrized models may drastically impact the results, suggesting that a careful examination of their effects should be considered before drawing conclusions.Resume Que ce soit dans un cadre Bayesien ou classique, il est bien connu que la surparametrisation, dans les modeles pour donnees categorielles incompletes, peut conduire a des conclusions erronees. Cependant, certains auteurs persistent a negliger les problemes lies a la presence de parametres non identifies. Nous passons en revue la litterature dans ce domaine, et considerons quelques exemples surparametres simples dans lesquels les elements subjectifs influencent de facon non negligeable les resultats, independamment de la taille des echantillons. Plus precisement, nous montrons comment des a priori consideres comme peu ou non-informatifs peuvent se reveler extremement informatifs en ce qui concerne les parametres non identifies, et que le recours a des modeles surparametres peut avoir sur les conclusions finales un impact considerable. Ceci suggere un examen tres attentif de l`impact potentiel des a priori.