14 resultados para 230201 Probability Theory
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Experiments show that for a large corpus, Zipf’s law does not hold for all rank of words: the frequencies fall below those predicted by Zipf’s law for ranks greater than about 5,000 word types in the English language and about 30,000 word types in the inflected languages Irish and Latin. It also does not hold for syllables or words in the syllable-based languages, Chinese or Vietnamese. However, when single words are combined together with word n-grams in one list and put in rank order, the frequency of tokens in the combined list extends Zipf’s law with a slope close to -1 on a log-log plot in all five languages. Further experiments have demonstrated the validity of this extension of Zipf’s law to n-grams of letters, phonemes or binary bits in English. It is shown theoretically that probability theory
alone can predict this behavior in randomly created n-grams of binary bits.
Resumo:
The quick, easy way to master all the statistics you'll ever need The bad news first: if you want a psychology degree you'll need to know statistics. Now for the good news: Psychology Statistics For Dummies. Featuring jargon-free explanations, step-by-step instructions and dozens of real-life examples, Psychology Statistics For Dummies makes the knotty world of statistics a lot less baffling. Rather than padding the text with concepts and procedures irrelevant to the task, the authors focus only on the statistics psychology students need to know. As an alternative to typical, lead-heavy statistics texts or supplements to assigned course reading, this is one book psychology students won't want to be without. Ease into statistics – start out with an introduction to how statistics are used by psychologists, including the types of variables they use and how they measure them Get your feet wet – quickly learn the basics of descriptive statistics, such as central tendency and measures of dispersion, along with common ways of graphically depicting information Meet your new best friend – learn the ins and outs of SPSS, the most popular statistics software package among psychology students, including how to input, manipulate and analyse data Analyse this – get up to speed on statistical analysis core concepts, such as probability and inference, hypothesis testing, distributions, Z-scores and effect sizes Correlate that – get the lowdown on common procedures for defining relationships between variables, including linear regressions, associations between categorical data and more Analyse by inference – master key methods in inferential statistics, including techniques for analysing independent groups designs and repeated-measures research designs Open the book and find: Ways to describe statistical data How to use SPSS statistical software Probability theory and statistical inference Descriptive statistics basics How to test hypotheses Correlations and other relationships between variables Core concepts in statistical analysis for psychology Analysing research designs Learn to: Use SPSS to analyse data Master statistical methods and procedures using psychology-based explanations and examples Create better reports Identify key concepts and pass your course
Resumo:
Local computation in join trees or acyclic hypertrees has been shown to be linked to a particular algebraic structure, called valuation algebra.There are many models of this algebraic structure ranging from probability theory to numerical analysis, relational databases and various classical and non-classical logics. It turns out that many interesting models of valuation algebras may be derived from semiring valued mappings. In this paper we study how valuation algebras are induced by semirings and how the structure of the valuation algebra is related to the algebraic structure of the semiring. In particular, c-semirings with idempotent multiplication induce idempotent valuation algebras and therefore permit particularly efficient architectures for local computation. Also important are semirings whose multiplicative semigroup is embedded in a union of groups. They induce valuation algebras with a partially defined division. For these valuation algebras, the well-known architectures for Bayesian networks apply. We also extend the general computational framework to allow derivation of bounds and approximations, for when exact computation is not feasible.
Resumo:
Based on the Dempster-Shafer (D-S) theory of evidence and G. Yen's (1989), extension of the theory, the authors propose approaches to representing heuristic knowledge by evidential mapping and pooling the mass distribution in a complex frame by partitioning that frame using Shafter's partition technique. The authors have generalized Yen's model from Bayesian probability theory to the D-S theory of evidence. Based on such a generalized model, an extended framework for evidential reasoning systems is briefly specified in which a semi-graph method is used to describe the heuristic knowledge. The advantage of such a method is that it can avoid the complexity of graphs without losing the explicitness of graphs. The extended framework can be widely used to build expert systems
Resumo:
Using a laboratory experiment, we investigate whether incentive compatibility affects subjective probabilities elicited via the exchangeability method (EM), an elicitation technique consisting of several chained questions. We hypothesize that subjects who are aware of the chaining strategically behave and provide invalid subjective probabilities, while subjects who are not aware of the chaining state their real beliefs and provide valid subjective probabilities. The validity of subjective probabilities is investigated using de Finetti's notion of coherence, under which probability estimates are valid if and only if they obey all axioms of probability theory.
Four experimental treatments are designed and implemented. Subjects are divided into two initial treatment groups: in the first, they are provided with real monetary incentives, and in the second, they are not. Each group is further sub-divided into two treatment groups, in the first, the chained structure of the experimental design is made clear to the subjects, while, in the second, the chained structure is hidden by randomizing the elicitation questions.
Our results suggest that subjects provided with monetary incentives and randomized questions provide valid subjective probabilities because they are not aware of the chaining which undermines the incentive compatibility of the exchangeability method.
Resumo:
The chain growth probability (alpha value) is one of the most significant parameters in Fischer-Tropsch (FT) synthesis. To gain insight into the chain growth probability, we systematically studied the hydrogenation and C-C coupling reactions with different chain lengths on the stepped Co(0001) surface using density functional theory calculations. Our findings elucidate the relationship between the barriers of these elementary reactions and the chain length. Moreover, we derived a general expression of the chain growth probability and investigated the behavior of the alpha value observed experimentally. The high methane yield results from the lower chain growth rate for C-1 + C-1 coupling compared with the other coupling reactions. After C-1, the deviation of product distribution in FT synthesis from the Anderson-Schulz-Flory distribution is due to the chain length-dependent paraffin/olefin ratio. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
We employ time-dependent R-matrix theory to study ultra-fast dynamics in the doublet 2s2p(2) configuration of C+ for a total magnetic quantum number M = 1. In contrast to the dynamics observed for M = 0, ultra-fast dynamics for M = 1 is governed by spin dynamics in which the 2s electron acts as a flag rather than a spectator electron. Under the assumption that m(S) = 1/2, m(2s) = 1/2 allows spin dynamics involving the two 2p electrons, whereas m(2s) = -1/2 prevents spin dynamics of the two 2p electrons. For a pump-probe pulse scheme with (h) over bar omega(pump) = 10.9 eV and (h) over bar omega(probe) = 16.3 eV and both pulses six cycles long, little sign of spin dynamics is observed in the total ionization probability. Signs of spin dynamics can be observed, however, in the ejected-electron momentum distributions. We demonstrate that the ejected-electron momentum distributions can be used for unaligned targets to separate the contributions of initial M = 0 and M = 1 levels. This would, in principle, allow unaligned target ions to be used to obtain information on the different dynamics in the 2s2p(2) configuration for the M = 0 and M = 1 levels from a single experime
Resumo:
The results of calculations investigating the effects of autodetaching resonances on the multiphoton detachment spectra of H are presented. The R-matrix Floquet method is used, in which the coupling of the ion with the laser field is described non-perturbatively. The laser field is fixed at an intensity of 10 W cm, while frequency ranges are chosen such that the lowest autodetaching states of the ion are excited through a two- or three-photon transition from the ground state. Detachment rates are compared, where possible, to previous results obtained using perturbation theory. An illustration of how non-lowest-order processes, involving autodetaching states, can lead to light-induced continuum structures is also presented. Finally, it is demonstrated that by using a frequency connecting the 1s and 2s states, the probability of exciting the residual hydrogen atom is significantly enhanced.
Resumo:
A self-consistent relativistic two-fluid model is proposed for electron-ion plasma dynamics. A one-dimensional geometry is adopted. Electrons are treated as a relativistically degenerate fluid, governed by an appropriate equation of state. The ion fluid is also allowed to be relativistic, but is cold, nondegenerate, and subject only to an electrostatic potential. Exact stationary-profile solutions are sought, at the ionic scale, via the Sagdeev pseudopotential method. The analysis provides the pulse existence region, in terms of characteristic relativistic parameters, associated with the (ultrahigh) particle density.
Resumo:
Dry reforming is a promising reaction to utilise the greenhouse gases CO2 and CH4. Nickel-based catalysts are the most popular catalysts for the reaction, and the coke formation on the catalysts is the main obstacle to the commercialisation of dry reforming. In this study, the whole reaction network of dry reformation on both flat and stepped nickel catalysts (Ni(111) and Ni(211)) as well as nickel carbide (flat: Ni3C(001); stepped: Ni3C(111)) is investigated using density functional theory calculations. The overall reaction energy profiles in the free energy landscape are obtained, and kinetic analyses are utilised to evaluate the activity of the four surfaces. By careful examination of our results, we find the following regarding the activity: (i) flat surfaces are more active than stepped surfaces for the dry reforming and (ii) metallic nickel catalysts are more active than those of nickel carbide, and therefore, the phase transformation from nickel to nickel carbide will reduce the activity. With respect to the coke formation, the following is found: (i) the coke formation probability can be measured by the rate ratio of CH oxidation pathway to C oxidation pathway (r(CH)/r(C)) and the barrier of CO dissociation, (ii) on Ni(111), the coke is unlikely to form, and (iii) the coke formations on the stepped surfaces of both nickel and nickel carbide can readily occur. A deactivation scheme, using which experimental results can be rationalised, is proposed.
Resumo:
The combination of density functional theory (DFT) calculations and kinetic analyses is a very useful approach to study surface reactions in heterogeneous catalysis. The present paper reviews some recent work applying this approach to Fischer-Tropsch (FT) synthesis. Emphasis is placed on the following fundamental issues in FT synthesis: (i) reactive sites for both hydrogenation and C-C coupling reactions; (ii) reaction mechanisms including carbene mechanism, CO-insertion mechanism and hydroxyl-carbene mechanism; (iii) selectivity with a focus on CH(4) selectivity, alpha-olefin selectivity and chain growth probability; and (iv) activity.
Resumo:
The strong mixing of many-electron basis states in excited atoms and ions with open f shells results in very large numbers of complex, chaotic eigenstates that cannot be computed to any degree of accuracy. Describing the processes which involve such states requires the use of a statistical theory. Electron capture into these “compound resonances” leads to electron-ion recombination rates that are orders of magnitude greater than those of direct, radiative recombination and cannot be described by standard theories of dielectronic recombination. Previous statistical theories considered this as a two-electron capture process which populates a pair of single-particle orbitals, followed by “spreading” of the two-electron states into chaotically mixed eigenstates. This method is similar to a configuration-average approach because it neglects potentially important effects of spectator electrons and conservation of total angular momentum. In this work we develop a statistical theory which considers electron capture into “doorway” states with definite angular momentum obtained by the configuration interaction method. We apply this approach to electron recombination with W20+, considering 2×106 doorway states. Despite strong effects from the spectator electrons, we find that the results of the earlier theories largely hold. Finally, we extract the fluorescence yield (the probability of photoemission and hence recombination) by comparison with experiment.
Resumo:
In this paper, we consider the transmission of confidential information over a κ-μ fading channel in the presence of an eavesdropper who also experiences κ-μ fading. In particular, we obtain novel analytical solutions for the probability of strictly positive secrecy capacity (SPSC) and a lower bound of secure outage probability (SOPL) for independent and non-identically distributed channel coefficients without parameter constraints. We also provide a closed-form expression for the probability of SPSC when the μ parameter is assumed to take positive integer values. Monte-Carlo simulations are performed to verify the derived results. The versatility of the κ-μ fading model means that the results presented in this paper can be used to determine the probability of SPSC and SOPL for a large number of other fading scenarios, such as Rayleigh, Rice (Nakagamin), Nakagami-m, One-Sided Gaussian, and mixtures of these common fading models. In addition, due to the duality of the analysis of secrecy capacity and co-channel interference (CCI), the results presented here will have immediate applicability in the analysis of outage probability in wireless systems affected by CCI and background noise (BN). To demonstrate the efficacy of the novel formulations proposed here, we use the derived equations to provide a useful insight into the probability of SPSC and SOPL for a range of emerging wireless applications, such as cellular device-to-device, peer-to-peer, vehicle-to-vehicle, and body centric communications using data obtained from real channel measurements.