990 resultados para Filtered probabilities
Resumo:
A number of studies have addressed the relationship between intra-personal uncertainty and inter-personal disagreement about the future values of economic variables such as output growth and inflation using the SPF. By making use of the SPF respondents' probability forecasts of declines in output, we are able to construct a quarterly series of output growth uncertainty to supplement the annual series that are often used in such analyses. We also consider the relationship between disagreement and uncertainty for probability forecasts of declines in output.
Resumo:
I consider the possibility that respondents to the Survey of Professional Forecasters round their probability forecasts of the event that real output will decline in the future, as well as their reported output growth probability distributions. I make various plausible assumptions about respondents’ rounding practices, and show how these impinge upon the apparent mismatch between probability forecasts of a decline in output and the probabilities of this event implied by the annual output growth histograms. I find that rounding accounts for about a quarter of the inconsistent pairs of forecasts.
Resumo:
Solar wind/magnetosheath plasma in the magnetosphere can be identified using a component that has a higher charge state, lower density and, at least soon after their entry into the magnetosphere, lower energy than plasma from a terrestrial source. We survey here observations taken over 3 years of He2+ ions made by the Magnetospheric Ion Composition Sensor (MICS) of the Charge and Mass Mgnetospheric Ion Composition Experiment (CAMMICE) instrument aboard POLAR. The occurrence probability of these solar wind ions is then plotted as a function of Magnetic Local Time (MLT) and invariant latitude (3) for various energy ranges. For all energies observed by MICS (1.8–21.4 keV) and all solar wind conditions, the occurrence probabilities peaked around the cusp region and along the dawn flank. The solar wind conditions were filtered to see if this dawnward asymmetry is controlled by the Svalgaard-Mansurov effect (and so depends on the BY component of the interplanetary magnetic field, IMF) or by Fermi acceleration of He2+ at the bow shock (and so depends on the IMF ratio BX/BY ). It is shown that the asymmetry remained persistently on the dawn flank, suggesting it was not due to effects associated with direct entry into the magnetosphere. This asymmetry, with enhanced fluxes on the dawn flank, persisted for lower energy ions (below a “cross-over” energy of about 23 keV) but reversed sense to give higher fluxes on the dusk flank at higher energies. This can be explained by the competing effects of gradient/curvature drifts and the convection electric field on ions that are convecting sunward on re-closed field lines. The lower-energy He2+ ions E × B drift dawnwards as they move earthward, whereas the higher energy ions curvature/gradient drift towards dusk. The convection electric field in the tail is weaker for northward IMF. Ions then need less energy to drift to the dusk flank, so that the cross-over energy, at which the asymmetry changes sense, is reduced.
Resumo:
To understand the evolution of well-organized social behaviour, we must first understand the mechanism by which collective behaviour establishes. In this study, the mechanisms of collective behaviour in a colony of social insects were studied in terms of the transition probability between active and inactive states, which is linked to mutual interactions. The active and inactive states of the social insects were statistically extracted from the velocity profiles. From the duration distributions of the two states, we found that 1) the durations of active and inactive states follow an exponential law, and 2) pair interactions increase the transition probability from inactive to active states. The regulation of the transition probability by paired interactions suggests that such interactions control the populations of active and inactive workers in the colony.
Resumo:
It is known that patients may cease participating in a longitudinal study and become lost to follow-up. The objective of this article is to present a Bayesian model to estimate the malaria transition probabilities considering individuals lost to follow-up. We consider a homogeneous population, and it is assumed that the considered period of time is small enough to avoid two or more transitions from one state of health to another. The proposed model is based on a Gibbs sampling algorithm that uses information of lost to follow-up at the end of the longitudinal study. To simulate the unknown number of individuals with positive and negative states of malaria at the end of the study and lost to follow-up, two latent variables were introduced in the model. We used a real data set and a simulated data to illustrate the application of the methodology. The proposed model showed a good fit to these data sets, and the algorithm did not show problems of convergence or lack of identifiability. We conclude that the proposed model is a good alternative to estimate probabilities of transitions from one state of health to the other in studies with low adherence to follow-up.
Resumo:
When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We study the problem of the existence of filtered multiplicative bases of a restricted enveloping algebra u(L), where L is a finite-dimensional and p-nilpotent restricted Lie algebra over a field of positive characteristic p.
Resumo:
High-level CASSCF/MRCI calculations with a quintuple-zeta quality basis set are reported by characterizing for the first time a manifold of electronic states of the CAs radical yet to be investigated experimentally. Along with the potential energy curves and the associated spectroscopic constants, the dipole moment functions for selected electronic states as well as the transition dipole moment functions for the most relevant electronic transitions are also presented. Estimates of radiative transition probabilities and lifetimes complement this investigation, which also assesses the effect of spin-orbit interaction on the A (2)Pi state. Whenever pertinent, comparisons of similarities and differences with the isovalent CN and CP radicals are made.
Resumo:
Genetic algorithms are commonly used to solve combinatorial optimizationproblems. The implementation evolves using genetic operators (crossover, mutation,selection, etc.). Anyway, genetic algorithms like some other methods have parameters(population size, probabilities of crossover and mutation) which need to be tune orchosen.In this paper, our project is based on an existing hybrid genetic algorithmworking on the multiprocessor scheduling problem. We propose a hybrid Fuzzy-Genetic Algorithm (FLGA) approach to solve the multiprocessor scheduling problem.The algorithm consists in adding a fuzzy logic controller to control and tunedynamically different parameters (probabilities of crossover and mutation), in anattempt to improve the algorithm performance. For this purpose, we will design afuzzy logic controller based on fuzzy rules to control the probabilities of crossoverand mutation. Compared with the Standard Genetic Algorithm (SGA), the resultsclearly demonstrate that the FLGA method performs significantly better.
Resumo:
A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.
Resumo:
An operational complexity model (OCM) is proposed to enable the complexity of both the cognitive and the computational components of a process to be determined. From the complexity of formation of a set of traces via a specified route a measure of the probability of that route can be determined. By determining the complexities of alternative routes leading to the formation of the same set of traces, the odds ratio indicating the relative plausibility of the alternative routes can be found. An illustrative application to a BitTorrent piracy case is presented, and the results obtained suggest that the OCM is capable of providing a realistic estimate of the odds ratio for two competing hypotheses. It is also demonstrated that the OCM can be straightforwardly refined to encompass a variety of circumstances.
Resumo:
We apply the concept of exchangeable random variables to the case of non-additive robability distributions exhibiting ncertainty aversion, and in the lass generated bya convex core convex non-additive probabilities, ith a convex core). We are able to rove two versions of the law of arge numbers (de Finetti's heorems). By making use of two efinitions. of independence we rove two versions of the strong law f large numbers. It turns out that e cannot assure the convergence of he sample averages to a constant. e then modal the case there is a true" probability distribution ehind the successive realizations of the uncertain random variable. In this case convergence occurs. This result is important because it renders true the intuition that it is possible "to learn" the "true" additive distribution behind an uncertain event if one repeatedly observes it (a sufficiently large number of times). We also provide a conjecture regarding the "Iearning" (or updating) process above, and prove a partia I result for the case of Dempster-Shafer updating rule and binomial trials.
Resumo:
This dissertation presents two papers on how to deal with simple systemic risk measures to assess portfolio risk characteristics. The first paper deals with the Granger-causation of systemic risk indicators based in correlation matrices in stock returns. Special focus is devoted to the Eigenvalue Entropy as some previous literature indicated strong re- sults, but not considering different macroeconomic scenarios; the Index Cohesion Force and the Absorption Ratio are also considered. Considering the S&P500, there is not ev- idence of Granger-causation from Eigenvalue Entropies and the Index Cohesion Force. The Absorption Ratio Granger-caused both the S&P500 and the VIX index, being the only simple measure that passed this test. The second paper develops this measure to capture the regimes underlying the American stock market. New indicators are built using filtering and random matrix theory. The returns of the S&P500 is modelled as a mixture of normal distributions. The activation of each normal distribution is governed by a Markov chain with the transition probabilities being a function of the indicators. The model shows that using a Herfindahl-Hirschman Index of the normalized eigenval- ues exhibits best fit to the returns from 1998-2013.