108 resultados para PROBABILISTIC TELEPORTATION
Resumo:
In this paper we report on our attempts to fit the optimal data selection (ODS) model (Oaksford Chater, 1994; Oaksford, Chater, & Larkin, 2000) to the selection task data reported in Feeney and Handley (2000) and Handley, Feeney, and Harper (2002). Although Oaksford (2002b) reports good fits to the data described in Feeney and Handley (2000), the model does not adequately capture the data described in Handley et al. (2002). Furthermore, across all six of the experiments modelled here, the ODS model does not predict participants' behaviour at the level of selection rates for individual cards. Finally, when people's probability estimates are used in the modelling exercise, the model adequately captures only I out of 18 conditions described in Handley et al. We discuss the implications of these results for models of the selection task and claim that they support deductive, rather than probabilistic, accounts of the task.
Resumo:
Three experiments examined the influence of a second rule on the pattern of card selections on Wason's selection task. In Experiment 1 participants received a version of the task with a single test rule or one of two versions of the task with the same original test rule together with a second rule. The probability of q was manipulated in the two-rules conditions by varying the size of the antecedent set in the second rule. The results showed a significant suppression of q card and not-p card selections in the alternative-rule conditions, but no difference as a function of antecedent set size. In Experiment 2 the size of the antecendent set in the two-rules conditions was manipulated using the context of a computer printing double-sided cards. The results showed a significant reduction of q card selections in the two-rules conditions, but no effect of p set size. In Experiment 3 the scenario accompanying the rule was manipulated, and it specified a single alternative antecedent or a number of alternative antecedents. The q card selection rates were not affected by the scenario manipulation but again were suppressed by the presence of a second rule. Our results suggest that people make inferences about the unseen side of the cards when engaging with the task and that these inferences are systematically influenced by the presence of a second rule, but are not influenced by the probabilistic characteristics of this rule. These findings are discussed in the context of decision theoretic views of selection task performance (Oaksford Chater, 1994).
Resumo:
Objectives: The Secondary Prevention of Heart disEase in geneRal practicE (SPHERE) trial has recently reported. This study examines the cost-effectiveness of the SPHERE intervention in both healthcare systems on the island of Ireland. Methods: Incremental cost-effectiveness analysis. A probabilistic model was developed to combine within-trial and beyond-trial impacts of treatment to estimate the lifetime costs and benefits of two secondary prevention strategies: Intervention - tailored practice and patient care plans; and Control - standardized usual care. Results: The intervention strategy resulted in mean cost savings per patient of 512.77 (95 percent confidence interval [CI], 1086.46-91.98) and an increase in mean quality-adjusted life-years (QALYs) per patient of 0.0051 (95 percent CI, 0.0101-0.0200), when compared with the control strategy. The probability of the intervention being cost-effective was 94 percent if decision makers are willing to pay €45,000 per additional QALY. Conclusions: Decision makers in both settings must determine whether the level of evidence presented is sufficient to justify the adoption of the SPHERE intervention in clinical practice. Copyright © Cambridge University Press 2010.
Resumo:
Flutter prediction as currently practiced is usually deterministic, with a single structural model used to represent an aircraft. By using interval analysis to take into account structural variability, recent work has demonstrated that small changes in the structure can lead to very large changes in the altitude at which
utter occurs (Marques, Badcock, et al., J. Aircraft, 2010). In this follow-up work we examine the same phenomenon using probabilistic collocation (PC), an uncertainty quantification technique which can eficiently propagate multivariate stochastic input through a simulation code,
in this case an eigenvalue-based fluid-structure stability code. The resulting analysis predicts the consequences of an uncertain structure on incidence of
utter in probabilistic terms { information that could be useful in planning
flight-tests and assessing the risk of structural failure. The uncertainty in
utter altitude is confirmed to be substantial. Assuming that the structural uncertainty represents a epistemic uncertainty regarding the
structure, it may be reduced with the availability of additional information { for example aeroelastic response data from a flight-test. Such data is used to update the structural uncertainty using Bayes' theorem. The consequent
utter uncertainty is significantly reduced across the entire Mach number range.
Resumo:
In this paper we present a new method for simultaneously determining three dimensional (3-D) shape and motion of a non-rigid object from uncalibrated two dimensional (2- D) images without assuming the distribution characteristics. A non-rigid motion can be treated as a combination of a rigid rotation and a non-rigid deformation. To seek accurate recovery of deformable structures, we estimate the probability distribution function of the corresponding features through random sampling, incorporating an established probabilistic model. The fitting between the observation and the projection of the estimated 3-D structure will be evaluated using a Markov chain Monte Carlo based expectation maximisation algorithm. Applications of the proposed method to both synthetic and real image sequences are demonstrated with promising results.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
The purpose of this paper is to examine IT adoption by Irish credit unions. Using probabilistic models, we explore one aspect of IT, that of internet banking technology, and assess the degree to which characteristics specific to the credit union and to its potential membership base influence adoption. Our
analysis suggests that asset size, organisational structure being a member of the Irish League of Credit Unions and the loan to asset ratio are all important credit union specific drivers of internet banking adoption. We also find that characteristics of the area from where the credit union captures its members are important. Factors such as the percentage of the population that is employed, the proportion of the population in the age bracket 35 to 44, the proportion of the population that have access to broadband and the level of familiarity with a local ATM facility are all identified as influencing the probability of adopting internet banking.
Resumo:
Background: Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. Principal Findings: In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate ‘stepping stone’ populations yet to be discovered. Conclusions/Significance: We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess.
Resumo:
This paper presents a new algorithm for learning the structure of a special type of Bayesian network. The conditional phase-type (C-Ph) distribution is a Bayesian network that models the probabilistic causal relationships between a skewed continuous variable, modelled by the Coxian phase-type distribution, a special type of Markov model, and a set of interacting discrete variables. The algorithm takes a dataset as input and produces the structure, parameters and graphical representations of the fit of the C-Ph distribution as output.The algorithm, which uses a greedy-search technique and has been implemented in MATLAB, is evaluated using a simulated data set consisting of 20,000 cases. The results show that the original C-Ph distribution is recaptured and the fit of the network to the data is discussed.
Resumo:
In this paper we study the influence of interventions on self-interactions in a spatial Prisoner's Dilemma on a two-dimensional grid with periodic boundary conditions and synchronous updating of the dynamics. We investigate two different types of self-interaction modifications. The first type (FSIP) is deterministic, effecting each self-interaction of a player by a constant factor, whereas the second type (PSIP) performs a probabilistic interventions. Both types of interventions lead to a reduction of the payoff of the players and, hence, represent inhibiting effects. We find that a constant but moderate reduction of self-interactions has a very beneficial effect on the evolution of cooperators in the population, whereas probabilistic interventions on self-interactions are in general counter productive for the coexistence of the two different strategies. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Throughout design development of satellite structure, stress engineer is usually challenged with randomness in applied loads and material properties. To overcome such problem, a risk-based design is applied which estimates satellite structure probability of failure under static and thermal loads. Determining probability of failure can help to update initially applied factors of safety that were used during structure preliminary design phase. These factors of safety are related to the satellite mission objective. Sensitivity-based analysis is to be implemented in the context of finite element analysis (probabilistic finite element method or stochastic finite element method (SFEM)) to determine the probability of failure for satellite structure or one of its components.
Resumo:
In collaboration with Airbus-UK, the dimensional growth of aircraft panels while being riveted with stiffeners is investigated. Small panels are used in this investigation. The stiffeners have been fastened to the panels with rivets and it has been observed that during this operation the panels expand in the longitudinal and transverse directions. It has been observed that the growth is variable and the challenge is to control the riveting process to minimize this variability. In this investigation, the assembly of the small panels and longitudinal stiffeners has been simulated using static stress and nonlinear explicit finite element models. The models have been validated against a limited set of experimental measurements; it was found that more accurate predictions of the riveting process are achieved using explicit finite element models. Yet, the static stress finite element model is more time efficient, and more practical to simulate hundreds of rivets and the stochastic nature of the process. Furthermore, through a series of numerical simulations and probabilistic analyses, the manufacturing process control parameters that influence panel growth have been identified. Alternative fastening approaches were examined and it was found that dimensional growth can be controlled by changing the design of the dies used for forming the rivets.
Resumo:
With a significant increment of the number of digital cameras used for various purposes, there is a demanding call for advanced video analysis techniques that can be used to systematically interpret and understand the semantics of video contents, which have been recorded in security surveillance, intelligent transportation, health care, video retrieving and summarization. Understanding and interpreting human behaviours based on video analysis have observed competitive challenges due to non-rigid human motion, self and mutual occlusions, and changes of lighting conditions. To solve these problems, advanced image and signal processing technologies such as neural network, fuzzy logic, probabilistic estimation theory and statistical learning have been overwhelmingly investigated.
Resumo:
This paper addresses the pose recovery problem of a particular articulated object: the human body. In this model-based approach, the 2D-shape is associated to the corresponding stick figure allowing the joint segmentation and pose recovery of the subject observed in the scene. The main disadvantage of 2D-models is their restriction to the viewpoint. To cope with this limitation, local spatio-temporal 2D-models corresponding to many views of the same sequences are trained, concatenated and sorted in a global framework. Temporal and spatial constraints are then considered to build the probabilistic transition matrix (PTM) that gives a frame to frame estimation of the most probable local models to use during the fitting procedure, thus limiting the feature space. This approach takes advantage of 3D information avoiding the use of a complex 3D human model. The experiments carried out on both indoor and outdoor sequences have demonstrated the ability of this approach to adequately segment pedestrians and estimate their poses independently of the direction of motion during the sequence. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
We present a Spatio-temporal 2D Models Framework (STMF) for 2D-Pose tracking. Space and time are discretized and a mixture of probabilistic "local models" is learnt associating 2D Shapes and 2D Stick Figures. Those spatio-temporal models generalize well for a particular viewpoint and state of the tracked action but some spatio-temporal discontinuities can appear along a sequence, as a direct consequence of the discretization. To overcome the problem, we propose to apply a Rao-Blackwellized Particle Filter (RBPF) in the 2D-Pose eigenspace, thus interpolating unseen data between view-based clusters. The fitness to the images of the predicted 2D-Poses is evaluated combining our STMF with spatio-temporal constraints. A robust, fast and smooth human motion tracker is obtained by tracking only the few most important dimensions of the state space and by refining deterministically with our STMF.