951 resultados para Algorithmic Probability
Resumo:
An effective prognostics program will provide ample lead time for maintenance engineers to schedule a repair and to acquire replacement components before catastrophic failures occur. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique. For comparative study of the proposed model with the proportional hazard model (PHM), experimental bearing failure data from an accelerated bearing test rig were used. The result shows that the proposed prognostic model based on health state probability estimation can provide a more accurate prediction capability than the commonly used PHM in bearing failure case study.
Resumo:
Drink driving incidents in the Australian community continue to be a major road safety problem resulting in a third of all fatalities. Drink driving prevalence remains high; with the rate of Australians who self report drink driving remaining at 11%-12.1% [1,2]. The focus of research in the area to date has been with recidivist offenders who have a higher probability of reoffending, while there is comparatively limited research regarding first time offenders. An important and understudied area relates to the characteristics of first offenders and predictors of recidivism. This study examined the findings of in-depth focussed interviews with a sample of 20 individual first time drink driving offenders in Queensland recruited at the time of court mention.
Resumo:
The use of expert knowledge to quantify a Bayesian Network (BN) is necessary when data is not available. This however raises questions regarding how opinions from multiple experts can be used in a BN. Linear pooling is a popular method for combining probability assessments from multiple experts. In particular, Prior Linear Pooling (PrLP), which pools opinions then places them into the BN is a common method. This paper firstly proposes an alternative pooling method, Posterior Linear Pooling (PoLP). This method constructs a BN for each expert, then pools the resulting probabilities at the nodes of interest. Secondly, it investigates the advantages and disadvantages of using these pooling methods to combine the opinions of multiple experts. Finally, the methods are applied to an existing BN, the Wayfinding Bayesian Network Model, to investigate the behaviour of different groups of people and how these different methods may be able to capture such differences. The paper focusses on 6 nodes Human Factors, Environmental Factors, Wayfinding, Communication, Visual Elements of Communication and Navigation Pathway, and three subgroups Gender (female, male),Travel Experience (experienced, inexperienced), and Travel Purpose (business, personal) and finds that different behaviors can indeed be captured by the different methods.
Resumo:
An increasing number of studies analyze the relationship between natural disaster damage and income levels, but they do not consider the distinction between public and private disaster mitigation. This paper empirically distinguishes these two types of mitigation using Japanese prefectural panel data from 1975 to 2007. Our results show that public mitigation rather than private mitigation has contributed to mitigating the total damage resulting from natural disasters. Our estimation of cost-benefit ratios for each prefecture confirms that the mitigation efforts of urban prefectures are less effective than those of rural prefectures in focusing on both large and frequent/small disasters. Hence, urban prefectures need to reassess their public mitigation measures. Furthermore, to lessen the damage resulting from extreme catastrophes, policy makers are required to invest in improved mitigation infrastructures when faced with a high probability of disasters.
Resumo:
A new transdimensional Sequential Monte Carlo (SMC) algorithm called SM- CVB is proposed. In an SMC approach, a weighted sample of particles is generated from a sequence of probability distributions which ‘converge’ to the target distribution of interest, in this case a Bayesian posterior distri- bution. The approach is based on the use of variational Bayes to propose new particles at each iteration of the SMCVB algorithm in order to target the posterior more efficiently. The variational-Bayes-generated proposals are not limited to a fixed dimension. This means that the weighted particle sets that arise can have varying dimensions thereby allowing us the option to also estimate an appropriate dimension for the model. This novel algorithm is outlined within the context of finite mixture model estimation. This pro- vides a less computationally demanding alternative to using reversible jump Markov chain Monte Carlo kernels within an SMC approach. We illustrate these ideas in a simulated data analysis and in applications.
Resumo:
The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.
Resumo:
The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.
Resumo:
Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. While the systematicity and productivity of language provide a strong argument in favour of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and philosophy. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. Compositionality is equated with a joint probability distribution modelling how the constituent concepts in the combination are interpreted. Marginal selectivity is emphasised as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities (referred to collectively as Bell-type). Non-compositionality is then equated with either a failure of marginal selectivity, or, in the presence of marginal selectivity, with a violation of Bell-type inequalities. In both non-compositional scenarios, the conceptual combination cannot be modelled using a joint probability distribution with variables corresponding to the interpretation of the individual concepts. The framework is demonstrated by applying it to an empirical scenario of twenty-four non-lexicalised conceptual combinations.
Resumo:
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Resumo:
The provision of autonomy supportive environments that promote physical activity engagement have become popular in contemporary youth settings. However, questions remain about whether adolescent perceptions of their autonomy have implications for physical activity. The purpose of this investigation was to examine the association between adolescents’ self-reported physical activity and their perceived autonomy. Participants (n = 384 adolescents) aged between 12 and 15 years were recruited from six secondary schools in metropolitan Brisbane, Australia. Self-reported measures of physical activity and autonomy were obtained. Logistic regression with inverse probability weights were used to examine the association between autonomy and the odds of meeting youth physical activity guidelines. Autonomy (OR 0.61, 95% CI 0.49-0.76) and gender (OR 0.62, 95% CI 0.46-0.83) were negatively associated with meeting physical activity guidelines. However, the model explained only a small amount of the variation in whether youth in this sample met physical activity guidelines (R2 = 0.023). For every 1 unit decrease in autonomy (on an index from 1 to 5), participants were 1.64 times more likely to meet physical activity guidelines. The findings, which are at odds with several previous studies, suggest that interventions designed to facilitate youth physical activity should limit opportunities for youth to make independent decisions about their engagement. However, the small amount of variation explained by the predictors in the model is a caveat, and should be considered prior to applying such suggestions in practical settings. Future research should continue to examine a larger age range, longitudinal observational or intervention studies to examine assertions of causality, as well as objective measurement of physical activity.
Resumo:
We propose a new information-theoretic metric, the symmetric Kullback-Leibler divergence (sKL-divergence), to measure the difference between two water diffusivity profiles in high angular resolution diffusion imaging (HARDI). Water diffusivity profiles are modeled as probability density functions on the unit sphere, and the sKL-divergence is computed from a spherical harmonic series, which greatly reduces computational complexity. Adjustment of the orientation of diffusivity functions is essential when the image is being warped, so we propose a fast algorithm to determine the principal direction of diffusivity functions using principal component analysis (PCA). We compare sKL-divergence with other inner-product based cost functions using synthetic samples and real HARDI data, and show that the sKL-divergence is highly sensitive in detecting small differences between two diffusivity profiles and therefore shows promise for applications in the nonlinear registration and multisubject statistical analysis of HARDI data.
Resumo:
We apply an information-theoretic cost metric, the symmetrized Kullback-Leibler (sKL) divergence, or $J$-divergence, to fluid registration of diffusion tensor images. The difference between diffusion tensors is quantified based on the sKL-divergence of their associated probability density functions (PDFs). Three-dimensional DTI data from 34 subjects were fluidly registered to an optimized target image. To allow large image deformations but preserve image topology, we regularized the flow with a large-deformation diffeomorphic mapping based on the kinematics of a Navier-Stokes fluid. A driving force was developed to minimize the $J$-divergence between the deforming source and target diffusion functions, while reorienting the flowing tensors to preserve fiber topography. In initial experiments, we showed that the sKL-divergence based on full diffusion PDFs is adaptable to higher-order diffusion models, such as high angular resolution diffusion imaging (HARDI). The sKL-divergence was sensitive to subtle differences between two diffusivity profiles, showing promise for nonlinear registration applications and multisubject statistical analysis of HARDI data.
Resumo:
In the present study we utilised functional magnetic resonance imaging (fMRI) to examine cerebral activation during performance of a classic motor task in which response suppression load was parametrically varied. Linear increases in activity were observed in a distributed network of regions across both cerebral hemispheres, although with more extensive involvement of the right prefrontal cortex. Activated regions included prefrontal, parietal and occipitotemporal cortices. Decreasing activation was similarly observed in a distributed network of regions. These response forms are discussed in terms of an increasing requirement for visual cue discrimination and suppression/selection of motor responses, and a decreasing probability of the occurrence of non-target stimuli and attenuation of a prepotent tendency to respond. The results support recent proposals for a dominant role for the right-hemisphere in performance of motor response suppression tasks that emphasise the importance of the right prefrontal cortex.
Resumo:
Diffusion weighted magnetic resonance (MR) imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of 6 directions, second-order tensors can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve crossing fiber tracts. Recently, a number of high-angular resolution schemes with greater than 6 gradient directions have been employed to address this issue. In this paper, we introduce the Tensor Distribution Function (TDF), a probability function defined on the space of symmetric positive definite matrices. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the diffusion orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function.
Resumo:
High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.