180 resultados para Distorted probabilities
Resumo:
Accurate reliability prediction for large-scale, long lived engineering is a crucial foundation for effective asset risk management and optimal maintenance decision making. However, a lack of failure data for assets that fail infrequently, and changing operational conditions over long periods of time, make accurate reliability prediction for such assets very challenging. To address this issue, we present a Bayesian-Marko best approach to reliability prediction using prior knowledge and condition monitoring data. In this approach, the Bayesian theory is used to incorporate prior information about failure probabilities and current information about asset health to make statistical inferences, while Markov chains are used to update and predict the health of assets based on condition monitoring data. The prior information can be supplied by domain experts, extracted from previous comparable cases or derived from basic engineering principles. Our approach differs from existing hybrid Bayesian models which are normally used to update the parameter estimation of a given distribution such as the Weibull-Bayesian distribution or the transition probabilities of a Markov chain. Instead, our new approach can be used to update predictions of failure probabilities when failure data are sparse or nonexistent, as is often the case for large-scale long-lived engineering assets.
Resumo:
Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.
Resumo:
If contemporary artworks are often considered to be puzzles or riddles, then Wilkins Hill take this to a new level. Their recent exhibition Windows impersonating other windows, confronts viewers with an extremely ludic configuration: a spa bath full of almonds, towel racks placed before photos of Martin Heidegger distorted in neat grids, a video of a water tower in a Hamburg park, wooden cut-out speech bubbles and monitors that continuously play interviews with the artists themselves. What does it all mean?
Resumo:
Long-term changes in the genetic composition of a population occur by the fixation of new mutations, a process known as substitution. The rate at which mutations arise in a population and the rate at which they are fixed are expected to be equal under neutral conditions (Kimura, 1968). Between the appearance of a new mutation and its eventual fate of fixation or loss, there will be a period in which it exists as a transient polymorphism in the population (Kimura and Ohta, 1971). If the majority of mutations are deleterious (and nonlethal), the fixation probabilities of these transient polymorphisms are reduced and the mutation rate will exceed the substitution rate (Kimura, 1983). Consequently, different apparent rates may be observed on different time scales of the molecular evolutionary process (Penny, 2005; Penny and Holmes, 2001). The substitution rate of the mitochondrial protein-coding genes of birds and mammals has been traditionally recognized to be about 0.01 substitutions/site/million years (Myr) (Brown et al., 1979; Ho, 2007; Irwin et al., 1991; Shields and Wilson, 1987), with the noncoding D-loop evolving several times more quickly (e.g., Pesole et al., 1992; Quinn, 1992). Over the past decade, there has been mounting evidence that instantaneous mutation rates substantially exceed substitution rates, in a range of organisms (e.g., Denver et al., 2000; Howell et al., 2003; Lambert et al., 2002; Mao et al., 2006; Mumm et al., 1997; Parsons et al., 1997; Santos et al., 2005). The immediate reaction to the first of these findings was that the polymorphisms generated by the elevated mutation rate are short-lived, perhaps extending back only a few hundred years (Gibbons, 1998; Macaulay et al., 1997). That is, purifying selection was thought to remove these polymorphisms very rapidly.
Resumo:
We report three developments toward resolving the challenge of the apparent basal polytomy of neoavian birds. First, we describe improved conditional down-weighting techniques to reduce noise relative to signal for deeper divergences and find increased agreement between data sets. Second, we present formulae for calculating the probabilities of finding predefined groupings in the optimal tree. Finally, we report a significant increase in data: nine new mitochondrial (mt) genomes (the dollarbird, New Zealand kingfisher, great potoo, Australian owlet-nightjar, white-tailed trogon, barn owl, a roadrunner [a ground cuckoo], New Zealand long-tailed cuckoo, and the peach-faced lovebird) and together they provide data for each of the six main groups of Neoaves proposed by Cracraft J (2001). We use his six main groups of modern birds as priors for evaluation of results. These include passerines, cuckoos, parrots, and three other groups termed “WoodKing” (woodpeckers/rollers/kingfishers), “SCA” (owls/potoos/owlet-nightjars/hummingbirds/swifts), and “Conglomerati.” In general, the support is highly significant with just two exceptions, the owls move from the “SCA” group to the raptors, particularly accipitrids (buzzards/eagles) and the osprey, and the shorebirds may be an independent group from the rest of the “Conglomerati”. Molecular dating mt genomes support a major diversification of at least 12 neoavian lineages in the Late Cretaceous. Our results form a basis for further testing with both nuclear-coding sequences and rare genomic changes.
Resumo:
In Strong v Woolworth Ltd (t/as Big W) (2012) 285 ALR 420 the appellant was injured when she fell at a shopping centre outside the respondent’s premises. The appellant was disabled, having had her right leg amputated above the knee and therefore walked with crutches. One of the crutches came into contact with a hot potato chip which was on the floor, causing the crutch to slip and the appellant to fall. The appellant sued in negligence, alleging that the respondent was in breach of its duty of care by failing to institute and maintain a cleaning system to detect spillages and foreign objects within its sidewalk sales area. The issue before the High Court was whether it could be established on the balance of probabilities as to when the hot chip had fallen onto the ground so as to prove causation in fact...
Resumo:
Airport system is complex. Passenger dynamics within it appear to be complicate as well. Passenger behaviours outside standard processes are regarded more significant in terms of public hazard and service rate issues. In this paper, we devised an individual agent decision model to simulate stochastic passenger behaviour in airport departure terminal. Bayesian networks are implemented into the decision making model to infer the probabilities that passengers choose to use any in-airport facilities. We aim to understand dynamics of the discretionary activities of passengers.
Resumo:
In the structure of the title compound, [Mg(H2O)2(C8H6FO3)2]n(0.4H2O)n, slightly distorted octahedral MgO6 complex units have crystallographic inversion symmetry, the coordination polyhedron comprising two trans-related water molecules and four carboxyl O-atom donors, two of which are bridging. Within the two-dimensional complex polymer which is parallel to (100), the coordinating water molecules form intermolecular O---H...O hydrogen-bonds with carboxylate and phenoxy O-atom acceptors, as well as with the partial-occupancy solvent water molecules.
Resumo:
The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno’s sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib’s and Pelletier’s (2011) theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substan- tial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.
Resumo:
Reliable ambiguity resolution (AR) is essential to Real-Time Kinematic (RTK) positioning and its applications, since incorrect ambiguity fixing can lead to largely biased positioning solutions. A partial ambiguity fixing technique is developed to improve the reliability of AR, involving partial ambiguity decorrelation (PAD) and partial ambiguity resolution (PAR). Decorrelation transformation could substantially amplify the biases in the phase measurements. The purpose of PAD is to find the optimum trade-off between decorrelation and worst-case bias amplification. The concept of PAR refers to the case where only a subset of the ambiguities can be fixed correctly to their integers in the integer least-squares (ILS) estimation system at high success rates. As a result, RTK solutions can be derived from these integer-fixed phase measurements. This is meaningful provided that the number of reliably resolved phase measurements is sufficiently large for least-square estimation of RTK solutions as well. Considering the GPS constellation alone, partially fixed measurements are often insufficient for positioning. The AR reliability is usually characterised by the AR success rate. In this contribution an AR validation decision matrix is firstly introduced to understand the impact of success rate. Moreover the AR risk probability is included into a more complete evaluation of the AR reliability. We use 16 ambiguity variance-covariance matrices with different levels of success rate to analyse the relation between success rate and AR risk probability. Next, the paper examines during the PAD process, how a bias in one measurement is propagated and amplified onto many others, leading to more than one wrong integer and to affect the success probability. Furthermore, the paper proposes a partial ambiguity fixing procedure with a predefined success rate criterion and ratio-test in the ambiguity validation process. In this paper, the Galileo constellation data is tested with simulated observations. Numerical results from our experiment clearly demonstrate that only when the computed success rate is very high, the AR validation can provide decisions about the correctness of AR which are close to real world, with both low AR risk and false alarm probabilities. The results also indicate that the PAR procedure can automatically chose adequate number of ambiguities to fix at given high-success rate from the multiple constellations instead of fixing all the ambiguities. This is a benefit that multiple GNSS constellations can offer.
Resumo:
Molecular dynamics simulations were carried out on single chain models of linear low-density polyethylene in vacuum to study the effects of branch length, branch content, and branch distribution on the polymer’s crystalline structure at 300 K. The trans/gauche (t/g) ratios of the backbones of the modeled molecules were calculated and utilized to characterize their degree of crystallinity. The results show that the t/g ratio decreases with increasing branch content regardless of branch length and branch distribution, indicating that branch content is the key molecular parameter that controls the degree of crystallinity. Although t/g ratios of the models with the same branch content vary, they are of secondary importance. However, our data suggests that branch distribution (regular or random) has a significant effect on the degree of crystallinity for models containing 10 hexyl branches/1,000 backbone carbons. The fractions of branches that resided in the equilibrium crystalline structures of the models were also calculated. On average, 9.8% and 2.5% of the branches were found in the crystallites of the molecules with ethyl and hexyl branches while C13 NMR experiments showed that the respective probabilities of branch inclusion for ethyl and hexyl branches are 10% and 6% [Hosoda et al., Polymer 1990, 31, 1999–2005]. However, the degree of branch inclusion seems to be insensitive to the branch content and branch distribution.
Resumo:
This paper proposes the use of Bayesian approaches with the cross likelihood ratio (CLR) as a criterion for speaker clustering within a speaker diarization system, using eigenvoice modeling techniques. The CLR has previously been shown to be an effective decision criterion for speaker clustering using Gaussian mixture models. Recently, eigenvoice modeling has become an increasingly popular technique, due to its ability to adequately represent a speaker based on sparse training data, as well as to provide an improved capture of differences in speaker characteristics. The integration of eigenvoice modeling into the CLR framework to capitalize on the advantage of both techniques has also been shown to be beneficial for the speaker clustering task. Building on that success, this paper proposes the use of Bayesian methods to compute the conditional probabilities in computing the CLR, thus effectively combining the eigenvoice-CLR framework with the advantages of a Bayesian approach to the diarization problem. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 33.5% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.
Resumo:
PURPOSE: To explore the experience of couples who continued pregnancy following a diagnosis of serious or lethal fetal anomaly. STUDY DESIGN: Thirty-one male and female participants were recruited from a high-risk maternal–fetal medicine clinic in Washington State. Data were collected using in-depth interviews during pregnancy and after the birth of their baby. Transcribed interviews were thematically analyzed through the phenomenological lens of Merleau-Ponty. FINDINGS: Participants described how time became reconfigured and reconstituted as they tried to compress a lifetime of love for their future child into a limited period. Participants’ concepts of time became distorted and were related to their perceptual lived experience rather than the schedule-filled,regimented, linear clock time that governed the health professionals. CONCLUSION: Living in distorted time may be a mechanism parents use to cope with overwhelming and disorienting feelings when their unborn baby is diagnosed with a fetal anomaly.
Resumo:
The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.
Resumo:
This chapter provides an overview of the contribution of feminist criminologies to understandings of the complex intersections between sex, gender and crime. Dozens of scholars and activists have participated in these debates over the past four decades. For our contribution to this handbook, we interviewed ten distinguished scholars whose contributions are recognized internationally. Through the commentary provided by these scholars, this chapter examines some of the distinctive contributions of feminism to our knowledge about sex, gender, and crime, as well as some of the challenges it continues to face in the field of criminology. We conclude that feminist work within criminology continues to face a number of lingering challenges, most notably in relation to the struggle to maintain relevance in a world where concerns about gender inequality are marginalized and considered as historical relics not contemporary issues; where there are on-going tensions around the best strategies for change, as well as difficulties in challenging distorted representations of female crime and violence; and where a backlash, anti-feminist politics seeks to discredit explanations that draw a link between sex, gender, and crime. This chapter critically reviews these lingering challenges—locating feminist approaches (of which there are many) at the centre and not the periphery of advancing knowledge about gender, sex, and crime.