13 resultados para ignorance
Resumo:
In conditional probabilistic logic programming, given a query, the two most common forms for answering the query are either a probability interval or a precise probability obtained by using the maximum entropy principle. The former can be noninformative (e.g.,interval [0; 1]) and the reliability of the latter is questionable when the priori knowledge isimprecise. To address this problem, in this paper, we propose some methods to quantitativelymeasure if a probability interval or a single probability is sufficient for answering a query. We first propose an approach to measuring the ignorance of a probabilistic logic program with respect to a query. The measure of ignorance (w.r.t. a query) reflects howreliable a precise probability for the query can be and a high value of ignorance suggests that a single probability is not suitable for the query. We then propose a method to measure the probability that the exact probability of a query falls in a given interval, e.g., a second order probability. We call it the degree of satisfaction. If the degree of satisfaction is highenough w.r.t. the query, then the given interval can be accepted as the answer to the query. We also prove our measures satisfy many properties and we use a case study to demonstrate the significance of the measures. © Springer Science+Business Media B.V. 2012
Resumo:
The sexual health of people, particularly young people, in Northern Ireland is currently poor. Yet there has been little research conducted on sexual attitudes and lifestyles. This paper is based on data from the first ever major research project in this field in Northern Ireland. Using quantitative and qualitative methods, it targeted young people aged 14-25. A combination of a self-administered survey questionnaire, focus group discussions and one-to-one interviews was found to be most suitable for the collection of sensitive data on sexuality in a country where the social and moral climate had previously prevented studies of this nature. Information was collected on sexual attitudes and behaviour generally. This paper focuses on one crucial issue: the age of first sexual encounter. It explores the attitudes of young people to that experience and the use of contraception. Many of the findings match those of similar large-scale surveys in England and Wales, including the modal age of first sexual encounter and the influence of peer pressure on decision-making about first sex. There were significant gender differences in both behaviour and attitudes. It is hoped that the research results will influence future education and health policy, which has all too often been based on ignorance.
Resumo:
A key obstacle to the wide-scale development of renewable energy is that public acceptability of wind energy cannot be taken for granted when wind energy moves from abstract support to local implementation. Drawing on a case study of opposition to the siting of a proposed off-shore wind farm in Northern Ireland, we offer a rhetorical analysis of a series of representative documents drawn from government, media, pro- and anti-wind energy sources, which identifies and interprets a number of discourses of objection and support. The analysis indicates that the key issue in terms of the transition to a renewable energy economy has little to do with the technology itself. Understanding the different nuances of pro- and anti-wind energy discourses highlights the importance of thinking about new ways of looking at these conflicts. These include adopting a “conflict resolution” approach and “upstreaming” public involvement in the decision-making process and also the counter-productive strategy of assuming that objection is based on ignorance (which can be solved by information) or NIMBY thinking (which can be solved by moral arguments about overcoming “free riders”).
Resumo:
This paper is based on research undertaken in Ireland that sought to understand how parents communicate with their children about sexuality. Forty-three parents were interviewed and data were analysed using analytical induction. Data indicated that while parents tended to pride themselves on the culture of openness to sexuality that prevailed in their home, they often described situations where very little dialogue on the subject actually transpired. However, unlike previous research on the topic that identified parent-related factors (such as ignorance or embarrassment) as the main impediments to parent-young person communication about sex, participants in our study identified the central obstacle to be a reticence on the part of the young person to engage in such dialogue. Participants described various blocking techniques apparently used by the young people, including claims to have full prior knowledge on the issue, physically absenting themselves from the situation, becoming irritated or annoyed, or ridiculing parents' educational efforts. In our analysis, we consider our findings in light of the shifting power of children historically and the new cultural aspiration of maintaining harmonious and democratic relations with one's offspring.
Resumo:
This essay is intended as a self-reflective, auto-critique of the ‘social accounting community’. The essay is directed at the academic community of accountants concerned with social accounting. This `community' is predominantly concerned with English language accounting journals and is preoccupied with the social and environmental practices of the larger private sector organisations. The essay is motivated by a concern over our responsibilities as academics in a world in crisis and a concern that social accounting is losing its energy and revolutionary zeal. This community's social accounting endeavours have taken place in almost complete ignorance of the activities and developments in non accounting communities and, in particular, developments in the public and third sectors. The essay reaches out to the public and third sector work and literature as an illustration of one of the ways in which ‘our’ social accounting can try to prevent itself from becoming moribund.
Resumo:
Although cartel behaviour is almost universally (and rightly) condemned, it is not clear why cartel participants deserve the full wrath of the criminal law and its associated punishment. To fill this void, I develop a normative (or principled) justification for the criminalisation of conduct characteristic of ‘hard core’ cartels. The paper opens with a brief consideration of the rhetoric commonly used to denounce cartel activity, eg that it ‘steals from’ or ‘robs’ consumers. To put the discussion in context, a brief definition of ‘hard core’ cartel behaviour is provided and the harms associated with this activity are identified. These are: welfare losses in the form of appropriation (from consumer to producer) of consumer surplus, the creation of deadweight loss to the economy, the creation of productive inefficiency (hindering innovation of both products and processes), and the creation of so-called X-inefficiency. As not all activities which cause harm ought to be criminalised, a theory as to why certain harms in a liberal society can be criminalised is developed. It is based on JS Mill's harm to others principle (as refined by Feinberg) and on a choice of social institutions using Rawls's ‘veil of ignorance.’ The theory is centred on the value of individual choice in securing one's own well-being, with the market as an indispensable instrument for this. But as applied to the harm associated with cartel conduct, this theory shows that none of the earlier mentioned problems associated with this activity provide sufficient justification for criminalisation. However, as the harm from hard core cartel activity strikes at an important institution which permits an individual's ability to secure their own well-being in a liberal society, criminalisation of hard core cartel behaviour can have its normative justification on this basis.
Resumo:
We present TANC, a TAN classifier (tree-augmented naive) based on imprecise probabilities. TANC models prior near-ignorance via the Extreme Imprecise Dirichlet Model (EDM). A first contribution of this paper is the experimental comparison between EDM and the global Imprecise Dirichlet Model using the naive credal classifier (NCC), with the aim of showing that EDM is a sensible approximation of the global IDM. TANC is able to deal with missing data in a conservative manner by considering all possible completions (without assuming them to be missing-at-random), but avoiding an exponential increase of the computational time. By experiments on real data sets, we show that TANC is more reliable than the Bayesian TAN and that it provides better performance compared to previous TANs based on imprecise probabilities. Yet, TANC is sometimes outperformed by NCC because the learned TAN structures are too complex; this calls for novel algorithms for learning the TAN structures, better suited for an imprecise probability classifier.
Resumo:
In this paper we present TANC, i.e., a tree-augmented naive credal classifier based on imprecise probabilities; it models prior near-ignorance via the Extreme Imprecise Dirichlet Model (EDM) (Cano et al., 2007) and deals conservatively with missing data in the training set, without assuming them to be missing-at-random. The EDM is an approximation of the global Imprecise Dirichlet Model (IDM), which considerably simplifies the computation of upper and lower probabilities; yet, having been only recently introduced, the quality of the provided approximation needs still to be verified. As first contribution, we extensively compare the output of the naive credal classifier (one of the few cases in which the global IDM can be exactly implemented) when learned with the EDM and the global IDM; the output of the classifier appears to be identical in the vast majority of cases, thus supporting the adoption of the EDM in real classification problems. Then, by experiments we show that TANC is more reliable than the precise TAN (learned with uniform prior), and also that it provides better performance compared to a previous (Zaffalon, 2003) TAN model based on imprecise probabilities. TANC treats missing data by considering all possible completions of the training set, but avoiding an exponential increase of the computational times; eventually, we present some preliminary results with missing data.
Resumo:
Security is a critical concern around the world. Since resources for security are always limited, lots of interest have arisen in using game theory to handle security resource allocation problems. However, most of the existing work does not address adequately how a defender chooses his optimal strategy in a game with absent, inaccurate, uncertain, and even ambiguous strategy profiles' payoffs. To address this issue, we propose a general framework of security games under ambiguities based on Dempster-Shafer theory and the ambiguity aversion principle of minimax regret. Then, we reveal some properties of this framework. Also, we present two methods to reduce the influence of complete ignorance. Our investigation shows that this new framework is better in handling security resource allocation problems under ambiguities.
Resumo:
We explore the challenges posed by the violation of Bell-like inequalities by d-dimensional systems exposed to imperfect state-preparation and measurement settings. We address, in particular, the limit of high-dimensional systems, naturally arising when exploring the quantum-to-classical transition. We show that, although suitable Bell inequalities can be violated, in principle, for any dimension of given subsystems, it is in practice increasingly challenging to detect such violations, even if the system is prepared in a maximally entangled state. We characterize the effects of random perturbations on the state or on the measurement settings, also quantifying the efforts needed to certify the possible violations in case of complete ignorance on the system state at hand.
Resumo:
We present a robust Dirichlet process for estimating survival functions from samples with right-censored data. It adopts a prior near-ignorance approach to avoid almost any assumption about the distribution of the population lifetimes, as well as the need of eliciting an infinite dimensional parameter (in case of lack of prior information), as it happens with the usual Dirichlet process prior. We show how such model can be used to derive robust inferences from right-censored lifetime data. Robustness is due to the identification of the decisions that are prior-dependent, and can be interpreted as an analysis of sensitivity with respect to the hypothetical inclusion of fictitious new samples in the data. In particular, we derive a nonparametric estimator of the survival probability and a hypothesis test about the probability that the lifetime of an individual from one population is shorter than the lifetime of an individual from another. We evaluate these ideas on simulated data and on the Australian AIDS survival dataset. The methods are publicly available through an easy-to-use R package.
Resumo:
There has been much interest in the belief–desire–intention (BDI) agent-based model for developing scalable intelligent systems, e.g. using the AgentSpeak framework. However, reasoning from sensor information in these large-scale systems remains a significant challenge. For example, agents may be faced with information from heterogeneous sources which is uncertain and incomplete, while the sources themselves may be unreliable or conflicting. In order to derive meaningful conclusions, it is important that such information be correctly modelled and combined. In this paper, we choose to model uncertain sensor information in Dempster–Shafer (DS) theory. Unfortunately, as in other uncertainty theories, simple combination strategies in DS theory are often too restrictive (losing valuable information) or too permissive (resulting in ignorance). For this reason, we investigate how a context-dependent strategy originally defined for possibility theory can be adapted to DS theory. In particular, we use the notion of largely partially maximal consistent subsets (LPMCSes) to characterise the context for when to use Dempster’s original rule of combination and for when to resort to an alternative. To guide this process, we identify existing measures of similarity and conflict for finding LPMCSes along with quality of information heuristics to ensure that LPMCSes are formed around high-quality information. We then propose an intelligent sensor model for integrating this information into the AgentSpeak framework which is responsible for applying evidence propagation to construct compatible information, for performing context-dependent combination and for deriving beliefs for revising an agent’s belief base. Finally, we present a power grid scenario inspired by a real-world case study to demonstrate our work.