965 resultados para Uncertainty management


Relevância:

60.00% 60.00%

Publicador:

Resumo:

[EN] This study examines the evolution of budgeting practices in the extremely difficult Spanish economic environment. In order to analyse if companies are still maintaining their budgeting process and if, right now, they are facing more difficulties in forecasting accurate indicators, two similar web surveys were addressed over two periods of time, firstly in 2008 at the beginning of the financial crisis, and secondly in 2013 after five years of a downward trend. In addition, in-depth interviews were conducted to investigate how companies brought more flexibility to their budgeting process in order to cope with environmental uncertainty. The survey indicates that 97% of respondents are still using a traditional budgeting process being this result similar to the one found in 2008. However, 2013 showed that the reliance on forecasted information is being increasingly questioned. Furthermore the study revealed that the respondents are bringing more flexibility to their processes, being able to modify the objectives once the budget is approved and to obtain new resources outside the budgeting process. This paper contributes to revealing information about difficulties in setting reliable objectives in a turbulent environment and provides data about the evolution of budgeting practices over five years during an austere economic crisis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

CCTV systems are broadly deployed in the present world. Despite this, the impact on anti-social and criminal behaviour has been minimal. Subject reacquisition is a fundamental task to ensure in-time reaction for intelligent surveillance. However, traditional reacquisition based on face recognition is not scalable, hence in this paper we use reasoning techniques to reduce the computational effort which deploys the time-of-flight information between interested zones such as airport security corridors. Also, to improve accuracy of reacquisition, we introduce the idea of revision as a method of post-processing.We demonstrate the significance and usefulness of our framework with an experiment which shows much less computational effort and better accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Belief merging operators combine multiple belief bases (a profile) into a collective one. When the conjunction of belief bases is consistent, all the operators agree on the result. However, if the conjunction of belief bases is inconsistent, the results vary between operators. There is no formal manner to measure the results and decide on which operator to select. So, in this paper we propose to evaluate the result of merging operators by using three ordering relations (fairness, satisfaction and strength) over operators for a given profile. Moreover, a relation of conformity over operators is introduced in order to classify how well the operator conforms to the definition of a merging operator. By using the four proposed relations we provide a comparison of some classical merging operators and evaluate the results for some specific profiles.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Knowledge is an important component in many intelligent systems.
Since items of knowledge in a knowledge base can be conflicting, especially if
there are multiple sources contributing to the knowledge in this base, significant
research efforts have been made on developing inconsistency measures for
knowledge bases and on developing merging approaches. Most of these efforts
start with flat knowledge bases. However, in many real-world applications, items
of knowledge are not perceived with equal importance, rather, weights (which
can be used to indicate the importance or priority) are associated with items of
knowledge. Therefore, measuring the inconsistency of a knowledge base with
weighted formulae as well as their merging is an important but difficult task. In
this paper, we derive a numerical characteristic function from each knowledge
base with weighted formulae, based on the Dempster-Shafer theory of evidence.
Using these functions, we are able to measure the inconsistency of the knowledge
base in a convenient and rational way, and are able to merge multiple knowledge
bases with weighted formulae, even if knowledge in these bases may be
inconsistent. Furthermore, by examining whether multiple knowledge bases are
dependent or independent, they can be combined in different ways using their
characteristic functions, which cannot be handled (or at least have never been
considered) in classic knowledge based merging approaches in the literature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gender profiling is a fundamental task that helps CCTV systems to
provide better service for intelligent surveillance. Since subjects being detected
by CCTVs are not always cooperative, a few profiling algorithms are proposed
to deal with situations when faces of subjects are not available, among which
the most common approach is to analyze subjects’ body shape information. In
addition, there are some drawbacks for normal profiling algorithms considered
in real applications. First, the profiling result is always uncertain. Second, for a
time-lasting gender profiling algorithm, the result is not stable. The degree of
certainty usually varies, sometimes even to the extent that a male is classified
as a female, and vice versa. These facets are studied in a recent paper [16] using
Dempster-Shafer theory. In particular, Denoeux’s cautious rule is applied for
fusion mass functions through time lines. However, this paper points out that if
severe mis-classification is happened at the beginning of the time line, the result
of applying Denoeux’s rule could be disastrous. To remedy this weakness,
in this paper, we propose two generalizations to the DS approach proposed in
[16] that incorporates time-window and time-attenuation, respectively, in applying
Denoeux’s rule along with time lines, for which the DS approach is a special
case. Experiments show that these two generalizations do provide better results
than their predecessor when mis-classifications happen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Belief revision is the process that incorporates, in a consistent way,
a new piece of information, called input, into a belief base. When both belief
bases and inputs are propositional formulas, a set of natural and rational properties, known as AGM postulates, have been proposed to define genuine revision operations. This paper addresses the following important issue : How to revise a partially pre-ordered information (representing initial beliefs) with a new partially pre-ordered information (representing inputs) while preserving AGM postulates? We first provide a particular representation of partial pre-orders (called units) using the concept of closed sets of units. Then we restate AGM postulates in this framework by defining counterparts of the notions of logical entailment and logical consistency. In the second part of the paper, we provide some examples of revision operations that respect our set of postulates. We also prove that our revision methods extend well-known lexicographic revision and natural revision for both cases where the input is either a single propositional formula or a total pre-order.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Demand for intelligent surveillance in public transport systems is growing due to the increased threats of terrorist attack, vandalism and litigation. The aim of intelligent surveillance is in-time reaction to information received from various monitoring devices, especially CCTV systems. However, video analytic algorithms can only provide static assertions, whilst in reality, many related events happen in sequence and hence should be modeled sequentially. Moreover, analytic algorithms are error-prone, hence how to correct the sequential analytic results based on new evidence (external information or later sensing discovery) becomes an interesting issue. In this paper, we introduce a high-level sequential observation modeling framework which can support revision and update on new evidence. This framework adapts the situation calculus to deal with uncertainty from analytic results. The output of the framework can serve as a foundation for event composition. We demonstrate the significance and usefulness of our framework with a case study of a bus surveillance project.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Possibilistic answer set programming (PASP) unites answer set programming (ASP) and possibilistic logic (PL) by associating certainty values with rules. The resulting framework allows to combine both non-monotonic reasoning and reasoning under uncertainty in a single framework. While PASP has been well-studied for possibilistic definite and possibilistic normal programs, we argue that the current semantics of possibilistic disjunctive programs are not entirely satisfactory. The problem is twofold. First, the treatment of negation-as-failure in existing approaches follows an all-or-nothing scheme that is hard to match with the graded notion of proof underlying PASP. Second, we advocate that the notion of disjunction can be interpreted in several ways. In particular, in addition to the view of ordinary ASP where disjunctions are used to induce a non-deterministic choice, the possibilistic setting naturally leads to a more epistemic view of disjunction. In this paper, we propose a semantics for possibilistic disjunctive programs, discussing both views on disjunction. Extending our earlier work, we interpret such programs as sets of constraints on possibility distributions, whose least specific solutions correspond to answer sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de doutoramento, Ciências do Ambiente, Universidade de Lisboa, Faculdade de Ciências, Universidade Nova de Lisboa, 2015