49 resultados para Uncertainty Based Online Planning
Resumo:
We present a system for dynamic network resource configuration in environments with bandwidth reservation and path restoration mechanisms. Our focus is on the dynamic bandwidth management results, although the main goal of the system is the integration of the different mechanisms that manage the reserved paths (bandwidth, restoration, and spare capacity planning). The objective is to avoid conflicts between these mechanisms. The system is able to dynamically manage a logical network such as a virtual path network in ATM or a label switch path network in MPLS. This system has been designed to be modular in the sense that in can be activated or deactivated, and it can be applied only in a sub-network. The system design and implementation is based on a multi-agent system (MAS). We also included details of its architecture and implementation
Resumo:
This paper deals with fault detection and isolation problems for nonlinear dynamic systems. Both problems are stated as constraint satisfaction problems (CSP) and solved using consistency techniques. The main contribution is the isolation method based on consistency techniques and uncertainty space refining of interval parameters. The major advantage of this method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements, and model errors. Interval calculations bring independence from the assumption of monotony considered by several approaches for fault isolation which are based on observers. An application to a well known alcoholic fermentation process model is presented
Resumo:
A model-based approach for fault diagnosis is proposed, where the fault detection is based on checking the consistencyof the Analytical Redundancy Relations (ARRs) using an interval tool. The tool takes into account the uncertainty in theparameters and the measurements using intervals. Faults are explicitly included in the model, which allows for the exploitation of additional information. This information is obtained from partial derivatives computed from the ARRs. The signs in the residuals are used to prune the candidate space when performing the fault diagnosis task. The method is illustrated using a two-tank example, in which these aspects are shown to have an impact on the diagnosis and fault discrimination, since the proposed method goes beyond the structural methods
Resumo:
Often practical performance of analytical redundancy for fault detection and diagnosis is decreased by uncertainties prevailing not only in the system model, but also in the measurements. In this paper, the problem of fault detection is stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem can be solved using modal interval analysis and consistency techniques. Consistency techniques are then shown to be particularly efficient to check the consistency of the analytical redundancy relations (ARRs), dealing with uncertain measurements and parameters. Through the work presented in this paper, it can be observed that consistency techniques can be used to increase the performance of a robust fault detection tool, which is based on interval arithmetic. The proposed method is illustrated using a nonlinear dynamic model of a hydraulic system
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
In this paper the core functions of an artificial intelligence (AI) for controlling a debris collector robot are designed and implemented. Using the robot operating system (ROS) as the base of this work a multi-agent system is built with abilities for task planning.
Resumo:
This paper analyzes empirically the volatility of consumption-based stochastic discount factors as a measure of implicit economic fears by studying its relationship with future economic and stock market cycles. Time-varying economic fears seem to be well captured by the volatility of stochastic discount factors. In particular, the volatility of recursive utility-based stochastic discount factor with contemporaneous growth explains between 9 and 34 percent of future changes in industrial production at short and long horizons respectively. They also explain ex-ante uncertainty and risk aversion. However, future stock market cycles are better explained by a similar stochastic discount factor with long-run consumption growth. This specification of the stochastic discount factor presents higher volatility and lower pricing errors than the specification with contemporaneous consumption growth.
Resumo:
This paper investigates the link between brand performance and cultural primes in high-risk,innovation-based sectors. In theory section, we propose that the level of cultural uncertaintyavoidance embedded in a firm determine its marketing creativity by increasing the complexityand the broadness of a brand. It determines also the rate of firm product innovations.Marketing creativity and product innovation influence finally the firm marketingperformance. Empirically, we study trademarked promotion in the Software Security Industry(SSI). Our sample consists of 87 firms that are active in SSI from 11 countries in the period1993-2000. We use the data coming from SSI-related trademarks registered by these firms,ending up with 2,911 SSI-related trademarks and a panel of 18,213 observations. We estimatea two stage model in which first we predict the complexity and the broadness of a trademarkas a measure of marketing creativity and the rate of product innovations. Among severalcontrol variables, our variable of theoretical interest is the Hofstede s uncertainty avoidancecultural index. Then, we estimate the trademark duration with a hazard model using thepredicted complexity and broadness as well as the rate of product innovations, along with thesame control variables. Our evidence confirms that the cultural avoidance affects the durationof the trademarks through the firm marketing creativity and product innovation.
Resumo:
Two concentration methods for fast and routine determination of caffeine (using HPLC-UV detection) in surface, and wastewater are evaluated. Both methods are based on solid-phase extraction (SPE) concentration with octadecyl silica sorbents. A common “offline” SPE procedure shows that quantitative recovery of caffeine is obtained with 2 mL of an elution mixture solvent methanol-water containing at least 60% methanol. The method detection limit is 0.1 μg L−1 when percolating 1 L samples through the cartridge. The development of an “online” SPE method based on a mini-SPE column, containing 100 mg of the same sorbent, directly connected to the HPLC system allows the method detection limit to be decreased to 10 ng L−1 with a sample volume of 100 mL. The “offline” SPE method is applied to the analysis of caffeine in wastewater samples, whereas the “on-line” method is used for analysis in natural waters from streams receiving significant water intakes from local wastewater treatment plants
Resumo:
We describe the version of the GPT planner to be used in the planning competition. This version, called mGPT, solves mdps specified in the ppddllanguage by extracting and using different classes of lower bounds, along with various heuristic-search algorithms. The lower bounds are extracted from deterministic relaxations of the mdp where alternativeprobabilistic effects of an action are mapped into different, independent, deterministic actions. The heuristic-search algorithms, on the other hand, use these lower bounds for focusing the updates and delivering a consistent value function over all states reachable from the initial state with the greedy policy.
Resumo:
We introduce a width parameter that bounds the complexity of classical planning problems and domains, along with a simple but effective blind-search procedure that runs in time that is exponential in the problem width. We show that many benchmark domains have a bounded and small width provided thatgoals are restricted to single atoms, and hence that such problems are provably solvable in low polynomial time. We then focus on the practical value of these ideas over the existing benchmarks which feature conjunctive goals. We show that the blind-search procedure can be used for both serializing the goal into subgoals and for solving the resulting problems, resulting in a ‘blind’ planner that competes well with a best-first search planner guided by state-of-the-art heuristics. In addition, ideas like helpful actions and landmarks can be integrated as well, producing a planner with state-of-the-art performance.
Resumo:
Planning with partial observability can be formulated as a non-deterministic search problem in belief space. The problem is harder than classical planning as keeping track of beliefs is harder than keeping track of states, and searching for action policies is harder than searching for action sequences. In this work, we develop a framework for partial observability that avoids these limitations and leads to a planner that scales up to larger problems. For this, the class of problems is restricted to those in which 1) the non-unary clauses representing the uncertainty about the initial situation are nvariant, and 2) variables that are hidden in the initial situation do not appear in the body of conditional effects, which are all assumed to be deterministic. We show that such problems can be translated in linear time into equivalent fully observable non-deterministic planning problems, and that an slight extension of this translation renders the problem solvable by means of classical planners. The whole approach is sound and complete provided that in addition, the state-space is connected. Experiments are also reported.
Resumo:
The main objective of this Master Thesis is to discover more about Girona’s image as a tourism destination from different agents’ perspective and to study its differences on promotion or opinions. In order to meet this objective, three components of Girona’s destination image will be studied: attribute-based component, the holistic component, and the affective component. It is true that a lot of research has been done about tourism destination image, but it is less when we are talking about the destination of Girona. Some studies have already focused on Girona as a tourist destination, but they used a different type of sample and different methodological steps. This study is new among destination studies in the sense that it is based only on textual online data and it follows a methodology based on text-miming. Text-mining is a kind of methodology that allows people extract relevant information from texts. Also, after this information is extracted by this methodology, some statistical multivariate analyses are done with the aim of discovering more about Girona’s tourism image
Resumo:
A discussion is presented of daytime sky imaging and techniques that may be applied to the analysis of full-color sky images to infer cloud macrophysical properties. Descriptions of two different types of skyimaging systems developed by the authors are presented, one of which has been developed into a commercially available instrument. Retrievals of fractional sky cover from automated processing methods are compared to human retrievals, both from direct observations and visual analyses of sky images. Although some uncertainty exists in fractional sky cover retrievals from sky images, this uncertainty is no greater than that attached to human observations for the commercially available sky-imager retrievals. Thus, the application of automatic digital image processing techniques on sky images is a useful method to complement, or even replace, traditional human observations of sky cover and, potentially, cloud type. Additionally, the possibilities for inferring other cloud parameters such as cloud brokenness and solar obstruction further enhance the usefulness of sky imagers
Resumo:
This article presents preliminary research from an instructional design perspective on the design of the case method as an integral part of pedagogy and technology. Key features and benefitsusing this teaching and learning strategy in a Virtual Teaching and Learning Environment(VTLE) are identified, taking into account the requirements of the European Higher Education Area (EHEA) for a competence-based curricula design. The implications of these findings for alearning object approach exploring the possibilities of learning personalization, reusability and interoperability trough IMS LD, are also analyzed.