7 resultados para clauses
em Universidad Politécnica de Madrid
Resumo:
We propose an analysis for detecting procedures and goals that are deterministic (i.e., that produce at most one solution at most once),or predicates whose clause tests are mutually exclusive (which implies that at most one of their clauses will succeed) even if they are not deterministic. The analysis takes advantage of the pruning operator in order to improve the detection of mutual exclusion and determinacy. It also supports arithmetic equations and disequations, as well as equations and disequations on terms,for which we give a complete satisfiability testing algorithm, w.r.t. available type information. Information about determinacy can be used for program debugging and optimization, resource consumption and granularity control, abstraction carrying code, etc. We have implemented the analysis and integrated it in the CiaoPP system, which also infers automatically the mode and type information that our analysis takes as input. Experiments performed on this implementation show that the analysis is fairly accurate and efficient.
Resumo:
Static analyses of object-oriented programs usually rely on intermediate representations that respect the original semantics while having a more uniform and basic syntax. Most of the work involving object-oriented languages and abstract interpretation usually omits the description of that language or just refers to the Control Flow Graph(CFG) it represents. However, this lack of formalization on one hand results in an absence of assurances regarding the correctness of the transformation and on the other it typically strongly couples the analysis to the source language. In this work we present a framework for analysis of object-oriented languages in which in a first phase we transform the input program into a representation based on Horn clauses. This allows on one hand proving the transformation correct attending to a simple condition and on the other being able to apply an existing analyzer for (constraint) logic programming to automatically derive a safe approximation of the semantics of the original program. The approach is flexible in the sense that the first phase decouples the analyzer from most languagedependent features, and correct because the set of Horn clauses returned by the transformation phase safely approximates the standard semantics of the input program. The resulting analysis is also reasonably scalable due to the use of mature, modular (C)LP-based analyzers. The overall approach allows us to report results for medium-sized programs.
Resumo:
We propose an analysis for detecting procedures and goals that are deterministic (i.e. that produce at most one solution), or predicates whose clause tests are mutually exclusive (which implies that at most one of their clauses will succeed) even if they are not deterministic (because they cali other predicates that can produce more than one solution). Applications of such determinacy information include detecting programming errors, performing certain high-level program transformations for improving search efñciency, optimizing low level code generation and parallel execution, and estimating tighter upper bounds on the computational costs of goals and data sizes, which can be used for program debugging, resource consumption and granularity control, etc. We have implemented the analysis and integrated it in the CiaoPP system, which also infers automatically the mode and type information that our analysis takes as input. Experiments performed on this implementation show that the analysis is fairly accurate and efncient.
Resumo:
We propose an analysis for detecting procedures and goals that are deterministic (i.e., that produce at most one solution at most once), or predicates whose clause tests are mutually exclusive (which implies that at most one of their clauses will succeed) even if they are not deterministic. The analysis takes advantage of the pruning operator in order to improve the detection of mutual exclusion and determinacy. It also supports arithmetic equations and disequations, as well as equations and disequations on terms, for which we give a complete satisfiability testing algorithm, w.r.t. available type information. We have implemented the analysis and integrated it in the CiaoPP system, which also infers automatically the mode and type information that our analysis takes as input. Experiments performed on this implementation show that the analysis is fairly accurate and efficient.
Resumo:
Concession contracts in highways often include some kind of clauses (for example, a minimum traffic guarantee) that allow for better management of the business risks. The value of these clauses may be important and should be added to the total value of the concession. However, in these cases, traditional valuation techniques, like the NPV (net present value) of the project, are insufficient. An alternative methodology for the valuation of highway concession is one based on the real options approach. This methodology is generally built on the assumption of the evolution of traffic volume as a GBM (geometric Brownian motion), which is the hypothesis analyzed in this paper. First, a description of the methodology used for the analysis of the existence of unit roots (i.e., the hypothesis of non-stationarity) is provided. The Dickey-Fuller approach has been used, which is the most common test for this kind of analysis. Then this methodology is applied to perform a statistical analysis of traffic series in Spanish toll highways. For this purpose, data on the AADT (annual average daily traffic) on a set of highways have been used. The period of analysis is around thirty years in most cases. The main outcome of the research is that the hypothesis that traffic volume follows a GBM process in Spanish toll highways cannot be rejected. This result is robust, and therefore it can be used as a starting point for the application of the real options theory to assess toll highway concessions.
Resumo:
Writing an efficient abstract is always a difficult and significant work in academic writing. What kinds of abstracts are well reputed in sport science? To answer this question, 20 abstracts from top journals of sport science were analyzed in the current research. The number of words and rhetorical moves were studied to assess the structures of the abstracts. Meanwhile, the key clauses, citations, the use of first person pronoun, the adoption of abbreviations and acronyms, hedging and the main tense were included in the analysis of the writing skills. Results have show: (1) Almost all of the abstracts were non-structured, and the length varied a lot, but the average word count was about 210-220; (2) the use of writing skills, such as key clauses, citations and hedging differed depending on the preference of the journal where the abstract appeared, and the main tense was selected based on the context of the abstract. In most cases, abbreviations and acronyms were allowed to be used, while the first person pronoun was always avoided
Resumo:
This paper describes a new technique referred to as watched subgraphs which improves the performance of BBMC, a leading state of the art exact maximum clique solver (MCP). It is based on watched literals employed by modern SAT solvers for boolean constraint propagation. In efficient SAT algorithms, a list of clauses is kept for each literal (it is said that the clauses watch the literal) so that only those in the list are checked for constraint propagation when a (watched) literal is assigned during search. BBMC encodes vertex sets as bit strings, a bit block representing a subset of vertices (and the corresponding induced subgraph) the size of the CPU register word. The paper proposes to watch two subgraphs of critical sets during MCP search to efficiently compute a number of basic operations. Reported results validate the approach as the size and density of problem instances rise, while achieving comparable performance in the general case.