929 resultados para Program Analysis


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques intended to identify and isolate code fragments which depend on, or are depended upon, specific program entities. This is particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, and corresponding tools, target either the imperative or the object oriented paradigms, where program slices are computed with respect to a variable or a program statement. Taking a complementary point of view, this paper focuses on the slicing of higher-order functional programs under a lazy evaluation strategy. A prototype of a Haskell slicer, built as proof-of-concept for these ideas, is also introduced

Relevância:

60.00% 60.00%

Publicador:

Resumo:

More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A large and growing amount of software systems rely on non-trivial coordination logic for making use of third party services or components. Therefore, it is of outmost importance to understand and capture rigorously this continuously growing layer of coordination as this will make easier not only the veri cation of such systems with respect to their original speci cations, but also maintenance, further development, testing, deployment and integration. This paper introduces a method based on several program analysis techniques (namely, dependence graphs, program slicing, and graph pattern analysis) to extract coordination logic from legacy systems source code. This process is driven by a series of pre-de ned coordination patterns and captured by a special purpose graph structure from which coordination speci cations can be generated in a number of di erent formalisms

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Current software development relies increasingly on non-trivial coordination logic for com- bining autonomous services often running on di erent platforms. As a rule, however, in typical non-trivial software systems, such a coordination layer is strongly weaved within the application at source code level. Therefore, its precise identi cation becomes a major methodological (and technical) problem which cannot be overestimated along any program understanding or refactoring process. Open access to source code, as granted in OSS certi cation, provides an opportunity for the devel- opment of methods and technologies to extract, from source code, the relevant coordination information. This paper is a step in this direction, combining a number of program analysis techniques to automatically recover coordination information from legacy code. Such information is then expressed as a model in Orc, a general purpose orchestration language

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Concurrent programming is a difficult and error-prone task because the programmer must reason about multiple threads of execution and their possible interleavings. A concurrent program must synchronize the concurrent accesses to shared memory regions, but this is not enough to prevent all anomalies that can arise in a concurrent setting. The programmer can misidentify the scope of the regions of code that need to be atomic, resulting in atomicity violations and failing to ensure the correct behavior of the program. Executing a sequence of atomic operations may lead to incorrect results when these operations are co-related. In this case, the programmer may be required to enforce the sequential execution of those operations as a whole to avoid atomicity violations. This situation is specially common when the developer makes use of services from third-party packages or modules. This thesis proposes a methodology, based on the design by contract methodology, to specify which sequences of operations must be executed atomically. We developed an analysis that statically verifies that a client of a module is respecting its contract, allowing the programmer to identify the source of possible atomicity violations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil (área de especialização em Estruturas e Geotecnia)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Tax Credits Contingent Liabilities Report was created by the Tax Research and Program Analysis Section of the Iowa Department of Revenue (IDR) for the benefit of the Revenue Estimating Conference (REC). This report is part of the Tax Credits Tracking and Analysis Program. The goal of the program is to provide a repository for information concerning the awarding, usage, and effectiveness of tax credits. This report forecasts tax credit claims assuming that all available awarded credits are issued and then, along with forecasted credits, are subsequently claimed.