982 resultados para Symbolic model checking
Resumo:
Since their inception in 1962, Petri nets have been used in a wide variety of application domains. Although Petri nets are graphical and easy to understand, they have formal semantics and allow for analysis techniques ranging from model checking and structural analysis to process mining and performance analysis. Over time Petri nets emerged as a solid foundation for Business Process Management (BPM) research. The BPM discipline develops methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. Mainstream business process modeling notations and workflow management systems are using token-based semantics borrowed from Petri nets. Moreover, state-of-the-art BPM analysis techniques are using Petri nets as an internal representation. Users of BPM methods and tools are often not aware of this. This paper aims to unveil the seminal role of Petri nets in BPM.
Resumo:
In this thesis the use of the Bayesian approach to statistical inference in fisheries stock assessment is studied. The work was conducted in collaboration of the Finnish Game and Fisheries Research Institute by using the problem of monitoring and prediction of the juvenile salmon population in the River Tornionjoki as an example application. The River Tornionjoki is the largest salmon river flowing into the Baltic Sea. This thesis tackles the issues of model formulation and model checking as well as computational problems related to Bayesian modelling in the context of fisheries stock assessment. Each article of the thesis provides a novel method either for extracting information from data obtained via a particular type of sampling system or for integrating the information about the fish stock from multiple sources in terms of a population dynamics model. Mark-recapture and removal sampling schemes and a random catch sampling method are covered for the estimation of the population size. In addition, a method for estimating the stock composition of a salmon catch based on DNA samples is also presented. For most of the articles, Markov chain Monte Carlo (MCMC) simulation has been used as a tool to approximate the posterior distribution. Problems arising from the sampling method are also briefly discussed and potential solutions for these problems are proposed. Special emphasis in the discussion is given to the philosophical foundation of the Bayesian approach in the context of fisheries stock assessment. It is argued that the role of subjective prior knowledge needed in practically all parts of a Bayesian model should be recognized and consequently fully utilised in the process of model formulation.
Resumo:
Formal specification is vital to the development of distributed real-time systems as these systems are inherently complex and safety-critical. It is widely acknowledged that formal specification and automatic analysis of specifications can significantly increase system reliability. Although a number of specification techniques for real-time systems have been reported in the literature, most of these formalisms do not adequately address to the constraints that the aspects of 'distribution' and 'real-time' impose on specifications. Further, an automatic verification tool is necessary to reduce human errors in the reasoning process. In this regard, this paper is an attempt towards the development of a novel executable specification language for distributed real-time systems. First, we give a precise characterization of the syntax and semantics of DL. Subsequently, we discuss the problems of model checking, automatic verification of satisfiability of DL specifications, and testing conformance of event traces with DL specifications. Effective solutions to these problems are presented as extensions to the classical first-order tableau algorithm. The use of the proposed framework is illustrated by specifying a sample problem.
Resumo:
Denial-of-service (DoS) attacks form a very important category of security threats that are prevalent in MIPv6 (mobile internet protocol version 6) today. Many schemes have been proposed to alleviate such threats, including one of our own [9]. However, reasoning about the correctness of such protocols is not trivial. In addition, new solutions to mitigate attacks may need to be deployed in the network on a frequent basis as and when attacks are detected, as it is practically impossible to anticipate all attacks and provide solutions in advance. This makes it necessary to validate the solutions in a timely manner before deployment in the real network. However, threshold schemes needed in group protocols make analysis complex. Model checking threshold-based group protocols that employ cryptography have not been successful so far. Here, we propose a new simulation based approach for validation using a tool called FRAMOGR that supports executable specification of group protocols that use cryptography. FRAMOGR allows one to specify attackers and track probability distributions of values or paths. We believe that infrastructure such as FRAMOGR would be required in future for validating new group based threshold protocols that may be needed for making MIPv6 more robust.
Resumo:
Cache analysis plays a very important role in obtaining precise Worst Case Execution Time (WCET) estimates of programs for real-time systems. While Abstract Interpretation based approaches are almost universally used for cache analysis, they fail to take advantage of its unique requirement: it is not necessary to find the guaranteed cache behavior that holds across all executions of a program. We only need the cache behavior along one particular program path, which is the path with the maximum execution time. In this work, we introduce the concept of cache miss paths, which allows us to use the worst-case path information to improve the precision of AI-based cache analysis. We use Abstract Interpretation to determine the cache miss paths, and then integrate them in the IPET formulation. An added advantage is that this further allows us to use infeasible path information for cache analysis. Experimentally, our approach gives more precise WCETs as compared to AI-based cache analysis, and we also provide techniques to trade-off analysis time with precision to provide scalability.
Resumo:
Piezoelectric actuators are mounted on both sides of a rectangular wing model. Possibility of the improvement of aircraft rolling power is investigated. All experiment projects, including designing the wind tunnel model, checking the material constants, measuring the natural frequencies and checking the effects of actuators, guarantee the correctness and precision of the finite element model. The wind tunnel experiment results show that the calculations coincide with the experiments. The feasibility of fictitious control surface is validated.
Resumo:
In the past many different methodologies have been devised to support software development and different sets of methodologies have been developed to support the analysis of software artefacts. We have identified this mismatch as one of the causes of the poor reliability of embedded systems software. The issue with software development styles is that they are ``analysis-agnostic.'' They do not try to structure the code in a way that lends itself to analysis. The analysis is usually applied post-mortem after the software was developed and it requires a large amount of effort. The issue with software analysis methodologies is that they do not exploit available information about the system being analyzed.
In this thesis we address the above issues by developing a new methodology, called "analysis-aware" design, that links software development styles with the capabilities of analysis tools. This methodology forms the basis of a framework for interactive software development. The framework consists of an executable specification language and a set of analysis tools based on static analysis, testing, and model checking. The language enforces an analysis-friendly code structure and offers primitives that allow users to implement their own testers and model checkers directly in the language. We introduce a new approach to static analysis that takes advantage of the capabilities of a rule-based engine. We have applied the analysis-aware methodology to the development of a smart home application.
Resumo:
The State Key Laboratory of Computer Science (SKLCS) is committed to basic research in computer science and software engineering. The research topics of the laboratory include: concurrency theory, theory and algorithms for real-time systems, formal specifications based on context-free grammars, semantics of programming languages, model checking, automated reasoning, logic programming, software testing, software process improvement, middleware technology, parallel algorithms and parallel software, computer graphics and human-computer interaction. This paper describes these topics in some detail and summarizes some results obtained in recent years.
Resumo:
带参并发系统 实际包含一族并发系统实例,其中以一个(或多个)参数表示每个系统实例的规模,比如实例系统中 并发执行的进程个数或数据域的大小。带参模型检测的任务是验证 对任意的参数数值,所有的系统实例都满足所期望的性质。本文研究带参并发系统的模型检测方法和技术。 一般来说, 带参系统的参数不仅代表进程个数,也可以是通信缓冲区容量、数据域大小、数据通路宽度等。然而迄今为止,该领域的绝大多数研究都 针对前者类型的参数,而往往忽视数据方面的参数。本文中,我们提出基于带符号迁移图的复合型带参网络协议建模框架。 系统表示为顺序结构的“结点进程”的并发复合,彼此通过“信道进程”进行通讯;性质以一种嵌套谓词等式系形式的一阶$\mu$ 演算表达;进而相应扩展由实验室自主开发的传值进程模型检测工具,对于数据带参的并发系统给出有效的验证方法, 并且对于一些典型的带参系统作出实例研究。 出乎意料的是,在GERMAN2004 协议的缓存一致性已经被其他研究者证明正确的情况下,我们却发现了它在于数据一致性方面的一个严重错误,并予以更正。 进一步,应用我们检测工具中所实现的“数据无关”技术,我们有效推广了研究结论,证明了对于任意大小的数据域, 修改后的GERMAN协议仍然保证满足数据一致性要求。 环境抽象的方法将计数抽象的思想借鉴到谓词抽象的技术中,对于控制带参的并发系统给出一般性验证框架。然而该方法具有相当 高的复杂度;对于很多实际应用的系统而言,所构造的抽象系统规模仍然超出常用模型检测工具所能处理的范围。为了克服这个困难, 我们提出了{\em 状态聚类}方法,在 局部可达性分析的基础上,通过对系统表达式的分析把全部可达的局部状态分组成为少数几个状态簇。在进行 状态聚类的启发式算法中,我们采用了先分裂后聚合的策略,即在识别出不同类的具有代表性的状态配置后,再将类似的状态尽可能 合并到同一个状态簇。这样局部状态被相应划分成为若干等价类,每个类以一个单元状态为代表。然后我们在系统描述的层次上, 通过程序重写的方式直接构造生成对应的抽象系统。 实例研究显示,状态聚类方法通常能够将抽象节点的大小缩减约3个数量级。 参数抽象作为带参系统的安全性性质验证的另外一个有效方法,其最困难之处在于找到合适的辅助不变量,用以进行卫加强 以精化抽象模型。针对这个问题,我们提出一种启发式的方法,通过在参考模型中计算自动产生辅助不变量。 利用这一技术,我们将参数截断的思想应用于 处理环境抽象中依赖于参数类型的复杂变量,使得环境抽象在状态聚类后得到进一步优化。我们有效结合了参数抽象与环境抽象技术, 显著降低了抽象系统状态空间规模,从而极大节省了模型检测的内存开销;并且由于抽象中的每个 步骤都以现行的模型检测工具辅助计算,整个验证过程可以机械化完成,不需要过多的人工介入。作为示范,对于 工业应用的复杂的缓存一致性协议FLASH,我们在保留了其中最为著名的“三跳”事务处理的情况下,验证了缓存一致性和数据一致性。 据我们所知,这是首次这样工业实用的复杂协议能够同时在控制方面和数据方面在如此的精度上得到成功验证。
Resumo:
形式化验证主要是通过精确的分析来证明或证伪硬件或软件系统中一些明确的声明或者性质。形式化验证方法在广义上可以分成两大类:模型检测和定理证明。模型检测由对模型的所有状态和迁移做详尽访问的自动检测机制组成。这种机制是通过对适当的抽象模型进行直接或间接的状态枚举技术来实现的,从而证明模型中存在或者不存在所谓的“瑕疵”状态。定理证明则是运用数学推理和逻辑推论来证明系统的正确性。本文所讨论的就是模型检测方法中的限界模型检测方法及其相关的改进和应用。近些年来,基于可满足性求解(SAT)的限界模型检测方法作为基于BDD的模型检测方法的一种有效的补充,已经有了一定的发展。A. Biere等人最先提出了对于线性时序逻辑公式(LTL)的限界模型检测方法,进而,W.Penczek又提出了对于全局计算树逻辑(ACTL)的限界模型检测方法。由于模型检测方法具有很高的复杂性,因此,其效率问题始终倍受关注。论文第一个方面的重要贡献就体现在提高限界模型检测效率方面的相关研究上。我们对Penczek提出的限界模型检测方法进行了两方面的改进,以提高其求解效率。第一方面的改进是通过改进所需路径条数计算函数,将时序操作符EX和其它时序操作符区分开来,以减少编码时所需的迁移关系和变量数目;第二方面的改进则是采用统一路径编码的方式来简化公式的编码。通过这两方面的改进,公式在最坏情况下的编码复杂度得到了有效的降低。同时,我们还在工具BMV中实现了改进后的方法。随着实时系统的应用日益广泛,在对全局计算树逻辑的限界模型检测方法做了改进,提高了其求解效率以后,我们又将目光转移到了对实时系统的限界模型检测上。如何将限界模型检测方法高效的运用到对实时系统的检测当中成为了本文第二个方面的重要贡献。基于SAT的限界模型检测方法在对实时系统的模型和描述性质的公式进行编码时,需要对时间变量和时钟约束进行布尔编码,由于时间的连续性和不确定性,使得在对它进行布尔编码时,往往会相当的复杂,不仅耗费大量的时间,而且还不利于SAT的求解,即使是使用改进后的限界模型检测方法,整个验证过程的效率还是很低。所以,在对实时系统进行限界模型检测方面,我们转而考虑采用基于可满足性模块理论(SMT)的方法来对实时系统的模型和描述性质的公式进行编码。由于SMT可以直接处理基于整数或者实数的线性算术表达式,因此,我们可以直接用整型或者实型变量来表示时间变量,用线性算术表达式来表示时钟约束。与基于SAT的限界模型检测方法相比,基于SMT的限界模型检测方法在处理实时系统方面,不仅简化了编码过程,而且大大提高了求解效率。在此基础上,我们运用基于SAT的对全局计算树逻辑的限界模型检测方法中的改进思想,对基于SMT的实时系统的限界模型检测方法也做了相应的改进,使其求解效率较之改进前有了较大的提升。最后,我们就基于SAT的限界模型检测方法和基于SMT的限界模型检测方法以及它们的改进方法进行了一系列的实验对比。从实验结果可以看出,改进后的方法比改进前的方法在时间效率和空间效率上都有明显的提升,同时就实时系统来说,采用了SMT和限界模型检测相结合的方法以后,效率也比没有采用这两者的方法要高很多。
Resumo:
随着工业化程度的提高,对模型的检测方法受到越来越大的重视。通常的检测方法有推理验证和模型检测。而相对于推理验证,模型检测由于其高度自动化的检测过程,在工业界有着更广泛的应用。 模型检测自从概念雏形开始之时,就受到状态爆炸问题的困扰。为了解决这个问题,不同的技术被提了出来。McMillan提出利用OBDD的符号模型检测方法在一定程度上解决了这个问题。另外,本文所主要讲述的有界模型检测,经大量的实践证明,对符号模型检测也是一个很好的补充。 由于问题的特殊性,科学家提出来不同的模型来描述不同情况的系统。由于时间自动机能够很好地描述异步系统,而且这种系统广泛地存在于现实生活中,对时间自动机这种特殊模型的验证方法的研究变得很有必要。另外,由于时间自动机带有实数域的时钟变量,这导致了时间自动机有一个无限域的状态迁移图。为了利用模型检测的方法对其进行验证,需要对时间变量进行预处理。一般的方法是把时钟所对应的时钟区或时钟域根据等价性,化为有限的域,相应地,把时间自动机转化为有限的状态迁移图。 为了避免在对时间自动机有界模型检测过程中对变量进行布尔编码以及对时间自动机模型中的时钟进行预处理,本文给出一个利用SMT工具进行的对时间自动机进行有界模型检测的方法。该方法的主要优点是无需将时间自动机中的时钟进行预处理,也不需要将模型中的变量进行布尔编码,只需将时间自动机转变为SMT工具可解的逻辑公式,利用SMT的高效求解来进行模型检测。实验结果表明,对于某些可达性性质的验证,这种方法的效率有一定的优势。
Resumo:
诊断信息自动生成是模型检测方法的基本特征之一,对分析和排错具有重要的意义,讨论了传值进程模型检测中诊断信息的生成问题,引入了两种诊断信息的表示结构:证明图和示例;提出了两种诊断信息的构造算法,所采用的方法是从检测过程保存的依赖信息中抽取证明图和示例,这样可以继承已有的信息,从而减少计算量,相应的算法已经实现并用实例作了分析测试,实验结果表明该方法是有效的。
Resumo:
National Natural Science Foundation of China; Public Administration and Civil Service Bureau of Macau SAR; Companhia de Telecomunicacoes de Macau S.A.R.L.; Macau SAR Government Tourist Office
Resumo:
模型检测是近二十几年来最成功的自动验证技术之一,而模型检测工具的开发是将模型检测和实际相结合的关键.为了有效地对涉及到复杂数据类型的并发传值系统进行模型检测,总结了以扩展的带赋值符号迁移图和模态图分别作为并发系统和逻辑公式的语义模型来实现模型检测工具的工作,特别是将复杂数据结构引入传值进程定义语言和带赋值符号迁移图.同时结合实际例子说明模型检测工具的有效性.