987 resultados para Event-B


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Event-B is a formal method for modeling and verification of discrete transition systems. Event-B development yields proof obligations that must be verified (i.e. proved valid) in order to keep the produced models consistent. Satisfiability Modulo Theory solvers are automated theorem provers used to verify the satisfiability of logic formulas considering a background theory (or combination of theories). SMT solvers not only handle large firstorder formulas, but can also generate models and proofs, as well as identify unsatisfiable subsets of hypotheses (unsat-cores). Tool support for Event-B is provided by the Rodin platform: an extensible Eclipse based IDE that combines modeling and proving features. A SMT plug-in for Rodin has been developed intending to integrate alternative, efficient verification techniques to the platform. We implemented a series of complements to the SMT solver plug-in for Rodin, namely improvements to the user interface for when proof obligations are reported as invalid by the plug-in. Additionally, we modified some of the plug-in features, such as support for proof generation and unsat-core extraction, to comply with the SMT-LIB standard for SMT solvers. We undertook tests using applicable proof obligations to demonstrate the new features. The contributions described can potentially affect productivity in a positive manner.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background and Aim: Patients with gastric carcinomas have a poor prognosis and low survival rates. The aim of the present paper was to characterize cellular and molecular properties to provide insight into aspects of tumor progression in early compared with advanced gastric cancers. Methods: One hundred and nine graded gastric carcinomas (early or advanced stage, undifferentiated or differentiated type) with paired non-cancer tissue were studied to define the correlation between apoptosis (morphology, terminal deoxyribonucleotidyl transferase-mediated dUTP-digoxigenin nick end-labeling), cell proliferation (Ki-67 expression, morphology) and expression and localization of two proteins frequently having altered expression in cancers, namely p53 and c-myc. Results: Overall, apoptosis was lower in early stage, differentiated and undifferentiated gastric carcinomas compared with advanced-stage cancers. Cell proliferation was comparatively high in all stages. There was a high level of p53 positivity in all stages. Only the early- and advanced-stage undifferentiated cancers that were p53 positive had a significantly higher level of apoptosis (P< 0.05). Cell proliferation was significantly greater (P < 0.05) only in the early undifferentiated cancers that had either c-myc or p53-positivity. Conclusions: The results indicate that low apoptosis and high cell proliferation combine to drive gastric cancer development. The molecular controls for high cell proliferation of the early stage undifferentiated gastric cancers involve overexpression of both p53 and c-myc. Overexpression of p53 may also control cancer development in that its expression is associated with higher levels of apoptosis in early and late-stage undifferentiated, cancers. (C) 2002 Blackwell Publishing Asia Pty Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mouse mammary tumor virus (MMTV) infects B lymphocytes and expresses a superantigen on the cell surface after integration of its reverse-transcribed genome. Superantigen-dependent B- and T-cell activation becomes detectable 2 to 3 days after infection. We show here that before this event, B cells undergo a polyclonal activation which does not involve massive proliferation. This first phase of B-cell activation is T cell independent. Moreover, during the first phase of activation, when only a small fraction of B cells is infected by MMTV(SW), viral DNA is detected only in activated B cells. Such a B-cell activation is also seen after injection of murine leukemia virus but not after injection of vaccinia virus, despite the very similar kinetics and intensity of the immune response. Since retroviruses require activated target cells to induce efficient infection, these data suggest that the early polyclonal retrovirus-induced target cell activation might play an important role in the establishment of retroviral infections.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Questa tesi di dottorato è inserita nell’ambito della convenzione tra ARPA_SIMC (che è l’Ente finanziatore), l’Agenzia Regionale di Protezione Civile ed il Dipartimento di Scienze della Terra e Geologico - Ambientali dell’Ateneo di Bologna. L’obiettivo principale è la determinazione di possibili soglie pluviometriche di innesco per i fenomeni franosi in Emilia Romagna che possano essere utilizzate come strumento di supporto previsionale in sala operativa di Protezione Civile. In un contesto geologico così complesso, un approccio empirico tradizionale non è sufficiente per discriminare in modo univoco tra eventi meteo innescanti e non, ed in generale la distribuzione dei dati appare troppo dispersa per poter tracciare una soglia statisticamente significativa. È stato quindi deciso di applicare il rigoroso approccio statistico Bayesiano, innovativo poiché calcola la probabilità di frana dato un certo evento di pioggia (P(A|B)) , considerando non solo le precipitazioni innescanti frane (quindi la probabilità condizionata di avere un certo evento di precipitazione data l’occorrenza di frana, P(B|A)), ma anche le precipitazioni non innescanti (quindi la probabilità a priori di un evento di pioggia, P(A)). L’approccio Bayesiano è stato applicato all’intervallo temporale compreso tra il 1939 ed il 2009. Le isolinee di probabilità ottenute minimizzano i falsi allarmi e sono facilmente implementabili in un sistema di allertamento regionale, ma possono presentare limiti previsionali per fenomeni non rappresentati nel dataset storico o che avvengono in condizioni anomale. Ne sono esempio le frane superficiali con evoluzione in debris flows, estremamente rare negli ultimi 70 anni, ma con frequenza recentemente in aumento. Si è cercato di affrontare questo problema testando la variabilità previsionale di alcuni modelli fisicamente basati appositamente sviluppati a questo scopo, tra cui X – SLIP (Montrasio et al., 1998), SHALSTAB (SHALlow STABility model, Montgomery & Dietrich, 1994), Iverson (2000), TRIGRS 1.0 (Baum et al., 2002), TRIGRS 2.0 (Baum et al., 2008).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During Ocean Drilling Program (ODP) Leg 189, five sites were drilled in the Tasmanian Seaway with the objective to constrain the paleoceanographic implications of the separation of Australia from Antarctica and to elucidate the paleoceanographic developments throughout the Neogene (Shipboard Scientific Party, 2001a, doi:10.2973/odp.proc.ir.189.101.2001). Sediments ranged from Cretaceous to Quaternary in age and provided the opportunity to describe the paleoenvironments in the Tasman Seaway prior to, during, and after the separation of Australia and Antarctica. This study will focus on postseparation distribution of calcareous nannofossils through the Miocene. Miocene sediments were recovered at all five Leg 189 sites, and four of these sites were studied in detail to determine the calcareous nannofossil biostratigraphy. Hole 1168A, located on the western Tasmanian margin, contains a fairly continuous Miocene record and could be easily zoned using the Okada and Bukry (1980, doi:10.1016/0377-8398(80)90016-X) zonation. Analysis of sediments from Hole 1169A, located on the western South Tasman Rise, was not included in this study, as the recovered sediments were highly disturbed and unsuitable for further analysis (Shipboard Scientific Party, 2001c, doi:10.2973/odp.proc.ir.189.104.2001). Holes 1170A, 1171A, and 1171C are located on the South Tasman Rise south of the modern Subtropical Front (STF). They revealed incomplete Miocene sequences intersected by an early Miocene and late Miocene hiatus and could only be roughly zoned using the Okada and Bukry zonation. Similarly, Hole 1172A, located on the East Tasman Plateau, contains a Miocene sequence with a hiatus in the early Miocene and in the late Miocene and could only be roughly zoned using the Okada and Bukry (1980, doi:10.1016/0377-8398(80)90016-X) zonation. This study aims to improve calcareous nannofossil biostratigraphic resolution in this sector of the mid to high southern latitudes. This paper will present abundance, preservation, and stratigraphic distribution of calcareous nannofossils through the Miocene and focus mainly on biozonal assignment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A major deterioration in global climate occurred through the Eocene-Oligocene time interval, characterized by long-term cooling in both terrestrial and marine environments. During this long-term cooling trend, however, recent studies have documented several short-lived warming and cooling phases. In order to further investigate high-latitude climate during these events, we developed a high-resolution calcareous nannofossil record from ODP Site 748 Hole B for the interval spanning the late middle Eocene to the late Oligocene (~42 to 26 Ma). The primary goals of this study were to construct a detailed biostratigraphic record and to use nannofossil assemblage variations to interpret short-term changes in surface-water temperature and nutrient conditions. The principal nannofossil assemblage variations are identified using a temperate-warm-water taxa index (Twwt), from which three warming and five cooling events are identified within the middle Eocene to the earliest Oligocene interval. Among these climatic trends, the cooling event at ~39 Ma (Cooling Event B) is recorded here for the first time. Variations in fine-fraction d18O values at Site 748 are associated with changes in the Twwt index, supporting the idea that significant short-term variability in surface-water conditions occurred in the Kerguelen Plateau area during the middle and late Eocene. Furthermore, ODP Site 748 calcareous nannofossil paleoecology confirms the utility of these microfossils for biostratigraphic, paleoclimatic, and paleoceanographic reconstructions at Southern Ocean sites during the Paleogene.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La vérification de la résistance aux attaques des implémentations embarquées des vérifieurs de code intermédiaire Java Card est une tâche complexe. Les méthodes actuelles n'étant pas suffisamment efficaces, seule la génération de tests manuelle est possible. Pour automatiser ce processus, nous proposons une méthode appelée VTG (Vulnerability Test Generation, génération de tests de vulnérabilité). En se basant sur une représentation formelle des comportements fonctionnels du système sous test, un ensemble de tests d'intrusions est généré. Cette méthode s'inspire des techniques de mutation et de test à base de modèle. Dans un premier temps, le modèle est muté selon des règles que nous avons définies afin de représenter les potentielles attaques. Les tests sont ensuite extraits à partir des modèles mutants. Deux modèles Event-B ont été proposés. Le premier représente les contraintes structurelles des fichiers d'application Java Card. Le VTG permet en quelques secondes de générer des centaines de tests abstraits. Le second modèle est composé de 66 événements permettant de représenter 61 instructions Java Card. La mutation est effectuée en quelques secondes. L'extraction des tests permet de générer 223 tests en 45 min. Chaque test permet de vérifier une précondition ou une combinaison de préconditions d'une instruction. Cette méthode nous a permis de tester différents mécanismes d'implémentations de vérifieur de code intermédiaire Java Card. Bien que développée pour notre cas d'étude, la méthode proposée est générique et a été appliquée à d'autres cas d'études.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: Studies of diffuse large B-cell lymphoma (DLBCL) are typically evaluated by using a time-to-event approach with relapse, re-treatment, and death commonly used as the events. We evaluated the timing and type of events in newly diagnosed DLBCL and compared patient outcome with reference population data. PATIENTS AND METHODS: Patients with newly diagnosed DLBCL treated with immunochemotherapy were prospectively enrolled onto the University of Iowa/Mayo Clinic Specialized Program of Research Excellence Molecular Epidemiology Resource (MER) and the North Central Cancer Treatment Group NCCTG-N0489 clinical trial from 2002 to 2009. Patient outcomes were evaluated at diagnosis and in the subsets of patients achieving event-free status at 12 months (EFS12) and 24 months (EFS24) from diagnosis. Overall survival was compared with age- and sex-matched population data. Results were replicated in an external validation cohort from the Groupe d'Etude des Lymphomes de l'Adulte (GELA) Lymphome Non Hodgkinien 2003 (LNH2003) program and a registry based in Lyon, France. RESULTS: In all, 767 patients with newly diagnosed DLBCL who had a median age of 63 years were enrolled onto the MER and NCCTG studies. At a median follow-up of 60 months (range, 8 to 116 months), 299 patients had an event and 210 patients had died. Patients achieving EFS24 had an overall survival equivalent to that of the age- and sex-matched general population (standardized mortality ratio [SMR], 1.18; P = .25). This result was confirmed in 820 patients from the GELA study and registry in Lyon (SMR, 1.09; P = .71). Simulation studies showed that EFS24 has comparable power to continuous EFS when evaluating clinical trials in DLBCL. CONCLUSION: Patients with DLBCL who achieve EFS24 have a subsequent overall survival equivalent to that of the age- and sex-matched general population. EFS24 will be useful in patient counseling and should be considered as an end point for future studies of newly diagnosed DLBCL.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Oxygen isotope records of stalagmites from China and Oman reveal a weak summer monsoon event, with a double-plunging structure, that started 8.21 ± 0.02 kyr B.P. An identical but antiphased pattern is also evident in two stalagmite records from eastern Brazil, indicating that the South American Summer Monsoon was intensified during the 8.2 kyr B.P. event. These records demonstrate that the event was of global extent and synchronous within dating errors of <50 years. In comparison with recent model simulations, it is plausible that the 8.2 kyr B.P. event can be tied in changes of the Atlantic Meridional Overturning Circulation triggered by a glacial lake draining event. This, in turn, affected North Atlantic climate and latitudinal position of the Intertropical Convergence Zone, resulting in the observed low-latitude monsoonal precipitation patterns.