946 resultados para Formal Verification Methods
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.
Resumo:
Application of "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A refined plastic hinge method suitable for practical advanced analysis of steel frame structures comprising non-compact sections is presented in a companion paper. The method implicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the method for the analysis of steel frames comprising non-compact sections is established in this paper by comparison with a comprehensive range of analytical benchmark frame solutions. The refined plastic hinge method is shown to be more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations.
Resumo:
Background: A recent study by Dhillon et al. [12], identified both angioinvasion and mTOR as prognostic biomarkers for poor survival in early stage NSCLC. The aim of this study was to verify the above study by examining the angioinvasion and mTOR expression profile in a cohort of early stage NSCLC patients and correlate the results to patient clinico-pathological data and survival. Methods: Angioinvasion was routinely recorded by the pathologist at the initial assessment of the tumor following resection. mTOR was evaluated in 141 early stage (IA-IIB) NSCLC patients (67 - squamous; 60 - adenocarcinoma; 14 - others) using immunohistochemistry (IHC) analysis with an immunohistochemical score (IHS) calculated (% positive cells × staining intensity). Intensity was scored as follows: 0 (negative); 1+ (weak); 2+ (moderate); 3+ (strong). The range of scores was 0-300. Based on the previous study a cut-off score of 30 was used to define positive versus negative patients. The impact of angioinvasion and mTOR expression on prognosis was then evaluated. Results: 101 of the 141 tumors studied expressed mTOR. There was no difference in mTOR expression between squamous cell carcinoma and adenocarcinoma. Angioinvasion (p= 0.024) and mTOR staining (p= 0.048) were significant univariate predictors of poor survival. Both remained significant after multivariate analysis (p= 0.037 and p= 0.020, respectively). Conclusions: Our findings verify angioinvasion and mTOR expression as new biomarkers for poor outcome in patients with early stage NSCLC. mTOR expressing patients may benefit from novel therapies targeting the mTOR survival pathway. © 2011 Elsevier Ireland Ltd.
Resumo:
This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.
Resumo:
Introduction Informal caring networks contribute significantly to end-of-life (EOL) care in the community. However, to ensure that these networks are sustainable, and unpaid carers are not exploited, primary carers need permission and practical assistance to gather networks together and negotiate the help they need. Our aim in this study was to develop an understanding of how formal and informal carers work together when care is being provided in a dying person's home. We were particularly interested in formal providers’ perceptions and knowledge of informal networks of care and in identifying barriers to the networks working together. Methods Qualitative methods, informed by an interpretive approach, were used. In February-July 2012, 10 focus groups were conducted in urban, regional, and rural Australia comprising 88 participants. Findings Our findings show that formal providers are aware, and supportive, of the vital role informal networks play in the care of the dying at home. A number of barriers to formal and informal networks working together more effectively were identified. In particular, we found that the Australian policy of health-promoting palliative is not substantially translating to practice. Conclusion Combinations of formal and informal caring networks are essential to support people and their primary carers. Formal service providers do little to establish, support, or maintain the informal networks although there is much goodwill and scope for them to do so. Further re-orientation towards a health-promoting palliative care and community capacity building approach is suggested.
Resumo:
This paper makes a formal security analysis of the current Australian e-passport implementation using model checking tools CASPER/CSP/FDR. We highlight security issues in the current implementation and identify new threats when an e-passport system is integrated with an automated processing system like SmartGate. The paper also provides a security analysis of the European Union (EU) proposal for Extended Access Control (EAC) that is intended to provide improved security in protecting biometric information of the e-passport bearer. The current e-passport specification fails to provide a list of adequate security goals that could be used for security evaluation. We fill this gap; we present a collection of security goals for evaluation of e-passport protocols. Our analysis confirms existing security weaknesses that were previously identified and shows that both the Australian e-passport implementation and the EU proposal fail to address many security and privacy aspects that are paramount in implementing a secure border control mechanism. ACM Classification C.2.2 (Communication/Networking and Information Technology – Network Protocols – Model Checking), D.2.4 (Software Engineering – Software/Program Verification – Formal Methods), D.4.6 (Operating Systems – Security and Privacy Protection – Authentication)
Resumo:
Behavioral models capture operational principles of real-world or designed systems. Formally, each behavioral model defines the state space of a system, i.e., its states and the principles of state transitions. Such a model is the basis for analysis of the system’s properties. In practice, state spaces of systems are immense, which results in huge computational complexity for their analysis. Behavioral models are typically described as executable graphs, whose execution semantics encodes a state space. The structure theory of behavioral models studies the relations between the structure of a model and the properties of its state space. In this article, we use the connectivity property of graphs to achieve an efficient and extensive discovery of the compositional structure of behavioral models; behavioral models get stepwise decomposed into components with clear structural characteristics and inter-component relations. At each decomposition step, the discovered compositional structure of a model is used for reasoning on properties of the whole state space of the system. The approach is exemplified by means of a concrete behavioral model and verification criterion. That is, we analyze workflow nets, a well-established tool for modeling behavior of distributed systems, with respect to the soundness property, a basic correctness property of workflow nets. Stepwise verification allows the detection of violations of the soundness property by inspecting small portions of a model, thereby considerably reducing the amount of work to be done to perform soundness checks. Besides formal results, we also report on findings from applying our approach to an industry model collection.
Resumo:
Formal representations of business processes are used for analysis of the process behavior. Workflow nets are a widely used formalism for describing the behavior of business processes. Structure theory of processes investigates the relation between the structure of a model and its behavior. In this paper, we propose to employ the connectivity property of workflow nets as an angle to their structural analysis. In particular, we show how soundness verification can be organized using biconnected components of a workflow net. This allows for efficient identification and localization of flaws in the behavior of workflow nets and for supporting process analysts with diagnostic information
Resumo:
Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.
Resumo:
Study/Objective This program of research examines the effectiveness of legal mechanisms as motivators to maximise engagement and compliance with evacuation messages. This study is based on the understanding that the presence of legislative requirements, as well as sanctions and incentives encapsulated in law, can have a positive impact in achieving compliance. Our objective is to examine whether the current Australian legal frameworks, which incorporate evacuation during disasters, are an effective structure that is properly understood by those who enforce and those who are required to comply. Background In Australia, most jurisdictions have enacted legislation that encapsulates the power to evacuate and the ability to enforce compliance, either by the use of force or imposition of penalty. However, citizens still choose to not evacuate. Methods This program of research incorporates theoretical and doctrinal methodologies for reviewing literature and legislation in the Australia context. The aim of the research is to determine whether further clarity is required to create an understanding of the powers to evacuate, as well as greater public awareness of these powers. Results & Conclusion Legislators suggest that powers of evacuation can be ineffective if they are impractical to enforce. In Australia, there may also be confusion about from which legislative instrument the power to evacuate derives, and therefore whether there is a corresponding ability to enforce compliance through the use of force or imposition of a penalty. Equally, communities may lack awareness and understanding of the powers of agencies to enforce compliance. We seek to investigate whether this is the case, and whether even if greater awareness existed, it would act as an incentive to comply.
Resumo:
Supervisory Control and Data Acquisition (SCADA) systems are one of the key foundations of smart grids. The Distributed Network Protocol version 3 (DNP3) is a standard SCADA protocol designed to facilitate communications in substations and smart grid nodes. The protocol is embedded with a security mechanism called Secure Authentication (DNP3-SA). This mechanism ensures that end-to-end communication security is provided in substations. This paper presents a formal model for the behavioural analysis of DNP3-SA using Coloured Petri Nets (CPN). Our DNP3-SA CPN model is capable of testing and verifying various attack scenarios: modification, replay and spoofing, combined complex attack and mitigation strategies. Using the model has revealed a previously unidentified flaw in the DNP3-SA protocol that can be exploited by an attacker that has access to the network interconnecting DNP3 devices. An attacker can launch a successful attack on an outstation without possessing the pre-shared keys by replaying a previously authenticated command with arbitrary parameters. We propose an update to the DNP3-SA protocol that removes the flaw and prevents such attacks. The update is validated and verified using our CPN model proving the effectiveness of the model and importance of the formal protocol analysis.
Resumo:
This paper describes the application of vector spaces over Galois fields, for obtaining a formal description of a picture in the form of a very compact, non-redundant, unique syntactic code. Two different methods of encoding are described. Both these methods consist in identifying the given picture as a matrix (called picture matrix) over a finite field. In the first method, the eigenvalues and eigenvectors of this matrix are obtained. The eigenvector expansion theorem is then used to reconstruct the original matrix. If several of the eigenvalues happen to be zero this scheme results in a considerable compression. In the second method, the picture matrix is reduced to a primitive diagonal form (Hermite canonical form) by elementary row and column transformations. These sequences of elementary transformations constitute a unique and unambiguous syntactic code-called Hermite code—for reconstructing the picture from the primitive diagonal matrix. A good compression of the picture results, if the rank of the matrix is considerably lower than its order. An important aspect of this code is that it preserves the neighbourhood relations in the picture and the primitive remains invariant under translation, rotation, reflection, enlargement and replication. It is also possible to derive the codes for these transformed pictures from the Hermite code of the original picture by simple algebraic manipulation. This code will find extensive applications in picture compression, storage, retrieval, transmission and in designing pattern recognition and artificial intelligence systems.
Resumo:
In this research we modelled computer network devices to ensure their communication behaviours meet various network standards. By modelling devices as finite-state machines and examining their properties in a range of configurations, we discovered a flaw in a common network protocol and produced a technique to improve organisations' network security against data theft.
Resumo:
This thesis evaluates the security of Supervisory Control and Data Acquisition (SCADA) systems, which are one of the key foundations of many critical infrastructures. Specifically, it examines one of the standardised SCADA protocols called the Distributed Network Protocol Version 3, which attempts to provide a security mechanism to ensure that messages transmitted between devices, are adequately secured from rogue applications. To achieve this, the thesis applies formal methods from theoretical computer science to formally analyse the correctness of the protocol.