80 resultados para Formal Methods. Component-Based Development. Competition. Model Checking
Resumo:
Current Physiologically based pharmacokinetic (PBPK) models are inductive. We present an additional, different approach that is based on the synthetic rather than the inductive approach to modeling and simulation. It relies on object-oriented programming A model of the referent system in its experimental context is synthesized by assembling objects that represent components such as molecules, cells, aspects of tissue architecture, catheters, etc. The single pass perfused rat liver has been well described in evaluating hepatic drug pharmacokinetics (PK) and is the system on which we focus. In silico experiments begin with administration of objects representing actual compounds. Data are collected in a manner analogous to that in the referent PK experiments. The synthetic modeling method allows for recognition and representation of discrete event and discrete time processes, as well as heterogeneity in organization, function, and spatial effects. An application is developed for sucrose and antipyrine, administered separately and together PBPK modeling has made extensive progress in characterizing abstracted PK properties but this has also been its limitation. Now, other important questions and possible extensions emerge. How are these PK properties and the observed behaviors generated? The inherent heuristic limitations of traditional models have hindered getting meaningful, detailed answers to such questions. Synthetic models of the type described here are specifically intended to help answer such questions. Analogous to wet-lab experimental models, they retain their applicability even when broken apart into sub-components. Having and applying this new class of models along with traditional PK modeling methods is expected to increase the productivity of pharmaceutical research at all levels that make use of modeling and simulation.
Resumo:
Behaviour Trees is a novel approach for requirements engineering. It advocates a graphical tree notation that is easy to use and to understand. Individual requirements axe modelled as single trees which later on are integrated into a model of the system as a whole. We develop a formal semantics for a subset of Behaviour Trees using CSP. This work, on one hand, provides tool support for Behaviour Trees. On the other hand, it builds a front-end to a subset of the CSP notation and gives CSP users a new modelling strategy which is well suited to the challenges of requirements engineering.
Resumo:
Despite decades of research, the takeup of formal methods for developing provably correct software in industry remains slow. One reason for this is the high cost of proof construction, an activity that, due to the complexity of the required proofs, is typically carried out using interactive theorem provers. In this paper we propose an agent-oriented architecture for interactive theorem proving with the aim of reducing the user interactions (and thus the cost) of constructing software verification proofs. We describe a prototype implementation of our architecture and discuss its application to a small, but non-trivial case study.
Resumo:
This paper investigates how government policy directions embracing deregulation and market liberalism, together with significant pre-existing tensions within the Australian medical profession, produced ground breaking change in the funding and delivery of medical education for general practitioners. From an initial view between and within the medical profession, and government, about the goal of improving the standards of general practice education and training, segments of the general practice community, particularly those located in rural and remote settings, displayed increasingly vocal concerns about the approach and solutions proffered by the predominantly urban-influenced Royal Australian College of General Practitioners (RACGP). The extent of dissatisfaction culminated in the establishment of the Australian College of Rural and Remote Medicine (ACRRM) in 1997 and the development of an alternative curriculum for general practice. This paper focuses on two decades of changes in general practice training and how competition policy acted as a justificatory mechanism for putting general practice education out to competitive tender against a background of significant intra-professional conflict. The government's interest in increasing efficiency and deregulating the 'closed shop' practices of professions, as expressed through national competition policy, ultimately exposed the existing antagonisms within the profession to public view and allowed the government some influence on the sacred cow of professional training. Government policy has acted as a mechanism of resolution for long standing grievances of the rural GPs and propelled professional training towards an open competition model. The findings have implications for future research looking at the unanticipated outcomes of competition and internal markets.
Resumo:
This paper describes a formal component language, used to support automated component-based program development. The components, referred to as templates, are machine processable, meaning that appropriate tool support, such as retrieval support, can be developed. The templates are highly adaptable, meaning that they can be applied to a wide range of problems. Some of the main features of the language are described, including: higher-order parameters; state variable declarations; specification statements and conditionals; applicability conditions and theories; meta-level place holders; and abstract data structures.
Resumo:
Security protocols preserve essential properties, such as confidentiality and authentication, of electronically transmitted data. However, such properties cannot be directly expressed or verified in contemporary formal methods. Via a detailed example, we describe the phases needed to formalise and verify the correctness of a security protocol in the state-oriented Z formalism.
Resumo:
This paper addresses the problem of ensuring compliance of business processes, implemented within and across organisational boundaries, with the constraints stated in related business contracts. In order to deal with the complexity of this problem we propose two solutions that allow for a systematic and increasingly automated support for addressing two specific compliance issues. One solution provides a set of guidelines for progressively transforming contract conditions into business processes that are consistent with contract conditions thus avoiding violation of the rules in contract. Another solution compares rules in business contracts and rules in business processes to check for possible inconsistencies. Both approaches rely on a computer interpretable representation of contract conditions that embodies contract semantics. This semantics is described in terms of a logic based formalism allowing for the description of obligations, prohibitions, permissions and violations conditions in contracts. This semantics was based on an analysis of typical building blocks of many commercial, financial and government contracts. The study proved that our contract formalism provides a good foundation for describing key types of conditions in contracts, and has also given several insights into valuable transformation techniques and formalisms needed to establish better alignment between these two, traditionally separate areas of research and endeavour. The study also revealed a number of new areas of research, some of which we intend to address in near future.
Resumo:
Instantaneous outbursts in underground coal mines have occurred in at least 16 countries, involving both methane (CH4) and carbon dioxide (CO2). The precise mechanisms of an instantaneous outburst are still unresolved but must consider the effects of stress, gas content and physico-mechanical properties of the coal. Other factors such as mining methods (e.g., development heading into the coal seam) and geological features (e.g., coal seam disruptions from faulting) can combine to exacerbate the problem. Prediction techniques continue to be unreliable and unexpected outburst incidents resulting in fatalities are a major concern for underground coal operations. Gas content thresholds of 9 m(3)/t for CH4 and 6 m(3)/t for CO2 are used in the Sydney Basin, to indicate outburst-prone conditions, but are reviewed on an individual mine basis and in mixed as situations. Data on the sorption behaviour of Bowen Basin coals from Australia have provided an explanation for the conflicting results obtained by coal face desorption indices used for outburst-proneness assessment. A key factor appears to be different desorption rates displayed by banded coals, which is supported by both laboratory and mine-site investigations. Dull coal bands with high fusinite and semifusinite contents tend to display rapid desorption from solid coal, for a given pressure drop. The opposite is true for bright coal bands with high vitrinite contents and dull coal bands with high inertodetrinite contents. Consequently, when face samples of dull, fusinite-or semifusinite-rich coal of small particle size are taken for desorption testing, much gas has already escaped and low readings result. The converse applies for samples taken from coal bands with high vitrinite and/or inertodetrinite contents. In terms of outburst potential, it is the bright, vitrinite-rich and the dull, inertodetrinite-rich sections of a coal seam that appear to be more outburst-prone. This is due to the ability of the solid coal to retain gas, even after pressure reduction, creating a gas content gradient across the coal face sufficient to initiate an outburst. Once the particle size of the coal is reduced, rapid gas desorption can then take place. (C) 1998 Elsevier Science.
Resumo:
Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.
Resumo:
Simulations provide a powerful means to help gain the understanding of crustal fault system physics required to progress towards the goal of earthquake forecasting. Cellular Automata are efficient enough to probe system dynamics but their simplifications render interpretations questionable. In contrast, sophisticated elasto-dynamic models yield more convincing results but are too computationally demanding to explore phase space. To help bridge this gap, we develop a simple 2D elastodynamic model of parallel fault systems. The model is discretised onto a triangular lattice and faults are specified as split nodes along horizontal rows in the lattice. A simple numerical approach is presented for calculating the forces at medium and split nodes such that general nonlinear frictional constitutive relations can be modeled along faults. Single and multi-fault simulation examples are presented using a nonlinear frictional relation that is slip and slip-rate dependent in order to illustrate the model.
Resumo:
Current design procedures for Subsurface Flow (SSF) Wetlands are based on the simplifying assumptions of plug flow and first order decay of pollutants. These design procedures do yield functional wetlands but result in over-design and inadequate descriptions of the pollutant removal mechanisms which occur within them. Even though these deficiencies are often noted, few authors have attempted to improve modelling of either flow or pollutant removal in such systems. Consequently the Oxley Creek Wetland, a pilot scale SSF wetland designed to enable rigorous monitoring, has recently been constructed in Brisbane, Australia. Tracer studies have been carried out in order to determine the hydraulics of this wetland prior to commissioning it with sealed sewage. The tracer studies will continue during the wetland's commissioning and operational phases. These studies will improve our understanding of the hydraulics of newly built SSF wetlands and the changes brought on by operational factors such as biological films and wetland plant root structures. Results to date indicate that the flow through the gravel beds is not uniform and cannot be adequately modelled by a single parameter, plug flow with dispersion, model. We have developed a multiparameter model, incorporating four plug flow reactors, which provides a better approximation of our experimental data. With further development this model will allow improvements to current SSF wetland design procedures and operational strategies, and will underpin investigations into the pollutant removal mechanisms at the Oxley Creek Wetland. (C) 1997 IAWQ. Published by Elsevier Science Ltd.
Resumo:
OBJECTIVES We developed a prognostic strategy for quantifying the long-term risk of coronary heart disease (CHD) events in survivors of acute coronary syndromes (ACS). BACKGROUND Strategies for quantifying long-term risk of CHD events have generally been confined to primary prevention settings. The Long-term Intervention with Pravastatin in Ischemic Disease (LIPID) study, which demonstrated that pravastatin reduces CHD events in ACS survivors with a broad range of cholesterol levels, enabled assessment of long-term prognosis in a secondary prevention setting. METHODS Based on outcomes in 8,557 patients in the LIPID study, a multivariate risk factor model was developed for prediction of CHD death or nonfatal myocardial infarction. Prognostic indexes were developed based on the model, and low-, medium-, high- and very high-risk groups were defined by categorizing the prognostic indexes. RESULTS In addition to pravastatin treatment, the independently significant risk factors included: total and high density lipoprotein cholesterol, age, gender, smoking status, qualifying ACS, prior coronary revascularization, diabetes mellitus, hypertension and prior stroke. Pravastatin reduced coronary event rates in each risk level, and the relative risk reduction did not vary significantly between risk levels. The predicted five-year coronary event rates ranged from 5% to 19% for those assigned pravastatin and from 6.4% to 23.6% fur those assigned placebo. CONCLUSIONS Long-term prognosis of ACS survivors varied substantially according to conventional risk factor profile. Pravastatin reduced coronary risk within all risk levels; however, absolute risk remained high in treated patients with unfavorable profiles. Our risk stratification strategy enables identification of ACS survivors who remain at very high risk despite statin therapy. CT Am Coil Cardiol 2001;38:56-63) (C) 2001 by the American College of Cardiology.
Resumo:
The past decade has witnessed an increasing concerns over the effectiveness of project-based development assistance and the promotion of sector-wide approaches (SWAps) to health as a means to increase donor collaboration, consolidate local management of resources and undertake the policy and systems reform necessary to achieve a greater impact on health issues. The concept has gained the support of both the World Bank and the World Health Organisation, as well as key bilateral donors, and dominates current initiatives in development assistance for health. This paper examines the proposal of SWAps as rhetoric, and seeks to understand how that rhetoric functions, despite the variable application of its constituent elements and the range of contexts in which it operates. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The Load-Unload Response Ratio (LURR) method is an intermediate-term earthquake prediction approach that has shown considerable promise. It involves calculating the ratio of a specified energy release measure during loading and unloading where loading and unloading periods are determined from the earth tide induced perturbations in the Coulomb Failure Stress on optimally oriented faults. In the lead-up to large earthquakes, high LURR values are frequently observed a few months or years prior to the event. These signals may have a similar origin to the observed accelerating seismic moment release (AMR) prior to many large earthquakes or may be due to critical sensitivity of the crust when a large earthquake is imminent. As a first step towards studying the underlying physical mechanism for the LURR observations, numerical studies are conducted using the particle based lattice solid model (LSM) to determine whether LURR observations can be reproduced. The model is initialized as a heterogeneous 2-D block made up of random-sized particles bonded by elastic-brittle links. The system is subjected to uniaxial compression from rigid driving plates on the upper and lower edges of the model. Experiments are conducted using both strain and stress control to load the plates. A sinusoidal stress perturbation is added to the gradual compressional loading to simulate loading and unloading cycles and LURR is calculated. The results reproduce signals similar to those observed in earthquake prediction practice with a high LURR value followed by a sudden drop prior to macroscopic failure of the sample. The results suggest that LURR provides a good predictor for catastrophic failure in elastic-brittle systems and motivate further research to study the underlying physical mechanisms and statistical properties of high LURR values. The results provide encouragement for earthquake prediction research and the use of advanced simulation models to probe the physics of earthquakes.