965 resultados para Model Systems
Resumo:
Classical dynamics is formulated as a Hamiltonian flow in phase space, while quantum mechanics is formulated as unitary dynamics in Hilbert space. These different formulations have made it difficult to directly compare quantum and classical nonlinear dynamics. Previous solutions have focused on computing quantities associated with a statistical ensemble such as variance or entropy. However a more diner comparison would compare classical predictions to the quantum predictions for continuous simultaneous measurement of position and momentum of a single system, in this paper we give a theory of such measurement and show that chaotic behavior in classical systems fan be reproduced by continuously measured quantum systems.
Resumo:
The microwave and thermal cure processes for the epoxy-amine systems N,N,N',N'-tetraglycidyl-4,4'-diaminodiphenyl methane (TGDDM) with diaminodiphenyl sulfone (DDS) and diaminodiphenyl methane (DDM) have been investigated. The DDS system was studied at a single cure temperature of 433 K and a single stoichiometry of 27 wt% and the DDM system was studied at two stoichiometries, 19 and 32 wt%, and a range temperatures between 373 and 413 K. The best values the kinetic rate parameters for the consumption of amines have been determined by a least squares curve Ft to a model for epoxy-amine cure. The activation energies for the rate parameters for the MY721/DDM system were determined as was the overall activation energy for the cure reaction which was found to be 62 kJ mol(-1). No evidence was found for any specific effect of the microwave radiation on the rate parameters, and the systems were both found to be characterized by a negative substitution effect. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
The paper considers the structural identifiability of a parent–metabolite pharmacokinetic model for ivabradine and one of its metabolites. The model, which is linear, is considered initially for intravenous administration of ivabradine, and then for a combined intravenous and oral administration. In both cases, the model is shown to be unidentifiable. Simplification of the model (for both forms of administration) to that proposed by Duffull et al. (1) results in a globally structurally identifiable model. The analysis could be applied to the modeling of any drug undergoing first-pass metabolism, with plasma concentrations available from drug and metabolite.
Resumo:
A model has been developed which enables the viscosities of coal ash slags to be predicted as a function of composition and temperature under reducing conditions. The model describes both completely liquid and heterogeneous, i.e. partly crystallised, slags in the Al2O3-CaO-'FeO'-SiO2 system in equilibrium with metallic iron. The Urbain formalism has been modified to describe the viscosities of the liquid slag phase over the complete range of compositions and a wide range of temperatures. The computer package F * A * C * T was used to predict the proportions of solids and the compositions of the remaining liquid phases. The Roscoe equation has been used to describe the effect of presence of solid suspension (slurry effect) on the viscosity of partly crystallised slag systems. The model provides a good description of the experimental data of fully liquid, and liquid + solids mixtures, over the complete range of compositions and a wide range of temperatures. This model can now be used for viscosity predictions in industrial slag systems. Examples of the application of the new model to coal ash fluxing and blending are given in the paper. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
The development of cropping systems simulation capabilities world-wide combined with easy access to powerful computing has resulted in a plethora of agricultural models and consequently, model applications. Nonetheless, the scientific credibility of such applications and their relevance to farming practice is still being questioned. Our objective in this paper is to highlight some of the model applications from which benefits for farmers were or could be obtained via changed agricultural practice or policy. Changed on-farm practice due to the direct contribution of modelling, while keenly sought after, may in some cases be less achievable than a contribution via agricultural policies. This paper is intended to give some guidance for future model applications. It is not a comprehensive review of model applications, nor is it intended to discuss modelling in the context of social science or extension policy. Rather, we take snapshots around the globe to 'take stock' and to demonstrate that well-defined financial and environmental benefits can be obtained on-farm from the use of models. We highlight the importance of 'relevance' and hence the importance of true partnerships between all stakeholders (farmer, scientists, advisers) for the successful development and adoption of simulation approaches. Specifically, we address some key points that are essential for successful model applications such as: (1) issues to be addressed must be neither trivial nor obvious; (2) a modelling approach must reduce complexity rather than proliferate choices in order to aid the decision-making process (3) the cropping systems must be sufficiently flexible to allow management interventions based on insights gained from models. The pro and cons of normative approaches (e.g. decision support software that can reach a wide audience quickly but are often poorly contextualized for any individual client) versus model applications within the context of an individual client's situation will also be discussed. We suggest that a tandem approach is necessary whereby the latter is used in the early stages of model application for confidence building amongst client groups. This paper focuses on five specific regions that differ fundamentally in terms of environment and socio-economic structure and hence in their requirements for successful model applications. Specifically, we will give examples from Australia and South America (high climatic variability, large areas, low input, technologically advanced); Africa (high climatic variability, small areas, low input, subsistence agriculture); India (high climatic variability, small areas, medium level inputs, technologically progressing; and Europe (relatively low climatic variability, small areas, high input, technologically advanced). The contrast between Australia and Europe will further demonstrate how successful model applications are strongly influenced by the policy framework within which producers operate. We suggest that this might eventually lead to better adoption of fully integrated systems approaches and result in the development of resilient farming systems that are in tune with current climatic conditions and are adaptable to biophysical and socioeconomic variability and change. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
A model is introduced for two reduced BCS systems which are coupled through the transfer of Cooper pairs between the systems. The model may thus be used in the analysis of the Josephson effect arising from pair tunneling between two strongly coupled small metallic grains. At a particular coupling strength the model is integrable and explicit results are derived for the energy spectrum, conserved operators, integrals of motion, and wave function scalar products. It is also shown that form factors can be obtained for the calculation of correlation functions. Furthermore, a connection with perturbed conformal field theory is made.
Resumo:
We introduce an integrable model for two coupled BCS systems through a solution of the Yang-Baxter equation associated with the Lie algebra su(4). By employing the algebraic Bethe ansatz, we determine the exact solution for the energy spectrum. An asymptotic analysis is conducted to determine the leading terms in the ground state energy, the gap and some one point correlation functions at zero temperature. (C) 2002 Published by Elsevier Science B.V.
Resumo:
Due to the socio-economic inhomogeneity of communities in developing countries, the selection of sanitation systems is a complex task. To assist planners and communities in assessing the suitability of alternatives, the decision support system SANEX™ was developed. SANEX™ evaluates alternatives in two steps. First, Conjunctive Elimination, based on 20 mainly technical criteria, is used to screen feasible alternatives. Subsequently, a model derived from Multiattribute Utility Technique (MAUT) uses technical, socio-cultural and institutional criteria to compare the remaining alternatives with regard to their implementability and sustainability. This paper presents the SANEX™ algorithm, examples of its application in practice, and results obtained from field testing.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
The development of the new TOGA (titration and off-gas analysis) sensor for the detailed study of biological processes in wastewater treatment systems is outlined. The main innovation of the sensor is the amalgamation of titrimetric and off-gas measurement techniques. The resulting measured signals are: hydrogen ion production rate (HPR), oxygen transfer rate (OTR), nitrogen transfer rate (NTR), and carbon dioxide transfer rate (CTR). While OTR and NTR are applicable to aerobic and anoxic conditions, respectively, HPR and CTR are useful signals under all of the conditions found in biological wastewater treatment systems, namely, aerobic, anoxic and anaerobic. The sensor is therefore a powerful tool for studying the key biological processes under all these conditions. A major benefit from the integration of the titrimetric and off-gas analysis methods is that the acid/base buffering systems, in particular the bicarbonate system, are properly accounted for. Experimental data resulting from the TOGA sensor in aerobic, anoxic, and anaerobic conditions demonstrates the strength of the new sensor. In the aerobic environment, carbon oxidation (using acetate as an example carbon source) and nitrification are studied. Both the carbon and ammonia removal rates measured by the sensor compare very well with those obtained from off-line chemical analysis. Further, the aerobic acetate removal process is examined at a fundamental level using the metabolic pathway and stoichiometry established in the literature, whereby the rate of formation of storage products is identified. Under anoxic conditions, the denitrification process is monitored and, again, the measured rate of nitrogen gas transfer (NTR) matches well with the removal of the oxidised nitrogen compounds (measured chemically). In the anaerobic environment, the enhanced biological phosphorus process was investigated. In this case, the measured sensor signals (HPR and CTR) resulting from acetate uptake were used to determine the ratio of the rates of carbon dioxide production by competing groups of microorganisms, which consequently is a measure of the activity of these organisms. The sensor involves the use of expensive equipment such as a mass spectrometer and requires special gases to operate, thus incurring significant capital and operational costs. This makes the sensor more an advanced laboratory tool than an on-line sensor. (C) 2003 Wiley Periodicals, Inc.
Resumo:
We are witnessing an enormous growth in biological nitrogen removal from wastewater. It presents specific challenges beyond traditional COD (carbon) removal. A possibility for optimised process design is the use of biomass-supporting media. In this paper, attached growth processes (AGP) are evaluated using dynamic simulations. The advantages of these systems that were qualitatively described elsewhere, are validated quantitatively based on a simulation benchmark for activated sludge treatment systems. This simulation benchmark is extended with a biofilm model that allows for fast and accurate simulation of the conversion of different substrates in a biofilm. The economic feasibility of this system is evaluated using the data generated with the benchmark simulations. Capital savings due to volume reduction and reduced sludge production are weighed out against increased aeration costs. In this evaluation, effluent quality is integrated as well.
Resumo:
The microwave and thermal cure processes for the epoxy-amine systems (epoxy resin diglycidyl ether of bisphenol A, DGEBA) with 4,4'-diaminodiphenyl sulphone (DDS) and 4,4'-diaminodiphenyl methane (DDM) have been investigated for 1:1 stoichiometries by using fiber-optic FT-NIR spectroscopy. The DGEBA used was in the form of Ciba-Geigy GY260 resin. The DDM system was studied at a single cure temperature of 373 K and a single stoichiometry of 20.94 wt% and the DDS system was studied at a stoichiometry of 24.9 wt% and a range of temperatures between 393 and 443 K. The best values of the kinetic rate parameters for the consumption of amines have been determined by a least squares curve fit to a model for epoxy/amine cure. The activation energies for the polymerization of the DGEBA/DDS system were determined for both cure processes and found to be 66 and 69 kJ mol(-1) for the microwave and thermal cure processes, respectively. No evidence was found for any specific effect of the microwave radiation on the rate parameters, and the systems were both found to be characterized by a negative substitution effect. Copyright (C) 2002 John Wiley Sons, Ltd.