953 resultados para Automatic model transformation systems
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
A model is introduced for two reduced BCS systems which are coupled through the transfer of Cooper pairs between the systems. The model may thus be used in the analysis of the Josephson effect arising from pair tunneling between two strongly coupled small metallic grains. At a particular coupling strength the model is integrable and explicit results are derived for the energy spectrum, conserved operators, integrals of motion, and wave function scalar products. It is also shown that form factors can be obtained for the calculation of correlation functions. Furthermore, a connection with perturbed conformal field theory is made.
Resumo:
We introduce an integrable model for two coupled BCS systems through a solution of the Yang-Baxter equation associated with the Lie algebra su(4). By employing the algebraic Bethe ansatz, we determine the exact solution for the energy spectrum. An asymptotic analysis is conducted to determine the leading terms in the ground state energy, the gap and some one point correlation functions at zero temperature. (C) 2002 Published by Elsevier Science B.V.
Resumo:
Due to the socio-economic inhomogeneity of communities in developing countries, the selection of sanitation systems is a complex task. To assist planners and communities in assessing the suitability of alternatives, the decision support system SANEX™ was developed. SANEX™ evaluates alternatives in two steps. First, Conjunctive Elimination, based on 20 mainly technical criteria, is used to screen feasible alternatives. Subsequently, a model derived from Multiattribute Utility Technique (MAUT) uses technical, socio-cultural and institutional criteria to compare the remaining alternatives with regard to their implementability and sustainability. This paper presents the SANEX™ algorithm, examples of its application in practice, and results obtained from field testing.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
The development of the new TOGA (titration and off-gas analysis) sensor for the detailed study of biological processes in wastewater treatment systems is outlined. The main innovation of the sensor is the amalgamation of titrimetric and off-gas measurement techniques. The resulting measured signals are: hydrogen ion production rate (HPR), oxygen transfer rate (OTR), nitrogen transfer rate (NTR), and carbon dioxide transfer rate (CTR). While OTR and NTR are applicable to aerobic and anoxic conditions, respectively, HPR and CTR are useful signals under all of the conditions found in biological wastewater treatment systems, namely, aerobic, anoxic and anaerobic. The sensor is therefore a powerful tool for studying the key biological processes under all these conditions. A major benefit from the integration of the titrimetric and off-gas analysis methods is that the acid/base buffering systems, in particular the bicarbonate system, are properly accounted for. Experimental data resulting from the TOGA sensor in aerobic, anoxic, and anaerobic conditions demonstrates the strength of the new sensor. In the aerobic environment, carbon oxidation (using acetate as an example carbon source) and nitrification are studied. Both the carbon and ammonia removal rates measured by the sensor compare very well with those obtained from off-line chemical analysis. Further, the aerobic acetate removal process is examined at a fundamental level using the metabolic pathway and stoichiometry established in the literature, whereby the rate of formation of storage products is identified. Under anoxic conditions, the denitrification process is monitored and, again, the measured rate of nitrogen gas transfer (NTR) matches well with the removal of the oxidised nitrogen compounds (measured chemically). In the anaerobic environment, the enhanced biological phosphorus process was investigated. In this case, the measured sensor signals (HPR and CTR) resulting from acetate uptake were used to determine the ratio of the rates of carbon dioxide production by competing groups of microorganisms, which consequently is a measure of the activity of these organisms. The sensor involves the use of expensive equipment such as a mass spectrometer and requires special gases to operate, thus incurring significant capital and operational costs. This makes the sensor more an advanced laboratory tool than an on-line sensor. (C) 2003 Wiley Periodicals, Inc.
Resumo:
We are witnessing an enormous growth in biological nitrogen removal from wastewater. It presents specific challenges beyond traditional COD (carbon) removal. A possibility for optimised process design is the use of biomass-supporting media. In this paper, attached growth processes (AGP) are evaluated using dynamic simulations. The advantages of these systems that were qualitatively described elsewhere, are validated quantitatively based on a simulation benchmark for activated sludge treatment systems. This simulation benchmark is extended with a biofilm model that allows for fast and accurate simulation of the conversion of different substrates in a biofilm. The economic feasibility of this system is evaluated using the data generated with the benchmark simulations. Capital savings due to volume reduction and reduced sludge production are weighed out against increased aeration costs. In this evaluation, effluent quality is integrated as well.
Resumo:
We detail the automatic construction of R matrices corresponding to (the tensor products of) the (O-m\alpha(n)) families of highest-weight representations of the quantum superalgebras Uq[gl(m\n)]. These representations are irreducible, contain a free complex parameter a, and are 2(mn)-dimensional. Our R matrices are actually (sparse) rank 4 tensors, containing a total of 2(4mn) components, each of which is in general an algebraic expression in the two complex variables q and a. Although the constructions are straightforward, we describe them in full here, to fill a perceived gap in the literature. As the algorithms are generally impracticable for manual calculation, we have implemented the entire process in MATHEMATICA; illustrating our results with U-q [gl(3\1)]. (C) 2002 Published by Elsevier Science B.V.
Resumo:
The microwave and thermal cure processes for the epoxy-amine systems (epoxy resin diglycidyl ether of bisphenol A, DGEBA) with 4,4'-diaminodiphenyl sulphone (DDS) and 4,4'-diaminodiphenyl methane (DDM) have been investigated for 1:1 stoichiometries by using fiber-optic FT-NIR spectroscopy. The DGEBA used was in the form of Ciba-Geigy GY260 resin. The DDM system was studied at a single cure temperature of 373 K and a single stoichiometry of 20.94 wt% and the DDS system was studied at a stoichiometry of 24.9 wt% and a range of temperatures between 393 and 443 K. The best values of the kinetic rate parameters for the consumption of amines have been determined by a least squares curve fit to a model for epoxy/amine cure. The activation energies for the polymerization of the DGEBA/DDS system were determined for both cure processes and found to be 66 and 69 kJ mol(-1) for the microwave and thermal cure processes, respectively. No evidence was found for any specific effect of the microwave radiation on the rate parameters, and the systems were both found to be characterized by a negative substitution effect. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
We model the behavior of an ion trap with all ions driven simultaneously and coupled collectively to a heat bath. The equations for this system are similar to the irreversible dynamics of a collective angular momentum system known as the Dicke model. We show how the steady state of the ion trap as a dissipative many-body system driven far from equilibrium can exhibit quantum entanglement. We calculate the entanglement of this steady state for two ions in the trap and in the case of more than two ions we calculate the entanglement between two ions by tracing over all the other ions. The entanglement in the steady state is a maximum for the parameter values corresponding roughly to a bifurcation of a fixed point in the corresponding semiclassical dynamics. We conjecture that this is a general mechanism for entanglement creation in driven dissipative quantum systems.
Resumo:
The effect of test temperature, which controls the stability of austenite, on the impact toughness of a low carbon Fe-Ni-Mn-C austenitic steel and 304 stainless steel, has been investigated. Under impact conditions, stress-induced martensitic transformation occurred, in a region near the fracture surface, at test temperatures below 80degreesC for the Fe-Ni-Mn-C steel and below -25degreesC for 304 stainless steel. The former shows significant transformation toughening and the highest impact toughness was obtained at 10degreesC, which corresponds to the maximum amount of martensite formed by stress-induced transformation above the Ms temperature. The stress-induced martensitic transformation contributes negatively to the impact toughness in the 304 stainless steel. Increasing the amount of stress-induced transformation to martensite, lowered the impact toughness. The experimental results can be well explained by the Antolovich theory through the analysis of metallography and fractography. The different effect of stress-induced transformation on the impact toughness in Fe-Ni-Mn-C steel and 304 stainless steel has been further understood by applying the crystallographic model for stress-induced martensitic transformation to these two steels. (C) 2002 Kluwer Academic Publishers.
Resumo:
Conditions have been developed for genetic transformation and insertional mutagenesis in Leifsonia xyli subsp. xyli (Lxx), the causal organism of ratoon stunting disease (RSD), one of the most damaging and intractable diseases of sugarcane internationally. Transformation frequencies ranged from 1 to 10 colony forming units (CFU)/mug of plasmid DNA using Clavibacter/Escherichia coli shuttle vectors pCG188, pDM302, and pDM306 and ranged from 50 to 500 CFU/mug using cosmid cloning vectors pLAFR3 and pLAFR5-km. The transformation/transposition frequency was 0 to 70 CFU/mug of DNA, using suicide vectors pUCD623 and pSLTP2021 containing transposable elements Tn4431 and Tn5, respectively. It was necessary to grow Lxx in media containing 0.1% glycine for electroporation and to amplify large plasmids in a dam(-)/dcm(-) E. coli strain and purify the DNA by anion exchange. To keep selection pressure at an optimum, the transformants were grown on nitrocellulose filters (0.2-mum pore size) on media containing the appropriate antibiotics. Transposon Tn4431 containing a promoterless lux operon from Vibrio fischeri and a tetracycline-resistance gene was introduced on the suicide vector pUCD623. All but 1% of the putative transposon mutants produce light, indicating transposition into functional Lxx genes. Southern blot analysis of these transformants indicates predominantly single transposon insertions at unique sites. The cosmid cloning vector pLAFR5-km was stably maintained in Lxx. The development of a transformation and transposon mutagenesis system opens the way for molecular analysis of pathogenicity determinants in Lxx.
Resumo:
This paper addresses robust model-order reduction of a high dimensional nonlinear partial differential equation (PDE) model of a complex biological process. Based on a nonlinear, distributed parameter model of the same process which was validated against experimental data of an existing, pilot-scale BNR activated sludge plant, we developed a state-space model with 154 state variables in this work. A general algorithm for robustly reducing the nonlinear PDE model is presented and based on an investigation of five state-of-the-art model-order reduction techniques, we are able to reduce the original model to a model with only 30 states without incurring pronounced modelling errors. The Singular perturbation approximation balanced truncating technique is found to give the lowest modelling errors in low frequency ranges and hence is deemed most suitable for controller design and other real-time applications. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The Agricultural Production Systems slMulator, APSIM, is a cropping system modelling environment that simulates the dynamics of soil-plant-management interactions within a single crop or a cropping system. Adaptation of previously developed crop models has resulted in multiple crop modules in APSIM, which have low scientific transparency and code efficiency. A generic crop model template (GCROP) has been developed to capture unifying physiological principles across crops (plant types) and to provide modular and efficient code for crop modelling. It comprises a standard crop interface to the APSIM engine, a generic crop model structure, a crop process library, and well-structured crop parameter files. The process library contains the major science underpinning the crop models and incorporates generic routines based on physiological principles for growth and development processes that are common across crops. It allows APSIM to simulate different crops using the same set of computer code. The generic model structure and parameter files provide an easy way to test, modify, exchange and compare modelling approaches at process level without necessitating changes in the code. The standard interface generalises the model inputs and outputs, and utilises a standard protocol to communicate with other APSIM modules through the APSIM engine. The crop template serves as a convenient means to test new insights and compare approaches to component modelling, while maintaining a focus on predictive capability. This paper describes and discusses the scientific basis, the design, implementation and future development of the crop template in APSIM. On this basis, we argue that the combination of good software engineering with sound crop science can enhance the rate of advance in crop modelling. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.