35 resultados para GIBBS FORMALISM
Resumo:
This paper summarizes the processes involved in designing a mathematical model of a growing pasture plant, Stylosanthes scabra Vog. cv. Fitzroy. The model is based on the mathematical formalism of Lindenmayer systems and yields realistic computer-generated images of progressive plant geometry through time. The processes involved in attaining growth data, retrieving useful growth rules, and constructing a virtual plant model are outlined. Progressive output morphological data proved useful for predicting total leaf area and allowed for easier quantification of plant canopy size in terms of biomass and total leaf area.
Resumo:
We show that quantum feedback control can be used as a quantum-error-correction process for errors induced by a weak continuous measurement. In particular, when the error model is restricted to one, perfectly measured, error channel per physical qubit, quantum feedback can act to perfectly protect a stabilizer codespace. Using the stabilizer formalism we derive an explicit scheme, involving feedback and an additional constant Hamiltonian, to protect an (n-1)-qubit logical state encoded in n physical qubits. This works for both Poisson (jump) and white-noise (diffusion) measurement processes. Universal quantum computation is also possible in this scheme. As an example, we show that detected-spontaneous emission error correction with a driving Hamiltonian can greatly reduce the amount of redundancy required to protect a state from that which has been previously postulated [e.g., Alber , Phys. Rev. Lett. 86, 4402 (2001)].
Resumo:
Recent advances in computer technology have made it possible to create virtual plants by simulating the details of structural development of individual plants. Software has been developed that processes plant models expressed in a special purpose mini-language based on the Lindenmayer system formalism. These models can be extended from their architectural basis to capture plant physiology by integrating them with crop models, which estimate biomass production as a consequence of environmental inputs. Through this process, virtual plants will gain the ability to react to broad environmental conditions, while crop models will gain a visualisation component. This integration requires the resolution of the fundamentally different time scales underlying the approaches. Architectural models are usually based on physiological time; each time step encompasses the same amount of development in the plant, without regard to the passage of real time. In contrast, physiological models are based in real time; the amount of development in a time step is dependent on environmental conditions during the period. This paper provides a background on the plant modelling language, then describes how widely-used concepts of thermal time can be implemented to resolve these time scale differences. The process is illustrated using a case study. (C) 1997 Elsevier Science Ltd.
Resumo:
This paper offers a defense of backwards in time causation models in quantum mechanics. Particular attention is given to Cramer's transactional account, which is shown to have the threefold virtue of solving the Bell problem, explaining the complex conjugate aspect of the quantum mechanical formalism, and explaining various quantum mysteries such as Schrodinger's cat. The question is therefore asked, why has this model not received more attention from physicists and philosophers? One objection given by physicists in assessing Cramer's theory was that it is not testable. This paper seeks to answer this concern by utilizing an argument that backwards causation models entail a fork theory of causal direction. From the backwards causation model together with the fork theory one can deduce empirical predictions. Finally, the objection that this strategy is questionable because of its appeal to philosophy is deflected.
Resumo:
We study the scattering of the quantized electromagnetic field from a linear, dispersive dielectric using the scattering formalism for quantum fields. The medium is modeled as a collection of harmonic oscillators with a number of distinct resonance frequencies. This model corresponds to the Sellmeir expansion, which is widely used to describe experimental data for real dispersive media. The integral equation for the interpolating field in terms of the in field is solved and the solution used to find the out field. The relation between the ill and out creation and annihilation operators is found that allows one to calculate the S matrix for this system. In this model, we find that there are absorption bands, but the input-output relations are completely unitary. No additional quantum-noise terms are required.
Resumo:
A model has been developed which enables the viscosities of coal ash slags to be predicted as a function of composition and temperature under reducing conditions. The model describes both completely liquid and heterogeneous, i.e. partly crystallised, slags in the Al2O3-CaO-'FeO'-SiO2 system in equilibrium with metallic iron. The Urbain formalism has been modified to describe the viscosities of the liquid slag phase over the complete range of compositions and a wide range of temperatures. The computer package F * A * C * T was used to predict the proportions of solids and the compositions of the remaining liquid phases. The Roscoe equation has been used to describe the effect of presence of solid suspension (slurry effect) on the viscosity of partly crystallised slag systems. The model provides a good description of the experimental data of fully liquid, and liquid + solids mixtures, over the complete range of compositions and a wide range of temperatures. This model can now be used for viscosity predictions in industrial slag systems. Examples of the application of the new model to coal ash fluxing and blending are given in the paper. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Formulations of fuzzy integral equations in terms of the Aumann integral do not reflect the behavior of corresponding crisp models. Consequently, they are ill-adapted to describe physical phenomena, even when vagueness and uncertainty are present. A similar situation for fuzzy ODEs has been obviated by interpretation in terms of families of differential inclusions. The paper extends this formalism to fuzzy integral equations and shows that the resulting solution sets and attainability sets are fuzzy and far better descriptions of uncertain models involving integral equations. The investigation is restricted to Volterra type equations with mildly restrictive conditions, but the methods are capable of extensive generalization to other types and more general assumptions. The results are illustrated by integral equations relating to control models with fuzzy uncertainties.
Resumo:
The risk of cardiac events in patients undergoing major noncardiac surgery is dependent on their clinical characteristics and the results of stress testing. The purpose of this study was to develop a composite approach to defining levels of risk and to examine whether different approaches to prophylaxis influenced this prediction of outcome. One hundred forty-five consecutive patients (aged 68 +/- 9 years, 79 men) with >1 clinical risk variable were studied with standard dobutamine-atropine stress echo before major noncardiac surgery. Risk levels were stratified according to the presence of ischemia (new or worsening wall motion abnormality), ischemic threshold (heart rate at development of ischemia), and number of clinical risk variables. Patients were followed for perioperative events (during hospital admission) and death or infarction over the subsequent 16 10 months. Ten perioperative events occurred in 105 patients who proceeded to surgery (10%, 95% confidence interval [CI] 5% to 17%), 40 being cancelled because of cardiac or other risk. No ischemia was identified in 56 patients, 1 of whom (1.8%) had a perioperative infarction. Of the 49 patients with ischemia, 22 (45%) had 1 or 2 clinical risk factors; 2 (9%, 95% CI 1% to 29%) had events. Another 15 patients had a high ischemic threshold and 3 or 4 risk factors; 3 (20%, 95% Cl 4% to 48%) had events. Twelve patients had a low ischemic threshold and 3 or 4 risk factors; 4 (33%, 95% CI 10% to 65%) had events. Preoperative myocardial revascularization was performed in only 3 patients, none of whom had events. Perioperative and long-term events occurred despite the use of beta blockers; 7 of 41 eta blocker-treated patients had a perioperative event (17%, 95% CI 7% to 32%); these treated patients were at higher anticipated risk than untreated patients (20 +/- 24% vs 10 +/- 19%, p = 0.02). The total event rate over late follow-up was 13%, and was predicted by dobutamine-atropine stress echo results and heart rate response. (C) 2002 by Excerpta Medica, Inc.
Resumo:
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we establish a foundation for understanding the instrumentation needs of complex dynamic systems if ecological interface design (EID)-based interfaces are to be robust in the face of instrumentation failures. EID-based interfaces often include configural displays which reveal the higher-order properties of complex systems. However, concerns have been expressed that such displays might be misleading when instrumentation is unreliable or unavailable. Rasmussen's abstraction hierarchy (AH) formalism can be extended to include representations of sensors near the functions or properties about which they provide information, resulting in what we call a sensor-annotated abstraction hierarchy. Sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on higher-order system information by showing how the data provided from individual sensors propagates within and across levels of abstraction in the AH. The use of sensor-annotated AHs with a configural display is illustrated with a simple water reservoir example. We argue that if EID is to be effectively employed in the design of interfaces for complex systems, then the information needs of the human operator need to be considered at the earliest stages of system development while instrumentation requirements are being formulated. In this way, Rasmussen's AH promotes a formative approach to instrumentation engineering. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Developments in computer and three dimensional (3D) digitiser technologies have made it possible to keep track of the broad range of data required to simulate an insect moving around or over the highly heterogeneous habitat of a plant's surface. Properties of plant parts vary within a complex canopy architecture, and insect damage can induce further changes that affect an animal's movements, development and likelihood of survival. Models of plant architectural development based on Lindenmayer systems (L-systems) serve as dynamic platforms for simulation of insect movement, providing ail explicit model of the developing 3D structure of a plant as well as allowing physiological processes associated with plant growth and responses to damage to be described and Simulated. Simple examples of the use of the L-system formalism to model insect movement, operating Lit different spatial scales-from insects foraging on an individual plant to insects flying around plants in a field-are presented. Such models can be used to explore questions about the consequences of changes in environmental architecture and configuration on host finding, exploitation and its population consequences. In effect this model is a 'virtual ecosystem' laboratory to address local as well as landscape-level questions pertinent to plant-insect interactions, taking plant architecture into account. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
Despite extensive efforts to confirm a direct association between Chlamydia pneumoniae and atherosclerosis, different laboratories continue to report a large variability in detection rates. In this study, we analyzed multiple sections from atherosclerotic carotid arteries from 10 endartectomy patients to determine the location of C. pneumoniae DNA and the number of sections of the plaque required for analysis to obtain a 95% confidence of detecting the bacterium. A sensitive nested PCR assay detected C. pneumoniae DNA in all patients at one or more locations within the plaque. On average, 42% (ranging from 5 to 91%) of the sections from any single patient had C. pneumoniae DNA present. A patchy distribution of C. pneumoniae in the atherosclerotic lesions was observed, with no area of the carotid having significantly more C. pneumoniae DNA present. If a single random 30-mum-thick section was tested, there was only a 35.6 to 41.6% (95% confidence interval) chance of detecting C. pneumoniae DNA in a patient with carotid artery disease. A minimum of 15 sections would therefore be required to obtain a 95% chance of detecting all true positives. The low concentration and patchy distribution of C. pneumoniae DNA in atherosclerotic plaque appear to be among the reasons for inconsistency between laboratories in the results reported.
Resumo:
Passive avoidance learning is with advantage studied in day-old chicks trained to distinguish between beads of two different colors, of which one at training was associated with aversive taste. During the first 30-min post-training, two periods of glutamate release occur in the forebrain. One period is immediately after the aversive experience, when glutamate release is confined to the left hemisphere. A second release, 30 min later, may be bilateral, perhaps with preponderance of the right hemisphere. The present study showed increased pool sizes of glutamate and glutamine, specifically in the left hemisphere, at the time when the first glutamate release occurs, indicating de novo synthesis of glutamate/glutamine from glucose or glycogen, which are the only possible substrates. Behavioral evidence that memory is extinguished by intracranial administration at this time of iodoacetate, an inhibitor of glycolysis and glycogenolysis, and that the extinction of memory is counteracted by injection of glutamine, supports this concept. A decrease in forebrain glycogen of similar magnitude and coinciding with the increase in glutamate and glutamine suggests that glycogen rather than glucose is the main source of newly synthesized glutamate/glutamine. The second activation of glutamatergic activity 30 min after training, when memory is consolidated into stable, long-term memory, is associated with a bilateral increase in pool size of glutamate/glutamine. No glycogenolysis was observed at this time, but again there is a temporal correlation with sensitivity to inhibition by iodoacetate and rescue by glutamine, indicating the importance of de novo synthesis of glutamate/glutamine from glucose or glycogen. (C) 2003 Elsevier B.V All rights reserved.