81 resultados para Framework Model
Resumo:
Climate change is one of the major challenges facing economic systems at the start of the 21st century. Reducing greenhouse gas emissions will require both restructuring the energy supply system (production) and addressing the efficiency and sufficiency of the social uses of energy (consumption). The energy production system is a complicated supply network of interlinked sectors with 'knock-on' effects throughout the economy. End use energy consumption is governed by complex sets of interdependent cultural, social, psychological and economic variables driven by shifts in consumer preference and technological development trajectories. To date, few models have been developed for exploring alternative joint energy production-consumption systems. The aim of this work is to propose one such model. This is achieved in a methodologically coherent manner through integration of qualitative input-output models of production, with Bayesian belief network models of consumption, at point of final demand. The resulting integrated framework can be applied either (relatively) quickly and qualitatively to explore alternative energy scenarios, or as a fully developed quantitative model to derive or assess specific energy policy options. The qualitative applications are explored here.
Resumo:
A fermentation system was designed to model the human colonic microflora in vitro. The system provided a framework of mucin beads to encourage the adhesion of bacteria, which was encased within a dialysis membrane. The void between the beads was inoculated with faeces from human donors. Water and metabolites were removed from the fermentation by osmosis using a solution of polyethylene glycol (PEG). The system was concomitantly inoculated alongside a conventional single-stage chemostat. Three fermentations were carried out using inocula from three healthy human donors. Bacterial populations from the chemostat and biofilm system were enumerated using fluorescence in situ hybridization. The culture fluid was also analysed for its short-chain fatty acid (SCFA) content. A higher cell density was achieved in the biofilm fermentation system (taking into account the contribution made by the bead-associated bacteria) as compared with the chemostat, owing to the removal of water and metabolites. Evaluation of the bacterial populations revealed that the biofilm system was able to support two distinct groups of bacteria: bacteria growing in association with the mucin beads and planktonic bacteria in the culture fluid. Furthermore, distinct differences were observed between populations in the biofilm fermenter system and the chemostat, with the former supporting higher populations of clostridia and Escherichia coli. SCFA levels were lower in the biofilm system than in the chemostat, as in the former they were removed via the osmotic effect of the PEG. These experiments demonstrated the potential usefulness of the biofilm system for investigating the complexity of the human colonic microflora and the contribution made by sessile bacterial populations.
Resumo:
Large scientific applications are usually developed, tested and used by a group of geographically dispersed scientists. The problems associated with the remote development and data sharing could be tackled by using collaborative working environments. There are various tools and software to create collaborative working environments. Some software frameworks, currently available, use these tools and software to enable remote job submission and file transfer on top of existing grid infrastructures. However, for many large scientific applications, further efforts need to be put to prepare a framework which offers application-centric facilities. Unified Air Pollution Model (UNI-DEM), developed by Danish Environmental Research Institute, is an example of a large scientific application which is in a continuous development and experimenting process by different institutes in Europe. This paper intends to design a collaborative distributed computing environment for UNI-DEM in particular but the framework proposed may also fit to many large scientific applications as well.
Resumo:
This paper presents a novel intelligent multiple-controller framework incorporating a fuzzy-logic-based switching and tuning supervisor along with a generalised learning model (GLM) for an autonomous cruise control application. The proposed methodology combines the benefits of a conventional proportional-integral-derivative (PID) controller, and a PID structure-based (simultaneous) zero and pole placement controller. The switching decision between the two nonlinear fixed structure controllers is made on the basis of the required performance measure using a fuzzy-logic-based supervisor, operating at the highest level of the system. The supervisor is also employed to adaptively tune the parameters of the multiple controllers in order to achieve the desired closed-loop system performance. The intelligent multiple-controller framework is applied to the autonomous cruise control problem in order to maintain a desired vehicle speed by controlling the throttle plate angle in an electronic throttle control (ETC) system. Sample simulation results using a validated nonlinear vehicle model are used to demonstrate the effectiveness of the multiple-controller with respect to adaptively tracking the desired vehicle speed changes and achieving the desired speed of response, whilst penalising excessive control action. Crown Copyright (C) 2008 Published by Elsevier B.V. All rights reserved.
Resumo:
We provide a system identification framework for the analysis of THz-transient data. The subspace identification algorithm for both deterministic and stochastic systems is used to model the time-domain responses of structures under broadband excitation. Structures with additional time delays can be modelled within the state-space framework using additional state variables. We compare the numerical stability of the commonly used least-squares ARX models to that of the subspace N4SID algorithm by using examples of fourth-order and eighth-order systems under pulse and chirp excitation conditions. These models correspond to structures having two and four modes simultaneously propagating respectively. We show that chirp excitation combined with the subspace identification algorithm can provide a better identification of the underlying mode dynamics than the ARX model does as the complexity of the system increases. The use of an identified state-space model for mode demixing, upon transformation to a decoupled realization form is illustrated. Applications of state-space models and the N4SID algorithm to THz transient spectroscopy as well as to optical systems are highlighted.
Resumo:
In this paper, we introduce a novel high-level visual content descriptor devised for performing semantic-based image classification and retrieval. The work can be treated as an attempt for bridging the so called "semantic gap". The proposed image feature vector model is fundamentally underpinned by an automatic image labelling framework, called Collaterally Cued Labelling (CCL), which incorporates the collateral knowledge extracted from the collateral texts accompanying the images with the state-of-the-art low-level visual feature extraction techniques for automatically assigning textual keywords to image regions. A subset of the Corel image collection was used for evaluating the proposed method. The experimental results indicate that our semantic-level visual content descriptors outperform both conventional visual and textual image feature models.
Resumo:
An idealized equilibrium model for the undisturbed partly cloudy boundary layer (BL) is used as a framework to explore the coupling of the energy, water, and carbon cycles over land in midlatitudes and show the sensitivity to the clear‐sky shortwave flux, the midtropospheric temperature, moisture, CO2, and subsidence. The changes in the surface fluxes, the BL equilibrium, and cloud cover are shown for a warmer, doubled CO2 climate. Reduced stomatal conductance in a simple vegetation model amplifies the background 2 K ocean temperature rise to an (unrealistically large) 6 K increase in near‐surface temperature over land, with a corresponding drop of near‐surface relative humidity of about 19%, and a rise of cloud base of about 70 hPa. Cloud changes depend strongly on changes of mean subsidence; but evaporative fraction (EF) decreases. EF is almost uniquely related to mixed layer (ML) depth, independent of background forcing climate. This suggests that it might be possible to infer EF for heterogeneous landscapes from ML depth. The asymmetry of increased evaporation over the oceans and reduced transpiration over land increases in a warmer doubled CO2 climate.
Resumo:
We present a kinetic double layer model coupling aerosol surface and bulk chemistry (K2-SUB) based on the PRA framework of gas-particle interactions (Poschl-Rudich-Ammann, 2007). K2-SUB is applied to a popular model system of atmospheric heterogeneous chemistry: the interaction of ozone with oleic acid. We show that our modelling approach allows de-convoluting surface and bulk processes, which has been a controversial topic and remains an important challenge for the understanding and description of atmospheric aerosol transformation. In particular, we demonstrate how a detailed treatment of adsorption and reaction at the surface can be coupled to a description of bulk reaction and transport that is consistent with traditional resistor model formulations. From literature data we have derived a consistent set of kinetic parameters that characterise mass transport and chemical reaction of ozone at the surface and in the bulk of oleic acid droplets. Due to the wide range of rate coefficients reported from different experimental studies, the exact proportions between surface and bulk reaction rates remain uncertain. Nevertheless, the model results suggest an important role of chemical reaction in the bulk and an approximate upper limit of similar to 10(-11) cm(2) s(-1) for the surface reaction rate coefficient. Sensitivity studies show that the surface accommodation coefficient of the gas-phase reactant has a strong non-linear influence on both surface and bulk chemical reactions. We suggest that K2-SUB may be used to design, interpret and analyse future experiments for better discrimination between surface and bulk processes in the oleic acid-ozone system as well as in other heterogeneous reaction systems of atmospheric relevance.
Resumo:
We present a novel kinetic multi-layer model that explicitly resolves mass transport and chemical reaction at the surface and in the bulk of aerosol particles (KM-SUB). The model is based on the PRA framework of gas-particle interactions (Poschl-Rudich-Ammann, 2007), and it includes reversible adsorption, surface reactions and surface-bulk exchange as well as bulk diffusion and reaction. Unlike earlier models, KM-SUB does not require simplifying assumptions about steady-state conditions and radial mixing. The temporal evolution and concentration profiles of volatile and non-volatile species at the gas-particle interface and in the particle bulk can be modeled along with surface concentrations and gas uptake coefficients. In this study we explore and exemplify the effects of bulk diffusion on the rate of reactive gas uptake for a simple reference system, the ozonolysis of oleic acid particles, in comparison to experimental data and earlier model studies. We demonstrate how KM-SUB can be used to interpret and analyze experimental data from laboratory studies, and how the results can be extrapolated to atmospheric conditions. In particular, we show how interfacial and bulk transport, i.e., surface accommodation, bulk accommodation and bulk diffusion, influence the kinetics of the chemical reaction. Sensitivity studies suggest that in fine air particulate matter oleic acid and compounds with similar reactivity against ozone (carbon-carbon double bonds) can reach chemical lifetimes of many hours only if they are embedded in a (semi-)solid matrix with very low diffusion coefficients (< 10(-10) cm(2) s(-1)). Depending on the complexity of the investigated system, unlimited numbers of volatile and non-volatile species and chemical reactions can be flexibly added and treated with KM-SUB. We propose and intend to pursue the application of KM-SUB as a basis for the development of a detailed master mechanism of aerosol chemistry as well as for the derivation of simplified but realistic parameterizations for large-scale atmospheric and climate models.
Resumo:
We present a novel kinetic multi-layer model that explicitly resolves mass transport and chemical reaction at the surface and in the bulk of aerosol particles (KM-SUB). The model is based on the PRA framework of gas–particle interactions (P¨oschl et al., 5 2007), and it includes reversible adsorption, surface reactions and surface-bulk exchange as well as bulk diffusion and reaction. Unlike earlier models, KM-SUB does not require simplifying assumptions about steady-state conditions and radial mixing. The temporal evolution and concentration profiles of volatile and non-volatile species at the gas-particle interface and in the particle bulk can be modeled along with surface 10 concentrations and gas uptake coefficients. In this study we explore and exemplify the effects of bulk diffusion on the rate of reactive gas uptake for a simple reference system, the ozonolysis of oleic acid particles, in comparison to experimental data and earlier model studies. We demonstrate how KM-SUB can be used to interpret and analyze experimental data from laboratory stud15 ies, and how the results can be extrapolated to atmospheric conditions. In particular, we show how interfacial transport and bulk transport, i.e., surface accommodation, bulk accommodation and bulk diffusion, influence the kinetics of the chemical reaction. Sensitivity studies suggest that in fine air particulate matter oleic acid and compounds with similar reactivity against ozone (C=C double bonds) can reach chemical lifetimes of 20 multiple hours only if they are embedded in a (semi-)solid matrix with very low diffusion coefficients (10−10 cm2 s−1). Depending on the complexity of the investigated system, unlimited numbers of volatile and non-volatile species and chemical reactions can be flexibly added and treated with KM-SUB. We propose and intend to pursue the application of KM-SUB 25 as a basis for the development of a detailed master mechanism of aerosol chemistry as well as for the derivation of simplified but realistic parameterizations for large-scale atmospheric and climate models.
Resumo:
As the building industry proceeds in the direction of low impact buildings, research attention is being drawn towards the reduction of carbon dioxide emission and waste. Starting from design and construction to operation and demolition, various building materials are used throughout the whole building lifecycle involving significant energy consumption and waste generation. Building Information Modelling (BIM) is emerging as a tool that can support holistic design-decision making for reducing embodied carbon and waste production in the building lifecycle. This study aims to establish a framework for assessing embodied carbon and waste underpinned by BIM technology. On the basis of current research review, the framework is considered to include functional modules for embodied carbon computation. There are a module for waste estimation, a knowledge-base of construction and demolition methods, a repository of building components information, and an inventory of construction materials’ energy and carbon. Through both static 3D model visualisation and dynamic modelling supported by the framework, embodied energy (carbon), waste and associated costs can be analysed in the boundary of cradle-to-gate, construction, operation, and demolition. The proposed holistic modelling framework provides a possibility to analyse embodied carbon and waste from different building lifecycle perspectives including associated costs. It brings together existing segmented embodied carbon and waste estimation into a unified model, so that interactions between various parameters through the different building lifecycle phases can be better understood. Thus, it can improve design-decision support for optimal low impact building development. The applicability of this framework is anticipated being developed and tested on industrial projects in the near future.
Resumo:
There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.
Resumo:
A simple and coherent framework for partitioning uncertainty in multi-model climate ensembles is presented. The analysis of variance (ANOVA) is used to decompose a measure of total variation additively into scenario uncertainty, model uncertainty and internal variability. This approach requires fewer assumptions than existing methods and can be easily used to quantify uncertainty related to model-scenario interaction - the contribution to model uncertainty arising from the variation across scenarios of model deviations from the ensemble mean. Uncertainty in global mean surface air temperature is quantified as a function of lead time for a subset of the Coupled Model Intercomparison Project phase 3 ensemble and results largely agree with those published by other authors: scenario uncertainty dominates beyond 2050 and internal variability remains approximately constant over the 21st century. Both elements of model uncertainty, due to scenario-independent and scenario-dependent deviations from the ensemble mean, are found to increase with time. Estimates of model deviations that arise as by-products of the framework reveal significant differences between models that could lead to a deeper understanding of the sources of uncertainty in multi-model ensembles. For example, three models are shown diverging pattern over the 21st century, while another model exhibits an unusually large variation among its scenario-dependent deviations.
Resumo:
We propose a new modelling framework suitable for the description of atmospheric convective systems as a collection of distinct plumes. The literature contains many examples of models for collections of plumes in which strong simplifying assumptions are made, a diagnostic dependence of convection on the large-scale environment and the limit of many plumes often being imposed from the outset. Some recent studies have sought to remove one or the other of those assumptions. The proposed framework removes both, and is explicitly time-dependent and stochastic in its basic character. The statistical dynamics of the plume collection are defined through simple probabilistic rules applied at the level of individual plumes, and van Kampen's system size expansion is then used to construct the macroscopic limit of the microscopic model. Through suitable choices of the microscopic rules, the model is shown to encompass previous studies in the appropriate limits, and to allow their natural extensions beyond those limits.
Resumo:
We investigate a simplified form of variational data assimilation in a fully nonlinear framework with the aim of extracting dynamical development information from a sequence of observations over time. Information on the vertical wind profile, w(z ), and profiles of temperature, T (z , t), and total water content, qt (z , t), as functions of height, z , and time, t, are converted to brightness temperatures at a single horizontal location by defining a two-dimensional (vertical and time) variational assimilation testbed. The profiles of T and qt are updated using a vertical advection scheme. A basic cloud scheme is used to obtain the fractional cloud amount and, when combined with the temperature field, this information is converted into a brightness temperature, using a simple radiative transfer scheme. It is shown that our model exhibits realistic behaviour with regard to the prediction of cloud, but the effects of nonlinearity become non-negligible in the variational data assimilation algorithm. A careful analysis of the application of the data assimilation scheme to this nonlinear problem is presented, the salient difficulties are highlighted, and suggestions for further developments are discussed.