93 resultados para Systems Modelling
Resumo:
Mycoplasma gallisepticum (MG) is a bacterium that causes respiratory disease in chickens, leading to reduced egg production. A dynamic simulation model was developed that can be used to assess the costs and benefits of control using antimicrobials or vaccination in caged or free range systems. The intended users are veterinarians and egg producers. A user interface is provided for input of flock specific parameters. The economic consequence of an MG outbreak is expressed as a reduction in expected egg output. The model predicts that either vaccination or microbial treatment can approximately halve potential losses from MG in some circumstances. Sensitivity analysis is used to test assumptions about infection rate and timing of an outbreak. Feedback from veterinarians points to the value of the model as a discussion tool with producers.
Resumo:
The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.
Resumo:
Many communication signal processing applications involve modelling and inverting complex-valued (CV) Hammerstein systems. We develops a new CV B-spline neural network approach for efficient identification of the CV Hammerstein system and effective inversion of the estimated CV Hammerstein model. Specifically, the CV nonlinear static function in the Hammerstein system is represented using the tensor product from two univariate B-spline neural networks. An efficient alternating least squares estimation method is adopted for identifying the CV linear dynamic model’s coefficients and the CV B-spline neural network’s weights, which yields the closed-form solutions for both the linear dynamic model’s coefficients and the B-spline neural network’s weights, and this estimation process is guaranteed to converge very fast to a unique minimum solution. Furthermore, an accurate inversion of the CV Hammerstein system can readily be obtained using the estimated model. In particular, the inversion of the CV nonlinear static function in the Hammerstein system can be calculated effectively using a Gaussian-Newton algorithm, which naturally incorporates the efficient De Boor algorithm with both the B-spline curve and first order derivative recursions. The effectiveness of our approach is demonstrated using the application to equalisation of Hammerstein channels.
Resumo:
Activating transcription factor 3 (Atf3) is rapidly and transiently upregulated in numerous systems, and is associated with various disease states. Atf3 is required for negative feedback regulation of other genes, but is itself subject to negative feedback regulation possibly by autorepression. In cardiomyocytes, Atf3 and Egr1 mRNAs are upregulated via ERK1/2 signalling and Atf3 suppresses Egr1 expression. We previously developed a mathematical model for the Atf3-Egr1 system. Here, we adjusted and extended the model to explore mechanisms of Atf3 feedback regulation. Introduction of an autorepressive loop for Atf3 tuned down its expression and inhibition of Egr1 was lost, demonstrating that negative feedback regulation of Atf3 by Atf3 itself is implausible in this context. Experimentally, signals downstream from ERK1/2 suppress Atf3 expression. Mathematical modelling indicated that this cannot occur by phosphorylation of pre-existing inhibitory transcriptional regulators because the time delay is too short. De novo synthesis of an inhibitory transcription factor (ITF) with a high affinity for the Atf3 promoter could suppress Atf3 expression, but (as with the Atf3 autorepression loop) inhibition of Egr1 was lost. Developing the model to include newly-synthesised miRNAs very efficiently terminated Atf3 protein expression and, with a 4-fold increase in the rate of degradation of mRNA from the mRNA/miRNA complex, profiles for Atf3 mRNA, Atf3 protein and Egr1 mRNA approximated to the experimental data. Combining the ITF model with that of the miRNA did not improve the profiles suggesting that miRNAs are likely to play a dominant role in switching off Atf3 expression post-induction.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
There are three key components for developing a metadata system: a container structure laying out the key semantic issues of interest and their relationships; an extensible controlled vocabulary providing possible content; and tools to create and manipulate that content. While metadata systems must allow users to enter their own information, the use of a controlled vocabulary both imposes consistency of definition and ensures comparability of the objects described. Here we describe the controlled vocabulary (CV) and metadata creation tool built by the METAFOR project for use in the context of describing the climate models, simulations and experiments of the fifth Coupled Model Intercomparison Project (CMIP5). The CV and resulting tool chain introduced here is designed for extensibility and reuse and should find applicability in many more projects.
Resumo:
The paper discusses how variations in the pattern of convective plasma flows should beincluded in self-consistent time-dependent models of the coupled ionosphere-thermosphere system. The author shows how these variations depend upon the mechanism by which the solar wind flow excites the convection. The modelling of these effects is not just of relevance to the polar ionosphere. This is because the influence of convection is not confined to high latitudes: the resultant heating and composition changes in the thermosphere are communicated to lower latitudes by the winds which are also greatly modified by the plasma convection. These thermospheric changes alter the global distribution of plasma by modulatingthe rates of the chemical reactions which areresponsible for the loss of plasma. Hence the modelling of these high-latitude processes is of relevanceto the design and operation of HF communication, radar and navigation systems worldwide.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
This paper presents the mathematical development of a body-centric nonlinear dynamic model of a quadrotor UAV that is suitable for the development of biologically inspired navigation strategies. Analytical approximations are used to find an initial guess of the parameters of the nonlinear model, then parameter estimation methods are used to refine the model parameters using the data obtained from onboard sensors during flight. Due to the unstable nature of the quadrotor model, the identification process is performed with the system in closed-loop control of attitude angles. The obtained model parameters are validated using real unseen experimental data. Based on the identified model, a Linear-Quadratic (LQ) optimal tracker is designed to stabilize the quadrotor and facilitate its translational control by tracking body accelerations. The LQ tracker is tested on an experimental quadrotor UAV and the obtained results are a further means to validate the quality of the estimated model. The unique formulation of the control problem in the body frame makes the controller better suited for bio-inspired navigation and guidance strategies than conventional attitude or position based control systems that can be found in the existing literature.
Resumo:
The notion that learning can be enhanced when a teaching approach matches a learner’s learning style has been widely accepted in classroom settings since the latter represents a predictor of student’s attitude and preferences. As such, the traditional approach of ‘one-size-fits-all’ as may be applied to teaching delivery in Educational Hypermedia Systems (EHSs) has to be changed with an approach that responds to users’ needs by exploiting their individual differences. However, establishing and implementing reliable approaches for matching the teaching delivery and modalities to learning styles still represents an innovation challenge which has to be tackled. In this paper, seventy six studies are objectively analysed for several goals. In order to reveal the value of integrating learning styles in EHSs, different perspectives in this context are discussed. Identifying the most effective learning style models as incorporated within AEHSs. Investigating the effectiveness of different approaches for modelling students’ individual learning traits is another goal of this study. Thus, the paper highlights a number of theoretical and technical issues of LS-BAEHSs to serve as a comprehensive guidance for researchers who interest in this area.
Resumo:
During the last few years Enterprise Architecture (EA) has received increasing attention among industry and academia. By adopting EA, organisations may gain a number of benefits such as better decision making,increased revenues and cost reduction, and alignment of business and IT. However, EA adoption has been found to be difficult. In this paper a model to explain resistance during EA adoption process (REAP) is introduced and validated. The model reveals relationships between strategic level of EA, resulting organisational changes, and sources of resistance. By utilising REAP model, organisations may anticipate and prepare for the organisational change resistance during EA adoption.
Resumo:
There is little consensus on how agriculture will meet future food demands sustainably. Soils and their biota play a crucial role by mediating ecosystem services that support agricultural productivity. However, a multitude of site-specific environmental factors and management practices interact to affect the ability of soil biota to perform vital functions, confounding the interpretation of results from experimental approaches. Insights can be gained through models, which integrate the physiological, biological and ecological mechanisms underpinning soil functions. We present a powerful modelling approach for predicting how agricultural management practices (pesticide applications and tillage) affect soil functioning through earthworm populations. By combining energy budgets and individual-based simulation models, and integrating key behavioural and ecological drivers, we accurately predict population responses to pesticide applications in different climatic conditions. We use the model to analyse the ecological consequences of different weed management practices. Our results demonstrate that an important link between agricultural management (herbicide applications and zero, reduced and conventional tillage) and earthworms is the maintenance of soil organic matter (SOM). We show how zero and reduced tillage practices can increase crop yields while preserving natural ecosystem functions. This demonstrates how management practices which aim to sustain agricultural productivity should account for their effects on earthworm populations, as their proliferation stimulates agricultural productivity. Synthesis and applications. Our results indicate that conventional tillage practices have longer term effects on soil biota than pesticide control, if the pesticide has a short dissipation time. The risk of earthworm populations becoming exposed to toxic pesticides will be reduced under dry soil conditions. Similarly, an increase in soil organic matter could increase the recovery rate of earthworm populations. However, effects are not necessarily additive and the impact of different management practices on earthworms depends on their timing and the prevailing environmental conditions. Our model can be used to determine which combinations of crop management practices and climatic conditions pose least overall risk to earthworm populations. Linking our model mechanistically to crop yield models would aid the optimization of crop management systems by exploring the trade-off between different ecosystem services.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.