917 resultados para Model Driven Architecture (MDA)
Resumo:
A size-structured plant population model is developed to study the evolution of pathogen-induced leaf shedding under various environmental conditions. The evolutionary stable strategy (ESS) of the leaf shedding rate is determined for two scenarios: i) a constant leaf shedding strategy and ii) an infection load driven leaf shedding strategy. The model predicts that ESS leaf shedding rates increase with nutrient availability. No effect of plant density on the ESS leaf shedding rate is found even though disease severity increases with plant density. When auto-infection, that is increased infection due to spores produced on the plant itself, plays a key role in further disease increase on the plant, shedding leaves removes disease that would otherwise contribute to disease increase on the plant itself. Consequently leaf shedding responses to infections may evolve. When external infection, that is infection due to immigrant spores, is the key determinant, shedding a leaf does not reduce the force of infection on the leaf shedding plant. In this case leaf shedding will not evolve. Under a low external disease pressure adopting an infection driven leaf shedding strategy is more efficient than adopting a constant leaf shedding strategy, since a plant adopting an infection driven leaf shedding strategy does not shed any leaves in the absence of infection, even when leaf shedding rates are high. A plant adopting a constant leaf shedding rate sheds the same amount of leaves regardless of the presence of infection. Based on the results we develop two hypotheses that can be tested if the appropriate plant material is available.
Resumo:
We present a kinetic model for transformations between different self-assembled lipid structures. The model shows how data on the rates of phase transitions between mesophases of different geometries can be used to provide information on the mechanisms of the transformations and the transition states involved. This can be used, for example, to gain an insight into intermediate structures in cell membrane fission or fusion. In cases where the monolayer curvature changes on going from the initial to the final mesophase, we consider the phase transition to be driven primarily by the change in the relaxed curvature with pressure or temperature, which alters the relative curvature elastic energies of the two mesophase structures. Using this model, we have analyzed previously published kinetic data on the inter-conversion of inverse bicontinuous cubic phases in the 1-monoolein-30 wt% water system. The data are for a transition between QII(G) and QII(D) phases, and our analysis indicates that the transition state more closely resembles the QII(D) than the QII(G) phase. Using estimated values for the monolayer mean curvatures of the QII(G) and QII(D) phases of -0.123 nm(-1) and -0.133 nm(-1), respectively, gives values for the monolayer mean curvature of the transition state of between -0.131 nm(-1) and -0.132 nm(-1). Furthermore, we estimate that several thousand molecules undergo the phase transition cooperatively within one "cooperative unit", equivalent to 1-2 unit cells of QII(G) or 4-10 unit cells of QII(D).
Resumo:
Modern buildings are designed to enhance the match between environment, spaces and the people carrying out work, so that the well-being and the performance of the occupants are all in harmony. Building services are systems that facilitate a healthy working environment within which workers productivity can be optimised in the buildings. However, the maintenance of these services is fraught with problems that may contribute to up to 50% of the total life cycle cost of the building. Maintenance support is one area which is not usually designed into the system as this is not common practice in the services industry. The other areas of shortfall for future designs are; client requirements, commissioning, facilities management data and post occupancy evaluation feedback which needs to be adequately planned to capture and document this information for use in future designs. At the University of Reading an integrated approach has been developed to assemble the multitude of aspects inherent in this field. The means records required and measured achievements for the benefit of both building owners and practitioners. This integrated approach can be represented in a Through Life Business Model (TLBM) format using the concept of Integrated Logistic Support (ILS). The prototype TLBM developed utilises the tailored tools and techniques of ILS for building services. This TLBM approach will facilitate the successful development of a databank that would be invaluable in capturing essential data (e.g. reliability of components) for enhancing future building services designs, life cycle costing and decision making by practitioners, in particular facilities managers.
Resumo:
Climate change is one of the major challenges facing economic systems at the start of the 21st century. Reducing greenhouse gas emissions will require both restructuring the energy supply system (production) and addressing the efficiency and sufficiency of the social uses of energy (consumption). The energy production system is a complicated supply network of interlinked sectors with 'knock-on' effects throughout the economy. End use energy consumption is governed by complex sets of interdependent cultural, social, psychological and economic variables driven by shifts in consumer preference and technological development trajectories. To date, few models have been developed for exploring alternative joint energy production-consumption systems. The aim of this work is to propose one such model. This is achieved in a methodologically coherent manner through integration of qualitative input-output models of production, with Bayesian belief network models of consumption, at point of final demand. The resulting integrated framework can be applied either (relatively) quickly and qualitatively to explore alternative energy scenarios, or as a fully developed quantitative model to derive or assess specific energy policy options. The qualitative applications are explored here.
Resumo:
Several studies have highlighted the importance of the cooling period in oil absorption in deep-fat fried products. Specifically, it has been established that the largest proportion of oil which ends up into the food, is sucked into the porous crust region after the fried product is removed from the oil bath, stressing the importance of this time interval. The main objective of this paper was to develop a predictive mechanistic model that can be used to understand the principles behind post-frying cooling oil absorption kinetics, which can also help identifying the key parameters that affect the final oil intake by the fried product. The model was developed for two different geometries, an infinite slab and an infinite cylinder, and was divided into two main sub-models, one describing the immersion frying period itself and the other describing the post-frying cooling period. The immersion frying period was described by a transient moving-front model that considered the movement of the crust/core interface, whereas post-frying cooling oil absorption was considered to be a pressure driven flow mediated by capillary forces. A key element in the model was the hypothesis that oil suction would only begin once a positive pressure driving force had developed. The mechanistic model was based on measurable physical and thermal properties, and process parameters with no need of empirical data fitting, and can be used to study oil absorption in any deep-fat fried product that satisfies the assumptions made.
Resumo:
Studies of ignorance-driven decision making have been employed to analyse when ignorance should prove advantageous on theoretical grounds or else they have been employed to examine whether human behaviour is consistent with an ignorance-driven inference strategy (e. g., the recognition heuristic). In the current study we examine whether-under conditions where such inferences might be expected-the advantages that theoretical analyses predict are evident in human performance data. A single experiment shows that, when asked to make relative wealth judgements, participants reliably use recognition as a basis for their judgements. Their wealth judgements under these conditions are reliably more accurate when some of the target names are unknown than when participants recognize all of the names (a "less-is-more effect"). These results are consistent across a number of variations: the number of options given to participants and the nature of the wealth judgement. A basic model of recognition-based inference predicts these effects.
Resumo:
This paper discusses the problems inherent within traditional supply chain management's forecast and inventory management processes arising when tackling demand driven supply chain. A demand driven supply chain management architecture developed by Orchestr8 Ltd., U.K. is described to demonstrate its advantages over traditional supply chain management. Within this architecture, a metrics reporting system is designed by adopting business intelligence technology that supports users for decision making and planning supply activities over supply chain health.
Resumo:
The climatology of a stratosphere-resolving version of the Met Office’s climate model is studied and validated against ECMWF reanalysis data. Ensemble integrations are carried out at two different horizontal resolutions. Along with a realistic climatology and annual cycle in zonal mean zonal wind and temperature, several physical effects are noted in the model. The time of final warming of the winter polar vortex is found to descend monotonically in the Southern Hemisphere, as would be expected for purely radiative forcing. In the Northern Hemisphere, however, the time of final warming is driven largely by dynamical effects in the lower stratosphere and radiative effects in the upper stratosphere, leading to the earliest transition to westward winds being seen in the midstratosphere. A realistic annual cycle in stratospheric water vapor concentrations—the tropical “tape recorder”—is captured. Tropical variability in the zonal mean zonal wind is found to be in better agreement with the reanalysis for the model run at higher horizontal resolution because the simulated quasi-biennial oscillation has a more realistic amplitude. Unexpectedly, variability in the extratropics becomes less realistic under increased resolution because of reduced resolved wave drag and increased orographic gravity wave drag. Overall, the differences in climatology between the simulations at high and moderate horizontal resolution are found to be small.
Resumo:
Self-organizing neural networks have been implemented in a wide range of application areas such as speech processing, image processing, optimization and robotics. Recent variations to the basic model proposed by the authors enable it to order state space using a subset of the input vector and to apply a local adaptation procedure that does not rely on a predefined test duration limit. Both these variations have been incorporated into a new feature map architecture that forms an integral part of an Hybrid Learning System (HLS) based on a genetic-based classifier system. Problems are represented within HLS as objects characterized by environmental features. Objects controlled by the system have preset targets set against a subset of their features. The system's objective is to achieve these targets by evolving a behavioural repertoire that efficiently explores and exploits the problem environment. Feature maps encode two types of knowledge within HLS — long-term memory traces of useful regularities within the environment and the classifier performance data calibrated against an object's feature states and targets. Self-organization of these networks constitutes non-genetic-based (experience-driven) learning within HLS. This paper presents a description of the HLS architecture and an analysis of the modified feature map implementing associative memory. Initial results are presented that demonstrate the behaviour of the system on a simple control task.
Resumo:
In this chapter we described how the inclusion of a model of a human arm, combined with the measurement of its neural input and a predictor, can provide to a previously proposed teleoperator design robustness under time delay. Our trials gave clear indications of the superiority of the NPT scheme over traditional as well as the modified Yokokohji and Yoshikawa architectures. Its fundamental advantages are: the time-lead of the slave, the more efficient, and providing a more natural feeling manipulation, and the fact that incorporating an operator arm model leads to more credible stability results. Finally, its simplicity allows less likely to fail local control techniques to be employed. However, a significant advantage for the enhanced Yokokohji and Yoshikawa architecture results from the very fact that it’s a conservative modification of current designs. Under large prediction errors, it can provide robustness through directing the master and slave states to their means and, since it relies on the passivity of the mechanical part of the system, it would not confuse the operator. An experimental implementation of the techniques will provide further evidence for the performance of the proposed architectures. The employment of neural networks and fuzzy logic, which will provide an adaptive model of the human arm and robustifying control terms, is scheduled for the near future.
Resumo:
The Perspex Machine arose from the unification of computation with geometry. We now report significant redevelopment of both a partial C compiler that generates perspex programs and of a Graphical User Interface (GUI). The compiler is constructed with standard compiler-generator tools and produces both an explicit parse tree for C and an Abstract Syntax Tree (AST) that is better suited to code generation. The GUI uses a hash table and a simpler software architecture to achieve an order of magnitude speed up in processing and, consequently, an order of magnitude increase in the number of perspexes that can be manipulated in real time (now 6,000). Two perspex-machine simulators are provided, one using trans-floating-point arithmetic and the other using transrational arithmetic. All of the software described here is available on the world wide web. The compiler generates code in the neural model of the perspex. At each branch point it uses a jumper to return control to the main fibre. This has the effect of pruning out an exponentially increasing number of branching fibres, thereby greatly increasing the efficiency of perspex programs as measured by the number of neurons required to implement an algorithm. The jumpers are placed at unit distance from the main fibre and form a geometrical structure analogous to a myelin sheath in a biological neuron. Both the perspex jumper-sheath and the biological myelin-sheath share the computational function of preventing cross-over of signals to neurons that lie close to an axon. This is an example of convergence driven by similar geometrical and computational constraints in perspex and biological neurons.
Resumo:
Transport and deposition of charged inhaled aerosols in double planar bifurcation representing generation three to five of human respiratory system has been studied under a light activity breathing condition. Both steady and oscillatory laminar inhalation airflow is considered. Particle trajectories are calculated using a Lagrangian reference frame, which is dominated by the fluid force driven by airflow, gravity force and electrostatic forces (both of space and image charge forces). The particle-mesh method is selected to calculate the space charge force. This numerical study investigates the deposition efficiency in the three-dimensional model under various particle sizes, charge values, and inlet particle distribution. Numerical results indicate that particles carrying an adequate level of charge can improve deposition efficiency in the airway model.
Resumo:
The consistency of ensemble forecasts from three global medium-range prediction systems with the observed transition behaviour of a three-cluster model of the North Atlantic eddy-driven jet is examined. The three clusters consist of a mid jet cluster taken to represent an undisturbed jet and south and north jet clusters representing southward and northward shifts of the jet. The ensemble forecasts span a period of three extended winters (October–February) from October 2007–February 2010. The mean probabilities of transitions between the clusters calculated from the ensemble forecasts are compared with those calculated from a 23-extended-winter climatology taken from the European Centre for Medium-Range Weather Forecasts 40-Year Re-analysis (ERA40) dataset. No evidence of a drift with increasing lead time of the ensemble forecast transition probabilities towards values inconsistent with the 23-extended-winter climatology is found. The ensemble forecasts of transition probabilities are found to have positive Brier Skill at 15 day lead times. It is found that for the three-extended-winter forecast set, probabilistic forecasts initialized in the north jet cluster are generally less skilful than those initialized in the other clusters. This is consistent with the shorter persistence time-scale of the north jet cluster observed in the ERA40 23-extended-winter climatology. Copyright © 2011 Royal Meteorological Society
Resumo:
Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.