816 resultados para Adaptive Modelling, Entropy Evolution, Sustainable Design
Resumo:
Information provision to address the changing requirements can be best supported by content management. The Current information technology enables information to be stored and provided from various distributed sources. To identify and retrieve relevant information requires effective mechanisms for information discovery and assembly. This paper presents a method, which enables the design of such mechanisms, with a set of techniques for articulating and profiling users' requirements, formulating information provision specifications, realising management of information content in repositories, and facilitating response to the user's requirements dynamically during the process of knowledge construction. These functions are represented in an ontology which integrates the capability of the mechanisms. The ontological modelling in this paper has adopted semiotics principles with embedded norms to ensure coherent course of actions represented in these mechanisms. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The knowledge economy offers opportunity to a broad and diverse community of information systems users to efficiently gain information and know-how for improving qualifications and enhancing productivity in the work place. Such demand will continue and users will frequently require optimised and personalised information content. The advancement of information technology and the wide dissemination of information endorse individual users when constructing new knowledge from their experience in the real-world context. However, a design of personalised information provision is challenging because users’ requirements and information provision specifications are complex in their representation. The existing methods are not able to effectively support this analysis process. This paper presents a mechanism which can holistically facilitate customisation of information provision based on individual users’ goals, level of knowledge and cognitive styles preferences. An ontology model with embedded norms represents the domain knowledge of information provision in a specific context where users’ needs can be articulated and represented in a user profile. These formal requirements can then be transformed onto information provision specifications which are used to discover suitable information content from repositories and pedagogically organise the selected content to meet the users’ needs. The method is provided with adaptability which enables an appropriate response to changes in users’ requirements during the process of acquiring knowledge and skills.
Resumo:
Purpose – While Freeman's stakeholder management approach has attracted much attention from both scholars and practitioners, little empirical work has considered the interconnectedness of organisational perspectives and stakeholder perspectives. The purpose of this paper is to respond to this gap by developing and empirically testing a bi-directional model of organisation/stakeholder relationships. Design/methodology/approach – A conceptual framework is developed that integrates how stakeholders are affected by organisations with how they affect organisations. Quantitative data relating to both sides of the relationship are obtained from 700 customers of a European service organisation and analysed using partial least squares structural equation modelling technique. Findings – The findings provide empirical support for the notion of mutual dependency between organisations and stakeholders as advocated by stakeholder theorists. The results suggest that the way stakeholders relate to organisations is dependent on how organisations relate to stakeholders. Originality/value – The study is original on two fronts: first, it provides a framework and process that can be used by researchers to model bi-directional research with other stakeholder groups and in different contexts. Second, the study presents an example application of bi-directional research by empirically linking organisational and stakeholder expectations in the case of customers of a UK service organisation.
Resumo:
The design of high-voltage equipment encompasses the study of oscillatory surges caused by transients such as those produced by switching. By obtaining a model, the response of which reconstructs that observed in the actual system, simulation studies and critical tests can be carried out on the model rather than on the equipment itself. In this paper, methods for the construction of simplified models are described and it is shown how the use of a complex model does not necessarily result in superior response pattern reconstruction.
Resumo:
A simple parameter adaptive controller design methodology is introduced in which steady-state servo tracking properties provide the major control objective. This is achieved without cancellation of process zeros and hence the underlying design can be applied to non-minimum phase systems. As with other self-tuning algorithms, the design (user specified) polynomials of the proposed algorithm define the performance capabilities of the resulting controller. However, with the appropriate definition of these polynomials, the synthesis technique can be shown to admit different adaptive control strategies, e.g. self-tuning PID and self-tuning pole-placement controllers. The algorithm can therefore be thought of as an embodiment of other self-tuning design techniques. The performances of some of the resulting controllers are illustrated using simulation examples and the on-line application to an experimental apparatus.
Resumo:
This paper considers the use of a discrete-time deadbeat control action on systems affected by noise. Variations on the standard controller form are discussed and comparisons are made with controllers in which noise rejection is a higher priority objective. Both load and random disturbances are considered in the system description, although the aim of the deadbeat design remains as a tailoring of reference input variations. Finally, the use of such a deadbeat action within a self-tuning control framework is shown to satisfy, under certain conditions, the self-tuning property, generally though only when an extended form of least-squares estimation is incorporated.
Resumo:
As the building industry proceeds in the direction of low impact buildings, research attention is being drawn towards the reduction of carbon dioxide emission and waste. Starting from design and construction to operation and demolition, various building materials are used throughout the whole building lifecycle involving significant energy consumption and waste generation. Building Information Modelling (BIM) is emerging as a tool that can support holistic design-decision making for reducing embodied carbon and waste production in the building lifecycle. This study aims to establish a framework for assessing embodied carbon and waste underpinned by BIM technology. On the basis of current research review, the framework is considered to include functional modules for embodied carbon computation. There are a module for waste estimation, a knowledge-base of construction and demolition methods, a repository of building components information, and an inventory of construction materials’ energy and carbon. Through both static 3D model visualisation and dynamic modelling supported by the framework, embodied energy (carbon), waste and associated costs can be analysed in the boundary of cradle-to-gate, construction, operation, and demolition. The proposed holistic modelling framework provides a possibility to analyse embodied carbon and waste from different building lifecycle perspectives including associated costs. It brings together existing segmented embodied carbon and waste estimation into a unified model, so that interactions between various parameters through the different building lifecycle phases can be better understood. Thus, it can improve design-decision support for optimal low impact building development. The applicability of this framework is anticipated being developed and tested on industrial projects in the near future.
Resumo:
1. Closed Ecological Systems (CES) are small manmade ecosystems which do not have any material exchange with the surrounding environment. Recent ecological and technological advances enable successful establishment and maintenance of CES, making them a suitable tool for detecting and measuring subtle feedbacks and mechanisms. 2. As a part of an analogue (physical) C cycle modelling experiment, we developed a non-intrusive methodology to control the internal environment and to monitor atmospheric CO2 concentration inside 16 replicated CES. Whilst maintaining an air-tight seal of all CES, this approach allowed for access to the CO2 measuring equipment for periodic re-calibration and repairs. 3. To ensure reliable cross-comparison of CO2 observations between individual CES units and to minimise the cost of the system, only one CO2 sampling unit was used. An ADC BioScientific OP-2 (open-path) analyser mounted on a swinging arm was passing over a set of 16 measuring cells. Each cell was connected to an individual CES with air continuously circulating between them. 4. Using this setup, we were able to continuously measure several environmental variables and CO2 concentration within each closed system, allowing us to study minute effects of changing temperature on C fluxes within each CES. The CES and the measuring cells showed minimal air leakage during an experimental run lasting, on average, 3 months. The CO2 analyser assembly performed reliably for over 2 years, however an early iteration of the present design proved to be sensitive to positioning errors. 5. We indicate how the methodology can be further improved and suggest possible avenues where future CES based research could be applied.
Resumo:
There is a growing appreciation among evolutionary biologists that the rate and tempo of molecular evolution might often be altered at or near the time of speciation, i.e. that speciation is in some way a special time for genes. Molecular phylogenies frequently reveal increased rates of genetic evolution associated with speciation and other lines of investigation suggest that various types of abrupt genomic disruption can play an important role in promoting speciation via reproductive isolation. These phenomena are in conflict with the gradual view of molecular evolution that is implicit in much of our thinking about speciation and in the tools of modern biology. This raises the prospect of studying the molecular evolutionary consequences of speciation per se and studying the footprint of speciation as an active force in promoting genetic divergence. Here we discuss the reasons to believe that speciation can play such a role and elaborate on possible mechanisms for accelerated rates of evolution following speciation. We provide an example of how it is possible detect whether accelerated bursts of evolution occur in neutral and/or adaptive regions of genes and discuss the implications of rapid episodes of change for conventional models of molecular evolution. Speciation might often owe more to ephemeral and essentially arbitrary events that cause reproductive isolation than to the gradual and regular tug of natural selection that draws a species into a new niche.
Resumo:
A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.
Resumo:
Techniques for modelling urban microclimates and urban block surfaces temperatures are desired by urban planners and architects for strategic urban designs at the early design stages. This paper introduces a simplified mathematical model for urban simulations (UMsim) including urban surfaces temperatures and microclimates. The nodal network model has been developed by integrating coupled thermal and airflow model. Direct solar radiation, diffuse radiation, reflected radiation, long-wave radiation, heat convection in air and heat transfer in the exterior walls and ground within the complex have been taken into account. The relevant equations have been solved using the finite difference method under the Matlab platform. Comparisons have been conducted between the data produced from the simulation and that from an urban experimental study carried out in a real architectural complex on the campus of Chongqing University, China in July 2005 and January 2006. The results show a satisfactory agreement between the two sets of data. The UMsim can be used to simulate the microclimates, in particular the surface temperatures of urban blocks, therefore it can be used to assess the impact of urban surfaces properties on urban microclimates. The UMsim will be able to produce robust data and images of urban environments for sustainable urban design.
Resumo:
In this chapter we described how the inclusion of a model of a human arm, combined with the measurement of its neural input and a predictor, can provide to a previously proposed teleoperator design robustness under time delay. Our trials gave clear indications of the superiority of the NPT scheme over traditional as well as the modified Yokokohji and Yoshikawa architectures. Its fundamental advantages are: the time-lead of the slave, the more efficient, and providing a more natural feeling manipulation, and the fact that incorporating an operator arm model leads to more credible stability results. Finally, its simplicity allows less likely to fail local control techniques to be employed. However, a significant advantage for the enhanced Yokokohji and Yoshikawa architecture results from the very fact that it’s a conservative modification of current designs. Under large prediction errors, it can provide robustness through directing the master and slave states to their means and, since it relies on the passivity of the mechanical part of the system, it would not confuse the operator. An experimental implementation of the techniques will provide further evidence for the performance of the proposed architectures. The employment of neural networks and fuzzy logic, which will provide an adaptive model of the human arm and robustifying control terms, is scheduled for the near future.
Resumo:
This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.
Resumo:
This paper outlines some rehabilitation applications of manipulators and identifies that new approaches demand that the robot make an intimate contact with the user. Design of new generations of manipulators with programmable compliance along with higher level controllers that can set the compliance appropriately for the task, are both feasible propositions. We must thus gain a greater insight into the way in which a person interacts with a machine, particularly given that the interaction may be non-passive. We are primarily interested in the change in wrist and arm dynamics as the person co-contracts his/her muscles. It is observed that this leads to a change in stiffness that can push an actuated interface into a limit cycle. We use both experimental results gathered from a PHANToM haptic interface and a mathematical model to observe this effect. Results are relevant to the fields of rehabilitation and therapy robots, haptic interfaces, and telerobotics