14 resultados para Design Environments

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed digital control systems provide alternatives to conventional, centralised digital control systems. Typically, a modern distributed control system will comprise a multi-processor or network of processors, a communications network, an associated set of sensors and actuators, and the systems and applications software. This thesis addresses the problem of how to design robust decentralised control systems, such as those used to control event-driven, real-time processes in time-critical environments. Emphasis is placed on studying the dynamical behaviour of a system and identifying ways of partitioning the system so that it may be controlled in a distributed manner. A structural partitioning technique is adopted which makes use of natural physical sub-processes in the system, which are then mapped into the software processes to control the system. However, communications are required between the processes because of the disjoint nature of the distributed (i.e. partitioned) state of the physical system. The structural partitioning technique, and recent developments in the theory of potential controllability and observability of a system, are the basis for the design of controllers. In particular, the method is used to derive a decentralised estimate of the state vector for a continuous-time system. The work is also extended to derive a distributed estimate for a discrete-time system. Emphasis is also given to the role of communications in the distributed control of processes and to the partitioning technique necessary to design distributed and decentralised systems with resilient structures. A method is presented for the systematic identification of necessary communications for distributed control. It is also shwon that the structural partitions can be used directly in the design of software fault tolerant concurrent controllers. In particular, the structural partition can be used to identify the boundary of the conversation which can be used to protect a specific part of the system. In addition, for certain classes of system, the partitions can be used to identify processes which may be dynamically reconfigured in the event of a fault. These methods should be of use in the design of robust distributed systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern distributed control systems comprise of a set of processors which are interconnected using a suitable communication network. For use in real-time control environments, such systems must be deterministic and generate specified responses within critical timing constraints. Also, they should be sufficiently robust to survive predictable events such as communication or processor faults. This thesis considers the problem of coordinating and synchronizing a distributed real-time control system under normal and abnormal conditions. Distributed control systems need to periodically coordinate the actions of several autonomous sites. Often the type of coordination required is the all or nothing property of an atomic action. Atomic commit protocols have been used to achieve this atomicity in distributed database systems which are not subject to deadlines. This thesis addresses the problem of applying time constraints to atomic commit protocols so that decisions can be made within a deadline. A modified protocol is proposed which is suitable for real-time applications. The thesis also addresses the problem of ensuring that atomicity is provided even if processor or communication failures occur. Previous work has considered the design of atomic commit protocols for use in non time critical distributed database systems. However, in a distributed real-time control system a fault must not allow stringent timing constraints to be violated. This thesis proposes commit protocols using synchronous communications which can be made resilient to a single processor or communication failure and still satisfy deadlines. Previous formal models used to design commit protocols have had adequate state coverability but have omitted timing properties. They also assumed that sites communicated asynchronously and omitted the communications from the model. Timed Petri nets are used in this thesis to specify and design the proposed protocols which are analysed for consistency and timeliness. Also the communication system is mcxielled within the Petri net specifications so that communication failures can be included in the analysis. Analysis of the Timed Petri net and the associated reachability tree is used to show the proposed protocols always terminate consistently and satisfy timing constraints. Finally the applications of this work are described. Two different types of applications are considered, real-time databases and real-time control systems. It is shown that it may be advantageous to use synchronous communications in distributed database systems, especially if predictable response times are required. Emphasis is given to the application of the developed commit protocols to real-time control systems. Using the same analysis techniques as those used for the design of the protocols it can be shown that the overall system performs as expected both functionally and temporally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis has two aims. First, it sets out to develop an alternative methodology for the investigation of risk homeostasis theory (RHT). It is argued that the current methodologies of the pseudo-experimental design and post hoc analysis of road-traffic accident data both have their limitations, and that the newer 'game' type simulation exercises are also, but for different reasons, incapable of testing RHT predictions. The alternative methodology described here is based on the simulation of physical risk with intrinsic reward rather than a 'points pay-off'. The second aim of the thesis is to examine a number of predictions made by RHT through the use of this alternative methodology. Since the pseudo-experimental design and post hoc analysis of road-traffic data are both ill-suited to the investigation of that part of RHT which deals with the role of utility in determining risk-taking behaviour in response to a change in environmental risk, and since the concept of utility is critical to RHT, the methodology reported here is applied to the specific investigation of utility. Attention too is given to the question of which behavioural pathways carry the homeostasis effect, and whether those pathways are 'local' to the nature of the change in environmental risk. It is suggested that investigating RHT through this new methodology holds a number of advantages and should be developed further in an attempt to answer the RHT question. It is suggested too that the methodology allows RHT to be seen in a psychological context, rather than the statistical context that has so far characterised its investigation. The experimental findings reported here are in support of hypotheses derived from RHT and would therefore seem to argue for the importance of the individual and collective target level of risk, as opposed to the level of environmental risk, as the major determinant of accident loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work attempts to create a systemic design framework for man-machine interfaces which is self consistent, compatible with other concepts, and applicable to real situations. This is tackled by examining the current architecture of computer applications packages. The treatment in the main is philosophical and theoretical and analyses the origins, assumptions and current practice of the design of applications packages. It proposes that the present form of packages is fundamentally contradictory to the notion of packaging itself. This is because as an indivisible ready-to-implement solution, current package architecture displays the following major disadvantages. First, it creates problems as a result of user-package interactions, in which the designer tries to mould all potential individual users, no matter how diverse they are, into one model. This is worsened by the minute provision, if any, of important properties such as flexibility, independence and impartiality. Second, it displays rigid structure that reduces the variety and/or multi-use of the component parts of such a package. Third, it dictates specific hardware and software configurations which probably results in reducing the number of degrees of freedom of its user. Fourth, it increases the dependence of its user upon its supplier through inadequate documentation and understanding of the package. Fifth, it tends to cause a degeneration of the expertise of design of the data processing practitioners. In view of this understanding an alternative methodological design framework which is both consistent with systems approach and the role of a package in its likely context is proposed. The proposition is based upon an extension of the identified concept of the hierarchy of holons* which facilitates the examination of the complex relationships of a package with its two principal environments. First, the user characteristics and his decision making practice and procedures; implying an examination of the user's M.I.S. network. Second, the software environment and its influence upon a package regarding support, control and operation of the package. The framework is built gradually as discussion advances around the central theme of a compatible M.I.S., software and model design. This leads to the formation of the alternative package architecture that is based upon the design of a number of independent, self-contained small parts. Such is believed to constitute the nucleus around which not only packages can be more effectively designed, but is also applicable to many man-machine systems design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing systems that are heavily dependent upon direct workers have an inherent complexity that the system designer is often ill-equipped to understand. This complexity is due to the interactions that cause variations in performance of the workers. Variation in human performance can be explained by many factors, however one important factor that is not currently considered in any detail during the design stage is the physical working environment. This paper presents the findings of ongoing research investigating human performance within manufacturing systems. It sets out to identify the form of the relationships that exist between changes in physical working environmental variables and operator performance. These relationships can provide managers with a decision basis when designing and managing manufacturing systems and their environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing system design is an ongoing activity within industry. Modelling tools based on Discrete Event Simulation are often used by practitioners during this design cycle. However, such tools do not adequately model the behaviour of 'direct' workers in manufacturing environments. There is an important need to expand the capability of modelling to include the relationships between human centred factors (demography, attitudes, beliefs, etc), their working environment (physical and organizational), and their subsequent performance in terms of productive routines. Therefore, this paper describes research that has formed a pilot modelling methodology that is an important first step in providing such a capability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agent-based technology is playing an increasingly important role in today’s economy. Usually a multi-agent system is needed to model an economic system such as a market system, in which heterogeneous trading agents interact with each other autonomously. Two questions often need to be answered regarding such systems: 1) How to design an interacting mechanism that facilitates efficient resource allocation among usually self-interested trading agents? 2) How to design an effective strategy in some specific market mechanisms for an agent to maximise its economic returns? For automated market systems, auction is the most popular mechanism to solve resource allocation problems among their participants. However, auction comes in hundreds of different formats, in which some are better than others in terms of not only the allocative efficiency but also other properties e.g., whether it generates high revenue for the auctioneer, whether it induces stable behaviour of the bidders. In addition, different strategies result in very different performance under the same auction rules. With this background, we are inevitably intrigued to investigate auction mechanism and strategy designs for agent-based economics. The international Trading Agent Competition (TAC) Ad Auction (AA) competition provides a very useful platform to develop and test agent strategies in Generalised Second Price auction (GSP). AstonTAC, the runner-up of TAC AA 2009, is a successful advertiser agent designed for GSP-based keyword auction. In particular, AstonTAC generates adaptive bid prices according to the Market-based Value Per Click and selects a set of keyword queries with highest expected profit to bid on to maximise its expected profit under the limit of conversion capacity. Through evaluation experiments, we show that AstonTAC performs well and stably not only in the competition but also across a broad range of environments. The TAC CAT tournament provides an environment for investigating the optimal design of mechanisms for double auction markets. AstonCAT-Plus is the post-tournament version of the specialist developed for CAT 2010. In our experiments, AstonCAT-Plus not only outperforms most specialist agents designed by other institutions but also achieves high allocative efficiencies, transaction success rates and average trader profits. Moreover, we reveal some insights of the CAT: 1) successful markets should maintain a stable and high market share of intra-marginal traders; 2) a specialist’s performance is dependent on the distribution of trading strategies. However, typical double auction models assume trading agents have a fixed trading direction of either buy or sell. With this limitation they cannot directly reflect the fact that traders in financial markets (the most popular application of double auction) decide their trading directions dynamically. To address this issue, we introduce the Bi-directional Double Auction (BDA) market which is populated by two-way traders. Experiments are conducted under both dynamic and static settings of the continuous BDA market. We find that the allocative efficiency of a continuous BDA market mainly comes from rational selection of trading directions. Furthermore, we introduce a high-performance Kernel trading strategy in the BDA market which uses kernel probability density estimator built on historical transaction data to decide optimal order prices. Kernel trading strategy outperforms some popular intelligent double auction trading strategies including ZIP, GD and RE in the continuous BDA market by making the highest profit in static games and obtaining the best wealth in dynamic games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligent environments aim at supporting the user in executing her everyday tasks, e.g. by guiding her through a maintenance or cooking procedure. This requires a machine processable representation of the tasks for which workflows have proven an efficient means. The increasing number of available sensors in intelligent environments can facilitate the execution of workflows. The sensors can help to recognize when a user has finished a step in the workflow and thus to automatically proceed to the next step. This can heavily reduce the amount of required user interaction. However, manually specifying the conditions for triggering the next step in a workflow is very cumbersome and almost impossible for environments which are not known at design time. In this paper, we present a novel approach for learning and adapting these conditions from observation. We show that the learned conditions can even outperform the quality as conditions manually specified by workflow experts. Thus, the presented approach is very well suited for automatically adapting workflows in intelligent environments and can in that way increase the efficiency of the workflow execution. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated supplier selection and order allocation is an important decision for both designing and operating supply chains. This decision is often influenced by the concerned stakeholders, suppliers, plant operators and customers in different tiers. As firms continue to seek competitive advantage through supply chain design and operations they aim to create optimized supply chains. This calls for on one hand consideration of multiple conflicting criteria and on the other hand consideration of uncertainties of demand and supply. Although there are studies on supplier selection using advanced mathematical models to cover a stochastic approach, multiple criteria decision making techniques and multiple stakeholder requirements separately, according to authors' knowledge there is no work that integrates these three aspects in a common framework. This paper proposes an integrated method for dealing with such problems using a combined Analytic Hierarchy Process-Quality Function Deployment (AHP-QFD) and chance constrained optimization algorithm approach that selects appropriate suppliers and allocates orders optimally between them. The effectiveness of the proposed decision support system has been demonstrated through application and validation in the bioenergy industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Link quality-based rate adaptation has been widely used for IEEE 802.11 networks. However, network performance is affected by both link quality and random channel access. Selection of transmit modes for optimal link throughput can cause medium access control (MAC) throughput loss. In this paper, we investigate this issue and propose a generalised cross-layer rate adaptation algorithm. It considers jointly link quality and channel access to optimise network throughput. The objective is to examine the potential benefits by cross-layer design. An efficient analytic model is proposed to evaluate rate adaptation algorithms under dynamic channel and multi-user access environments. The proposed algorithm is compared to link throughput optimisation-based algorithm. It is found rate adaptation by optimising link layer throughput can result in large performance loss, which cannot be compensated by the means of optimising MAC access mechanism alone. Results show cross-layer design can achieve consistent and considerable performance gains of up to 20%. It deserves to be exploited in practical design for IEEE 802.11 networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, a number of sustainable strategies and polices have been created to protect and preserve our water environments from the impacts of growing communities. The Australian approach, Water Sensitive Urban Design (WSUD), defined as the integration of urban planning and design with the urban water cycle management, has made considerable advances on design guidelines since 2000. WSUD stormwater management systems (e.g. wetlands, bioretentions, porous pavement etc), also known as Best Management Practices (BMPs) or Low Impact Development (LID), are slowly gaining popularity across Australia, the USA and Europe. There have also been significant improvements in how to model the performance of the WSUD technologies (e.g. MUSIC software). However, the implementation issues of these WSUD practices are mainly related to ongoing institutional capacity. Some of the key problems are associated with a limited awareness of urban planners and designers; in general, they have very little knowledge of these systems and their benefits to the urban environments. At the same time, hydrological engineers should have a better understanding of building codes and master plans. The land use regulations are equally as important as the physical site conditions for determining opportunities and constraints for implementing WSUD techniques. There is a need for procedures that can make a better linkage between urban planners and WSUD engineering practices. Thus, this paper aims to present the development of a general framework for incorporating WSUD technologies into the site planning process. The study was applied to lot-scale in the Melbourne region, Australia. Results show the potential space available for fitting WSUD elements, according to building requirements and different types of housing densities. © 2011 WIT Press.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Type IA fiber gratings have unusual physical properties compared with other grating types. We compare with performance characteristics of Type IA and Type I Bragg gratings exposed to the effects of Co60 gamma-irradiation. A Bragg peak shift of 190 pm was observed for Type IA gratings written in Fibercore PS-1250/1500 photosensitive fiber at a radiation dose of 116 kGy. This is the largest wavelength shift recorded to date under radiation exposure. The Type IA and Type I gratings show different kinetics under radiation and during post-radiation annealing; this can be exploited for the design of a grating based dosimetry system. © 2012 SPIE.