12 resultados para Computer System Management
em Aston University Research Archive
Resumo:
One way of describing this thesis, is to state that it attempts to explicate the context within which an application of Stafford Beer's Viable System Model (VSM) makes cybernetic sense. The thesis will attempt to explain how such a context is presently not clearly ennunciated, and why such a lack hinders communications of the model together with its consequent effective take-up by the student or practitioner. The epistemological grounding of the VSM will be described as concerning the ontology of the individuals who apply it and give witness to its application. In describing a particular grounding for the Viable System Model, I am instantiating a methodology which I call a `hermeneutics of distinction'. The final two chapters explicate such a methodology, and consider the implications for the design of a computer system. This thesis is grounded in contemporary insights into the nervous system, and research into the biology of language and cognition. Its conclusions emerge from a synthesis of the twin discourses of Stafford Beer and Humberto Maturana.
Resumo:
This dissertation investigates the very important and current problem of modelling human expertise. This is an apparent issue in any computer system emulating human decision making. It is prominent in Clinical Decision Support Systems (CDSS) due to the complexity of the induction process and the vast number of parameters in most cases. Other issues such as human error and missing or incomplete data present further challenges. In this thesis, the Galatean Risk Screening Tool (GRiST) is used as an example of modelling clinical expertise and parameter elicitation. The tool is a mental health clinical record management system with a top layer of decision support capabilities. It is currently being deployed by several NHS mental health trusts across the UK. The aim of the research is to investigate the problem of parameter elicitation by inducing them from real clinical data rather than from the human experts who provided the decision model. The induced parameters provide an insight into both the data relationships and how experts make decisions themselves. The outcomes help further understand human decision making and, in particular, help GRiST provide more accurate emulations of risk judgements. Although the algorithms and methods presented in this dissertation are applied to GRiST, they can be adopted for other human knowledge engineering domains.
Resumo:
The proliferation of visual display terminals (VDTs) in offices is an international phenomenon. Numerous studies have investigated the health implications which can be categorised into visual problems, symptoms of musculo-skelctal discomfort, or psychosocial effects. The psychosocial effects are broader and there is mixed evidence in this area. The inconsistent results from the studies of VDT work so far undertaken may reflect several methodological shortcomings. In an attempt to overcome these deficiencies and to broaden the model of inter-relationships a model was developed to investigate their interactions and Ihc outputs of job satisfaction, stress and ill health. The study was a two-stage, long-term investigation with measures taken before the VDTs were introduced and the same measures taken 12 months after the 'go-live' date. The research was conducted in four offices of the Department of Social Security. The data were analysed for each individual site and in addition the total data were used in a path analysis model. Significant positive relationships were found at the pre-implementation stage between the musculo-skeletal discomfort, psychosomatic ailments, visual complaints and stress. Job satisfaction was negatively related to visual complaints and musculo-skeletal discomfort. Direct paths were found for age and job level with variety found in the job and age with job satisfaction and a negative relationship with the office environment. The only job characteristic which had a direct path to stress was 'dealing with others'. Similar inter-relationships were found in the post-implementation data. However, in addition attributes of the computer system, such as screen brightness and glare, were related positively with stress and negatively with job satisfaction. The comparison of the data at the two stages found that there had been no significant changes in the users' perceptions of their job characteristics and job satisfaction but there was a small and significant reduction in the stress measure.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
This thesis describes an investigation of the effect of elevated temperatures upon the properties of plain concrete containing a siliceous aggregate. A complete stress-strain relationship and creep behaviour are studied. Transient effects (non-steady state) are also examined in order to simulate more realistic conditions. A temperature range of 20-700ºC is used. corresponding to the temperatures generally attained during an actual fire. In order to carry out the requisite tests, a stiff compression testing machine has been designed and built. The overall control of the test rig is provided by a logger/computer system by developing appropriate software, thus enabling the load to be held constant for any period of tlme. Before outlining any details of the development of the testing apparatus which includes an electric furnace and the.associated instrumentation, previous work on properties of both concrete and. steel at elevated temperatures is reviewed. The test programme comprises four series of tests:stress-strain tests (with and without pre-load), transient tests (heating to failure under constant stress) and creep tests (constant stress and constant temperature). Where 3 stress levels are examined: 0.2, 0.4 & 0.6 fc. The experimental results show that the properties of concrete are significantly affected by temperature and the magnitude of the load. The slope of the descending portion branch of the stress-strain curves (strain softening) is found to be temperature dependent. After normalizing the data, the stress-strain curves for different temperatures are represented by a single curve. The creep results are analysed using an approach involving the activation energy which is found to be constant. The analysis shows that the time-dependent deformation is sensibly linear with the applied stress. The total strain concept is shown to hold for the test data within limits.
Resumo:
In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.
Resumo:
Suggests that simulation of the workflow component of a computer supported co-operative work (CSCW) system has the potential to reduce the costs of system implementation, while at the same time improving the quality of the delivered system. Demonstrates the value of being able to assess the frequency and volume of workflow transactions using a case study of CSCW software developed for estate agency co-workers in which a model was produced based on a discrete-event simulation approach with implementation on a spreadsheet platform.
Resumo:
This research was conducted at the Space Research and Technology Centre o the European Space Agency at Noordvijk in the Netherlands. ESA is an international organisation that brings together a range of scientists, engineers and managers from 14 European member states. The motivation for the work was to enable decision-makers, in a culturally and technologically diverse organisation, to share information for the purpose of making decisions that are well informed about the risk-related aspects of the situations they seek to address. The research examined the use of decision support system DSS) technology to facilitate decision-making of this type. This involved identifying the technology available and its application to risk management. Decision-making is a complex activity that does not lend itself to exact measurement or precise understanding at a detailed level. In view of this, a prototype DSS was developed through which to understand the practical issues to be accommodated and to evaluate alternative approaches to supporting decision-making of this type. The problem of measuring the effect upon the quality of decisions has been approached through expert evaluation of the software developed. The practical orientation of this work was informed by a review of the relevant literature in decision-making, risk management, decision support and information technology. Communication and information technology unite the major the,es of this work. This allows correlation of the interests of the research with European public policy. The principles of communication were also considered in the topic of information visualisation - this emerging technology exploits flexible modes of human computer interaction (HCI) to improve the cognition of complex data. Risk management is itself an area characterised by complexity and risk visualisation is advocated for application in this field of endeavour. The thesis provides recommendations for future work in the fields of decision=making, DSS technology and risk management.
Resumo:
Case studies in copper-alloy rolling mill companies showed that existing planning systems suffer from numerous shortcomings. Where computerised systems are in use, these tend to simply emulate older manual systems and still rely heavily on modification by experienced planners on the shopfloor. As the size and number of orders increase, the task of process planners, while seeking to optimise the manufacturing objectives and keep within the production constraints, becomes extremely complicated because of the number of options for mixing or splitting the orders into batches. This thesis develops a modular approach to computerisation of the production management and planning functions. The full functional specification of each module is discussed, together with practical problems associated with their phased implementation. By adapting the Distributed Bill of Material concept from Material Requirements Planning (MRP) philosophy, the production routes generated by the planning system are broken down to identify the rolling stages required. Then to optimise the use of material at each rolling stage, the system generates an optimal cutting pattern using a new algorithm that produces practical solutions to the cutting stock problem. It is shown that the proposed system can be accommodated on a micro-computer, which brings it into the reach of typical companies in the copper-alloy rolling industry, where profit margins are traditionally low and the cost of widespread use of mainframe computers would be prohibitive.
River basin surveillance using remotely sensed data: a water resources information management system
Resumo:
This thesis describes the development of an operational river basin water resources information management system. The river or drainage basin is the fundamental unit of the system; in both the modelling and prediction of hydrological processes, and in the monitoring of the effect of catchment management policies. A primary concern of the study is the collection of sufficient and sufficiently accurate information to model hydrological processes. Remote sensing, in combination with conventional point source measurement, can be a valuable source of information, but is often overlooked by hydrologists, due to the cost of acquisition and processing. This thesis describes a number of cost effective methods of acquiring remotely sensed imagery, from airborne video survey to real time ingestion of meteorological satellite data. Inexpensive micro-computer systems and peripherals are used throughout to process and manipulate the data. Spatial information systems provide a means of integrating these data with topographic and thematic cartographic data, and historical records. For the system to have any real potential the data must be stored in a readily accessible format and be easily manipulated within the database. The design of efficient man-machine interfaces and the use of software enginering methodologies are therefore included in this thesis as a major part of the design of the system. The use of low cost technologies, from micro-computers to video cameras, enables the introduction of water resources information management systems into developing countries where the potential benefits are greatest.
Resumo:
Recent National Student Surveys revealed that many U.K. university students are dissatisfied with the timeliness and usefulness of the feedback received from their tutors. Ensuring timeliness in marking often results in a reduction in the quality of feedback. In Computer Science where learning relies on practising and learning from mistakes, feedback that pin-points errors and explains means of improvement is important to achieve a good student learning experience. Though suitable use of Information and Communication Technology should alleviate this problem, existing Virtual Learning Environments and e-Assessment applications such as Blackboard/WebCT, BOSS, MarkTool and GradeMark are inadequate to support a coursework assessment process that promotes timeliness and usefulness of feedback while maintaining consistency in marking involving multiple tutors. We have developed a novel Internet application, called eCAF, for facilitating an efficient and transparent coursework assessment and feedback process. The eCAF system supports detailed marking scheme editing and enables tutors to use such schemes to pin-point errors in students' work so as to provide helpful feedback efficiently. Tutors can also highlight areas in a submitted work and associate helpful feedback that clearly links to the identified mistakes and the respective marking criteria. In light of the results obtained from a recent trial of eCAF, we discuss how the key features of eCAF may facilitate an effective and efficient coursework assessment and feedback process.