27 resultados para 080302 Computer System Architecture

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process framework comprises three phases, as follows: scope the supply chain/network; identify the options for supply system architecture and select supply system architecture. It facilitates a structured approach that analyses the supply chain/network contextual characteristics, in order to ensure alignment with the appropriate supply system architecture. The process framework was derived from comprehensive literature review and archival case study analysis. The review led to the classification of supply system architectures according to their orientation, whether integrated; partially integrated; co-ordinated or independent. The classification was combined with the characteristics that influence the selection of supply system architecture to encapsulate the conceptual framework. It builds upon existing frameworks and methodologies by focusing on structured procedure; supporting project management; facilitating participation and clarifying point of entry. The process framework was initially tested in three case study applications from the food, automobile and hand tool industries. A variety of industrial settings was chosen to illustrate transferability. The case study applications indicate that the process framework is a valid approach to the problem; however, further testing is required. In particular, the use of group support system technologies to support the process and the steps involving the participation of software vendors need further testing. However, the process framework can be followed due to the clarity of its presentation. It considers the issue of timing by including alternative decision-making techniques, dependent on the constraints. It is useful for ensuring a sound business case is developed, with supporting documentation and analysis that identifies the strategic and functional requirements of supply system architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent technological advances have paved the way for developing and offering advanced services for the stakeholders in the agricultural sector. A paradigm shift is underway from proprietary and monolithic tools to Internet-based, cloud hosted, open systems that will enable more effective collaboration between stakeholders. This new paradigm includes the technological support of application developers to create specialized services that will seamlessly interoperate, thus creating a sophisticated and customisable working environment for the end users. We present the implementation of an open architecture that instantiates such an approach, based on a set of domain independent software tools called "generic enablers" that have been developed in the context of the FI-WARE project. The implementation is used to validate a number of innovative concepts for the agricultural sector such as the notion of a services' market place and the system's adaptation to network failures. During the design and implementation phase, the system has been evaluated by end users, offering us valuable feedback. The results of the evaluation process validate the acceptance of such a system and the need of farmers to have access to sophisticated services at affordable prices. A summary of this evaluation process is also presented in this paper. © 2013 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In view of the increasingly complexity of services logic and functional requirements, a new system architecture based on SOA was proposed for the equipment remote monitoring and diagnosis system. According to the design principles of SOA, different levels and different granularities of services logic and functional requirements for remote monitoring and diagnosis system were divided, and a loosely coupled web services system was built. The design and implementation schedule of core function modules for the proposed architecture were presented. A demo system was used to validate the feasibility of the proposed architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This second issue of Knowledge Management Research & Practice (KMRP) continues the international nature of the first issue, with papers from authors based on four different continents. There are five regular papers, plus the first of what is intended to be an occasional series of 'position papers' from respected figures in the knowledge management field, who have specific issues they wish to raise from a personal standpoint. The first two regular papers are both based on case studies. The first is 'Aggressively pursuing knowledge management over two years: a case study a US government organization' by Jay Liebowitz. Liebowitz is well known to both academics and practictioners as an author on knowledge management and knowledge based systems. Government departments in many Western countries must soon face up to the problems that will occur as the 'baby boomer' generation reaches retirement age over the next decade. This paper describes how one particular US government organization has attempted to address this situation (and others) through the introduction of a knowledge management initiative. The second case study paper is 'Knowledge creation through the synthesizing capability of networked strategic communities: case study on new product development in Japan' by Mitsuru Kodama. This paper looks at the importance of strategic communities - communities that have strategic relevance and support - in knowledge management. Here, the case study organization is Nippon Telegraph and Telephone Corporation (NTT), a Japanese telecommunication firm. The third paper is 'Knowledge management and intellectual capital: an empirical examination of current practice in Australia' by Albert Zhou and Dieter Fink. This paper reports the results of a survey carried out in 2001, exploring the practices relating to knowledge management and intellectual capital in Australia and the relationship between them. The remaining two regular papers are conceptual in nature. The fourth is 'The enterprise knowledge dictionary' by Stuart Galup, Ronald Dattero and Richard Hicks. Galup, Dattero and Hicks propose the concept of an enterprise knowledge dictionary and its associated knowledge management system architecture as offering the appropriate form of information technology to support various different types of knowledge sources, while behaving as a single source from the user's viewpoint. The fifth and final regular paper is 'Community of practice and metacapabilities' by Geri Furlong and Leslie Johnson. This paper looks at the role of communities of practice in learning in organizations. Its emphasis is on metacapabilities - the properties required to learn, develop and apply skills. This discussion takes work on learning and core competences to a higher level. Finally, this issue includes a position paper 'Innovation as an objective of knowledge management. Part I: the landscape of management' by Dave Snowden. Snowden has been highly visible in the knowledge management community thanks to his role as the Director of IBM Global Services' Canolfan Cynefin Centre. He has helped many government and private sector organizations to consider their knowledge management problems and strategies. This, the first of two-part paper, is inspired by the notion of complexity. In it, Snowden calls for what he sees as a 20th century emphasis on designed systems for knowledge management to be consigned to history, and replaced by a 21st century emphasis on emergence. Letters to the editor on this, or any other topic related to knowledge management research and practice, are welcome. We trust that you will find the contributions stimulating, and again invite you to contribute your own paper(s) to future issues of KMRP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proliferation of visual display terminals (VDTs) in offices is an international phenomenon. Numerous studies have investigated the health implications which can be categorised into visual problems, symptoms of musculo-skelctal discomfort, or psychosocial effects. The psychosocial effects are broader and there is mixed evidence in this area. The inconsistent results from the studies of VDT work so far undertaken may reflect several methodological shortcomings. In an attempt to overcome these deficiencies and to broaden the model of inter-relationships a model was developed to investigate their interactions and Ihc outputs of job satisfaction, stress and ill health. The study was a two-stage, long-term investigation with measures taken before the VDTs were introduced and the same measures taken 12 months after the 'go-live' date. The research was conducted in four offices of the Department of Social Security. The data were analysed for each individual site and in addition the total data were used in a path analysis model. Significant positive relationships were found at the pre-implementation stage between the musculo-skeletal discomfort, psychosomatic ailments, visual complaints and stress. Job satisfaction was negatively related to visual complaints and musculo-skeletal discomfort. Direct paths were found for age and job level with variety found in the job and age with job satisfaction and a negative relationship with the office environment. The only job characteristic which had a direct path to stress was 'dealing with others'. Similar inter-relationships were found in the post-implementation data. However, in addition attributes of the computer system, such as screen brightness and glare, were related positively with stress and negatively with job satisfaction. The comparison of the data at the two stages found that there had been no significant changes in the users' perceptions of their job characteristics and job satisfaction but there was a small and significant reduction in the stress measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One way of describing this thesis, is to state that it attempts to explicate the context within which an application of Stafford Beer's Viable System Model (VSM) makes cybernetic sense. The thesis will attempt to explain how such a context is presently not clearly ennunciated, and why such a lack hinders communications of the model together with its consequent effective take-up by the student or practitioner. The epistemological grounding of the VSM will be described as concerning the ontology of the individuals who apply it and give witness to its application. In describing a particular grounding for the Viable System Model, I am instantiating a methodology which I call a `hermeneutics of distinction'. The final two chapters explicate such a methodology, and consider the implications for the design of a computer system. This thesis is grounded in contemporary insights into the nervous system, and research into the biology of language and cognition. Its conclusions emerge from a synthesis of the twin discourses of Stafford Beer and Humberto Maturana.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes an investigation of the effect of elevated temperatures upon the properties of plain concrete containing a siliceous aggregate. A complete stress-strain relationship and creep behaviour are studied. Transient effects (non-steady state) are also examined in order to simulate more realistic conditions. A temperature range of 20-700ºC is used. corresponding to the temperatures generally attained during an actual fire. In order to carry out the requisite tests, a stiff compression testing machine has been designed and built. The overall control of the test rig is provided by a logger/computer system by developing appropriate software, thus enabling the load to be held constant for any period of tlme. Before outlining any details of the development of the testing apparatus which includes an electric furnace and the.associated instrumentation, previous work on properties of both concrete and. steel at elevated temperatures is reviewed. The test programme comprises four series of tests:stress-strain tests (with and without pre-load), transient tests (heating to failure under constant stress) and creep tests (constant stress and constant temperature). Where 3 stress levels are examined: 0.2, 0.4 & 0.6 fc. The experimental results show that the properties of concrete are significantly affected by temperature and the magnitude of the load. The slope of the descending portion branch of the stress-strain curves (strain softening) is found to be temperature dependent. After normalizing the data, the stress-strain curves for different temperatures are represented by a single curve. The creep results are analysed using an approach involving the activation energy which is found to be constant. The analysis shows that the time-dependent deformation is sensibly linear with the applied stress. The total strain concept is shown to hold for the test data within limits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation investigates the very important and current problem of modelling human expertise. This is an apparent issue in any computer system emulating human decision making. It is prominent in Clinical Decision Support Systems (CDSS) due to the complexity of the induction process and the vast number of parameters in most cases. Other issues such as human error and missing or incomplete data present further challenges. In this thesis, the Galatean Risk Screening Tool (GRiST) is used as an example of modelling clinical expertise and parameter elicitation. The tool is a mental health clinical record management system with a top layer of decision support capabilities. It is currently being deployed by several NHS mental health trusts across the UK. The aim of the research is to investigate the problem of parameter elicitation by inducing them from real clinical data rather than from the human experts who provided the decision model. The induced parameters provide an insight into both the data relationships and how experts make decisions themselves. The outcomes help further understand human decision making and, in particular, help GRiST provide more accurate emulations of risk judgements. Although the algorithms and methods presented in this dissertation are applied to GRiST, they can be adopted for other human knowledge engineering domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In-Motes is a mobile agent middleware that generates an intelligent framework for deploying applications in Wireless Sensor Networks (WSNs). In-Motes is based on the injection of mobile agents into the network that can migrate or clone following specific rules and performing application specific tasks. By doing so, each mote is given a certain degree of perception, cognition and control, forming the basis for its intelligence. Our middleware incorporates technologies such as Linda-like tuplespaces and federated system architecture in order to obtain a high degree of collaboration and coordination for the agent society. A set of behavioral rules inspired by a community of bacterial strains is also generated as the means for robustness of the WSN. In this paper, we present In-Motes and provide a detailed evaluation of its implementation for MICA2 motes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: This paper presents the system architecture of a serious game, which is going to be run in parallel to Rolls Royce tra ining on product-service systems (PSS). Design/methodology/approach: The original game is outlined, requirements for an onl ine version are defined, and the architecture is proposed. Findings: The games approach has proven its value in design for service tra ining but an online version is needed to improve the opportunit ies to deliver the game. Originality/value: Such a system presents opportunities for the acquisition and development of specific professional knowledge, skills, and competencies

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Collaborative working with the aid of computers is increasing rapidly due to the widespread use of computer networks, geographic mobility of people, and small powerful personal computers. For the past ten years research has been conducted into this use of computing technology from a wide variety of perspectives and for a wide range of uses. This thesis adds to that previous work by examining the area of collaborative writing amongst groups of people. The research brings together a number of disciplines, namely sociology for examining group dynamics, psychology for understanding individual writing and learning processes, and computer science for database, networking, and programming theory. The project initially looks at groups and how they form, communicate, and work together, progressing on to look at writing and the cognitive processes it entails for both composition and retrieval. The thesis then details a set of issues which need to be addressed in a collaborative writing system. These issues are then followed by developing a model for collaborative writing, detailing an iterative process of co-ordination, writing and annotation, consolidation, and negotiation, based on a structured but extensible document model. Implementation issues for a collaborative application are then described, along with various methods of overcoming them. Finally the design and implementation of a collaborative writing system, named Collaborwriter, is described in detail, which concludes with some preliminary results from initial user trials and testing.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This research was conducted at the Space Research and Technology Centre o the European Space Agency at Noordvijk in the Netherlands. ESA is an international organisation that brings together a range of scientists, engineers and managers from 14 European member states. The motivation for the work was to enable decision-makers, in a culturally and technologically diverse organisation, to share information for the purpose of making decisions that are well informed about the risk-related aspects of the situations they seek to address. The research examined the use of decision support system DSS) technology to facilitate decision-making of this type. This involved identifying the technology available and its application to risk management. Decision-making is a complex activity that does not lend itself to exact measurement or precise understanding at a detailed level. In view of this, a prototype DSS was developed through which to understand the practical issues to be accommodated and to evaluate alternative approaches to supporting decision-making of this type. The problem of measuring the effect upon the quality of decisions has been approached through expert evaluation of the software developed. The practical orientation of this work was informed by a review of the relevant literature in decision-making, risk management, decision support and information technology. Communication and information technology unite the major the,es of this work. This allows correlation of the interests of the research with European public policy. The principles of communication were also considered in the topic of information visualisation - this emerging technology exploits flexible modes of human computer interaction (HCI) to improve the cognition of complex data. Risk management is itself an area characterised by complexity and risk visualisation is advocated for application in this field of endeavour. The thesis provides recommendations for future work in the fields of decision=making, DSS technology and risk management.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Issues of wear and tribology are increasingly important in computer hard drives as slider flying heights are becoming lower and disk protective coatings thinner to minimise spacing loss and allow higher areal density. Friction, stiction and wear between the slider and disk in a hard drive were studied using Accelerated Friction Test (AFT) apparatus. Contact Start Stop (CSS) and constant speed drag tests were performed using commercial rigid disks and two different air bearing slider types. Friction and stiction were captured during testing by a set of strain gauges. System parameters were varied to investigate their effect on tribology at the head/disk interface. Chosen parameters were disk spinning velocity, slider fly height, temperature, humidity and intercycle pause. The effect of different disk texturing methods was also studied. Models were proposed to explain the influence of these parameters on tribology. Atomic Force Microscopy (AFM) and Scanning Electron Microscopy (SEM) were used to study head and disk topography at various test stages and to provide physical parameters to verify the models. X-ray Photoelectron Spectroscopy (XPS) was employed to identify surface composition and determine if any chemical changes had occurred as a result of testing. The parameters most likely to influence the interface were identified for both CSS and drag testing. Neural Network modelling was used to substantiate results. Topographical AFM scans of disk and slider were exported numerically to file and explored extensively. Techniques were developed which improved line and area analysis. A method for detecting surface contacts was also deduced, results supported and explained observed AFT behaviour. Finally surfaces were computer generated to simulate real disk scans, this allowed contact analysis of many types of surface to be performed. Conclusions were drawn about what disk characteristics most affected contacts and hence friction, stiction and wear.