23 resultados para user-centered approach

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: Development and evaluation of a prototype dialogue game for servitization is reported. Design/methodology/approach: This paper reports the design of the iServe game, from user centered design, through implementation using the Unity games engine to evaluation, a process which took 270 researcher hours. Findings: No relationship was found between either age or gaming experience and usability. Participants who identified themselves as non-experts in servitization recognized the potential of the game to teach servitization concepts to other novice learners. Originality/value: The potential of business games for education and executive development has been recognized but factors, including high development cost, inhibit their uptake. Games engines offer a potential solution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effectiveness of the strategies employed by the Urban Wildlife Group (a voluntary conservation organisation) to provide and manage three urban nature parks has been evaluated, using a multiple methods methodology. Where the level of community interest and commitment to a project is high, the utilisation of the community nature park strategy (to maximise benefits to UWG and the community) is warranted. Where the level of interest and commitment of the local community is low, a strategy designed to encourage limited involvement of the community is most effective and efficient. The campaign strategy, whereby the community and UWG take direct action to oppose a threat of undesirable development on a nature park, is assessed to be a sub-strategy, rather than a strategy in its own right. Questionnaire surveys and observations studies have revealed that urban people appreciate and indeed demand access to nature parks in urban areas, which have similar amenity value to that provided by countryside recreation sites. Urban nature parks are valued for their natural character, natural features (trees, wild flowers) peace and quiet, wildlife and openness. People use these sites for a mixture of informal and mainly passive activities, such as walking and dog walking. They appear to be of particular value to children for physical and imaginative play. The exact input of time and resources that UWG has committed to the projects has depended on the level of input of the local authority. The evidence indicates that the necessary technical expertise needed to produce and manage urban nature parks, using a user-oriented approach is not adequately provided by local authorities. The methods used in this research are presented as an `evaluation kit' that may be used by practitioners and researchers to evaluate the effectiveness of a wide range of different open spaces and the strategies employed to provide and manage them.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Enterprise Resource Planning (ERP) projects are strategic and capital intensive, so failure may be costly and even cause bankruptcy of companies. Previous studies have proposed ways for improving implementation, but they are mostly generic and follow standardized project management practices as specified in various standards (e.g. the “project management body of knowledge” of the Project Management Institute). Because ERP is interdisciplinary (involving change management, project management and information technology management), it warrants a customized approach to managing risks throughout the life cycle of implementation and operation. Through a practical case study, this paper demonstrates a qualitative, user friendly approach to ERP project risk management. Firstly, through a literature review it identifies various risk factors in ERP implementation. Secondly, the risk management practices of a UK-based multinational consulting company in one of its clients are evaluated. The risk factors from the case study organization and literature are then compared and discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to develop an integrated patient-focused analytical framework to improve quality of care in accident and emergency (A&E) unit of a Maltese hospital. Design/methodology/approach – The study adopts a case study approach. First, a thorough literature review has been undertaken to study the various methods of healthcare quality management. Second, a healthcare quality management framework is developed using combined quality function deployment (QFD) and logical framework approach (LFA). Third, the proposed framework is applied to a Maltese hospital to demonstrate its effectiveness. The proposed framework has six steps, commencing with identifying patients’ requirements and concluding with implementing improvement projects. All the steps have been undertaken with the involvement of the concerned stakeholders in the A&E unit of the hospital. Findings – The major and related problems being faced by the hospital under study were overcrowding at A&E and shortage of beds, respectively. The combined framework ensures better A&E services and patient flow. QFD identifies and analyses the issues and challenges of A&E and LFA helps develop project plans for healthcare quality improvement. The important outcomes of implementing the proposed quality improvement programme are fewer hospital admissions, faster patient flow, expert triage and shorter waiting times at the A&E unit. Increased emergency consultant cover and faster first significant medical encounter were required to start addressing the problems effectively. Overall, the combined QFD and LFA method is effective to address quality of care in A&E unit. Practical/implications – The proposed framework can be easily integrated within any healthcare unit, as well as within entire healthcare systems, due to its flexible and user-friendly approach. It could be part of Six Sigma and other quality initiatives. Originality/value – Although QFD has been extensively deployed in healthcare setup to improve quality of care, very little has been researched on combining QFD and LFA in order to identify issues, prioritise them, derive improvement measures and implement improvement projects. Additionally, there is no research on QFD application in A&E. This paper bridges these gaps. Moreover, very little has been written on the Maltese health care system. Therefore, this study contributes demonstration of quality of emergency care in Malta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Digital Business Discourse offers a distinctively language- and discourse-centered approach to digitally mediated business and professional communication, providing a timely and comprehensive assessment of the current digital communication practices of today's organisations and workplaces. It is the first dedicated publication to address how computer-mediated communication technologies affect institutional discourse practices, bringing together scholarship from a range of disciplinary backgrounds, including organisational and management studies, rhetorical and communication studies, communication training and discourse analysis. Covering a wide spectrum of communication technologies, such as email, instant messaging, message boards, Twitter, corporate blogs and consumer reviews, the chapters gather research drawing on empirical data from real professional contexts. In this way, the book contributes to both academic scholarship and business communication training, enabling researchers, trainers and practitioners to deepen their understanding of the impact of new communication technologies on professional and corporate communication practices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Digital Business Discourse offers a distinctively language- and discourse-centered approach to digitally mediated business and professional communication, providing a timely and comprehensive assessment of the current digital communication practices of today's organisations and workplaces. It is the first dedicated publication to address how computer-mediated communication technologies affect institutional discourse practices, bringing together scholarship from a range of disciplinary backgrounds, including organisational and management studies, rhetorical and communication studies, communication training and discourse analysis. Covering a wide spectrum of communication technologies, such as email, instant messaging, message boards, Twitter, corporate blogs and consumer reviews, the chapters gather research drawing on empirical data from real professional contexts. In this way, the book contributes to both academic scholarship and business communication training, enabling researchers, trainers and practitioners to deepen their understanding of the impact of new communication technologies on professional and corporate communication practices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The application of pharmacokinetic modelling within the drug development field essentially allows one to develop a quantitative description of the temporal behaviour of a compound of interest at a tissue/organ level, by identifying and defining relationships between a dose of a drug and dependent variables. In order to understand and characterise the pharmacokinetics of a drug, it is often helpful to employ pharmacokinetic modelling using empirical or mechanistic approaches. Pharmacokinetic models can be developed within mathematical and statistical commercial software such as MATLAB using traditional mathematical and computation coding, or by using the Simbiology Toolbox available within MATLAB for a graphical user interface approach to developing pharmacokinetic (PBPK) models. For formulations dosed orally, a prerequisite for clinical activity is the entry of the drug into the systemic circulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic ontology building is a vital issue in many fields where they are currently built manually. This paper presents a user-centred methodology for ontology construction based on the use of Machine Learning and Natural Language Processing. In our approach, the user selects a corpus of texts and sketches a preliminary ontology (or selects an existing one) for a domain with a preliminary vocabulary associated to the elements in the ontology (lexicalisations). Examples of sentences involving such lexicalisation (e.g. ISA relation) in the corpus are automatically retrieved by the system. Retrieved examples are validated by the user and used by an adaptive Information Extraction system to generate patterns that discover other lexicalisations of the same objects in the ontology, possibly identifying new concepts or relations. New instances are added to the existing ontology or used to tune it. This process is repeated until a satisfactory ontology is obtained. The methodology largely automates the ontology construction process and the output is an ontology with an associated trained leaner to be used for further ontology modifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of data throughout the strategic, tactical and operational areas within many organisations, has provided a need for the decision maker to be presented with structured information that is appropriate for achieving allocated tasks. However, despite this abundance of data, managers at all levels in the organisation commonly encounter a condition of ‘information overload’, that results in a paucity of the correct information. Specifically, this thesis will focus upon the tactical domain within the organisation and the information needs of management who reside at this level. In doing so, it will argue that the link between decision making at the tactical level in the organisation, and low-level transaction processing data, should be through a common object model that used a framework based upon knowledge leveraged from co-ordination theory. In order to achieve this, the Co-ordinated Business Object Model (CBOM) was created. Detailing a two-tier framework, the first tier models data based upon four interactive object models, namely, processes, activities, resources and actors. The second tier analyses the data captured by the four object models, and returns information that can be used to support tactical decision making. In addition, the Co-ordinated Business Object Support System (CBOSS), is a prototype tool that has been developed in order to both support the CBOM implementation, and to also demonstrate the functionality of the CBOM as a modelling approach for supporting tactical management decision making. Containing a graphical user interface, the system’s functionality allows the user to create and explore alternative implementations of an identified tactical level process. In order to validate the CBOM, three verification tests have been completed. The results provide evidence that the CBOM framework helps bridge the gap between low level transaction data, and the information that is used to support tactical level decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigates the general user interface problems in using networked services. Some of the problems are: users have to recall machine names and procedures to. invoke networked services; interactions with some of the services are by means of menu-based interfaces which are quite cumbersome to use; inconsistencies exist between the interfaces for different services because they were developed independently. These problems have to be removed so that users can use the services effectively. A prototype system has been developed to help users interact with networked services. This consists of software which gives the user an easy and consistent interface with the various services. The prototype is based on a graphical user interface and it includes the following appJications: Bath Information & Data Services; electronic mail; file editor. The prototype incorporates an online help facility to assist users using the system. The prototype can be divided into two parts: the user interface part that manages interactlon with the user; the communicatIon part that enables the communication with networked services to take place. The implementation is carried out using an object-oriented approach where both the user interface part and communication part are objects. The essential characteristics of object-orientation, - abstraction, encapsulation, inheritance and polymorphism - can all contribute to the better design and implementation of the prototype. The Smalltalk Model-View-Controller (MVC) methodology has been the framework for the construction of the prototype user interface. The purpose of the development was to study the effectiveness of users interaction to networked services. Having completed the prototype, tests users were requested to use the system to evaluate its effectiveness. The evaluation of the prototype is based on observation, i.e. observing the way users use the system and the opinion rating given by the users. Recommendations to improve further the prototype are given based on the results of the evaluation. based on the results of the evah:1ation. . .'. " "', ':::' ,n,<~;'.'

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional waterfall software life cycle model has several weaknesses. One problem is that a working version of a system is unavailable until a late stage in the development; any omissions and mistakes in the specification undetected until that stage can be costly to maintain. The operational approach which emphasises the construction of executable specifications can help to remedy this problem. An operational specification may be exercised to generate the behaviours of the specified system, thereby serving as a prototype to facilitate early validation of the system's functional requirements. Recent ideas have centred on using an existing operational method such as JSD in the specification phase of object-oriented development. An explicit transformation phase following specification is necessary in this approach because differences in abstractions between the two domains need to be bridged. This research explores an alternative approach of developing an operational specification method specifically for object-oriented development. By incorporating object-oriented concepts in operational specifications, the specifications have the advantage of directly facilitating implementation in an object-oriented language without requiring further significant transformations. In addition, object-oriented concepts can help the developer manage the complexity of the problem domain specification, whilst providing the user with a specification that closely reflects the real world and so the specification and its execution can be readily understood and validated. A graphical notation has been developed for the specification method which can capture the dynamic properties of an object-oriented system. A tool has also been implemented comprising an editor to facilitate the input of specifications, and an interpreter which can execute the specifications and graphically animate the behaviours of the specified systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work attempts to create a systemic design framework for man-machine interfaces which is self consistent, compatible with other concepts, and applicable to real situations. This is tackled by examining the current architecture of computer applications packages. The treatment in the main is philosophical and theoretical and analyses the origins, assumptions and current practice of the design of applications packages. It proposes that the present form of packages is fundamentally contradictory to the notion of packaging itself. This is because as an indivisible ready-to-implement solution, current package architecture displays the following major disadvantages. First, it creates problems as a result of user-package interactions, in which the designer tries to mould all potential individual users, no matter how diverse they are, into one model. This is worsened by the minute provision, if any, of important properties such as flexibility, independence and impartiality. Second, it displays rigid structure that reduces the variety and/or multi-use of the component parts of such a package. Third, it dictates specific hardware and software configurations which probably results in reducing the number of degrees of freedom of its user. Fourth, it increases the dependence of its user upon its supplier through inadequate documentation and understanding of the package. Fifth, it tends to cause a degeneration of the expertise of design of the data processing practitioners. In view of this understanding an alternative methodological design framework which is both consistent with systems approach and the role of a package in its likely context is proposed. The proposition is based upon an extension of the identified concept of the hierarchy of holons* which facilitates the examination of the complex relationships of a package with its two principal environments. First, the user characteristics and his decision making practice and procedures; implying an examination of the user's M.I.S. network. Second, the software environment and its influence upon a package regarding support, control and operation of the package. The framework is built gradually as discussion advances around the central theme of a compatible M.I.S., software and model design. This leads to the formation of the alternative package architecture that is based upon the design of a number of independent, self-contained small parts. Such is believed to constitute the nucleus around which not only packages can be more effectively designed, but is also applicable to many man-machine systems design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research was carried out in the Aviation Division of Dunlop Limited and was initiated as a search for more diverse uses for carbon/carbon composites. An assumed communication model of adoption was refined by introducing the concept of a two way search after making cross industry comparisons of supplier and consumer behaviour. This research has examined methods of searching for new uses for advanced technology materials. Two broad approaches were adopted. First, a case history approach investigated materials that had been in a similar oosition to carbon/carbon to see how other material producing firms had tackled the problem. Second, a questionnaire survey among industrialists examined: the role and identity of material decision makers in different sized firms; the effectiveness of various information sources and channels; and the material adoption habits of different industries. The effectiveness of selected information channels was further studied by monitoring the response to publicity given to carbon/carbon. A flow chart has been developed from the results of this research which should help any material producing firm that is contemplating the introduction of a new material to the world market. Further benefit to our understanding of the innovation and adoption of new materials would accrue from work in the followino areas: "micro" type case histories; understanding more fully the role of product champions or promoters; investigating the phase difference between incremental and radical type innovations for materials; examining the relationship between the adoption rate of new materials and the advance of technology; studying the development of cost per unit function methods for material selection; and reviewing the benefits that economy of scale studies can have on material developments. These are all suggested areas for further work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks have been identified as one of the key technologies for the 21st century. They consist of tiny devices with limited processing and power capabilities, called motes that can be deployed in large numbers of useful sensing capabilities. Even though, they are flexible and easy to deploy, there are a number of considerations when it comes to their fault tolerance, conserving energy and re-programmability that need to be addressed before we draw any substantial conclusions about the effectiveness of this technology. In order to overcome their limitations, we propose a middleware solution. The proposed scheme is composed based on two main methods. The first method involves the creation of a flexible communication protocol based on technologies such as Mobile Code/Agents and Linda-like tuple spaces. In this way, every node of the wireless sensor network will produce and process data based on what is the best for it but also for the group that it belongs too. The second method incorporates the above protocol in a middleware that will aim to bridge the gap between the application layer and low level constructs such as the physical layer of the wireless sensor network. A fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort towards the deployed applications running in an energy efficient manner inside the network. The proposed scheme is evaluated through a number of trials aiming to test its merits under real time conditions and to identify its effectiveness against other similar approaches. Finally, parameters which determine the characteristics of the proposed scheme are also examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the work undertaken in the Scholarly Ontologies Project. The aim of the project has been to develop a computational approach to support scholarly sensemaking, through interpretation and argumentation, enabling researchers to make claims: to describe and debate their view of a document's key contributions and relationships to the literature. The project has investigated the technicalities and practicalities of capturing conceptual relations, within and between conventional documents in terms of abstract ontological structures. In this way, we have developed a new kind of index to distributed digital library systems. This paper reports a case study undertaken to test the sensemaking tools developed by the Scholarly Ontologies project. The tools used were ClaiMapper, which allows the user to sketch argument maps of individual papers and their connections, ClaiMaker, a server on which such models can be stored and saved, which provides interpretative services to assist the querying of argument maps across multiple papers and ClaimFinder, a novice interface to the search services in ClaiMaker.