48 resultados para current research information systems
Resumo:
The recent global „credit crunch? has brought sharply into focus the need for better understanding of what it takes for organisations to survive. This research seeks to help organisations maintain their „viability? – the ability to maintain a separate existence and survive on their own. Whilst there are a multitude of factors that contribute to organisational viability, information can be viewed as the lifeblood of organisations. This research increases our understanding of how organisations can manage information effectively to help maintain their viability. The viable systems model (VSM) is an established modelling technique that enables the detailed analysis of organisational activity to examine how the structure and functions performed in an organisation contribute to its „viability?. The VSM has been widely applied, in small/large companies, industries and governments. However, whilst the VSM concentrates on the structure and functions necessary for an organisation to be viable, it pays much less attention to information deployment in organisations. Indeed, the VSM is criticised in the literature for being unable to provide much help with detailed information and communication structures and new theories are called for to explore the way people interact and what information they need in the VSM. This research analyses qualitative data collected from four case studies to contribute to our understanding of the role that information plays in organisational viability, making three key contributions to the academic literature. In the information management literature, this research provides new insight into the roles that specific information plays in organisations. In the systems thinking literature, this research extends our understanding of the VSM and builds on its powerful diagnostic capability to provide further criteria to aid in the diagnosis of viable organisations. In the information systems literature, this research develops a framework that can be used to help organisations design more effective information systems.
Resumo:
Challenges of returnable transport equipment (RTE) management continue to heighten as the popularity of their usage magnifies. Logistics companies are investigating the implementation of radio-frequency identification (RFID) technology to alleviate problems such as loss prevention and stock reduction. However, the research within this field is limited and fails to fully explore with depth, the wider network improvements that can be made to optimize the supply chain through efficient RTE management. This paper, investigates the nature of RTE network management building on current research and practices, filling a gap in the literature, through the investigation of a product-centric approach where the paradigms of “intelligent products” and “autonomous objects” are explored. A network optimizing approach with RTE management is explored, encouraging advanced research development of the RTE paradigm to align academic research with problematic areas in industry. Further research continues with the development of an agent-based software system, ready for application to a real-case study distribution network, producing quantitative results for further analysis. This is pivotal on the endeavor to developing agile support systems, fully utilizing an information-centric environment and encouraging RTE to be viewed as critical network optimizing tools rather than costly waste.
Resumo:
Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.
Resumo:
Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.
Resumo:
This research thesis is concerned with the human factors aspects of industrial alarm systems within human supervisory control tasks. Typically such systems are located in central control rooms, and the information may be presented via visual display units. The thesis develops a human, rather than engineering, centred approach to the assessment, measurement and analysis of the situation. A human factors methodology was employed to investigate the human requirements through: interviews, questionnaires, observation and controlled experiments. Based on the analysis of current industrial alarm systems in a variety of domains (power generation, manufacturing and coronary care), it is suggested that often designers do not pay due considerations to the human requirements. It is suggested that most alarm systems have severe shortcomings in human factors terms. The interviews, questionnaire and observations led to the proposal of 'alarm initiated activities' as a framework for the research to proceed. The framework comprises of six main stages: observe, accept, analyse, investigate, correct and monitor. This framework served as a basis for laboratory research into alarm media. Under consideration were speech-based alarm displays and visual alarm displays. Non-speech auditory displays were the subject of a literature review. The findings suggest that care needs to be taken when selecting the alarm media. Ideally it should be chosen to support the task requirements of the operator, rather than being arbitrarily assigned. It was also indicated that there may be some interference between the alarm initiated activities and the alarm media, i.e. information that supports one particular stage of alarm handling may interfere with another.
Resumo:
This thesis deals with the problem of Information Systems design for Corporate Management. It shows that the results of applying current approaches to Management Information Systems and Corporate Modelling fully justify a fresh look to the problem. The thesis develops an approach to design based on Cybernetic principles and theories. It looks at Management as an informational process and discusses the relevance of regulation theory to its practice. The work proceeds around the concept of change and its effects on the organization's stability and survival. The idea of looking at organizations as viable systems is discussed and a design to enhance survival capacity is developed. It takes Ashby's theory of adaptation and developments on ultra-stability as a theoretical framework and considering conditions for learning and foresight deduces that a design should include three basic components: A dynamic model of the organization- environment relationships; a method to spot significant changes in the value of the essential variables and in a certain set of parameters; and a Controller able to conceive and change the other two elements and to make choices among alternative policies. Further considerations of the conditions for rapid adaptation in organisms composed of many parts, and the law of Requisite Variety determine that successful adaptive behaviour requires certain functional organization. Beer's model of viable organizations is put in relation to Ashby's theory of adaptation and regulation. The use of the Ultra-stable system as abstract unit of analysis permits developing a rigorous taxonomy of change; it starts distinguishing between change with in behaviour and change of behaviour to complete the classification with organizational change. It relates these changes to the logical categories of learning connecting the topic of Information System design with that of organizational learning.
Resumo:
Random number generation is a central component of modern information technology, with crucial applications in ensuring communications and information security. The development of new physical mechanisms suitable to directly generate random bit sequences is thus a subject of intense current research, with particular interest in alloptical techniques suitable for the generation of data sequences with high bit rate. One such promising technique that has received much recent attention is the chaotic semiconductor laser systems producing high quality random output as a result of the intrinsic nonlinear dynamics of its architecture [1]. Here we propose a novel complementary concept of all-optical technique that might dramatically increase the generation rate of random bits by using simultaneously multiple spectral channels with uncorrelated signals - somewhat similar to use of wave-division-multiplexing in communications. We propose to exploit the intrinsic nonlinear dynamics of extreme spectral broadening and supercontinuum (SC) generation in optical fibre, a process known to be often associated with non-deterministic fluctuations [2]. In this paper, we report proof-of concept results indicating that the fluctuations in highly nonlinear fibre SC generation can potentially be used for random number generation.
Resumo:
Target-specific delivery has become an integral area of research in order to increase bioavailability and reduce the toxic effects of drugs. As a drug-delivery option, trigger-release liposomes offer sophisticated targeting and greater control-release capabilities. These are broadly divided into two categories; those that utilise the local environment of the target site where there may be an upregulation in certain enzymes or a change in pH and those liposomes that are triggered by an external physical stimulus such as heat, ultrasound or light. These release mechanisms offer a greater degree of control over when and where the drug is released; furthermore, targeting of diseased tissue is enhanced by incorporation of target-specific components such as antibodies. This review aims to show the development of such trigger release liposome systems and the current research in this field.
Resumo:
This edited book is intended for use by students, academics and practitioners who take interest in the outsourcing and offshoring of information technology and business services and processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for practitioners, academics and students. The range of topics covered in this book is wide and diverse, and represents both client and supplier perspectives on sourcing of global services. Various aspects related to the decision making process (e.g., asset transfer), learning mechanisms and organizational practices for managing outsourcing relationships are discussed in great depth. Contemporary sourcing models, including cloud services, are examined. Client dependency on the outsourcing provider, and social aspects, such as identity, are discussed in detail. Furthermore, resistance in outsourcing and failures are investigated to derive lessons as to how to avoid them and improve efficiency in outsourcing. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.
Resumo:
This edited book is intended for use by students, academics and practitioners who take interest in outsourcing and offshoring of information technology and business processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit to students and managers. The range of topics covered in this book is wide and diverse. Various governance and coordination mechanisms for managing outsourcing relationships are discussed in great depth and the decision-making processes and considerations regarding sourcing arrangements, including multi-sourcing and cloud services, are examined. Vendors’ capabilities for managing global software development are studied in depth. Clients’ capabilities and issues related to compliance and culture are also discussed in association with various sourcing models. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.
Resumo:
This edited book is intended for use by students, academics and practitioners who take interest in the outsourcing and offshoring of information technology and business services and processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for practitioners, academics and students. The range of topics covered in this book is wide and diverse, and represents both client and supplier perspectives on sourcing of global services. Various aspects related to the decision making process (e.g., asset transfer), learning mechanisms and organizational practices for managing outsourcing relationships are discussed in great depth. Contemporary sourcing models, including cloud services, are examined. Client dependency on the outsourcing provider, and social aspects, such as identity, are discussed in detail. Furthermore, resistance in outsourcing and failures are investigated to derive lessons as to how to avoid them and improve efficiency in outsourcing. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.
Resumo:
Over recent years, hub-and-spoke distribution techniques have attracted widespread research attention. Despite there being a growing body of literature in this area there is less focus on the spoke-terminal element of the hub-and-spoke system as being a key component in the overall service received by the end-user. Current literature is highly geared towards discussing bulk optimization of freight units rather than to the more discrete and individualistic profile characteristics of shared-user Less-than-truckload (LTL) freight. In this paper, a literature review is presented to review the role hub-and-spoke systems play in meeting multi-profile customer demands, particularly in developing sectors with more sophisticated needs, such as retail. The paper also looks at the use of simulation technology as a suitable tool for analyzing spoke-terminal operations within developing hub-and spoke systems.
Resumo:
Clinical Decision Support Systems (CDSSs) need to disseminate expertise in formats that suit different end users and with functionality tuned to the context of assessment. This paper reports research into a method for designing and implementing knowledge structures that facilitate the required flexibility. A psychological model of expertise is represented using a series of formally specified and linked XML trees that capture increasing elements of the model, starting with hierarchical structuring, incorporating reasoning with uncertainty, and ending with delivering the final CDSS. The method was applied to the Galatean Risk and Safety Tool, GRiST, which is a web-based clinical decision support system (www.egrist.org) for assessing mental-health risks. Results of its clinical implementation demonstrate that the method can produce a system that is able to deliver expertise targetted and formatted for specific patient groups, different clinical disciplines, and alternative assessment settings. The approach may be useful for developing other real-world systems using human expertise and is currently being applied to a logistics domain. © 2013 Polish Information Processing Society.
Resumo:
The primary questions addressed in this paper are the following: what are the factors that affect students’ adoption of an e-learning system and what are the relationships among these factors? This paper investigates and identifies some of the major factors affecting students’ adoption of an e-learning system in a university in Jordan. E-learning adoption is approached from the information systems acceptance point of view. This suggests that a prior condition for learning effectively using e-learning systems is that students must actually use them. Thus, a greater knowledge of the factors that affect IT adoption and their interrelationships is a pre-cursor to a better understanding of student acceptance of e-learning systems. In turn, this will help and guide those who develop, implement, and deliver e-learning systems. In this study, an extended version of the Technology Acceptance Model (TAM) was developed to investigate the underlying factors that influence students’ decisions to use an e-learning system. The TAM was populated using data gathered from a survey of 486 undergraduate students using the Moodle based e-learning system at the Arab Open University. The model was estimated using Structural Equation Modelling (SEM). A path model was developed to analyze the relationships between the factors to explain students’ adoption of the e-learning system. Whilst findings support existing literature about prior experience affecting perceptions, they also point to surprising group effects, which may merit future exploration.
Resumo:
Purpose: The purpose of this paper is to investigate enterprise resource planning (ERP) systems development and emerging practices in the management of enterprises (i.e. parts of companies working with parts of other companies to deliver a complex product and/or service) and identify any apparent correlations. Suitable a priori contingency frameworks are then used and extended to explain apparent correlations. Discussion is given to provide guidance for researchers and practitioners to deliver better strategic, structural and operational competitive advantage through this approach; coined here as the "enterprization of operations". Design/methodology/approach: Theoretical induction uses a new empirical longitudinal case study from Zoomlion (a Chinese manufacturing company) built using an adapted form of template analysis to produce a new contingency framework. Findings: Three main types of enterprises and the three main types of ERP systems are defined and correlations between them are explained. Two relevant a priori frameworks are used to induct a new contingency model to support the enterprization of operations; known as the dynamic enterprise reference grid for ERP (DERG-ERP). Research limitations/implications: The findings are based on one longitudinal case study. Further case studies are currently being conducted in the UK and China. Practical implications: The new contingency model, the DERG-ERP, serves as a guide for ERP vendors, information systems management and operations managers hoping to grow and sustain their competitive advantage with respect to effective enterprise strategy, enterprise structure and ERP systems. Originality/value: This research explains how ERP systems and the effective management of enterprises should develop in order to sustain competitive advantage with respect to enterprise strategy, enterprise structure and ERP systems use. © Emerald Group Publishing Limited.