59 resultados para Management information systems -- TFG


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methodologies for understanding business processes and their information systems (IS) are often criticized, either for being too imprecise and philosophical (a criticism often levied at softer methodologies) or too hierarchical and mechanistic (levied at harder methodologies). The process-oriented holonic modelling methodology combines aspects of softer and harder approaches to aid modellers in designing business processes and associated IS. The methodology uses holistic thinking and a construct known as the holon to build process descriptions into a set of models known as a holarchy. This paper describes the methodology through an action research case study based in a large design and manufacturing organization. The scientific contribution is a methodology for analysing business processes in environments that are characterized by high complexity, low volume and high variety where there are minimal repeated learning opportunities, such as large IS development projects. The practical deliverables from the project gave IS and business process improvements for the case study company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article investigates whether (1) cross-functional integration within a firm and the use of information systems (IS) that support information sharing with external parties can enhance integration across the supply chain and wider networks and (2) whether collaboration with customers, suppliers and other external parties leads to increased supply chain performance in terms of new product development and introduction of new processes. Data from a high-quality survey carried out in Taiwan in 2009 were used, and appropriate econometric models were applied. Results show that the adoption of IS that enhance information sharing is vital not only for the effective communication with suppliers and with wider network members, but their adoption also has a direct effect across a firm's innovative effort. Cross-functional integration appears to matter only for the introduction of an innovative process. Collaboration with customers and suppliers affected a product's design and its overall features and functionality, respectively. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When applying multivariate analysis techniques in information systems and social science disciplines, such as management information systems (MIS) and marketing, the assumption that the empirical data originate from a single homogeneous population is often unrealistic. When applying a causal modeling approach, such as partial least squares (PLS) path modeling, segmentation is a key issue in coping with the problem of heterogeneity in estimated cause-and-effect relationships. This chapter presents a new PLS path modeling approach which classifies units on the basis of the heterogeneity of the estimates in the inner model. If unobserved heterogeneity significantly affects the estimated path model relationships on the aggregate data level, the methodology will allow homogenous groups of observations to be created that exhibit distinctive path model estimates. The approach will, thus, provide differentiated analytical outcomes that permit more precise interpretations of each segment formed. An application on a large data set in an example of the American customer satisfaction index (ACSI) substantiates the methodology’s effectiveness in evaluating PLS path modeling results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Completing projects faster than the normal duration is always a challenge to the management of any project, as it often demands many paradigm shifts. Opportunities of globalization, competition from private sectors and multinationals force the management of public sector organizations in the Indian petroleum sector to take various aggressive strategies to maintain their profitability. Constructing infrastructure for handling petroleum products is one of them. Moreover, these projects are required to be completed in faster duration compared to normal schedules to remain competitive, to get faster return on investment, and to give longer project life. However, using conventional tools and techniques of project management, it is impossible to handle the problem of reducing the project duration from a normal period. This study proposes the use of concurrent engineering in managing projects for radically reducing project duration. The phases of the project are accomplished concurrently/simultaneously instead of in a series. The complexities that arise in managing projects are tackled through restructuring project organization, improving management commitment, strengthening project-planning activities, ensuring project quality, managing project risk objectively and integrating project activities through management information systems. These would not only ensure completion of projects in fast track, but also improve project effectiveness in terms of quality, cost effectiveness, team building, etc. and in turn overall productivity of the project organization would improve.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coordination of effort within and among different expert groups is a central feature of contemporary organizations. Within the existing literature, however, a dichotomy has emerged in our understanding of the role played by codification in coordinating expert groups. One strand of literature emphasizes codification as a process that supports coordination by enabling the storage and ready transfer of knowledge. In contrast, another strand highlights the persistent differences between expert groups that create boundaries to the transfer of knowledge, seeing coordination as dependent on the quality of the reciprocal interactions between groups and individuals. Our research helps to resolve such contested understandings of the coordinative role played by codification. By focusing on the offshore-outsourcing of knowledge-intensive services, we examine the role played by codification when expertise was coordinated between client staff and onsite and offshore vendor personnel in a large-scale outsourcing contract between TATA Consultancy Services (TCS) and ABN AMRO bank. A number of theoretical contributions flow from our analysis of the case study, helping to move our understanding beyond the dichotomized views of codification outlined above. First, our study adds to previous work where codification has been seen as a static concept by demonstrating the multiple, coexisting, and complementary roles that codification may play. We examine the dynamic nature of codification and show changes in the relative importance of these different roles in coordinating distributed expertise over time. Second, we reconceptualize the commonly accepted view of codification as focusing on the replication and diffusion of knowledge by developing the notion of the codification of the “knower” as complementary to the codification of knowledge. Unlike previous studies of expertise directories, codification of the knower does not involve representing expertise in terms of occupational skills or competences but enables the reciprocal interrelating of expertise required by more unstructured tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent global „credit crunch? has brought sharply into focus the need for better understanding of what it takes for organisations to survive. This research seeks to help organisations maintain their „viability? – the ability to maintain a separate existence and survive on their own. Whilst there are a multitude of factors that contribute to organisational viability, information can be viewed as the lifeblood of organisations. This research increases our understanding of how organisations can manage information effectively to help maintain their viability. The viable systems model (VSM) is an established modelling technique that enables the detailed analysis of organisational activity to examine how the structure and functions performed in an organisation contribute to its „viability?. The VSM has been widely applied, in small/large companies, industries and governments. However, whilst the VSM concentrates on the structure and functions necessary for an organisation to be viable, it pays much less attention to information deployment in organisations. Indeed, the VSM is criticised in the literature for being unable to provide much help with detailed information and communication structures and new theories are called for to explore the way people interact and what information they need in the VSM. This research analyses qualitative data collected from four case studies to contribute to our understanding of the role that information plays in organisational viability, making three key contributions to the academic literature. In the information management literature, this research provides new insight into the roles that specific information plays in organisations. In the systems thinking literature, this research extends our understanding of the VSM and builds on its powerful diagnostic capability to provide further criteria to aid in the diagnosis of viable organisations. In the information systems literature, this research develops a framework that can be used to help organisations design more effective information systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the development of an operational river basin water resources information management system. The river or drainage basin is the fundamental unit of the system; in both the modelling and prediction of hydrological processes, and in the monitoring of the effect of catchment management policies. A primary concern of the study is the collection of sufficient and sufficiently accurate information to model hydrological processes. Remote sensing, in combination with conventional point source measurement, can be a valuable source of information, but is often overlooked by hydrologists, due to the cost of acquisition and processing. This thesis describes a number of cost effective methods of acquiring remotely sensed imagery, from airborne video survey to real time ingestion of meteorological satellite data. Inexpensive micro-computer systems and peripherals are used throughout to process and manipulate the data. Spatial information systems provide a means of integrating these data with topographic and thematic cartographic data, and historical records. For the system to have any real potential the data must be stored in a readily accessible format and be easily manipulated within the database. The design of efficient man-machine interfaces and the use of software enginering methodologies are therefore included in this thesis as a major part of the design of the system. The use of low cost technologies, from micro-computers to video cameras, enables the introduction of water resources information management systems into developing countries where the potential benefits are greatest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: There is increasing evidence that electronic prescribing (ePrescribing) or computerised provider/physician order entry (CPOE) systems can improve the quality and safety of healthcare services. However, it has also become clear that their implementation is not straightforward and may create unintended or undesired consequences once in use. In this context, qualitative approaches have been particularly useful and their interpretative synthesis could make an important and timely contribution to the field. This review will aim to identify, appraise and synthesise qualitative studies on ePrescribing/CPOE in hospital settings, with or without clinical decision support. Methods and analysis: Data sources will include the following bibliographic databases: MEDLINE, MEDLINE In Process, EMBASE, PsycINFO, Social Policy and Practice via Ovid, CINAHL via EBSCO, The Cochrane Library (CDSR, DARE and CENTRAL databases), Nursing and Allied Health Sources, Applied Social Sciences Index and Abstracts via ProQuest and SCOPUS. In addition, other sources will be searched for ongoing studies (ClinicalTrials.gov) and grey literature: Healthcare Management Information Consortium, Conference Proceedings Citation Index (Web of Science) and Sociological abstracts. Studies will be independently screened for eligibility by 2 reviewers. Qualitative studies, either standalone or in the context of mixed-methods designs, reporting the perspectives of any actors involved in the implementation, management and use of ePrescribing/CPOE systems in hospital-based care settings will be included. Data extraction will be conducted by 2 reviewers using a piloted form. Quality appraisal will be based on criteria from the Critical Appraisal Skills Programme checklist and Standards for Reporting Qualitative Research. Studies will not be excluded based on quality assessment. A postsynthesis sensitivity analysis will be undertaken. Data analysis will follow the thematic synthesis method. Ethics and dissemination: The study does not require ethical approval as primary data will not be collected. The results of the study will be published in a peer-reviewed journal and presented at relevant conferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic ontology building is a vital issue in many fields where they are currently built manually. This paper presents a user-centred methodology for ontology construction based on the use of Machine Learning and Natural Language Processing. In our approach, the user selects a corpus of texts and sketches a preliminary ontology (or selects an existing one) for a domain with a preliminary vocabulary associated to the elements in the ontology (lexicalisations). Examples of sentences involving such lexicalisation (e.g. ISA relation) in the corpus are automatically retrieved by the system. Retrieved examples are validated by the user and used by an adaptive Information Extraction system to generate patterns that discover other lexicalisations of the same objects in the ontology, possibly identifying new concepts or relations. New instances are added to the existing ontology or used to tune it. This process is repeated until a satisfactory ontology is obtained. The methodology largely automates the ontology construction process and the output is an ontology with an associated trained leaner to be used for further ontology modifications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper initially highlights the rapid growth in the call centre (CC) sector in developing countries like India. It then makes a case for the investigation of human resource management (HRM) systems of call centres in India. The analysis is based on a two-phase empirical study. Phase one examines the nature and pattern of HRM systems and phase two the emerging issue of attrition in Indian call centres. A mixed research approach comprising in-depth interviews and questionnaire survey was adopted to conduct the investigation. Against the established norms of Indian organizations, the findings highlight the existence of formal, structured and rationalized HRM systems. Core reasons for the increasing levels of attrition are highlighted. The analysis further provides useful information both for academics and practitioners and opens avenues for future research.