861 resultados para Software Process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This paper seeks to analyse the process of packaged software selection in a small organization, focussing particularly on the role of IT consultants as intermediaries in the process. Design/methodology/approach – This is based upon a longitudinal, qualitative field study concerning the adoption of a customer relationship management package in an SME management consultancy. Findings – The authors illustrate how the process of “salesmanship”, an activity directed by the vendor/consultant and focussed on the interests of senior management, marginalises user needs and ultimately secures the procurement of the software package. Research limitations/implications – Despite the best intentions the authors lose something of the rich detail of the lived experience of technology in presenting the case study as a linear narrative. Specifically, the authors have been unable to do justice to the complexity of the multifarious ways in which individual perceptions of the project were influenced and shaped by the opinions of others. Practical implications – Practitioners, particularly those from within SMEs, should be made aware of the ways in which external parties may have a vested interest in steering projects in a particular direction, which may not necessarily align with their own interests. Originality/value – This study highlights in detail the role of consultants and vendors in software selection processes, an area which has received minimal attention to date. Prior work in this area emphasises the necessary conditions for, and positive outcomes of, appointing external parties in an SME context, with only limited attention being paid to the potential problems such engagements may bring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As organisations increasingly engage in the selection, purchase, and adoption of packaged software products, how these activities are carried out in practice becomes increasingly relevant for researchers and practitioners. Our focus in this paper is to propose a framework for understanding the packaged software selection process. The functionalist literature on this area of study suggests a number of generic recommendations, which are based on rational assumptions about the process and view the decision making that takes place as producing the “best technology solution.’” To explore this, we conducted a longitudinal, in-depth study of packaged software selection in a small organisation. For interpretation of the case, we draw upon the Social Construction of Technology, a theoretical framework arguing that technology is socially constituted and regarding the process of development as contradictory and uncertain. We offer a number of contributions. First, we further our understanding of packaged software selection with the critique that we offer of the functionalist literature, drawing insights from the emerging critical/constructivist literature and expanding our domain of interest to encompass the wider environment. Second, we weave this together with our experiences in the field, drawing on social constructivism for theoretical support, to develop a framework of packaged software selection that shows how various actors shape the process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Customer Relationship Management (CRM) packaged software has become a key contributor to attempts at aligning business and IT strategies in recent years. Throughout the 1990s there was, in many organisations strategies, a shift from the need to manage transactions and toward relationship management. Where Enterprise Resource Planning packages dominated the management of transactions era, CRM packages lead in regard to relationships. At present, balanced views of CRM packages are scantly presented instead relying on vendor rhetoric. This paper uses case study research to analyse some of the issues associated with CRM packages. These issues include the limitations of CRM packages, the need for a relationship orientation and the problems of a dominant management perspective of CRM. It is suggested that these issues could be more readily accommodated by organisational detachment from beliefs in IT as utopia, consideration of prior IS theory and practice and a more informed approach to CRM package selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process modelling is an integral part of any process industry. Several sugar factory models have been developed over the years to simulate the unit operations. An enhanced and comprehensive milling process simulation model has been developed to analyse the performance of the milling train and to assess the impact of changes and advanced control options for improved operational efficiency. The developed model is incorporated in a proprietary software package ‘SysCAD’. As an example, the milling process model has been used to predict a significant loss of extraction by returning the cush from the juice screen before #3 mill instead of before #2 mill as is more commonly done. Further work is being undertaken to more accurately model extraction processes in a milling train, to examine extraction issues dynamically and to integrate the model into a whole factory model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today’s information systems log vast amounts of data. These collections of data (implicitly) describe events (e.g. placing an order or taking a blood test) and, hence, provide information on the actual execution of business processes. The analysis of such data provides an excellent starting point for business process improvement. This is the realm of process mining, an area which has provided a repertoire of many analysis techniques. Despite the impressive capabilities of existing process mining algorithms, dealing with the abundance of data recorded by contemporary systems and devices remains a challenge. Of particular importance is the capability to guide the meaningful interpretation of “oceans of data” by process analysts. To this end, insights from the field of visual analytics can be leveraged. This article proposes an approach where process states are reconstructed from event logs and visualised in succession, leading to an animated history of a process. This approach is customisable in how a process state, partially defined through a collection of activity instances, is visualised: one can select a map and specify a projection of events on this map based on the properties of the events. This paper describes a comprehensive implementation of the proposal. It was realised using the open-source process mining framework ProM. Moreover, this paper also reports on an evaluation of the approach conducted with Suncorp, one of Australia’s largest insurance companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Critical to the research of urban morphologists is the availability of historical records that document the urban transformation of the study area. However, thus far little work has been done towards an empirical approach to the validation of archival data in this field. Outlined in this paper, therefore, is a new methodology for validating the accuracy of archival records and mapping data, accrued through the process of urban morphological research, so as to establish a reliable platform from which analysis can proceed. The paper particularly addresses the problems of inaccuracies in existing curated historical information, as well as errors in archival research by student assistants, which together give rise to unacceptable levels of uncertainty in the documentation. The paper discusses the problems relating to the reliability of historical information, demonstrates the importance of data verification in urban morphological research, and proposes a rigorous method for objective testing of collected archival data through the use of qualitative data analysis software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characterisation of facial expression through landmark-based analysis methods such as FACEM (Pilowsky & Katsikitis, 1994) has a variety of uses in psychiatric and psychological research. In these systems, important structural relationships are extracted from images of facial expressions by the analysis of a pre-defined set of feature points. These relationship measures may then be used, for instance, to assess the degree of variability and similarity between different facial expressions of emotion. FaceXpress is a multimedia software suite that provides a generalised workbench for landmark-based facial emotion analysis and stimulus manipulation. It is a flexible tool that is designed to be specialised at runtime by the user. While FaceXpress has been used to implement the FACEM process, it can also be configured to support any other similar, arbitrary system for quantifying human facial emotion. FaceXpress also implements an integrated set of image processing tools and specialised tools for facial expression stimulus production including facial morphing routines and the generation of expression-representative line drawings from photographs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a technique for the automated removal of noise from process execution logs. Noise is the result of data quality issues such as logging errors and manifests itself in the form of infrequent process behavior. The proposed technique generates an abstract representation of an event log as an automaton capturing the direct follows relations between event labels. This automaton is then pruned from arcs with low relative frequency and used to remove from the log those events not fitting the automaton, which are identified as outliers. The technique has been extensively evaluated on top of various auto- mated process discovery algorithms using both artificial logs with different levels of noise, as well as a variety of real-life logs. The results show that the technique significantly improves the quality of the discovered process model along fitness, appropriateness and simplicity, without negative effects on generalization. Further, the technique scales well to large and complex logs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study proposes that technology adoption be considered as a multi-stage process constituting several distinct stages. Using the Theory of Planned Behaviour (TPB), Ettlie’s adoption stages and by employing data gathered from 162 owners of Small and Medium-sized Enterprises (SMEs), our findings show that the determinants of the intention to adopt packaged software fluctuate significantly across adoption stages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since their inception in 1962, Petri nets have been used in a wide variety of application domains. Although Petri nets are graphical and easy to understand, they have formal semantics and allow for analysis techniques ranging from model checking and structural analysis to process mining and performance analysis. Over time Petri nets emerged as a solid foundation for Business Process Management (BPM) research. The BPM discipline develops methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. Mainstream business process modeling notations and workflow management systems are using token-based semantics borrowed from Petri nets. Moreover, state-of-the-art BPM analysis techniques are using Petri nets as an internal representation. Users of BPM methods and tools are often not aware of this. This paper aims to unveil the seminal role of Petri nets in BPM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow Results: A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusions: A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application with source code is freely available for academic users and can be downloaded from http://www.icrisat.org/bt-software-d-lims.htm