16 resultados para process mapping

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A case study demonstrates the use of a process-based approach to change regarding the implementation of an information system for road traffic accident reporting in a UK police force. The supporting tools of process mapping and business process simulation are used in the change process and assist in communicating the current process design and people's roles in the overall performance of that design. The simulation model is also used to predict the performance of new designs incorporating the use of information technology. The approach is seen to have a number of advantages in the context of a public sector organisation. These include the ability for personnel to move from a traditional grouping of staff in occupational groups with relationships defined by reporting requirements to a view of their role in a process, which delivers a performance to a customer. By running the simulation through time it is also possible to gauge how changes at an operational level can lead to the meeting of strategic targets over time. Also the ability of simulation to proof new designs was seen as particularly important in a government agency were past failures of information technology investments had contributed to a more risk averse approach to their implementation. © 2004 Elsevier Ltd. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper argues the use of reusable simulation templates as a tool that can help to predict the effect of e-business introduction on business processes. First, a set of requirements for e-business modelling is introduced and modelling options described. Traditional business process mapping techniques are examined as a way of identifying potential changes. Whilst paper-based process mapping may not highlight significant differences between traditional and e-business processes, simulation does allow the real effects of e-business to be identified. Simulation has the advantage of capturing the dynamic characteristics of the process, thus reflecting more accurately the changes in behaviour. This paper shows the value of using generic process maps as a starting point for collecting the data that is needed to build the simulation and proposes the use of reusable templates/components for the speedier building of e-business simulation models.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A structured approach to process improvement is described in the context of the human resources division of a UK police force. The approach combines a number of established techniques of process improvement such as the balanced scorecard and process mapping with a scoring system developed to prioritise processes for improvement. The methodology described presents one way of ensuring the correct processes are identified and redesigned at an operational level in such a way as to support the organisation's strategic aims. In addition, a performance measurement system is utilised to attempt to ensure that the changes implemented do actually achieve the desired effect over time. The case demonstrates the need to choose and in some cases develop in-house tools and techniques dependent on the context of the process improvement effort.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main aim of this research is to demonstrate strategic supplier performance evaluation of a UK-based manufacturing organisation using an integrated analytical framework. Developing long term relationship with strategic suppliers is common in today's industry. However, monitoring suppliers' performance all through the contractual period is important in order to ensure overall supply chain performance. Therefore, client organisations need to measure suppliers' performance dynamically and inform them on improvement measures. Although there are many studies introducing innovative supplier performance evaluation frameworks and empirical researches on identifying criteria for supplier evaluation, little has been reported on detailed application of strategic supplier performance evaluation and its implication on overall performance of organisation. Additionally, majority of the prior studies emphasise on lagging factors (quality, delivery schedule and value/cost) for supplier selection and evaluation. This research proposes both leading (organisational practices, risk management, environmental and social practices) and lagging factors for supplier evaluation and demonstrates a systematic method for identifying those factors with the involvement of relevant stakeholders and process mapping. The contribution of this article is a real-life case-based action research utilising an integrated analytical model that combines quality function deployment and the analytic hierarchy process method for suppliers' performance evaluation. The effectiveness of the method has been demonstrated through number of validations (e.g. focus group, business results, and statistical analysis). Additionally, the study reveals that enhanced supplier performance results positive impact on operational and business performance of client organisation.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known 'background' process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the 'probability density function', 'pdf') of the data generated by the 'background' process. The relative proportion of this 'background' component (the 'prior' 'background' 'probability), the 'pdf' and the 'prior' probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known 'background' distribution. The method exploits the Kolmogorov-Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the 'oker' data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm. © Springer-Verlag 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim To undertake a national study of teaching, learning and assessment in UK schools of pharmacy. Design Triangulation of course documentation, 24 semi-structured interviews undertaken with 29 representatives from the schools and a survey of all final year students (n=1,847) in the 15 schools within the UK during 2003–04. Subjects and setting All established UK pharmacy schools and final year MPharm students. Outcome measures Data were combined and analysed under the topics of curriculum, teaching and learning, assessment, multi-professional teaching and learning, placement education and research projects. Results Professional accreditation was the main driver for curriculum design but links to preregistration training were poor. Curricula were consistent but offered little student choice. On average half the curriculum was science-based. Staff supported the science content but students less so. Courses were didactic but schools were experimenting with new methods of learning. Examinations were the principal form of assessment but the contribution of practice to the final degree ranged considerably (21–63%). Most students considered the assessment load to be about right but with too much emphasis upon knowledge. Assessment of professional competence was focused upon dispensing and pharmacy law. All schools undertook placement teaching in hospitals but there was little in community/primary care. There was little inter-professional education. Resources and logistics were the major limiters. Conclusions There is a need for an integrated review of the accreditation process for the MPharm and preregistration training and redefinition of professional competence at an undergraduate level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is about care, insider positions and mothering within feminist research. We ask questions about how honest, ethical and caring can we really be in placing the self into the research process as mothers ourselves. Should we leave out aspects of the research that do not fit neatly and how ethical can we claim to be if we do? Moreover, should difficult differences, secrets and silences that emerge from the research process and research stories that might 'out' us as failures be excluded from research outcomes so as to claim legitimate research? We consider the use of a feminist methods as crucial in the reciprocal and relational understanding of personal enquiry. Mothers invest significant emotional capital in their families and we explore the blurring of the interpersonal and intrapersonal when sharing mothering experiences common to both participant and researcher. Indeed participants can identify themselves within the process as 'friends' of the researcher. We both have familiarity within our respective research that has led to mutual understanding of having insider positions. Crucially individuals' realities are a vital component of the qualitative paradigm and that 'insider' research remains a necessary, albeit messy vehicle in social research. As it is we consider a growing body of literature which marks out and endorses a feminist ethics of care. All of which critique established ways of thinking about ethics, morality, security, citizenship and care. It provides alternatives in mapping private and public aspects of social life as it operates at a theoretical level, but importantly for this paper also at the level of practical application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most machine-learning algorithms are designed for datasets with features of a single type whereas very little attention has been given to datasets with mixed-type features. We recently proposed a model to handle mixed types with a probabilistic latent variable formalism. This proposed model describes the data by type-specific distributions that are conditionally independent given the latent space and is called generalised generative topographic mapping (GGTM). It has often been observed that visualisations of high-dimensional datasets can be poor in the presence of noisy features. In this paper we therefore propose to extend the GGTM to estimate feature saliency values (GGTMFS) as an integrated part of the parameter learning process with an expectation-maximisation (EM) algorithm. The efficacy of the proposed GGTMFS model is demonstrated both for synthetic and real datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement and verification of products and processes during the early design is attracting increasing interest from high value manufacturing industries. Measurement planning is deemed as an effective means to facilitate the integration of the metrology activity into a wider range of production processes. However, the literature reveals that there are very few research efforts in this field, especially regarding large volume metrology. This paper presents a novel approach to accomplish instruments selection, the first stage of measurement planning process, by mapping measurability characteristics between specific measurement assignments and instruments.