41 resultados para Critical to Satisfaction
em Aston University Research Archive
Resumo:
Purpose - This "research note" sets out to fuel the debate around the practices and technologies within operations that are critical to success with servitization. It presents a study of four companies which are delivering advanced services and reports on the organisation and skill-sets of people within these. Design/methodology/approach - This has been case-based research at four manufacturers leading in their delivery of services. Findings - It describes the desirable behaviour of people in the front-line of service delivery, identifies the supporting skill-sets, how these people are organised, and explains why all these factors are so important. Originality/value - This paper contributes to the understanding of the servitization process and, in particular, the implications to broader operations of the firm. © 2013 Emerald Group Publishing Limited. All rights reserved.
Resumo:
Purpose – Role clarity of frontline staff is critical to their perceptions of service quality in call centres. The purpose of this study is to examine the effects of role clarity and its antecedents and consequences on employee-perceived service quality. Design/methodology/approach – A conceptual model, based on the job characteristics model and cognitive theories, is proposed. Key antecedents of role clarity considered here are feedback, autonomy, participation, supervisory consideration, and team support; while key consequences are organizational commitment, job satisfaction and service quality. An internal marketing approach is adopted and all variables are measured from the frontline employee's perspective. A structural equation model is developed and tested on a sample of 342 call centre representatives of a major commercial bank in the UK. Findings – The research reveals that role clarity plays a critical role in explaining employee perceptions of service quality. Further, the research findings indicate that feedback, participation and team support significantly influence role clarity, which in turn influences job satisfaction and organizational commitment. Research limitations/implications – The research suggests that boundary personnel in service firms should strive for more clarity in perceived role for delivering better service quality. The limitations are in sample availability from in-house transaction call centres of a single bank. Originality/value – The contributions of this study are untangling the confusing research evidence on the effect of role clarity on service quality, using service quality as a performance variable as opposed to productivity estimates, adopting an internal marketing approach to understanding the phenomenon, and introducing teamwork along with job-design and supervisory factors as antecedent to role clarity.
Resumo:
Purpose – This paper aims to respond to John Rossiter's call for a “Marketing measurement revolution” in the current issue of EJM, as well as providing broader comment on Rossiter's C-OAR-SE framework, and measurement practice in marketing in general. Design/methodology/approach – The paper is purely theoretical, based on interpretation of measurement theory. Findings – The authors find that much of Rossiter's diagnosis of the problems facing measurement practice in marketing and social science is highly relevant. However, the authors find themselves opposed to the revolution advocated by Rossiter. Research limitations/implications – The paper presents a comment based on interpretation of measurement theory and observation of practices in marketing and social science. As such, the interpretation is itself open to disagreement. Practical implications – There are implications for those outside academia who wish to use measures derived from academic work as well as to derive their own measures of key marketing and other social variables. Originality/value – This paper is one of the few to explicitly respond to the C-OAR-SE framework proposed by Rossiter, and presents a number of points critical to good measurement theory and practice, which appear to remain underdeveloped in marketing and social science.
Resumo:
The concept of plagiarism is not uncommonly associated with the concept of intellectual property, both for historical and legal reasons: the approach to the ownership of ‘moral’, nonmaterial goods has evolved to the right to individual property, and consequently a need was raised to establish a legal framework to cope with the infringement of those rights. The solution to plagiarism therefore falls most often under two categories: ethical and legal. On the ethical side, education and intercultural studies have addressed plagiarism critically, not only as a means to improve academic ethics policies (PlagiarismAdvice.org, 2008), but mainly to demonstrate that if anything the concept of plagiarism is far from being universal (Howard & Robillard, 2008). Even if differently, Howard (1995) and Scollon (1994, 1995) argued, and Angèlil-Carter (2000) and Pecorari (2008) later emphasised that the concept of plagiarism cannot be studied on the grounds that one definition is clearly understandable by everyone. Scollon (1994, 1995), for example, claimed that authorship attribution is particularly a problem in non-native writing in English, and so did Pecorari (2008) in her comprehensive analysis of academic plagiarism. If among higher education students plagiarism is often a problem of literacy, with prior, conflicting social discourses that may interfere with academic discourse, as Angèlil-Carter (2000) demonstrates, we then have to aver that a distinction should be made between intentional and inadvertent plagiarism: plagiarism should be prosecuted when intentional, but if it is part of the learning process and results from the plagiarist’s unfamiliarity with the text or topic it should be considered ‘positive plagiarism’ (Howard, 1995: 796) and hence not an offense. Determining the intention behind the instances of plagiarism therefore determines the nature of the disciplinary action adopted. Unfortunately, in order to demonstrate the intention to deceive and charge students with accusations of plagiarism, teachers necessarily have to position themselves as ‘plagiarism police’, although it has been argued otherwise (Robillard, 2008). Practice demonstrates that in their daily activities teachers will find themselves being required a command of investigative skills and tools that they most often lack. We thus claim that the ‘intention to deceive’ cannot inevitably be dissociated from plagiarism as a legal issue, even if Garner (2009) asserts that generally plagiarism is immoral but not illegal, and Goldstein (2003) makes the same severance. However, these claims, and the claim that only cases of copyright infringement tend to go to court, have recently been challenged, mainly by forensic linguists, who have been actively involved in cases of plagiarism. Turell (2008), for instance, demonstrated that plagiarism is often connoted with an illegal appropriation of ideas. Previously, she (Turell, 2004) had demonstrated by comparison of four translations of Shakespeare’s Julius Caesar to Spanish that the use of linguistic evidence is able to demonstrate instances of plagiarism. This challenge is also reinforced by practice in international organisations, such as the IEEE, to whom plagiarism potentially has ‘severe ethical and legal consequences’ (IEEE, 2006: 57). What plagiarism definitions used by publishers and organisations have in common – and which the academia usually lacks – is their focus on the legal nature. We speculate that this is due to the relation they intentionally establish with copyright laws, whereas in education the focus tends to shift from the legal to the ethical aspects. However, the number of plagiarism cases taken to court is very small, and jurisprudence is still being developed on the topic. In countries within the Civil Law tradition, Turell (2008) claims, (forensic) linguists are seldom called upon as expert witnesses in cases of plagiarism, either because plagiarists are rarely taken to court or because there is little tradition of accepting linguistic evidence. In spite of the investigative and evidential potential of forensic linguistics to demonstrate the plagiarist’s intention or otherwise, this potential is restricted by the ability to identify a text as being suspect of plagiarism. In an era with such a massive textual production, ‘policing’ plagiarism thus becomes an extraordinarily difficult task without the assistance of plagiarism detection systems. Although plagiarism detection has attracted the attention of computer engineers and software developers for years, a lot of research is still needed. Given the investigative nature of academic plagiarism, plagiarism detection has of necessity to consider not only concepts of education and computational linguistics, but also forensic linguistics. Especially, if intended to counter claims of being a ‘simplistic response’ (Robillard & Howard, 2008). In this paper, we use a corpus of essays written by university students who were accused of plagiarism, to demonstrate that a forensic linguistic analysis of improper paraphrasing in suspect texts has the potential to identify and provide evidence of intention. A linguistic analysis of the corpus texts shows that the plagiarist acts on the paradigmatic axis to replace relevant lexical items with a related word from the same semantic field, i.e. a synonym, a subordinate, a superordinate, etc. In other words, relevant lexical items were replaced with related, but not identical, ones. Additionally, the analysis demonstrates that the word order is often changed intentionally to disguise the borrowing. On the other hand, the linguistic analysis of linking and explanatory verbs (i.e. referencing verbs) and prepositions shows that these have the potential to discriminate instances of ‘patchwriting’ and instances of plagiarism. This research demonstrates that the referencing verbs are borrowed from the original in an attempt to construct the new text cohesively when the plagiarism is inadvertent, and that the plagiarist has made an effort to prevent the reader from identifying the text as plagiarism, when it is intentional. In some of these cases, the referencing elements prove being able to identify direct quotations and thus ‘betray’ and denounce plagiarism. Finally, we demonstrate that a forensic linguistic analysis of these verbs is critical to allow detection software to identify them as proper paraphrasing and not – mistakenly and simplistically – as plagiarism.
Resumo:
The surface behaviour of materials is crucial to our everyday lives. Studies of the corrosive, reactive, optical and electronic properties of surfaces are thus of great importance to a wide range of industries including the chemical and electronics sectors. The surface properties of polymers can also be tuned for use in packaging, non stick coatings or for use in medical applications. Methods to characterise surface composition and reactivity are thus critical to the development of next generation materials. This report will outline the basic principles of X-ray photoelectron spectroscopy and how it can be applied to analyse the surfaces of inorganic materials. The role of XPS in understanding the nature of the active site in heterogeneous catalysts will also be discussed.
Resumo:
Parkinson's disease is a complex heterogeneous disorder with urgent need for disease-modifying therapies. Progress in successful therapeutic approaches for PD will require an unprecedented level of collaboration. At a workshop hosted by Parkinson's UK and co-organized by Critical Path Institute's (C-Path) Coalition Against Major Diseases (CAMD) Consortiums, investigators from industry, academia, government and regulatory agencies agreed on the need for sharing of data to enable future success. Government agencies included EMA, FDA, NINDS/NIH and IMI (Innovative Medicines Initiative). Emerging discoveries in new biomarkers and genetic endophenotypes are contributing to our understanding of the underlying pathophysiology of PD. In parallel there is growing recognition that early intervention will be key for successful treatments aimed at disease modification. At present, there is a lack of a comprehensive understanding of disease progression and the many factors that contribute to disease progression heterogeneity. Novel therapeutic targets and trial designs that incorporate existing and new biomarkers to evaluate drug effects independently and in combination are required. The integration of robust clinical data sets is viewed as a powerful approach to hasten medical discovery and therapies, as is being realized across diverse disease conditions employing big data analytics for healthcare. The application of lessons learned from parallel efforts is critical to identify barriers and enable a viable path forward. A roadmap is presented for a regulatory, academic, industry and advocacy driven integrated initiative that aims to facilitate and streamline new drug trials and registrations in Parkinson's disease.
Resumo:
The delegation of public tasks to arm’s-length bodies remains a central feature of contemporary reform agendas within both developed and developing countries. The role and capacity of political and administrative principals (i.e. ministers and departments of state) to control the vast network of arm’s-length bodies for which they are formally responsible is therefore a critical issue within and beyond academe. In the run-up to the 2010 General Election in the United Kingdom, the ‘quango conundrum’ emerged as an important theme and all three major parties committed themselves to shift the balance of power back towards ministers and sponsor departments. This article presents the results of the first major research project to track and examine the subsequent reform process. It reveals a stark shift in internal control relationships from the pre-election ‘poor parenting’ model to a far tighter internal situation that is now the focus of complaints by arm’s-length bodies of micro-management. This shift in the balance of power and how it was achieved offers new insights into the interplay between different forms of governance and has significant theoretical and comparative relevance. Points for practitioners: For professionals working in the field of arm’s-length governance, the article offers three key insights. First, that a well-resourced core executive is critical to directing reform given the challenges of implementing reform in a context of austerity. Second, that those implementing reform will also need to take into account the diverse consequences of centrally imposed reform likely to result in different departments with different approaches to arm’s-length governance. Third, that reforming arm’s-length governance can affect the quality of relationships, and those working in the field will need to mitigate these less tangible challenges to ensure success.
Resumo:
John Bowlby's use of evolutionary theory as a cornerstone of his attachment theory was innovative in its day and remains useful. Del Giudice's target article extends Belsky et al.'s and Chisholm's efforts to integrate attachment theory with more current thinking about evolution, ecology, and neuroscience. His analysis would be strengthened by (1) using computer simulation to clarify and simulate the effects of early environmental stress, (2) incorporating information about non-stress related sources of individual differences, (3) considering the possibility of adaptive behavior without specific evolutionary adaptations, and (4) considering whether the attachment construct is critical to his analysis.
Resumo:
One of the most significant paradigm shifts of modern business management is that individual businesses no longer compete as solely autonomous entities, but rather as supply chains. Firms worldwide have embraced the concept of supply chain management as important and sometimes critical to their business. The idea of a collaborative supply chain is to gain a competitive advantage by improving overall performance through measuring a holistic perspective of the supply chain. However, contemporary performance measurement theory is somewhat fragmented and fails to support this idea. Therefore, this research develops and applies an integrated supply chain performance measurement framework that provides a more holistic approach to the study of supply chain performance measurement by combining both supply chain macro processes and decision making levels. Therefore, the proposed framework can provide a balanced horizontal (cross-process) and vertical (hierarchical decision) view and measure the performance of the entire supply chain system. Firstly, literature on performance measurement frameworks and performance measurement factors of supply chain management will help to develop a conceptual framework. Next the proposed framework will be presented. The framework will be validated through in-depth interviews with three Thai manufacturing companies. The fieldwork combined varied sources in order to understand the views of manufacturers on supply chain performance in the three case study companies. The collected data were analyzed, interpreted, and reported using thematic analysis and analysis hierarchy process (AHP), which was influenced by the study’s conceptual framework. This research contributes a new theory of supply chain performance measurement and knowledge on supply chain characteristics of a developing country, Thailand. The research also affects organisations by preparing decision makers to make strategic, tactical and operational level decisions with respect to supply chain macro processes. The results from the case studies also indicate the similarities and differences in their supply chain performance. Furthermore, the implications of the study are offered for both academic and practical use.
Resumo:
This research examines the role of the information management process within a process-oriented enterprise, Xerox Ltd. The research approach is based on a post-positive paradigm and has resulted in thirty-five idiographic statements. The three major outcomes are: 1. The process-oriented holistic enterprise is an organisation that requires a long-term management commitment to its development. It depends on the careful management of people, tasks, information and technology. A complex integration of business processes is required and this can be managed through the use of consistent documentation techniques, clarity in the definition of process responsibilities and management attention to the global metrics and the centralisation of the management of the process model are critical to its success. 2. The role of the information management process within the context of a process-oriented enterprise is to provide flexible and cost-effective applications, technological, and process support to the business. This is best achieved through a centralisation of the management of information management and of the process model. A business-led approach combined with the consolidation of applications, information, process, and data architectures is central to providing effective business and process-focused support. 3. In a process oriented holistic enterprise, process and information management are inextricably linked. The model of process management depends heavily on information management, whilst the model of information management is totally focused around supporting and creating the process model. The two models are mutually creating - one cannot exist without the other. There is a duality concept of process and information management.
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
This thesis has two aims. First, it sets out to develop an alternative methodology for the investigation of risk homeostasis theory (RHT). It is argued that the current methodologies of the pseudo-experimental design and post hoc analysis of road-traffic accident data both have their limitations, and that the newer 'game' type simulation exercises are also, but for different reasons, incapable of testing RHT predictions. The alternative methodology described here is based on the simulation of physical risk with intrinsic reward rather than a 'points pay-off'. The second aim of the thesis is to examine a number of predictions made by RHT through the use of this alternative methodology. Since the pseudo-experimental design and post hoc analysis of road-traffic data are both ill-suited to the investigation of that part of RHT which deals with the role of utility in determining risk-taking behaviour in response to a change in environmental risk, and since the concept of utility is critical to RHT, the methodology reported here is applied to the specific investigation of utility. Attention too is given to the question of which behavioural pathways carry the homeostasis effect, and whether those pathways are 'local' to the nature of the change in environmental risk. It is suggested that investigating RHT through this new methodology holds a number of advantages and should be developed further in an attempt to answer the RHT question. It is suggested too that the methodology allows RHT to be seen in a psychological context, rather than the statistical context that has so far characterised its investigation. The experimental findings reported here are in support of hypotheses derived from RHT and would therefore seem to argue for the importance of the individual and collective target level of risk, as opposed to the level of environmental risk, as the major determinant of accident loss.
Resumo:
l, This report presents the findings of a study of individual personalities of Naval Officers, Chief Petty Officers and Petty Officers serving in different environments within the Ministry of Defence and the Fleet. This sample was used to establish norms for the Cattell 16 PF Questionnaire, and these are compared with other occupational norms discussed in the literature. 2. The results obtained on psychometric measures were related to other data collected about the work and the formal organisation. This was in its turn related to problems facing the Navy because of changes in technology which have occurred or which are now taking place and are expected to make an impact in the future. 3. A need is recognised for a way of simulating the effects of proposed changes within the manpower field of the Royal Navy and a simulation model is put forward and discussed. 4. The use of psychometric measures in selection for entry and for special tasks is examined, Particular reference is made to problems of group formation in the context of leadership in a technical environment. 5. The control of the introduction of change is discussed in the recognition that people represent an increasingly important resource which is critical to the continuing life of the total organisation. 6. Conclusions are drawn from the various strands of the research and recommendations are made both for line management and for subsequent research programmes.
Resumo:
Several cationic initiator systems were developed and used to polymerise oxetane with two oxonium ion initiator systems being investigated in depth. The first initiator system was generated by the elimination of a chloride group from a chloro methyl ethyl ether. Adding a carbonyl co-catalyst to a carbocationic centre generated the second initiator system. It was found that the anion used to stabilise the initiator was critical to the initial rate of polymerisation of oxetane with hexafluoroantimonate resulting in the fastest polymerisations. Both initiator systems could be used at varying monomer to initiator concentrations to control the molecular number average, Mn, of the resultant polymer. Both initiator systems showed living characteristics and were used to polymerise further monomers and generate higher molecular weight material and block copolymers. Oxetane and 3,3-dimethyl oxetane can both be polymerised using either oxonium ion initiator system in a variety of DCM or DCM/1,4-dioxane solvent mixtures. The level of 1,4-dioxane does have an impact on the initial rate of polymerisation with higher levels resulting in lower initial rates of polymerisation but do tend to result in higher polydispersities. The level of oligomer formation is also reduced as the level of 1,4-dioxane is increased. 3,3-bis-bromomethyl oxetane was also polymerised but a large amount of hyperbranching was seen at the bromide site resulting in a difficult to solvate polymer system. Multifunctional initiator systems were also generated using the halide elimination reactions with some success being achieved with 1,3,5-tris-bromomethyl-2,4,6-tris-methyl-benzene derived initiator system. This offered some control over the molecular number average of the resultant polymer system.
Resumo:
The thesis contributes to the evolving process of moving the study of Complexity from the arena of metaphor to something real and operational. Acknowledging this phenomenon ultimately changes the underlying assumptions made about working environments and leadership; organisations are dynamic and so should their leaders be. Dynamic leaders are behaviourally complex. Behavioural Complexity is a product of behavioural repertoire - range of behaviours; and behavioural differentiation - where effective leaders apply appropriate behaviour to the demands of the situation. Behavioural Complexity was operationalised using the Competing Values Framework (CVF). The CVF is a measure that captures the extent to which leaders demonstrate four behaviours on four quadrants: Control, Compete, Collaborate and Create, which are argued to be critical to all types of organisational leadership. The results provide evidence to suggest Behavioural Complexity is an enabler of leadership effectiveness; Organisational Complexity (captured using a new measure developed in the thesis) moderates Behavioural Complexity and leadership effectiveness; and leadership training supports Behavioural Complexity in contributing to leadership effectiveness. Most definitions of leadership come down to changing people’s behaviour. Such definitions have contributed to a popularity of focus in leadership research intent on exploring how to elicit change in others when maybe some of the popularity of attention should have been on eliciting change in the leader them self. It is hoped that this research will provoke interest into the factors that cause behavioural change in leaders that in turn enable leadership effectiveness and in doing so contribute to a better understanding of leadership in organisations.