17 resultados para Customer Relation Method (CRM)
Resumo:
Customer relationship management (CRM) implementation projects reflect a growing conceptual shift from the traditional engineering view of projects. Such projects are complex and risky because they call for both organisational and technological changes. This requires effective project management across various phases of the implementation process. However, few empirical researches have dealt with these project management issues. The aim of this research is to investigate how a “project team” manages CRM implementation projects successfully, across the different phases of the implementation process. We conducted an in-depth case study of the “Firm-Clients Branch” of a large telecommunications company in France. The findings show that, to manage CRM implementation projects successfully, an integrated and balanced approach is required. This involves appropriate system selection, effective process re-engineering and further development of organizational structures. We highlight the need for a “technochange approach” to achieve successful organisational transition and effective CRM implementation. The study reveals that the project team plays a central role throughout the implementation phases. Furthermore the effectiveness of technochange depends on project team performance, technology efficiency and close coordination with stakeholders.
Resumo:
Research on large firms suggests that dedicated customer relationship management (CRM) software applications play a critical role in creating and sustaining customer relationships. CRM is also of strategic importance to small and medium-sized enterprises (SMEs), but most of them do not employ dedicated CRM software. Instead they use generic Internet-based technologies to manage customer relationships with electronic CRM (eCRM). There has been little research on the extent to which the use of generic Internet technologies contributes to SME performance. The present study fills the gap, building upon the literature on organizational capabilities, marketing, and SMEs to develop a research model with which to explore the relationships between generic Internet technologies, eCRM capabilities, and the resulting performance benefits in the SME context. A survey across 286 SMEs in Ireland finds strong empirical evidence in support of the hypotheses regarding these benefits. The study contributes to managerial decision making by showing how SMEs can use generic Internet technologies to advance their customer relationships and contributes to theory development by conceptualizing eCRM capabilities in an SME context.
Resumo:
This study examines the role of capabilities in core marketing-related business processes–product development management (PDM), supply chain management (SCM) and customer relationship management (CRM)–in translating a firm’s market orientation (MO) into firm performance. The study is the first to examine the interplay of all three business process capabilities simultaneously, while investigating how environmental conditions moderate their performance effects. A moderated mediation analysis of 468 product-focused firms finds that PDM and CRM process capabilities play important mediating roles, whereas SCM process capability does not mediate the relationship between MO and performance. However, the relative importance of the capabilities as mediators varies along the degree of environmental turbulence, and under certain conditions, an increase in the level of business process capability may even turn detrimental.
Resumo:
Wireless Mesh Networks (WMNs) have emerged as a key technology for the next generation of wireless networking. Instead ofbeing another type of ad-hoc networking, WMNs diversify the capabilities of ad-hoc networks. There are many kinds of protocols that work over WMNs, such as IEEE 802.11a/b/g, 802.15 and 802.16. To bring about a high throughput under varying conditions, these protocols have to adapt their transmission rate. While transmission rate is a significant part, only a few algorithms such as Auto Rate Fallback (ARF) or Receiver Based Auto Rate (RBAR) have been published. In this paper we will show MAC, packet loss and physical layer conditions play important role for having good channel condition. Also we perform rate adaption along with multiple packet transmission for better throughput. By allowing for dynamically monitored, multiple packet transmission and adaptation to changes in channel quality by adjusting the packet transmission rates according to certain optimization criteria improvements in performance can be obtained. The proposed method is the detection of channel congestion by measuring the fluctuation of signal to the standard deviation of and the detection of packet loss before channel performance diminishes. We will show that the use of such techniques in WMN can significantly improve performance. The effectiveness of the proposed method is presented in an experimental wireless network testbed via packet-level simulation. Our simulation results show that regardless of the channel condition we were to improve the performance in the throughput.
Resumo:
This thesis examines the innovative performance of 206 U.S. business service firms. Undeniably, a need exists for better comprehension of the service sector of developed economies. This research takes a unique view by applying a synthesis approach to studying innovation and attempts to build under a proposed strategic innovation paradigm. A quantitative method is utilised via questionnaire in which all major types of innovation are under examination including: product and service, organisational, and technology-driven innovations. Essential ideas for this conceptual framework encapsulate a new mode of understanding service innovation. Basically, the structure of this analysis encompasses the likelihood of innovation and determining the extent of innovation, while also attempting to shed light on the factors which determine the impact of innovation on performance among service firms. What differentiates this research is its focus on customer-driven service firms in addition to other external linkages. A synopsis of the findings suggest that external linkages, particularly with customers, suppliers and strategic alliances or joint ventures, significantly affect innovation performance with regard to the introduction of new services. Service firms which incorporate formal and informal R&D experience significant increases in the extent of new-to-market and new-to-firm innovations. Additionally, the results show that customer-driven service firms experience greater productivity and growth. Furthermore, the findings suggest that external linkages assist service firm performance.
Resumo:
This thesis was focused on theoretical models of synchronization to cortical dynamics as measured by magnetoencephalography (MEG). Dynamical systems theory was used in both identifying relevant variables for brain coordination and also in devising methods for their quantification. We presented a method for studying interactions of linear and chaotic neuronal sources using MEG beamforming techniques. We showed that such sources can be accurately reconstructed in terms of their location, temporal dynamics and possible interactions. Synchronization in low-dimensional nonlinear systems was studied to explore specific correlates of functional integration and segregation. In the case of interacting dissimilar systems, relevant coordination phenomena involved generalized and phase synchronization, which were often intermittent. Spatially-extended systems were then studied. For locally-coupled dissimilar systems, as in the case of cortical columns, clustering behaviour occurred. Synchronized clusters emerged at different frequencies and their boundaries were marked through oscillation death. The macroscopic mean field revealed sharp spectral peaks at the frequencies of the clusters and broader spectral drops at their boundaries. These results question existing models of Event Related Synchronization and Desynchronization. We re-examined the concept of the steady-state evoked response following an AM stimulus. We showed that very little variability in the AM following response could be accounted by system noise. We presented a methodology for detecting local and global nonlinear interactions from MEG data in order to account for residual variability. We found crosshemispheric nonlinear interactions of ongoing cortical rhythms concurrent with the stimulus and interactions of these rhythms with the following AM responses. Finally, we hypothesized that holistic spatial stimuli would be accompanied by the emergence of clusters in primary visual cortex resulting in frequency-specific MEG oscillations. Indeed, we found different frequency distributions in induced gamma oscillations for different spatial stimuli, which was suggestive of temporal coding of these spatial stimuli. Further, we addressed the bursting character of these oscillations, which was suggestive of intermittent nonlinear dynamics. However, we did not observe the characteristic-3/2 power-law scaling in the distribution of interburst intervals. Further, this distribution was only seldom significantly different to the one obtained in surrogate data, where nonlinear structure was destroyed. In conclusion, the work presented in this thesis suggests that advances in dynamical systems theory in conjunction with developments in magnetoencephalography may facilitate a mapping between levels of description int he brain. this may potentially represent a major advancement in neuroscience.
Resumo:
The Octopus Automated Perimeter was validated in a comparative study and found to offer many advantages in the assessment of the visual field. The visual evoked potential was investigated in an extensive study using a variety of stimulus parameters to simulate hemianopia and central visual field defects. The scalp topography was recorded topographically and a technique to compute the source derivation of the scalp potential was developed. This enabled clarification of the expected scalp distribution to half field stimulation using different electrode montages. The visual evoked potential following full field stimulation was found to be asymmetrical around the midline with a bias over the left occiput particularly when the foveal polar projections of the occipital cortex were preferentially stimulated. The half field response reflected the distribution asymmetry. Masking of the central 3° resulted in a response which was approximately symmetrical around the midline but there was no evidence of the PNP-complex. A method for visual field quantification was developed based on the neural representation of visual space (Drasdo and Peaston 1982) in an attempt to relate visual field depravation with the resultant visual evoked potentials. There was no form of simple, diffuse summation between the scalp potential and the cortical generators. It was, however, possible to quantify the degree of scalp potential attenuation for M-scaled full field stimuli. The results obtained from patients exhibiting pre-chiasmal lesions suggested that the PNP-complex is not scotomatous in nature but confirmed that it is most likely to be related to specific diseases (Harding and Crews 1982). There was a strong correlation between the percentage information loss of the visual field and the diagnostic value of the visual evoked potential in patients exhibiting chiasmal lesions.
Resumo:
Background Cardiovascular disease (CVD) is partially attributed to traditional cardiovascular risk factors, which can be identified and managed based on risk stratification algorithms (Framingham Risk Score, National Cholesterol Education Program, Systematic Cardiovascular Risk Evaluation and Reynolds Risk Score). We aimed to (a) identify the proportion of at risk patients with rheumatoid arthritis (RA) requiring statin therapy identified by conventional risk calculators, and (b) assess whether patients at risk were receiving statins. Methods Patients at high CVD risk (excluding patients with established CVD or diabetes) were identified from a cohort of 400 well characterised patients with RA, by applying risk calculators with or without a ×1.5 multiplier in specific patient subgroups. Actual statin use versus numbers eligible for statins was also calculated. Results The percentage of patients identified as being at risk ranged significantly depending on the method, from 1.6% (for 20% threshold global CVD risk) to 15.5% (for CVD and cerebrovascular morbidity and mortality) to 21.8% (for 10% global CVD risk) and 25.9% (for 5% CVD mortality), with the majority of them (58.1% to 94.8%) not receiving statins. The application of a 1.5 multiplier identified 17% to 78% more at risk patients. Conclusions Depending on the risk stratification method, 2% to 26% of patients with RA without CVD have sufficiently high risk to require statin therapy, yet most of them remain untreated. To address this issue, we would recommend annual systematic screening using the nationally applicable risk calculator, combined with regular audit of whether treatment targets have been achieved.
Resumo:
Purpose: Previous research has emphasized the pivotal role that salespeople play in customer satisfaction. In this regard, the relationship between salespeople's attitudes, skills, and characteristics, and customer satisfaction remains an area of interest. The paper aims to make three contributions: first, it seeks to examine the impact of salespeople's satisfaction, adaptive selling, and dominance on customer satisfaction. Second, this research aims to use dyadic data, which is a better test of the relationships between constructs since it avoids common method variance. Finally, in contrast to previous research, it aims to test all of the customers of salespeople rather than customers selected by salespeople. Design/methodology/approach: The study employs multilevel analysis to examine the relationship between salespeople's satisfaction with the firm on customer satisfaction, using a dyadic, matched business-to-business sample of a large European financial service provider that comprises 188 customers and 18 employees. Findings: The paper finds that customers' evaluation of service quality, product quality, and value influence customer satisfaction. The analysis at the selling firm's employee level shows that adaptive selling and employee satisfaction positively impact customer satisfaction, while dominance is negatively related to customer satisfaction. Practical implications: Research shows that customer-focus is a key driver in the success of service companies. Customer satisfaction is regarded as a prerequisite for establishing long-term, profitable relations between company and customer, and customer contact employees are key to nurturing this relationship. The role of salespeople's attitudes, skills, and characteristics in the customer satisfaction process are highlighted in this paper. Originality/value: The use of dyadic, multilevel studies to assess the nature of the relationship between employees and customers is, to date, surprisingly limited. The paper examines the link between employee attitudes, skills, and characteristics, and customer satisfaction in a business-to-business setting in the financial service sector, differentiating between customer- and employee-level drivers of business customer satisfaction.
Resumo:
In current organizations, valuable enterprise knowledge is often buried under rapidly expanding huge amount of unstructured information in the form of web pages, blogs, and other forms of human text communications. We present a novel unsupervised machine learning method called CORDER (COmmunity Relation Discovery by named Entity Recognition) to turn these unstructured data into structured information for knowledge management in these organizations. CORDER exploits named entity recognition and co-occurrence data to associate individuals in an organization with their expertise and associates. We discuss the problems associated with evaluating unsupervised learners and report our initial evaluation experiments in an expert evaluation, a quantitative benchmarking, and an application of CORDER in a social networking tool called BuddyFinder.
Resumo:
Discovering who works with whom, on which projects and with which customers is a key task in knowledge management. Although most organizations keep models of organizational structures, these models do not necessarily accurately reflect the reality on the ground. In this paper we present a text mining method called CORDER which first recognizes named entities (NEs) of various types from Web pages, and then discovers relations from a target NE to other NEs which co-occur with it. We evaluated the method on our departmental Website. We used the CORDER method to first find related NEs of four types (organizations, people, projects, and research areas) from Web pages on the Website and then rank them according to their co-occurrence with each of the people in our department. 20 representative people were selected and each of them was presented with ranked lists of each type of NE. Each person specified whether these NEs were related to him/her and changed or confirmed their rankings. Our results indicate that the method can find the NEs with which these people are closely related and provide accurate rankings.
Resumo:
In this paper, we propose a text mining method called LRD (latent relation discovery), which extends the traditional vector space model of document representation in order to improve information retrieval (IR) on documents and document clustering. Our LRD method extracts terms and entities, such as person, organization, or project names, and discovers relationships between them by taking into account their co-occurrence in textual corpora. Given a target entity, LRD discovers other entities closely related to the target effectively and efficiently. With respect to such relatedness, a measure of relation strength between entities is defined. LRD uses relation strength to enhance the vector space model, and uses the enhanced vector space model for query based IR on documents and clustering documents in order to discover complex relationships among terms and entities. Our experiments on a standard dataset for query based IR shows that our LRD method performed significantly better than traditional vector space model and other five standard statistical methods for vector expansion.
Resumo:
Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.
Resumo:
This paper considers the contemporary use of focus groups as a method of data collection within qualitative research settings. The authors draw upon their own experiences of using focus groups in educational and 'community' user-group environments in order to provide an overview of recent issues and debates surrounding the deployment of focus group methods and to pick out specific areas of contention in relation to both their epistemological and practical implications. Accordingly, the paper reflects on some of the realities of 'doing' focus groups whilst, at the same time, highlighting common problems and dilemmas which beginning researchers might encounter in their application. In turn, the paper raises a number of related issues around which there appears to have been a lack of academic discussion to date.