22 resultados para performance implications
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
The purpose of this paper is to examine website adoption and its resultant effects on credit union performance in Ireland over the period 2002 to 2010. While there has been a steady increase in web adoption over the period a sizeable proportion (53%) of credit unions did not have a web-based facility in 2010. To gauge web functionality the researchers accessed all websites in 2010/2011 and it transpired that most sites were classified as informational with limited transactional options. Panel data techniques are then used to capture the dynamic nature of website diffusion and to investigate the effect of website adoption on cost and performance. The empirical analysis reveals that credit unions that have web-based functionality have a reduced spread between the loan and pay-out rate with this primarily caused by reduced loan rates. This reduced spread, although small, is found to both persist and increase over time.
Resumo:
The lifetime success and performance characteristics of communally reared offspring of wild native Burrishoole (native), ranched native (ranched) and non-native (non-native) Atlantic salmon Salmo salar from the adjacent Owenmore River were compared. Non-native year parr showed a substantial downstream migration, which was not shown by native and ranched parr. This appears to have been an active migration rather than competitive displacement and may reflect an adaptation to environmental or physiographic conditions within the Owenmore River catchment where the main nursery habitat is downstream of the spawning area. There were no differences between native and ranched in smolt output or adult return. Both of these measures, however, were significantly lower for the non-native group. A greater proportion of the non-native Atlantic salmon was taken in the coastal drift nets compared to the return to the Burrishoole system, probably as a result of the greater size of the non-native fish. The overall lifetime success of the non-native group, from fertilized egg to returning adult, was some 35% of native and ranched. The ranched group showed a significantly greater male parr maturity, a greater proportion of 1+ year smolts, and differences in sex ratio and timing of freshwater entry of returning adults compared to native, which may have fitness implications under specific conditions.
Resumo:
Purpose The aim of this paper is to explore the issues involved in developing and applying performance management approaches within a large UK public sector department using a multiple stakeholder perspective and an accompanying theoretical framework. Design/methodology/approach An initial short questionnaire was used to determine perceptions about the implementation and effectiveness of the new performance management system across the organisation. In total, 700 questionnaires were distributed. Running concurrently with an ethnographic approach, and informed by the questionnaire responses, was a series of semi-structured interviews and focus groups. Findings Staff at all levels had an understanding of the new system and perceived it as being beneficial. However, there were concerns that the approach was not continuously managed throughout the year and was in danger of becoming an annual event, rather than an ongoing process. Furthermore, the change process seemed to have advanced without corresponding changes to appraisal and reward and recognition systems. Thus, the business objectives were not aligned with motivating factors within the organisation. Research limitations/implications Additional research to test the validity and usefulness of the theoretical model, as discussed in this paper, would be beneficial. Practical implications The strategic integration of the stakeholder performance measures and scorecards was found to be essential to producing an overall stakeholder-driven strategy within the case study organisation. Originality/value This paper discusses in detail the approach adopted and the progress made by one large UK public sector organisation, as it attempts to develop better relationships with all of its stakeholders and hence improve its performance. This paper provides a concerted attempt to link theory with practice.
Resumo:
Purpose: Environmental turbulence including rapid changes in technology and markets has resulted in the need for new approaches to performance measurement and benchmarking. There is a need for studies that attempt to measure and benchmark upstream, leading or developmental aspects of organizations. Therefore, the aim of this paper is twofold. The first is to conduct an in-depth case analysis of lead performance measurement and benchmarking leading to the further development of a conceptual model derived from the extant literature and initial survey data. The second is to outline future research agendas that could further develop the framework and the subject area.
Design/methodology/approach: A multiple case analysis involving repeated in-depth interviews with managers in organisational areas of upstream influence in the case organisations.
Findings: It was found that the effect of external drivers for lead performance measurement and benchmarking was mediated by organisational context factors such as level of progression in business improvement methods. Moreover, the legitimation of the business improvement methods used for this purpose, although typical, had been extended beyond their original purpose with the development of bespoke sets of lead measures.
Practical implications: Examples of methods and lead measures are given that can be used by organizations in developing a programme of lead performance measurement and benchmarking.
Originality/value: There is a paucity of in-depth studies relating to the theory and practice of lead performance measurement and benchmarking in organisations.
Resumo:
A variety of short time delays inserted between pairs of subjects were found to affect their ability to synchronize a musical task. The subjects performed a clapping rhythm together from separate sound-isolated rooms via headphones and without visual contact. One-way time delays between pairs were manipulated electronically in the range of 3 to 78 ms. We are interested in quantifying the envelope of time delay within which two individuals produce synchronous per- formances. The results indicate that there are distinct regimes of mutually coupled behavior, and that `natural time delay'o¨delay within the narrow range associated with travel times across spatial arrangements of groups and ensembleso¨supports the most stable performance. Conditions outside of this envelope, with time delays both below and above it, create characteristic interaction dynamics in the mutually coupled actions of the duo. Trials at extremely short delays (corresponding to unnaturally close proximity) had a tendency to accelerate from anticipation. Synchronization lagged at longer delays (larger than usual physical distances) and produced an increasingly severe deceleration and then deterioration of performed rhythms. The study has implications for music collaboration over the Internet and suggests that stable rhythmic performance can be achieved by `wired ensembles' across distances of thousands of kilometers.
Resumo:
It has been 25 years since the publication of a comprehensive review of the full spectrum of salesperformance drivers. This study takes stock of the contemporary field and synthesizes empirical evidence from the period 1982–2008. The authors revise the classification scheme for sales performance determinants devised by Walker et al. (1977) and estimate both the predictive validity of its sub-categories and the impact of a range of moderators on determinant-sales performance relationships. Based on multivariate causal model analysis, the results make two major observations: (1) Five sub-categories demonstrate significant relationships with sales performance: selling-related knowledge (ß=.28), degree of adaptiveness (ß=.27), role ambiguity (ß=-.25), cognitive aptitude (ß=.23) and work engagement (ß=.23). (2) These sub-categories are moderated by measurement method, research context, and salestype variables. The authors identify managerial implications of the results and offer suggestions for further research, including the conjecture that as the world is moving toward a knowledge-intensive economy, salespeople could be functioning as knowledge-brokers. The results seem to back this supposition and indicate how it might inspire future research in the field of personal selling.
Resumo:
Purpose – The purpose of this paper is to summarize the accumulated body of knowledge on the performance of new product projects and provide directions for further research. Design/methodology/approach – Using a refined classification of antecedents of new product project performance the research results are meta-analyzed in the literature in order to identify the strength and stability of predictor-performance relationships. Findings – The results reveal that 22 variables have a significant relationship with new product project performance, of which only 12 variables have a sizable relationship. In order of importance these factors are the degree of organizational interaction, R&D and marketing interface, general product development proficiency, product advantage, financial/business analysis, technical proficiency, management skill, marketing proficiency, market orientation, technology synergy, project manager competency and launch activities. Of the 34 variables 16 predictors show potential for moderator effects. Research limitations/implications – The validity of the results is constrained by publication bias and heterogeneity of performance measures, and directions for the presentation of data in future empirical publications are provided. Practical implications – This study helps new product project managers in understanding and managing the performance of new product development projects. Originality/value – This paper provides unique insights into the importance of predictors of new product performance at the project level. Furthermore, it identifies which predictor-performance relations are contingent on other factors.
Resumo:
Drawing on agency and flexible capability perspectives, the authors develop a theoretical framework explaining the impact of ownership structure on organisational flexibility and store performance in retail chains. The researchers argue that franchised stores attract more entrepreneurial managers with more flexible capabilities and they have a stronger incentive to align their flexible capabilities with the demands of the business environment. A sample of 105 franchised and company-owned stores of an optical retail chain is used to test the hypotheses. Furthermore, the study found strong support for the hypotheses that 'Franchised stores have a higher structural flexibility than company-owned stores', but only weak support for operational and strategic flexible capabilities. Furthermore, in line with the study's theoretical framework, it has been found that in a highly turbulent business environment, franchised stores perform better than company-owned stores. The paper concludes with a discussion of the implications for theory development and management of ownership structures in retail chains.
Resumo:
BACKGROUND:
tissue MicroArrays (TMAs) are a valuable platform for tissue based translational research and the discovery of tissue biomarkers. The digitised TMA slides or TMA Virtual Slides, are ultra-large digital images, and can contain several hundred samples. The processing of such slides is time-consuming, bottlenecking a potentially high throughput platform.
METHODS:
a High Performance Computing (HPC) platform for the rapid analysis of TMA virtual slides is presented in this study. Using an HP high performance cluster and a centralised dynamic load balancing approach, the simultaneous analysis of multiple tissue-cores were established. This was evaluated on Non-Small Cell Lung Cancer TMAs for complex analysis of tissue pattern and immunohistochemical positivity.
RESULTS:
the automated processing of a single TMA virtual slide containing 230 patient samples can be significantly speeded up by a factor of circa 22, bringing the analysis time to one minute. Over 90 TMAs could also be analysed simultaneously, speeding up multiplex biomarker experiments enormously.
CONCLUSIONS:
the methodologies developed in this paper provide for the first time a genuine high throughput analysis platform for TMA biomarker discovery that will significantly enhance the reliability and speed for biomarker research. This will have widespread implications in translational tissue based research.
Resumo:
The ancillary (non-sounding) body movements made by expert musicians during performance have been shown to indicate expressive, emotional, and structural features of the music to observers, even if the sound of the performance is absent. If such ancillary body movements are a component of skilled musical performance, then it should follow that acquiring the temporal control of such movements is a feature of musical skill acquisition. This proposition is tested using measures derived from a theory of temporal guidance of movement, “General Tau Theory” (Lee in Ecol Psychol 10:221–250, 1998; Lee et al. in Exp Brain Res 139:151–159, 2001), to compare movements made during performances of intermediate-level clarinetists before and after learning a new piece of music. Results indicate that the temporal control of ancillary body movements made by participants was stronger in performances after the music had been learned and was closer to the measures of temporal control found for an expert musician’s movements. These findings provide evidence that the temporal control of musicians’ ancillary body movements develops with musical learning. These results have implications for other skillful behaviors and nonverbal communication.
Resumo:
Dioxin contamination of the food chain typically occurs when cocktails of combustion residues or polychlorinated biphenyl (PCB) containing oils become incorporated into animal feed. These highly toxic compounds are bioaccumulative with small amounts posing a major health risk. The ability to identify animal exposure to these compounds prior to their entry into the food chain may be an invaluable tool to safeguard public health. Dioxin-like compounds act by a common mode of action and this suggests that markers or patterns of response may facilitate identification of exposed animals. However, secondary co-contaminating compounds present in typical dioxin sources may affect responses to compounds. This study has investigated for the first time the potential of a metabolomics platform to distinguish between animals exposed to different sources of dioxin contamination through their diet. Sprague-Dawley rats were given feed containing dioxin-like toxins from hospital incinerator soot, a common PCB oil standard and pure 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) (normalized at 0.1 µg/kg TEQ) and acquired plasma was subsequently biochemically profiled using ultra high performance liquid chromatography (UPLC) quadropole time-of-flight-mass spectrometry (QTof-MS). An OPLS-DA model was generated from acquired metabolite fingerprints and validated which allowed classification of plasma from individual animals into the four dietary exposure study groups with a level of accuracy of 97-100%. A set of 24 ions of importance to the prediction model, and which had levels significantly altered between feeding groups, were positively identified as deriving from eight identifiable metabolites including lysophosphatidylcholine (16:0) and tyrosine. This study demonstrates the enormous potential of metabolomic-based profiling to provide a powerful and reliable tool for the detection of dioxin exposure in food-producing animals.
Resumo:
Cancer registries must provide complete and reliable incidence information with the shortest possible delay for use in studies such as comparability, clustering, cancer in the elderly and adequacy of cancer surveillance. Methods of varying complexity are available to registries for monitoring completeness and timeliness. We wished to know which methods are currently in use among cancer registries, and to compare the results of our findings to those of a survey carried out in 2006.
Methods
In the framework of the EUROCOURSE project, and to prepare cancer registries for participation in the ERA-net scheme, we launched a survey on the methods used to assess completeness, and also on the timeliness and methods of dissemination of results by registries. We sent the questionnaire to all general registries (GCRs) and specialised registries (SCRs) active in Europe and within the European Network of Cancer Registries (ENCR).
Results
With a response rate of 66% among GCRs and 59% among SCRs, we obtained data for analysis from 116 registries with a population coverage of ∼280 million. The most common methods used were comparison of trends (79%) and mortality/incidence ratios (more than 60%). More complex methods were used less commonly: capture–recapture by 30%, flow method by 18% and death certificate notification (DCN) methods with the Ajiki formula by 9%.
The median latency for completion of ascertainment of incidence was 18 months. Additional time required for dissemination was of the order of 3–6 months, depending on the method: print or electronic. One fifth (21%) did not publish results for their own registry but only as a contribution to larger national or international data repositories and publications; this introduced a further delay in the availability of data.
Conclusions
Cancer registries should improve the practice of measuring their completeness regularly and should move from traditional to more quantitative methods. This could also have implications in the timeliness of data publication.