74 resultados para service performance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The total thermoplastics pipe market in west Europe is estimated at 900,000 metric tonnes for 1977 and is projected to grow to some 1.3 million tonnes of predominantly PVC and polyolefins pipe by 1985. By that time, polyethylene for gas distribution pipe and fittings will represent some 30% of the total polyethylene pipe market. The performance characteristics of a high density polyethylene are significantly influenced by both molecular weight and type of comonomer; the major influences being in the long-term hoop stress resistance and the environmental stress cracking resistance. Minor amounts of hexene-1 are more effective than comonomers lower in the homologous series, although there is some sacrifice of density related properties. A synergistic improvement is obtained by combining molecular weight increase with copolymerisation. The Long-term design strength of polyethylene copolymers can be determined from hoop stress measurement at elevated temperatures and by means of a separation factor of approximate value 22, extrapolation can be made to room temperature performance for a water environment. A polyethylene of black composition has a sufficiently improved performance over yellow pigmented pipe to cast doubts on the validity of internationally specifying yellow coded pipe for gas distribution service. The chemical environment (condensate formation) that can exist in natural gas distribution networks has a deleterious effect on the pipe performance the reduction amounting to at least two decades in log time. Desorption of such condensate is very slow and the influence of the more aggressive aromatic components is to lead to premature stress cracking. For natural gas distribution purposes, the design stress rating should be 39 Kg/cm2 for polyethylenes in the molecular weight range of 150 - 200,000 and 55 Kg/cm2 for higher molecular weight materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service-based systems that are dynamically composed at run time to provide complex, adaptive functionality are currently one of the main development paradigms in software engineering. However, the Quality of Service (QoS) delivered by these systems remains an important concern, and needs to be managed in an equally adaptive and predictable way. To address this need, we introduce a novel, tool-supported framework for the development of adaptive service-based systems called QoSMOS (QoS Management and Optimisation of Service-based systems). QoSMOS can be used to develop service-based systems that achieve their QoS requirements through dynamically adapting to changes in the system state, environment and workload. QoSMOS service-based systems translate high-level QoS requirements specified by their administrators into probabilistic temporal logic formulae, which are then formally and automatically analysed to identify and enforce optimal system configurations. The QoSMOS self-adaptation mechanism can handle reliability- and performance-related QoS requirements, and can be integrated into newly developed solutions or legacy systems. The effectiveness and scalability of the approach are validated using simulations and a set of experiments based on an implementation of an adaptive service-based system for remote medical assistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dedicated short range communications (DSRC) has been regarded as one of the most promising technologies to provide robust communications for large scale vehicle networks. It is designed to support both road safety and commercial applications. Road safety applications will require reliable and timely wireless communications. However, as the medium access control (MAC) layer of DSRC is based on the IEEE 802.11 distributed coordination function (DCF), it is well known that the random channel access based MAC cannot provide guaranteed quality of services (QoS). It is very important to understand the quantitative performance of DSRC, in order to make better decisions on its adoption, control, adaptation, and improvement. In this paper, we propose an analytic model to evaluate the DSRC-based inter-vehicle communication. We investigate the impacts of the channel access parameters associated with the different services including arbitration inter-frame space (AIFS) and contention window (CW). Based on the proposed model, we analyze the successful message delivery ratio and channel service delay for broadcast messages. The proposed analytical model can provide a convenient tool to evaluate the inter-vehicle safety applications and analyze the suitability of DSRC for road safety applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tendency of managers to focus on short-term results rather than on sustained company success is of particular importance to retail marketing managers, because marketing activities involve expenditures which may only pay off in the longer term. To address the issue of myopic management, our study shows how the complexity of the service profit chain (SPC) can cause managers to make suboptimal decisions. Hence, our paper departs from past research by recognizing that understanding the temporal interplay between operational investments, employee satisfaction, customer satisfaction, and operating profit is essential to achieving sustained success. In particular, we intend to improve understanding of the functioning of the SPC with respect to time lags and feedback loops. Results of our large-scale longitudinal study set in a multi-outlet retail chain reveal time-lag effects between operational investments and employee satisfaction, as well as between customer satisfaction and performance. These findings, along with evidence of a negative interaction effect of employee satisfaction on the relationship between current performance and future investments, show the substantial risk of mismanaging the SPC. We identify specific situations in which the dynamic approach leads to superior marketing investment decisions, when compared to the conventional static view of the SCP. These insights provide valuable managerial guidance for effectively managing the SPC over time. © 2012 New York University.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IEEE 802.11 standard has achieved huge success in the past decade and is still under development to provide higher physical data rate and better quality of service (QoS). An important problem for the development and optimization of IEEE 802.11 networks is the modeling of the MAC layer channel access protocol. Although there are already many theoretic analysis for the 802.11 MAC protocol in the literature, most of the models focus on the saturated traffic and assume infinite buffer at the MAC layer. In this paper we develop a unified analytical model for IEEE 802.11 MAC protocol in ad hoc networks. The impacts of channel access parameters, traffic rate and buffer size at the MAC layer are modeled with the assistance of a generalized Markov chain and an M/G/1/K queue model. The performance of throughput, packet delivery delay and dropping probability can be achieved. Extensive simulations show the analytical model is highly accurate. From the analytical model it is shown that for practical buffer configuration (e.g. buffer size larger than one), we can maximize the total throughput and reduce the packet blocking probability (due to limited buffer size) and the average queuing delay to zero by effectively controlling the offered load. The average MAC layer service delay as well as its standard deviation, is also much lower than that in saturated conditions and has an upper bound. It is also observed that the optimal load is very close to the maximum achievable throughput regardless of the number of stations or buffer size. Moreover, the model is scalable for performance analysis of 802.11e in unsaturated conditions and 802.11 ad hoc networks with heterogenous traffic flows. © 2012 KSI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Korea has increasingly adopted design-build for public construction projects in the last few years. There is a much greater awareness of the need to change a system based on ‘Value for Money’ which is high on the government's agenda. A whole life performance bid evaluation model is proposed to aid decision makers in the selection of a design-builder. This is based on the integration of a framework using an analytic hierarchy process as the bid awarding system is being changed from one based on lowest price, to one based on best value over the life-cycle. Key criteria like whole life cost, service life planning and design quality are important through the key stages of evaluation process. The model uses a systematic and holistic approach which enables a public sector to make better decisions in design-builder selection, which will deliver whole life benefits, based on long term cost-effectiveness and whole life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reputation is a signalling device that serves as a proxy for the quality of a firm’s products, strategies and employees relative to its competitors, when communicating with clients and other stakeholders. It is especially important for professional service firms because of the complex and intangible nature of their service and because of the advantages it confers in the market for high-quality professional staff. This paper extends and refines existing research on reputation which shows positive returns to reputation for professional service firms. We use different rankings of the top 50 law firms in the UK to measure reputation and examine their relationship with financial performance as expressed in firm revenue and profits. We find positive but diminishing returns to reputation even within this group and we find a stronger relationship between reputation and profits than fee income. We conclude that reputation may be an important source of competitive advantage for leading firms but it seems to offer little leverage for others. If these results are generalizable across other professional sectors this raises the question of how the majority of firms can differentiate themselves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of this dissertation is to assess the relation between municipal benchmarking and organisational learning with a specific emphasis on benchlearning and performance within municipalities and between groups of municipalities in the building and housing sector in the Netherlands. The first and main conclusion is that this relation exists, but that the relative success of different approaches to dimensions of change and organisational learning are a key explanatory factor for differences in the success of benchlearning. Seven other important conclusions could be derived from the empirical research. First, a combination of interpretative approaches at the group level with a mixture of hierarchical and network strategies, positively influences benchlearning. Second, interaction among professionals at the inter-organisational level strengthens benchlearning. Third, stimulating supporting factors can be seen as a more important strategy to strengthen benchlearning than pulling down barriers. Fourth, in order to facilitate benchlearning, intrinsic motivation and communication skills matter, and are supported by a high level of cooperation (i.e., team work), a flat organisational structure and interactions between individuals. Fifth, benchlearning is facilitated by a strategy that is based on a balanced use of episodic (emergent) and systemic (deliberate) forms of power. Sixth, high levels of benchlearning will be facilitated by an analyser or prospector strategic stance. Prospectors and analysers reach a different learning outcome than defenders and reactors. Whereas analysers and prospectors are willing to change policies when it is perceived as necessary, the strategic stances of defenders and reactors result in narrow process improvements (i.e., single-loop learning). Seventh, performance improvement is influenced by functional perceptions towards performance, and these perceptions ultimately influence the elements adopted. This research shows that efforts aimed at benchlearning and ultimately improved service delivery, should be directed to a multi-level and multi-dimensional approach addressing the context, content and process of dimensions of change and organisational learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Designing effective direct mail pieces is considered a key success factor in direct marketing. However, related published empirical research is scarce while design recommendations are manifold and often conflicting. Compared with prior work, our study aims to provide more elaborate and empirically validated findings for the effects of direct mail design characteristics by analyzing 677 direct mail campaigns from non-profit organizations and financial service providers. We investigate the effects of (1) various envelope characteristics and observable cues on opening rates, and (2) characteristics of the envelope content on the keeping rates of direct mail campaigns. We show that visual design elements on the outer envelope – rather than sender-related details – are the predominant drivers of opening rates. Factors such as letter length, provision of sender information in the letter, and personalization positively influence the keeping rate. We also observe that opening and keeping rates are uncorrelated at the campaign level, implying that opening direct mail pieces is only a necessary condition for responding to offers, but not per se a driver of direct mail response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The aim of this article is to detail the correlation between quality management, specifically its tools and critical success factors, and performance in terms of primary operational and secondary organisational performances. Design/methodology/approach: Survey data from the UK and Turkey were analysed using exploratory factor analyses, structural equation modelling and regression analysis. Findings: The results show that quality management has a significant and positive impact on both primary and secondary performances; that Turkish and UK attitudes to quality management are similar; and that quality management is widely practised in manufacturing and service industries but has more statistical emphasis in the manufacturing sector. The main challenge for making quality management practice more effective lies in an appropriate balanced use of the different sorts of the tools and critical success factors. Originality/value: This study takes a novel approach by: (i) exploring the relationship between primary operational and secondary organisational performances, (ii) using service and manufacturing data and (iii) making a cross-country comparison between the UK (a developed economy) and Turkey (a developing economy). Limitations: Detailed contrast provided between only two countries. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: In today's competitive scenario, effective supply chain management is increasingly dependent on third-party logistics (3PL) companies' capabilities and performance. The dissemination of information technology (IT) has contributed to change the supply chain role of 3PL companies and IT is considered an important element influencing the performance of modern logistics companies. Therefore, the purpose of this paper is to explore the relationship between IT and 3PLs' performance, assuming that logistics capabilities play a mediating role in this relationship. Design/methodology/approach: Empirical evidence based on a questionnaire survey conducted on a sample of logistics service companies operating in the Italian market was used to test a conceptual resource-based view (RBV) framework linking IT adoption, logistics capabilities and firm performance. Factor analysis and ordinary least square (OLS) regression analysis have been used to test hypotheses. The focus of the paper is multidisciplinary in nature; management of information systems, strategy, logistics and supply chain management approaches have been combined in the analysis. Findings: The results indicate strong relationships among data gathering technologies, transactional capabilities and firm performance, in terms of both efficiency and effectiveness. Moreover, a positive correlation between enterprise information technologies and 3PL financial performance has been found. Originality/value: The paper successfully uses the concept of logistics capabilities as mediating factor between IT adoption and firm performance. Objective measures have been proposed for IT adoption and logistics capabilities. Direct and indirect relationships among variables have been successfully tested. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Product Service Systems, servitization, and Service Science literature continues to grow as organisations seek to protect and improve their competitive position. The potential of technology applications to deliver service delivery systems facilitated by the ability to make real time decisions based upon ‘in the field’ performance is also significant. Research identifies four key questions to be addressed. Namely: how far along the servitization continuum should the organisation go in a single strategic step? Does the organisation have the structure and infrastructure to support this transition? What level of condition monitoring should it employ? Is the product positioned correctly in the value chain to adopt condition monitoring technology? Strategy consists of three dimensions, namely content, context, and process. The literature relating to PSS, servitization, and strategy all discuss the concepts relative to content and context but none offer a process to deliver an aligned strategy to deliver a service delivery system enabled by condition based management. This paper presents a tested iterative strategy formulation methodology which is the result of a structured development programme.