31 resultados para IPv6, Denial of Service, Coloured Petri Nets, Risk Analysis, IPv6threats
em Aston University Research Archive
Resumo:
The Internet has become a universal communication network tool. It has evolved from a platform that supports best-effort traffic to one that now carries different traffic types including those involving continuous media with quality of service (QoS) requirements. As more services are delivered over the Internet, we face increasing risk to their availability given that malicious attacks on those Internet services continue to increase. Several networks have witnessed denial of service (DoS) and distributed denial of service (DDoS) attacks over the past few years which have disrupted QoS of network services, thereby violating the Service Level Agreement (SLA) between the client and the Internet Service Provider (ISP). Hence DoS or DDoS attacks are major threats to network QoS. In this paper we survey techniques and solutions that have been deployed to thwart DoS and DDoS attacks and we evaluate them in terms of their impact on network QoS for Internet services. We also present vulnerabilities that can be exploited for QoS protocols and also affect QoS if exploited. In addition, we also highlight challenges that still need to be addressed to achieve end-to-end QoS with recently proposed DoS/DDoS solutions. © 2010 John Wiley & Sons, Ltd.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
Risk management in healthcare represents a group of various complex actions, implemented to improve the quality of healthcare services and guarantee the patients safety. Risks cannot be eliminated, but it can be controlled with different risk assessment methods derived from industrial applications and among these the Failure Mode Effect and Criticality Analysis (FMECA) is a largely used methodology. The main purpose of this work is the analysis of failure modes of the Home Care (HC) service provided by local healthcare unit of Naples (ASL NA1) to focus attention on human and non human factors according to the organization framework selected by WHO. © Springer International Publishing Switzerland 2014.
Resumo:
Advances in technology coupled with increasing labour costs have caused service firms to explore self-service delivery options. Although some studies have focused on self-service and use of technology in service delivery, few have explored the role of service quality in consumer evaluation of technology-based self-service options. By integrating and extending the self-service quality framework the service evaluation model and the Technology Acceptance Model the authors address this emerging issue by empirically testing a comprehensive model that captures the antecedents and consequences of perceived service quality to predict continued customer interaction in the technology-based self-service context of Internet banking. Important service evaluation constructs like perceived risk, perceived value and perceived satisfaction are modelled in this framework. The results show that perceived control has the strongest influence on service quality evaluations. Perceived speed of delivery, reliability and enjoyment also have a significant impact on service quality perceptions. The study also found that even though perceived service quality, perceived risk and satisfaction are important predictors of continued interaction, perceived customer value plays a pivotal role in influencing continued interaction.
Resumo:
The development of an information system in Caribbean public sector organisations is usually seen as a matter of installing hardware and software according to a directive from senior management, without much planning. This causes huge investment in procuring hardware and software without improving overall system performance. Increasingly, Caribbean organisations are looking for assurances on information system performance before making investment decisions not only to satisfy the funding agencies, but also to be competitive in this dynamic and global business world. This study demonstrates an information system planning approach using a process-reengineering framework. Firstly, the stakeholders for the business functions are identified along with their relationships and requirements. Secondly, process reengineering is carried out to develop the system requirements. Accordingly, information technology is selected through detailed system requirement analysis. Thirdly, cost-benefit analysis, identification of critical success factors and risk analysis are carried out to strengthen the selection. The entire methodology has been demonstrated through an information system project in the Barbados drug service, a public sector organisation in the Caribbean.
A model of service performance enhancement:the role of transactional and transformational leadership
Resumo:
This paper is concerned with the ways in which transactional and transformational leadership styles can improve the service performance of front-line staff. Past literature on services marketing has indicated the importance of leadership but has largely ignored the parallel literature in which leadership styles have been conceptualized and operationalized (e.g., sales management, organizational psychology). This paper seeks to build upon existing services marketing theory by introducing the role of leadership styles in enhancing service performance. Consequently, a conceptual framework of the effect of transactional and transformational leadership styles on service performance, anchored in a crossdisciplinary literature review, is developed. Managerial implications and future research directions are also discussed.
Resumo:
Service encounter quality is an area of growing interest to researchers and managers alike, yet little is known about the effects of face-to-face service encounter quality within a business-to-business setting. In this paper, a psychometrically sound measure of such service encounter quality is proposed, and consequences of this construct are empirically assessed. Both a literature review and a dyadic in-depth interview approach were used to develop a conceptual framework and a pool of items to capture service encounter quality. A mail survey of customers was undertaken, and a response rate of 36% was obtained. Data analysis was conducted via confirmatory factor analysis and structural equation modeling. Findings reveal a four-factor structure of service encounter quality, encompassing professionalism, civility, friendliness and competence dimensions. Service encounter quality was found to be directly related to customer satisfaction and service quality perceptions, and indirectly to loyalty. The importance of these findings for practitioners and for future research on service encounter quality is discussed.
Resumo:
Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.
Resumo:
On the basis of a review of the substantive quality and service marketing literature current knowledge regarding service quality expectations was found either absent or deficient. The phenomenon is of increasing importance to both marketing researchers and management and was therefore judged worthy of scholarly consideration. Because the service quality literature was insufficiently rich when embarking on the thesis three basic research issues were considered namely the nature, determinants, and dynamics of service quality expectations. These issues were first conceptually and then qualitatively explored. This process generated research hypotheses mainly relating to a model which were subsequently tested through a series of empirical investigations using questionnaire data from field studies in a single context. The results were internally consistent and strongly supported the main research hypotheses. It was found that service quality expectations can be meaningfully described in terms of generic/service-specific, intangible/tangible, and process/outcome categories. Service-specific quality expectations were also shown to be determined by generic service quality expectations, demographic variables, personal values, psychological needs, general service sophistication, service-specific sophistication, purchase motives, and service-specific information when treating service class involvement as an exogenous variable. Subjects who had previously not directly experienced a particular service were additionally found to revise their expectations of quality when exposed to the service with change being driven by a sub-set of identified determinants.
Resumo:
Predicting future need for water resources has traditionally been, at best, a crude mixture of art and science. This has prevented the evaluation of water need from being carried out in either a consistent or comprehensive manner. This inconsistent and somewhat arbitrary approach to water resources planning led to well publicised premature developments in the 1970's and 1980's but privatisation of the Water Industry, including creation of the Office of Water Services and the National Rivers Authority in 1989, turned the tide of resource planning to the point where funding of schemes and their justification by the Regulators could no longer be assumed. Furthermore, considerable areas of uncertainty were beginning to enter the debate and complicate the assessment It was also no longer appropriate to consider that contingencies would continue to lie solely on the demand side of the equation. An inability to calculate the balance between supply and demand may mean an inability to meet standards of service or, arguably worse, an excessive provision of water resources and excessive costs to customers. United Kingdom Water Industry Research limited (UKWlR) Headroom project in 1998 provided a simple methodology for the calculation of planning margins. This methodology, although well received, was not, however, accepted by the Regulators as a tool sufficient to promote resource development. This thesis begins by considering the history of water resource planning in the UK, moving on to discuss events following privatisation of the water industry post·1985. The mid section of the research forms the bulk of original work and provides a scoping exercise which reveals a catalogue of uncertainties prevalent within the supply-demand balance. Each of these uncertainties is considered in terms of materiality, scope, and whether it can be quantified within a risk analysis package. Many of the areas of uncertainty identified would merit further research. A workable, yet robust, methodology for evaluating the balance between water resources and water demands by using a spreadsheet based risk analysis package is presented. The technique involves statistical sampling and simulation such that samples are taken from input distributions on both the supply and demand side of the equation and the imbalance between supply and demand is calculated in the form of an output distribution. The percentiles of the output distribution represent different standards of service to the customer. The model allows dependencies between distributions to be considered, for improved uncertainties to be assessed and for the impact of uncertain solutions to any imbalance to be calculated directly. The method is considered a Significant leap forward in the field of water resource planning.
Resumo:
A literature review revealed that very little work has been conducted to investigate the possible benefits of coloured interventions on reading performance in low vision due to ARMD, under conditions that are similar to the real world reading environment. Further studies on the use of colour, as a rehabilitative intervention in low vision would therefore be useful. A series of objective, subject based, age-similar controlled experiments were used to address the primary aims. Trends in some of the ARMD data suggested better reading performance with blue or green illuminance but there were also some individuals who performed better with yellow, or with illuminance of reduced intensity. Statistically, better reading in general occurred with a specialised yellow photochromic lens and also a clear lens than with a fixed lens or a neutral density filter. No reading advantage was gained from using the coloured screen facility of a video-magnifier. Some subjects with low vision were found to have co-existent binocular vision anomalies, which may have caused reading difficulties similar to those produced by ARMD. Some individuals with ARMD benefited from the use of increased local illuminance produced by either a standard tungsten or compact fluorescent lamp. No reading improvement occurred with a daylight simulation tungsten lamp. The Intuitive Colorimeter® can be used to detect and map out colour vision discrimination deficiency in ARMD and the Humphrey 630 Visual Field Analyser can be used to analyse the biocular visual field in subjects with ARMD. Some experiments highlighted a positive effect of a blue intervention in reading with ARMD.
Resumo:
Market orientation is an organization-wide concept that helps explain sustained competitive advantage (SCA). Since networks become ever more important, especially in the service sector, there is need to expand the concept of MO to a network setting. In line with Narver and Slater (1990), the concept of Market Orientation of Networks (MONW) is developed. This study indicates how MONW relates to the resource-based view (RBV) of the firm and the industrial organization (IO) view in explaining SCA. It is argued that MONW has direct and indirect effects on SCA. More precisely, the antecedent effect of MONW to resources and industry structure is considered.
Resumo:
Using a configuration theory approach, this paper conducts a comparative study between frontline employees in phone and face-to-face service encounters for a retail bank. The study compares the top performers in service quality in relation to three components of organizational commitment and their demographics by applying a profile deviation analysis. The results show that the profile deviation for face-to-face employees is significantly negative, while for call center employees nonsignificant. Although the study finds no significant differences in the three components of commitment, significant differences exist in the total experience and age of the best performers. Also, affective commitment dominates the profile of high performers, while poor service providers seem to exhibit a higher level of continuance commitment. This study demonstrates the utility of profile deviation approaches in designing internal marketing strategies.
Resumo:
This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.