773 resultados para Testbeds, Denial Of Service
Resumo:
This paper aims to develop a mathematical model based on semi-group theory, which allows to improve quality of service (QoS), including the reduction of the carbon path, in a pervasive environment of a Mobile Virtual Network Operator (MVNO). This paper generalise an interrelationship Machine to Machine (M2M) mathematical model, based on semi-group theory. This paper demonstrates that using available technology and with a solid mathematical model, is possible to streamline relationships between building agents, to control pervasive spaces so as to reduce the impact in carbon footprint through the reduction of GHG.
Resumo:
The European Union (EU) is embedded in a pluralistic legal context because of the EU and its Member States’ treaty memberships and domestic laws. Where EU conduct has implications for both the EU’s international trade relations and the legal position of individual traders, it possibly affects EU and its Member States’ obligations under the law of the World Trade Organization (WTO law) as well as the Union’s own multi-layered constitutional legal order. The present paper analyses the way in which the European Court of Justice (ECJ) accommodates WTO and EU law in the context of international trade disputes triggered by the EU. Given the ECJ’s denial of direct effect of WTO law in principle, the paper focuses on the protection of rights and remedies conferred by EU law. It assesses the implications of the WTO Dispute Settlement Understanding (DSU) – which tolerates the acceptance of retaliatory measures constraining traders’ activities in sectors different from those subject to the original trade dispute (Bananas and Hormones cases) – for the protection of ‘retaliation victims’. The paper concludes that governmental discretion conferred by WTO law has not affected the applicability of EU constitutional law but possibly shapes the actual scope of EU rights and remedies where such discretion is exercised in the EU’s general interest.
Resumo:
The process of global deforestation calls for urgent attention, particularly in South America where deforestation rates have failed to decline over the past 20 years. The main direct cause of deforestation is land conversion to agriculture. We combine data from the FAO and the World Bank for six tropical Southern American countries over the period 1970–2006, estimate a panel data model accounting for various determinants of agricultural land expansion and derive elasticities to quantify the effect of the different independent variables. We investigate whether agricultural intensification, in conjunction with governance factors, has been promoting agricultural expansion, leading to a ‘‘Jevons paradox’’. The paradox occurs if an increase in the productivity of one factor (here agricultural land) leads to its increased, rather than decreased, utilization. We find that for high values of our governance indicators a Jevons paradox exists even for moderate levels of agricultural productivity, leading to an overall expansion of agricultural area. Agricultural expansion is also positively related to the level of service on external debt and population growth, while its association with agricultural exports is only moderate. Finally, we find no evidence of an environmental Kuznets curve, as agricultural area is ultimately positively correlated to per-capita income levels.
Resumo:
Traditional resource management has had as its main objective the optimisation of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.
Resumo:
Traditional resource management has had as its main objective the optimisation of throughput, based on pa- rameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.
Resumo:
This paper aims to design a collaboration model for a Knowledge Community - SSMEnetUK. The research identifies SSMEnetUK as a socio-technical system and uses the core concepts of Service Science to explore the subject domain. The paper is positioned within the concept of Knowledge Management (KM) and utilising Web 2.0 tools for collaboration. A qualitative case study method was adopted and multiple data sources were used. In achieving that, the degree of co-relation between knowledge management activities and Web 2.0 tools for collaboration in the scenario are pitted against the concept of value propositions offered by both customer/user and service provider. The proposed model provides a better understanding of how Knowledge Management and Web 2.0 tools can enable effective collaboration within SSMEnetUK. This research is relevant to the wider service design and innovation community because it provides a basis for building a service-centric collaboration platform for the benefit of both customer/user and service provider.
Resumo:
Purpose– The evolution of the service marketing field was marked by the emergence of a global, vigorous and tolerant community of service marketing researchers. This paper seeks to examine the history of the service marketing community and argues that it may be an archetype for building the emergent global service research community. Design/methodology/approach– The study combines qualitative and quantitative approaches. The authors interviewed four pioneering service scholars and also collected descriptive data (e.g. Authorship, Affiliation, Title, Keywords) of all service related articles published in 13 top peer‐reviewed marketing and service journals over the last 30 years (5,432 articles; 6,450 authors). In a dynamic analysis the authors mapped global collaboration between countries over time and detected clusters of international collaboration. Findings– Findings suggest a growing international collaboration for the USA and the UK, while for other countries like Israel the global collaboration started from a high level and decreases now. Further, the service marketing community never became polarized and there were always contributions from researchers all over the world. Research limitations/implications– As the global service research community is developing, service marketing becomes a research neighborhood within the broader service research community. Simultaneously, other research neighborhoods are emerging within this new community (e.g. service arts, service management, service engineering, service science). Originality/value– Anchored on the social evolution and biological evolution metaphors, this study explains the evolution of the service marketing field from both qualitative and quantitative perspectives. Furthermore, it explains the development of the service marketing community as an archetype for building the global service research community.
Resumo:
Introduction: Resistance to anticoagulants in Norway rats (Rattus norvegicus) and house mice (Mus domesticus) has been studied in the UK since the early 1960s. In no other country in the world is our understanding of resistance phenomena so extensive and profound. Almost every aspect of resistance in the key rodent target species has been examined in laboratory and field trials and results obtained by independent researchers have been published. It is the principal purpose of this document to present a short synopsis of this information. More recently, however, the development of genetical techniques has provided a definitive means of detection of resistant genotypes among pest rodent populations. Preliminary information from a number of such surveys will also be presented. Resistance in Norway rats: A total of nine different anticoagulant resistance mutations (single nucleotide polymorphisms or SNPs) are found among Norway rats in the UK. In no other country worldwide are present so many different forms of Norway rat resistance. Among these nine SNPs, five are known to confer on rats that carry them a significant degree of resistance to anticoagulant rodenticides. These mutations are: L128Q, Y139S, L120Q, Y139C and Y139F. The latter three mutations confer, to varying degrees, practical resistance to bromadiolone and difenacoum, the two second-generation anticoagulants in predominant use in the UK. It is the recommendation of RRAG that bromadiolone and difenacoum should not be used against rats carrying the L120Q, Y139C and Y139F mutations because this will promote the spread of resistance and jeopardise the long-term efficacy of anticoagulants. Brodifacoum, flocoumafen and difethialone are effective against these three genotypes but cannot presently be used because of the regulatory restriction that they can only be applied against rats that are living and feeding predominantly indoors. Our understanding of the geographical distribution of Norway rat resistance in incomplete but is rapidly increasing. In particular, the mapping of the focus of L120Q Norway rat resistance in central-southern England by DNA sequencing is well advanced. We now know that rats carrying this resistance mutation are present across a large part of the counties of Hampshire, Berkshire and Wiltshire, and the resistance spreads into Avon, Oxfordshire and Surrey. It is also found, perhaps as outlier foci, in south-west Scotland and East Sussex. L120Q is currently the most severe form of anticoagulant resistance found in Norway rats and is prevalent over a considerable part of central-southern England. A second form of advanced Norway rat resistance is conferred by the Y139C mutation. This is noteworthy because it occurs in at least four different foci that are widely geographically dispersed, namely in Dumfries and Galloway, Gloucestershire, Yorkshire and Norfolk. Once again, bromadiolone and difenacoum are not recommended for use against rats carrying this genotype and a concern of RRAG is that continued applications of resisted active substances may result in Y139C becoming more or less ubiquitous across much of the UK. Another type of advanced resistance, the Y139F mutation, is present in Kent and Sussex. This means that Norway rats, carrying some degree of resistance to bromadiolone and difenacoum, are now found from the south coast of Kent, west into the city of Bristol, to Yorkshire in the north-east and to the south-west of Scotland. This difficult situation can only deteriorate further where these three genotypes exist and resisted anticoagulants are predominantly used against them. Resistance in house mice: House mouse is not so well understood but the presence in the UK of two resistant genotypes, L128S and Y139C, is confirmed. House mice are naturally tolerant to anticoagulants and such is the nature of this tolerance, and the presence of genetical resistance, that house mice resistant to the first-generation anticoagulants are considered to be widespread in the UK. Consequently, baits containing warfarin, sodium warfarin, chlorophacinone and coumatetralyl are not approved for use against mice. This regulatory position is endorsed by RRAG. Baits containing brodifacoum, flocoumafen and difethialone are effective against house mice and may be applied in practice because house mouse infestations are predominantly indoors. There are some reports of resistance among mice in some areas to the second-generation anticoagulant bromadiolone, while difenacoum remains largely efficacious. Alternatives to anticoagulants: The use of habitat manipulation, that is the removal of harbourage, denial of the availability of food and the prevention of ingress to structures, is an essential component of sustainable rodent pest management. All are of importance in the management of resistant rodents and have the advantage of not selecting for resistant genotypes. The use of these techniques may be particularly valuable in preventing the build-up of rat infestations. However, none can be used to remove any sizeable extant rat infestation and for practical reasons their use against house mice is problematic. Few alternative chemical interventions are available in the European Union because of the removal from the market of zinc phosphide, calciferol and bromethalin. Our virtual complete reliance on the use of anticoagulants for the chemical control of rodents in the UK, and more widely in the EU, calls for improved schemes for resistance management. Of course, these might involve the use of alternatives to anticoagulant rodenticides. Also important is an increasing knowledge of the distribution of resistance mutations in rats and mice and the use of only fully effective anticoagulants against them.
Resumo:
Background Access to, and the use of, information and communication technology (ICT) is increasingly becoming a vital component of mainstream life. First-order (e.g. time and money) and second-order factors (e.g. beliefs of staff members) affect the use of ICT in different contexts. It is timely to investigate what these factors may be in the context of service provision for adults with intellectual disabilities given the role ICT could play in facilitating communication and access to information and opportunities as suggested in Valuing People. Method Taking a qualitative approach, nine day service sites within one organization were visited over a period of 6 months to observe ICT-related practice and seek the views of staff members working with adults with intellectual disabilities. All day services were equipped with modern ICT equipment including computers, digital cameras, Internet connections and related peripherals. Results Staff members reported time, training and budget as significant first-order factors. Organizational culture and beliefs about the suitability of technology for older or less able service users were the striking second-order factors mentioned. Despite similar levels of equipment, support and training, ICT use had developed in very different ways across sites. Conclusion The provision of ICT equipment and training is not sufficient to ensure their use; the beliefs of staff members and organizational culture of sites play a substantial role in how ICT is used with and by service users. Activity theory provides a useful framework for considering how first- and second-order factors are related. Staff members need to be given clear information about the broader purpose of activities in day services, especially in relation to the lifelong learning agenda, in order to see the relevance and usefulness of ICT resources for all service users.
Resumo:
We develop a new measurement scale to assess consumers’ brand likeability in firm-level brands. We present brand likeability as a multidimensional construct. In the context of service experience purchases, we find that increased likeability in brands results in: (1) greater amount of positive association; (2) increased interaction interest; (3) more personified quality; and (4) increased brand contentment. The four-dimensional multiple-item scale demonstrates good psychometric properties, showing strong evidence of reliability as well as convergent, discriminant and nomological validity. Our findings reveal that brand likeability is positively associated with satisfaction and positive word of mouth. The scale extends existing branding research, providing brand managers with a metric so that likeability can be managed strategically. It addresses the need for firms to act more likeably in an interaction-dominated economy. Focusing on likeability acts as a differentiator and encourages likeable brand personality traits. We present theoretical implications and future research directions on the holistic brand likeability concept.
Resumo:
Purpose – Recognizing the heterogeneity of services, this paper aims to clarify the characteristics of forward and the corresponding reverse supply chains of different services. Design/methodology/approach – The paper develops a two-dimensional typology matrix, representing four main clusters of services according to the degree of input standardization and the degree of output tangibility. Based on this matrix, this paper develops a typology and parsimonious conceptual models illustrating the characteristics of forward and the corresponding reverse supply chains of each cluster of services. Findings – The four main clusters of service supply chains have different characteristics. This provides the basis for the identification, presentation and explanation of the different characteristics of their corresponding reverse service supply chains. Research limitations/implications – The findings of this research can help future researchers to analyse, map and model forward and reverse service supply chains, and to identify potential research gaps in the area. Practical/implications – The findings of the research can help managers of service firms to gain better visibility of their forward and reverse supply chains, and refine their business models to help extend their reverse/closed-loop activities. Furthermore, the findings can help managers to better optimize their service operations to reduce service gaps and potentially secure new value-adding opportunities. Originality/value – This paper is the first, to the authors ' knowledge, to conceptualize the basic structure of the forward and reverse service supply chains while dealing with the high level of heterogeneity of services.
Resumo:
Internet protocol TV (IPTV) is predicted to be the key technology winner in the future. Efforts to accelerate the deployment of IPTV centralized model which is combined of VHO, encoders, controller, access network and Home network. Regardless of whether the network is delivering live TV, VOD, or Time-shift TV, all content and network traffic resulting from subscriber requests must traverse the entire network from the super-headend all the way to each subscriber's Set-Top Box (STB).IPTV services require very stringent QoS guarantees When IPTV traffic shares the network resources with other traffic like data and voice, how to ensure their QoS and efficiently utilize the network resources is a key and challenging issue. For QoS measured in the network-centric terms of delay jitter, packet losses and bounds on delay. The main focus of this thesis is on the optimized bandwidth allocation and smooth datatransmission. The proposed traffic model for smooth delivering video service IPTV network with its QoS performance evaluation. According to Maglaris et al [5] First, analyze the coding bit rate of a single video source. Various statistical quantities are derived from bit rate data collected with a conditional replenishment inter frame coding scheme. Two correlated Markov process models (one in discrete time and one incontinuous time) are shown to fit the experimental data and are used to model the input rates of several independent sources into a statistical multiplexer. Preventive control mechanism which is to be include CAC, traffic policing used for traffic control.QoS has been evaluated of common bandwidth scheduler( FIFO) by use fluid models with Markovian queuing method and analysis the result by using simulator andanalytically, Which is measured the performance of the packet loss, overflow and mean waiting time among the network users.
Resumo:
The motivation for this thesis work is the need for improving reliability of equipment and quality of service to railway passengers as well as a requirement for cost-effective and efficient condition maintenance management for rail transportation. This thesis work develops a fusion of various machine vision analysis methods to achieve high performance in automation of wooden rail track inspection.The condition monitoring in rail transport is done manually by a human operator where people rely on inference systems and assumptions to develop conclusions. The use of conditional monitoring allows maintenance to be scheduled, or other actions to be taken to avoid the consequences of failure, before the failure occurs. Manual or automated condition monitoring of materials in fields of public transportation like railway, aerial navigation, traffic safety, etc, where safety is of prior importance needs non-destructive testing (NDT).In general, wooden railway sleeper inspection is done manually by a human operator, by moving along the rail sleeper and gathering information by visual and sound analysis for examining the presence of cracks. Human inspectors working on lines visually inspect wooden rails to judge the quality of rail sleeper. In this project work the machine vision system is developed based on the manual visual analysis system, which uses digital cameras and image processing software to perform similar manual inspections. As the manual inspection requires much effort and is expected to be error prone sometimes and also appears difficult to discriminate even for a human operator by the frequent changes in inspected material. The machine vision system developed classifies the condition of material by examining individual pixels of images, processing them and attempting to develop conclusions with the assistance of knowledge bases and features.A pattern recognition approach is developed based on the methodological knowledge from manual procedure. The pattern recognition approach for this thesis work was developed and achieved by a non destructive testing method to identify the flaws in manually done condition monitoring of sleepers.In this method, a test vehicle is designed to capture sleeper images similar to visual inspection by human operator and the raw data for pattern recognition approach is provided from the captured images of the wooden sleepers. The data from the NDT method were further processed and appropriate features were extracted.The collection of data by the NDT method is to achieve high accuracy in reliable classification results. A key idea is to use the non supervised classifier based on the features extracted from the method to discriminate the condition of wooden sleepers in to either good or bad. Self organising map is used as classifier for the wooden sleeper classification.In order to achieve greater integration, the data collected by the machine vision system was made to interface with one another by a strategy called fusion. Data fusion was looked in at two different levels namely sensor-level fusion, feature- level fusion. As the goal was to reduce the accuracy of the human error on the rail sleeper classification as good or bad the results obtained by the feature-level fusion compared to that of the results of actual classification were satisfactory.
Resumo:
Internet protocol TV (IPTV) is predicted to be the key technology winner in the future. Efforts to accelerate the deployment of IPTV centralized model which is combined of VHO, encoders, controller, access network and Home network. Regardless of whether the network is delivering live TV, VOD, or Time-shift TV, all content and network traffic resulting from subscriber requests must traverse the entire network from the super-headend all the way to each subscriber's Set-Top Box (STB). IPTV services require very stringent QoS guarantees When IPTV traffic shares the network resources with other traffic like data and voice, how to ensure their QoS and efficiently utilize the network resources is a key and challenging issue. For QoS measured in the network-centric terms of delay jitter, packet losses and bounds on delay. The main focus of this thesis is on the optimized bandwidth allocation and smooth data transmission. The proposed traffic model for smooth delivering video service IPTV network with its QoS performance evaluation. According to Maglaris et al [5] first, analyze the coding bit rate of a single video source. Various statistical quantities are derived from bit rate data collected with a conditional replenishment inter frame coding scheme. Two correlated Markov process models (one in discrete time and one in continuous time) are shown to fit the experimental data and are used to model the input rates of several independent sources into a statistical multiplexer. Preventive control mechanism which is to be including CAC, traffic policing used for traffic control. QoS has been evaluated of common bandwidth scheduler( FIFO) by use fluid models with Markovian queuing method and analysis the result by using simulator and analytically, Which is measured the performance of the packet loss, overflow and mean waiting time among the network users.