935 resultados para internet service provider liability


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies, in collaboration with a Finnish logistics service company, gainsharing and the development of a gainsharing models in a logistics outsourcing context. The purpose of the study is to create various gainsharing model variations for the use of a service provider and its customers in order to develop and enhance the customer’s processes and operations, create savings and improve the collaboration between the companies. The study concentrates on offering gainsharing model alternatives for companies operating in internal logistics outsourcing context. Additionally, the prerequisites for the gainsharing arrangement are introduced. In the beginning of the study an extensive literature review is conducted. There are three main themes explored which are the collaboration in an outsourcing context, key account management and gainsharing philosophy. The customer expectations and experiences are gathered by interviewing case company’s employees and its key customers. In order to design the gainsharing model prototypes, customers and other experts’ knowledge and experiences are utilized. The result of this thesis is five gainsharing model variations that are based on the empirical and theoretical data. In addition, the instructions related to each created model are given to the case company, but are not available in this paper

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing popularity of utility-oriented computing where the resources are traded as services, efficient management of quality of service (QoS) has become increasingly significant to both service consumers and service providers. In the context of distributed multimedia content adaptation deployment on service-oriented computing, how to ensure the stringent QoS requirements of the content adaptation is a significant and immediate challenge. However, QoS guarantees in the distributed multimedia content adaptation deployment on service-oriented platform context have not been accorded the attention it deserves. In this paper, we address this problem. We formulate the SLA management for distributed multimedia content adaptation deployment on service-oriented computing as an integer programming problem. We propose an SLA management framework that enables the service provider to determine deliverable QoS before settling SLA with potential service consumers to optimize QoS guarantees. We analyzed the performance of the proposed strategy under various conditions in terms of the SLA success rate, rejection rate and impact of the resource data errors on potential violation of the agreed upon SLA. We also compared the proposed SLA management framework with a baseline approach in which the distributed multimedia content adaptation is deployed on a service-oriented platform without SLA consideration. The results of the experiments show that the proposed SLA management framework substantially outperforms the baseline approach confirming that SLA management is a core requirement for the deployment of distributed multimedia content adaptation on service-oriented systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By 2010, cloud computing had become established as a new model of IT provisioning for service providers. New market players and businesses emerged, threatening the business models of established market players. This teaching case explores the challenges arising through the impact of the new cloud computing technology on an established, multinational IT service provider called ITSP. Should the incumbent vendors adopt cloud computing offerings? And, if so, what form should those offerings take? The teaching case focuses on the strategic dimensions of technological developments, their threats and opportunities. It requires strategic decision making and forecasting under high uncertainty. The critical question is whether cloud computing is a disruptive technology or simply an alternative channel to supply computing resources over the Internet. The case challenges students to assess this new technology and plan ITSP’s responses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extensive investigative survey on Cloud Computing with the main focus on gaps that is slowing down Cloud adoption as well as reviewing the threat remediation challenges. Some experimentally supported thoughts on novel approaches to address some of the widely discussed cyber-attack types using machine learning techniques. The thoughts have been constructed in such a way so that Cloud customers can detect the cyber-attacks in their VM without much help from Cloud service provider

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The literature on corporate identity management suggests that managing corporate identity is a strategically complex task embracing the shaping of a range of dimensions of organisational life. The performance measurement literature and its applications likewise now also emphasise organisational ability to incorporate various dimensions considering both financial and non-financial performance measures when assessing success. The inclusion of these soft non-financial measures challenges organisations to quantify intangible aspects of performance such as corporate identity, transforming unmeasurables into measurables. This paper explores the regulatory roles of the use of the balanced scorecard in shaping key dimensions of corporate identities in a public sector shared service provider in Australia. This case study employs qualitative interviews of senior managers and employees, secondary data and participant observation. The findings suggest that the use of the balanced scorecard has potential to support identity construction, as an organisational symbol, a communication tool of vision, and as strategy, through creating conversations that self-regulate behaviour. The development of an integrated performance measurement system, the balanced scorecard, becomes an expression of a desired corporate identity, and the performance measures and continuous process provide the resource for interpreting actual corporate identities. Through this process of understanding and mobilising the interaction, it may be possible to create a less obtrusive and more subtle way to control “what an organisation is”. This case study also suggests that the theoretical and practical fusion of the disciplinary knowledge around corporate identities and performance measurement systems could make a contribution to understanding and shaping corporate identities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For many organizations, maintaining and upgrading enterprise resource planning (ERP) systems (large packaged application software) is often far more costly than the initial implementation. Systematic planning and knowledge of the fundamental maintenance processes and maintenance-related management data are required in order to effectively and efficiently administer maintenance activities. This paper reports a revelatory case study of Government Services Provider (GSP), a high-performing ERP service provider to government agencies in Australia. GSP ERP maintenance-process and maintenance-data standards are compared with the IEEE/EIA 12207 software engineering standard for custom software, also drawing upon published research, to identify how practices in the ERP context diverge from the IEEE standard. While the results show that many best practices reflected in the IEEE standard have broad relevance to software generally, divergent practices in the ERP context necessitate a shift in management focus, additional responsibilities, and different maintenance decision criteria. Study findings may provide useful guidance to practitioners, as well as input to the IEEE and other related standards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

President’s Message Hello fellow AITPM members, We’ve been offered a lot of press lately about the Federal Government’s plan for the multibillion dollar rollout of its high speed broadband network, which at the moment is being rated to a speed of 100Mb/s. This seems fantastic in comparison to the not atypical 250 to 500kb/s that I receive on my metropolitan cable broadband, which incidentally my service provider rates at theoretical speeds of up to 8 Mb/s. I have no doubt that such a scheme will generate significant advantages to business and consumers. However, I also have some reservations. Only a few of years ago I marvelled at my first 256Mb USB stick, which cost my employer about $90. Last month I purchased a 16Gb stick with a free computer carry bag for $80, which on the back of my envelope has given me about 72 times the value of my first USB stick not including the carry bag! I am pretty sure the technology industry will find a way to eventually push a lot more than 100Mb/s down the optic fibre network just as they have done with pushing several Mb/s ADSL2 down antique copper wire. This makes me wonder about the general problem of inbuilt obsolescence of all things high-tech due to rapid advances in the tech industry. As a transport professional I then think to myself that our industry has been moving forward at somewhat of a slower pace. We certainly have had major milestones having significant impacts, such as the move from horse and cart to the self propelled motor vehicle, sealing and formal geometric design of roads, development of motorways, signalisation of intersections, coordination of networks, to simulation modelling for real time adaptive control (perhaps major change has been at a frequency of 30 years or so?). But now with ITS truly penetrating the transport market, largely thanks to the in-car GPS navigator, smart phone, e-toll and e-ticket, I believe that to avoid our own obsolescence we’re going to need to “plan for ITS” rather than just what we seem to have been doing up until now, that is, to get it out there. And we’ll likely need to do it at a faster pace. It will involve understanding how to data mine enormous data sets, better understanding the human/machine interface, keeping pace with automotive technology more closely, resolving the ethical and privacy chestnuts, and in the main actually planning for ITS to make peoples’ lives easier rather than harder. And in amongst this we’ll need to keep pace with the types of technology advances similar to my USB stick example above. All the while we’ll be making a brand new set of friends in the disciplines that will morph into ITS along with us. Hopefully these will all be “good” problems for our profession to have. I should close in reminding everyone again that AITPM’s flagship event, the 2009 AITPM National Conference, Traffic Beyond Tomorrow, is being held in Adelaide from 5 to 7 August. www.aitpm.com has all of the details about how to register, sponsor a booth, session, etc. Best regards all, Jon Bunker

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Home-based palliative care services are facing increasing challenges in servicing the needs of clients who live alone and without a primary caregiver. The findings from the analysis of 721 services’ records from three Australian states, and feedback from health professionals in interviews and postal surveys, demonstrated that there were aspects of being on one’s own with a terminal illness and living at home that require a specialised approach and support. This study explored the issues of palliative care patients living alone, from a service provider perspective, and provided evidence-based information to assist with service planning. The study made recommendations to the Australian Department of Health and Ageing about services considered important in developing support structures for this growing population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Value Management (VM) has been proven to provide a structured framework, together with supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It is identified at International level as a natural career progression for the construction service provider and as an opportunity in developing leading-edge skills. The services offered by contractors and consultants in the construction sector have been expanding. In an increasingly competitive and global marketplace, firms are seeking ways to differentiate their services to ever more knowledgeable and demanding clients. The traditional demarcations have given way, and the old definition of what contractors, designers, engineers and quantity surveyors can, and cannot do in terms of their market offering has changed. Project management, design and cost and safety consultancy services, are being delivered by a diverse range of suppliers. Value management services have been developing in various sectors in industry; from manufacturing to the military and now construction. Given the growing evidence that VM has been successful in delivering value-for-money to the client, VM would appear to be gaining some momentum as an essential management tool in the Malaysian construction sector. The recently issued VM Circular 3/2009 by the Economic Planning Unit Malaysia (EPU) possibly marks a new beginning in public sector client acceptance on the strength of VM in construction. This paper therefore attempts to study the prospects of marketing the benefits of VM by construction service providers, and how it may provide an edge in an increasingly competitive Malaysian construction industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In open railway access markets, a train service provider (TSP) negotiates with an infrastructure provider (IP) for track access rights. This negotiation has been modeled by a multi-agent system (MAS) in which the IP and TSP are represented by separate software agents. One task of the IP agent is to generate feasible (and preferably optimal) track access rights, subject to the constraints submitted by the TSP agent. This paper formulates an IP-TSP transaction and proposes a branch-and-bound algorithm for the IP agent to identify the optimal track access rights. Empirical simulation results show that the model is able to emulate rational agent behaviors. The simulation results also show good consistency between timetables attained from the proposed methods and those derived by the scheduling principles adopted in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the recent regulatory reforms in a number of countries, railways resources are no longer managed by a single party but are distributed among different stakeholders. To facilitate the operation of train services, a train service provider (SP) has to negotiate with the infrastructure provider (IP) for a train schedule and the associated track access charge. This paper models the SP and IP as software agents and the negotiation as a prioritized fuzzy constraint satisfaction (PFCS) problem. Computer simulations have been conducted to demonstrate the effects on the train schedule when the SP has different optimization criteria. The results show that by assigning different priorities on the fuzzy constraints, agents can represent SPs with different operational objectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Digital Economy Bill has been heavily criticized by consumer organizations, internet service providers and technology experts on the grounds that it will reduce the public’s ability to access politically sensitive information, impinge on citizens’ rights to privacy, threaten freedom of expression and have a chilling effect on digital innovation. Its passage in spite of these criticisms reflects, among other things, the power of the rhetoric that has been employed by its proponents. This paper examines economic arguments surrounding the digital economy debate in light of lessons from one of the world's fastest growing economies: China.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a digital world, users’ Personally Identifiable Information (PII) is normally managed with a system called an Identity Management System (IMS). There are many types of IMSs. There are situations when two or more IMSs need to communicate with each other (such as when a service provider needs to obtain some identity information about a user from a trusted identity provider). There could be interoperability issues when communicating parties use different types of IMS. To facilitate interoperability between different IMSs, an Identity Meta System (IMetS) is normally used. An IMetS can, at least theoretically, join various types of IMSs to make them interoperable and give users the illusion that they are interacting with just one IMS. However, due to the complexity of an IMS, attempting to join various types of IMSs is a technically challenging task, let alone assessing how well an IMetS manages to integrate these IMSs. The first contribution of this thesis is the development of a generic IMS model called the Layered Identity Infrastructure Model (LIIM). Using this model, we develop a set of properties that an ideal IMetS should provide. This idealized form is then used as a benchmark to evaluate existing IMetSs. Different types of IMS provide varying levels of privacy protection support. Unfortunately, as observed by Jøsang et al (2007), there is insufficient privacy protection in many of the existing IMSs. In this thesis, we study and extend a type of privacy enhancing technology known as an Anonymous Credential System (ACS). In particular, we extend the ACS which is built on the cryptographic primitives proposed by Camenisch, Lysyanskaya, and Shoup. We call this system the Camenisch, Lysyanskaya, Shoup - Anonymous Credential System (CLS-ACS). The goal of CLS-ACS is to let users be as anonymous as possible. Unfortunately, CLS-ACS has problems, including (1) the concentration of power to a single entity - known as the Anonymity Revocation Manager (ARM) - who, if malicious, can trivially reveal a user’s PII (resulting in an illegal revocation of the user’s anonymity), and (2) poor performance due to the resource-intensive cryptographic operations required. The second and third contributions of this thesis are the proposal of two protocols that reduce the trust dependencies on the ARM during users’ anonymity revocation. Both protocols distribute trust from the ARM to a set of n referees (n > 1), resulting in a significant reduction of the probability of an anonymity revocation being performed illegally. The first protocol, called the User Centric Anonymity Revocation Protocol (UCARP), allows a user’s anonymity to be revoked in a user-centric manner (that is, the user is aware that his/her anonymity is about to be revoked). The second protocol, called the Anonymity Revocation Protocol with Re-encryption (ARPR), allows a user’s anonymity to be revoked by a service provider in an accountable manner (that is, there is a clear mechanism to determine which entity who can eventually learn - and possibly misuse - the identity of the user). The fourth contribution of this thesis is the proposal of a protocol called the Private Information Escrow bound to Multiple Conditions Protocol (PIEMCP). This protocol is designed to address the performance issue of CLS-ACS by applying the CLS-ACS in a federated single sign-on (FSSO) environment. Our analysis shows that PIEMCP can both reduce the amount of expensive modular exponentiation operations required and lower the risk of illegal revocation of users’ anonymity. Finally, the protocols proposed in this thesis are complex and need to be formally evaluated to ensure that their required security properties are satisfied. In this thesis, we use Coloured Petri nets (CPNs) and its corresponding state space analysis techniques. All of the protocols proposed in this thesis have been formally modeled and verified using these formal techniques. Therefore, the fifth contribution of this thesis is a demonstration of the applicability of CPN and its corresponding analysis techniques in modeling and verifying privacy enhancing protocols. To our knowledge, this is the first time that CPN has been comprehensively applied to model and verify privacy enhancing protocols. From our experience, we also propose several CPN modeling approaches, including complex cryptographic primitives (such as zero-knowledge proof protocol) modeling, attack parameterization, and others. The proposed approaches can be applied to other security protocols, not just privacy enhancing protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With a focus on intention and motivation, this paper describes a study involving three organisational communities and their collective effort to develop and provide more inclusive housing for people with disabilities and their families. While many studies, such as that by Rocha & Miles (2009), focus on commercial organisations, and sustainability from an economic perspective, this study involves a not-for-profit organisation (the accommodation and service provider) as well as a research organisation and a design action group volunteering their services free of charge. From this pro-bono context, the paper describes a case study that explores the nature of the collective as a basis for creative practice and political activism and the theoretical implications and wider application in terms of emerging research in the area of collaborative entrepreneurship and design activism.