109 resultados para internet service provider liability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The convergence of Internet marketplaces and service-oriented architectures has spurred the growth of Web service ecosystems. This paper articulates a vision for Web service ecosystems, discusses early manifestations of this vision, and presents a unifying architecture to support the emergence of larger and more sophisticated ecosystems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prominent research focus, especially in the context of EU public funding, has been the systematic use of the Internet for new ways of value creation in the services sector. This idea of service networks in the Internet, frequently dubbed the Internet of Services or Web service ecosystems, wants to make services tradable in digital media. In order to enable communication and trade between providers and consumers of services, the Internet of Services requires a standard that creates a "commercial envelope" around a service. This is where the Unified Service Description Language (USDL) comes into play as a normative and balanced unification of service information. The unified description established by USDL is machine-processable, considers technical and business aspects of a service as well as functional and non-functional attributes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preparing valuations is a time consuming process involving site inspections, research and report formulation. The ease of access to the internet has changed how and where valuations may be undertaken. No longer is it necessary to return to the office to finalise reports, or leave your desk in order to undertake research. This enables more streamlined service delivery and is viewed as a positive. However, it is not without negative impacts. This paper seeks to inform practitioners of the work environment changes flowing from increased access to the internet. It identifies how increased accessibility to, and use of, technology and the internet has, and will continue to, impact upon valuation service provision into the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of services available on the Internet and exploited through ever globalizing business networks poses new challenges for service interoperability. New services, from consumer “apps”, enterprise suites, platform and infrastructure resources, are vying for demand with quickly evolving and overlapping capabilities, and shorter cycles of extending service access from user interfaces to software interfaces. Services, drawn from a wider global setting, are subject to greater change and heterogeneity, demanding new requirements for structural and behavioral interface adaptation. In this paper, we analyze service interoperability scenarios in global business networks, and propose new patterns for service interactions, above those proposed over the last 10 years through the development of Web service standards and process choreography languages. By contrast, we reduce assumptions of design-time knowledge required to adapt services, giving way to run-time mismatch resolutions, extend the focus from bilateral to multilateral messaging interactions, and propose declarative ways in which services and interactions take part in long-running conversations via the explicit use of state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last two decades, the internet and e-commerce have reshaped the way we communicate, interact and transact. In the converged environment enabled by high speed broadband, web 2.0, social media, virtual worlds, user-generated content, cloud computing, VoIP, open source software and open content have rapidly become established features of our online experience. Business and government alike are increasingly using the internet as the preferred platform for delivery of their goods and services and for effective engagement with their clients. New ways of doing things online and challenges to existing business, government and social activities have tested current laws and often demand new policies and laws, adapted to the new realities. The focus of this book is the regulation of social, cultural and commercial activity on the World Wide Web. It considers developments in the law that have been, and continue to be, brought about by the emergence of the internet and e-commerce. It analyses how the law is applied to define rights and obligations in relation to online infrastructure, content and practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in the area of ‘Transformational Government’ position the citizen at the centre of focus. This paradigm shift from a department-centric to a citizen-centric focus requires governments to re-think their approach to service delivery, thereby decreasing costs and increasing citizen satisfaction. The introduction of franchises as a virtual business layer between the departments and their citizens is intended to provide a solution. Franchises are structured to address the needs of citizens independent of internal departmental structures. For delivering services online, governments pursue the development of a One-Stop Portal, which structures information and services through those franchises. Thus, each franchise can be mapped to a specific service bundle, which groups together services that are deemed to be of relevance to a specific citizen need. This study focuses on the development and evaluation of these service bundles. In particular, two research questions guide the line of investigation of this study: Research Question 1): What methods can be used by governments to identify service bundles as part of governmental One-Stop Portals? Research Question 2): How can the quality of service bundles in governmental One-Stop Portals be evaluated? The first research question asks about the identification of suitable service bundle identification methods. A literature review was conducted, to, initially, conceptualise the service bundling task, in general. As a consequence, a 4-layer model of service bundling and a morphological box were created, detailing characteristics that are of relevance when identifying service bundles. Furthermore, a literature review of Decision-Support Systems was conducted to identify approaches of relevance in different bundling scenarios. These initial findings were complemented by targeted studies of multiple leading governments in the e-government domain, as well as with a local expert in the field. Here, the aim was to identify the current status of online service delivery and service bundling in practice. These findings led to the conceptualising of two service bundle identification methods, applicable in the context of Queensland Government: On the one hand, a provider-driven approach, based on service description languages, attributes, and relationships between services was conceptualised. As well, a citizen-driven approach, based on analysing the outcomes from content identification and grouping workshops with citizens, was also conceptualised. Both methods were then applied and evaluated in practice. The conceptualisation of the provider-driven method for service bundling required the initial specification of relevant attributes that could be used to identify similarities between services called relationships; these relationships then formed the basis for the identification of service bundles. This study conceptualised and defined seven relationships, namely ‘Co-location’, ‘Resource’, ‘Co-occurrence’, ‘Event’, ‘Consumer’, ‘Provider’, and ‘Type’. The relationships, and the bundling method itself, were applied and refined as part of six Action Research cycles in collaboration with the Queensland Government. The findings show that attributes and relationships can be used effectively as a means for bundle identification, if distinct decision rules are in place to prescribe how services are to be identified. For the conceptualisation of the citizen-driven method, insights from the case studies led to the decision to involve citizens, through card sorting activities. Based on an initial list of services, relevant for a certain franchise, participating citizens grouped services according to their liking. The card sorting activity, as well as the required analysis and aggregation of the individual card sorting results, was analysed in depth as part of this study. A framework was developed that can be used as a decision-support tool to assist with the decision of what card sorting analysis method should be utilised in a given scenario. The characteristic features associated with card sorting in a government context led to the decision to utilise statistical analysis approaches, such as cluster analysis and factor analysis, to aggregate card sorting results. The second research question asks how the quality of service bundles can be assessed. An extensive literature review was conducted focussing on bundle, portal, and e-service quality. It was found that different studies use different constructs, terminology, and units of analysis, which makes comparing these models a difficult task. As a direct result, a framework was conceptualised, that can be used to position past and future studies in this research domain. Complementing the literature review, interviews conducted as part of the case studies with leaders in e-government, indicated that, typically, satisfaction is evaluated for the overall portal once the portal is online, but quality tests are not conducted during the development phase. Consequently, a research model which appropriately defines perceived service bundle quality would need to be developed from scratch. Based on existing theory, such as Theory of Reasoned Action, Expectation Confirmation Theory, and Theory of Affordances, perceived service bundle quality was defined as an inferential belief. Perceived service bundle quality was positioned within the nomological net of services. Based on the literature analysis on quality, and on the subsequent work of a focus group, the hypothesised antecedents (descriptive beliefs) of the construct and the associated question items were defined and the research model conceptualised. The model was then tested, refined, and finally validated during six Action Research cycles. Results show no significant difference in higher quality or higher satisfaction among users for either the provider-driven method or for the citizen-driven method. The decision on which method to choose, it was found, should be based on contextual factors, such as objectives, resources, and the need for visibility. The constructs of the bundle quality model were examined. While the quality of bundles identified through the citizen-centric approach could be explained through the constructs ‘Navigation’, ‘Ease of Understanding’, and ‘Organisation’, bundles identified through the provider-driven approach could be explained solely through the constructs ‘Navigation’ and ‘Ease of Understanding’. An active labelling style for bundles, as part of the provider-driven Information Architecture, had a larger impact on ‘Quality’ than the topical labelling style used in the citizen-centric Information Architecture. However, ‘Organisation’, reflecting the internal, logical structure of the Information Architecture, was a significant factor impacting on ‘Quality’ only in the citizen-driven Information Architecture. Hence, it was concluded that active labelling can compensate for a lack of logical structure. Further studies are needed to further test this conjecture. Such studies may involve building alternative models and conducting additional empirical research (e.g. use of an active labelling style for the citizen-driven Information Architecture). This thesis contributes to the body of knowledge in several ways. Firstly, it presents an empirically validated model of the factors explaining and predicting a citizen’s perception of service bundle quality. Secondly, it provides two alternative methods that can be used by governments to identify service bundles in structuring the content of a One-Stop Portal. Thirdly, this thesis provides a detailed narrative to suggest how the recent paradigm shift in the public domain, towards a citizen-centric focus, can be pursued by governments; the research methodology followed by this study can serve as an exemplar for governments seeking to achieve a citizen-centric approach to service delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite its potential multiple contributions to sustainable policy objectives, urban transit is generally not widely used by the public in terms of its market share compared to that of automobiles, particularly in affluent societies with low-density urban forms like Australia. Transit service providers need to attract more people to transit by improving transit quality of service. The key to cost-effective transit service improvements lies in accurate evaluation of policy proposals by taking into account their impacts on transit users. If transit providers knew what is more or less important to their customers, they could focus their efforts on optimising customer-oriented service. Policy interventions could also be specified to influence transit users’ travel decisions, with targets of customer satisfaction and broader community welfare. This significance motivates the research into the relationship between urban transit quality of service and its user perception as well as behaviour. This research focused on two dimensions of transit user’s travel behaviour: route choice and access arrival time choice. The study area chosen was a busy urban transit corridor linking Brisbane central business district (CBD) and the St. Lucia campus of The University of Queensland (UQ). This multi-system corridor provided a ‘natural experiment’ for transit users between the CBD and UQ, as they can choose between busway 109 (with grade-separate exclusive right-of-way), ordinary on-street bus 412, and linear fast ferry CityCat on the Brisbane River. The population of interest was set as the attendees to UQ, who travelled from the CBD or from a suburb via the CBD. Two waves of internet-based self-completion questionnaire surveys were conducted to collect data on sampled passengers’ perception of transit service quality and behaviour of using public transit in the study area. The first wave survey is to collect behaviour and attitude data on respondents’ daily transit usage and their direct rating of importance on factors of route-level transit quality of service. A series of statistical analyses is conducted to examine the relationships between transit users’ travel and personal characteristics and their transit usage characteristics. A factor-cluster segmentation procedure is applied to respodents’ importance ratings on service quality variables regarding transit route preference to explore users’ various perspectives to transit quality of service. Based on the perceptions of service quality collected from the second wave survey, a series of quality criteria of the transit routes under study was quantitatively measured, particularly, the travel time reliability in terms of schedule adherence. It was proved that mixed traffic conditions and peak-period effects can affect transit service reliability. Multinomial logit models of transit user’s route choice were estimated using route-level service quality perceptions collected in the second wave survey. Relative importance of service quality factors were derived from choice model’s significant parameter estimates, such as access and egress times, seat availability, and busway system. Interpretations of the parameter estimates were conducted, particularly the equivalent in-vehicle time of access and egress times, and busway in-vehicle time. Market segmentation by trip origin was applied to investigate the difference in magnitude between the parameter estimates of access and egress times. The significant costs of transfer in transit trips were highlighted. These importance ratios were applied back to quality perceptions collected as RP data to compare the satisfaction levels between the service attributes and to generate an action relevance matrix to prioritise attributes for quality improvement. An empirical study on the relationship between average passenger waiting time and transit service characteristics was performed using the service quality perceived. Passenger arrivals for services with long headways (over 15 minutes) were found to be obviously coordinated with scheduled departure times of transit vehicles in order to reduce waiting time. This drove further investigations and modelling innovations in passenger’ access arrival time choice and its relationships with transit service characteristics and average passenger waiting time. Specifically, original contributions were made in formulation of expected waiting time, analysis of the risk-aversion attitude to missing desired service run in the passengers’ access time arrivals’ choice, and extensions of the utility function specification for modelling passenger access arrival distribution, by using complicated expected utility forms and non-linear probability weighting to explicitly accommodate the risk of missing an intended service and passenger’s risk-aversion attitude. Discussions on this research’s contributions to knowledge, its limitations, and recommendations for future research are provided at the concluding section of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service-oriented architectures and Web services mature and have become more widely accepted and used by industry. This growing adoption increased the demands for new ways of using Web service technology. Users start re-combining and mediating other providers’ services in ways that have not been anticipated by their original provider. Within organisations and cross-organisational communities, discoverable services are organised in repositories providing convenient access to adaptable end-to-end business processes. This idea is captured in the term Service Ecosystem. This paper addresses the question of how quality management can be performed in such service ecosystems. Service quality management is a key challenge when services are composed of a dynamic set of heterogeneous sub-services from different service providers. This paper contributes to this important area by developing a reference model of quality management in service ecosystems. We illustrate the application of the reference model in an exploratory case study. With this case study, we show how the reference model helps to derive requirements for the implementation and support of quality management in an exemplary service ecosystem in public administration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service mismatches involve the adaptation of structural and behavioural interfaces of services, which in practice incurs long lead times through manual, coding e ort. We propose a framework, complementary to conventional service adaptation, to extract comprehensive and seman- tically normalised service interfaces, useful for interoperability in large business networks and the Internet of Services. The framework supports introspection and analysis of large and overloaded operational signa- tures to derive focal artefacts, namely the underlying business objects of services. A more simpli ed and comprehensive service interface layer is created based on these, and rendered into semantically normalised in- terfaces, given an ontology accrued through the framework from service analysis history. This opens up the prospect of supporting capability comparisons across services, and run-time request backtracking and ad- justment, as consumers discover new features of a service's operations through corresponding features of similar services. This paper provides a rst exposition of the service interface synthesis framework, describing patterns having novel requirements for unilateral service adaptation, and algorithms for interface introspection and business object alignment. A prototype implementation and analysis of web services drawn from com- mercial logistic systems are used to validate the algorithms and identify open challenges and future research directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growth of APIs and Web services on the Internet, especially through larger enterprise systems increasingly being leveraged for Cloud and software-as-a-service opportunities, poses challenges for improving the efficiency of integration with these services. Interfaces of enterprise systems are typically larger, more complex and overloaded, with single operations having multiple data entities and parameter sets, supporting varying requests, and reflecting versioning across different system releases, compared to fine-grained operations of contemporary interfaces. We propose a technique to support the refactoring of service interfaces by deriving business entities and their relationships. In this paper, we focus on the behavioural aspects of service interfaces, aiming to discover the sequential dependencies of operations (otherwise known as protocol extraction) based on the entities and relationships derived. Specifically, we propose heuristics according to these relationships, and in turn, deriving permissible orders in which operations are invoked. As a result of this, service operations can be refactored on business entity CRUD lines, with explicit behavioural protocols as part of an interface definition. This supports flexible service discovery, composition and integration. A prototypical implementation and analysis of existing Web services, including those of commercial logistic systems (Fedex), are used to validate the algorithms proposed through the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the internet age, copyright owners are increasingly looking to online intermediaries to take steps to prevent copyright infringement. Sometimes these intermediaries are closely tied to the acts of infringement; sometimes – as in the case of ISPs – they are not. In 2012, the Australian High Court decided the Roadshow Films v iiNet case, in which it held that an Australian ISP was not liable under copyright’s authorization doctrine, which asks whether the intermediary has sanctioned, approved or countenanced the infringement. The Australian Copyright Act 1968 directs a court to consider, in these situations, whether the intermediary had the power to prevent the infringement and whether it took any reasonable steps to prevent or avoid the infringement. It is generally not difficult for a court to find the power to prevent infringement – power to prevent can include an unrefined technical ability to disconnect users from the copyright source, such as an ISP terminating users’ internet accounts. In the iiNet case, the High Court eschewed this broad approach in favor of focusing on a notion of control that was influenced by principles of tort law. In tort, when a plaintiff asserts that a defendant should be liable for failing to act to prevent harm caused to the plaintiff by a third party, there is a heavy burden on the plaintiff to show that the defendant had a duty to act. The duty must be clear and specific, and will often hinge on the degree of control that the defendant was able to exercise over the third party. Control in these circumstances relates directly to control over the third party’s actions in inflicting the harm. Thus, in iiNet’s case, the control would need to be directed to the third party’s infringing use of BitTorrent; control over a person’s ability to access the internet is too imprecise. Further, when considering omissions to act, tort law differentiates between the ability to control and the ability to hinder. The ability to control may establish a duty to act, and the court will then look to small measures taken to prevent the harm to determine whether these satisfy the duty. But the ability to hinder will not suffice to establish liability in the absence of control. This article argues that an inquiry grounded in control as defined in tort law would provide a more principled framework for assessing the liability of passive intermediaries in copyright. In particular, it would set a higher, more stable benchmark for determining the copyright liability of passive intermediaries, based on the degree of actual, direct control that the intermediary can exercise over the infringing actions of its users. This approach would provide greater clarity and consistency than has existed to date in this area of copyright law in Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web service and business process technologies are widely adopted to facilitate business automation and collaboration. Given the complexity of business processes, it is a sought-after feature to show a business process with different views to cater for the diverse interests, authority levels, etc., of different users. Aiming to implement such flexible process views in the Web service environment, this paper presents a novel framework named FlexView to support view abstraction and concretisation of WS-BPEL processes. In the FlexView framework, a rigorous view model is proposed to specify the dependency and correlation between structural components of process views with emphasis on the characteristics of WS-BPEL, and a set of rules are defined to guarantee the structural consistency between process views during transformations. A set of algorithms are developed to shift the abstraction and concretisation operations to the operational level. A prototype is also implemented for the proof-of-concept purpose. © 2010 Springer Science+Business Media, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose Health service quality is an important determinant for health service satisfaction and behavioral intentions. The purpose of this paper is to investigate requirements of e‐health services and to develop a measurement model to analyze the construct of “perceived e‐health service quality.” Design/methodology/approach The paper adapts the C‐OAR‐SE procedure for scale development by Rossiter. The focal aspect is the “physician‐patient relationship” which forms the core dyad in the healthcare service provision. Several in‐depth interviews were conducted in Switzerland; first with six patients (as raters), followed by two experts of the healthcare system (as judges). Based on the results and an extensive literature research, the classification of object and attributes is developed for this model. Findings The construct e‐health service quality can be described as an abstract formative object and is operationalized with 13 items: accessibility, competence, information, usability/user friendliness, security, system integration, trust, individualization, empathy, ethical conduct, degree of performance, reliability, and ability to respond. Research limitations/implications Limitations include the number of interviews with patients and experts as well as critical issues associated with C‐OAR‐SE. More empirical research is needed to confirm the quality indicators of e‐health services. Practical implications Health care providers can utilize the results for the evaluation of their service quality. Practitioners can use the hierarchical structure to measure service quality at different levels. The model provides a diagnostic tool to identify poor and/or excellent performance with regard to the e‐service delivery. Originality/value The paper contributes to knowledge with regard to the measurement of e‐health quality and improves the understanding of how customers evaluate the quality of e‐health services.