935 resultados para internet service provider liability


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: There is a need to adapt pathways to care to promote access to mental health services for Indigenous people in Australia. This study explored Indigenous community and service provider perspectives of well-being and ways to promote access to care for Indigenous people at risk of depressive illness. Design: A participatory action research framework was used to inform the development of an agreed early intervention pathway; thematic analysis Setting: 2 remote communities in the Northern Territory. Participants: Using snowball and purposive sampling, 27 service providers and community members with knowledge of the local context and the diverse needs of those at risk of depression were interviewed. 30% of participants were Indigenous. The proposed pathway to care was adapted in response to participant feedback. Results: The study found that Indigenous mental health and well-being is perceived as multifaceted and strongly linked to cultural identity. It also confirms that there is broad support for promotion of a clear pathway to early intervention. Key identified components of this pathway were the health centre, visiting and community-based services, and local community resources including elders, cultural activities and families. Enablers to early intervention were reported. Significant barriers to the detection and treatment of those at risk of depression were identified, including insufficient resources, negative attitudes and stigma, and limited awareness of support options. Conclusions: Successful early intervention for wellbeing concerns requires improved understanding of Indigenous well-being perspectives and a systematic change in service delivery that promotes integration, flexibility and collaboration between services and the community, and recognises the importance of social determinants in health promotion and the healing process. Such changes require policy support, targeted training and education, and ongoing promotion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large integration of solar Photo Voltaic (PV) in distribution network has resulted in over-voltage problems. Several control techniques are developed to address over-voltage problem using Deterministic Load Flow (DLF). However, intermittent characteristics of PV generation require Probabilistic Load Flow (PLF) to introduce variability in analysis that is ignored in DLF. The traditional PLF techniques are not suitable for distribution systems and suffer from several drawbacks such as computational burden (Monte Carlo, Conventional convolution), sensitive accuracy with the complexity of system (point estimation method), requirement of necessary linearization (multi-linear simulation) and convergence problem (Gram–Charlier expansion, Cornish Fisher expansion). In this research, Latin Hypercube Sampling with Cholesky Decomposition (LHS-CD) is used to quantify the over-voltage issues with and without the voltage control algorithm in the distribution network with active generation. LHS technique is verified with a test network and real system from an Australian distribution network service provider. Accuracy and computational burden of simulated results are also compared with Monte Carlo simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, a quality assessment method based on sampling of primary laser inventory units (microsegments) was analysed. The accuracy of a laser inventory carried out in Kuhmo was analysed as a case study. Field sample plots were measured on the sampled microsegments in the Kuhmo inventory area. Two main questions were considered. Did the ALS based inventory meet the accuracy requirements set for the provider and how should a reliable, cost-efficient and independent quality assessment be undertaken. The agreement between control measurement and ALS based inventory was analysed in four ways: 1) The root mean squared errors (RMSEs) and bias were calculated. 2) Scatter plots with 95% confidence intervals were plotted and the placing of identity lines was checked. 3) Bland-Altman plots were drawn so that the mean difference of attributes between the control method and ALS-method was calculated and plotted against average value of attributes. 4) The tolerance limits were defined and combined with Bland-Altman plots. The RMSE values were compared to a reference study from which the accuracy requirements had been set to the service provider. The accuracy requirements in Kuhmo were achieved, however comparison of RMSE values proved to be difficult. Field control measurements are costly and time-consuming, but they are considered to be robust. However, control measurements might include errors, which are difficult to take into account. Using the Bland-Altman plots none of the compared methods are considered to be completely exact, so this offers a fair way to interpret results of assessment. The tolerance limits to be set on order combined with Bland-Altman plots were suggested to be taken in practise. In addition, bias should be calculated for total area. Some other approaches for quality control were briefly examined. No method was found to fulfil all the required demands of statistical reliability, cost-efficiency, time efficiency, simplicity and speed of implementation. Some benefits and shortcomings of the studied methods were discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis examines posting of workers within the free movement of services in the European Union. The emphasis is on the case law of the European Court of Justice and in the role it has played in the liberalisation of the service sector in respect of posting of workers. The case law is examined from two different viewpoints: firstly, that of employment law and secondly, immigration law. The aim is to find out how active a role the Court has taken with regard these two fields of law and what are the implications of the Court’s judgments for the regulation on a national level. The first part of the thesis provides a general review of the Community law principles governing the freedom to provide services in the EU. The second part presents the Posted Workers’ Directive and the case law of the European Court of Justice before and after the enactment of the Directive from the viewpoint of employment law. Special attention is paid to a recent judgment in which the Court has taken a restrictive position with regard to a trade union’s right to take collective action against a service provider established in another Member State. The third part of the thesis concentrates, firstly, on the legal status of non-EU nationals lawfully resident in the EU. Secondly, it looks into the question of how the Court’s case law has affected the possibilities to use non-EU nationals as posted workers within the freedom to provide services. The final chapter includes a critical analysis of the Court’s case law on posted workers. The judgments of the European Court of Justice are the principal source of law for this thesis. In the primary legislation the focus is on Articles 49 EC and 50 EC that lay down the rules concerning the free movement of services. Within the secondary legislation, the present work principally concentrates on the Posted Workers’ Directive. It also examines proposals of the European Commission and directives that have been adopted in the field of immigration. The conclusions of the case study are twofold: while in the field of employment law, the European Court of Justice has based its judgments on a very literal interpretation of the Posted Workers’ Directive, in the field of immigration its conclusions have been much more innovative. In both fields of regulation the Court’s judgments have far-reaching implications for the rules concerning posting of workers leaving very little discretion for the Member States’ authorities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep packet inspection is a technology which enables the examination of the content of information packets being sent over the Internet. The Internet was originally set up using “end-to-end connectivity” as part of its design, allowing nodes of the network to send packets to all other nodes of the network, without requiring intermediate network elements to maintain status information about the transmission. In this way, the Internet was created as a “dumb” network, with “intelligent” devices (such as personal computers) at the end or “last mile” of the network. The dumb network does not interfere with an application's operation, nor is it sensitive to the needs of an application, and as such it treats all information sent over it as (more or less) equal. Yet, deep packet inspection allows the examination of packets at places on the network which are not endpoints, In practice, this permits entities such as Internet service providers (ISPs) or governments to observe the content of the information being sent, and perhaps even manipulate it. Indeed, the existence and implementation of deep packet inspection may challenge profoundly the egalitarian and open character of the Internet. This paper will firstly elaborate on what deep packet inspection is and how it works from a technological perspective, before going on to examine how it is being used in practice by governments and corporations. Legal problems have already been created by the use of deep packet inspection, which involve fundamental rights (especially of Internet users), such as freedom of expression and privacy, as well as more economic concerns, such as competition and copyright. These issues will be considered, and an assessment of the conformity of the use of deep packet inspection with law will be made. There will be a concentration on the use of deep packet inspection in European and North American jurisdictions, where it has already provoked debate, particularly in the context of discussions on net neutrality. This paper will also incorporate a more fundamental assessment of the values that are desirable for the Internet to respect and exhibit (such as openness, equality and neutrality), before concluding with the formulation of a legal and regulatory response to the use of this technology, in accordance with these values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Action, Power and Experience in Organizational Change - A Study of Three Major Corporations This study explores change management and resistance to change as social activities and power displays through worker experiences in three major Finnish corporations. Two important sensitizing concepts were applied. Firstly, Richard Sennett's perspective on work in the new form of capitalism, and its shortcomings - the lack of commitment and freedom accompanied by the disruption to lifelong career planning and the feeling of job insecurity - offered a fruitful starting point for a critical study. Secondly, Michel Foucault's classical concept of power, treated as anecdotal, interactive and nonmeasurable, provided tools for analyzing change-enabling and resisting acts. The study bridges the gap between management and social sciences. The former have usually concentrated on leadership issues, best practices and goal attainment, while the latter have covered worker experiences, power relations and political conflicts. The study was motivated by three research questions. Firstly, why people resist or support changes in their work, work environment or organization, and the kind of analyses these behavioural choices are based on. Secondly, the kind of practical forms which support for, and resistance to change take, and how people choose the different ways of acting. Thirdly, how the people involved experience and describe their own subject position and actions in changing environments. The examination focuses on practical interpretations and action descriptions given by the members of three major Finnish business organizations. The empirical data was collected during a two-year period in the Finnish Post Corporation, the Finnish branch of Vattenfal Group, one of the leading European energy companies, and the Mehiläinen Group, the leading private medical service provider in Finland. It includes 154 non-structured thematic interviews and 309 biographies concentrating on personal experiences of change. All positions and organizational levels were represented. The analysis was conducted using the grounded theory method introduced by Straus and Corbin in three sequential phases, including open, axial and selective coding processes. As a result, there is a hierarchical structure of categories, which is summarized in the process model of change behaviour patterns. Key ingredients are past experiences and future expectations which lead to different change relations and behavioural roles. Ultimately, they contribute to strategic and tactical choices realized as both public and hidden forms of action. The same forms of action can be used in both supporting and resisting change, and there are no specific dividing lines either between employer and employee roles or between different hierarchical positions. In general, however, it is possible to conclude that strategic choices lead more often to public forms of action, whereas tactical choices result in hidden forms. The primary goal of the study was to provide knowledge which has practical applications in everyday business life, HR and change management. The results, therefore, are highly applicable to other organizations as well as to less change-dominated situations, whenever power relations and conflicting interests are present. A sociological thesis on classical business management issues can be of considerable value in revealing the crucial social processes behind behavioural patterns. Keywords: change management, organizational development, organizational resistance, resistance to change, change management, labor relations, organization, leadership

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many residential and small business users connect to the Internet via home gateways, such as DSL and cable modems. The characteristics of these devices heavily influence the quality and performance of the Internet service that these users receive. Anecdotal evidence suggests that an extremely diverse set of behaviors exists in the deployed base, forcing application developers to design for the lowest common denominator. This paper experimentally analyzes some characteristics of a substantial number of different home gateways: binding timeouts, queuing delays, throughput, protocol support and others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growth of the information economy has been stellar in the last decade. General-purpose technologies such as the computer and the Internet have promoted productivity growth in a large number of industries. The effect on telecommunications, media and technology industries has been particularly strong. These industries include mobile telecommunications, printing and publishing, broadcasting, software, hardware and Internet services. There have been large structural changes, which have led to new questions on business strategies, regulation and policy. This thesis focuses on four such questions and answers them by extending the theoretical literature on platforms. The questions (with short answers) are: (i) Do we need to regulate how Internet service providers discriminate between content providers? (Yes.) (ii) What are the welfare effects of allowing consumers to pay to remove advertisements from advertisement-supported products?(Ambiguous, but those watching ads are worse off.) (iii) Why are some markets characterized by open platforms, extendable by third parties, and some by closed platforms, which are not extendable? (It is a trade-off between intensified competition for consumers and benefits from third parties) (iv) Do private platform providers allow third parties to access their platform when it is socially desirable? (No.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the emergence of service marketing, the focus of service research has evolved. Currently the focus of research is shifting towards value co-created by the customer. Consequently, value creation is increasingly less fixed to a specific time or location controlled by the service provider. However, present service management models, although acknowledging customer participation and accessibility, have not considered the role of the empowered customer who may perform the service at various locations and time frames. The present study expands this scope and provides a framework for exploring customer perceived value from a temporal and spatial perspective. The framework is used to understand and analyse customer perceived value and to explore customer value profiles. It is proposed that customer perceived value can be conceptualised as a function of technical, functional, temporal and spatial value dimensions. These dimensions are suggested to have value-increasing and value-decreasing facets. This conceptualisation is empirically explored in an online banking context and it is shown that time and location are more important value dimensions relative to the technical and functional dimensions. The findings demonstrate that time and location are important not only in terms of having the possibility to choose when and where the service is performed. Customers also value an efficient and optimised use of time and a private and customised service location. The study demonstrates that time and location are not external elements that form the service context, but service value dimensions, in addition to the technical and functional dimensions. This thesis contributes to existing service management research through its framework for understanding temporal and spatial dimensions of perceived value. Practical implications of the study are that time and location need to be considered as service design elements in order to differentiate the service from other services and create additional value for customers. Also, because of increased customer control and the importance of time and location, it is increasingly relevant for service providers to provide a facilitating arena for customers to create value, rather than trying to control the value creation process. Kristina Heinonen is associated with CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Customer loyalty has been a central topic of both marketing theory and practice for several decades. Customer disloyalty, or relationship ending, has received much less attention. Despite the close relation between customer loyalty and disloyalty, they have rarely been addressed in the same study. The thesis bridges this gap by focusing on both loyal and disloyal customers and the factors characterising them. Based on a qualitative study of loyal and disloyal bank customers in the Finnish retail banking market, both factors that are common to the groups and factors that differentiate between them are identified. A conceptual framework of factors that affect customer loyalty or disloyalty is developed and used to analyse the empirical data. According to the framework, customers’ loyalty status (behavioural and attitudinal loyalty) is influenced by positive, loyalty-supporting, and negative, loyalty-repressing factors. Loyalty-supporting factors either promote customer dedication, making the customer want to remain loyal, or act as constraints, hindering the customer from switching. Among the loyalty-repressing factors it is especially important to identify those that act as triggers of disloyal behaviour, making customers switch service providers. The framework further suggests that by identifying the sources of loyalty-supporting and -repressing factors (the environment, the provider, the customer, the provider-customer interaction, or the core service) one can determine which factors are within the control of the service provider. Attitudinal loyalty is approached through a customer’s “feeling of loyalty”, as described by customers both orally and graphically. By combining the graphs with behavioural loyalty, seven customer groups are identified: Stable Loyals, Rescued Loyals, Loyals at Risk, Positive Disloyals, Healing Disloyals, Fading Disloyals, and Abrupt Disloyals. The framework and models of the thesis can be used to analyse factors that affect customer loyalty and disloyalty in different service contexts. Since the empirical study was carried out in a retail bank setting, the thesis has managerial relevance especially for banks. Christina Nordman is associated with CERS, Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration. The doctoral thesis is part of the Göran Collert Research Project in Customer Relationships and Retail Banking and has been funded by The Göran Collert Foundation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is based on the assumption that fading customer relationships are important phenomena to understand in order for companies to prevent a future relationship termination, manage a desired relationship termination, or manage the situation where the relationship strength temporarily or permanently has weakened but where the customer still stays with the same service provider. It is assumed that fading could take different forms and develop through a range of different processes. The purpose of the thesis is therefore to define and describe fading, reveal different types of fading relationship processes, and discuss the dynamics of these processes. In services literature there is a lack of research focusing on the weakening of customer relationships. Fading therefore represents a new approach to understanding issues related to the ending of customer relationships. A fading relationship process could precede a relationship ending, but could also represent a temporal weakening of the relationship without leading to termination. It thus distinguishes the concept from other concepts within ending research which focus solely on relationships that have been terminated, taking a larger aspect of the relationship into account (as a relationship could build on constant changes). A pilot study created an understanding of difficulties related to understanding and detecting fading customer relationship, which led to a follow-up study incorporating qualitative interviews in relationship dyads characterised as fading with both private banking customers and their respective financial advisor. The focus remained on the understanding of the fading process resulting in a model for analysing different types of fading processes. Four types of fading processes were also revealed; the crash landing process, the altitude drop process, the fizzle out process and the try out process. The dissertation contributes to a broadened understanding of different types of fading processes within the research area of ending relationships emphasising the dynamic aspects of the phenomenon. Managerial implications include the management of different types of fading processes and also the understanding of the financial advisor’s role in influencing the development of these processes. Helena Åkerlund is associated with CERS, the Centre for Relationship Marketing and Service Management at Hanken, Swedish School of Economics and Business Administration, Helsinki

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All companies have a portfolio of customer relationships. From a managerial standpoint the value of these customer relationships is a key issue. The aim of the paper is to introduce a conceptual framework for customers’ energy towards a service provider. Customer energy is defined as the cognitive, affective and behavioural effort a customer puts into the purchase of an offering. It is based on two dimensions: life theme involvement and relationship commitment. Data from a survey study of 425 customers of an online gambling site was combined with data about their individual purchases and activity. Analysis showed that involvement and commitment influence both customer behaviour and attitudes. Customer involvement was found to be strongly related to overall spending within a consumption area, whereas relationship commitment is a better predictor of the amount of money spent at a particular company. Dividing the customers into four different involvement / commitment segments revealed differences in churn rates, word-of-mouth, brand attitude, switching propensity and the use of the service for socializing. The framework provides a tool for customer management by revealing differences in fundamental drivers of customer behaviour resulting in completely new customer portfolios. Knowledge of customer energy allows companies to manage their communication and offering development better and provides insight into the risk of losing a customer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper extends current discussions about value creation and proposes a customer dominant value perspective. A customer-dominant marketing logic positions the customer in the center, rather than the service provider/producer or the interaction or the system. The focus is shifted from the company´s service processes involving the customer, to the customer´s multi-contextual value formation, involving the company. It is argued that value is not always an active process of creation; instead value is embedded and formed in the highly dynamic and multi-contextual reality and life of the customer. This leads to a need to look beyond the current line of visibility where visible customer-company interactions are focused to the invisible and mental life of the customer. From this follows a need to extend the temporal scope, from exchange and use even further to accumulated experiences in the customer´s life. The aim of this paper is to explore value formation from a customer dominant logic perspective. This is done in three steps: first, value formation is contrasted to earlier views on the company’s role in value creation by using a broad ontologically driven framework discussing what, how, when, where and who. Next, implications of the proposed characteristics of value formation compared to earlier approaches are put forward. Finally, some tentative suggestions of how this perspective would affect marketing in service companies are presented. As value formation in a CDL perspective has a different focus and scope than earlier views on value it leads to posing questions about the customer that reveals earlier hidden aspects of the role of a service for the customer. This insight might be used in service development and innovation.