983 resultados para Web-Resource


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The exploding demand for services like the World Wide Web reflects the potential that is presented by globally distributed information systems. The number of WWW servers world-wide has doubled every 3 to 5 months since 1993, outstripping even the growth of the Internet. At each of these self-managed sites, the Common Gateway Interface (CGI) and Hypertext Transfer Protocol (HTTP) already constitute a rudimentary basis for contributing local resources to remote collaborations. However, the Web has serious deficiencies that make it unsuited for use as a true medium for metacomputing --- the process of bringing hardware, software, and expertise from many geographically dispersed sources to bear on large scale problems. These deficiencies are, paradoxically, the direct result of the very simple design principles that enabled its exponential growth. There are many symptoms of the problems exhibited by the Web: disk and network resources are consumed extravagantly; information search and discovery are difficult; protocols are aimed at data movement rather than task migration, and ignore the potential for distributing computation. However, all of these can be seen as aspects of a single problem: as a distributed system for metacomputing, the Web offers unpredictable performance and unreliable results. The goal of our project is to use the Web as a medium (within either the global Internet or an enterprise intranet) for metacomputing in a reliable way with performance guarantees. We attack this problem one four levels: (1) Resource Management Services: Globally distributed computing allows novel approaches to the old problems of performance guarantees and reliability. Our first set of ideas involve setting up a family of real-time resource management models organized by the Web Computing Framework with a standard Resource Management Interface (RMI), a Resource Registry, a Task Registry, and resource management protocols to allow resource needs and availability information be collected and disseminated so that a family of algorithms with varying computational precision and accuracy of representations can be chosen to meet realtime and reliability constraints. (2) Middleware Services: Complementary to techniques for allocating and scheduling available resources to serve application needs under realtime and reliability constraints, the second set of ideas aim at reduce communication latency, traffic congestion, server work load, etc. We develop customizable middleware services to exploit application characteristics in traffic analysis to drive new server/browser design strategies (e.g., exploit self-similarity of Web traffic), derive document access patterns via multiserver cooperation, and use them in speculative prefetching, document caching, and aggressive replication to reduce server load and bandwidth requirements. (3) Communication Infrastructure: Finally, to achieve any guarantee of quality of service or performance, one must get at the network layer that can provide the basic guarantees of bandwidth, latency, and reliability. Therefore, the third area is a set of new techniques in network service and protocol designs. (4) Object-Oriented Web Computing Framework A useful resource management system must deal with job priority, fault-tolerance, quality of service, complex resources such as ATM channels, probabilistic models, etc., and models must be tailored to represent the best tradeoff for a particular setting. This requires a family of models, organized within an object-oriented framework, because no one-size-fits-all approach is appropriate. This presents a software engineering challenge requiring integration of solutions at all levels: algorithms, models, protocols, and profiling and monitoring tools. The framework captures the abstract class interfaces of the collection of cooperating components, but allows the concretization of each component to be driven by the requirements of a specific approach and environment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: One way to tackle health inequalities in resource-poor settings is to establish links between doctors and health professionals there and specialists elsewhere using web-based telemedicine. One such system run by the Swinfen Charitable Trust has been in existence for 13 years which is an unusually long time for such systems.

Objective: We wanted to gain some insights into whether and how this system might be improved.

Methods: We carried out a survey by questionnaire of referrers and specialists over a six months period.

Results: During the study period, a total of 111 cases were referred from 35 different practitioners, of whom 24% were not doctors. Survey replies were received concerning 67 cases, a response rate of 61 per cent. Eighty-seven per cent of the responding referrers found the telemedicine advice useful, and 78% were able to follow the advice provided. As a result of the advice received, the diagnosis was changed in 22% of all cases and confirmed in a further 18 per cent. Patient management was changed in 33 per cent. There was no substantial difference between doctors and non-doctors. During the study period, the 111 cases were responded to by 148 specialists, from whom 108 replies to the questionnaire were received, a response rate of 73 per cent. About half of the specialists (47%) felt that their advice had improved the management of the patients. There were 62 cases where it was possible to match up the opinions of the referrer and the consultants about the value of a specific teleconsultation. In 34 cases (55%) the referrers and specialists agreed about the value. However, in 28 cases (45%) they did not: specialists markedly underestimated the value of a consultation compared to referrers. Both referrers and specialist were extremely positive about the system which appears to be working well. Minor changes such as a clearer referral template and an improved web interface for specialists may improve it.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Web Science - Group 15 created an interactive infographic which informs prospective applicants about the new Web Science undergraduate degrees offered at the University of Southampton, starting in October 2013. Web Science as a new and exciting field of research is also briefly outlined, supported by two video interviews with Dr Les Car, a web scientist.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Resource monitoring in distributed systems is required to understand the 'health' of the overall system and to help identify particular problems, such as dysfunctional hardware or faulty system or application software. Monitoring systems such as GridRM provide the ability to connect to any number of different types of monitoring agents and provide different views of the system, based on a client's particular preferences. Web 2.0 technologies, and in particular 'mashups', are emerging as a promising technique for rapidly constructing rich user interfaces, that combine and present data in intuitive ways. This paper describes a Web 2.0 user interface that was created to expose resource data harvested by the GridRM resource monitoring system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Designing a successful web project requires understanding not only of its owner's business and technological needs, as well as having the substantial management and development experience, but it also depends on a thorough knowledge of the system's application domain and of other existing systems in the domain. In order to gather such domain knowledge, it is necessary to identify the nature of the proposed web services venture with regards to other similar services offered in the domain, the business setting of enterprises that initiate such ventures, the various types of customers involved, and how these factors translate into requirements. In this paper, we present an approach to studying the domain of web-enabled Human Resource and payroll services with the aim of attaining design knowledge that would ensure customer satisfaction and could eventually pave the way to the successful implementation of web-enabled services.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background : General Practitioners and community nurses rely on easily accessible, evidence-based online information to guide practice. To date, the methods that underpin the scoping of user-identified online information needs in palliative care have remained under-explored. This paper describes the benefits and challenges of a collaborative approach involving users and experts that informed the first stage of the development of a palliative care website.

Method : The action research-inspired methodology included a panel assessment of an existing palliative care website based in Victoria, Australia; a pre-development survey (n = 197) scoping potential audiences and palliative care information needs; working parties conducting a needs analysis about necessary information content for a redeveloped website targeting health professionals and caregivers/patients; an iterative evaluation process involving users and experts; as well as a final evaluation survey (n = 166).

Results : Involving users in the identification of content and links for a palliative care website is time-consuming and requires initial resources, strong networking skills and commitment. However, user participation provided crucial information that led to the widened the scope of the website audience and guided the development and testing of the website. The needs analysis underpinning the project suggests that palliative care peak bodies need to address three distinct audiences (clinicians, allied health professionals as well as patients and their caregivers).

Conclusion :
Web developers should pay close attention to the content, language, and accessibility needs of these groups. Given the substantial cost associated with the maintenance of authoritative health information sites, the paper proposes a more collaborative development in which users can be engaged in the definition of content to ensure relevance and responsiveness, and to eliminate unnecessary detail. Access to volunteer networks forms an integral part of such an approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A web operating system is an operating system that users can access from any hardware at any location. A peer-to-peer (P2P) grid uses P2P communication for resource management and communication between nodes in a grid and manages resources locally in each cluster, and this provides a proper architecture for a web operating system. Use of semantic technology in web operating systems is an emerging field that improves the management and discovery of resources and services. In this paper, we propose PGSW-OS (P2P grid semantic Web OS), a model based on a P2P grid architecture and semantic technology to improve resource management in a web operating system through resource discovery with the aid of semantic features. Our approach integrates distributed hash tables (DHTs) and semantic overlay networks to enable semantic-based resource management by advertising resources in the DHT based upon their annotations to enable semantic-based resource matchmaking. Our model includes ontologies and virtual organizations. Our technique decreases the computational complexity of searching in a web operating system environment. We perform a simulation study using the Gridsim simulator, and our experiments show that our model provides enhanced utilization of resources, better search expressiveness, scalability, and precision. © 2014 Springer Science+Business Media New York.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Arabidopsis thaliana, a small annual plant belonging to the mustard family, is the subject of study by an estimated 7000 researchers around the world. In addition to the large body of genetic, physiological and biochemical data gathered for this plant, it will be the first higher plant genome to be completely sequenced, with completion expected at the end of the year 2000. The sequencing effort has been coordinated by an international collaboration, the Arabidopsis Genome Initiative (AGI). The rationale for intensive investigation of Arabidopsis is that it is an excellent model for higher plants. In order to maximize use of the knowledge gained about this plant, there is a need for a comprehensive database and information retrieval and analysis system that will provide user-friendly access to Arabidopsis information. This paper describes the initial steps we have taken toward realizing these goals in a project called The Arabidopsis Information Resource (TAIR) (www.arabidopsis.org).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Disasters cause widespread harm and disrupt the normal functioning of society, and effective management requires the participation and cooperation of many actors. While advances in information and networking technology have made transmission of data easier than it ever has been before, communication and coordination of activities between actors remain exceptionally difficult. This paper employs semantic web technology and Linked Data principles to create a network of intercommunicating and inter-dependent on-line sites for managing resources. Each site publishes available resources openly and a lightweight opendata protocol is used to request and respond to requests for resources between sites in the network.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper reports on an experiment of using a publisher provided web-based resource to make available a series of optional practice quizzes and other supplementary material to all students taking a first year introductory microeconomics module. The empirical analysis evaluates the impact these supplementary resources had on student learning. First, we investigate which students decided to make use of the resources. Then, we analyse the impact this decision has on their subsequent performance in the examination at the end of the module. The results show that, even after taking into account the possibility of self-selection bias, using the web-based resource had a significant positive effect on student learning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.