509 resultados para anonimato rete privacy deep web onion routing cookie
Resumo:
As a model for knowledge description and formalization, ontologies are widely used to represent user profiles in personalized web information gathering. However, when representing user profiles, many models have utilized only knowledge from either a global knowledge base or a user local information. In this paper, a personalized ontology model is proposed for knowledge representation and reasoning over user profiles. This model learns ontological user profiles from both a world knowledge base and user local instance repositories. The ontology model is evaluated by comparing it against benchmark models in web information gathering. The results show that this ontology model is successful.
Resumo:
This paper reports an empirical study on measuring transit service reliability using the data from a Web-based passenger survey on a major transit corridor in Brisbane, Australia. After an introduction of transit service reliability measures, the paper presents the results from the case study including study area, data collection, and reliability measures obtained. This includes data exploration of boarding/arrival lateness, in-vehicle time variation, waiting time variation, and headway adherence. Impacts of peak-period effects and separate operation on service reliability are examined. Relationships between transit service characteristics and passenger waiting time are also discussed. A summary of key findings and an agenda of future research are offered in conclusions.
Resumo:
Dr Mills is also the invited author of the Deep End Series Teacher Guides by ERA publications. This 3-volume series for teachers is used in more than 200 schools in Australia, the USA, Canada, New Zealand, Sweden, Norway, and South America.
Resumo:
Dr Mills is the invited author of the Deep End Series Teacher Guides by ERA publications. This 3-volume series for teachers is used in more than 200 schools in Australia, the USA, Canada, New Zealand, Sweden, Norway, and South America.
Resumo:
Dr Mills is also the invited author of the Deep End Series Teacher Guides by ERA publications. This 3-volume series for teachers is used in more than 200 schools in Australia, the USA, Canada, New Zealand, Sweden, Norway, and South America.
Resumo:
In pre-Fitzgerald Queensland, the existence of corruption was widely known but its extent and modes of operation were not fully evident. The Fitzgerald Report identified the need for reform of the structure, procedures and efficiency in public administration in Queensland. What was most striking in the Queensland reform process was that a new model for combating corruption had been developed. Rather than rely upon a single law and a single institution, existing institutions were strengthened and new institutions were instituted to create a set of mutually supporting and mutually checking institutions, agencies and laws that jointly sought to improve governmental standards and combat corruption. Some of the reforms were either unique to Queensland or very rare. One of the strengths of this approach was that it avoided creating a single overarching institution to fight corruption. There are many powerful opponents of reform. Influential institutions and individuals resist any interference with their privileges. In order to cause a mass exodus from an entrenched corruption system, a seminal event or defining process is needed to alter expectations and incentives that are sufficient to encourage significant numbers of individuals to desert the corruption system and assist the integrity system in exposing and destroying it. The Fitzgerald Inquiry was such an event. The article also briefly addresses methods for destroying national corruption system where they emerge and exist.
Resumo:
The interoperable and loosely-coupled web services architecture, while beneficial, can be resource-intensive, and is thus susceptible to denial of service (DoS) attacks in which an attacker can use a relatively insignificant amount of resources to exhaust the computational resources of a web service. We investigate the effectiveness of defending web services from DoS attacks using client puzzles, a cryptographic countermeasure which provides a form of gradual authentication by requiring the client to solve some computationally difficult problems before access is granted. In particular, we describe a mechanism for integrating a hash-based puzzle into existing web services frameworks and analyze the effectiveness of the countermeasure using a variety of scenarios on a network testbed. Client puzzles are an effective defence against flooding attacks. They can also mitigate certain types of semantic-based attacks, although they may not be the optimal solution.
Resumo:
Most web service discovery systems use keyword-based search algorithms and, although partially successful, sometimes fail to satisfy some users information needs. This has given rise to several semantics-based approaches that look to go beyond simple attribute matching and try to capture the semantics of services. However, the results reported in the literature vary and in many cases are worse than the results obtained by keyword-based systems. We believe the accuracy of the mechanisms used to extract tokens from the non-natural language sections of WSDL files directly affects the performance of these techniques, because some of them can be more sensitive to noise. In this paper three existing tokenization algorithms are evaluated and a new algorithm that outperforms all the algorithms found in the literature is introduced.
Resumo:
Crack is a significant influential factor in soil slope that could leads to rainfall-induced slope instability. Existence of cracks at soil surface will decrease the shear strength and increase the hydraulic conductivity of soil slope. Although previous research has shown the effect of surface-cracks in soil stability, the influence of deep-cracks on soil stability is still unknown. The limited availability of deep crack data due to the difficulty of effective investigate methods could be one of the obstacles. Current technology in electrical resistivity can be used to detect deep-cracks in soil. This paper discusses deep cracks in unsaturated residual soil slopes in Indonesia using electrical resistivity method. The field investigation such as bore hole and SPT tests was carried out at multiple locations in the area where the electrical resistivity testing have been conducted. Subsequently, the results from bore-hole and SPT test were used to verify the results of the electrical resistivity test. This study demonstrates the benefits and limitations of the electrical resistivity in detecting deep-cracks in a residual soil slopes.
Resumo:
Increasingly scientists are using collections of software tools in their research. These tools are typically used in concert, often necessitating laborious and error-prone manual data reformatting and transfer. We present an intuitive workflow environment to support scientists with their research. The workflow, GPFlow, wraps legacy tools, presenting a high level, interactive web-based front end to scientists. The workflow backend is realized by a commercial grade workflow engine (Windows Workflow Foundation). The workflow model is inspired by spreadsheets and is novel in its support for an intuitive method of interaction enabling experimentation as required by many scientists, e.g. bioinformaticians. We apply GPFlow to two bioinformatics experiments and demonstrate its flexibility and simplicity.
Resumo:
In cloud computing, resource allocation and scheduling of multiple composite web services is an important and challenging problem. This is especially so in a hybrid cloud where there may be some low-cost resources available from private clouds and some high-cost resources from public clouds. Meeting this challenge involves two classical computational problems: one is assigning resources to each of the tasks in the composite web services; the other is scheduling the allocated resources when each resource may be used by multiple tasks at different points of time. In addition, Quality-of-Service (QoS) issues, such as execution time and running costs, must be considered in the resource allocation and scheduling problem. Here we present a Cooperative Coevolutionary Genetic Algorithm (CCGA) to solve the deadline-constrained resource allocation and scheduling problem for multiple composite web services. Experimental results show that our CCGA is both efficient and scalable.
Resumo:
In this paper, we propose a search-based approach to join two tables in the absence of clean join attributes. Non-structured documents from the web are used to express the correlations between a given query and a reference list. To implement this approach, a major challenge we meet is how to efficiently determine the number of times and the locations of each clean reference from the reference list that is approximately mentioned in the retrieved documents. We formalize the Approximate Membership Localization (AML) problem and propose an efficient partial pruning algorithm to solve it. A study using real-word data sets demonstrates the effectiveness of our search-based approach, and the efficiency of our AML algorithm.