865 resultados para Libraries, Private
Resumo:
The rapid development of the World Wide Web has created massive information leading to the information overload problem. Under this circumstance, personalization techniques have been brought out to help users in finding content which meet their personalized interests or needs out of massively increasing information. User profiling techniques have performed the core role in this research. Traditionally, most user profiling techniques create user representations in a static way. However, changes of user interests may occur with time in real world applications. In this research we develop algorithms for mining user interests by integrating time decay mechanisms into topic-based user interest profiling. Time forgetting functions will be integrated into the calculation of topic interest measurements on in-depth level. The experimental study shows that, considering temporal effects of user interests by integrating time forgetting mechanisms shows better performance of recommendation.
Resumo:
Most recommender systems attempt to use collaborative filtering, content-based filtering or hybrid approach to recommend items to new users. Collaborative filtering recommends items to new users based on their similar neighbours, and content-based filtering approach tries to recommend items that are similar to new users' profiles. The fundamental issues include how to profile new users, and how to deal with the over-specialization in content-based recommender systems. Indeed, the terms used to describe items can be formed as a concept hierarchy. Therefore, we aim to describe user profiles or information needs by using concepts vectors. This paper presents a new method to acquire user information needs, which allows new users to describe their preferences on a concept hierarchy rather than rating items. It also develops a new ranking function to recommend items to new users based on their information needs. The proposed approach is evaluated on Amazon book datasets. The experimental results demonstrate that the proposed approach can largely improve the effectiveness of recommender systems.
Resumo:
Different reputation models are used in the web in order to generate reputation values for products using uses' review data. Most of the current reputation models use review ratings and neglect users' textual reviews, because it is more difficult to process. However, we argue that the overall reputation score for an item does not reflect the actual reputation for all of its features. And that's why the use of users' textual reviews is necessary. In our work we introduce a new reputation model that defines a new aggregation method for users' extracted opinions about products' features from users' text. Our model uses features ontology in order to define general features and sub-features of a product. It also reflects the frequencies of positive and negative opinions. We provide a case study to show how our results compare with other reputation models.
Resumo:
Agile learning spaces have the potential to afford flexible and innovative pedagogic practice. However there is little known about the experiences of teachers and learners in newly designed learning spaces, and whether the potential for reimagined pedagogies is being realised. This paper uses data from a recent study into the experiences of teacher-librarians, teachers, students and leaders of seven Queensland school libraries built with Building the Education Revolution (BER) funding, to explore the question, “how does the physical environment of school libraries influence pedagogic practices?” This paper proposes that teachers explored new pedagogies within the spaces when there was opportunity for flexibility and experimentation and the spaces sufficiently supported their beliefs about student learning. The perspectives of a range of library users were gathered through an innovative research design incorporating student drawings, videoed library tours and reflections, and interviews. The research team collected qualitative data from school libraries throughout 2012. The libraries represented a variety of geographic locations, socioeconomic conditions and both primary and secondary campuses. The use of multiple data sources, and also the perspectives of the multiple researchers who visited the sites and then coded the data, enabled complementary insights and synergies to emerge. Principles of effective teacher learning that can underpin school wide learning about the potential for agile learning spaces to enhance student learning, are identified. The paper concludes that widespread innovative use of the new library spaces was significantly enhanced when the school leadership fostered whole school discussions about the type of learning the spaces might provoke. This research has the potential to inform school designers, teachers and teacher-librarians to make the most of the transformative potential of next generation learning spaces.
Resumo:
Recent scholarship has considered the implications of the rise of voluntary private standards in food and the role of private actors in a rapidly evolving, de-facto ‘mandatory’ sphere of governance. Standards are an important element of this globalising private sphere, but are an element that has been relatively peripheral in analyses of power in agri-food systems. Sociological thought has countered orthodox views of standards as simple tools of measurement, instead understanding their function as a governance mechanism that transforms many things, and people, during processes of standardisation. In a case study of the Australian retail supermarket duopoly and the proprietary standards required for market access this paper foregrounds retailers as standard owners and their role in third-party auditing and certification. Interview data from primary research into Australia’s food standards captures the multifaceted role supermarkets play as standard-owners, who are found to impinge on the independence of third-party certification while enforcing rigorous audit practices. We show how standard owners, in attempting to standardize the audit process, generate tensions within certification practices in a unique example of ritualism around audit. In examining standards to understand power in contemporary food governance, it is shown that retailers are drawn beyond standard-setting into certification and enforcement, that is characterized by a web of institutions and actors whose power to influence outcomes is uneven.
Resumo:
Proxy re-encryption (PRE) is a highly useful cryptographic primitive whereby Alice and Bob can endow a proxy with the capacity to change ciphertext recipients from Alice to Bob, without the proxy itself being able to decrypt, thereby providing delegation of decryption authority. Key-private PRE (KP-PRE) specifies an additional level of confidentiality, requiring pseudo-random proxy keys that leak no information on the identity of the delegators and delegatees. In this paper, we propose a CPA-secure PK-PRE scheme in the standard model (which we then transform into a CCA-secure scheme in the random oracle model). Both schemes enjoy highly desirable properties such as uni-directionality and multi-hop delegation. Unlike (the few) prior constructions of PRE and KP-PRE that typically rely on bilinear maps under ad hoc assumptions, security of our construction is based on the hardness of the standard Learning-With-Errors (LWE) problem, itself reducible from worst-case lattice hard problems that are conjectured immune to quantum cryptanalysis, or “post-quantum”. Of independent interest, we further examine the practical hardness of the LWE assumption, using Kannan’s exhaustive search algorithm coupling with pruning techniques. This leads to state-of-the-art parameters not only for our scheme, but also for a number of other primitives based on LWE published the literature.
Resumo:
We present two unconditional secure protocols for private set disjointness tests. In order to provide intuition of our protocols, we give a naive example that applies Sylvester matrices. Unfortunately, this simple construction is insecure as it reveals information about the intersection cardinality. More specifically, it discloses its lower bound. By using the Lagrange interpolation, we provide a protocol for the honest-but-curious case without revealing any additional information. Finally, we describe a protocol that is secure against malicious adversaries. In this protocol, a verification test is applied to detect misbehaving participants. Both protocols require O(1) rounds of communication. Our protocols are more efficient than the previous protocols in terms of communication and computation overhead. Unlike previous protocols whose security relies on computational assumptions, our protocols provide information theoretic security. To our knowledge, our protocols are the first ones that have been designed without a generic secure function evaluation. More important, they are the most efficient protocols for private disjointness tests in the malicious adversary case.
Resumo:
User-generated content plays a pivotal role in the current social media. The main focus, however, has been on the explicitly generated user content such as photos, videos and status updates on different social networking sites. In this paper, we explore the potential of implicitly generated user content, based on users’ online consumption behaviors. It is technically feasible to record users’ consumption behaviors on mobile devices and share that with relevant people. Mobile devices with such capabilities could enrich social interactions around the consumed content, but it may also threaten users’ privacy. To understand the potentials of this design direction we created and evaluated a low-fidelity prototype intended for photo sharing within private groups. Our prototype incorporates two design concepts, namely, FingerPrint and MoodPhotos that leverage users’ consumption history and emotional responses. In this paper, we report user values and user acceptance of this prototype from three participatory design workshops.
Resumo:
At Eurocrypt’04, Freedman, Nissim and Pinkas introduced a fuzzy private matching problem. The problem is defined as follows. Given two parties, each of them having a set of vectors where each vector has T integer components, the fuzzy private matching is to securely test if each vector of one set matches any vector of another set for at least t components where t < T. In the conclusion of their paper, they asked whether it was possible to design a fuzzy private matching protocol without incurring a communication complexity with the factor (T t ) . We answer their question in the affirmative by presenting a protocol based on homomorphic encryption, combined with the novel notion of a share-hiding error-correcting secret sharing scheme, which we show how to implement with efficient decoding using interleaved Reed-Solomon codes. This scheme may be of independent interest. Our protocol is provably secure against passive adversaries, and has better efficiency than previous protocols for certain parameter values.
Resumo:
In private placement transactions, issuing firms sell a block of securities to just a small group of investors at a discounted price. Non-participating shareholders suffer from ownership dilution and lose the opportunity to receive the discount. This thesis provides the first evidence on whether and how corporate governance can protect non-participating shareholders' interests. Results from an examination of 329 private placements issued by the top 250 Australian firms between 2002 and 2009 demonstrate that firms with higher governance quality are more likely to issue a share purchase plan (SPP) along with the private placement, thus providing greater protection to non-participating shareholders' interests.
Resumo:
Recently, botnet, a network of compromised computers, has been recognized as the biggest threat to the Internet. The bots in a botnet communicate with the botnet owner via a communication channel called Command and Control (C & C) channel. There are three main C & C channels: Internet Relay Chat (IRC), Peer-to-Peer (P2P) and web-based protocols. By exploiting the flexibility of the Web 2.0 technology, the web-based botnet has reached a new level of sophistication. In August 2009, such botnet was found on Twitter, one of the most popular Web 2.0 services. In this paper, we will describe a new type of botnet that uses Web 2.0 service as a C & C channel and a temporary storage for their stolen information. We will then propose a novel approach to thwart this type of attack. Our method applies a unique identifier of the computer, an encryption algorithm with session keys and a CAPTCHA verification.
Resumo:
Early works on Private Information Retrieval (PIR) focused on minimizing the necessary communication overhead. They seemed to achieve this goal but at the expense of query response time. To mitigate this weakness, protocols with secure coprocessors were introduced. They achieve optimal communication complexity and better online processing complexity. Unfortunately, all secure coprocessor-based PIR protocols require heavy periodical preprocessing. In this paper, we propose a new protocol, which is free from the periodical preprocessing while offering the optimal communication complexity and almost optimal online processing complexity. The proposed protocol is proven to be secure.
Resumo:
We present efficient protocols for private set disjointness tests. We start from an intuition of our protocols that applies Sylvester matrices. Unfortunately, this simple construction is insecure as it reveals information about the cardinality of the intersection. More specifically, it discloses its lower bound. By using the Lagrange interpolation we provide a protocol for the honest-but-curious case without revealing any additional information. Finally, we describe a protocol that is secure against malicious adversaries. The protocol applies a verification test to detect misbehaving participants. Both protocols require O(1) rounds of communication. Our protocols are more efficient than the previous protocols in terms of communication and computation overhead. Unlike previous protocols whose security relies on computational assumptions, our protocols provide information theoretic security. To our knowledge, our protocols are first ones that have been designed without a generic secure function evaluation. More importantly, they are the most efficient protocols for private disjointness tests for the malicious adversary case.
Resumo:
Motivated by the need of private set operations in a distributed environment, we extend the two-party private matching problem proposed by Freedman, Nissim and Pinkas (FNP) at Eurocrypt’04 to the distributed setting. By using a secret sharing scheme, we provide a distributed solution of the FNP private matching called the distributed private matching. In our distributed private matching scheme, we use a polynomial to represent one party’s dataset as in FNP and then distribute the polynomial to multiple servers. We extend our solution to the distributed set intersection and the cardinality of the intersection, and further we show how to apply the distributed private matching in order to compute distributed subset relation. Our work extends the primitives of private matching and set intersection by Freedman et al. Our distributed construction might be of great value when the dataset is outsourced and its privacy is the main concern. In such cases, our distributed solutions keep the utility of those set operations while the dataset privacy is not compromised. Comparing with previous works, we achieve a more efficient solution in terms of computation. All protocols constructed in this paper are provably secure against a semi-honest adversary under the Decisional Diffie-Hellman assumption.
Resumo:
Since Queensland Wire Industries Pty Ltd v Broken Hill Pty Co Ltd (1989) 167 CLR 177 it has been recognised that corporations with substantial market power are subject to special responsibilities and restraints that corporations without market power are not. In NT Power Generation Pty Ltd v Power and Water Authority (2004) 219 CLR 90 McHugh A-CJ, Gummow, Callinan and Heydon JJ in their joint reasons stated (at [76]), that s 46 of the Competition and Consumer Act 2010 (Cth) (CCA) can operate not only to prevent firms with substantial market power from doing prohibited things, but also compel them positively to do things they do not want to do. Their Honours also stated (at [126]) that the proposition that a private property owner who declines to permit competitors to use the property is immune from s 46 is “intrinsically unsound”. However, the circumstances in which a firm with substantial power must accommodate competitors, and private property rights give way to the public interest are uncertain. The purpose of this Note is to consider recent developments in two areas of the CCA where the law requires private property rights to give way to the public interest. The first part of the Note considers two recent cases which clarify the circumstances in which s 46 of the CCA can be used to compel a firm with substantial market power to accommodate a competitor and allow the competitor to make use of private property rights in the public interest. Secondly, on 12 February 2014 the Minister for Small Business, the Hon Bruce Billson,released the Productivity Commission’s Final Report, on the National Access Regime in Pt IIIA of the CCA (National Access Regime, Inquiry Report No 66, Canberra). Pt IIIA provides for the processes by which third parties may obtain access to infrastructure owned by others in the public interest. The Report recommends that Pt IIIA be retained but makes a number of suggestions for its reform, some of which will be briefly considered.