937 resultados para Library Access Considerations: A User’s Perspective


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Access control is a fundamental concern in any system that manages resources, e.g., operating systems, file systems, databases and communications systems. The problem we address is how to specify, enforce, and implement access control in distributed environments. This problem occurs in many applications such as management of distributed project resources, e-newspaper and payTV subscription services. Starting from an access relation between users and resources, we derive a user hierarchy, a resource hierarchy, and a unified hierarchy. The unified hierarchy is then used to specify the access relation in a way that is compact and that allows efficient queries. It is also used in cryptographic schemes that enforce the access relation. We introduce three specific cryptography based hierarchical schemes, which can effectively enforce and implement access control and are designed for distributed environments because they do not need the presence of a central authority (except perhaps for set- UP).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the use of library catalogue in Niger Delta University library. The study employed descriptive research method and questionnaire as a research instrument to generate the data. The analysis revealed that 168 (51 percent) of the users were not aware of library catalogue, and 160 (54 percent) had never used the catalogue. The study also showed that 209 (71.7 percent) encountered difficulties in using the catalogue because of lack of proper education and, as a result, 202 (68.7 percent) of the users resolved to used browsing/reading through the shelves method to locate books. The analysis also revealed that 202 (68.7 percent) of users indicated that proper user education was a means to easy catalogue use in the library. Recommendations were made to improve effective catalogue use, including user education, regular orientation programme, and preparation of guidelines on use of the library catalogue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recommendations • Become a beta partner with vendor • Test load collections before going live • Update cataloging codes to benefit your community • Don’t expect to drastically change cataloging practices

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mashups are becoming increasingly popular as end users are able to easily access, manipulate, and compose data from several web sources. To support end users, communities are forming around mashup development environments that facilitate sharing code and knowledge. We have observed, however, that end user mashups tend to suffer from several deficiencies, such as inoperable components or references to invalid data sources, and that those deficiencies are often propagated through the rampant reuse in these end user communities. In this work, we identify and specify ten code smells indicative of deficiencies we observed in a sample of 8,051 pipe-like web mashups developed by thousands of end users in the popular Yahoo! Pipes environment. We show through an empirical study that end users generally prefer pipes that lack those smells, and then present eleven specialized refactorings that we designed to target and remove the smells. Our refactorings reduce the complexity of pipes, increase their abstraction, update broken sources of data and dated components, and standardize pipes to fit the community development patterns. Our assessment on the sample of mashups shows that smells are present in 81% of the pipes, and that the proposed refactorings can reduce that number to 16%, illustrating the potential of refactoring to support thousands of end users developing pipe-like mashups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A qualitative and quantitative reanalysis of the Six Cultures data on children’s play, collected in the 1950s, was performed to revisit worlds of childhood during a time when sample communities were more isolated from mass markets and media than they are today. A count was performed of children aged 3 to 10 in each community sample scored as engaging in creative-constructive play, fantasy play, role play, and games with rules. Children from Nyansongo and Khalapur scored lowest overall, those from Tarong and Juxtlahuaca scored intermediate, and those from Taira and Orchard Town scored highest. Cultural norms and opportunities determined how the kinds of play were stimulated by the physical and social environments (e.g., whether adults encouraged work versus play, whether children had freedom for exploration and motivation to practice adult roles through play, and whether the environment provided easy access to models and materials for creative and constructive play).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives To analyse the perspective of clinical research stakeholders concerning post-trial access to study medication. Methods Questionnaires and informed consents were sent through e-mail to 599 ethics committee (EC) members, 290 clinical investigators (HIV/AIDS and Diabetes) and 53 sponsors in Brazil. Investigators were also asked to submit the questionnaire to their research patients. Two reminders were sent to participants. Results The response rate was 21%, 20% and 45% in EC, investigators and sponsors' groups, respectively. 54 patients answered the questionnaire through their doctors. The least informative item in the consent form was how to obtain the study medication after trial. If a benefit were demonstrated in the study, 60% of research participants and 35% of EC answered that all patients should continue receiving study medication after trial; 43% of investigators believed the medication should be given to participants, and 40% to subjects who participated and benefited from treatment. For 50% of the sponsors, study medication should be assured to participants who had benefited from treatment. The majority of responders answered that medication should be provided free by sponsors; investigators and sponsors believed the medication should be kept until available in the public health sector; EC members said that the patient should keep the benefit; patients answered that benefits should be assured for life. Conclusions Due to the study limitations, the results cannot be generalised; however, the data can contribute to discussion of this complex topic through analysing the views of stakeholders in clinical research in Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermal limits of individual animals were originally proposed as a link between animal physiology and thermal ecology. Although this link is valid in theory, the evaluation of physiological tolerances involves some problems that are the focus of this study. One rationale was that heating rates shall influence upper critical limits, so that ecological thermal limits need to consider experimental heating rates. In addition, if thermal limits are not surpassed in experiments, subsequent tests of the same individual should yield similar results or produce evidence of hardening. Finally, several non-controlled variables such as time under experimental conditions and procedures may affect results. To analyze these issues we conducted an integrative study of upper critical temperatures in a single species, the ant Atta sexdens rubropiosa, an animal model providing large numbers of individuals of diverse sizes but similar genetic makeup. Our specific aims were to test the 1) influence of heating rates in the experimental evaluation of upper critical temperature, 2) assumptions of absence of physical damage and reproducibility, and 3) sources of variance often overlooked in the thermal-limits literature; and 4) to introduce some experimental approaches that may help researchers to separate physiological and methodological issues. The upper thermal limits were influenced by both heating rates and body mass. In the latter case, the effect was physiological rather than methodological. The critical temperature decreased during subsequent tests performed on the same individual ants, even one week after the initial test. Accordingly, upper thermal limits may have been overestimated by our (and typical) protocols. Heating rates, body mass, procedures independent of temperature and other variables may affect the estimation of upper critical temperatures. Therefore, based on our data, we offer suggestions to enhance the quality of measurements, and offer recommendations to authors aiming to compile and analyze databases from the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Several mathematical and statistical methods have been proposed in the last few years to analyze microarray data. Most of those methods involve complicated formulas, and software implementations that require advanced computer programming skills. Researchers from other areas may experience difficulties when they attempting to use those methods in their research. Here we present an user-friendly toolbox which allows large-scale gene expression analysis to be carried out by biomedical researchers with limited programming skills. Results Here, we introduce an user-friendly toolbox called GEDI (Gene Expression Data Interpreter), an extensible, open-source, and freely-available tool that we believe will be useful to a wide range of laboratories, and to researchers with no background in Mathematics and Computer Science, allowing them to analyze their own data by applying both classical and advanced approaches developed and recently published by Fujita et al. Conclusion GEDI is an integrated user-friendly viewer that combines the state of the art SVR, DVAR and SVAR algorithms, previously developed by us. It facilitates the application of SVR, DVAR and SVAR, further than the mathematical formulas present in the corresponding publications, and allows one to better understand the results by means of available visualizations. Both running the statistical methods and visualizing the results are carried out within the graphical user interface, rendering these algorithms accessible to the broad community of researchers in Molecular Biology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discussions on the future of cataloging has received increased attention in the last ten years, mainly due to the impact of rapid development of information and communication technologies in the same period, which has provided access to the Web anytime, anywhere. These discussions revolve around the need for a new bibliographic framework to meet the demand of this new reality in the digital environment, ie how libraries can process, store, deliver, share and integrate their collections (physical, digital or scanned), in current post-PC era? Faced with this question, Open Access, Open Source and Open Standards are three concepts that need to receive greater attention in the field of Library and Information Science, as it is believed to be fundamental elements for the change of paradigm of descriptive representation, currently based conceptually on physical item rather than intellectual work. This paper aims to raise and discuss such issues and instigate information professionals, especially librarians, to think, discuss and propose initiatives for such problems, contributing and sharing ideas and possible solutions, in multidisciplinary teams. At the end is suggested the effective creation of multidisciplinary and inter-institutional study groups on the future of cataloging and its impact on national collections, in order to contribute to the area of descriptive representation in national and international level

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The University of São Paulo celebrates its Integrated Library System 30th anniversary with an exhibition, discussing the problems of retrieval, preservation and access to knowledge resulting from the exceptional changes ICTs produce in contemporary society. It opens up discussions on the main function of the ancient library institution, reinforces its relevance and reflects on technical tools and social practices that make information and basic raw material accessible, generating new forms of knowledge. About the future library, it´s a call for reflection on how the brilliant minds of the past projected into the future, which for us are the achievements of the present. The future has already started and expects each one to exercise inventiveness and determination to build it in a human and collaborative sense.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

USP INFORMATION MANDATE – Resolution 6444 – Oct. 22th, 2012 Make public and accessible the knowledge generated by research developed at USP, encouraging the sharing, the use and generation of new content; •Preserve institutional memory by storing the full text of Intellectual Production (scientific, academic, artistic and technical); •Increase the impact of the knowledge generated in the university within the scientific community and the general public; •It is suggested to all members of the USP community to publish the results of their research, preferably, in open-access publication outlets and/or repositories and to include the permission to deposit their production in the BDPI system in their publication agreements. •Institutional Repository for Intellectual Production; •Official Source USP Statistical Yearbook.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matita (that means pencil in Italian) is a new interactive theorem prover under development at the University of Bologna. When compared with state-of-the-art proof assistants, Matita presents both traditional and innovative aspects. The underlying calculus of the system, namely the Calculus of (Co)Inductive Constructions (CIC for short), is well-known and is used as the basis of another mainstream proof assistant—Coq—with which Matita is to some extent compatible. In the same spirit of several other systems, proof authoring is conducted by the user as a goal directed proof search, using a script for storing textual commands for the system. In the tradition of LCF, the proof language of Matita is procedural and relies on tactic and tacticals to proceed toward proof completion. The interaction paradigm offered to the user is based on the script management technique at the basis of the popularity of the Proof General generic interface for interactive theorem provers: while editing a script the user can move forth the execution point to deliver commands to the system, or back to retract (or “undo”) past commands. Matita has been developed from scratch in the past 8 years by several members of the Helm research group, this thesis author is one of such members. Matita is now a full-fledged proof assistant with a library of about 1.000 concepts. Several innovative solutions spun-off from this development effort. This thesis is about the design and implementation of some of those solutions, in particular those relevant for the topic of user interaction with theorem provers, and of which this thesis author was a major contributor. Joint work with other members of the research group is pointed out where needed. The main topics discussed in this thesis are briefly summarized below. Disambiguation. Most activities connected with interactive proving require the user to input mathematical formulae. Being mathematical notation ambiguous, parsing formulae typeset as mathematicians like to write down on paper is a challenging task; a challenge neglected by several theorem provers which usually prefer to fix an unambiguous input syntax. Exploiting features of the underlying calculus, Matita offers an efficient disambiguation engine which permit to type formulae in the familiar mathematical notation. Step-by-step tacticals. Tacticals are higher-order constructs used in proof scripts to combine tactics together. With tacticals scripts can be made shorter, readable, and more resilient to changes. Unfortunately they are de facto incompatible with state-of-the-art user interfaces based on script management. Such interfaces indeed do not permit to position the execution point inside complex tacticals, thus introducing a trade-off between the usefulness of structuring scripts and a tedious big step execution behavior during script replaying. In Matita we break this trade-off with tinycals: an alternative to a subset of LCF tacticals which can be evaluated in a more fine-grained manner. Extensible yet meaningful notation. Proof assistant users often face the need of creating new mathematical notation in order to ease the use of new concepts. The framework used in Matita for dealing with extensible notation both accounts for high quality bidimensional rendering of formulae (with the expressivity of MathMLPresentation) and provides meaningful notation, where presentational fragments are kept synchronized with semantic representation of terms. Using our approach interoperability with other systems can be achieved at the content level, and direct manipulation of formulae acting on their rendered forms is possible too. Publish/subscribe hints. Automation plays an important role in interactive proving as users like to delegate tedious proving sub-tasks to decision procedures or external reasoners. Exploiting the Web-friendliness of Matita we experimented with a broker and a network of web services (called tutors) which can try independently to complete open sub-goals of a proof, currently being authored in Matita. The user receives hints from the tutors on how to complete sub-goals and can interactively or automatically apply them to the current proof. Another innovative aspect of Matita, only marginally touched by this thesis, is the embedded content-based search engine Whelp which is exploited to various ends, from automatic theorem proving to avoiding duplicate work for the user. We also discuss the (potential) reusability in other systems of the widgets presented in this thesis and how we envisage the evolution of user interfaces for interactive theorem provers in the Web 2.0 era.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.