167 resultados para Library statistics.
Resumo:
The purpose of this paper is to consider how libraries support the development of community networks both physically and digitally. To do this, a case-study methodology was employed, including a combination of data about the library and qualitative interviews with library users considering their experience of the library. This paper proposes that libraries act as ‘third places’ spatially connecting people; libraries also build links with online media and play a critical role in inclusively connecting non-technology users with the information on the Internet and digital technology more generally. The paper establishes the value of libraries in the digital age and recommends that libraries actively seek ways to develop links between non-technology users and activity on the Internet. It addresses the need to reach these types of non-technology users in different ways. Further, it suggests that libraries utilise their positioning as third places to create broader community networks, to support local communities beyond existing users and beyond the library precinct.
Resumo:
The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.
Resumo:
University libraries worldwide are reconceptualising the ways in which they support the research agenda in their respective institutions. This paper is based on a survey completed by member libraries of the Queensland University Libraries Office of Cooperation (QUL OC), the findings of which may be informative for other university libraries. After briefly examining major emerging trends in research support, the paper discusses the results of the survey specifically focussing on support for researchers and the research agenda in their institutions. All responding libraries offer a high level of research support, however, eResearch support, in general, and research data management support, in particular, have the highest variance among the libraries, and signal possible areas for growth. Areas for follow-up, benchmarking and development are suggested.
Resumo:
Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.
Resumo:
Phenomenography is a research approach devised to allow the investigation of varying ways in which people experience aspects of their world. Whilst growing attention is being paid to interpretative research in LIS, it is not always clear how the outcomes of such research can be used in practice. This article explores the potential contribution of phenomenography in advancing the application of phenomenological and hermeneutic frameworks to LIS theory, research and practice. In phenomenography we find a research toll which in revealing variation, uncovers everyday understandings of phenomena and provides outcomes which are readily applicable to professional practice. THe outcomes may be used in human computer interface design, enhancement, implementation and training, in the design and evaluation of services, and in education and training for both end users and information professionals. A proposed research territory for phenomenography in LIS includes investigating qualitative variation in the experienced meaning of: 1) information and its role in society 2) LIS concepts and principles 3) LIS processes and; 4) LIS elements.
Resumo:
The knowledge economy of the 21st century requires skills such as creativity, critical thinking, problem solving, communication and collaboration (Partnership for 21st century skills, 2011) – skills that cannot easily be learnt from books, but rather through learning-by-doing and social interaction. Big ideas and disruptive innovation often result from collaboration between individuals from diverse backgrounds and areas of expertise. Public libraries, as facilitators of education and knowledge, have been actively seeking responses to such changing needs of the general public...
Resumo:
In the field of face recognition, Sparse Representation (SR) has received considerable attention during the past few years. Most of the relevant literature focuses on holistic descriptors in closed-set identification applications. The underlying assumption in SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such assumption is easily violated in the more challenging face verification scenario, where an algorithm is required to determine if two faces (where one or both have not been seen before) belong to the same person. In this paper, we first discuss why previous attempts with SR might not be applicable to verification problems. We then propose an alternative approach to face verification via SR. Specifically, we propose to use explicit SR encoding on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which are then concatenated to form an overall face descriptor. Due to the deliberate loss spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment & various image deformations. Within the proposed framework, we evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN), and an implicit probabilistic technique based on Gaussian Mixture Models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the proposed local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, in both verification and closed-set identification problems. The experiments also show that l1-minimisation based encoding has a considerably higher computational than the other techniques, but leads to higher recognition rates.
Resumo:
This paper presents the idea of a compendium of process technologies, i.e., a concise but comprehensive collection of techniques for process model analysis that support research on the design, execution, and evaluation of processes. The idea originated from observations on the evolution of process-related research disciplines. Based on these observations, we derive design goals for a compendium. Then, we present the jBPT library, which addresses these goals by means of an implementation of common analysis techniques in an open source codebase.
Resumo:
In Australia and increasingly worldwide, methamphetamine is one of the most commonly seized drugs analysed by forensic chemists. The current well-established GC/MS methods used to identify and quantify methamphetamine are lengthy, expensive processes, but often rapid analysis is requested by undercover police leading to an interest in developing this new analytical technique. Ninety six illicit drug seizures containing methamphetamine (0.1% - 78.6%) were analysed using Fourier Transform Infrared Spectroscopy with an Attenuated Total Reflectance attachment and Chemometrics. Two Partial Least Squares models were developed, one using the principal Infrared Spectroscopy peaks of methamphetamine and the other a Hierarchical Partial Least Squares model. Both of these models were refined to choose the variables that were most closely associated with the methamphetamine % vector. Both of the models were excellent, with the principal peaks in the Partial Least Squares model having Root Mean Square Error of Prediction 3.8, R2 0.9779 and lower limit of quantification 7% methamphetamine. The Hierarchical Partial Least Squares model had lower limit of quantification 0.3% methamphetamine, Root Mean Square Error of Prediction 5.2 and R2 0.9637. Such models offer rapid and effective methods for screening illicit drug samples to determine the percentage of methamphetamine they contain.
Resumo:
The article focuses on the evidence-based information practice (EBIP) applied at the Auraria Library in Denver, Colorado during the reorganization of its technical services division. Collaboration processes were established for the technical services division through the reorganization and redefinition of workflows. There are several factors that form part of the redefinition of roles including personal interests, department needs, and library needs. A collaborative EBIP environment was created in the division by addressing issues of workplace hierarchies, by the distribution of problem solving, and by the encouragement of reflective dialogue.
Resumo:
According to Australian Job Search, just 14% of librarians are under the age of 35. As a Generation Y librarian, flexibility is a key factor to ensuring survival in the Baby Boomer library and overcoming employment, promotion and in particular stereotype barriers. This paper draws upon generational and library workforce research, coupled with industry experience to provide practical advice and strategies to break through both personal and professional barriers for the Generation Y librarian in the Baby Boomer library world. Industry understanding, drawn from personal experiences of working in public, education and special libraries, utilises my journey as a librarian since graduation in 2005 to discuss barriers faced and methods for breaking through. In my previous position as Teaching and Learning Librarian at Northern Melbourne Institute of TAFE from 35 library staff I was the sole member under 30. In addition I was the youngest member of the Library Management Team by 20 years, providing a perfect example of the Generation Y librarian within a Baby Boomer environment. This experience provides the platform for exploring strategies for understanding and overcoming ageist ideas, generational stereotypes, and employment barriers. Discussion regarding the need to develop sound industry knowledge for survival within the library world will also be raised.
Resumo:
Learning is most effective when intrinsically motivated through personal interest, and situated in a supportive socio-cultural context. This paper reports on findings from a study that explored implications for design of interactive learning environments through 18 months of ethnographic observations of people’s interactions at “Hack The Evening” (HTE). HTE is a meetup group initiated at the State Library of Queensland in Brisbane, Australia, and dedicated to provide visitors with opportunities for connected learning in relation to hacking, making and do-it-yourself technology. The results provide insights into factors that contributed to HTE as a social, interactive and participatory environment for learning – knowledge is created and co-created through uncoordinated interactions among participants that come from a diversity of backgrounds, skills and areas of expertise. The insights also reveal challenges and barriers that the HTE group faced in regards to connected learning. Four dimensions of design opportunities are presented to overcome those challenges and barriers towards improving connected learning in library buildings and other free-choice learning environments that seek to embody a more interactive and participatory culture among their users. The insights are relevant for librarians as well as designers, managers and decision makers of other interactive and free-choice learning environments.
Resumo:
This study of English Coronial practice raises a number of questions, not only regarding state investigations of suicide, but also of the role of the Coroner itself. Following observations at over 20 inquests into possible suicides, and in-depth interviews with six Coroners, three main issue emerged: first, there exists considerable slippage between different Coroners over which deaths are likely to be classified as suicide; second, the high standard of proof required, and immense pressure faced by Coroners from family members at inquest to reach any verdict other than suicide, can significantly depress likely suicide rates; and finally, Coroners feel no professional obligation, either individually or collectively, to contribute to the production of consistent and useful social data regarding suicide—arguably rendering comparative suicide statistics relatively worthless. These issues lead, ultimately, to a more important question about the role we expect Coroners to play within social governance, and within an effective, contemporary democracy.
Resumo:
The research was a qualitative study investigating the lived experiences of teacher librarians as evidence based practitioners in Australian school libraries. It addressed how teacher librarians understood, applied and implemented evidence based practice, and investigated what these teacher librarians considered to constitute evidence. Two key critical findings of this research are that evidence based practice for teacher librarians is a holistic experience and evidence for teacher librarians can take many forms, including professional knowledge, observations, statistics, informal feedback and personal reflections. The study is significant to teacher librarians, library and information professionals, schools and school administrators, and the research field.
Resumo:
This paper presents research findings and design strategies that illustrate how digital technology can be applied as a tool for hybrid placemaking in ways that would not be possible in purely digital or physical space. Digital technology has revolutionised the way people learn and gather new information. This trend has challenged the role of the library as a physical place, as well as the interplay of digital and physical aspects of the library. The paper provides an overview of how the penetration of digital technology into everyday life has affected the library as a place, both as designed by place makers, and, as perceived by library users. It then identifies a gap in current library research about the use of digital technology as a tool for placemaking, and reports results from a study of Gelatine – a custom built user check-in system that displays real-time user information on a set of public screens. Gelatine and its evaluation at The Edge, at State Library of Queensland illustrates how combining affordances of social, spatial and digital space can improve the connected learning experience among on-site visitors. Future design strategies involving gamifying the user experience in libraries are described based on Gelatine’s infrastructure. The presented design ideas and concepts are relevant for managers and designers of libraries as well as other informal, social learning environments.