171 resultados para Information Science
Resumo:
Major Web search engines, such as AltaVista, are essential tools in the quest to locate online information. This article reports research that used transaction log analysis to examine the characteristics and changes in AltaVista Web searching that occurred from 1998 to 2002. The research questions we examined are (1) What are the changes in AltaVista Web searching from 1998 to 2002? (2) What are the current characteristics of AltaVista searching, including the duration and frequency of search sessions? (3) What changes in the information needs of AltaVista users occurred between 1998 and 2002? The results of our research show (1) a move toward more interactivity with increases in session and query length, (2) with 70% of session durations at 5 minutes or less, the frequency of interaction is increasing, but it is happening very quickly, and (3) a broadening range of Web searchers' information needs, with the most frequent terms accounting for less than 1% of total term usage. We discuss the implications of these findings for the development of Web search engines. © 2005 Wiley Periodicals, Inc.
Resumo:
Metasearch engines are an intuitive method for improving the performance of Web search by increasing coverage, returning large numbers of results with a focus on relevance, and presenting alternative views of information needs. However, the use of metasearch engines in an operational environment is not well understood. In this study, we investigate the usage of Dogpile.com, a major Web metasearch engine, with the aim of discovering how Web searchers interact with metasearch engines. We report results examining 2,465,145 interactions from 534,507 users of Dogpile.com on May 6, 2005 and compare these results with findings from other Web searching studies. We collect data on geographical location of searchers, use of system feedback, content selection, sessions, queries, and term usage. Findings show that Dogpile.com searchers are mainly from the USA (84% of searchers), use about 3 terms per query (mean = 2.85), implement system feedback moderately (8.4% of users), and generally (56% of users) spend less than one minute interacting with the Web search engine. Overall, metasearchers seem to have higher degrees of interaction than searchers on non-metasearch engines, but their sessions are for a shorter period of time. These aspects of metasearching may be what define the differences from other forms of Web searching. We discuss the implications of our findings in relation to metasearch for Web searchers, search engines, and content providers.
Resumo:
Detecting query reformulations within a session by a Web searcher is an important area of research for designing more helpful searching systems and targeting content to particular users. Methods explored by other researchers include both qualitative (i.e., the use of human judges to manually analyze query patterns on usually small samples) and nondeterministic algorithms, typically using large amounts of training data to predict query modification during sessions. In this article, we explore three alternative methods for detection of session boundaries. All three methods are computationally straightforward and therefore easily implemented for detection of session changes. We examine 2,465,145 interactions from 534,507 users of Dogpile.com on May 6, 2005. We compare session analysis using (a) Internet Protocol address and cookie; (b) Internet Protocol address, cookie, and a temporal limit on intrasession interactions; and (c) Internet Protocol address, cookie, and query reformulation patterns. Overall, our analysis shows that defining sessions by query reformulation along with Internet Protocol address and cookie provides the best measure, resulting in an 82% increase in the count of sessions. Regardless of the method used, the mean session length was fewer than three queries, and the mean session duration was less than 30 min. Searchers most often modified their query by changing query terms (nearly 23% of all query modifications) rather than adding or deleting terms. Implications are that for measuring searching traffic, unique sessions may be a better indicator than the common metric of unique visitors. This research also sheds light on the more complex aspects of Web searching involving query modifications and may lead to advances in searching tools.
Resumo:
Internet and computer addiction has been a popular research area since the 90s. Studies on Internet and computer addiction have usually been conducted in the US, and the investigation of computer and Internet addiction at different countries is an interesting area of research. This study investigates computer and Internet addiction among teenagers and Internet cafe visitors in Turkey. We applied a survey to 983 visitors in the Internet cafes. The results show that the Internet cafe visitors are usually teenagers, mostly middle and high-school students and usually are busy with computer and Internet applications like chat, e-mail, browsing and games. The teenagers come to the Internet cafe to spend time with friends and the computers. In addition, about 30% of cafe visitors admit to having an Internet addiction, and about 20% specifically mention the problems that they are having with the Internet. It is rather alarming to consider the types of activities that the teenagers are performing in an Internet cafe, their reasons for being there, the percentage of self-awareness about Internet addiction, and the lack of control of applications in the cafe.
Resumo:
Time-varying bispectra, computed using a classical sliding window short-time Fourier approach, are analyzed for scalp EEG potentials evoked by an auditory stimulus and new observations are presented. A single, short duration tone is presented from the left or the right, direction unknown to the test subject. The subject responds by moving the eyes to the direction of the sound. EEG epochs sampled at 200 Hz for repeated trials are processed between -70 ms and +1200 ms with reference to the stimulus. It is observed that for an ensemble of correctly recognized cases, the best matching timevarying bispectra at (8 Hz, 8Hz) are for PZ-FZ channels and this is also largely the case for grand averages but not for power spectra at 8 Hz. Out of 11 subjects, the only exception for time-varying bispectral match was a subject with family history of Alzheimer’s disease and the difference was in bicoherence, not biphase.
Resumo:
Fusion techniques have received considerable attention for achieving performance improvement with biometrics. While a multi-sample fusion architecture reduces false rejects, it also increases false accepts. This impact on performance also depends on the nature of subsequent attempts, i.e., random or adaptive. Expressions for error rates are presented and experimentally evaluated in this work by considering the multi-sample fusion architecture for text-dependent speaker verification using HMM based digit dependent speaker models. Analysis incorporating correlation modeling demonstrates that the use of adaptive samples improves overall fusion performance compared to randomly repeated samples. For a text dependent speaker verification system using digit strings, sequential decision fusion of seven instances with three random samples is shown to reduce the overall error of the verification system by 26% which can be further reduced by 6% for adaptive samples. This analysis novel in its treatment of random and adaptive multiple presentations within a sequential fused decision architecture, is also applicable to other biometric modalities such as finger prints and handwriting samples.
Resumo:
A substantial group of young people experience mental health problems which impact on their educational development and subsequent wellbeing. Of those who do suffer from mental health issues, a minority of these seek appropriate professional assistance. This paucity of help seeking behaviours among young people is a challenge for counsellors. Whereas adults who suffer mental health issues have increasingly turned to the internet for assistance, it is interesting that when young people whose social lives are increasingly dependent on the communication technologies, are not catered for as much as adults by online counselling. One small online counselling pilot program conducted at a Queensland secondary school for three years from 2005-2007 (Glasheen & Campbell, 2009) offered anonymous live-time counselling from the school counsellor (via a secure chat room) to students through the school’s website. Findings indicated that boys were more likely to use the service than girls. All participants transitioned to face-to-face counselling, and all reported it was beneficial. This pilot study attested to the potential of an online counselling. However, school counsellors as a professional group have been hesitant to utilise online counselling as part of their service delivery to young people in schools. This chapter concludes by identifying reasons for this reluctance and the possible initiatives to increase online support for young people in schools.
Resumo:
This book chapter examines the concept of team teaching from the perspective of the various stakeholders, in order to discuss the advantages and disadvantages of team teaching for students, to consider the positive and negative dimensions of collaborative teaching for teachers, and to review the implications for educational administration. In addition, attention will be paid to the issues associated with team teaching in the context of e-learning. The chapter concludes with a case study which discusses how the implementation of collaborative teaching within the library and information science discipline at an Australian university helped develop the authors’ understanding of socially constructed knowledge.
Resumo:
Stormwater has been recognised as one of the main culprits of aquatic ecosystem pollution and as a significant threat to the goal of ecological sustainable development. Water sensitive urban design is one of the key responses to the need to better manage urban stormwater runoff, the objectives of which go beyond rapid and efficient conveyance. Underpinned by the concepts of sustainable urban development, water sensitive urban design has proven to be an efficient and environmentally-friendly approach to urban stormwater management, with the necessary technical know-how and skills already available. However, large-scale implementation of water sensitive urban design is still lacking in Australia due to significant impediments and negative perceptions. Identification of the issues, barriers and drivers that affect sustainability outcomes of urban stormwater management is one of the first steps towards encouraging the wide-scale uptake of water sensitive urban design features which integrate sustainable urban stormwater management. This chapter investigates key water sensitive urban design perceptions, drivers and barriers in order to improve sustainable urban stormwater management efforts.
Resumo:
As a result of rapid urbanisation, population growth, changes in lifestyle, pollution and the impacts of climate change, water provision has become a critical challenge for planners and policy-makers. In the wake of increasingly difficult water provision and drought, the notion that freshwater is a finite and vulnerable resource is increasingly being realised. Many city administrations around the world are struggling to provide water security for their residents to maintain lifestyle and economic growth. This chapter reviews the global challenge of providing freshwater to sustain lifestyles and economic growth, and the contributing challenges of climate change, urbanisation, population growth and problems in rainfall distribution. The chapter proceeds to evaluate major alternatives to current water sources such as conservation, recycling and reclamation, and desalination. Integrated water resource management is briefly looked at to explore its role in complementing water provision. A comparative study on alternative resources is undertaken to evaluate their strengths, weaknesses, opportunities and constraints, and the results are discussed.
Resumo:
In recent years, local government infrastructure management practices have evolved from conventional land use planning to more wide ranging and integrated urban growth and infrastructure management approaches. The roles and responsibilities of local government are no longer simply to manage daily operational functions of a city and provide basic infrastructure. Local governments are now required to undertake economic planning, manage urban growth; be involved in major infrastructure planning; and even engage in achieving sustainable development objectives. The Brisbane Urban Growth model has proven initially successful to ensure timely and coordinated delivery of urban infrastructure. This model may be the first step for many local governments to move toward an integrated, sustainable and effective infrastructure management.
Resumo:
Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.
Resumo:
In many applications of active noise control (ANC), an online secondary path modelling method using a white noise as a training signal is required to ensure convergence of the system. The modelling accuracy and the convergence rate increase when a white noise with larger variance is used, however larger the variance increases the residual noise, which decreases performance of the system. The proposed algorithm uses the advantages of the white noise with larger variance to model the secondary path, but the injection is stopped at the optimum point to increase performance of the system. In this approach, instead of continuous injection of the white noise, a sudden change in secondary path during the operation makes the algorithm to reactivate injection of the white noise to adjust the secondary path estimation. Comparative simulation results shown in this paper indicate effectiveness of the proposed method.
Resumo:
Purpose – The article aims to review a university course, offered to students in both Australia and Germany, to encourage them to learn about designing, implementing, marketing and evaluating information programs and services in order to build active and engaged communities. The concepts and processes of Web 2.0 technologies come together in the learning activities, with students establishing their own personal learning networks (PLNs). Design/methodology/approach – The case study examines the principles of learning and teaching that underpin the course and presents the students' own experiences of the challenges they faced as they explored the interactive, participative and collaborative dimensions of the web. Findings – The online format of the course and the philosophy of learning through play provided students with a safe and supportive environment for them to move outside of their comfort zones, to be creative, to experiment and to develop their professional personas. Reflection on learning was a key component that stressed the value of reflective practice in assisting library and information science (LIS) professionals to adapt confidently to the rapidly changing work environment. Originality/value – This study provides insights into the opportunities for LIS courses to work across geographical boundaries, to allow students to critically appraise library practice in different contexts and to become active participants in wider professional networks.
Resumo:
Iris based identity verification is highly reliable but it can also be subject to attacks. Pupil dilation or constriction stimulated by the application of drugs are examples of sample presentation security attacks which can lead to higher false rejection rates. Suspects on a watch list can potentially circumvent the iris based system using such methods. This paper investigates a new approach using multiple parts of the iris (instances) and multiple iris samples in a sequential decision fusion framework that can yield robust performance. Results are presented and compared with the standard full iris based approach for a number of iris degradations. An advantage of the proposed fusion scheme is that the trade-off between detection errors can be controlled by setting parameters such as the number of instances and the number of samples used in the system. The system can then be operated to match security threat levels. It is shown that for optimal values of these parameters, the fused system also has a lower total error rate.