892 resultados para information science
Resumo:
User-Web interactions have emerged as an important research in the field of information science. In this study, we examine extensively the Web searching performed by general users. Our goal is to investigate the effects of users’ cognitive styles on their Web search behavior in relation to two broad components: Information Searching and Information Processing Approaches. We use questionnaires, a measure of cognitive style, Web session logs and think-aloud as the data collection instruments. Our study findings show wholistic Web users tend to adopt a top-down approach to Web searching, where the users searched for a generic topic, and then reformulate their queries to search for specific information. They tend to prefer reading to process information. Analytic users tend to prefer a bottom-up approach to information searching and they process information by scanning search result pages.
Resumo:
This chapter investigates the challenges and opportunities associated with planning for a competitive city. The chapter is based on the assumption that a healthy city is a fundamental prerequisite for a competitive city. Thus, it is critical to examine the local determinants of health and factor these into any planning efforts. The main focus of the chapter is on the role of e-health planning, by utilising web-based geographic decision support systems. The proposed novel decision support system would provide a powerful and effective platform for stakeholders to access essential data for decision-making purposes. The chapter also highlights the need for a comprehensive information framework to guide the process of planning for healthy cities. Additionally, it discusses the prospects and constraints of such an approach. In summary, this chapter outlines the potential insights of using information science-based framework and suggests practical planning methods, as part of a broader e-health approach for improving the health characteristics of competitive cities.
Resumo:
Australia is just one of many developed countries facing the challenge of delivering value for money in the provision of a substantial infrastructure pipeline amidst severe construction and private finance constraints. To help address this challenge, this chapter focuses on developing an understanding of the determinants of value at key procurement decision points that range from the make-or-buy decision, to buying in the context of market structures, including the exchange relationship and contractual arrangement decision. This understanding is based on theoretical pluralism and illustrated by research in the field of construction and maintenance, and in public-private partnerships.
Resumo:
Delivering infrastructure projects involves many stakeholders. Their responsibilities and authorities vary over the course of the project lifecycle - from establishing the project parameters and performance requirements, to operating and maintaining the completed infrastructure. To ensure the successful delivery of infrastructure projects, it is important for the project management team to identify and manage the stakeholders and their requirements. This chapter discusses the management of stakeholders in delivering infrastructure projects, from their conception to completion. It includes managing the stakeholders for project selection and involving them to improve project constructability, operability and maintainability.
Resumo:
The impact of Web 2.0 and social networking tools such as virtual communities, on education has been much commented on. The challenge for teachers is to embrace these new social networking tools and apply them to new educational contexts. The increasingly digitally-abled student cohorts and the need for educational applications of Web 2.0 are challenges that overwhelm many educators. This chapter will make three important contributions. Firstly it will explore the characteristics and behaviours of digitally-abled students enrolled in higher education. An innovation of this chapter will be the appli- cation of Bourdieu’s notions of capital, particularly social, cultural and digital capital to understand these characteristics. Secondly, it will present a possible use of a commonly used virtual community, Facebook©. Finally it will offer some advice for educators who are interested in using popular social networking communities, similar to Facebook©, in their teaching and learning.
Resumo:
It is possible for the visual attention characteristics of a person to be exploited as a biometric for authentication or identification of individual viewers. The visual attention characteristics of a person can be easily monitored by tracking the gaze of a viewer during the presentation of a known or unknown visual scene. The positions and sequences of gaze locations during viewing may be determined by overt (conscious) or covert (sub-conscious) viewing behaviour. This paper presents a method to authenticate individuals using their covert viewing behaviour, thus yielding a unique behavioural biometric. A method to quantify the spatial and temporal patterns established by the viewer for their covert behaviour is proposed utilsing a principal component analysis technique called `eigenGaze'. Experimental results suggest that it is possible to capture the unique visual attention characteristics of a person to provide a simple behavioural biometric.
Resumo:
This article examines social, cultural and technological change in the systems and economies of educational information management. Since the Sumerians first collected, organized and supervised administrative and religious records some six millennia ago, libraries have been key physical depositories and cultural signifiers in the production and mediation of social capital and power through education. To date, the textual, archival and discursive practices perpetuating libraries have remained exempt from inquiry. My aim here is to remedy this hiatus by making the library itself the terrain and object of critical analysis and investigation. The paper argues that in the three dominant communications eras—namely, oral, print and digital cultures—society’s centres of knowledge and learning have resided in the ceremony, the library and the cybrary respectively. In a broad-brush historical grid, each of these key educational institutions—the ceremony in oral culture, the library in print culture and the cybrary in digital culture—are mapped against social, cultural and technological orders pertaining to their era. Following a description of these shifts in society’s collective cultural memory, the paper then examines the question of what the development of global information systems and economies mean for schools and libraries of today, and for teachers and learners as knowledge consumers and producers?
Resumo:
Teachers are under increasing pressure from government and school management to incorporate technology into lessons. They need to consider which technologies can most effectively enhance subject learning, encourage higher order thinking skills and support the performance of authentic tasks. This chapter reviews the practical and theoretical tools that have been developed to aid teachers in selecting software and reviews the software assessment methodologies from the 1980s to the present day. It concludes that teachers need guidance to structure the evaluation of technology, to consider its educational affordances, its usability, its suitability for the students and the classroom environment and its fit to the teachers’ preferred pedagogies.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
The presence of High Speed Rail (HSR) systems influences market shares of road and air transport, and the development of cities and regions they serve. With the deployment of HSR infrastructure, changes in accessibility have occurred. These changes have lead researchers to investigate effects on the economic and spatial derived variables. Contention exists when managing the trade off between efficiency, and access points which are usually in the range of hundreds of kilometres apart. In short, it is argued that intermediate cities, bypassed by HSR services, suffer a decline in their accessibility and developmental opportunities. The present Chapter will analyse possible impacts derived from the presence of HSR infrastructure. In particular, it will consider small and medium agglomerations in the vicinity of HSR corridors, not always served by HSR stations. Thus, a methodology is developed to quantify accessibility benefits and their distribution. These benefits will be investigated in relation to different rail transit strategies integrating HSR infrastructure where a HSR station cannot be positioned. These strategies are selected principally for the type of service offered: (i) cadenced, (ii) express, (iii) frequent or (iv) non-stopping. Furthermore, to ground the theoretical approach linking accessibility and competitiveness, a case study in the North-Eastern Italian regions will be used for the application of the accessibility distributive patterns between the HSR infrastructure and the selected strategies. Results indicate that benefits derive from well informed decisions on HSR station positioning and the appropriate blend of complementary services in the whole region to interface HSR infrastructure. The results are significant for all countries in Europe and worldwide, not only for investing in HSR infrastructure, but mostly in terms of building territorial cohesion, while seeking international recognition for developing successful new technology and systems.
Resumo:
Recent studies on automatic new topic identification in Web search engine user sessions demonstrated that neural networks are successful in automatic new topic identification. However most of this work applied their new topic identification algorithms on data logs from a single search engine. In this study, we investigate whether the application of neural networks for automatic new topic identification are more successful on some search engines than others. Sample data logs from the Norwegian search engine FAST (currently owned by Overture) and Excite are used in this study. Findings of this study suggest that query logs with more topic shifts tend to provide more successful results on shift-based performance measures, whereas logs with more topic continuations tend to provide better results on continuation-based performance measures.
Resumo:
Major Web search engines, such as AltaVista, are essential tools in the quest to locate online information. This article reports research that used transaction log analysis to examine the characteristics and changes in AltaVista Web searching that occurred from 1998 to 2002. The research questions we examined are (1) What are the changes in AltaVista Web searching from 1998 to 2002? (2) What are the current characteristics of AltaVista searching, including the duration and frequency of search sessions? (3) What changes in the information needs of AltaVista users occurred between 1998 and 2002? The results of our research show (1) a move toward more interactivity with increases in session and query length, (2) with 70% of session durations at 5 minutes or less, the frequency of interaction is increasing, but it is happening very quickly, and (3) a broadening range of Web searchers' information needs, with the most frequent terms accounting for less than 1% of total term usage. We discuss the implications of these findings for the development of Web search engines. © 2005 Wiley Periodicals, Inc.
Resumo:
Metasearch engines are an intuitive method for improving the performance of Web search by increasing coverage, returning large numbers of results with a focus on relevance, and presenting alternative views of information needs. However, the use of metasearch engines in an operational environment is not well understood. In this study, we investigate the usage of Dogpile.com, a major Web metasearch engine, with the aim of discovering how Web searchers interact with metasearch engines. We report results examining 2,465,145 interactions from 534,507 users of Dogpile.com on May 6, 2005 and compare these results with findings from other Web searching studies. We collect data on geographical location of searchers, use of system feedback, content selection, sessions, queries, and term usage. Findings show that Dogpile.com searchers are mainly from the USA (84% of searchers), use about 3 terms per query (mean = 2.85), implement system feedback moderately (8.4% of users), and generally (56% of users) spend less than one minute interacting with the Web search engine. Overall, metasearchers seem to have higher degrees of interaction than searchers on non-metasearch engines, but their sessions are for a shorter period of time. These aspects of metasearching may be what define the differences from other forms of Web searching. We discuss the implications of our findings in relation to metasearch for Web searchers, search engines, and content providers.
Resumo:
Detecting query reformulations within a session by a Web searcher is an important area of research for designing more helpful searching systems and targeting content to particular users. Methods explored by other researchers include both qualitative (i.e., the use of human judges to manually analyze query patterns on usually small samples) and nondeterministic algorithms, typically using large amounts of training data to predict query modification during sessions. In this article, we explore three alternative methods for detection of session boundaries. All three methods are computationally straightforward and therefore easily implemented for detection of session changes. We examine 2,465,145 interactions from 534,507 users of Dogpile.com on May 6, 2005. We compare session analysis using (a) Internet Protocol address and cookie; (b) Internet Protocol address, cookie, and a temporal limit on intrasession interactions; and (c) Internet Protocol address, cookie, and query reformulation patterns. Overall, our analysis shows that defining sessions by query reformulation along with Internet Protocol address and cookie provides the best measure, resulting in an 82% increase in the count of sessions. Regardless of the method used, the mean session length was fewer than three queries, and the mean session duration was less than 30 min. Searchers most often modified their query by changing query terms (nearly 23% of all query modifications) rather than adding or deleting terms. Implications are that for measuring searching traffic, unique sessions may be a better indicator than the common metric of unique visitors. This research also sheds light on the more complex aspects of Web searching involving query modifications and may lead to advances in searching tools.