136 resultados para Depth, reference


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In late 2006, the National Library of Australia implemented a trial Instant Messaging service that ran in parallel with the AskNow chat reference service for a six month period. The trial was a resounding success, proving both a demand for an IM service and the suitability of the medium for reference service provision in a collaborative environment. The trial also allowed the collection of a significant body of data on user expectations, librarian experience and the nature of enquiries. This article begins by introducing the concept of IM and discusses the impetus for its use as a channel for reference service provision. It presents and analyses data collected from user surveys, session transcripts, usage statistics, staff surveys and other staff feedback mechanisms, and explores the issues arising from the data analysis. The article concludes by discussing the IM system architecture that the NLA is currently developing, which will allow the Library to move forward with an ongoing IM service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Web has become a worldwide repository of information which individuals, companies, and organizations utilize to solve or address various information problems. Many of these Web users utilize automated agents to gather this information for them. Some assume that this approach represents a more sophisticated method of searching. However, there is little research investigating how Web agents search for online information. In this research, we first provide a classification for information agent using stages of information gathering, gathering approaches, and agent architecture. We then examine an implementation of one of the resulting classifications in detail, investigating how agents search for information on Web search engines, including the session, query, term, duration and frequency of interactions. For this temporal study, we analyzed three data sets of queries and page views from agents interacting with the Excite and AltaVista search engines from 1997 to 2002, examining approximately 900,000 queries submitted by over 3,000 agents. Findings include: (1) agent sessions are extremely interactive, with sometimes hundreds of interactions per second (2) agent queries are comparable to human searchers, with little use of query operators, (3) Web agents are searching for a relatively limited variety of information, wherein only 18% of the terms used are unique, and (4) the duration of agent-Web search engine interaction typically spans several hours. We discuss the implications for Web information agents and search engines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Object segmentation is one of the fundamental steps for a number of robotic applications such as manipulation, object detection, and obstacle avoidance. This paper proposes a visual method for incorporating colour and depth information from sequential multiview stereo images to segment objects of interest from complex and cluttered environments. Rather than segmenting objects using information from a single frame in the sequence, we incorporate information from neighbouring views to increase the reliability of the information and improve the overall segmentation result. Specifically, dense depth information of a scene is computed using multiple view stereo. Depths from neighbouring views are reprojected into the reference frame to be segmented compensating for imperfect depth computations for individual frames. The multiple depth layers are then combined with color information from the reference frame to create a Markov random field to model the segmentation problem. Finally, graphcut optimisation is employed to infer pixels belonging to the object to be segmented. The segmentation accuracy is evaluated over images from an outdoor video sequence demonstrating the viability for automatic object segmentation for mobile robots using monocular cameras as a primary sensor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an iterative hierarchical algorithm for multi-view stereo. The algorithm attempts to utilise as much contextual information as is available to compute highly accurate and robust depth maps. There are three novel aspects to the approach: 1) firstly we incrementally improve the depth fidelity as the algorithm progresses through the image pyramid; 2) secondly we show how to incorporate visual hull information (when available) to constrain depth searches; and 3) we show how to simultaneously enforce the consistency of the depth-map by continual comparison with neighbouring depth-maps. We show that this approach produces highly accurate depth-maps and, since it is essentially a local method, is both extremely fast and simple to implement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A time-resolved inverse spatially offset Raman spectrometer was constructed for depth profiling of Raman-active substances under both the lab and the field environments. The system operating principles and performance are discussed along with its advantages relative to traditional continuous wave spatially offset Raman spectrometer. The developed spectrometer uses a combination of space- and time-resolved detection in order to obtain high-quality Raman spectra from substances hidden behind coloured opaque surface layers, such as plastic and garments, with a single measurement. The time-gated spatially offset Raman spectrometer was successfully used to detect concealed explosives and drug precursors under incandescent and fluorescent background light as well as under daylight. The average screening time was 50 s per measurement. The excitation energy requirements were relatively low (20 mW) which makes the probe safe for screening hazardous substances. The unit has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than previous picosecond-based systems, to provide a functional platform for in-line or in-field sensing of chemical substances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The favourable scaffold for bone tissue engineering should have desired characteristic features, such as adequate mechanical strength and three-dimensional open porosity, which guarantee a suitable environment for tissue regeneration. In fact, the design of such complex structures like bone scaffolds is a challenge for investigators. One of the aims is to achieve the best possible mechanical strength-degradation rate ratio. In this paper we attempt to use numerical modelling to evaluate material properties for designing bone tissue engineering scaffold fabricated via the fused deposition modelling technique. For our studies the standard genetic algorithm was used, which is an efficient method of discrete optimization. For the fused deposition modelling scaffold, each individual strut is scrutinized for its role in the architecture and structural support it provides for the scaffold, and its contribution to the overall scaffold was studied. The goal of the study was to create a numerical tool that could help to acquire the desired behaviour of tissue engineered scaffolds and our results showed that this could be achieved efficiently by using different materials for individual struts. To represent a great number of ways in which scaffold mechanical function loss could proceed, the exemplary set of different desirable scaffold stiffness loss function was chosen. © 2012 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network RTK (Real-Time Kinematic) is a technology that is based on GPS (Global Positioning System) or more generally on GNSS (Global Navigation Satellite System) observations to achieve centimeter-level accuracy positioning in real time. It is enabled by a network of Continuously Operating Reference Stations (CORS). CORS placement is an important problem in the design of network RTK as it directly affects not only the installation and running costs of the network RTK, but also the Quality of Service (QoS) provided by the network RTK. In our preliminary research on the CORS placement, we proposed a polynomial heuristic algorithm for a so-called location-based CORS placement problem. From a computational point of view, the location-based CORS placement is a largescale combinatorial optimization problem. Thus, although the heuristic algorithm is efficient in computation time it may not be able to find an optimal or near optimal solution. Aiming at improving the quality of solutions, this paper proposes a repairing genetic algorithm (RGA) for the location-based CORS placement problem. The RGA has been implemented and compared to the heuristic algorithm by experiments. Experimental results have shown that the RGA produces better quality of solutions than the heuristic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Description ‘The second volume of the Handbook on the Knowledge Economy is a worthy companion to the highly successful original volume published in 2005, extending its theoretical depth and developing its coverage. Together the two volumes provide the single best work and reference point for knowledge economy studies. The second volume with fifteen original essays by renowned scholars in the field, provides insightful and robust analyses of the development potential of the knowledge economy in all its aspects, forms and manifestations.’ – Michael A. Peters, University of Illinois, US

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter examines why policy decision-makers opt for command and control environmental regulation despite the availability of a plethora of market-based instruments which are more efficient and cost-effective. Interestingly, Sri Lanka has adopted a wholly command and control system, during both the pre and post liberalisation economic policies. This chapter first examines the merits and demerits of command and control and market-based approaches and then looks at Sri Lanka’s extensive environmental regulatory framework. The chapter then examines the likely reasons as to why the country has gone down the path of inflexible regulatory measures and has become entrenched in them. The various hypotheses are discussed and empirical evidence is provided. The chapter also discusses the consequences of an environmentally slack economy and policy implications stemming from adopting a wholly regulatory approach. The chapter concludes with a discussion of the main results.