975 resultados para Virtual Reference Service


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interpretation has been used in many tourism sectors as a technique in achieving building hannony between resources and human needs. The objectives of this study are to identify the types of the interpretive methods used, and to evaluate their effectiveness, in marine parks. This study reviews the design principles of an effective interpretation for marine wildlife tourism, and adopts Drams' five design principles (1997) into a conceptual framework. Enjoyment increase, knowledge gain, attitude and intention change, and behaviour modification were used as key indicators in the assessment of the interpretive effectiveness of the Vancouver Aquarium (VA) and Marineland Canada (MC). Since on-site research is unavailable, a virtual tour is created to represent the interpretive experiences in the two study sites. Self-administered questionnaires are used to measure responses. Through comparing responses to the questionnaires (pre-, post-virtual tours and follow-up), this study found that interpretation increased enjoyment and added to respondents' knowledge. Although the changes in attitudes and intentions are not significant, the findings indicate that attitude and intention changes did occur as a result of interpretation, but only to a limited extent. Overall results suggest that more techniques should be added to enhance the effectiveness of the interpretation in marine parks or self-guiding tours, and with careful design, virtual tours are the innovative interpretation techniques for marine parks or informal educational facilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As institutions of higher education struggle to stay relevant, competitive, accessible, and flexible, they are scrambling to attend to a shift in focus for new students. This shift involves experiential learning. The purpose of this major research paper was to examine the existing structures, to seek gaps in the experiential learning programs, and to devise a framework to move forward. The specific focus was on experiential learning at Brock University in the Faculty of Applied Health Sciences. The methodology was underscored with cognitive constructivism and appreciative theory. Data collection involved content analysis steps established by Krippendorff (2004) and Weber (1985). Data analysis involved the four dimensions of reflection designed by LaBoskey, including the purpose, context, content, and procedures. The results developed understandings on the state of formal processes and pathways within service learning. A tool kit was generated that defines service learning and offers an overview of the types of service learning typically employed. The tool kit acts as a reference guide for those interested in implementing experiential learning courses. Importantly, the results also provided 10 key points in experiential learning courses by Emily Allan. A flow chart illustrates the connections among each of the 10 points, and then they are described in full to establish a strategy for the way forward in experiential learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Biomedical scientists need to choose among hundreds of publicly available bioinformatics applications, tools, and databases. Librarian challenges include raising awareness to valuable resources, as well as providing support in finding and evaluating specific resources. Our objective is to implement an education program in bioinformatics similar to those offered in other North American academic libraries. Description: Our initial target clientele included four research departments of the Faculty of Medicine at Universite´ de Montréal. In January 2010, I attended two departmental meetings and interviewed a few stakeholders in order to propose a basic bioinformatics service: one-to-one consultations and a workshop on NCBI databases. The response was favourable. The workshop was thus offered once a month during the Winter and Fall semesters, and participants were invited to evaluate the workshop via an online survey. In addition, a bioinformatics subject guide was launched on the library’s website in December 2010. Outcomes: One hundred and two participants attended one of the nine NCBI workshops offered in 2010; most were graduate students (74%). The survey’s response rate was 54%. A majority of respondents thought that the bioinformatics resources featured in the workshop were relevant (95%) and that the difficulty level of exercises was appropriate (84%). Respondents also thought that their future information searches would be more efficient (93%) and that the workshop should be integrated in a course (78%). Furthermore, five bioinformatics-related reference questions were answered and two one-to-one consultations with students were performed. Discussion: The success of our bioinformatics service is growing. Future directions include extending the service to other biomedical departments, integrating the workshop in an undergraduate course, promoting the subject guide to other francophone universities, and creating a bioinformatics blog that would feature specific databases, news, and library resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scientific studies on the materials management systems and practices actually followed in various organizations in India are rather limited. This is particularly true with respect to service industries. In this context ,the present study on the “materials management in state transport undertakings in India, with special reference to kerala state road corporation” assumes particular significance . This study, examines critically, the prevailing set up, procedures and practices of materials management in the Kerala state road transport corporation and compares them with the prevailing practices in other similar state transport undertakings. It indicates several areas for improvement with respect to the organization, materials planning, purchasing, store keeping and other aspects. It also seeks to develop a comprehensive inventory control system for KSRTC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As of 1999. the state of Kerala has 3210 offices of scheduled commercial banks (SCBS). In all, there are 48 commercial banks operating in Kerala, which includes PSBs, OPBs, NPBS. FBs, and Gramin Banks. The urban areas give a complete picture of the competition in the present day banking scenario with the presence of all bank groups. Semi-urban areas of Kerala have 2196 and urban areas have 593 as on March 1995.“ The study focuses on the selected segments ofthe urban customers in Kerala which is capable of giving the finer aspects of variation in customer behaviour in the purchase of banking products and services. Considering the exhaustive nature of such an exercise, all the districts in the state have not been brought under the purview of the study. Instead. three districts with largest volume of business in terms of deposits, advances, and number of offices have been short listed as representative regions for a focused study. The study focuses on the retail customer segment and their perceptions on the various products or services offered to them. Non Resident Indians (NRIs), and Traders and Small—ScaIe Industries segments have also been included in the study with a view to obtain a comparative picture with respect to perception on customer satisfaction and service quality dimensions and bank choice behaviour. The research is hence confined to customer behaviour and the implications for possible strategies for segmentation within the retail segment customers

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presented study examined the opinion of in-service and prospective chemistry teachers about the importance of usage of molecular and crystal models in secondary-level school practice, and investigated some of the reasons for their (non-) usage. The majority of participants stated that the use of models plays an important role in chemistry education and that they would use them more often if the circumstances were more favourable. Many teachers claimed that three-dimensional (3d) models are still not available in sufficient number at their schools; they also pointed to the lack of available computer facilities during chemistry lessons. The research revealed that, besides the inadequate material circumstances, less than one third of participants are able to use simple (freeware) computer programs for drawing molecular structures and their presentation in virtual space; however both groups of teachers expressed the willingness to improve their knowledge in the subject area. The investigation points to several actions which could be undertaken to improve the current situation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using an immersive virtual reality system, we measured the ability of observers to detect the rotation of an object when its movement was yoked to the observer's own translation. Most subjects had a large bias such that a static object appeared to rotate away from them as they moved. Thresholds for detecting target rotation were similar to those for an equivalent speed discrimination task carried out by static observers, suggesting that visual discrimination is the predominant limiting factor in detecting target rotation. Adding a stable visual reference frame almost eliminated the bias. Varying the viewing distance of the target had little effect, consistent with observers underestimating distance walked. However, accuracy of walking to a briefly presented visual target was high and not consistent with an underestimation of distance walked. We discuss implications for theories of a task-independent representation of visual space. © 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a distributed environment remote entities, usually the producers or consumers of services, need a means to publish their existence so that clients, needing their services, can search and find the appropriate ones that they can then interact with directly. The publication of information is via a registry service, and the interaction is via a high-level messaging service. Typically, separate libraries provide these two services. Tycho is an implementation of a wide-area asynchronous messaging framework with an integrated distributed registry. This will free developers from the need to assemble their applications from a range of potentially diverse middleware offerings, which should simplify and speed application development and more importantly allow developers to concentrate on their own domain of expertise. In the first part of the paper we outline our motivation for producing Tycho and then review a number of registry and messaging systems popular with the Grid community. In the second part of the paper we describe the architecture and implementation of Tycho. In the third part of the paper we present and discuss various performance tests that were undertaken to compare Tycho with alternative similar systems. Finally, we summarise and conclude the paper and outline future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is reported in the literature that distances from the observer are underestimated more in virtual environments (VEs) than in physical world conditions. On the other hand estimation of size in VEs is quite accurate and follows a size-constancy law when rich cues are present. This study investigates how estimation of distance in a CAVETM environment is affected by poor and rich cue conditions, subject experience, and environmental learning when the position of the objects is estimated using an experimental paradigm that exploits size constancy. A group of 18 healthy participants was asked to move a virtual sphere controlled using the wand joystick to the position where they thought a previously-displayed virtual cube (stimulus) had appeared. Real-size physical models of the virtual objects were also presented to the participants as a reference of real physical distance during the trials. An accurate estimation of distance implied that the participants assessed the relative size of sphere and cube correctly. The cube appeared at depths between 0.6 m and 3 m, measured along the depth direction of the CAVE. The task was carried out in two environments: a poor cue one with limited background cues, and a rich cue one with textured background surfaces. It was found that distances were underestimated in both poor and rich cue conditions, with greater underestimation in the poor cue environment. The analysis also indicated that factors such as subject experience and environmental learning were not influential. However, least square fitting of Stevens’ power law indicated a high degree of accuracy during the estimation of object locations. This accuracy was higher than in other studies which were not based on a size-estimation paradigm. Thus as indirect result, this study appears to show that accuracy when estimating egocentric distances may be increased using an experimental method that provides information on the relative size of the objects used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a solution to the problems associated with network latency within distributed virtual environments. It begins by discussing the advantages and disadvantages of synchronous and asynchronous distributed models, in the areas of user and object representation and user-to-user interaction. By introducing a hybrid solution, which utilises the concept of a causal surface, the advantages of both synchronous and asynchronous models are combined. Object distortion is a characteristic feature of the hybrid system, and this is proposed as a solution which facilitates dynamic real-time user collaboration. The final section covers implementation details, with reference to a prototype system available from the Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Virtual Lightbox for Museums and Archives (VLMA) is a tool for collecting and reusing, in a structured fashion, the online contents of museums and archive datasets. It is not restricted to datasets with visual components although VLMA includes a lightbox service that enables comparison and manipulation of visual information. With VLMA, one can browse and search collections, construct personal collections, annotate them, export these collections to XML or Impress (Open Office) presentation format, and share collections with other VLMA users. VLMA was piloted as an e-Learning tool as part of JISC’s e-Learning focus in its first phase (2004-2005) and in its second phase (2005-2006) it has incorporated new partner collections while improving and expanding interfaces and services. This paper concerns its development as a research and teaching tool, especially to teachers using museum collections, and discusses the recent development of VLMA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Peer-to-Peer (P2P) networks, it is often desirable to assign node IDs which preserve locality relationships in the underlying topology. Node locality can be embedded into node IDs by utilizing a one dimensional mapping by a Hilbert space filling curve on a vector of network distances from each node to a subset of reference landmark nodes within the network. However this approach is fundamentally limited because while robustness and accuracy might be expected to improve with the number of landmarks, the effectiveness of 1 dimensional Hilbert Curve mapping falls for the curse of dimensionality. This work proposes an approach to solve this issue using Landmark Multidimensional Scaling (LMDS) to reduce a large set of landmarks to a smaller set of virtual landmarks. This smaller set of landmarks has been postulated to represent the intrinsic dimensionality of the network space and therefore a space filling curve applied to these virtual landmarks is expected to produce a better mapping of the node ID space. The proposed approach, the Virtual Landmarks Hilbert Curve (VLHC), is particularly suitable for decentralised systems like P2P networks. In the experimental simulations the effectiveness of the methods is measured by means of the locality preservation derived from node IDs in terms of latency to nearest neighbours. A variety of realistic network topologies are simulated and this work provides strong evidence to suggest that VLHC performs better than either Hilbert Curves or LMDS use independently of each other.