975 resultados para Authority Files
Resumo:
Authority files serve to uniquely identify real world ‘things’ or entities like documents, persons, organisations, and their properties, like relations and features. Already important in the classical library world, authority files are indispensable for adequate information retrieval and analysis in the computer age. This is because, even more than humans, computers are poor at handling ambiguity. Through authority files, people tell computers which terms, names or numbers refer to the same thing or have the same meaning by giving equivalent notions the same identifier. Thus, authority files signpost the internet where these identifiers are interlinked on the basis of relevance. When executing a query, computers are able to navigate from identifier to identifier by following these links and collect the queried information on these so-called ‘crosswalks’. In this context, identifiers also go under the name controlled access points. Identifiers become even more crucial now massive data collections like library catalogues or research datasets are releasing their till-now contained data directly to the internet. This development is coined Open Linked Data. The concatenating name for the internet is Web of Data instead of the classical Web of Documents.
Resumo:
The present study is an attempt to highlight the problem of typographical errors in OPACS. The errors made while typing catalogue entries as well as importing bibliographical records from other libraries exist unnoticed by librarians resulting the non-retrieval of available records and affecting the quality of OPACs. This paper follows previous research on the topic mainly by Jeffrey Beall and Terry Ballard. The word “management” was chosen from the list of likely to be misspelled words identified by previous research. It was found that the word is wrongly entered in several forms in local, national and international OPACs justifying the observations of Ballard that typos occur in almost everywhere. Though there are lots of corrective measures proposed and are in use, the study asserts the fact that human effort is needed to get rid of the problem. The paper is also an invitation to the library professionals and system designers to construct a strategy to solve the issue
Resumo:
The present study is an attempt to highlight the problem of typographical errors in OPACS. The errors made while typing catalogue entries as well as importing bibliographical records from other libraries exist unnoticed by librarians resulting the non-retrieval of available records and affecting the quality of OPACs. This paper follows previous research on the topic mainly by Jeffrey Beall and Terry Ballard. The word “management” was chosen from the list of likely to be misspelled words identified by previous research. It was found that the word is wrongly entered in several forms in local, national and international OPACs justifying the observations of Ballard that typos occur in almost everywhere. Though there are lots of corrective measures proposed and are in use, the study asserts the fact that human effort is needed to get rid of the problem. The paper is also an invitation to the library professionals and system designers to construct a strategy to solve the issue
Resumo:
The first part presented at the meeting by A. Dipchikova is a brief report of the role of the National library as an institution in collecting, preserving and making accessible the national written heritage. Problems of digitization are examined from the point of view of the existing experience in cataloguing. Special attention is paid to the history and the significance of international standards, the experience in the field of development and maintenance of authority files on national and international level as well as in markup languages. Possibilities of using MARC and XML in the library are discussed. The second part presented here by E. Moussakova is giving an overview of the latest activities of the Library in the sphere of digitisation of the old Slavic manuscripts which are component of the national cultural heritage. It is pointed out that the current work is rather limited within the scope of preparation of metadata than being focused on digital products.
Resumo:
This article reports on research carried out on 200 child welfare files from the largest welfare authority in Northern Ireland from 1950-1968. The literature review provides a commentary on some of the major debates surrounding child welfare and protection social work from the perspective of its historical development. The report of the research which follows offers an insight into one core, and less well-known period of child welfare history in Northern Ireland between the two Children and Young Persons Acts (1950 & 1968). Using a method of discourse analysis influenced by Michel Foucault, a detailed description of the nature of practice is offered. This paper is offered as a work in progress, with further work being planned for dissemination of more detailed analysis of the method and outcomes. The research seeks to ask a few core questions based on problems identified in the present with our current understandings of child welfare and protection histories. While recognising the limitations of this study and the need for broader analysis of the wider context surrounding child welfare practice at the moment, it is argued that some salient conclusions can be drawn about continuity and discontinuity in practice which are of interest to practitioners and students of child welfare social work.
Automatic classification of scientific records using the German Subject Heading Authority File (SWD)
Resumo:
The following paper deals with an automatic text classification method which does not require training documents. For this method the German Subject Heading Authority File (SWD), provided by the linked data service of the German National Library is used. Recently the SWD was enriched with notations of the Dewey Decimal Classification (DDC). In consequence it became possible to utilize the subject headings as textual representations for the notations of the DDC. Basically, we we derive the classification of a text from the classification of the words in the text given by the thesaurus. The method was tested by classifying 3826 OAI-Records from 7 different repositories. Mean reciprocal rank and recall were chosen as evaluation measure. Direct comparison to a machine learning method has shown that this method is definitely competitive. Thus we can conclude that the enriched version of the SWD provides high quality information with a broad coverage for classification of German scientific articles.
Resumo:
“SOH see significant benefit in digitising its drawings and operation and maintenance manuals. Since SOH do not currently have digital models of the Opera House structure or other components, there is an opportunity for this national case study to promote the application of Digital Facility Modelling using standardized Building Information Models (BIM)”. The digital modelling element of this project examined the potential of building information models for Facility Management focusing on the following areas: • The re-usability of building information for FM purposes • BIM as an Integrated information model for facility management • Extendibility of the BIM to cope with business specific requirements • Commercial facility management software using standardised building information models • The ability to add (organisation specific) intelligence to the model • A roadmap for SOH to adopt BIM for FM The project has established that BIM – building information modelling - is an appropriate and potentially beneficial technology for the storage of integrated building, maintenance and management data for SOH. Based on the attributes of a BIM, several advantages can be envisioned: consistency in the data, intelligence in the model, multiple representations, source of information for intelligent programs and intelligent queries. The IFC – open building exchange standard – specification provides comprehensive support for asset and facility management functions, and offers new management, collaboration and procurement relationships based on sharing of intelligent building data. The major advantages of using an open standard are: information can be read and manipulated by any compliant software, reduced user “lock in” to proprietary solutions, third party software can be the “best of breed” to suit the process and scope at hand, standardised BIM solutions consider the wider implications of information exchange outside the scope of any particular vendor, information can be archived as ASCII files for archival purposes, and data quality can be enhanced as the now single source of users’ information has improved accuracy, correctness, currency, completeness and relevance. SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. There have been remarkably few technical difficulties in converting the House’s existing conventions and standards to the new model based environment. This demonstrates that the IFC model represents world practice for building data representation and management (see Sydney Opera House – FM Exemplar Project Report Number 2005-001-C-3, Open Specification for BIM: Sydney Opera House Case Study). Availability of FM applications based on BIM is in its infancy but focussed systems are already in operation internationally and show excellent prospects for implementation systems at SOH. In addition to the generic benefits of standardised BIM described above, the following FM specific advantages can be expected from this new integrated facilities management environment: faster and more effective processes, controlled whole life costs and environmental data, better customer service, common operational picture for current and strategic planning, visual decision-making and a total ownership cost model. Tests with partial BIM data – provided by several of SOH’s current consultants – show that the creation of a SOH complete model is realistic, but subject to resolution of compliance and detailed functional support by participating software applications. The showcase has demonstrated successfully that IFC based exchange is possible with several common BIM based applications through the creation of a new partial model of the building. Data exchanged has been geometrically accurate (the SOH building structure represents some of the most complex building elements) and supports rich information describing the types of objects, with their properties and relationships.
Resumo:
Objective The review addresses two distinct sets of issues: 1. specific functionality, interface, and calculation problems that presumably can be fixed or improved; and 2. the more fundamental question of whether the system is close to being ready for ‘commercial prime time’ in the North American market. Findings Many of our comments relate to the first set of issues, especially sections B and C. Sections D and E deal with the second set. Overall, we feel that LCADesign represents a very impressive step forward in the ongoing quest to link CAD with LCA tools and, more importantly, to link the world of architectural practice and that of environmental research. From that perspective, it deserves continued financial support as a research project. However, if the decision is whether or not to continue the development program from a purely commercial perspective, we are less bullish. In terms of the North American market, there are no regulatory or other drivers to press design teams to use a tool of this nature. There is certainly interest in this area, but the tools must be very easy to use with little or no training. Understanding the results is as important in this regard as knowing how to apply the tool. Our comments are fairly negative when it comes to that aspect. Our opinion might change to some degree when the ‘fixes’ are made and the functionality improved. However, as discussed in more detail in the following sections, we feel that the multi-step process — CAD to IFC to LCADesign — could pose a serious problem in terms of market acceptance. The CAD to IFC part is impossible for us to judge with the information provided, and we can’t even begin to answer the question about the ease of using the software to import designs, but it appears cumbersome from what we do know. There does appear to be a developing North American market for 3D CAD, with a recent survey indicating that about 50% of the firms use some form of 3D modeling for about 75% of their projects. However, this does not mean that full 3D CAD is always being used. Our information suggests that AutoDesk accounts for about 75 to 80% of the 3D CAD market, and they are very cautious about any links that do not serve a latent demand. Finally, other system that link CAD to energy simulation are using XML data transfer protocols rather than IFC files, and it is our understanding that the market served by AutoDesk tends in that direction right now. This is a subject that is outside our area of expertise, so please take these comments as suggestions for more intensive market research rather than as definitive findings.
Resumo:
The indoor air quality (IAQ) in buildings is currently assessed by measurement of pollutants during building operation for comparison with air quality standards. Current practice at the design stage tries to minimise potential indoor air quality impacts of new building materials and contents by selecting low-emission materials. However low-emission materials are not always available, and even when used the aggregated pollutant concentrations from such materials are generally overlooked. This paper presents an innovative tool for estimating indoor air pollutant concentrations at the design stage, based on emissions over time from large area building materials, furniture and office equipment. The estimator considers volatile organic compounds, formaldehyde and airborne particles from indoor materials and office equipment and the contribution of outdoor urban air pollutants affected by urban location and ventilation system filtration. The estimated pollutants are for a single, fully mixed and ventilated zone in an office building with acceptable levels derived from Australian and international health-based standards. The model acquires its dimensional data for the indoor spaces from a 3D CAD model via IFC files and the emission data from a building products/contents emissions database. This paper describes the underlying approach to estimating indoor air quality and discusses the benefits of such an approach for designers and the occupants of buildings.
Resumo:
Underlying social space are territories, lands,geographical domains, the actual geographical underpinnings of the imperial, and also the cultural contest. To think about distant places, to colonize them, to populate or depopulate them: all of this occurs on, about, or because of land. […] Imperialism and the culture associated with it affirm both the primacy of geography and an ideology about control of territory.
Resumo:
In today’s global design world, architectural and other related design firms design across time zones and geographically distant locations. High bandwidth virtual environments have the potential to make a major impact on these global design teams. However, there is insufficient evidence about the way designers collaborate in their normal working environments using traditional and/or digital media. This paper presents a method to study the impact of communication and information technologies on collaborative design practice by comparing design tasks done in a normal working environment with design tasks done in a virtual environment. Before introducing high bandwidth collaboration technology to the work environment, a baseline study is conducted to observe and analyze the existing collaborative process. Designers currently rely on phone, fax, email, and image files for communication and collaboration. Describing the current context is important for comparison with the following phases. We developed the coding scheme that will be used in analyzing three stages of the collaborative design activity. The results will establish the basis for measures of collaborative design activity when a new technology is introduced later to the same work environment – for example, designers using electronic whiteboards, 3D virtual worlds, webcams, and internet phone. The results of this work will form the basis of guidelines for the introduction of technology into global design offices
Resumo:
The ability to assess a commercial building for its impact on the environment at the earliest stage of design is a goal which is achievable by integrating several approaches into a single procedure directly from the 3D CAD representation. Such an approach enables building design professionals to make informed decisions on the environmental impact of building and its alternatives during the design development stage instead of at the post-design stage where options become limited. The indicators of interest are those which relate to consumption of resources and energy, contributions to pollution of air, water and soil, and impacts on the health and wellbeing of people in the built environment as a result of constructing and operating buildings. 3D object-oriented CAD files contain a wealth of building information which can be interrogated for details required for analysis of the performance of a design. The quantities of all components in the building can be automatically obtained from the 3D CAD objects and their constituent materials identified to calculate a complete list of the amounts of all building products such as concrete, steel, timber, plastic etc. When this information is combined with a life cycle inventory database, key internationally recognised environmental indicators can be estimated. Such a fully integrated tool known as LCADesign has been created for automated ecoefficiency assessment of commercial buildings direct from 3D CAD. This paper outlines the key features of LCADesign and its application to environmental assessment of commercial buildings.
Resumo:
Buildings consume resources and energy, contribute to pollution of our air, water and soil, impact the health and well-being of populations and constitute an important part of the built environment in which we live. The ability to assess their design with a view to reducing that impact automatically from their 3D CAD representations enables building design professionals to make informed decisions on the environmental impact of building structures. Contemporary 3D object-oriented CAD files contain a wealth of building information. LCADesign has been designed as a fully integrated approach for automated eco-efficiency assessment of commercial buildings direct from 3D CAD. LCADesign accesses the 3D CAD detail through Industry Foundation Classes (IFCs) - the international standard file format for defining architectural and constructional CAD graphic data as 3D real-world objects - to permit construction professionals to interrogate these intelligent drawing objects for analysis of the performance of a design. The automated take-off provides quantities of all building components whose specific production processes, logistics and raw material inputs, where necessary, are identified to calculate a complete list of quantities for all products such as concrete, steel, timber, plastic etc and combines this information with the life cycle inventory database, to estimate key internationally recognised environmental indicators such as CML, EPS and Eco-indicator 99. This paper outlines the key modules of LCADesign and their role in delivering an automated eco-efficiency assessment for commercial buildings.
Resumo:
Our students come from diverse backgrounds. They need flexibility in their learning. First year students tend to worry when they miss lectures or part of lectures. Having the lecture as an on line resource allows students to miss a lecture without stressing about it and to be more relaxed in the lecture, knowing that anything they may miss will be available later. The resource: The Windows based program from Blueberry Software (not Blackberry!) - BB Flashback - allows the simultaneous recording of the computer screen together with the audio, as well as Webcam recording. Editing capabilities include adding pause buttons, graphics and text to the file before exporting it in a flash file. Any diagrams drawn on the board or shown via visualiser can be photographed and easily incorporated. The audio from the file can be extracted if required to be posted as podcast. Exporting modes other than Flash are also available, allowing vodcasting if you wish. What you will need: - the recording software: it can be installed on the lecture hall computer just prior to lecture if needed - a computer: either the ones in lecture halls, especially if fitted with audio recording, or a laptop (I have used audio recording via Bluetooth for mobility). Feedback from students has been positive and will be presented on the poster.
Resumo:
PERWAPI is a component for reading and writing .NET PE-files. The name is a compound acronym for Program Executable – Reader/Writer – Application Programming Interface. The code was written by one of us (Diane Corney) with some contributions from some of the early users of the tool. PERWAPI is a managed component, written entirely in safe C#. The design of the writer part of the component is loosely based on Diane Corney’s previous PEAPI component. It is open source software, and is released under a “FreeBSD-like” license. The source may be downloaded from “http://plas.fit.qut.edu.au/perwapi/” As of the date of this document the code has facilities for reading and writing PEfiles compatible with the latest (beta-2) release of the ”Whidbey” version of .NET, that is, the Visual Studio 2005 framework. An invocation option allows earlier versions of the framework to be targeted.