879 resultados para pacs: information retrieval techniques
Resumo:
Assuming as a starting point the acknowledge that the principles and methods used to build and manage the documentary systems are disperse and lack systematization, this study hypothesizes that the notion of structure, when assuming mutual relationships among its elements, promotes more organical systems and assures better quality and consistency in the retrieval of information concerning users` matters. Accordingly, it aims to explore the fundamentals about the records of information and documentary systems, starting from the notion of structure. In order to achieve that, it presents basic concepts and relative matters to documentary systems and information records. Next to this, it lists the theoretical subsides over the notion of structure, studied by Benveniste, Ferrater Mora, Levi-Strauss, Lopes, Penalver Simo, Saussure, apart from Ducrot, Favero and Koch. Appropriations that have already been done by Paul Otlet, Garcia Gutierrez and Moreiro Gonzalez. In Documentation come as a further topic. It concludes that the adopted notion of structure to make explicit a hypothesis of real systematization achieves more organical systems, as well as it grants pedagogical reference to the documentary tasks.
Resumo:
The project of Information Architecture is one of the initial stages of the project of a website, thus the detection and correction of errors in this stage are easier and time-saving than in the following stages. However, to minimize errors for the projects of information architecture, a methodology is necessary to organize the work of the professional and guarantee the final product quality. The profile of the professional who works with Information Architecture in Brazil has been analyzed (quantitative research by means of a questionnaire on-line) as well as the difficulties, techniques and methodologies found in his projects (qualitative research by means of interviews in depth with support of the approaches of the Sense-Making). One concludes that the methodologies of projects of information architecture need to develop the adoption of the approaches of Design Centered in the User and in the ways to evaluate its results.
Resumo:
SKAN: Skin Scanner - System for Skin Cancer Detection Using Adaptive Techniques - combines computer engineering concepts with areas like dermatology and oncology. Its objective is to discern images of skin cancer, specifically melanoma, from others that show only common spots or other types of skin diseases, using image recognition. This work makes use of the ABCDE visual rule, which is often used by dermatologists for melanoma identification, to define which characteristics are analyzed by the software. It then applies various algorithms and techniques, including an ellipse-fitting algorithm, to extract and measure these characteristics and decide whether the spot is a melanoma or not. The achieved results are presented with special focus on the adaptive decision-making and its effect on the diagnosis. Finally, other applications of the software and its algorithms are presented.
Resumo:
The etiological agent of maize white spot (MWS) disease has been a subject of controversy and discussion. Initially the disease was described as Phaeosphaeria leaf spot caused by Phaeosphaeria maydis. Other authors have Suggested the existence of different fungal species causing similar symptoms. Recently, a bacterium, Pantoea ananatis, was described as the causal agent of this disease. The purpose of this Study was to offer additional information on the correct etiology of this disease by providing visual evidence of the presence of the bacterium in the interior of the MWS lesions by using transmission electron microscopy (TEM) and molecular techniques. The TEM allowed Visualization of a large amount of bacteria in the intercellular spaces of lesions collected from both artificially and naturally infected plants. Fungal structures were not visualized in young lesions. Bacterial primers for the 16S rRNA and rpoB genes were used in PCR reactions to amplify DNA extracted from water-soaked (young) and necrotic lesions. The universal fungal oligonucleotide ITS4 was also included to identity the possible presence of fungal structures inside lesions. Positive PCR products from water-soaked lesions, both from naturally and artificially inoculated plants, were produced with bacterial primers, whereas no amplification was observed when ITS4 oligonucleotide was used. On the other hand, DNA amplification with ITS4 primer was observed when DNA was isolated from necrotic (old) lesions. These results reinforced previous report of P. ananatis as the primary pathogen and the hypothesis that fungal species may colonize lesions pre-established by P. ananatis.
Resumo:
Experimental mechanical sieving methods are applied to samples of shellfish remains from three sites in southeast Queensland, Seven Mile Creek Mound, Sandstone Point and One-Tree, to test the efficacy of various recovery and quantification procedures commonly applied to shellfish assemblages in Australia. There has been considerable debate regarding the most appropriate sieve sizes and quantification methods that should be applied in the recovery of vertebrate faunal remains. Few studies, however, have addressed the impact of recovery and quantification methods on the interpretation of invertebrates, specifically shellfish remains. In this study, five shellfish taxa representing four bivalves (Anadara trapezia, Trichomya hirsutus, Saccostrea glomerata, Donax deltoides) and one gastropod (Pyrazus ebeninus) common in eastern Australian midden assemblages are sieved through 10mm, 6.3mm and 3.15mm mesh. Results are quantified using MNI, NISP and weight. Analyses indicate that different structural properties and pre- and postdepositional factors affect recovery rates. Fragile taxa (T. hirsutus) or those with foliated structure (S. glomerata) tend to be overrepresented by NISP measures in smaller sieve fractions, while more robust taxa (A. trapezia and P. ebeninus) tend to be overrepresented by weight measures. Results demonstrate that for all quantification methods tested a 3mm sieve should be used on all sites to allow for regional comparability and to effectively collect all available information about the shellfish remains.
Resumo:
While multimedia data, image data in particular, is an integral part of most websites and web documents, our quest for information so far is still restricted to text based search. To explore the World Wide Web more effectively, especially its rich repository of truly multimedia information, we are facing a number of challenging problems. Firstly, we face the ambiguous and highly subjective nature of defining image semantics and similarity. Secondly, multimedia data could come from highly diversified sources, as a result of automatic image capturing and generation processes. Finally, multimedia information exists in decentralised sources over the Web, making it difficult to use conventional content-based image retrieval (CBIR) techniques for effective and efficient search. In this special issue, we present a collection of five papers on visual and multimedia information management and retrieval topics, addressing some aspects of these challenges. These papers have been selected from the conference proceedings (Kluwer Academic Publishers, ISBN: 1-4020- 7060-8) of the Sixth IFIP 2.6 Working Conference on Visual Database Systems (VDB6), held in Brisbane, Australia, on 29–31 May 2002.
Resumo:
Allergy is a major cause of morbidity worldwide. The number of characterized allergens and related information is increasing rapidly creating demands for advanced information storage, retrieval and analysis. Bioinformatics provides useful tools for analysing allergens and these are complementary to traditional laboratory techniques for the study of allergens. Specific applications include structural analysis of allergens, identification of B- and T-cell epitopes, assessment of allergenicity and cross-reactivity, and genome analysis. In this paper, the most important bioinformatic tools and methods with relevance to the study of allergy have been reviewed.
Resumo:
Recent advances in the control of molecular engineering architectures have allowed unprecedented ability of molecular recognition in biosensing, with a promising impact for clinical diagnosis and environment control. The availability of large amounts of data from electrical, optical, or electrochemical measurements requires, however, sophisticated data treatment in order to optimize sensing performance. In this study, we show how an information visualization system based on projections, referred to as Projection Explorer (PEx), can be used to achieve high performance for biosensors made with nanostructured films containing immobilized antigens. As a proof of concept, various visualizations were obtained with impedance spectroscopy data from an array of sensors whose electrical response could be specific toward a given antibody (analyte) owing to molecular recognition processes. In addition to discussing the distinct methods for projection and normalization of the data, we demonstrate that an excellent distinction can be made between real samples tested positive for Chagas disease and Leishmaniasis, which could not be achieved with conventional statistical methods. Such high performance probably arose from the possibility of treating the data in the whole frequency range. Through a systematic analysis, it was inferred that Sammon`s mapping with standardization to normalize the data gives the best results, where distinction could be made of blood serum samples containing 10(-7) mg/mL of the antibody. The method inherent in PEx and the procedures for analyzing the impedance data are entirely generic and can be extended to optimize any type of sensor or biosensor.
Resumo:
The ground and excited state geometry of the six-coordinate copper(II) ion is examined in detail using the CuF64- and Cu(H2O)(6)(2+) complexes as examples. A variety of spectroscopic techniques are used to illustrate the relations between the geometric and electronic properties of these complexes through the characterization of their potential energy surfaces.
Resumo:
This paper examines the effects of information request ambiguity and construct incongruence on end user's ability to develop SQL queries with an interactive relational database query language. In this experiment, ambiguity in information requests adversely affected accuracy and efficiency. Incongruities among the information request, the query syntax, and the data representation adversely affected accuracy, efficiency, and confidence. The results for ambiguity suggest that organizations might elicit better query development if end users were sensitized to the nature of ambiguities that could arise in their business contexts. End users could translate natural language queries into pseudo-SQL that could be examined for precision before the queries were developed. The results for incongruence suggest that better query development might ensue if semantic distances could be reduced by giving users data representations and database views that maximize construct congruence for the kinds of queries in typical domains. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Regional planners, policy makers and policing agencies all recognize the importance of better understanding the dynamics of crime. Theoretical and application-oriented approaches which provide insights into why and where crimes take place are much sought after. Geographic information systems and spatial analysis techniques, in particular, are proving to be essential or studying criminal activity. However, the capabilities of these quantitative methods continue to evolve. This paper explores the use of geographic information systems and spatial analysis approaches for examining crime occurrence in Brisbane, Australia. The analysis highlights novel capabilities for the analysis of crime in urban regions.
Resumo:
The blending of coals has become popular to improve the performance of coals, to meet specifications of power plants and, to reduce the cost of coals, This article reviews the results and provides new information on ignition, flame stability, and carbon burnout studies of blended coals. The reviewed studies were conducted in laboratory-, pilot-, and full-scale facilities. The new information was taken in pilot-scale studies. The results generally show that blending a high-volatile coal with a low-volatile coal or anthracite can improve the ignition, flame stability and burnout of the blends. This paper discusses two general methods to predict the performance of blended coals: (1) experiment; and (2) indices. Laboratory- and pilot-scale tests, at least, provide a relative ranking of the combustion performance of coal/blends in power station boilers. Several indices, volatile matter content, heating value and a maceral index, can be used to predict the relative ranking of ignitability and flame stability of coals and blends. The maceral index, fuel ratio, and vitrinite reflectance can also be used to predict the absolute carbon burnout of coal and blends within limits. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
The majority of the world's population now resides in urban environments and information on the internal composition and dynamics of these environments is essential to enable preservation of certain standards of living. Remotely sensed data, especially the global coverage of moderate spatial resolution satellites such as Landsat, Indian Resource Satellite and Systeme Pour I'Observation de la Terre (SPOT), offer a highly useful data source for mapping the composition of these cities and examining their changes over time. The utility and range of applications for remotely sensed data in urban environments could be improved with a more appropriate conceptual model relating urban environments to the sampling resolutions of imaging sensors and processing routines. Hence, the aim of this work was to take the Vegetation-Impervious surface-Soil (VIS) model of urban composition and match it with the most appropriate image processing methodology to deliver information on VIS composition for urban environments. Several approaches were evaluated for mapping the urban composition of Brisbane city (south-cast Queensland, Australia) using Landsat 5 Thematic Mapper data and 1:5000 aerial photographs. The methods evaluated were: image classification; interpretation of aerial photographs; and constrained linear mixture analysis. Over 900 reference sample points on four transects were extracted from the aerial photographs and used as a basis to check output of the classification and mixture analysis. Distinctive zonations of VIS related to urban composition were found in the per-pixel classification and aggregated air-photo interpretation; however, significant spectral confusion also resulted between classes. In contrast, the VIS fraction images produced from the mixture analysis enabled distinctive densities of commercial, industrial and residential zones within the city to be clearly defined, based on their relative amount of vegetation cover. The soil fraction image served as an index for areas being (re)developed. The logical match of a low (L)-resolution, spectral mixture analysis approach with the moderate spatial resolution image data, ensured the processing model matched the spectrally heterogeneous nature of the urban environments at the scale of Landsat Thematic Mapper data.