890 resultados para Content Analysis and Indexing
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
In this paper we present the development and the implementation of a content analysis model for observing aspects relating to the social mission of the public library on Facebook pages and websites. The model is unique and it was developed from the literature. There were designed the four categories for analysis Generate social capital and social cohesion, Consolidate democracy and citizenship, Social and digital inclusion and Fighting illiteracies. The model enabled the collection and the analysis of data applied to a case study consisting of 99 Portuguese public libraries with Facebook page. With this model of content analysis we observed the facets of social mission and we read the actions with social facets on the Facebook page and in the websites of public libraries. At the end we discuss in parallel the results of observation of the Facebook of libraries and the websites. By reading the description of the actions of the social mission, the general conclusion and the most immediate is that 99 public libraries on Facebook and websites rarely publish social character actions, and the results are little satisfying. The Portuguese public libraries highlight substantially the actions in the category Generate social capital and social cohesion.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
In this work, the main factors affecting the rheological behavior of polyethylene terephtalate (PET) in the linear viscoelastic regime (water content, time delay before test, duration of experiment, and temperature) were accessed. Small amplitude oscillatory shear tests were performed after different time delays ranging from 300 to 5000 s for samples with water contents ranging from 0.02 to 0.45 wt %. Time sweep tests were carried out for different durations to explain the changes undergone by PET before and during small amplitude oscillatory shear measurements. Immediately after the time sweep tests, the PET samples were removed from the rheometer, analyzed by differential scanning calorimetry and their molar mass was obtained by viscometry analysis. It was shown that for all the samples, the delay before test and residence time within the rheometer (i.e. duration of experiment) result in structural changes of the PET samples, such as increase or decrease of molar mass, broadening of molar mass distribution, and branching phenomena. (C) 2010 Wiley Periodicals, Inc. J Appl Polym Sci 116: 3525-3533, 2010
Resumo:
Oral squamous cell carcinoma (OSCC) is associated with high morbidity and mortality which is due, at least in part, to late detection. Precancerous and cancerous oral lesions may mimic any number of benign oral lesions, and as such may be left without investigation and treatment until they are well advanced. Over the past several years there has been renewed interest in oral cytology as an adjuvant clinical tool in the investigation of oral mucosal lesions. The purpose of the present study was to compare the usefulness of ploidy analysis after Feulgen stained cytological thin-prep specimens with traditional incisional biopsy and routine histopathological examination for the assessment of the pre-malignant potential of oral mucosal lesions. An analysis of the cytological specimens was undertaken with virtual microscopy which allowed for rapid and thorough analysis of the complete cytological specimen. 100 healthy individuals between 30 and 70 years of age, who were non-smokers, non-drinkers and not taking any medication, had cytological specimens collected from both the buccal mucosa and lateral margin of tongue to establish normal cytology parameters within a control population. Patients with a presumptive clinical diagnosis of lichen planus, leukoplakia or OSCC had lesional cytological samples taken prior to their diagnostic biopsy. Standardised thin preparations were prepared and each specimen stained by both Feuglen and Papanicolau methods. High speed scanning of the complete slide at 40X magnification was undertaken using the Aperio Scanscope TM and the green channel of the resultant image was analysed after threshold segmentation to isolate only nuclei and the integrated optical density of each nucleus taken as a gross measure of the DNA content (ploidy). Preliminary results reveal that ploidy assessment of oral cytology holds great promise as an adjunctive prognostic factor in the analysis of the malignant potential of oral mucosal lesions.
Resumo:
This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes.
Resumo:
High-content analysis has revolutionized cancer drug discovery by identifying substances that alter the phenotype of a cell, which prevents tumor growth and metastasis. The high-resolution biofluorescence images from assays allow precise quantitative measures enabling the distinction of small molecules of a host cell from a tumor. In this work, we are particularly interested in the application of deep neural networks (DNNs), a cutting-edge machine learning method, to the classification of compounds in chemical mechanisms of action (MOAs). Compound classification has been performed using image-based profiling methods sometimes combined with feature reduction methods such as principal component analysis or factor analysis. In this article, we map the input features of each cell to a particular MOA class without using any treatment-level profiles or feature reduction methods. To the best of our knowledge, this is the first application of DNN in this domain, leveraging single-cell information. Furthermore, we use deep transfer learning (DTL) to alleviate the intensive and computational demanding effort of searching the huge parameter's space of a DNN. Results show that using this approach, we obtain a 30% speedup and a 2% accuracy improvement.
Resumo:
This paper presents a tool for the analysis and regeneration of Web contents, implemented through XML and Java. At the moment, the Web content delivery from server to clients is carried out without taking into account clients' characteristics. Heterogeneous and diverse characteristics, such as user's preferences, different capacities of the client's devices, different types of access, state of the network and current load on the server, directly affect the behavior of Web services. On the other hand, the growing use of multimedia objects in the design of Web contents is made without taking into account this diversity and heterogeneity. It affects, even more, the appropriate content delivery. Thus, the objective of the presented tool is the treatment of Web pages taking into account the mentioned heterogeneity and adapting contents in order to improve the performance on the Web
Analysis and evaluation of techniques for the extraction of classes in the ontology learning process
Resumo:
This paper analyzes and evaluates, in the context of Ontology learning, some techniques to identify and extract candidate terms to classes of a taxonomy. Besides, this work points out some inconsistencies that may be occurring in the preprocessing of text corpus, and proposes techniques to obtain good terms candidate to classes of a taxonomy.