821 resultados para granular computing
Resumo:
The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.
Resumo:
The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.
Resumo:
Web document cluster analysis plays an important role in information retrieval by organizing large amounts of documents into a small number of meaningful clusters. Traditional web document clustering is based on the Vector Space Model (VSM), which takes into account only two-level (document and term) knowledge granularity but ignores the bridging paragraph granularity. However, this two-level granularity may lead to unsatisfactory clustering results with “false correlation”. In order to deal with the problem, a Hierarchical Representation Model with Multi-granularity (HRMM), which consists of five-layer representation of data and a twophase clustering process is proposed based on granular computing and article structure theory. To deal with the zero-valued similarity problemresulted from the sparse term-paragraphmatrix, an ontology based strategy and a tolerance-rough-set based strategy are introduced into HRMM. By using granular computing, structural knowledge hidden in documents can be more efficiently and effectively captured in HRMM and thus web document clusters with higher quality can be generated. Extensive experiments show that HRMM, HRMM with tolerancerough-set strategy, and HRMM with ontology all outperform VSM and a representative non VSM-based algorithm, WFP, significantly in terms of the F-Score.
Resumo:
In order to address problems of information overload in digital imagery task domains we have developed an interactive approach to the capture and reuse of image context information. Our framework models different aspects of the relationship between images and domain tasks they support by monitoring the interactive manipulation and annotation of task-relevant imagery. The approach allows us to gauge a measure of a user's intentions as they complete goal-directed image tasks. As users analyze retrieved imagery their interactions are captured and an expert task context is dynamically constructed. This human expertise, proficiency, and knowledge can then be leveraged to support other users in carrying out similar domain tasks. We have applied our techniques to two multimedia retrieval applications for two different image domains, namely the geo-spatial and medical imagery domains. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
In different fields a conception of granules is applied both as a group of elements defined by internal properties and as something inseparable whole reflecting external properties. Granular computing may be interpreted in terms of abstraction, generalization, clustering, levels of abstraction, levels of detail, and so on. We have proposed to use multialgebraic systems as a mathematical tool for synthesis and analysis of granules and granule structures. The theorem of necessary and sufficient conditions for multialgebraic systems existence has been proved.
Resumo:
Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.
Resumo:
We show the effects of the granular structure of the initial conditions of a hydrodynamic description of high-energy nucleus-nucleus collisions on some observables, especially on the elliptic-flow parameter upsilon(2). Such a structure enhances production of isotropically distributed high-p(T) particles, making upsilon(2) smaller there. Also, it reduces upsilon(2) in the forward and backward regions where the global matter density is smaller and, therefore, where such effects become more efficacious.
Resumo:
This study aimed to determine the efficiency of an anaerobic stirred sequencing-batch reactor containing granular biomass for the degradation of linear alkylbenzene sulfonate (LAS), a surfactant present in household detergent. The bioreactor was monitored for LAS concentrations in the influent, effluent and sludge, pH, chemical oxygen demand, bicarbonate alkalinity, total solids, and volatile solids. The degradation of LAS was found to be higher in the absence of co-substrates (53%) than in their presence (24-37%). Using the polymerase chain reaction and denaturing gradient gel electrophoresis (PCR/DGGE), we identified populations of microorganisms from the Bacteria and Archaea domains. Among the bacteria, we identified uncultivated populations of Arcanobacterium spp. (94%) and Opitutus spp. (96%). Among the Archaea, we identified Methanospirillum spp. (90%), Methanosaeta spp. (98%), and Methanobacterium spp. (96%). The presence of methanogenic microorganisms shows that LAS did not inhibit anaerobic digestion. Sampling at the last stage of reactor operation recovered 61 clones belonging to the domain bacteria. These represented a variety of phyla: 34% shared significant homology with Bacteroidetes, 18% with Proteobacteria, 11% with Verrucomicrobia, 8% with Fibrobacteres, 2% with Acidobacteria, 3% with Chlorobi and Firmicutes, and 1% with Acidobacteres and Chloroflexi. A small fraction of the clones (13%) were not related to any phylum. Published by Elsevier Ltd.
Resumo:
We investigate in detail the effects of a QND vibrational number measurement made on single ions in a recently proposed measurement scheme for the vibrational state of a register of ions in a linear rf trap [C. D'HELON and G. J. MILBURN, Phys Rev. A 54, 5141 (1996)]. The performance of a measurement shows some interesting patterns which are closely related to searching.
Resumo:
Expokit provides a set of routines aimed at computing matrix exponentials. More precisely, it computes either a small matrix exponential in full, the action of a large sparse matrix exponential on an operand vector, or the solution of a system of linear ODEs with constant inhomogeneity. The backbone of the sparse routines consists of matrix-free Krylov subspace projection methods (Arnoldi and Lanczos processes), and that is why the toolkit is capable of coping with sparse matrices of large dimension. The software handles real and complex matrices and provides specific routines for symmetric and Hermitian matrices. The computation of matrix exponentials is a numerical issue of critical importance in the area of Markov chains and furthermore, the computed solution is subject to probabilistic constraints. In addition to addressing general matrix exponentials, a distinct attention is assigned to the computation of transient states of Markov chains.
Resumo:
This article reports on the liquid phase adsorption of flavour esters onto granular activated carbon. Ethyl propionate, ethyl butyrate, and ethyl isovalerate were used as adsorbates, and Filtrasorb 400 activated carbon was chosen as the adsorbent. Sips, Toth, Unilan, and Dubinin-Radushkevich isotherm equations which are generally used for heterogeneous adsorbents were used to fit the data. Although satisfactory in fitting the data, inconsistency in parameter values indicated these models to be inadequate. On the other hand the Dubinin-Radushkevich model gave more consistent and meaningful parameter values and adsorption capacities. By employing the Dubinin-Radushkevich equation, the limiting volume of the adsorbed space, which equals the accessible micropore volume, was determined, and found to correlate with the value from carbon dioxide adsorption.
Resumo:
We derive a general thermo-mechanical theory for particulate materials consisting of granules of arbitrary whose material points possess three translational and three independent rotational degrees of freedom. Additional field variables are the translational and rotational granular temperatures, the kinetic energies shape and size. The kinematics of granulate is described within the framework of a polar continuum theory of the velocity and spin fluctuations respectively and the usual thermodynamic temperature. We distinguish between averages over particle categories (averages in mass/velocity and moment of inertia/spin space, respectively) and particle phases where the average extends over distinct subsets of particle categories (multi phase flows). The relationship between the thermal energy in the granular system and phonon energy in a molecular system is briefly discussed in the main body of the paper and discussed in detail in the Appendix A. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
We present a scheme which offers a significant reduction in the resources required to implement linear optics quantum computing. The scheme is a variation of the proposal of Knill, Laflamme and Milburn, and makes use of an incremental approach to the error encoding to boost probability of success.