862 resultados para Algorithms, Properties, the KCube Graphs
Resumo:
Los hipergrafos dirigidos se han empleado en problemas relacionados con lógica proposicional, bases de datos relacionales, linguística computacional y aprendizaje automático. Los hipergrafos dirigidos han sido también utilizados como alternativa a los grafos (bipartitos) dirigidos para facilitar el estudio de las interacciones entre componentes de sistemas complejos que no pueden ser fácilmente modelados usando exclusivamente relaciones binarias. En este contexto, este tipo de representación es conocida como hiper-redes. Un hipergrafo dirigido es una generalización de un grafo dirigido especialmente adecuado para la representación de relaciones de muchos a muchos. Mientras que una arista en un grafo dirigido define una relación entre dos de sus nodos, una hiperarista en un hipergrafo dirigido define una relación entre dos conjuntos de sus nodos. La conexión fuerte es una relación de equivalencia que divide el conjunto de nodos de un hipergrafo dirigido en particiones y cada partición define una clase de equivalencia conocida como componente fuertemente conexo. El estudio de los componentes fuertemente conexos de un hipergrafo dirigido puede ayudar a conseguir una mejor comprensión de la estructura de este tipo de hipergrafos cuando su tamaño es considerable. En el caso de grafo dirigidos, existen algoritmos muy eficientes para el cálculo de los componentes fuertemente conexos en grafos de gran tamaño. Gracias a estos algoritmos, se ha podido averiguar que la estructura de la WWW tiene forma de “pajarita”, donde más del 70% del los nodos están distribuidos en tres grandes conjuntos y uno de ellos es un componente fuertemente conexo. Este tipo de estructura ha sido también observada en redes complejas en otras áreas como la biología. Estudios de naturaleza similar no han podido ser realizados en hipergrafos dirigidos porque no existe algoritmos capaces de calcular los componentes fuertemente conexos de este tipo de hipergrafos. En esta tesis doctoral, hemos investigado como calcular los componentes fuertemente conexos de un hipergrafo dirigido. En concreto, hemos desarrollado dos algoritmos para este problema y hemos determinado que son correctos y cuál es su complejidad computacional. Ambos algoritmos han sido evaluados empíricamente para comparar sus tiempos de ejecución. Para la evaluación, hemos producido una selección de hipergrafos dirigidos generados de forma aleatoria inspirados en modelos muy conocidos de grafos aleatorios como Erdos-Renyi, Newman-Watts-Strogatz and Barabasi-Albert. Varias optimizaciones para ambos algoritmos han sido implementadas y analizadas en la tesis. En concreto, colapsar los componentes fuertemente conexos del grafo dirigido que se puede construir eliminando ciertas hiperaristas complejas del hipergrafo dirigido original, mejora notablemente los tiempos de ejecucion de los algoritmos para varios de los hipergrafos utilizados en la evaluación. Aparte de los ejemplos de aplicación mencionados anteriormente, los hipergrafos dirigidos han sido también empleados en el área de representación de conocimiento. En concreto, este tipo de hipergrafos se han usado para el cálculo de módulos de ontologías. Una ontología puede ser definida como un conjunto de axiomas que especifican formalmente un conjunto de símbolos y sus relaciones, mientras que un modulo puede ser entendido como un subconjunto de axiomas de la ontología que recoge todo el conocimiento que almacena la ontología sobre un conjunto especifico de símbolos y sus relaciones. En la tesis nos hemos centrado solamente en módulos que han sido calculados usando la técnica de localidad sintáctica. Debido a que las ontologías pueden ser muy grandes, el cálculo de módulos puede facilitar las tareas de re-utilización y mantenimiento de dichas ontologías. Sin embargo, analizar todos los posibles módulos de una ontología es, en general, muy costoso porque el numero de módulos crece de forma exponencial con respecto al número de símbolos y de axiomas de la ontología. Afortunadamente, los axiomas de una ontología pueden ser divididos en particiones conocidas como átomos. Cada átomo representa un conjunto máximo de axiomas que siempre aparecen juntos en un modulo. La decomposición atómica de una ontología es definida como un grafo dirigido de tal forma que cada nodo del grafo corresponde con un átomo y cada arista define una dependencia entre una pareja de átomos. En esta tesis introducimos el concepto de“axiom dependency hypergraph” que generaliza el concepto de descomposición atómica de una ontología. Un modulo en una ontología correspondería con un componente conexo en este tipo de hipergrafos y un átomo de una ontología con un componente fuertemente conexo. Hemos adaptado la implementación de nuestros algoritmos para que funcionen también con axiom dependency hypergraphs y poder de esa forma calcular los átomos de una ontología. Para demostrar la viabilidad de esta idea, hemos incorporado nuestros algoritmos en una aplicación que hemos desarrollado para la extracción de módulos y la descomposición atómica de ontologías. A la aplicación la hemos llamado HyS y hemos estudiado sus tiempos de ejecución usando una selección de ontologías muy conocidas del área biomédica, la mayoría disponibles en el portal de Internet NCBO. Los resultados de la evaluación muestran que los tiempos de ejecución de HyS son mucho mejores que las aplicaciones más rápidas conocidas. ABSTRACT Directed hypergraphs are an intuitive modelling formalism that have been used in problems related to propositional logic, relational databases, computational linguistic and machine learning. Directed hypergraphs are also presented as an alternative to directed (bipartite) graphs to facilitate the study of the interactions between components of complex systems that cannot naturally be modelled as binary relations. In this context, they are known as hyper-networks. A directed hypergraph is a generalization of a directed graph suitable for representing many-to-many relationships. While an edge in a directed graph defines a relation between two nodes of the graph, a hyperedge in a directed hypergraph defines a relation between two sets of nodes. Strong-connectivity is an equivalence relation that induces a partition of the set of nodes of a directed hypergraph into strongly-connected components. These components can be collapsed into single nodes. As result, the size of the original hypergraph can significantly be reduced if the strongly-connected components have many nodes. This approach might contribute to better understand how the nodes of a hypergraph are connected, in particular when the hypergraphs are large. In the case of directed graphs, there are efficient algorithms that can be used to compute the strongly-connected components of large graphs. For instance, it has been shown that the macroscopic structure of the World Wide Web can be represented as a “bow-tie” diagram where more than 70% of the nodes are distributed into three large sets and one of these sets is a large strongly-connected component. This particular structure has been also observed in complex networks in other fields such as, e.g., biology. Similar studies cannot be conducted in a directed hypergraph because there does not exist any algorithm for computing the strongly-connected components of the hypergraph. In this thesis, we investigate ways to compute the strongly-connected components of directed hypergraphs. We present two new algorithms and we show their correctness and computational complexity. One of these algorithms is inspired by Tarjan’s algorithm for directed graphs. The second algorithm follows a simple approach to compute the stronglyconnected components. This approach is based on the fact that two nodes of a graph that are strongly-connected can also reach the same nodes. In other words, the connected component of each node is the same. Both algorithms are empirically evaluated to compare their performances. To this end, we have produced a selection of random directed hypergraphs inspired by existent and well-known random graphs models like Erd˝os-Renyi and Newman-Watts-Strogatz. Besides the application examples that we mentioned earlier, directed hypergraphs have also been employed in the field of knowledge representation. In particular, they have been used to compute the modules of an ontology. An ontology is defined as a collection of axioms that provides a formal specification of a set of terms and their relationships; and a module is a subset of an ontology that completely captures the meaning of certain terms as defined in the ontology. In particular, we focus on the modules computed using the notion of syntactic locality. As ontologies can be very large, the computation of modules facilitates the reuse and maintenance of these ontologies. Analysing all modules of an ontology, however, is in general not feasible as the number of modules grows exponentially in the number of terms and axioms of the ontology. Nevertheless, the modules can succinctly be represented using the Atomic Decomposition of an ontology. Using this representation, an ontology can be partitioned into atoms, which are maximal sets of axioms that co-occur in every module. The Atomic Decomposition is then defined as a directed graph such that each node correspond to an atom and each edge represents a dependency relation between two atoms. In this thesis, we introduce the notion of an axiom dependency hypergraph which is a generalization of the atomic decomposition of an ontology. A module in the ontology corresponds to a connected component in the hypergraph, and the atoms of the ontology to the strongly-connected components. We apply our algorithms for directed hypergraphs to axiom dependency hypergraphs and in this manner, we compute the atoms of an ontology. To demonstrate the viability of this approach, we have implemented the algorithms in the application HyS which computes the modules of ontologies and calculate their atomic decomposition. In the thesis, we provide an experimental evaluation of HyS with a selection of large and prominent biomedical ontologies, most of which are available in the NCBO Bioportal. HyS outperforms state-of-the-art implementations in the tasks of extracting modules and computing the atomic decomposition of these ontologies.
Resumo:
The purpose of this research is to characterise the mechanical properties of multicrystalline silicon for photovoltaic applications that was crystallised from silicon feedstock with a high content of several types of impurities. The mechanical strength, fracture toughness and elastic modulus were measured at different positions within a multicrystalline silicon block to quantify the effect of impurity segregation on these mechanical properties. The microstructure and fracture surfaces of the samples was exhaustively analysed with a scanning electron microscope in order to correlate the values of mechanical properties with material microstructure. Fracture stresses values were treated statistically via the Weibull statistics. The results of this research show that metals segregate to the top of the block, produce moderate microcracking and introduce high thermal stresses. Silicon oxide is produced at the bottom part of the silicon block, and its presence significantly reduces the mechanical strength and fracture toughness of multicrystalline silicon due to both thermal and elastic mismatch between silicon and the silicon oxide inclusions. Silicon carbide inclusions from the upper parts of the block increase the fracture toughness and elastic modulus of multicrystalline silicon. Additionally, the mechanical strength of multicrystalline silicon can increase when the radius of the silicon carbide inclusions is smaller than ~10 µm. The most damaging type of impurity inclusion for the multicrystalline silicon block studied in this work was amorphous silicon oxide. The oriented precipitation of silicon oxide at grain and twin boundaries eases the formation of radial cracks between inclusions and decreases significatively the mechanical strength of multicrystalline silicon. The second most influencing type of impurity inclusions were metals like aluminium and copper, that cause spontaneous microcracking in their surroundings after the crystallisation process, therefore reducing the mechanical response of multicrystalline silicon. Therefore, solar cell producers should pay attention to the content of metals and oxygen within the silicon feedstock in order to produce solar cells with reliable mechanical properties.
Resumo:
The cytosolic 70-kDa heat shock proteins (Hsp70s), Ssa and Ssb, of Saccharomyces cerevisiae are functionally distinct. Here we report that the ATPase activities of these two classes of Hsp70s exhibit different kinetic properties. The Ssa ATPase has properties similar to those of other Hsp70s studied, such as DnaK and Hsc70. Ssb, however, has an unusually low steady-state affinity for ATP but a higher maximal velocity. In addition, the ATPase activity of Hsp70s, like that of Ssa1, depends on the addition of K+ whereas Ssb activity does not. Suprisingly, the isolated 44-kDa ATPase domain of Ssb has a Km and Vmax for ATP hydrolysis similar to those of Ssa, rather than those of full length Ssb. Analysis of Ssa/Ssb fusion proteins demonstrates that the Ssb peptide-binding domain fused to the Ssa ATPase domain generates an ATPase of relatively high activity and low steady-state affinity for ATP similar to that of native Ssb. Therefore, at least some of the biochemical differences between the ATPases of these two classes of Hsp70s are not intrinsic to the ATPase domain itself. The differential influence of the peptide-binding domain on the ATPase domain may, in part, explain the functional uniqueness of these two classes of Hsp70s.
Resumo:
The emotif database is a collection of more than 170 000 highly specific and sensitive protein sequence motifs representing conserved biochemical properties and biological functions. These protein motifs are derived from 7697 sequence alignments in the BLOCKS+ database (released on June 23, 2000) and all 8244 protein sequence alignments in the PRINTS database (version 27.0) using the emotif-maker algorithm developed by Nevill-Manning et al. (Nevill-Manning,C.G., Wu,T.D. and Brutlag,D.L. (1998) Proc. Natl Acad. Sci. USA, 95, 5865–5871; Nevill-Manning,C.G., Sethi,K.S., Wu,T.D. and Brutlag,D.L. (1997) ISMB-97, 5, 202–209). Since the amino acids and the groups of amino acids in these sequence motifs represent critical positions conserved in evolution, search algorithms employing the emotif patterns can identify and classify more widely divergent sequences than methods based on global sequence similarity. The emotif protein pattern database is available at http://motif.stanford.edu/emotif/.
Resumo:
This study is aimed to determine the properties of Nantes carrots while drying by hot air at three different temperatures (50, 60 and 70 ºC). The chemical properties evaluated were: moisture, pro- tein, fibre, ash, sugars and water activity, and the physical properties were: texture, color, density and porosity. The results showed that the drying at 70 ºC affected mostly the chemical properties analyzed. Regarding the texture, similar changes were recorded in terms of hardness, gumminess and chewiness at the temperature of 70 ºC that affected these properties the most. Regarding color, in general the vari- ations in a* and b* along drying were not meaningful, although some discoloration was observed (in- crease in L*). The porosity increased due to the decrease in humidity. The final porosity measured for the carrots dried at 70 ºC was; however, lower than those for 50 and 60 ºC.
Resumo:
Vermicompost filtration is a new on-site waste treatment system. Consequently, little is known about the filter medium properties. The aim of this preliminary study was to quantify physical and compositional properties of vermicompost filter beds that had been used to treat domestic solid organic waste and wastewater. This paper presents the trials performed on pilot-scale reactors filled with vermicompost from a full-scale vermicompost filtration system. Household solid organic waste and raw wastewater at the rate of 130 L/m(2)/d was applied to the reactor bed surface over a four-month period. It was found that fresh casts laid on the bed surface had a BOD of 1290 mg/g VS while casts buried to a depth of 10 cm had a BOD of 605 mg/g VS. Below this depth there was little further biodegradation of earthworm casts despite cast ages of up to five years. Solid material in the reactor accounted for only 7-10% of the reactor volume. The total voidage comprised of large free-draining pores, which accounted for 15-20% of the reactor volume and 60-70% micropores, able to hold up water against gravity. It was shown that water could flow through the medium micropores and macropores following a wastewater application. The wastewater flow characteristics were modeled by a two-region model based on the Richards Equation, an equation used to describe porous spatially heterogeneous materials.
Resumo:
Beyond the inherent technical challenges, current research into the three dimensional surface correspondence problem is hampered by a lack of uniform terminology, an abundance of application specific algorithms, and the absence of a consistent model for comparing existing approaches and developing new ones. This paper addresses these challenges by presenting a framework for analysing, comparing, developing, and implementing surface correspondence algorithms. The framework uses five distinct stages to establish correspondence between surfaces. It is general, encompassing a wide variety of existing techniques, and flexible, facilitating the synthesis of new correspondence algorithms. This paper presents a review of existing surface correspondence algorithms, and shows how they fit into the correspondence framework. It also shows how the framework can be used to analyse and compare existing algorithms and develop new algorithms using the framework's modular structure. Six algorithms, four existing and two new, are implemented using the framework. Each implemented algorithm is used to match a number of surface pairs. Results demonstrate that the correspondence framework implementations are faithful implementations of existing algorithms, and that powerful new surface correspondence algorithms can be created. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
The introduction of mesoporous nanosize zirconia to the catalyst for methanol synthesis dedicates the nanosized catalyst and mesoporous duplicated properties. The catalyst bears the larger surface area, larger mesoporous volume and more uniform diameter, more surface metal atoms and oxygen vacancies than the catalyst prepared with the conventional coprecipitation method. The modification of microstructure and electronic effect could result in the change of the reduced chemical state and decrease of reducuction temperature of copper, donating the higher activity and methanol selectivity to the catalyst. The results of methanol synthesis demonstrate that the Cu+ is the optimum active site. Also, the interaction between the copper and zirconia shows the synergistic effect to fulfil the methanol synthesis.
Resumo:
The verification of information flow properties of security devices is difficult because it involves the analysis of schematic diagrams, artwork, embedded software, etc. In addition, a typical security device has many modes, partial information flow, and needs to be fault tolerant. We propose a new approach to the verification of such devices based upon checking abstract information flow properties expressed as graphs. This approach has been implemented in software, and successfully used to find possible paths of information flow through security devices.
Resumo:
Multidimensional compound optimization is a new paradigm in the drug discovery process, yielding efficiencies during early stages and reducing attrition in the later stages of drug development. The success of this strategy relies heavily on understanding this multidimensional data and extracting useful information from it. This paper demonstrates how principled visualization algorithms can be used to understand and explore a large data set created in the early stages of drug discovery. The experiments presented are performed on a real-world data set comprising biological activity data and some whole-molecular physicochemical properties. Data visualization is a popular way of presenting complex data in a simpler form. We have applied powerful principled visualization methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), to help the domain experts (screening scientists, chemists, biologists, etc.) understand and draw meaningful decisions. We also benchmark these principled methods against relatively better known visualization approaches, principal component analysis (PCA), Sammon's mapping, and self-organizing maps (SOMs), to demonstrate their enhanced power to help the user visualize the large multidimensional data sets one has to deal with during the early stages of the drug discovery process. The results reported clearly show that the GTM and HGTM algorithms allow the user to cluster active compounds for different targets and understand them better than the benchmarks. An interactive software tool supporting these visualization algorithms was provided to the domain experts. The tool facilitates the domain experts by exploration of the projection obtained from the visualization algorithms providing facilities such as parallel coordinate plots, magnification factors, directional curvatures, and integration with industry standard software. © 2006 American Chemical Society.
Resumo:
The computer systems of today are characterised by data and program control that are distributed functionally and geographically across a network. A major issue of concern in this environment is the operating system activity of resource management for different processors in the network. To ensure equity in load distribution and improved system performance, load balancing is often undertaken. The research conducted in this field so far, has been primarily concerned with a small set of algorithms operating on tightly-coupled distributed systems. More recent studies have investigated the performance of such algorithms in loosely-coupled architectures but using a small set of processors. This thesis describes a simulation model developed to study the behaviour and general performance characteristics of a range of dynamic load balancing algorithms. Further, the scalability of these algorithms are discussed and a range of regionalised load balancing algorithms developed. In particular, we examine the impact of network diameter and delay on the performance of such algorithms across a range of system workloads. The results produced seem to suggest that the performance of simple dynamic policies are scalable but lack the load stability of more complex global average algorithms.
Resumo:
Paper-based phenolic laminates are used extensively in the electrical industry. Many small components are fabricated from these materials by the process known as punching. Recently an investigation was carried out to study the effect of processing variables on the punching properties. It was concluded that further work would be justified and that this should include a critical examination of the resin properties in a more controlled and systematic manner. In this investigation an attempt has been made to assess certain features of the resin structure in terms of thermomechanical properties. The number of crosslinks in the system was controlled using resins based on phenol and para-cresol formulations. Intramolecular hydrogen bonding effects were examined using substituted resins and a synthetically derived phenol based on 1,3-di-(o-hydroxyphenyl) propane.. A resin system was developed using the Friedel Crafts reaction to examine inter-molecular hydrogen bonding at the resin-paper interface. The punching properties of certain selected resins were assessed on a qualitative basis. In addition flexural and dynamic mechanical properties were determined in a general study of the structure-property relationships of these materials. It has been shown that certain features of the resin structure significantly influenced mechanical properties. :F'urther, it was noted that there is a close relationship between punching properties, mechanical damping and flexural strain. This work includes a critical examination of the curing mechanism and views are postulated in an attempt to extend knowledge in this area of the work. Finally, it is argued that future work should be based on a synthetic approach and that dynamic mechanical testing would provide a powerful tool In developing a deeper understanding of the resin fine structure.
Resumo:
The inference and optimization in sparse graphs with real variables is studied using methods of statistical mechanics. Efficient distributed algorithms for the resource allocation problem are devised. Numerical simulations show excellent performance and full agreement with the theoretical results. © Springer-Verlag Berlin Heidelberg 2006.
Resumo:
This work is directed towards optimizing the radiation pattern of smart antennas using genetic algorithms. The structure of the smart antennas based on Space Division Multiple Access (SDMA) is proposed. It is composed of adaptive antennas, each of which has adjustable weight elements for amplitudes and phases of signals. The corresponding radiation pattern formula available for the utilization of numerical optimization techniques is deduced. Genetic algorithms are applied to search the best phase-amplitude weights or phase-only weights with which the optimal radiation pattern can be achieved. ^ One highlight of this work is the proposed optimal radiation pattern concept and its implementation by genetic algorithms. The results show that genetic algorithms are effective for the true Signal-Interference-Ratio (SIR) design of smart antennas. This means that not only nulls can be put in the directions of the interfering signals but also simultaneously main lobes can be formed in the directions of the desired signals. The optimal radiation pattern of a smart antenna possessing SDMA ability has been achieved. ^ The second highlight is on the weight search by genetic algorithms for the optimal radiation pattern design of antennas having more than one interfering signal. The regular criterion for determining which chromosome should be kept for the next step iteration is modified so as to improve the performance of the genetic algorithm iteration. The results show that the modified criterion can speed up and guarantee the iteration to be convergent. ^ In addition, the comparison between phase-amplitude perturbations and phase-only perturbations for the radiation pattern design of smart antennas are carried out. The effects of parameters used by the genetic algorithm on the optimal radiation pattern design are investigated. Valuable results are obtained. ^
Resumo:
The strong couplings between different degrees of freedom are believed to be responsible for novel and complex phenomena discovered in transition metal oxides (TMOs). The physical complexity is directly responsible for their tunability. Creating surfaces/interfaces add an additional ' man-made' twist, approaching the quantum phenomena of correlated materials. ^ The dissertation focused on the structural and electronic properties in proximity of surface of three prototype TMO compounds by using three complementary techniques: scanning tunneling microscopy, angle-resolved photoelectron spectroscopy and low energy electron diffraction, particularly emphasized the effects of broken symmetry and imperfections like defects on the coupling between charge and lattice degrees of freedom. ^ Ca1.5Sr0.5RuO4 is a layered ruthenate with square lattice and at the boundary of magnetic/orbital instability in Ca2-xSrxRuO4. That the substitution of Sr 2+ with Ca2+ causing RuO6 rotation narrows the dxy band width and changes the Fermi surface topology. Particularly, the γ(dxy) Fermi surface sheet exhibited hole-like in Ca1.5Sr0.5RuO4 in contrast to electron-like in Sr2RuO4, showing a strong charge-lattice coupling. ^ Na0.75CoO2 is a layered cobaltite with triangular lattice exhibiting extraordinary thermoelectric properties. The well-ordered CoO2-terminated surface with random Na distribution was observed. However, lattice constants of the surface are smaller than that in bulk. The surface density of states (DOS) showed strong temperature dependence. Especially, an unusual shift of the minimum DOS occurs below 230 K, clearly indicating a local charging effect on the surface. ^ Cd2Re2O7 is the first known pyrochlore oxide superconductor (Tc ∼ 1K). It exhibited an unusual second-order phase transition occurring at TS1 = 200 K and a controversial first-order transition at TS2 = 120 K. While bulk properties display large anomalies at TS1 but rather subtle and sample-dependent changes at TS2, the surface DOS near the EF show no change at T s1 but a substantial increase below TS2---a complete reversal as the signature for the transitions. We argued that crystal imperfections, mainly defects, which were considerably enhanced at the surface, resulted in the transition at TS2. ^