941 resultados para Algorithms, Properties, the KCube Graphs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental studies on epoxies report that the microstructure consists of highly crosslinked localized regions connected with a dispersed phase of low crosslink density. The various thermo-mechanical properties of epoxies might be affected by the crosslink distribution. But as experiments cannot report the exact number of crosslinked covalent bonds present in the structure, molecular dynamics is thus being used in this work to determine the influence of crosslink distribution on thermo-mechanical properties. Molecular dynamics and molecular mechanics simulations are used to establish wellequilibrated molecular models of EPON 862-DETDA epoxy system with a range of crosslink densities and various crosslink distributions. Crosslink distributions are being varied by forming differently crosslinked localized clusters and then by forming different number of crosslinks interconnecting the clusters. Simulations are subsequently used to predict the volume shrinkage, thermal expansion coefficients, and elastic properties of each of the crosslinked systems. The results indicate that elastic properties increase with increasing levels of overall crosslink density and the thermal expansion coefficient decreases with overall crosslink density, both above and below the glass transition temperature. Elastic moduli and coefficients of linear thermal expansion values were found to be different for systems with same overall crosslink density but having different crosslink distributions, thus indicating an effect of the epoxy nanostructure on physical properties. The values of thermo-mechanical properties for all the crosslinked systems are within the range of values reported in literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Staphylococcus aureus genotype B (GTB) is a contagious mastitis pathogen in cattle, occurring in up to 87% of individuals. Because treatment is generally insufficient, culling is often required, leading to large economic loss in the Swiss dairy industry. As the detection of this pathogen in bulk tank milk (BTM) would greatly facilitate its control, a novel real-time quantitative PCR-based assay for BTM has previously been developed and is now being evaluated for its diagnostic properties at the herd level. Herds were initially classified as to their Staph. aureus GTB status by a reference method. Using BTM and herd pools of single-quarter and 4-quarter milk, the herds were then grouped by the novel assay, and the resulting classifications were compared. A total of 54 dairy herds were evaluated. Using the reference method, 21 herds were found to be GTB positive, whereas 33 were found to be negative. Considering the novel assay using both herd pools, all herds were grouped correctly, resulting in maximal diagnostic sensitivities (100%) and specificities (100%). For BTM samples, diagnostic sensitivities and specificities were 90 and 100%, respectively. Two herds were false negative in BTM, because cows with clinical signs of mastitis were not milked into the tank. Besides its excellent diagnostic properties, the assay is characterized by its low detection level, high efficiency, and its suitability for automation. Using the novel knowledge and assay, eradication of Staph. aureus GTB from a dairy herd may be considered as a realistic goal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The outcome of light-based therapeutic approaches depends on light propagation in biological tissues, which is governed by their optical properties. The objective of this study was to quantify optical properties of brain tissue in vivo and postmortem and assess changes due to tissue handling postmortem. The study was carried out on eight female New Zealand white rabbits. The local fluence rate was measured in the VIS/NIR range in the brain in vivo, just postmortem, and after six weeks’ storage of the head at −20∘C or in 10% formaldehyde solution. Only minimal changes in the effective attenuation coefficient μeff were observed for two methods of sacrifice, exsanguination or injection of KCl. Under all tissue conditions, μeff decreased with increasing wavelengths. After long-term storage for six weeks at −20∘C, μeff decreased, on average, by 15 to 25% at all wavelengths, while it increased by 5 to 15% at all wavelengths after storage in formaldehyde. We demonstrated that μeff was not very sensitive to the method of animal sacrifice, that tissue freezing significantly altered tissue optical properties, and that formalin fixation might affect the tissue’s optical properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The statistical distributions of different software properties have been thoroughly studied in the past, including software size, complexity and the number of defects. In the case of object-oriented systems, these distributions have been found to obey a power law, a common statistical distribution also found in many other fields. However, we have found that for some statistical properties, the behavior does not entirely follow a power law, but a mixture between a lognormal and a power law distribution. Our study is based on the Qualitas Corpus, a large compendium of diverse Java-based software projects. We have measured the Chidamber and Kemerer metrics suite for every file of every Java project in the corpus. Our results show that the range of high values for the different metrics follows a power law distribution, whereas the rest of the range follows a lognormal distribution. This is a pattern typical of so-called double Pareto distributions, also found in empirical studies for other software properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cadmium thioindate spinel CdIn2S4 semiconductor has potential applications for optoelectronic devices. We present a theoretical study of the structural and optoelectronic properties of the host and of the Cr-doped ternary spinel. For the host spinel, we analyze the direct or indirect character of the energy bandgap, the change of the energy bandgap with the anion displacement parameter and with the site cation distribution, and the optical properties. The main effect of the Cr doping is the creation of an intermediate band within the energy bandgap. The character and the occupation of this band are analyzed for two substitutions: Cr by In and Cr by Cd. This band permits more channels for the photon absorption. The optical properties are obtained and analyzed. The absorption coefficients are decomposed into contributions from the different absorption channels and from the inter-and intra-atomic components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The horizontal visibility algorithm was recently introduced as a mapping between time series and networks. The challenge lies in characterizing the structure of time series (and the processes that generated those series) using the powerful tools of graph theory. Recent works have shown that the visibility graphs inherit several degrees of correlations from their associated series, and therefore such graph theoretical characterization is in principle possible. However, both the mathematical grounding of this promising theory and its applications are in its infancy. Following this line, here we address the question of detecting hidden periodicity in series polluted with a certain amount of noise. We first put forward some generic properties of horizontal visibility graphs which allow us to define a (graph theoretical) noise reduction filter. Accordingly, we evaluate its performance for the task of calculating the period of noisy periodic signals, and compare our results with standard time domain (autocorrelation) methods. Finally, potentials, limitations and applications are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most promising areas in which probabilistic graphical models have shown an incipient activity is the field of heuristic optimization and, in particular, in Estimation of Distribution Algorithms. Due to their inherent parallelism, different research lines have been studied trying to improve Estimation of Distribution Algorithms from the point of view of execution time and/or accuracy. Among these proposals, we focus on the so-called distributed or island-based models. This approach defines several islands (algorithms instances) running independently and exchanging information with a given frequency. The information sent by the islands can be either a set of individuals or a probabilistic model. This paper presents a comparative study for a distributed univariate Estimation of Distribution Algorithm and a multivariate version, paying special attention to the comparison of two alternative methods for exchanging information, over a wide set of parameters and problems ? the standard benchmark developed for the IEEE Workshop on Evolutionary Algorithms and other Metaheuristics for Continuous Optimization Problems of the ISDA 2009 Conference. Several analyses from different points of view have been conducted to analyze both the influence of the parameters and the relationships between them including a characterization of the configurations according to their behavior on the proposed benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El auge del "Internet de las Cosas" (IoT, "Internet of Things") y sus tecnologías asociadas han permitido su aplicación en diversos dominios de la aplicación, entre los que se encuentran la monitorización de ecosistemas forestales, la gestión de catástrofes y emergencias, la domótica, la automatización industrial, los servicios para ciudades inteligentes, la eficiencia energética de edificios, la detección de intrusos, la gestión de desastres y emergencias o la monitorización de señales corporales, entre muchas otras. La desventaja de una red IoT es que una vez desplegada, ésta queda desatendida, es decir queda sujeta, entre otras cosas, a condiciones climáticas cambiantes y expuestas a catástrofes naturales, fallos de software o hardware, o ataques maliciosos de terceros, por lo que se puede considerar que dichas redes son propensas a fallos. El principal requisito de los nodos constituyentes de una red IoT es que estos deben ser capaces de seguir funcionando a pesar de sufrir errores en el propio sistema. La capacidad de la red para recuperarse ante fallos internos y externos inesperados es lo que se conoce actualmente como "Resiliencia" de la red. Por tanto, a la hora de diseñar y desplegar aplicaciones o servicios para IoT, se espera que la red sea tolerante a fallos, que sea auto-configurable, auto-adaptable, auto-optimizable con respecto a nuevas condiciones que puedan aparecer durante su ejecución. Esto lleva al análisis de un problema fundamental en el estudio de las redes IoT, el problema de la "Conectividad". Se dice que una red está conectada si todo par de nodos en la red son capaces de encontrar al menos un camino de comunicación entre ambos. Sin embargo, la red puede desconectarse debido a varias razones, como que se agote la batería, que un nodo sea destruido, etc. Por tanto, se hace necesario gestionar la resiliencia de la red con el objeto de mantener la conectividad entre sus nodos, de tal manera que cada nodo IoT sea capaz de proveer servicios continuos, a otros nodos, a otras redes o, a otros servicios y aplicaciones. En este contexto, el objetivo principal de esta tesis doctoral se centra en el estudio del problema de conectividad IoT, más concretamente en el desarrollo de modelos para el análisis y gestión de la Resiliencia, llevado a la práctica a través de las redes WSN, con el fin de mejorar la capacidad la tolerancia a fallos de los nodos que componen la red. Este reto se aborda teniendo en cuenta dos enfoques distintos, por una parte, a diferencia de otro tipo de redes de dispositivos convencionales, los nodos en una red IoT son propensos a perder la conexión, debido a que se despliegan en entornos aislados, o en entornos con condiciones extremas; por otra parte, los nodos suelen ser recursos con bajas capacidades en términos de procesamiento, almacenamiento y batería, entre otros, por lo que requiere que el diseño de la gestión de su resiliencia sea ligero, distribuido y energéticamente eficiente. En este sentido, esta tesis desarrolla técnicas auto-adaptativas que permiten a una red IoT, desde la perspectiva del control de su topología, ser resiliente ante fallos en sus nodos. Para ello, se utilizan técnicas basadas en lógica difusa y técnicas de control proporcional, integral y derivativa (PID - "proportional-integral-derivative"), con el objeto de mejorar la conectividad de la red, teniendo en cuenta que el consumo de energía debe preservarse tanto como sea posible. De igual manera, se ha tenido en cuenta que el algoritmo de control debe ser distribuido debido a que, en general, los enfoques centralizados no suelen ser factibles a despliegues a gran escala. El presente trabajo de tesis implica varios retos que conciernen a la conectividad de red, entre los que se incluyen: la creación y el análisis de modelos matemáticos que describan la red, una propuesta de sistema de control auto-adaptativo en respuesta a fallos en los nodos, la optimización de los parámetros del sistema de control, la validación mediante una implementación siguiendo un enfoque de ingeniería del software y finalmente la evaluación en una aplicación real. Atendiendo a los retos anteriormente mencionados, el presente trabajo justifica, mediante una análisis matemático, la relación existente entre el "grado de un nodo" (definido como el número de nodos en la vecindad del nodo en cuestión) y la conectividad de la red, y prueba la eficacia de varios tipos de controladores que permiten ajustar la potencia de trasmisión de los nodos de red en respuesta a eventuales fallos, teniendo en cuenta el consumo de energía como parte de los objetivos de control. Así mismo, este trabajo realiza una evaluación y comparación con otros algoritmos representativos; en donde se demuestra que el enfoque desarrollado es más tolerante a fallos aleatorios en los nodos de la red, así como en su eficiencia energética. Adicionalmente, el uso de algoritmos bioinspirados ha permitido la optimización de los parámetros de control de redes dinámicas de gran tamaño. Con respecto a la implementación en un sistema real, se han integrado las propuestas de esta tesis en un modelo de programación OSGi ("Open Services Gateway Initiative") con el objeto de crear un middleware auto-adaptativo que mejore la gestión de la resiliencia, especialmente la reconfiguración en tiempo de ejecución de componentes software cuando se ha producido un fallo. Como conclusión, los resultados de esta tesis doctoral contribuyen a la investigación teórica y, a la aplicación práctica del control resiliente de la topología en redes distribuidas de gran tamaño. Los diseños y algoritmos presentados pueden ser vistos como una prueba novedosa de algunas técnicas para la próxima era de IoT. A continuación, se enuncian de forma resumida las principales contribuciones de esta tesis: (1) Se han analizado matemáticamente propiedades relacionadas con la conectividad de la red. Se estudia, por ejemplo, cómo varía la probabilidad de conexión de la red al modificar el alcance de comunicación de los nodos, así como cuál es el mínimo número de nodos que hay que añadir al sistema desconectado para su re-conexión. (2) Se han propuesto sistemas de control basados en lógica difusa para alcanzar el grado de los nodos deseado, manteniendo la conectividad completa de la red. Se han evaluado diferentes tipos de controladores basados en lógica difusa mediante simulaciones, y los resultados se han comparado con otros algoritmos representativos. (3) Se ha investigado más a fondo, dando un enfoque más simple y aplicable, el sistema de control de doble bucle, y sus parámetros de control se han optimizado empleando algoritmos heurísticos como el método de la entropía cruzada (CE, "Cross Entropy"), la optimización por enjambre de partículas (PSO, "Particle Swarm Optimization"), y la evolución diferencial (DE, "Differential Evolution"). (4) Se han evaluado mediante simulación, la mayoría de los diseños aquí presentados; además, parte de los trabajos se han implementado y validado en una aplicación real combinando técnicas de software auto-adaptativo, como por ejemplo las de una arquitectura orientada a servicios (SOA, "Service-Oriented Architecture"). ABSTRACT The advent of the Internet of Things (IoT) enables a tremendous number of applications, such as forest monitoring, disaster management, home automation, factory automation, smart city, etc. However, various kinds of unexpected disturbances may cause node failure in the IoT, for example battery depletion, software/hardware malfunction issues and malicious attacks. So, it can be considered that the IoT is prone to failure. The ability of the network to recover from unexpected internal and external failures is known as "resilience" of the network. Resilience usually serves as an important non-functional requirement when designing IoT, which can further be broken down into "self-*" properties, such as self-adaptive, self-healing, self-configuring, self-optimization, etc. One of the consequences that node failure brings to the IoT is that some nodes may be disconnected from others, such that they are not capable of providing continuous services for other nodes, networks, and applications. In this sense, the main objective of this dissertation focuses on the IoT connectivity problem. A network is regarded as connected if any pair of different nodes can communicate with each other either directly or via a limited number of intermediate nodes. More specifically, this thesis focuses on the development of models for analysis and management of resilience, implemented through the Wireless Sensor Networks (WSNs), which is a challenging task. On the one hand, unlike other conventional network devices, nodes in the IoT are more likely to be disconnected from each other due to their deployment in a hostile or isolated environment. On the other hand, nodes are resource-constrained in terms of limited processing capability, storage and battery capacity, which requires that the design of the resilience management for IoT has to be lightweight, distributed and energy-efficient. In this context, the thesis presents self-adaptive techniques for IoT, with the aim of making the IoT resilient against node failures from the network topology control point of view. The fuzzy-logic and proportional-integral-derivative (PID) control techniques are leveraged to improve the network connectivity of the IoT in response to node failures, meanwhile taking into consideration that energy consumption must be preserved as much as possible. The control algorithm itself is designed to be distributed, because the centralized approaches are usually not feasible in large scale IoT deployments. The thesis involves various aspects concerning network connectivity, including: creation and analysis of mathematical models describing the network, proposing self-adaptive control systems in response to node failures, control system parameter optimization, implementation using the software engineering approach, and evaluation in a real application. This thesis also justifies the relations between the "node degree" (the number of neighbor(s) of a node) and network connectivity through mathematic analysis, and proves the effectiveness of various types of controllers that can adjust power transmission of the IoT nodes in response to node failures. The controllers also take into consideration the energy consumption as part of the control goals. The evaluation is performed and comparison is made with other representative algorithms. The simulation results show that the proposals in this thesis can tolerate more random node failures and save more energy when compared with those representative algorithms. Additionally, the simulations demonstrate that the use of the bio-inspired algorithms allows optimizing the parameters of the controller. With respect to the implementation in a real system, the programming model called OSGi (Open Service Gateway Initiative) is integrated with the proposals in order to create a self-adaptive middleware, especially reconfiguring the software components at runtime when failures occur. The outcomes of this thesis contribute to theoretic research and practical applications of resilient topology control for large and distributed networks. The presented controller designs and optimization algorithms can be viewed as novel trials of the control and optimization techniques for the coming era of the IoT. The contributions of this thesis can be summarized as follows: (1) Mathematically, the fault-tolerant probability of a large-scale stochastic network is analyzed. It is studied how the probability of network connectivity depends on the communication range of the nodes, and what is the minimum number of neighbors to be added for network re-connection. (2) A fuzzy-logic control system is proposed, which obtains the desired node degree and in turn maintains the network connectivity when it is subject to node failures. There are different types of fuzzy-logic controllers evaluated by simulations, and the results demonstrate the improvement of fault-tolerant capability as compared to some other representative algorithms. (3) A simpler but more applicable approach, the two-loop control system is further investigated, and its control parameters are optimized by using some heuristic algorithms such as Cross Entropy (CE), Particle Swarm Optimization (PSO), and Differential Evolution (DE). (4) Most of the designs are evaluated by means of simulations, but part of the proposals are implemented and tested in a real-world application by combining the self-adaptive software technique and the control algorithms which are presented in this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los hipergrafos dirigidos se han empleado en problemas relacionados con lógica proposicional, bases de datos relacionales, linguística computacional y aprendizaje automático. Los hipergrafos dirigidos han sido también utilizados como alternativa a los grafos (bipartitos) dirigidos para facilitar el estudio de las interacciones entre componentes de sistemas complejos que no pueden ser fácilmente modelados usando exclusivamente relaciones binarias. En este contexto, este tipo de representación es conocida como hiper-redes. Un hipergrafo dirigido es una generalización de un grafo dirigido especialmente adecuado para la representación de relaciones de muchos a muchos. Mientras que una arista en un grafo dirigido define una relación entre dos de sus nodos, una hiperarista en un hipergrafo dirigido define una relación entre dos conjuntos de sus nodos. La conexión fuerte es una relación de equivalencia que divide el conjunto de nodos de un hipergrafo dirigido en particiones y cada partición define una clase de equivalencia conocida como componente fuertemente conexo. El estudio de los componentes fuertemente conexos de un hipergrafo dirigido puede ayudar a conseguir una mejor comprensión de la estructura de este tipo de hipergrafos cuando su tamaño es considerable. En el caso de grafo dirigidos, existen algoritmos muy eficientes para el cálculo de los componentes fuertemente conexos en grafos de gran tamaño. Gracias a estos algoritmos, se ha podido averiguar que la estructura de la WWW tiene forma de “pajarita”, donde más del 70% del los nodos están distribuidos en tres grandes conjuntos y uno de ellos es un componente fuertemente conexo. Este tipo de estructura ha sido también observada en redes complejas en otras áreas como la biología. Estudios de naturaleza similar no han podido ser realizados en hipergrafos dirigidos porque no existe algoritmos capaces de calcular los componentes fuertemente conexos de este tipo de hipergrafos. En esta tesis doctoral, hemos investigado como calcular los componentes fuertemente conexos de un hipergrafo dirigido. En concreto, hemos desarrollado dos algoritmos para este problema y hemos determinado que son correctos y cuál es su complejidad computacional. Ambos algoritmos han sido evaluados empíricamente para comparar sus tiempos de ejecución. Para la evaluación, hemos producido una selección de hipergrafos dirigidos generados de forma aleatoria inspirados en modelos muy conocidos de grafos aleatorios como Erdos-Renyi, Newman-Watts-Strogatz and Barabasi-Albert. Varias optimizaciones para ambos algoritmos han sido implementadas y analizadas en la tesis. En concreto, colapsar los componentes fuertemente conexos del grafo dirigido que se puede construir eliminando ciertas hiperaristas complejas del hipergrafo dirigido original, mejora notablemente los tiempos de ejecucion de los algoritmos para varios de los hipergrafos utilizados en la evaluación. Aparte de los ejemplos de aplicación mencionados anteriormente, los hipergrafos dirigidos han sido también empleados en el área de representación de conocimiento. En concreto, este tipo de hipergrafos se han usado para el cálculo de módulos de ontologías. Una ontología puede ser definida como un conjunto de axiomas que especifican formalmente un conjunto de símbolos y sus relaciones, mientras que un modulo puede ser entendido como un subconjunto de axiomas de la ontología que recoge todo el conocimiento que almacena la ontología sobre un conjunto especifico de símbolos y sus relaciones. En la tesis nos hemos centrado solamente en módulos que han sido calculados usando la técnica de localidad sintáctica. Debido a que las ontologías pueden ser muy grandes, el cálculo de módulos puede facilitar las tareas de re-utilización y mantenimiento de dichas ontologías. Sin embargo, analizar todos los posibles módulos de una ontología es, en general, muy costoso porque el numero de módulos crece de forma exponencial con respecto al número de símbolos y de axiomas de la ontología. Afortunadamente, los axiomas de una ontología pueden ser divididos en particiones conocidas como átomos. Cada átomo representa un conjunto máximo de axiomas que siempre aparecen juntos en un modulo. La decomposición atómica de una ontología es definida como un grafo dirigido de tal forma que cada nodo del grafo corresponde con un átomo y cada arista define una dependencia entre una pareja de átomos. En esta tesis introducimos el concepto de“axiom dependency hypergraph” que generaliza el concepto de descomposición atómica de una ontología. Un modulo en una ontología correspondería con un componente conexo en este tipo de hipergrafos y un átomo de una ontología con un componente fuertemente conexo. Hemos adaptado la implementación de nuestros algoritmos para que funcionen también con axiom dependency hypergraphs y poder de esa forma calcular los átomos de una ontología. Para demostrar la viabilidad de esta idea, hemos incorporado nuestros algoritmos en una aplicación que hemos desarrollado para la extracción de módulos y la descomposición atómica de ontologías. A la aplicación la hemos llamado HyS y hemos estudiado sus tiempos de ejecución usando una selección de ontologías muy conocidas del área biomédica, la mayoría disponibles en el portal de Internet NCBO. Los resultados de la evaluación muestran que los tiempos de ejecución de HyS son mucho mejores que las aplicaciones más rápidas conocidas. ABSTRACT Directed hypergraphs are an intuitive modelling formalism that have been used in problems related to propositional logic, relational databases, computational linguistic and machine learning. Directed hypergraphs are also presented as an alternative to directed (bipartite) graphs to facilitate the study of the interactions between components of complex systems that cannot naturally be modelled as binary relations. In this context, they are known as hyper-networks. A directed hypergraph is a generalization of a directed graph suitable for representing many-to-many relationships. While an edge in a directed graph defines a relation between two nodes of the graph, a hyperedge in a directed hypergraph defines a relation between two sets of nodes. Strong-connectivity is an equivalence relation that induces a partition of the set of nodes of a directed hypergraph into strongly-connected components. These components can be collapsed into single nodes. As result, the size of the original hypergraph can significantly be reduced if the strongly-connected components have many nodes. This approach might contribute to better understand how the nodes of a hypergraph are connected, in particular when the hypergraphs are large. In the case of directed graphs, there are efficient algorithms that can be used to compute the strongly-connected components of large graphs. For instance, it has been shown that the macroscopic structure of the World Wide Web can be represented as a “bow-tie” diagram where more than 70% of the nodes are distributed into three large sets and one of these sets is a large strongly-connected component. This particular structure has been also observed in complex networks in other fields such as, e.g., biology. Similar studies cannot be conducted in a directed hypergraph because there does not exist any algorithm for computing the strongly-connected components of the hypergraph. In this thesis, we investigate ways to compute the strongly-connected components of directed hypergraphs. We present two new algorithms and we show their correctness and computational complexity. One of these algorithms is inspired by Tarjan’s algorithm for directed graphs. The second algorithm follows a simple approach to compute the stronglyconnected components. This approach is based on the fact that two nodes of a graph that are strongly-connected can also reach the same nodes. In other words, the connected component of each node is the same. Both algorithms are empirically evaluated to compare their performances. To this end, we have produced a selection of random directed hypergraphs inspired by existent and well-known random graphs models like Erd˝os-Renyi and Newman-Watts-Strogatz. Besides the application examples that we mentioned earlier, directed hypergraphs have also been employed in the field of knowledge representation. In particular, they have been used to compute the modules of an ontology. An ontology is defined as a collection of axioms that provides a formal specification of a set of terms and their relationships; and a module is a subset of an ontology that completely captures the meaning of certain terms as defined in the ontology. In particular, we focus on the modules computed using the notion of syntactic locality. As ontologies can be very large, the computation of modules facilitates the reuse and maintenance of these ontologies. Analysing all modules of an ontology, however, is in general not feasible as the number of modules grows exponentially in the number of terms and axioms of the ontology. Nevertheless, the modules can succinctly be represented using the Atomic Decomposition of an ontology. Using this representation, an ontology can be partitioned into atoms, which are maximal sets of axioms that co-occur in every module. The Atomic Decomposition is then defined as a directed graph such that each node correspond to an atom and each edge represents a dependency relation between two atoms. In this thesis, we introduce the notion of an axiom dependency hypergraph which is a generalization of the atomic decomposition of an ontology. A module in the ontology corresponds to a connected component in the hypergraph, and the atoms of the ontology to the strongly-connected components. We apply our algorithms for directed hypergraphs to axiom dependency hypergraphs and in this manner, we compute the atoms of an ontology. To demonstrate the viability of this approach, we have implemented the algorithms in the application HyS which computes the modules of ontologies and calculate their atomic decomposition. In the thesis, we provide an experimental evaluation of HyS with a selection of large and prominent biomedical ontologies, most of which are available in the NCBO Bioportal. HyS outperforms state-of-the-art implementations in the tasks of extracting modules and computing the atomic decomposition of these ontologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research is to characterise the mechanical properties of multicrystalline silicon for photovoltaic applications that was crystallised from silicon feedstock with a high content of several types of impurities. The mechanical strength, fracture toughness and elastic modulus were measured at different positions within a multicrystalline silicon block to quantify the effect of impurity segregation on these mechanical properties. The microstructure and fracture surfaces of the samples was exhaustively analysed with a scanning electron microscope in order to correlate the values of mechanical properties with material microstructure. Fracture stresses values were treated statistically via the Weibull statistics. The results of this research show that metals segregate to the top of the block, produce moderate microcracking and introduce high thermal stresses. Silicon oxide is produced at the bottom part of the silicon block, and its presence significantly reduces the mechanical strength and fracture toughness of multicrystalline silicon due to both thermal and elastic mismatch between silicon and the silicon oxide inclusions. Silicon carbide inclusions from the upper parts of the block increase the fracture toughness and elastic modulus of multicrystalline silicon. Additionally, the mechanical strength of multicrystalline silicon can increase when the radius of the silicon carbide inclusions is smaller than ~10 µm. The most damaging type of impurity inclusion for the multicrystalline silicon block studied in this work was amorphous silicon oxide. The oriented precipitation of silicon oxide at grain and twin boundaries eases the formation of radial cracks between inclusions and decreases significatively the mechanical strength of multicrystalline silicon. The second most influencing type of impurity inclusions were metals like aluminium and copper, that cause spontaneous microcracking in their surroundings after the crystallisation process, therefore reducing the mechanical response of multicrystalline silicon. Therefore, solar cell producers should pay attention to the content of metals and oxygen within the silicon feedstock in order to produce solar cells with reliable mechanical properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cytosolic 70-kDa heat shock proteins (Hsp70s), Ssa and Ssb, of Saccharomyces cerevisiae are functionally distinct. Here we report that the ATPase activities of these two classes of Hsp70s exhibit different kinetic properties. The Ssa ATPase has properties similar to those of other Hsp70s studied, such as DnaK and Hsc70. Ssb, however, has an unusually low steady-state affinity for ATP but a higher maximal velocity. In addition, the ATPase activity of Hsp70s, like that of Ssa1, depends on the addition of K+ whereas Ssb activity does not. Suprisingly, the isolated 44-kDa ATPase domain of Ssb has a Km and Vmax for ATP hydrolysis similar to those of Ssa, rather than those of full length Ssb. Analysis of Ssa/Ssb fusion proteins demonstrates that the Ssb peptide-binding domain fused to the Ssa ATPase domain generates an ATPase of relatively high activity and low steady-state affinity for ATP similar to that of native Ssb. Therefore, at least some of the biochemical differences between the ATPases of these two classes of Hsp70s are not intrinsic to the ATPase domain itself. The differential influence of the peptide-binding domain on the ATPase domain may, in part, explain the functional uniqueness of these two classes of Hsp70s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emotif database is a collection of more than 170 000 highly specific and sensitive protein sequence motifs representing conserved biochemical properties and biological functions. These protein motifs are derived from 7697 sequence alignments in the BLOCKS+ database (released on June 23, 2000) and all 8244 protein sequence alignments in the PRINTS database (version 27.0) using the emotif-maker algorithm developed by Nevill-Manning et al. (Nevill-Manning,C.G., Wu,T.D. and Brutlag,D.L. (1998) Proc. Natl Acad. Sci. USA, 95, 5865–5871; Nevill-Manning,C.G., Sethi,K.S., Wu,T.D. and Brutlag,D.L. (1997) ISMB-97, 5, 202–209). Since the amino acids and the groups of amino acids in these sequence motifs represent critical positions conserved in evolution, search algorithms employing the emotif patterns can identify and classify more widely divergent sequences than methods based on global sequence similarity. The emotif protein pattern database is available at http://motif.stanford.edu/emotif/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is aimed to determine the properties of Nantes carrots while drying by hot air at three different temperatures (50, 60 and 70 ºC). The chemical properties evaluated were: moisture, pro- tein, fibre, ash, sugars and water activity, and the physical properties were: texture, color, density and porosity. The results showed that the drying at 70 ºC affected mostly the chemical properties analyzed. Regarding the texture, similar changes were recorded in terms of hardness, gumminess and chewiness at the temperature of 70 ºC that affected these properties the most. Regarding color, in general the vari- ations in a* and b* along drying were not meaningful, although some discoloration was observed (in- crease in L*). The porosity increased due to the decrease in humidity. The final porosity measured for the carrots dried at 70 ºC was; however, lower than those for 50 and 60 ºC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vermicompost filtration is a new on-site waste treatment system. Consequently, little is known about the filter medium properties. The aim of this preliminary study was to quantify physical and compositional properties of vermicompost filter beds that had been used to treat domestic solid organic waste and wastewater. This paper presents the trials performed on pilot-scale reactors filled with vermicompost from a full-scale vermicompost filtration system. Household solid organic waste and raw wastewater at the rate of 130 L/m(2)/d was applied to the reactor bed surface over a four-month period. It was found that fresh casts laid on the bed surface had a BOD of 1290 mg/g VS while casts buried to a depth of 10 cm had a BOD of 605 mg/g VS. Below this depth there was little further biodegradation of earthworm casts despite cast ages of up to five years. Solid material in the reactor accounted for only 7-10% of the reactor volume. The total voidage comprised of large free-draining pores, which accounted for 15-20% of the reactor volume and 60-70% micropores, able to hold up water against gravity. It was shown that water could flow through the medium micropores and macropores following a wastewater application. The wastewater flow characteristics were modeled by a two-region model based on the Richards Equation, an equation used to describe porous spatially heterogeneous materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Beyond the inherent technical challenges, current research into the three dimensional surface correspondence problem is hampered by a lack of uniform terminology, an abundance of application specific algorithms, and the absence of a consistent model for comparing existing approaches and developing new ones. This paper addresses these challenges by presenting a framework for analysing, comparing, developing, and implementing surface correspondence algorithms. The framework uses five distinct stages to establish correspondence between surfaces. It is general, encompassing a wide variety of existing techniques, and flexible, facilitating the synthesis of new correspondence algorithms. This paper presents a review of existing surface correspondence algorithms, and shows how they fit into the correspondence framework. It also shows how the framework can be used to analyse and compare existing algorithms and develop new algorithms using the framework's modular structure. Six algorithms, four existing and two new, are implemented using the framework. Each implemented algorithm is used to match a number of surface pairs. Results demonstrate that the correspondence framework implementations are faithful implementations of existing algorithms, and that powerful new surface correspondence algorithms can be created. (C) 2004 Elsevier Inc. All rights reserved.