916 resultados para Computational topology
Resumo:
Stable networks of order r where r is a natural number refer to those networks that are immune to coalitional deviation of size r or less. In this paper, we introduce stability of a finite order and examine its relation with efficient networks under anonymous and component additive value functions and the component-wise egalitarian allocation rule. In particular, we examine shapes of networks or network architectures that would resolve the conflict between stability and efficiency in the sense that if stable networks assume those shapes they would be efficient and if efficient networks assume those shapes, they would be stable with minimal further restrictions on value functions.
Resumo:
The aim of the study was to use a computational and experimental approach to evaluate, compare and predict the ability of calcium phosphate (CaP) and poly (methyl methacrylate) (PMMA) augmentation cements to restore mechanical stability to traumatically fractured vertebrae, following a vertebroplasty procedure. Traumatic fractures (n = 17) were generated in a series of porcine vertebrae using a drop-weight method. The fractured vertebrae were imaged using μCT and tested under axial compression. Twelve of the fractured vertebrae were randomly selected to undergo a vertebroplasty procedure using either a PMMA (n = 6) or a CaP cement variation (n = 6). The specimens were imaged using μCT and re-tested. Finite element models of the fractured and augmented vertebrae were generated from the μCT data and used to compare the effect of fracture void fill with augmented specimen stiffness. Significant increases (p <0.05) in failure load were found for both of the augmented specimen groups compared to the fractured group. The experimental and computational results indicated that neither the CaP cement nor PMMA cement could completely restore the vertebral mechanical behavior to the intact level. The effectiveness of the procedure appeared to be more influenced by the volume of fracture filled rather than by the mechanical properties of the cement itself.
Resumo:
This is a report on the 4th international conference in 'Quantitative Biology and Bioinformatics in Modern Medicine' held in Belfast (UK), 19-20 September 2013. The aim of the conference was to bring together leading experts from a variety of different areas that are key for Systems Medicine to exchange novel findings and promote interdisciplinary ideas and collaborations.
Resumo:
Driven by the requirements of the bionic joint or tracking equipment for the spherical parallel manipulators (SPMs) with three rotational degrees-of-freedom (DoFs), this paper carries out the topology synthesis of a class of three-legged SPMs employing Lie group theory. In order to achieve the intersection of the displacement subgroups, the subgroup characteristics and operation principles are defined in this paper. Mainly drawing on the Lie group theory, the topology synthesis procedure of three-legged SPMs including four stages and two functional blocks is proposed, in which the assembly principles of three legs are defined. By introducing the circular track, a novel class of three-legged SPMs is synthesized, which is the important complement to the existing SPMs. Finally, four typical examples are given to demonstrate the finite displacements of the synthesized three-legged SPMs.
Resumo:
We present a fully-distributed self-healing algorithm DEX, that maintains a constant degree expander network in a dynamic setting. To the best of our knowledge, our algorithm provides the first efficient distributed construction of expanders - whose expansion properties hold deterministically - that works even under an all-powerful adaptive adversary that controls the dynamic changes to the network (the adversary has unlimited computational power and knowledge of the entire network state, can decide which nodes join and leave and at what time, and knows the past random choices made by the algorithm). Previous distributed expander constructions typically provide only probabilistic guarantees on the network expansion which rapidly degrade in a dynamic setting, in particular, the expansion properties can degrade even more rapidly under adversarial insertions and deletions. Our algorithm provides efficient maintenance and incurs a low overhead per insertion/deletion by an adaptive adversary: only O(log n) rounds and O(log n) messages are needed with high probability (n is the number of nodes currently in the network). The algorithm requires only a constant number of topology changes. Moreover, our algorithm allows for an efficient implementation and maintenance of a distributed hash table (DHT) on top of DEX, with only a constant additional overhead. Our results are a step towards implementing efficient self-healing networks that have guaranteed properties (constant bounded degree and expansion) despite dynamic changes.
Resumo:
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Resumo:
There is extensive theoretical work on measures of inconsistency for arbitrary formulae in knowledge bases. Many of these are defined in terms of the set of minimal inconsistent subsets (MISes) of the base. However, few have been implemented or experimentally evaluated to support their viability, since computing all MISes is intractable in the worst case. Fortunately, recent work on a related problem of minimal unsatisfiable sets of clauses (MUSes) offers a viable solution in many cases. In this paper, we begin by drawing connections between MISes and MUSes through algorithms based on a MUS generalization approach and a new optimized MUS transformation approach to finding MISes. We implement these algorithms, along with a selection of existing measures for flat and stratified knowledge bases, in a tool called mimus. We then carry out an extensive experimental evaluation of mimus using randomly generated arbitrary knowledge bases. We conclude that these measures are viable for many large and complex random instances. Moreover, they represent a practical and intuitive tool for inconsistency handling.
Resumo:
Recent advances in hardware development coupled with the rapid adoption and broad applicability of cloud computing have introduced widespread heterogeneity in data centers, significantly complicating the management of cloud applications and data center resources. This paper presents the CACTOS approach to cloud infrastructure automation and optimization, which addresses heterogeneity through a combination of in-depth analysis of application behavior with insights from commercial cloud providers. The aim of the approach is threefold: to model applications and data center resources, to simulate applications and resources for planning and operation, and to optimize application deployment and resource use in an autonomic manner. The approach is based on case studies from the areas of business analytics, enterprise applications, and scientific computing.