3 resultados para Pairwise constraints

em Brock University, Canada


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a qualitative study exploring the physical activity patterns of a group of women with physical disabilities through their lifespan. In-depth interviews were done with a group of 6 women aged 1 9 to 3 1 . The data were analyzed via content and demographic strategies. Participants in this study reported that their physical activity patterns and their experiences related to their physical activity participation changed over their lives. They were most physically active in their youth (under 14 years of age) and as they reached high school age (over 14 years of age) and to the present time, they have become less physically active. They also reported both affordances and constraints to their physical activity participation through their lifespan. In their youth, they reported affordances such as their parents' assistance, an abundance of available physical activity opportunities, and independent unassisted mobility, as all playing an important factor in their increased youth physical activity. In adulthood, the participants' reported less time, fewer opportunities for physical activity, and reliance on power mobility as significant constraints to their physical activity. The participants reported fewer constraints to being physically active in their youth when compared to adulthood. Their reasons for participation in physical activity changed from fun and socialization in their youth instead of for maintenance of health, weight, and function in adulthood. These affordances, constraints and reasons for physical activity participation were supported in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of a large and reliable DNA codeword library is a key problem in DNA based computing. DNA codes, namely sets of fixed length edit metric codewords over the alphabet {A, C, G, T}, satisfy certain combinatorial constraints with respect to biological and chemical restrictions of DNA strands. The primary constraints that we consider are the reverse--complement constraint and the fixed GC--content constraint, as well as the basic edit distance constraint between codewords. We focus on exploring the theory underlying DNA codes and discuss several approaches to searching for optimal DNA codes. We use Conway's lexicode algorithm and an exhaustive search algorithm to produce provably optimal DNA codes for codes with small parameter values. And a genetic algorithm is proposed to search for some sub--optimal DNA codes with relatively large parameter values, where we can consider their sizes as reasonable lower bounds of DNA codes. Furthermore, we provide tables of bounds on sizes of DNA codes with length from 1 to 9 and minimum distance from 1 to 9.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex networks have recently attracted a significant amount of research attention due to their ability to model real world phenomena. One important problem often encountered is to limit diffusive processes spread over the network, for example mitigating pandemic disease or computer virus spread. A number of problem formulations have been proposed that aim to solve such problems based on desired network characteristics, such as maintaining the largest network component after node removal. The recently formulated critical node detection problem aims to remove a small subset of vertices from the network such that the residual network has minimum pairwise connectivity. Unfortunately, the problem is NP-hard and also the number of constraints is cubic in number of vertices, making very large scale problems impossible to solve with traditional mathematical programming techniques. Even many approximation algorithm strategies such as dynamic programming, evolutionary algorithms, etc. all are unusable for networks that contain thousands to millions of vertices. A computationally efficient and simple approach is required in such circumstances, but none currently exist. In this thesis, such an algorithm is proposed. The methodology is based on a depth-first search traversal of the network, and a specially designed ranking function that considers information local to each vertex. Due to the variety of network structures, a number of characteristics must be taken into consideration and combined into a single rank that measures the utility of removing each vertex. Since removing a vertex in sequential fashion impacts the network structure, an efficient post-processing algorithm is also proposed to quickly re-rank vertices. Experiments on a range of common complex network models with varying number of vertices are considered, in addition to real world networks. The proposed algorithm, DFSH, is shown to be highly competitive and often outperforms existing strategies such as Google PageRank for minimizing pairwise connectivity.