995 resultados para randomized algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Polyphenols contained in natural sources such as grapes, have been considered pharmacological agents to combat oxidative stress and inflammation, common features in Chronic Kidney Disease patients. Objective: To evaluate the effects of grape powder supplementation on inflammatory and antioxidant biomarkers in hemodialysis (HD) patients. Methods: The double-blind placebo-controlled randomized clinical trial evaluated non-diabetic HD patients that received grape powder (500 mg of polyphenols/day) (n = 16, 9 men, 53.0 ± 9.8 years of age, 111.6 ± 58.2 HD months) or placebo (n = 16, 9 men, 52.7 ± 13.7 years of age, 110.4 ± 93.1 HD months) for five weeks. The glutathione peroxidase (GSH-Px) activity and C-reactive protein (CRP) levels were evaluated by ELISA method. Results: After the intervention period, the patients receiving grape powder showed an increase in the GSH-Px activity (16.5 (41.0) to 42.0 (43.3) nmol/min/ml) (p < 0.05) and they did not have the CRP levels increased as seen in placebo group (2.6 (0.28) to 2.8 (0.23 mg/L) (p < 0.05). Conclusion: The use of grape powder as phenolic source could play an important role as an antioxidant and anti-inflammatory agent in non-diabetic HD patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. Smoking prevalence is highest among the young adult cohort. Postsecondary students are no exception. Although many students intend to quit smoking, no research has established what methods best promote reductions in, or complete abstinence from smoking. This randomized controlled trial examined the effectiveness of three self-help smoking cessation interventions. Method. On six post-secondary campuses, 483 smokers who voluntarily accessed Leave The Pack Behind (a tobacco control initiative) were randomly assigned to one of three smoking cessation interventions: One Step At A Time (a 2-booklet, *gold standard' program for adults); Smoke|Quit (a newly-developed 2-booklet program for young adult students); and usual care (a 'Quit Kit' containing a booklet on stress management, information about pharmacological quitting aides and novelty items). All participants also received one proactive telephone support call from a peer counsellor. During the study, 85 participants withdrew. The final sample of 216 students who completed baseline questionnaires and 12-week follow-up telephone interviews was representative of the initial sample in terms of demographic characteristics, and smokingquitting- related variables. Results. Whether participants quit smoking depended upon treatment condition, ^(2, N=2\6) = 6.34, p = .04, with Smoke|Quit producing more successfijl quitters (18.4%) than One Step At A Time (4.5%) or the Quit Kit (1 1.4%). On average, participants had quit 53.46 days, with no significant difference across treatments. Selfefficacy also increased. Use of the intervention or other quitting aides was not associated with treatment condition. Among the 191 participants who did not quit smoking, treatment condition did not influence outcomes. Overall, 46.2% had made a quit attempt. Significant decreases in weekly tobacco consumption and increases in self-efficacy to resist smoking were observed from baseline to follow-up. Conclusion. Post-secondary institutions represent a potentially final opportunity for age-targeted interventions. Self-help resources tailored to students' social and contextual characteristics will have considerable more impact than stage-only tailored interventions. Both reduction and abstinence outcomes should be emphasized to positively support students to stop smoking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research attempted to address the question of the role of explicit algorithms and episodic contexts in the acquisition of computational procedures for regrouping in subtraction. Three groups of students having difficulty learning to subtract with regrouping were taught procedures for doing so through either an explicit algorithm, an episodic content or an examples approach. It was hypothesized that the use of an explicit algorithm represented in a flow chart format would facilitate the acquisition and retention of specific procedural steps relative to the other two conditions. On the other hand, the use of paragraph stories to create episodic content was expected to facilitate the retrieval of algorithms, particularly in a mixed presentation format. The subjects were tested on similar, near, and far transfer questions over a four-day period. Near and far transfer algorithms were also introduced on Day Two. The results suggested that both explicit and episodic context facilitate performance on questions requiring subtraction with regrouping. However, the differential effects of these two approaches on near and far transfer questions were not as easy to identify. Explicit algorithms may facilitate the acquisition of specific procedural steps while at the same time inhibiting the application of such steps to transfer questions. Similarly, the value of episodic context in cuing the retrieval of an algorithm may be limited by the ability of a subject to identify and classify a new question as an exemplar of a particular episodically deflned problem type or category. The implications of these findings in relation to the procedures employed in the teaching of Mathematics to students with learning problems are discussed in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The (n, k)-star interconnection network was proposed in 1995 as an attractive alternative to the n-star topology in parallel computation. The (n, k )-star has significant advantages over the n-star which itself was proposed as an attractive alternative to the popular hypercube. The major advantage of the (n, k )-star network is its scalability, which makes it more flexible than the n-star as an interconnection network. In this thesis, we will focus on finding graph theoretical properties of the (n, k )-star as well as developing parallel algorithms that run on this network. The basic topological properties of the (n, k )-star are first studied. These are useful since they can be used to develop efficient algorithms on this network. We then study the (n, k )-star network from algorithmic point of view. Specifically, we will investigate both fundamental and application algorithms for basic communication, prefix computation, and sorting, etc. A literature review of the state-of-the-art in relation to the (n, k )-star network as well as some open problems in this area are also provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioinformatics applies computers to problems in molecular biology. Previous research has not addressed edit metric decoders. Decoders for quaternary edit metric codes are finding use in bioinformatics problems with applications to DNA. By using side effect machines we hope to be able to provide efficient decoding algorithms for this open problem. Two ideas for decoding algorithms are presented and examined. Both decoders use Side Effect Machines(SEMs) which are generalizations of finite state automata. Single Classifier Machines(SCMs) use a single side effect machine to classify all words within a code. Locking Side Effect Machines(LSEMs) use multiple side effect machines to create a tree structure of subclassification. The goal is to examine these techniques and provide new decoders for existing codes. Presented are ideas for best practices for the creation of these two types of new edit metric decoders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The (n, k)-arrangement interconnection topology was first introduced in 1992. The (n, k )-arrangement graph is a class of generalized star graphs. Compared with the well known n-star, the (n, k )-arrangement graph is more flexible in degree and diameter. However, there are few algorithms designed for the (n, k)-arrangement graph up to present. In this thesis, we will focus on finding graph theoretical properties of the (n, k)- arrangement graph and developing parallel algorithms that run on this network. The topological properties of the arrangement graph are first studied. They include the cyclic properties. We then study the problems of communication: broadcasting and routing. Embedding problems are also studied later on. These are very useful to develop efficient algorithms on this network. We then study the (n, k )-arrangement network from the algorithmic point of view. Specifically, we will investigate both fundamental and application algorithms such as prefix sums computation, sorting, merging and basic geometry computation: finding convex hull on the (n, k )-arrangement graph. A literature review of the state-of-the-art in relation to the (n, k)-arrangement network is also provided, as well as some open problems in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hyper-star interconnection network was proposed in 2002 to overcome the drawbacks of the hypercube and its variations concerning the network cost, which is defined by the product of the degree and the diameter. Some properties of the graph such as connectivity, symmetry properties, embedding properties have been studied by other researchers, routing and broadcasting algorithms have also been designed. This thesis studies the hyper-star graph from both the topological and algorithmic point of view. For the topological properties, we try to establish relationships between hyper-star graphs with other known graphs. We also give a formal equation for the surface area of the graph. Another topological property we are interested in is the Hamiltonicity problem of this graph. For the algorithms, we design an all-port broadcasting algorithm and a single-port neighbourhood broadcasting algorithm for the regular form of the hyper-star graphs. These algorithms are both optimal time-wise. Furthermore, we prove that the folded hyper-star, a variation of the hyper-star, to be maixmally fault-tolerant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hub location problem is an NP-hard problem that frequently arises in the design of transportation and distribution systems, postal delivery networks, and airline passenger flow. This work focuses on the Single Allocation Hub Location Problem (SAHLP). Genetic Algorithms (GAs) for the capacitated and uncapacitated variants of the SAHLP based on new chromosome representations and crossover operators are explored. The GAs is tested on two well-known sets of real-world problems with up to 200 nodes. The obtained results are very promising. For most of the test problems the GA obtains improved or best-known solutions and the computational time remains low. The proposed GAs can easily be extended to other variants of location problems arising in network design planning in transportation systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main focus of this thesis is to evaluate and compare Hyperbalilearning algorithm (HBL) to other learning algorithms. In this work HBL is compared to feed forward artificial neural networks using back propagation learning, K-nearest neighbor and 103 algorithms. In order to evaluate the similarity of these algorithms, we carried out three experiments using nine benchmark data sets from UCI machine learning repository. The first experiment compares HBL to other algorithms when sample size of dataset is changing. The second experiment compares HBL to other algorithms when dimensionality of data changes. The last experiment compares HBL to other algorithms according to the level of agreement to data target values. Our observations in general showed, considering classification accuracy as a measure, HBL is performing as good as most ANn variants. Additionally, we also deduced that HBL.:s classification accuracy outperforms 103's and K-nearest neighbour's for the selected data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A double-blinded, placebo controlled, cross-over design was used to investigate sodium citrate dihydrate (Na-CIT) supplementation improve 200m swimming performance. Ten well-trained, male swimmers (14.9 ± 0.4y; 63.5 ± 4kg) performed four 200m time trials: acute (ACU) supplementation (0.5g/kg), acute placebo (PLC-A), chronic (CHR) (0.1g/kg for 3 days and 0.3g/kg on the 4th day pre-trial), and chronic placebo (PLC-C). Na-CIT was administered 120min pre-trial in solution with 500mL of flavored water; placebo was flavored water. Blood lactate, base excess (BE), bicarbonate, pH, and PCO2 were analyzed at basal, 100min post-ingestion, and 3min post-trial via finger prick. Time, lactate, and rate of perceived exertion were not different between trials. BE and bicarbonate were significantly higher for the ACU and CHR trials compared to placebo. “Responders” improved by 1.03% (P=0.043) and attained significantly higher post-trial lactate concentrations in the ACU versus PLC-A trials and compared to non-responders in the ACU and CHR trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hub Location Problems play vital economic roles in transportation and telecommunication networks where goods or people must be efficiently transferred from an origin to a destination point whilst direct origin-destination links are impractical. This work investigates the single allocation hub location problem, and proposes a genetic algorithm (GA) approach for it. The effectiveness of using a single-objective criterion measure for the problem is first explored. Next, a multi-objective GA employing various fitness evaluation strategies such as Pareto ranking, sum of ranks, and weighted sum strategies is presented. The effectiveness of the multi-objective GA is shown by comparison with an Integer Programming strategy, the only other multi-objective approach found in the literature for this problem. Lastly, two new crossover operators are proposed and an empirical study is done using small to large problem instances of the Civil Aeronautics Board (CAB) and Australian Post (AP) data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The KCube interconnection topology was rst introduced in 2010. The KCube graph is a compound graph of a Kautz digraph and hypercubes. Compared with the at- tractive Kautz digraph and well known hypercube graph, the KCube graph could accommodate as many nodes as possible for a given indegree (and outdegree) and the diameter of interconnection networks. However, there are few algorithms designed for the KCube graph. In this thesis, we will concentrate on nding graph theoretical properties of the KCube graph and designing parallel algorithms that run on this network. We will explore several topological properties, such as bipartiteness, Hamiltonianicity, and symmetry property. These properties for the KCube graph are very useful to develop efficient algorithms on this network. We will then study the KCube network from the algorithmic point of view, and will give an improved routing algorithm. In addition, we will present two optimal broadcasting algorithms. They are fundamental algorithms to many applications. A literature review of the state of the art network designs in relation to the KCube network as well as some open problems in this field will also be given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Population-based metaheuristics, such as particle swarm optimization (PSO), have been employed to solve many real-world optimization problems. Although it is of- ten sufficient to find a single solution to these problems, there does exist those cases where identifying multiple, diverse solutions can be beneficial or even required. Some of these problems are further complicated by a change in their objective function over time. This type of optimization is referred to as dynamic, multi-modal optimization. Algorithms which exploit multiple optima in a search space are identified as niching algorithms. Although numerous dynamic, niching algorithms have been developed, their performance is often measured solely on their ability to find a single, global optimum. Furthermore, the comparisons often use synthetic benchmarks whose landscape characteristics are generally limited and unknown. This thesis provides a landscape analysis of the dynamic benchmark functions commonly developed for multi-modal optimization. The benchmark analysis results reveal that the mechanisms responsible for dynamism in the current dynamic bench- marks do not significantly affect landscape features, thus suggesting a lack of representation for problems whose landscape features vary over time. This analysis is used in a comparison of current niching algorithms to identify the effects that specific landscape features have on niching performance. Two performance metrics are proposed to measure both the scalability and accuracy of the niching algorithms. The algorithm comparison results demonstrate the algorithms best suited for a variety of dynamic environments. This comparison also examines each of the algorithms in terms of their niching behaviours and analyzing the range and trade-off between scalability and accuracy when tuning the algorithms respective parameters. These results contribute to the understanding of current niching techniques as well as the problem features that ultimately dictate their success.