984 resultados para parallel application
Resumo:
In the past two decades the work of a growing portion of researchers in robotics focused on a particular group of machines, belonging to the family of parallel manipulators: the cable robots. Although these robots share several theoretical elements with the better known parallel robots, they still present completely (or partly) unsolved issues. In particular, the study of their kinematic, already a difficult subject for conventional parallel manipulators, is further complicated by the non-linear nature of cables, which can exert only efforts of pure traction. The work presented in this thesis therefore focuses on the study of the kinematics of these robots and on the development of numerical techniques able to address some of the problems related to it. Most of the work is focused on the development of an interval-analysis based procedure for the solution of the direct geometric problem of a generic cable manipulator. This technique, as well as allowing for a rapid solution of the problem, also guarantees the results obtained against rounding and elimination errors and can take into account any uncertainties in the model of the problem. The developed code has been tested with the help of a small manipulator whose realization is described in this dissertation together with the auxiliary work done during its design and simulation phases.
Resumo:
Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.
Resumo:
Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.
Resumo:
Two novel bicyclo-T nucleosides carrying a hydroxyl or a carboxymethyl substituent in C(6')-[alpha]-position were prepared and incorporated into oligodeoxynucleotides. During oligonucleotide deprotection the carboxymethyl substituent was converted into different amide substituents in a parallel way. Tm-measurements showed no dramatic differences in both, thermal affinity and mismatch discrimination, compared to unmodified oligonucleotides. The post-synthetic modification of the carboxymethyl substituent allows in principle for a parallel preparation of a library of oligonucleotides carrying diverse substituents at C(6'). In addition, functional groups can be placed into unique positions in a DNA double helix.
Resumo:
Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.
Resumo:
Arachnomelia is a monogenic recessive defect of skeletal development in cattle. The causative mutation was previously mapped to a approximately 7 Mb interval on chromosome 5. Here we show that array-based sequence capture and massively parallel sequencing technology, combined with the typical family structure in livestock populations, facilitates the identification of the causative mutation. We re-sequenced the entire critical interval in a healthy partially inbred cow carrying one copy of the critical chromosome segment in its ancestral state and one copy of the same segment with the arachnomelia mutation, and we detected a single heterozygous position. The genetic makeup of several partially inbred cattle provides extremely strong support for the causality of this mutation. The mutation represents a single base insertion leading to a premature stop codon in the coding sequence of the SUOX gene and is perfectly associated with the arachnomelia phenotype. Our findings suggest an important role for sulfite oxidase in bone development.
Resumo:
Background The European trout (Salmo trutta species complex) occurs across a very wide altitudinal range from lowland rivers to alpine streams. Historically, the major European river systems contained different, evolutionarily distinct trout lineages, and some of this genetic diversity has persisted in spite of extensive human-mediated translocations. We used AFLP-based genome scans to investigate the extent of potentially adaptive divergence among major drainages and along altitudinal gradients replicated in several rivers. Results The proportion of loci showing evidence of divergent selection was larger between drainages than along altitudinal transects within drainages. This suggests divergent selection is stronger between drainages, or adaptive divergence is constrained by gene flow among populations within drainages, although the latter could not be confirmed at a more local scale. Still, altitudinal divergence occurred and, at approximately 2% of the markers, parallel changes of the AFLP band frequencies with altitude were observed suggesting that altitude may well be an important source of divergent selection within rivers. Conclusions Our results indicate that adaptive genetic divergence is common both between major European river systems and along altitudinal gradients within drainages. Alpine trout appear to be a promising model system to investigate the relative roles of divergent selection and gene flow in promoting or preventing adaptation to climate gradients.
Resumo:
An important problem in computational biology is finding the longest common subsequence (LCS) of two nucleotide sequences. This paper examines the correctness and performance of a recently proposed parallel LCS algorithm that uses successor tables and pruning rules to construct a list of sets from which an LCS can be easily reconstructed. Counterexamples are given for two pruning rules that were given with the original algorithm. Because of these errors, performance measurements originally reported cannot be validated. The work presented here shows that speedup can be reliably achieved by an implementation in Unified Parallel C that runs on an Infiniband cluster. This performance is partly facilitated by exploiting the software cache of the MuPC runtime system. In addition, this implementation achieved speedup without bulk memory copy operations and the associated programming complexity of message passing.
Resumo:
Linear programs, or LPs, are often used in optimization problems, such as improving manufacturing efficiency of maximizing the yield from limited resources. The most common method for solving LPs is the Simplex Method, which will yield a solution, if one exists, but over the real numbers. From a purely numerical standpoint, it will be an optimal solution, but quite often we desire an optimal integer solution. A linear program in which the variables are also constrained to be integers is called an integer linear program or ILP. It is the focus of this report to present a parallel algorithm for solving ILPs. We discuss a serial algorithm using a breadth-first branch-and-bound search to check the feasible solution space, and then extend it into a parallel algorithm using a client-server model. In the parallel mode, the search may not be truly breadth-first, depending on the solution time for each node in the solution tree. Our search takes advantage of pruning, often resulting in super-linear improvements in solution time. Finally, we present results from sample ILPs, describe a few modifications to enhance the algorithm and improve solution time, and offer suggestions for future work.
Resumo:
Small clusters of gallium oxide, technologically important high temperature ceramic, together with interaction of nucleic acid bases with graphene and small-diameter carbon nanotube are focus of first principles calculations in this work. A high performance parallel computing platform is also developed to perform these calculations at Michigan Tech. First principles calculations are based on density functional theory employing either local density or gradient-corrected approximation together with plane wave and gaussian basis sets. The bulk Ga2O3 is known to be a very good candidate for fabricating electronic devices that operate at high temperatures. To explore the properties of Ga2O3 at nonoscale, we have performed a systematic theoretical study on the small polyatomic gallium oxide clusters. The calculated results find that all lowest energy isomers of GamOn clusters are dominated by the Ga-O bonds over the metal-metal or the oxygen-oxygen bonds. Analysis of atomic charges suggest the clusters to be highly ionic similar to the case of bulk Ga2O3. In the study of sequential oxidation of these slusters starting from Ga2O, it is found that the most stable isomers display up to four different backbones of constituent atoms. Furthermore, the predicted configuration of the ground state of Ga2O is recently confirmed by the experimental result of Neumark's group. Guided by the results of calculations the study of gallium oxide clusters, performance related challenge of computational simulations, of producing high performance computers/platforms, has been addressed. Several engineering aspects were thoroughly studied during the design, development and implementation of the high performance parallel computing platform, rama, at Michigan Tech. In an attempt to stay true to the principles of Beowulf revolutioni, the rama cluster was extensively customized to make it easy to understand, and use - for administrators as well as end-users. Following the results of benchmark calculations and to keep up with the complexity of systems under study, rama has been expanded to a total of sixty four processors. Interest in the non-covalent intereaction of DNA with carbon nanotubes has steadily increased during past several years. This hybrid system, at the junction of the biological regime and the nanomaterials world, possesses features which make it very attractive for a wide range of applicatioins. Using the in-house computational power available, we have studied details of the interaction between nucleic acid bases with graphene sheet as well as high-curvature small-diameter carbon nanotube. The calculated trend in the binding energies strongly suggests that the polarizability of the base molecules determines the interaction strength of the nucleic acid bases with graphene. When comparing the results obtained here for physisorption on the small diameter nanotube considered with those from the study on graphene, it is observed that the interaction strength of nucleic acid bases is smaller for the tube. Thus, these results show that the effect of introducing curvature is to reduce the binding energy. The binding energies for the two extreme cases of negligible curvature (i.e. flat graphene sheet) and of very high curvature (i.e. small diameter nanotube) may be considered as upper and lower bounds. This finding represents an important step towards a better understanding of experimentally observed sequence-dependent interaction of DNA with Carbon nanotubes.
Resumo:
Does there exist a Steiner Triple System on v points, whose blocks can be partitioned into partial parallel classes of size m, where m ≤ [v⁄3], m | b and b is the number of blocks of the STS(v)? We give the answer for 9 ≤ v ≤ 43. We also show that whenever 2|b, v ≡ 3 (mod 6) we can find an STS(v) whose blocks can be partitioned into partial parallel classes of size 2, and whenever 4|b , v ≡ 3 (mod 6), there exists an STS(v) whose blocks can be partitioned into partial parallel classes of size 4.
Resumo:
The solution phase parallel synthesis involving reactions of Baylis-Hillman products of 3-substituted-5-isoxazolecarbaldehydes with nucleophiles and their in vivo antithrombotic evaluations are described along with the results of in vitro platelet aggregation inhibition assay of a few compounds. Results of the detailed evaluation of one of the compounds as an inhibitor of platelet aggregation are also presented.
Resumo:
AIMS The aim of this single-site, randomized, controlled, double-blind, 3-arm parallel study was to determine the effectiveness of a prophylaxis paste containing 15% calcium sodium phosphosilicate (CSPS; NovaMin(®) ) with and without fluoride in reducing dentine hypersensitivity immediately after a single application and 28 days following dental scaling and root planing. MATERIALS & METHODS Overall, 151 subjects were enrolled in this study. All subjects received a scaling and root planing procedure followed by a final prophylaxis step using one of three different prophylaxis pastes: Test-A (15% NovaMin(®) and NaF), Test-B (15% NovaMin(®) ) and a control. Dentine hypersensitivity was assessed by tactile stimulus (Yeaple Probe(®) ) and by air blast (Schiff scale) at baseline, immediately after and 28 days after a prophylaxis procedure. One hundred and forty-nine subjects completed the study. RESULTS Subjects having received the test prophylaxis pastes showed statistically lower (anova, p < 0.05) dentine hypersensitivity compared with the control group immediately after the prophylaxis procedure (Yeaple Probe(®) : Test-A = 20.9 ± 12.6, Test-B = 22.7 ± 12.9, Control=11.2 ± 3.1; Schiff score: Test-A = 1.1 ± 0.6, Test-B = 1.1 ± 0.6, Control = 2.0 ± 0.7) and after 28 days (Yeaple probe: Test-A = 21.5 ± 11.9, Test-B = 20.6 ± 11.3, Control = 11.8 ± 6.0; Schiff score: Test-A = 1.0 ± 0.6, Test-B = 1.0 ± 0.6, Control = 2.0 ± 0.7). CONCLUSIONS In conclusion, the single application of both fluoridated and non-fluoridated prophylaxis pastes containing 15% CSPS (NovaMin(®) ) provided a significant reduction of dentine hypersensitivity up to at least 28 days.