86 resultados para constrained reconstruction
Resumo:
Goal modelling is a well known rigorous method for analysing problem rationale and developing requirements. Under the pressures typical of time-constrained projects its benefits are not accessible. This is because of the effort and time needed to create the graph and because reading the results can be difficult owing to the effects of crosscutting concerns. Here we introduce an adaptation of KAOS to meet the needs of rapid turn around and clarity. The main aim is to help the stakeholders gain an insight into the larger issues that might be overlooked if they make a premature start into implementation. The method emphasises the use of obstacles, accepts under-refined goals and has new methods for managing crosscutting concerns and strategic decision making. It is expected to be of value to agile as well as traditional processes.
Resumo:
Dog-in-a-Doublet Bridge Reconstruction Scheme integrates the interdisciplinary design to provide solution for different needs: to provide a crossing to carry the new 40-tonne loading requirement, to improve the visibility of the substandard junction, and within the funding available. The management of the project involves the co-ordination of different authorities, statutory undertakers and other bodies. At certain stages, there were negotiations with RSPB on the restriction of construction period from July to November. After the re-assessment of the environmental impact of the construction on the breeding and wintering birds, the restrict was waived. As the bid for the assessment, strengthening and structural maintenance of bridges in the Cambridgeshire County Council Transport Policies and Programme No. 21 (1995/96) for Dog-in-a-Doublet Bridge Reconstruction Schemes was unsuccessful to attract the Transport Supplement Grant (TSG). A series of temporary measures had to be undertaken until funding is available for its replacement.
Resumo:
A method is discussed for imposing any desired constraint on the force field obtained in a force constant refinement calculation. The application of this method to force constant refinement calculations for the methyl halide molecules is reported. All available data on the vibration frequencies, Coriolis interaction constants and centrifugal stretching constants of CH3X and CD3X molecules were used in the refinements, but despite this apparent abundance of data it was found that constraints were necessary in order to obtain a unique solution to the force field. The results of unconstrained calculations, and of three different constrained calculations, are reported in this paper. The constrained models reported are a Urey—Bradley force field, a modified valence force field, and a constraint based on orbital-following bond-hybridization arguments developed in the following paper. The results are discussed, and compared with previous results for these molecules. The third of the above models is found to reproduce the observed data better than either of the first two, and additional reasons are given for preferring this solution to the force field for the methyl halide molecules.
Resumo:
The proportional odds model provides a powerful tool for analysing ordered categorical data and setting sample size, although for many clinical trials its validity is questionable. The purpose of this paper is to present a new class of constrained odds models which includes the proportional odds model. The efficient score and Fisher's information are derived from the profile likelihood for the constrained odds model. These results are new even for the special case of proportional odds where the resulting statistics define the Mann-Whitney test. A strategy is described involving selecting one of these models in advance, requiring assumptions as strong as those underlying proportional odds, but allowing a choice of such models. The accuracy of the new procedure and its power are evaluated.
Resumo:
Motivation: We compare phylogenetic approaches for inferring functional gene links. The approaches detect independent instances of the correlated gain and loss of pairs of genes from species' genomes. We investigate the effect on results of basing evidence of correlations on two phylogenetic approaches, Dollo parsminony and maximum likelihood (ML). We further examine the effect of constraining the ML model by fixing the rate of gene gain at a low value, rather than estimating it from the data. Results: We detect correlated evolution among a test set of pairs of yeast (Saccharomyces cerevisiae) genes, with a case study of 21 eukaryotic genomes and test data derived from known yeast protein complexes. If the rate at which genes are gained is constrained to be low, ML achieves by far the best results at detecting known functional links. The model then has fewer parameters but it is more realistic by preventing genes from being gained more than once. Availability: BayesTraits by M. Pagel and A. Meade, and a script to configure and repeatedly launch it by D. Barker and M. Pagel, are available at http://www.evolution.reading.ac.uk .
Resumo:
The node-density effect is an artifact of phylogeny reconstruction that can cause branch lengths to be underestimated in areas of the tree with fewer taxa. Webster, Payne, and Pagel (2003, Science 301:478) introduced a statistical procedure (the "delta" test) to detect this artifact, and here we report the results of computer simulations that examine the test's performance. In a sample of 50,000 random data sets, we find that the delta test detects the artifact in 94.4% of cases in which it is present. When the artifact is not present (n = 10,000 simulated data sets) the test showed a type I error rate of approximately 1.69%, incorrectly reporting the artifact in 169 data sets. Three measures of tree shape or "balance" failed to predict the size of the node-density effect. This may reflect the relative homogeneity of our randomly generated topologies, but emphasizes that nearly any topology can suffer from the artifact, the effect not being confined only to highly unevenly sampled or otherwise imbalanced trees. The ability to screen phylogenies for the node-density artifact is important for phylogenetic inference and for researchers using phylogenetic trees to infer evolutionary processes, including their use in molecular clock dating. [Delta test; molecular clock; molecular evolution; node-density effect; phylogenetic reconstruction; speciation; simulation.]
Resumo:
Twenty-eight microsatellite primer pairs developed from Fragaria vesca ‘Rügen’ were applied to sixteen accessions representing eight diploid Fragaria species. The number of alleles generated, the power of discrimination and the percentage of accessions where no PCR product could be amplified were calculated for each locus for the thirteen non-F. vesca accessions. A phylogeny was then generated for the species accessions sampled, using the presence or absence of alleles at the polymorphic loci as character states. Despite the problems inherent in phylogeny reconstruction from microsatellite data, the phylogeny showed some congruence with a previously published phylogeny of Fragaria, based on nucleotide sequence data. However, relationships inferred from microsatellite allele data were relatively unresolved and poorly supported. The genetic basis of allelic polymorphisms at specific loci was investigated through direct sequencing of the PCR products amplified by three primer pairs. The potential utility of sequence data generated from microsatellite loci in evolutionary studies of closely related species groups is briefly explored.
Resumo:
A novel type of tweezer molecule containing electron-rich 2-pyrenyloxy arms has been designed to exploit intramolecular hydrogen bonding in stabilising a preferred conformation for supramolecular complexation to complementary sequences in aromatic copolyimides. This tweezer-conformation is demonstrated by single-crystal X-ray analyses of the tweezer molecule itself and of its complex with an aromatic diimide model-compound. In terms of its ability to bind selectively to polyimide chains, the new tweezer molecule shows very high sensitivity to sequence effects. Thus, even low concentrations of tweezer relative to diimide units (<2.5 mol%) are sufficient to produce dramatic, sequence-related splittings of the pyromellitimide proton NMR resonances. These induced resonance-shifts arise from ring-current shielding of pyromellitimide protons by the pyrenyloxy arms of the tweezer-molecule, and the magnitude of such shielding is a function of the tweezer-binding constant for any particular monomer sequence. Recognition of both short-range and long-range sequences is observed, the latter arising from cumulative ring-current shielding of diimide protons by tweezer molecules binding at multiple adjacent sites on the copolymer chain.
Application of olefin metathesis for the synthesis of constrained beta-amino esters from norbornenes
Resumo:
Synthesis of a number of novel, conformationally rigid beta-amino esters has been achieved via a tandem olefin metathesis reaction. The starting materials are readily accessible from the Diels-Alder adduct between cyclopentadiene and maleic anhydride.
Resumo:
This paper describes the SIMULINK implementation of a constrained predictive control algorithm based on quadratic programming and linear state space models, and its application to a laboratory-scale 3D crane system. The algorithm is compatible with Real Time. Windows Target and, in the case of the crane system, it can be executed with a sampling period of 0.01 s and a prediction horizon of up to 300 samples, using a linear state space model with 3 inputs, 5 outputs and 13 states.
Resumo:
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward constrained regression manner. The leave-one-out (LOO) test score is used for kernel selection. The jackknife parameter estimator subject to positivity constraint check is used for the parameter estimation of a single parameter at each forward step. As such the proposed approach is simple to implement and the associated computational cost is very low. An illustrative example is employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to that of the classical Parzen window estimate.
Resumo:
The 3D reconstruction of a Golgi-stained dendritic tree from a serial stack of images captured with a transmitted light bright-field microscope is investigated. Modifications to the bootstrap filter are discussed such that the tree structure may be estimated recursively as a series of connected segments. The tracking performance of the bootstrap particle filter is compared against Differential Evolution, an evolutionary global optimisation method, both in terms of robustness and accuracy. It is found that the particle filtering approach is significantly more robust and accurate for the data considered.
Resumo:
This paper describes a new method for reconstructing 3D surface using a small number, e.g. 10, of 2D photographic images. The images are taken at different viewing directions by a perspective camera with full prior knowledge of the camera configurations. The reconstructed object's surface is represented a set of triangular facets. We empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points optimally cluster closely on a highly curved part of the surface and are widely, spread on smooth or fat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not undersampled or underrepresented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method Given that the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object.