16 resultados para biological reference points
Resumo:
This paper details work carried out to verify the dimensional measurement performance of the Indoor GPS (iGPS) system; a network of Rotary-Laser Automatic Theodolites (R-LATs). Initially tests were carried out to determine the angular uncertainties on an individual R-LAT transmitter-receiver pair. A method is presented of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. Further tests carried out on a highly optimized version of the iGPS system have shown that the coordinate uncertainty can be reduced to 0.25 mm at a 95% confidence level.
Resumo:
This paper details a method of estimating the uncertainty of dimensional measurement for a three-dimensional coordinate measurement machine. An experimental procedure was developed to compare three-dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with a multilateration-like technique employed to establish three-dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and was used to estimate that the uncertainty of measurement for the basic iGPS system is approximately 1 mm at a 95% confidence level throughout a measurement volume of approximately 10 m × 10 m × 1.5 m. © 2010 IOP Publishing Ltd.
Resumo:
This paper details a method of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Under conditions of reduced visual stimulation, the systems of accommodation and vergence tend towards physiological resting states that are intermediate within their functional range. The terms tonic accommodation (TA) and tonic vergence (TV) are used in the study to describe these stimulus-free, intermediate adjustments and to represent the systems as being in a state of innervational tonicity. The literature relating to TA and TV and the various experiments of this thesis are reviewed. Methodology has been developed enabling the determination of TA and TV under conditions of total darknessl laser optometry for TA and ~ernier-alignment for TV. The thesis describes a series of experiments designed to investigate various aspects of TA and TV, and their role in ametropia, binocular vision and their adaptation to sustained visual tasks. Measurements of TA were also utilised to investigate the effect of various autonomic effector drugs on the ciliary muscle. The effects of ethanol on binocular function are shown to be directly proportional to the .initial level of TVJ which is itself unaffected. These results support the concept of TV as the reference point for normal vergence responses. The results of the pharmacological investigations indicate the presence of a small but significant, beta-receptor mediated inhibitory sympathetic input to the ciliary muscle, and that the wide distribution in TA is a consequence of inter-observer variations in parasympathetic, rather than sympathetic tone. Following interaction with visual tasks of t5mins duration, the levels of TA and TV are found to be biased in the direction of, and proportional to, the task position: except during near-task viewing where the task-to-TA stimulus-distance exceeds 1.5D (for TA) and 3.5deg (for TV). Under these conditions the expected level of bias is attenuated, Adaptive models are discussed, proposing TA and TV as the reference points of the accommodative and vergence system.
Resumo:
Measurement and variation control of geometrical Key Characteristics (KCs), such as flatness and gap of joint faces, coaxiality of cabin sections, is the crucial issue in large components assembly from the aerospace industry. Aiming to control geometrical KCs and to attain the best fit of posture, an optimization algorithm based on KCs for large components assembly is proposed. This approach regards the posture best fit, which is a key activity in Measurement Aided Assembly (MAA), as a two-phase optimal problem. In the first phase, the global measurement coordinate system of digital model and shop floor is unified with minimum error based on singular value decomposition, and the current posture of components being assembly is optimally solved in terms of minimum variation of all reference points. In the second phase, the best posture of the movable component is optimally determined by minimizing multiple KCs' variation with the constraints that every KC respectively conforms to its product specification. The optimal models and the process procedures for these two-phase optimal problems based on Particle Swarm Optimization (PSO) are proposed. In each model, every posture to be calculated is modeled as a 6 dimensional particle (three movement and three rotation parameters). Finally, an example that two cabin sections of satellite mainframe structure are being assembled is selected to verify the effectiveness of the proposed approach, models and algorithms. The experiment result shows the approach is promising and will provide a foundation for further study and application. © 2013 The Authors.
Resumo:
Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.
Resumo:
This report presents the results of testing of the Metris iGPS system performed by the National Physical Laboratory (NPL) and the University of Bath (UoB), with the assistance of Metris, and Airbus at Airbus, Broughton in March 2008. The aim of the test was to determine the performance capability of the iGPS coordinate metrology system by comparison with a reference measurement system based on multilateration implemented using laser trackers. A network of reference points was created using SMR nests fixed to the ground and above ground level on various stands. The reference points were spread out within the measurement volume of approximately 10 m ´ 10 m ´ 2 m. The coordinates of each reference point were determined by the laser tracker survey using multilateration. The expanded uncertainty (k=2) in the relative position of these reference coordinates was estimated to be of the order of 10 µm in x, y and z. A comparison between the iGPS system and the reference system showed that for the test setup, the iGPS system was able to determine lengths up to 12 m with an uncertainty of 170 µm (k=2) and coordinates with an uncertainty of 120 µm in x and y and 190 µm in z (k=2).
Resumo:
Pilot scale studies of high rate filtration were initiated to assess its potential as either a primary 'roughing' filter to alleviate the seasonal overloading of low rate filters on Hereford sewage treatment works - caused by wastes from cider production - or as a two stage high rate process to provide complete sewage treatment. Four mineral and four plastic primary filter media and two plastic secondary filter media were studied. The hydraulic loading applied to the primary plastic media (11.2 m3 /m3 .d) was twice that applied to the mineral media. The plastic media removed an average around 66 percent and the mineral media around 73 percent of the BOD applied when the 90 percentile BOD concentration was 563 mg/1. At a hydraulic loading of 4 m3 /m3 .d the secondary filters removed most of the POD from partially settled primary filter effluents, with one secondary effluent satisfying a 25 mg/1 BOD and 30 mg/1 SS standard. No significant degree of nitrification was achieved. Fungi dominated the biological film of the primary filters, with invertebrate grazers having little influence on film levels. Ponding did not arise, and modular media supported lower film levels than random-fill types. Secondary filter film levels were low, being dominated by bacteria. The biological loading applied to the filters was related to sludge dewaterability, with the most readily conditionable sludges produced by filters supporting heavy film. Sludges produced by random-fill media could be dewatered as readily as those produced by low rate filters treating the same sewage. Laboratory scale studies showed a relationship between log effluent BOD and nitrification achieved by biological filters. This relationship and the relationship between BOD load applied and removed observed in all filter media could he used to optimise operating conditions required in biological filters to achieve given effluent BOD and ammoniacal nitrogen standards.
Resumo:
Magnification factors specify the extent to which the area of a small patch of the latent (or `feature') space of a topographic mapping is magnified on projection to the data space, and are of considerable interest in both neuro-biological and data analysis contexts. Previous attempts to consider magnification factors for the self-organizing map (SOM) algorithm have been hindered because the mapping is only defined at discrete points (given by the reference vectors). In this paper we consider the batch version of SOM, for which a continuous mapping can be defined, as well as the Generative Topographic Mapping (GTM) algorithm of Bishop et al. (1997) which has been introduced as a probabilistic formulation of the SOM. We show how the techniques of differential geometry can be used to determine magnification factors as continuous functions of the latent space coordinates. The results are illustrated here using a problem involving the identification of crab species from morphological data.
Resumo:
Background: The importance of appropriate normalization controls in quantitative real-time polymerase chain reaction (qPCR) experiments has become more apparent as the number of biological studies using this methodology has increased. In developing a system to study gene expression from transiently transfected plasmids, it became clear that normalization using chromosomally encoded genes is not ideal, at it does not take into account the transfection efficiency and the significantly lower expression levels of the plasmids. We have developed and validated a normalization method for qPCR using a co-transfected plasmid.Results: The best chromosomal gene for normalization in the presence of the transcriptional activators used in this study, cadmium, dexamethasone, forskolin and phorbol-12-myristate 13-acetate was first identified. qPCR data was analyzed using geNorm, Normfinder and BestKeeper. Each software application was found to rank the normalization controls differently with no clear correlation. Including a co-transfected plasmid encoding the Renilla luciferase gene (Rluc) in this analysis showed that its calculated stability was not as good as the optimised chromosomal genes, most likely as a result of the lower expression levels and transfection variability. Finally, we validated these analyses by testing two chromosomal genes (B2M and ActB) and a co-transfected gene (Rluc) under biological conditions. When analyzing co-transfected plasmids, Rluc normalization gave the smallest errors compared to the chromosomal reference genes.Conclusions: Our data demonstrates that transfected Rluc is the most appropriate normalization reference gene for transient transfection qPCR analysis; it significantly reduces the standard deviation within biological experiments as it takes into account the transfection efficiencies and has easily controllable expression levels. This improves reproducibility, data validity and most importantly, enables accurate interpretation of qPCR data. © 2010 Jiwaji et al; licensee BioMed Central Ltd.
Resumo:
The development of classical and lipophilic inhibitors of dihydrofolate reductase (DHFR) as antitumour agents is reviewed and the advantages and problems associated with each class are discussed. The antitumour activity, pharmacokinetics and metabolism of m-azido-pyrimethamine (MZP), a novel lipophilic inhibitor, are considered and compared with metoprine, the prototype lipophilic antifolate. Evidence for a folate-independent target for lipophilic DHFR inhibitors is presented. Synthetic studies centred on three principal objectives. Firstly a series of structural analogues of MZP were prepared encompassing alkoxy, chloro and alkylamino substituents and evaluated, as the ethanesulphonate salts, for activity against mammalian DHFR. Inhibitory constant (KI) determinations were conducted by a Zone B analysis, the corresponding 4'-azido isomer of MZP proving more potent than the parent compound. Secondly, to facilitate metabolism and stability studies on MZP, a range of possible reference compounds were synthesised and characterised. Finally, a series of diaminopyrimidine derivatives were synthesised embracing structural features incompatible with DHFR inhibitory activity, in order that such compounds may serve as biochemical probes for the unidentified folate-independent target for lipophilic diaminopyrimidines discussed previously. Inactivity against DHFR was achieved via introduction of an ionic or basic group into a normally hydrophobic region of the molecule and compounds were screened against a mammalian DHFR and thymidylate synthase to confirm the abolition of activity. Several derivatives surprisingly proved potent inhibitors of DHFR exhibiting KI values comparable to that of methotrexate. Analogues were screened for antitumour activity in vitro and in vivo against murine leukaemia cell lines in order to identify potential lead compounds. Several derivatives virtually inactive against DHFR exhibited a disparate cytotoxicity and further biochemical studies are warranted. The nobreak hitherto unreported debenzylation of 2,4-diamino-5-(N-alkyl-benzylaminophenyl) pyrimidines was discovered during the course of the synthetic studies, treatment of these compounds with nitrous acid affording the corresponding benzotriazoles.
Resumo:
This thesis presents an investigation into the application of methods of uncertain reasoning to the biological classification of river water quality. Existing biological methods for reporting river water quality are critically evaluated, and the adoption of a discrete biological classification scheme advocated. Reasoning methods for managing uncertainty are explained, in which the Bayesian and Dempster-Shafer calculi are cited as primary numerical schemes. Elicitation of qualitative knowledge on benthic invertebrates is described. The specificity of benthic response to changes in water quality leads to the adoption of a sensor model of data interpretation, in which a reference set of taxa provide probabilistic support for the biological classes. The significance of sensor states, including that of absence, is shown. Novel techniques of directly eliciting the required uncertainty measures are presented. Bayesian and Dempster-Shafer calculi were used to combine the evidence provided by the sensors. The performance of these automatic classifiers was compared with the expert's own discrete classification of sampled sites. Variations of sensor data weighting, combination order and belief representation were examined for their effect on classification performance. The behaviour of the calculi under evidential conflict and alternative combination rules was investigated. Small variations in evidential weight and the inclusion of evidence from sensors absent from a sample improved classification performance of Bayesian belief and support for singleton hypotheses. For simple support, inclusion of absent evidence decreased classification rate. The performance of Dempster-Shafer classification using consonant belief functions was comparable to Bayesian and singleton belief. Recommendations are made for further work in biological classification using uncertain reasoning methods, including the combination of multiple-expert opinion, the use of Bayesian networks, and the integration of classification software within a decision support system for water quality assessment.
Resumo:
The thesis investigates the value of quantitative analyses for historical studies of science through an examination of research trends in insect pest control, or economic entomology. Reviews are made of quantitative studies of science, and historical studies of pest control. The methodological strengths and weaknesses of bibliometric techniques are examined in a special chapter; techniques examined include productivity studies such as paper counts, and relational techniques such as co-citation and co-word analysis. Insect pest control is described. This includes a discussion of the socio-economic basis of the concept of `pest'; a series of classifications of pest control techniques are provided and analysed with respect to their utility for scientometric studies. The chemical and biological approaches to control are discussed as scientific and technological paradigms. Three case studies of research trends in economic entomology are provided. First a scientometric analysis of samples of chemical control and biological control papers; providing quantitative data on institutional, financial, national, and journal structures associated with pest control research fields. Second, a content analysis of a core journal, the Journal of Economic Entomology, over a period of 1910-1985; this identifies the main research innovations and trends, in particular the changing balance between chemical and biological control. Third, an analysis of historical research trends in insecticide research; this shows the rise, maturity and decline of research of many groups of compounds. These are supplemented by a collection of seven papers on scientometric studies of pest control and quantitative techniques for analysing science.
Resumo:
Whether to assess the functionality of equipment or as a determinate for the accuracy of assays, reference standards are essential for the purposes of standardisation and validation. The ELISPOT assay, developed over thirty years ago, has emerged as a leading immunological assay in the development of novel vaccines for the assessment of efficacy. However, with its widespread use, there is a growing demand for a greater level of standardisation across different laboratories. One of the major difficulties in achieving this goal has been the lack of definitive reference standards. This is partly due to the ex vivo nature of the assay, which relies on cells being placed directly into the wells. Thus, the aim of this thesis was to produce an artificial reference standard using liposomes, for use within the assay. Liposomes are spherical bilayer vesicles with an enclosed aqueous compartment and therefore are models for biological membranes. Initial work examined pre-design considerations in order to produce an optimal formulation that would closely mimic the action of the cells ordinarily placed on the assay. Recognition of the structural differences between liposomes and cells led to the formulation of liposomes with increased density. This was achieved by using a synthesised cholesterol analogue. By incorporating this cholesterol analogue in liposomes, increased sedimentation rates were observed within the first few hours. The optimal liposome formulation from these studies was composed of 2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC), cholesterol (Chol) and brominated cholesterol (Brchol) at a 16:4:12 µMol ratio, based on a significantly higher (p<0.01) sedimentation (as determined by a percentage transmission of 59 ± 5.9 % compared to the control formulation at 29 ± 12 % after four hours). By considering a range of liposome formulations ‘proof of principle’ for using liposomes as ELISPOT reference standards was shown; recombinant IFN? cytokine was successfully entrapped within vesicles of different lipid compositions, which were able to promote spot formation within the ELISPOT assay. Using optimised liposome formulations composed of phosphatidylcholine with or without cholesterol (16 µMol total lipid) further development was undertaken to produce an optimised, scalable protocol for the production of liposomes as reference standards. A linear increase in spot number by the manipulation of cytokine concentration and/or lipid concentrations was not possible, potentially due to the saturation that occurred within the base of wells. Investigations into storage of the formulations demonstrated the feasibility of freezing and lyophilisation with disaccharide cryoprotectants, but also highlighted the need for further protocol optimisation to achieve a robust reference standard upon storage. Finally, the transfer of small-scale production to a medium lab-scale batch (40 mL) demonstrated this was feasible within the laboratory using the optimised protocol.
Resumo:
This paper determines the capability of two photogrammetric systems in terms of their measurement uncertainty in an industrial context. The first system – V-STARS inca3 from Geodetic Systems Inc. – is a commercially available measurement solution. The second system comprises an off-the-shelf Nikon D700 digital camera fitted with a 28 mm Nikkor lens and the research-based Vision Measurement Software (VMS). The uncertainty estimate of these two systems is determined with reference to a calibrated constellation of points determined by a Leica AT401 laser tracker. The calibrated points have an average associated standard uncertainty of 12·4 μm, spanning a maximum distance of approximately 14·5 m. Subsequently, the two systems’ uncertainty was determined. V-STARS inca3 had an estimated standard uncertainty of 43·1 μm, thus outperforming its manufacturer's specification; the D700/VMS combination achieved a standard uncertainty of 187 μm.