11 resultados para analysis with NMR

em Cambridge University Engineering Department Publications Database


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Quantitative microbeam Rutherford backscattering (RBS) analysis with a 1.5 MeV 4He+ beam has determined limits on the purity of copper deposited on glass with a novel inkjet process. A tetravinyl silane tetrakisCu(I) 1,1,1,5,5,5-hexafluoroacetylacetonate (TVST[Cu]hfac) complex was heated to 70 °C and jetted onto the glass substrate through a piezoelectric ceramic print head in droplets about 0.5 mm diameter. The substrate temperature was 150 °C. Solid well-formed deposits resulted which have a copper content greater than about 90% by weight. The RBS spectra were analysed objectively using the DataFurnace code, with the assumption that the deposit was CuOx, and the validity of different assumed values of x being tested. The assumptions and the errors of the analysis are critically evaluated. © 2002 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Spatial normalisation is a key element of statistical parametric mapping and related techniques for analysing cohort statistics on voxel arrays and surfaces. The normalisation process involves aligning each individual specimen to a template using some sort of registration algorithm. Any misregistration will result in data being mapped onto the template at the wrong location. At best, this will introduce spatial imprecision into the subsequent statistical analysis. At worst, when the misregistration varies systematically with a covariate of interest, it may lead to false statistical inference. Since misregistration generally depends on the specimen's shape, we investigate here the effect of allowing for shape as a confound in the statistical analysis, with shape represented by the dominant modes of variation observed in the cohort. In a series of experiments on synthetic surface data, we demonstrate how allowing for shape can reveal true effects that were previously masked by systematic misregistration, and also guard against misinterpreting systematic misregistration as a true effect. We introduce some heuristics for disentangling misregistration effects from true effects, and demonstrate the approach's practical utility in a case study of the cortical bone distribution in 268 human femurs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Housing stock models can be useful tools in helping to assess the environmental and socio-economic impacts of retrofits to residential buildings; however, existing housing stock models are not able to quantify the uncertainties that arise in the modelling process from various sources, thus limiting the role that they can play in helping decision makers. This paper examines the different sources of uncertainty involved in housing stock models and proposes a framework for handling these uncertainties. This framework involves integrating probabilistic sensitivity analysis with a Bayesian calibration process in order to quantify uncertain parameters more accurately. The proposed framework is tested on a case study building, and suggestions are made on how to expand the framework for retrofit analysis at an urban-scale. © 2011 Elsevier Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The uncertainty associated with a rainfall-runoff and non-point source loading (NPS) model can be attributed to both the parameterization and model structure. An interesting implication of the areal nature of NPS models is the direct relationship between model structure (i.e. sub-watershed size) and sample size for the parameterization of spatial data. The approach of this research is to find structural limitations in scale for the use of the conceptual NPS model, then examine the scales at which suitable stochastic depictions of key parameter sets can be generated. The overlapping regions are optimal (and possibly the only suitable regions) for conducting meaningful stochastic analysis with a given NPS model. Previous work has sought to find optimal scales for deterministic analysis (where, in fact, calibration can be adjusted to compensate for sub-optimal scale selection); however, analysis of stochastic suitability and uncertainty associated with both the conceptual model and the parameter set, as presented here, is novel; as is the strategy of delineating a watershed based on the uncertainty distribution. The results of this paper demonstrate a narrow range of acceptable model structure for stochastic analysis in the chosen NPS model. In the case examined, the uncertainties associated with parameterization and parameter sensitivity are shown to be outweighed in significance by those resulting from structural and conceptual decisions. © 2011 Copyright IAHS Press.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Estimating the fundamental matrix (F), to determine the epipolar geometry between a pair of images or video frames, is a basic step for a wide variety of vision-based functions used in construction operations, such as camera-pair calibration, automatic progress monitoring, and 3D reconstruction. Currently, robust methods (e.g., SIFT + normalized eight-point algorithm + RANSAC) are widely used in the construction community for this purpose. Although they can provide acceptable accuracy, the significant amount of required computational time impedes their adoption in real-time applications, especially video data analysis with many frames per second. Aiming to overcome this limitation, this paper presents and evaluates the accuracy of a solution to find F by combining the use of two speedy and consistent methods: SURF for the selection of a robust set of point correspondences and the normalized eight-point algorithm. This solution is tested extensively on construction site image pairs including changes in viewpoint, scale, illumination, rotation, and moving objects. The results demonstrate that this method can be used for real-time applications (5 image pairs per second with the resolution of 640 × 480) involving scenes of the built environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A boundary integral technique has been developed for the numerical simulation of the air flow for the Aaberg exhaust system. For the steady, ideal, irrotational air flow induced by a jet, the air velocity is an analytical function. The solution of the problem is formulated in the form of a boundary integral equation by seeking the solution of a mixed boundary-value problem of an analytical function based on the Riemann-Hilbert technique. The boundary integral equation is numerically solved by converting it into a system of linear algebraic equations, which are solved by the process of the Gaussian elimination. The air velocity vector at any point in the solution domain is then computed from the air velocity on the boundary of the solution domains.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accurate and efficient computation of the distance function d for a given domain is important for many areas of numerical modeling. Partial differential (e.g. HamiltonJacobi type) equation based distance function algorithms have desirable computational efficiency and accuracy. In this study, as an alternative, a Poisson equation based level set (distance function) is considered and solved using the meshless boundary element method (BEM). The application of this for shape topology analysis, including the medial axis for domain decomposition, geometric de-featuring and other aspects of numerical modeling is assessed. © 2011 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The design of wind turbine blades is a true multi-objective engineering task. The aerodynamic effectiveness of the turbine needs to be balanced with the system loads introduced by the rotor. Moreover the problem is not dependent on a single geometric property, but besides other parameters on a combination of aerofoil family and various blade functions. The aim of this paper is therefore to present a tool which can help designers to get a deeper insight into the complexity of the design space and to find a blade design which is likely to have a low cost of energy. For the research we use a Computational Blade Optimisation and Load Deflation Tool (CoBOLDT) to investigate the three extreme point designs obtained from a multi-objective optimisation of turbine thrust, annual energy production as well as mass for a horizontal axis wind turbine blade. The optimisation algorithm utilised is based on Multi-Objective Tabu Search which constitutes the core of CoBOLDT. The methodology is capable to parametrise the spanning aerofoils with two-dimensional Free Form Deformation and blade functions with two tangentially connected cubic splines. After geometry generation we use a panel code to create aerofoil polars and a stationary Blade Element Momentum code to evaluate turbine performance. Finally, the obtained loads are fed into a structural layout module to estimate the mass and stiffness of the current blade by means of a fully stressed design. For the presented test case we chose post optimisation analysis with parallel coordinates to reveal geometrical features of the extreme point designs and to select a compromise design from the Pareto set. The research revealed that a blade with a feasible laminate layout can be obtained, that can increase the energy capture and lower steady state systems loads. The reduced aerofoil camber and an increased L/. D-ratio could be identified as the main drivers. This statement could not be made with other tools of the research community before. © 2013 Elsevier Ltd.