24 resultados para Curve number method
em Aston University Research Archive
Resumo:
We have proposed a similarity matching method (SMM) to obtain the change of Brillouin frequency shift (BFS), in which the change of BFS can be determined from the frequency difference between detecting spectrum and selected reference spectrum by comparing their similarity. We have also compared three similarity measures in the simulation, which has shown that the correlation coefficient is more accurate to determine the change of BFS. Compared with the other methods of determining the change of BFS, the SMM is more suitable for complex Brillouin spectrum profiles. More precise result and much faster processing speed have been verified in our simulation and experiments. The experimental results have shown that the measurement uncertainty of the BFS has been improved to 0.72 MHz by using the SMM, which is almost one-third of that by using the curve fitting method, and the speed of deriving the BFS change by the SMM is 120 times faster than that by the curve fitting method.
Resumo:
The principled statistical application of Gaussian random field models used in geostatistics has historically been limited to data sets of a small size. This limitation is imposed by the requirement to store and invert the covariance matrix of all the samples to obtain a predictive distribution at unsampled locations, or to use likelihood-based covariance estimation. Various ad hoc approaches to solve this problem have been adopted, such as selecting a neighborhood region and/or a small number of observations to use in the kriging process, but these have no sound theoretical basis and it is unclear what information is being lost. In this article, we present a Bayesian method for estimating the posterior mean and covariance structures of a Gaussian random field using a sequential estimation algorithm. By imposing sparsity in a well-defined framework, the algorithm retains a subset of “basis vectors” that best represent the “true” posterior Gaussian random field model in the relative entropy sense. This allows a principled treatment of Gaussian random field models on very large data sets. The method is particularly appropriate when the Gaussian random field model is regarded as a latent variable model, which may be nonlinearly related to the observations. We show the application of the sequential, sparse Bayesian estimation in Gaussian random field models and discuss its merits and drawbacks.
Resumo:
Advances in both computer technology and the necessary mathematical models capable of capturing the geometry of arbitarily shaped objects has led to the development in this thesis of a surface generation package called 'IBSCURF' aimed at providing a more economically viable solution to free-form surface manufacture. A suit of computer programs written in FORTRAN 77 has been developed to provide computer aids for every aspect of work in designing and machining free-form surfaces. A vector-valued parametric method was used for shape description and a lofting technique employed for the construction of the surface. The development of the package 'IBSCURF' consists of two phases. The first deals with CAD. The design process commences in defining the cross-sections which are represented by uniform B-spline curves as approximations to give polygons. The order of the curve and the position and number of the polygon vertices can be used as parameters for the modification to achieve the required curves. When the definitions of the sectional curves is complete, the surface is interpolated over them by cubic cardinal splines. To use the CAD function of the package to design a mould for a plastic handle, a mathematical model was developed. To facilitate the integration of design and machining using the mathematical representation of the surface, the second phase of the package is concerned with CAM which enables the generation of tool offset positions for ball-nosed cutters and a general post-processor has been developed which automatically generates NC tape programs for any CNC milling machine. The two phases of these programs have been successfully implemented, as a CAD/CAM package for free-form surfaces on the VAX 11/750 super-minicomputer with graphics facilities for displaying drawings interactively on the terminal screen. The development of this package has been beneficial in all aspects of design and machining of free form surfaces.
Resumo:
Distributed Brillouin sensing of strain and temperature works by making spatially resolved measurements of the position of the measurand-dependent extremum of the resonance curve associated with the scattering process in the weakly nonlinear regime. Typically, measurements of backscattered Stokes intensity (the dependent variable) are made at a number of predetermined fixed frequencies covering the design measurand range of the apparatus and combined to yield an estimate of the position of the extremum. The measurand can then be found because its relationship to the position of the extremum is assumed known. We present analytical expressions relating the relative error in the extremum position to experimental errors in the dependent variable. This is done for two cases: (i) a simple non-parametric estimate of the mean based on moments and (ii) the case in which a least squares technique is used to fit a Lorentzian to the data. The question of statistical bias in the estimates is discussed and in the second case we go further and present for the first time a general method by which the probability density function (PDF) of errors in the fitted parameters can be obtained in closed form in terms of the PDFs of the errors in the noisy data.
Resumo:
'I'he accurate rreasurement of bed shear stress has been extremely difficult due to its changing values until white propunded a theory which would give constant shear along the bed of a flume. In this investigation a flume has been designed according to White's theory and by two separate methods proven to give constant shearing force along the bed. The first method applied the Hydrogen Bubble Technique to obtain accurate values of velocity thus allowing the velocity profile to be plotted and the momentum at the various test sections to be calculated. The use of a 16 mm Beaulieu movie camera allowed the exact velocity profiles created by the hydrogen bubbles to be recorded whilst an analysing projector gave the means of calculating the exact velocities at the various test sections. Simultaneously Preston's technique of measuring skin friction using Pitot tubes was applied. Twc banks of open ended water manometer were used for recording the static and velocity head pressure drop along the flume. This tvpe of manometer eliminated air locks in the tubes and was found to be sufficiently accurate. Readings of pressure and velocity were taken for various types and diameters of bed material both natural sands and glass spheres and the results tabulated. Graphs of particle Reynolds Number against bed shear stress were plotted and gave a linear relationship which dropped off at high values of Reynolds number. It was found that bed movement occurred instantaneously along the bed of the flume once critical velocity had been reached. On completion of this test a roof curve inappropriate to the bed material was used and then the test repeated. The bed shearing stress was now no longer constant and yet bed movement started instantaneously along the bed of the flume, showing that there are more parameters than critical shear stress to bed movement. It is concluded from the two separate methods applied that the bed shear stress is constant along the bed of the flume.
Resumo:
An experimental investigation into the Acoustic Emission (AE) response of sand has been undertaken, and the use of AE as a method of yield point identification has been assessed. Dense, saturated samples of sand were tested in conventional triaxial apparatus. The measurements of stresses and strains were carried out according to current research practice. The AE monitoring system was integrated with the soil mechanics equipment in such a way that sample disturbance was minimised. During monotonically loaded, constant cell pressure tests the total number of events recorded was found to increase at an increasing rate in a manner which may be approximated by a power law. The AE response of the sand was found to be both stress level and stress path dependent. Undrained constant cell pressure tests showed that, unlike drained tests, the AE event rate increased at an increasing rate; this was shown to correlate with the mean effective stress variation. The stress path dependence was most noticeable in extension tests, where the number of events recorded was an order of magnitude less than that recorded in comparable compression tests. This stress path dependence was shown to be due to the differences in the work done by the external stresses. In constant cell pressure tests containing unload/reload cycles it was found that yield could be identified from a discontinuity in the event rate/time curve which occurred during reloading. Further tests involving complex stress paths showed that AE was a useful method of yield point identification. Some tests involving large stress reversals were carried out, and AE identified the inverse yield points more distinctly than conventional methods of yield point identification.
Resumo:
The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.
Resumo:
Hierarchical knowledge structures are frequently used within clinical decision support systems as part of the model for generating intelligent advice. The nodes in the hierarchy inevitably have varying influence on the decisionmaking processes, which needs to be reflected by parameters. If the model has been elicited from human experts, it is not feasible to ask them to estimate the parameters because there will be so many in even moderately-sized structures. This paper describes how the parameters could be obtained from data instead, using only a small number of cases. The original method [1] is applied to a particular web-based clinical decision support system called GRiST, which uses its hierarchical knowledge to quantify the risks associated with mental-health problems. The knowledge was elicited from multidisciplinary mental-health practitioners but the tree has several thousand nodes, all requiring an estimation of their relative influence on the assessment process. The method described in the paper shows how they can be obtained from about 200 cases instead. It greatly reduces the experts’ elicitation tasks and has the potential for being generalised to similar knowledge-engineering domains where relative weightings of node siblings are part of the parameter space.
Resumo:
Artifact selection decisions typically involve the selection of one from a number of possible/candidate options (decision alternatives). In order to support such decisions, it is important to identify and recognize relevant key issues of problem solving and decision making (Albers, 1996; Harris, 1998a, 1998b; Jacobs & Holten, 1995; Loch & Conger, 1996; Rumble, 1991; Sauter, 1999; Simon, 1986). Sauter classifies four problem solving/decision making styles: (1) left-brain style, (2) right-brain style, (3) accommodating, and (4) integrated (Sauter, 1999). The left-brain style employs analytical and quantitative techniques and relies on rational and logical reasoning. In an effort to achieve predictability and minimize uncertainty, problems are explicitly defined, solution methods are determined, orderly information searches are conducted, and analysis is increasingly refined. Left-brain style decision making works best when it is possible to predict/control, measure, and quantify all relevant variables, and when information is complete. In direct contrast, right-brain style decision making is based on intuitive techniques—it places more emphasis on feelings than facts. Accommodating decision makers use their non-dominant style when they realize that it will work best in a given situation. Lastly, integrated style decision makers are able to combine the left- and right-brain styles—they use analytical processes to filter information and intuition to contend with uncertainty and complexity.
Resumo:
An inverse problem is considered where the structure of multiple sound-soft planar obstacles is to be determined given the direction of the incoming acoustic field and knowledge of the corresponding total field on a curve located outside the obstacles. A local uniqueness result is given for this inverse problem suggesting that the reconstruction can be achieved by a single incident wave. A numerical procedure based on the concept of the topological derivative of an associated cost functional is used to produce images of the obstacles. No a priori assumption about the number of obstacles present is needed. Numerical results are included showing that accurate reconstructions can be obtained and that the proposed method is capable of finding both the shapes and the number of obstacles with one or a few incident waves.
Resumo:
We consider a Cauchy problem for the Laplace equation in a two-dimensional semi-infinite region with a bounded inclusion, i.e. the region is the intersection between a half-plane and the exterior of a bounded closed curve contained in the half-plane. The Cauchy data are given on the unbounded part of the boundary of the region and the aim is to construct the solution on the boundary of the inclusion. In 1989, Kozlov and Maz'ya [10] proposed an alternating iterative method for solving Cauchy problems for general strongly elliptic and formally self-adjoint systems in bounded domains. We extend their approach to our setting and in each iteration step mixed boundary value problems for the Laplace equation in the semi-infinite region are solved. Well-posedness of these mixed problems are investigated and convergence of the alternating procedure is examined. For the numerical implementation an efficient boundary integral equation method is proposed, based on the indirect variant of the boundary integral equation approach. The mixed problems are reduced to integral equations over the (bounded) boundary of the inclusion. Numerical examples are included showing the feasibility of the proposed method.
Resumo:
Artifact selection decisions typically involve the selection of one from a number of possible/candidate options (decision alternatives). In order to support such decisions, it is important to identify and recognize relevant key issues of problem solving and decision making (Albers, 1996; Harris, 1998a, 1998b; Jacobs & Holten, 1995; Loch & Conger, 1996; Rumble, 1991; Sauter, 1999; Simon, 1986). Sauter classifies four problem solving/decision making styles: (1) left-brain style, (2) right-brain style, (3) accommodating, and (4) integrated (Sauter, 1999). The left-brain style employs analytical and quantitative techniques and relies on rational and logical reasoning. In an effort to achieve predictability and minimize uncertainty, problems are explicitly defined, solution methods are determined, orderly information searches are conducted, and analysis is increasingly refined. Left-brain style decision making works best when it is possible to predict/control, measure, and quantify all relevant variables, and when information is complete. In direct contrast, right-brain style decision making is based on intuitive techniques—it places more emphasis on feelings than facts. Accommodating decision makers use their non-dominant style when they realize that it will work best in a given situation. Lastly, integrated style decision makers are able to combine the left- and right-brain styles—they use analytical processes to filter information and intuition to contend with uncertainty and complexity.
Resumo:
Distributed Brillouin sensing of strain and temperature works by making spatially resolved measurements of the position of the measurand-dependent extremum of the resonance curve associated with the scattering process in the weakly nonlinear regime. Typically, measurements of backscattered Stokes intensity (the dependent variable) are made at a number of predetermined fixed frequencies covering the design measurand range of the apparatus and combined to yield an estimate of the position of the extremum. The measurand can then be found because its relationship to the position of the extremum is assumed known. We present analytical expressions relating the relative error in the extremum position to experimental errors in the dependent variable. This is done for two cases: (i) a simple non-parametric estimate of the mean based on moments and (ii) the case in which a least squares technique is used to fit a Lorentzian to the data. The question of statistical bias in the estimates is discussed and in the second case we go further and present for the first time a general method by which the probability density function (PDF) of errors in the fitted parameters can be obtained in closed form in terms of the PDFs of the errors in the noisy data.
Resumo:
An increasing number of publications on the dried blood spot (DBS) sampling approach for the quantification of drugs and metabolites have been spurred on by the inherent advantages of this sampling technique. In the present research, a selective and sensitive high-performance liquid chromatography method for the concurrent determination of multiple antiepileptic drugs (AEDs) [levetiracetam (LVT), lamotrigine (LTG), phenobarbital (PHB)], carbamazepine (CBZ) and its active metabolite carbamazepine-10,11 epoxide (CBZE)] in a single DBS has been developed and validated. Whole blood was spotted onto Guthrie cards and dried. Using a standard punch (6 mm diameter), a circular disc was punched from the card and extracted with methanol: acetonitrile (3:1, v/v) containing hexobarbital (Internal Standard) and sonicated prior to evaporation. The extract was then dissolved in water and vortex mixed before undergoing solid phase extraction using HLB cartridges. Chromatographic separation of the AEDs was achieved using Waters XBridge™ C18 column with a gradient system. The developed method was linear over the concentration ranges studied with r ≥ 0.995 for all compounds. The lower limits of quantification (LLOQs) were 2, 1, 2, 0.5 and 1 μg/mL for LVT, LTG, PHB, CBZE and CBZ, respectively. Accuracy (%RE) and precision (%CV) values for within and between day were <20% at the LLOQs and <15% at all other concentrations tested. This method was successfully applied to the analysis of the AEDs in DBS samples taken from children with epilepsy for the assessment of their adherence to prescribed treatments.
Resumo:
This paper considers the contemporary use of focus groups as a method of data collection within qualitative research settings. The authors draw upon their own experiences of using focus groups in educational and 'community' user-group environments in order to provide an overview of recent issues and debates surrounding the deployment of focus group methods and to pick out specific areas of contention in relation to both their epistemological and practical implications. Accordingly, the paper reflects on some of the realities of 'doing' focus groups whilst, at the same time, highlighting common problems and dilemmas which beginning researchers might encounter in their application. In turn, the paper raises a number of related issues around which there appears to have been a lack of academic discussion to date.