27 resultados para Hypercomplex geometric derivative

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. Generalisation is measured by the performance on independent test data drawn from the same distribution as the training data. Such performance can be quantified by the posterior average of the information divergence between the true and the model distributions. Averaging over the Bayesian posterior guarantees internal coherence; Using information divergence guarantees invariance with respect to representation. The theory generalises the least mean squares theory for linear Gaussian models to general problems of statistical estimation. The main results are: (1)~the ideal optimal estimate is always given by average over the posterior; (2)~the optimal estimate within a computational model is given by the projection of the ideal estimate to the model. This incidentally shows some currently popular methods dealing with hyperpriors are in general unnecessary and misleading. The extension of information divergence to positive normalisable measures reveals a remarkable relation between the dlt dual affine geometry of statistical manifolds and the geometry of the dual pair of Banach spaces Ld and Ldd. It therefore offers conceptual simplification to information geometry. The general conclusion on the issue of evaluating neural network learning rules and other statistical inference methods is that such evaluations are only meaningful under three assumptions: The prior P(p), describing the environment of all the problems; the divergence Dd, specifying the requirement of the task; and the model Q, specifying available computing resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The aim of this study was to compare a developmental optical coherence tomography (OCT) based contact lens inspection instrument to a widely used geometric inspection instrument (Optimec JCF), to establish the capability of a market focused OCT system. Methods: Measurements of 27 soft spherical contact lenses were made using the Optimec JCF and a new OCT based instrument, the Optimec is830. Twelve of the lenses analysed were specially commissioned from a traditional hydrogel (Contamac GM Advance 49%) and 12 from a silicone hydrogel (Contamac Definitive 65), each set with a range of back optic zone radius (BOZR) and centre thickness (CT) values. Three commercial lenses were also measured; CooperVision MyDay (Stenfilcon A) in −10D, −3D and +6D powers. Two measurements of BOZR, CT and total diameter were made for each lens in temperature controlled saline on both instruments. Results: The results showed that the is830 and JCF measurements were comparable, but that the is830 had a better repeatability coefficient for BOZR (0.065 mm compared to 0.151 mm) and CT (0.008 mm compared to 0.027 mm). Both instruments had similar results for total diameter (0.041 mm compared to 0.044 mm). Conclusions: The OCT based instrument assessed in this study is able to match and improve on the JCF instrument for the measurement of total diameter, back optic zone radius and centre thickness for soft contact lenses in temperature controlled saline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional approaches to calculate total factor productivity change through Malmquist indexes rely on distance functions. In this paper we show that the use of distance functions as a means to calculate total factor productivity change may introduce some bias in the analysis, and therefore we propose a procedure that calculates total factor productivity change through observed values only. Our total factor productivity change is then decomposed into efficiency change, technological change, and a residual effect. This decomposition makes use of a non-oriented measure in order to avoid problems associated with the traditional use of radial oriented measures, especially when variable returns to scale technologies are to be compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional approaches to calculate total factor productivity (TFP) change through Malmquist indexes rely on distance functions. In this paper we show that the use of distance functions as a means to calculate TFP change may introduce some bias in the analysis, and therefore we propose a procedure that calculates TFP change through observed values only. Our total TFP change is then decomposed into efficiency change, technological change, and a residual effect. This decomposition makes use of a non-oriented measure in order to avoid problems associated with the traditional use of radial oriented measures, especially when variable returns to scale technologies are to be compared. The proposed approach is applied in this paper to a sample of Portuguese bank branches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Edges are key points of information in visual scenes. One important class of models supposes that edges correspond to the steepest parts of the luminance profile, implying that they can be found as peaks and troughs in the response of a gradient (1st derivative) filter, or as zero-crossings in the 2nd derivative (ZCs). We tested those ideas using a stimulus that has no local peaks of gradient and no ZCs, at any scale. The stimulus profile is analogous to the Mach ramp, but it is the luminance gradient (not the absolute luminance) that increases as a linear ramp between two plateaux; the luminance profile is a blurred triangle-wave. For all image-blurs tested, observers marked edges at or close to the corner points in the gradient profile, even though these were not gradient maxima. These Mach edges correspond to peaks and troughs in the 3rd derivative. Thus Mach edges are inconsistent with many standard edge-detection schemes, but are nicely predicted by a recent model that finds edge points with a 2-stage sequence of 1st then 2nd derivative operators, each followed by a half-wave rectifier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feature detection is a crucial stage of visual processing. In previous feature-marking experiments we found that peaks in the 3rd derivative of the luminance profile can signify edges where there are no 1st derivative peaks nor 2nd derivative zero-crossings (Wallis and George 'Mach edges' (the edges of Mach bands) were nicely predicted by a new nonlinear model based on 3rd derivative filtering. As a critical test of the model, we now use a new class of stimuli, formed by adding a linear luminance ramp to the blurred triangle waves used previously. The ramp has no effect on the second or higher derivatives, but the nonlinear model predicts a shift from seeing two edges to seeing only one edge as the added ramp gradient increases. In experiment 1, subjects judged whether one or two edges were visible on each trial. In experiment 2, subjects used a cursor to mark perceived edges and bars. The position and polarity of the marked edges were close to model predictions. Both experiments produced the predicted shift from two to one Mach edge, but the shift was less complete than predicted. We conclude that the model is a useful predictor of edge perception, but needs some modification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Edge detection is crucial in visual processing. Previous computational and psychophysical models have often used peaks in the gradient or zero-crossings in the 2nd derivative to signal edges. We tested these approaches using a stimulus that has no such features. Its luminance profile was a triangle wave, blurred by a rectangular function. Subjects marked the position and polarity of perceived edges. For all blur widths tested, observers marked edges at or near 3rd derivative maxima, even though these were not 1st derivative maxima or 2nd derivative zero-crossings, at any scale. These results are predicted by a new nonlinear model based on 3rd derivative filtering. As a critical test, we added a ramp of variable slope to the blurred triangle-wave luminance profile. The ramp has no effect on the (linear) 2nd or higher derivatives, but the nonlinear model predicts a shift from seeing two edges to seeing one edge as the ramp gradient increases. Results of two experiments confirmed such a shift, thus supporting the new model. [Supported by the Engineering and Physical Sciences Research Council].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many models of edge analysis in biological vision, the initial stage is a linear 2nd derivative operation. Such models predict that adding a linear luminance ramp to an edge will have no effect on the edge's appearance, since the ramp has no effect on the 2nd derivative. Our experiments did not support this prediction: adding a negative-going ramp to a positive-going edge (or vice-versa) greatly reduced the perceived blur and contrast of the edge. The effects on a fairly sharp edge were accurately predicted by a nonlinear multi-scale model of edge processing [Georgeson, M. A., May, K. A., Freeman, T. C. A., & Hesse, G. S. (in press). From filters to features: Scale-space analysis of edge and blur coding in human vision. Journal of Vision], in which a half-wave rectifier comes after the 1st derivative filter. But we also found that the ramp affected perceived blur more profoundly when the edge blur was large, and this greater effect was not predicted by the existing model. The model's fit to these data was much improved when the simple half-wave rectifier was replaced by a threshold-like transducer [May, K. A. & Georgeson, M. A. (2007). Blurred edges look faint, and faint edges look sharp: The effect of a gradient threshold in a multi-scale edge coding model. Vision Research, 47, 1705-1720.]. This modified model correctly predicted that the interaction between ramp gradient and edge scale would be much larger for blur perception than for contrast perception. In our model, the ramp narrows an internal representation of the gradient profile, leading to a reduction in perceived blur. This in turn reduces perceived contrast because estimated blur plays a role in the model's estimation of contrast. Interestingly, the model predicts that analogous effects should occur when the width of the window containing the edge is made narrower. This has already been confirmed for blur perception; here, we further support the model by showing a similar effect for contrast perception. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Edges are key points of information in visual scenes. One important class of models supposes that edges correspond to the steepest parts of the luminance profile, implying that they can be found as peaks and troughs in the response of a gradient (first-derivative) filter, or as zero-crossings (ZCs) in the second-derivative. A variety of multi-scale models are based on this idea. We tested this approach by devising a stimulus that has no local peaks of gradient and no ZCs, at any scale. Our stimulus profile is analogous to the classic Mach-band stimulus, but it is the local luminance gradient (not the absolute luminance) that increases as a linear ramp between two plateaux. The luminance profile is a smoothed triangle wave and is obtained by integrating the gradient profile. Subjects used a cursor to mark the position and polarity of perceived edges. For all the ramp-widths tested, observers marked edges at or close to the corner points in the gradient profile, even though these were not gradient maxima. These new Mach edges correspond to peaks and troughs in the third-derivative. They are analogous to Mach bands - light and dark bars are seen where there are no luminance peaks but there are peaks in the second derivative. Here, peaks in the third derivative were seen as light-to-dark edges, troughs as dark-to-light edges. Thus Mach edges are inconsistent with many standard edge detectors, but are nicely predicted by a new model that uses a (nonlinear) third-derivative operator to find edge points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We studied the visual mechanisms that encode edge blur in images. Our previous work suggested that the visual system spatially differentiates the luminance profile twice to create the `signature' of the edge, and then evaluates the spatial scale of this signature profile by applying Gaussian derivative templates of different sizes. The scale of the best-fitting template indicates the blur of the edge. In blur-matching experiments, a staircase procedure was used to adjust the blur of a comparison edge (40% contrast, 0.3 s duration) until it appeared to match the blur of test edges at different contrasts (5% - 40%) and blurs (6 - 32 min of arc). Results showed that lower-contrast edges looked progressively sharper. We also added a linear luminance gradient to blurred test edges. When the added gradient was of opposite polarity to the edge gradient, it made the edge look progressively sharper. Both effects can be explained quantitatively by the action of a half-wave rectifying nonlinearity that sits between the first and second (linear) differentiating stages. This rectifier was introduced to account for a range of other effects on perceived blur (Barbieri-Hesse and Georgeson, 2002 Perception 31 Supplement, 54), but it readily predicts the influence of the negative ramp. The effect of contrast arises because the rectifier has a threshold: it not only suppresses negative values but also small positive values. At low contrasts, more of the gradient profile falls below threshold and its effective spatial scale shrinks in size, leading to perceived sharpening.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Marr's work offered guidelines on how to investigate vision (the theory - algorithm - implementation distinction), as well as specific proposals on how vision is done. Many of the latter have inevitably been superseded, but the approach was inspirational and remains so. Marr saw the computational study of vision as tightly linked to psychophysics and neurophysiology, but the last twenty years have seen some weakening of that integration. Because feature detection is a key stage in early human vision, we have returned to basic questions about representation of edges at coarse and fine scales. We describe an explicit model in the spirit of the primal sketch, but tightly constrained by psychophysical data. Results from two tasks (location-marking and blur-matching) point strongly to the central role played by second-derivative operators, as proposed by Marr and Hildreth. Edge location and blur are evaluated by finding the location and scale of the Gaussian-derivative `template' that best matches the second-derivative profile (`signature') of the edge. The system is scale-invariant, and accurately predicts blur-matching data for a wide variety of 1-D and 2-D images. By finding the best-fitting scale, it implements a form of local scale selection and circumvents the knotty problem of integrating filter outputs across scales. [Supported by BBSRC and the Wellcome Trust]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called 'business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. © 2006 Springer-Verlag Berlin Heidelberg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A long period grating is interrogated with a fibre Bragg grating using a derivative spectroscopy technique. A quasi-linear relationship between the output of the sensing scheme and the curvature experienced by the long period grating is demonstrated, with a sensitivity of 5.05 m and with an average curvature resolution of 2.9 × 10-2 m-1. In addition, the feasibility of multiplexing an in-line series of long period gratings with this interrogation scheme is demonstrated with two pairs of fibre Bragg gratings and long period gratings. With this arrangement the cross-talk error between channels was less than ± 2.4 × 10-3 m-1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As systems for computer-aided-design and production of mechanical parts have developed, there has arisen a need for techniques for the comprehensive description of the desired part, including its 3-D shape. The creation and manipulation of shapes is generally known as geometric modelling. It is desirable that links be established between geometric modellers and machining programs. Currently, unbounded APT and some bounded geometry systems are being widely used in manufacturing industry for machining operations such as: milling, drilling, boring and turning, applied mainly to engineering parts. APT systems, however, are presently only linked to wire-frame drafting systems. The combination of a geometric modeller and APT will provide a powerful manufacturing system for industry from the initial design right through part manufacture using NC machines. This thesis describes a recently developed interface (ROMAPT) between a bounded geometry modeller (ROMULUS) and an unbounded NC processor (APT). A new set of theoretical functions and practical algorithms for the computer aided manufacturing of 3D solid geometric model has been investigated. This work has led to the development of a sophisticated computer program, ROMAPT, which provides a new link between CAD (in form of a goemetric modeller ROMULUS) and CAM (in form of the APT NC system). ROMAPT has been used to machine some engineering prototypes successfully both in soft foam material and aluminium. It has been demonstrated above that the theory and algorithms developed by the author for the development of computer aided manufacturing of 3D solid modelling are both valid and applicable. ROMAPT allows the full potential of a solid geometric modeller (ROMULUS) to be further exploited for NC applications without requiring major investment in new NC processor. ROMAPT supports output in APT-AC, APT4 and the CAM-I SSRI NC languages.