959 resultados para Characteristic curves
Resumo:
Gately [1974] recently introduced the concept of an individual player's “propensity to disrupt” a payoff vector in a three-person characteristic function game. As a generalisation of this concept we propose the “disruption nucleolus” of ann-person game. The properties and computational possibilities of this concept are analogous to those of the nucleolus itself. Two numerical examples are given.
Resumo:
For a Switched Reluctance Motor (SRM), the flux linkage characteristic is the most basic magnetic characteristic, and many other quantities, including the incremental inductance, back emf, and electromagnetic torque can be determined indirectly from it. In this paper, two methods of measuring the flux linkage profile of an SRM from the phase winding voltage and current measurements, with and without rotor locking devices, are presented. Torque, incremental inductance and back emf characteristics of the SRM are then obtained from the flux linkage measurements. The torque of the SRM is also measured directly as a comparison, and the closeness of the calculated and directly measured torque curves suggests the validity of the method to obtain the SRM torque, incremental inductance and back emf profiles from the flux linkage measurements. © 2013 IEEE.
Resumo:
Purpose: Defocus curves are used to evaluate the subjective range of clear vision of presbyopic corrections such as in eyes implanted with accommodating intraocular lenses (IOLs). This study determines whether letter sequences and/or lens presentation order ought to be randomised when measuring defocus curves. Methods: Defocus curves (range +2.00DS to -2.00DS) were measured on 18 pre-presbyopic subjects (mean age 24.1 ± 4.2 years) for six combinations of sequential or randomised positive or negative lens progression and non-randomised or randomised letter sequences. The letters were presented on a computerised logMAR chart at 6 m. Results: Overall there was a statistically significant difference between the six combinations (ANOVA, p < 0.05) attributable to the combination of non-randomised letters with non-randomised lens progression from negative to positive defocus (p < 0.01). There was no statistically significant difference in defocus curve measurements if both letters and lens order were randomised compared to if only one of these variables was randomised (p > 0.05). Non-randomised letters, with a sequential lens progression from negative to positive, was significantly different to all other combinations when compared individually (Student's T-test, p < 0.003 on all comparisons), and was confirmed as the sole source of the overall significant difference. There was no statistically significant difference if both lens presentation order and letter sequences were randomised compared to if only one or the other of these variables was randomised. Conclusion: Non-randomised letters and non-randomised lens progression on their own did not affect the subjective amplitude of accommodation as measured by defocus curves, although their combination should be avoided. © 2007 British Contact Lens Association.
Resumo:
Very often the experimental data are the realization of the process, fully determined by some unknown function, being distorted by hindrances. Treatment and experimental data analysis are substantially facilitated, if these data to represent as analytical expression. The experimental data processing algorithm and the example of using this algorithm for spectrographic analysis of oncologic preparations of blood is represented in this article.
Resumo:
*Partially supported by NATO.
Resumo:
Here we study the integers (d, g, r) such that on a smooth projective curve of genus g there exists a rank r stable vector bundle with degree d and spanned by its global sections.
Resumo:
Let C = (C, g^1/4 ) be a tetragonal curve. We consider the scrollar invariants e1 , e2 , e3 of g^1/4 . We prove that if W^1/4 (C) is a non-singular variety, then every g^1/4 ∈ W^1/4 (C) has the same scrollar invariants.
Resumo:
* Partially supported by Grant MM-428/94 of MESC.
Resumo:
Purpose: To evaluate the effect of reducing the number of visual acuity measurements made in a defocus curve on the quality of data quantified. Setting: Midland Eye, Solihull, United Kingdom. Design: Evaluation of a technique. Methods: Defocus curves were constructed by measuring visual acuity on a distance logMAR letter chart, randomizing the test letters between lens presentations. The lens powers evaluated ranged between +1.50 diopters (D) and -5.00 D in 0.50 D steps, which were also presented in a randomized order. Defocus curves were measured binocularly with the Tecnis diffractive, Rezoom refractive, Lentis rotationally asymmetric segmented (+3.00 D addition [add]), and Finevision trifocal multifocal intraocular lenses (IOLs) implanted bilaterally, and also for the diffractive IOL and refractive or rotationally asymmetric segmented (+3.00 D and +1.50 D adds) multifocal IOLs implanted contralaterally. Relative and absolute range of clear-focus metrics and area metrics were calculated for curves fitted using 0.50 D, 1.00 D, and 1.50 D steps and a near add-specific profile (ie, distance, half the near add, and the full near-add powers). Results: A significant difference in simulated results was found in at least 1 of the relative or absolute range of clear-focus or area metrics for each of the multifocal designs examined when the defocus-curve step size was increased (P<.05). Conclusion: Faster methods of capturing defocus curves from multifocal IOL designs appear to distort the metric results and are therefore not valid. Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned. © 2013 ASCRS and ESCRS.
Resumo:
Similar to classic Signal Detection Theory (SDT), recent optimal Binary Signal Detection Theory (BSDT) and based on it Neural Network Assembly Memory Model (NNAMM) can successfully reproduce Receiver Operating Characteristic (ROC) curves although BSDT/NNAMM parameters (intensity of cue and neuron threshold) and classic SDT parameters (perception distance and response bias) are essentially different. In present work BSDT/NNAMM optimal likelihood and posterior probabilities are analytically analyzed and used to generate ROCs and modified (posterior) mROCs, optimal overall likelihood and posterior. It is shown that for the description of basic discrimination experiments in psychophysics within the BSDT a ‘neural space’ can be introduced where sensory stimuli as neural codes are represented and decision processes are defined, the BSDT’s isobias curves can simultaneously be interpreted as universal psychometric functions satisfying the Neyman-Pearson objective, the just noticeable difference (jnd) can be defined and interpreted as an atom of experience, and near-neutral values of biases are observers’ natural choice. The uniformity or no-priming hypotheses, concerning the ‘in-mind’ distribution of false-alarm probabilities during ROC or overall probability estimations, is introduced. The BSDT’s and classic SDT’s sensitivity, bias, their ROC and decision spaces are compared.
Resumo:
On the basis of convolutional (Hamming) version of recent Neural Network Assembly Memory Model (NNAMM) for intact two-layer autoassociative Hopfield network optimal receiver operating characteristics (ROCs) have been derived analytically. A method of taking into account explicitly a priori probabilities of alternative hypotheses on the structure of information initiating memory trace retrieval and modified ROCs (mROCs, a posteriori probabilities of correct recall vs. false alarm probability) are introduced. The comparison of empirical and calculated ROCs (or mROCs) demonstrates that they coincide quantitatively and in this way intensities of cues used in appropriate experiments may be estimated. It has been found that basic ROC properties which are one of experimental findings underpinning dual-process models of recognition memory can be explained within our one-factor NNAMM.
Resumo:
This article goes into the development of NURBS models of quadratic curves and surfaces. Curves and surfaces which could be represented by one general equation (one for the curves and one for the surfaces) are addressed. The research examines the curves: ellipse, parabola and hyperbola, the surfaces: ellipsoid, paraboloid, hyperboloid, double hyperboloid, hyperbolic paraboloid and cone, and the cylinders: elliptic, parabolic and hyperbolic. Many real objects which have to be modeled in 3D applications possess specific features. Because of this these geometric objects have been chosen. Using the NURBS models presented here, specialized software modules (plug-ins) have been developed for a 3D graphic system. An analysis of their implementation and the primitives they create has been performed.
Resumo:
Recently Garashuk and Lisonek evaluated Kloosterman sums K (a) modulo 4 over a finite field F3m in the case of even K (a). They posed it as an open problem to characterize elements a in F3m for which K (a) ≡ 1 (mod4) and K (a) ≡ 3 (mod4). In this paper, we will give an answer to this problem. The result allows us to count the number of elements a in F3m belonging to each of these two classes.
Resumo:
We present a new program tool for interactive 3D visualization of some fundamental algorithms for representation and manipulation of Bézier curves. The program tool has an option for demonstration of one of their most important applications - in graphic design for creating letters by means of cubic Bézier curves. We use Java applet and JOGL as our main visualization techniques. This choice ensures the platform independency of the created applet and contributes to the realistic 3D visualization. The applet provides basic knowledge on the Bézier curves and is appropriate for illustrative and educational purposes. Experimental results are included.
Resumo:
2000 Mathematics Subject Classification: Primary 34C07, secondary 34C08.