108 resultados para metrics
Resumo:
In a recent investigation, Landsat TM and ETM+ data were used to simulate different resolutions of remotely-sensed images (from 30 to 1100 m) and to analyze the effect of resolution on a range of landscape metrics associated with spatial patterns of forest fragmentation in Chapare, Bolivia since the mid-1980s. Whereas most metrics were found to be highly dependent on pixel size, several fractal metrics (DLFD, MPFD, and AWMPFD) were apparently independent of image resolution, in contradiction with a sizeable body of literature indicating that fractal dimensions of natural objects depend strongly on image characteristics. The present re-analysis of the Chapare images, using two alternative algorithms routinely used for the evaluation of fractal dimensions, shows that the values of the box-counting and information fractal dimensions are systematically larger, sometimes by as much as 85%, than the "fractal" indices DLFD, MPFD, and AWMFD for the same images. In addition, the geometrical fractal features of the forest and non-forest patches in the Chapare region strongly depend on the resolution of images used in the analysis. The largest dependency on resolution occurs for the box-counting fractal dimension in the case of the non-forest patches in 1993, where the difference between the 30 and I 100 m-resolution images corresponds to 24% of the full theoretical range (1.0 to 2.0) of the mass fractal dimension. The observation that the indices DLFD, MPFD, and AWMPFD, unlike the classical fractal dimensions, appear relatively unaffected by resolution in the case of the Chapare images seems due essentially to the fact that these indices are based on a heuristic, "non-geometric" approach to fractals. Because of their lack of a foundation in fractal geometry, nothing guarantees that these indices will be resolution-independent in general. (C) 2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved.
Resumo:
While Nalimov’s endgame tables for Western Chess are the most used today, their Depth-to-Mate metric is not the most efficient or effective in use. The authors have developed and used new programs to create tables to alternative metrics and recommend better strategies for endgame play.
Resumo:
While Nalimov’s endgame tables for Western Chess are the most used today, their Depth-to-Mate metric is not the only one and not the most effective in use. The authors have developed and used new programs to create tables to alternative metrics and recommend better strategies for endgame play.
Resumo:
The latest 6-man chess endgame results confirm that there are many deep forced mates beyond the 50-move rule. Players with potential wins near this limit naturally want to avoid a claim for a draw: optimal play to current metrics does not guarantee feasible wins or maximise the chances of winning against fallible opposition. A new metric and further strategies are defined which support players’ aspirations and improve their prospects of securing wins in the context of a k-move rule.
Resumo:
This article reports the combined results of several initiatives in creating and surveying complete suites of endgame tables (EGTs) to the Depth to Mate (DTM) and Depth to Conversion (DTC) metrics. Data on percentage results, maximals and mutual zugzwangs, mzugs, has been filed and made available on the web, as have the DTM EGTs.
Resumo:
Consider the statement "this project should cost X and has risk of Y". Such statements are used daily in industry as the basis for making decisions. The work reported here is part of a study aimed at providing a rational and pragmatic basis for such statements. Of particular interest are predictions made in the requirements and early phases of projects. A preliminary model has been constructed using Bayesian Belief Networks and in support of this, a programme to collect and study data during the execution of various software development projects commenced in May 2002. The data collection programme is undertaken under the constraints of a commercial industrial regime of multiple concurrent small to medium scale software development projects. Guided by pragmatism, the work is predicated on the use of data that can be collected readily by project managers; including expert judgements, effort, elapsed times and metrics collected within each project.
Resumo:
Many populations have recovered from severe bottlenecks either naturally or through intensive conservation management. In the past, however, few conservation programs have monitored the genetic health of recovering populations. We conducted a conservation genetic assessment of a small, reintroduced population of Mauritius Kestrel (Falco punctatus) to determine whether genetic deterioration has occurred since its reintroduction. We used pedigree analysis that partially accounted for individuals of unknown origin to document that (1) inbreeding occurred frequently (2.6% increase per generation; N-el = 18.9), (2) 25% of breeding pairs were composed of either closely or moderately related individuals, (3) genetic diversity has been lost from the population (1,6% loss per generation; N-ev = 32.1) less rapidly than the corresponding increase in inbreeding, and (4) ignoring the contribution of unknown individuals to a pedigree will bias the metrics derived from that pedigree, ultimately obscuring the prevailing genetic dynamics. The rates of inbreeding and loss of genetic variation in the subpopulation of Mauritius Kestrel we examined were extreme and among the highest yet documented in a wild vertebrate population. Thus, genetic deterioration may affect this population's long-term viability. Remedial conservation strategies are needed to reduce the impact of inbreeding and loss of genetic variation in this species, We suggest that schemes to monitor genetic variation after reintroduction should be an integral component of endangered species recovery programs
Resumo:
The introduction of metrics, league tables, performance targets, research assessment exercises and a range of other pressures placed by society, funding bodies and employers on scholars, teachers and students have resulted in diminished value being placed on the essential ethical criterion of truth. The impact of reduced valuation for truth has a huge impact on the standing of science and not least horticultural science in the eyes of the general public at a time when this should be a primary concern. This contribution discusses examples of the impact of diminished valuation of truth, the causes of this phenomenon, the results that come from this situation and remedies that are needed.
Resumo:
Graphical tracking is a technique for crop scheduling where the actual plant state is plotted against an ideal target curve which encapsulates all crop and environmental characteristics. Management decisions are made on the basis of the position of the actual crop against the ideal position. Due to the simplicity of the approach it is possible for graphical tracks to be developed on site without the requirement for controlled experimentation. Growth models and graphical tracks are discussed, and an implementation of the Richards curve for graphical tracking described. In many cases, the more intuitively desirable growth models perform sub-optimally due to problems with the specification of starting conditions, environmental factors outside the scope of the original model and the introduction of new cultivars. Accurate specification for a biological model requires detailed and usually costly study, and as such is not adaptable to a changing cultivar range and changing cultivation techniques. Fitting of a new graphical track for a new cultivar can be conducted on site and improved over subsequent seasons. Graphical tracking emphasises the current position relative to the objective, and as such does not require the time consuming or system specific input of an environmental history, although it does require detailed crop measurement. The approach is flexible and could be applied to a variety of specification metrics, with digital imaging providing a route for added value. For decision making regarding crop manipulation from the observed current state, there is a role for simple predictive modelling over the short term to indicate the short term consequences of crop manipulation.
Resumo:
One of the most common decisions we make is the one about where to move our eyes next. Here we examine the impact that processing the evidence supporting competing options has on saccade programming. Participants were asked to saccade to one of two possible visual targets indicated by a cloud of moving dots. We varied the evidence which supported saccade target choice by manipulating the proportion of dots moving towards one target or the other. The task was found to become easier as the evidence supporting target choice increased. This was reflected in an increase in percent correct and a decrease in saccade latency. The trajectory and landing position of saccades were found to deviate away from the non-selected target reflecting the choice of the target and the inhibition of the non-target. The extent of the deviation was found to increase with amount of sensory evidence supporting target choice. This shows that decision-making processes involved in saccade target choice have an impact on the spatial control of a saccade. This would seem to extend the notion of the processes involved in the control of saccade metrics beyond a competition between visual stimuli to one also reflecting a competition between options.
Resumo:
Inhibition is intimately involved in the ability to select a target for a goal-directed movement. The effect of distracters on the deviation of oculomotor trajectories and landing positions provides evidence of such inhibition. individual saccade trajectories and landing positions may deviate initially either towards, or away from, a competing distracter-the direction and extent of this deviation depends upon saccade latency and the target to distracter separation. However, the underlying commonality of the sources of oculomotor inhibition has not been investigated. Here we report the relationship between distracter-related deviation of saccade trajectory, landing position and saccade latency. Observers saccaded to a target which could be accompanied by a distracter shown at various distances from very close (10 angular degrees) to far away (120 angular degrees). A fixation-gap paradigm was used to manipulate latency independently of the influence of competing distracters. When distracters were close to the target, saccade trajectory and landing position deviated toward the distracter position, while at greater separations landing position was always accurate but trajectories deviated away from the distracters. Different spatial patterns of deviations across latency were found. This pattern of results is consistent with the metrics of the saccade reflecting coarse pooling of the ongoing activity at the distracter location: saccade trajectory reflects activity at saccade initiation while landing position reveals activity at saccade end. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Saccadic eye movements and fixations are the behavioral means by which we visually sample text during reading. Human oculomotor control is governed by a complex neurophysiological system involving the brain stem, superior colliculus, and several cortical areas [1, 2]. A very widely held belief among researchers investigating primate vision is that the oculomotor system serves to orient the visual axes of both eyes to fixate the same target point in space. It is argued that such precise positioning of the eyes is necessary to place images on corresponding retinal locations, such that on each fixation a single, nondiplopic, visual representation is perceived [3]. Vision works actively through a continual sampling process involving saccades and fixations [4]. Here we report that during normal reading, the eyes do not always fixate the same letter within a word. We also demonstrate that saccadic targeting is yoked and based on a unified cyclopean percept of a whole word since it is unaffected if different word parts are delivered exclusively to each eye via a dichoptic presentation technique. These two findings together suggest that the visual signal from each eye is fused at a very early stage in the visual pathway, even when the fixation disparity is greater than one character (0.29 deg), and that saccade metrics for each eye are computed on the basis of that fused signal.
Resumo:
Frequency recognition is an important task in many engineering fields such as audio signal processing and telecommunications engineering, for example in applications like Dual-Tone Multi-Frequency (DTMF) detection or the recognition of the carrier frequency of a Global Positioning, System (GPS) signal. This paper will present results of investigations on several common Fourier Transform-based frequency recognition algorithms implemented in real time on a Texas Instruments (TI) TMS320C6713 Digital Signal Processor (DSP) core. In addition, suitable metrics are going to be evaluated in order to ascertain which of these selected algorithms is appropriate for audio signal processing(1).
Resumo:
Single point interaction haptic devices do not provide the natural grasp and manipulations found in the real world, as afforded by multi-fingered haptics. The present study investigates a two-fingered grasp manipulation involving rotation with and without force feedback. There were three visual cue conditions: monocular, binocular and projective lighting. Performance metrics of time and positional accuracy were assessed. The results indicate that adding haptics to an object manipulation task increases the positional accuracy but slightly increases the overall time taken.
Resumo:
This paper presents a novel two-pass algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS). compensation. for block base motion On the basis of research from previous algorithms, especially an on-the-edge motion estimation algorithm called hexagonal search (HEXBS), we propose the LHMEA and the Two-Pass Algorithm (TPA). We introduce hashtable into video compression. In this paper we employ LHMEA for the first-pass search in all the Macroblocks (MB) in the picture. Motion Vectors (MV) are then generated from the first-pass and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of MBs. The evaluation of the algorithm considers the three important metrics being time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms. Experimental results show that the proposed algorithm can offer the same compression rate as the Full Search. LHMEA with TPA has significant improvement on HEXBS and shows a direction for improving other fast motion estimation algorithms, for example Diamond Search.