141 resultados para SPECKLE-TRACKING
Resumo:
Incorporating ecological processes and animal behaviour into Species Distribution Models (SDMs) is difficult. In species with a central resting or breeding place, there can be conflict between the environmental requirements of the 'central place' and foraging habitat. We apply a multi-scale SDM to examine habitat trade-offs between the central place, roost sites, and foraging habitat in . Myotis nattereri. We validate these derived associations using habitat selection from behavioural observations of radio-tracked bats. A Generalised Linear Model (GLM) of roost occurrence using land cover variables with mixed spatial scales indicated roost occurrence was positively associated with woodland on a fine scale and pasture on a broad scale. Habitat selection of radio-tracked bats mirrored the SDM with bats selecting for woodland in the immediate vicinity of individual roosts but avoiding this habitat in foraging areas, whilst pasture was significantly positively selected for in foraging areas. Using habitat selection derived from radio-tracking enables a multi-scale SDM to be interpreted in a behavioural context. We suggest that the multi-scale SDM of . M. nattereri describes a trade-off between the central place and foraging habitat. Multi-scale methods provide a greater understanding of the ecological processes which determine where species occur and allow integration of behavioural processes into SDMs. The findings have implications when assessing the resource use of a species at a single point in time. Doing so could lead to misinterpretation of habitat requirements as these can change within a short time period depending on specific behaviour, particularly if detectability changes depending on behaviour. © 2011 Gesellschaft für ökologie.
Resumo:
Reliable detection of JAK2-V617F is critical for accurate diagnosis of myeloproliferative neoplasms (MPNs); in addition, sensitive mutation-specific assays can be applied to monitor disease response. However, there has been no consistent approach to JAK2-V617F detection, with assays varying markedly in performance, affecting clinical utility. Therefore, we established a network of 12 laboratories from seven countries to systematically evaluate nine different DNA-based quantitative PCR (qPCR) assays, including those in widespread clinical use. Seven quality control rounds involving over 21,500 qPCR reactions were undertaken using centrally distributed cell line dilutions and plasmid controls. The two best-performing assays were tested on normal blood samples (n=100) to evaluate assay specificity, followed by analysis of serial samples from 28 patients transplanted for JAK2-V617F-positive disease. The most sensitive assay, which performed consistently across a range of qPCR platforms, predicted outcome following transplant, with the mutant allele detected a median of 22 weeks (range 6-85 weeks) before relapse. Four of seven patients achieved molecular remission following donor lymphocyte infusion, indicative of a graft vs MPN effect. This study has established a robust, reliable assay for sensitive JAK2-V617F detection, suitable for assessing response in clinical trials, predicting outcome and guiding management of patients undergoing allogeneic transplant.
Resumo:
Processor architectures has taken a turn towards many-core processors, which integrate multiple processing cores on a single chip to increase overall performance, and there are no signs that this trend will stop in the near future. Many-core processors are harder to program than multi-core and single-core processors due to the need of writing parallel or concurrent programs with high degrees of parallelism. Moreover, many-cores have to operate in a mode of strong scaling because of memory bandwidth constraints. In strong scaling increasingly finer-grain parallelism must be extracted in order to keep all processing cores busy.
Task dataflow programming models have a high potential to simplify parallel program- ming because they alleviate the programmer from identifying precisely all inter-task de- pendences when writing programs. Instead, the task dataflow runtime system detects and enforces inter-task dependences during execution based on the description of memory each task accesses. The runtime constructs a task dataflow graph that captures all tasks and their dependences. Tasks are scheduled to execute in parallel taking into account dependences specified in the task graph.
Several papers report important overheads for task dataflow systems, which severely limits the scalability and usability of such systems. In this paper we study efficient schemes to manage task graphs and analyze their scalability. We assume a programming model that supports input, output and in/out annotations on task arguments, as well as commutative in/out and reductions. We analyze the structure of task graphs and identify versions and generations as key concepts for efficient management of task graphs. Then, we present three schemes to manage task graphs building on graph representations, hypergraphs and lists. We also consider a fourth edge-less scheme that synchronizes tasks using integers. Analysis using micro-benchmarks shows that the graph representation is not always scalable and that the edge-less scheme introduces least overhead in nearly all situations.
Resumo:
This paper presents generalized Laplacian eigenmaps, a novel dimensionality reduction approach designed to address stylistic variations in time series. It generates compact and coherent continuous spaces whose geometry is data-driven. This paper also introduces graph-based particle filter, a novel methodology conceived for efficient tracking in low dimensional space derived from a spectral dimensionality reduction method. Its strengths are a propagation scheme, which facilitates the prediction in time and style, and a noise model coherent with the manifold, which prevents divergence, and increases robustness. Experiments show that a combination of both techniques achieves state-of-the-art performance for human pose tracking in underconstrained scenarios.
Resumo:
Proteinuria originates from the kidney and occurs as a result of injury to either the glomerulus or the renal tubule or both. It is relatively common in the general population with reported point prevalence of up to 8% but the prevalence falls to around 2% on repeated testing. Chronic glomerular injury resulting in proteinuria may be secondary to prolonged duration of diabetes or hypertension. A tubular origin of proteinuria may be associated with inflammation of renal tubules triggered by prescribed drugs or ingested toxins. In the absence of obvious clues to the cause of persistent proteinuria on history or clinical examination it is worthwhile reviewing the patient's prescribed drugs to identify any potentially nephrotoxic agents e.g. NSAIDs. NICE guidelines recommend screening for proteinuria in individuals at higher risk for chronic kidney disease (CKD). These include patients with diabetes, hypertension, cardiovascular disease, connective tissue disorders, a family history of renal disease and those prescribed potentially nephrotoxic drugs. Patients with sudden onset of lower limb oedema and associated proteinuria should have a serum albumin level measured to exclude the nephrotic syndrome. Renal tract ultrasound will measure kidney size, and detect scarring associated with chronic pyelonephritis or prior renal stone disease which can cause proteinuria.
Resumo:
Object tracking is an active research area nowadays due to its importance in human computer interface, teleconferencing and video surveillance. However, reliable tracking of objects in the presence of occlusions, pose and illumination changes is still a challenging topic. In this paper, we introduce a novel tracking approach that fuses two cues namely colour and spatio-temporal motion energy within a particle filter based framework. We conduct a measure of coherent motion over two image frames, which reveals the spatio-temporal dynamics of the target. At the same time, the importance of both colour and motion energy cues is determined in the stage of reliability evaluation. This determination helps maintain the performance of the tracking system against abrupt appearance changes. Experimental results demonstrate that the proposed method outperforms the other state of the art techniques in the used test datasets.
Resumo:
High-cadence, multiwavelength observations and simulations are employed for the analysis of solar photospheric magnetic bright points (MBPs) in the quiet Sun. The observations were obtained with the Rapid Oscillations in the Solar Atmosphere (ROSA) imager and the Interferometric Bidimensional Spectrometer at the Dunn Solar Telescope. Our analysis reveals that photospheric MBPs have an average transverse velocity of approximately 1 km s-1, whereas their chromospheric counterparts have a slightly higher average velocity of 1.4 km s-1. Additionally, chromospheric MBPs were found to be around 63 per cent larger than the equivalent photospheric MBPs. These velocity values were compared with the output of numerical simulations generated using the muram code. The simulated results were similar, but slightly elevated, when compared to the observed data. An average velocity of 1.3 km s-1 was found in the simulated G-band images and an average of 1.8 km s-1 seen in the velocity domain at a height of 500 km above the continuum formation layer. Delays in the change of velocities were also analysed. Average delays of ˜4 s between layers of the simulated data set were established and values of ˜29 s observed between G-band and Ca ii K ROSA observations. The delays in the simulations are likely to be the result of oblique granular shock waves, whereas those found in the observations are possibly the result of a semi-rigid flux tube.
Resumo:
We introduce a novel dual-stage algorithm for online multi-target tracking in realistic conditions. In the first stage, the problem of data association between tracklets and detections, given partial occlusion, is addressed using a novel occlusion robust appearance similarity method. This is used to robustly link tracklets with detections without requiring explicit knowledge of the occluded regions. In the second stage, tracklets are linked using a novel method of constraining the linking process that removes the need for ad-hoc tracklet linking rules. In this method, links between tracklets are permitted based on their agreement with optical flow evidence. Tests of this new tracking system have been performed using several public datasets.
Resumo:
In this paper, we propose a novel visual tracking framework, based on a decision-theoretic online learning algorithm namely NormalHedge. To make NormalHedge more robust against noise, we propose an adaptive NormalHedge algorithm, which exploits the historic information of each expert to perform more accurate prediction than the standard NormalHedge. Technically, we use a set of weighted experts to predict the state of the target to be tracked over time. The weight of each expert is online learned by pushing the cumulative regret of the learner towards that of the expert. Our simulation experiments demonstrate the effectiveness of the proposed adaptive NormalHedge, compared to the standard NormalHedge method. Furthermore, the experimental results of several challenging video sequences show that the proposed tracking method outperforms several state-of-the-art methods.
Resumo:
In this paper a 3D human pose tracking framework is presented. A new dimensionality reduction method (Hierarchical Temporal Laplacian Eigenmaps) is introduced to represent activities in hierarchies of low dimensional spaces. Such a hierarchy provides increasing independence between limbs, allowing higher flexibility and adaptability that result in improved accuracy. Moreover, a novel deterministic optimisation method (Hierarchical Manifold Search) is applied to estimate efficiently the position of the corresponding body parts. Finally, evaluation on public datasets such as HumanEva demonstrates that our approach achieves a 62.5mm-65mm average joint error for the walking activity and outperforms state-of-the-art methods in terms of accuracy and computational cost.