965 resultados para TRACKING ANALYSIS
Resumo:
En esta tesis se presenta un análisis en profundidad de cómo se deben utilizar dos tipos de métodos directos, Lucas-Kanade e Inverse Compositional, en imágenes RGB-D y se analiza la capacidad y precisión de los mismos en una serie de experimentos sintéticos. Estos simulan imágenes RGB, imágenes de profundidad (D) e imágenes RGB-D para comprobar cómo se comportan en cada una de las combinaciones. Además, se analizan estos métodos sin ninguna técnica adicional que modifique el algoritmo original ni que lo apoye en su tarea de optimización tal y como sucede en la mayoría de los artículos encontrados en la literatura. Esto se hace con el fin de poder entender cuándo y por qué los métodos convergen o divergen para que así en el futuro cualquier interesado pueda aplicar los conocimientos adquiridos en esta tesis de forma práctica. Esta tesis debería ayudar al futuro interesado a decidir qué algoritmo conviene más en una determinada situación y debería también ayudarle a entender qué problemas le pueden dar estos algoritmos para poder poner el remedio más apropiado. Las técnicas adicionales que sirven de remedio para estos problemas quedan fuera de los contenidos que abarca esta tesis, sin embargo, sí se hace una revisión sobre ellas.---ABSTRACT---This thesis presents an in-depth analysis about how direct methods such as Lucas- Kanade and Inverse Compositional can be applied in RGB-D images. The capability and accuracy of these methods is also analyzed employing a series of synthetic experiments. These simulate the efects produced by RGB images, depth images and RGB-D images so that diferent combinations can be evaluated. Moreover, these methods are analyzed without using any additional technique that modifies the original algorithm or that aids the algorithm in its search for a global optima unlike most of the articles found in the literature. Our goal is to understand when and why do these methods converge or diverge so that in the future, the knowledge extracted from the results presented here can efectively help a potential implementer. After reading this thesis, the implementer should be able to decide which algorithm fits best for a particular task and should also know which are the problems that have to be addressed in each algorithm so that an appropriate correction is implemented using additional techniques. These additional techniques are outside the scope of this thesis, however, they are reviewed from the literature.
Resumo:
The structural connectivity of the brain is considered to encode species-wise and subject-wise patterns that will unlock large areas of understanding of the human brain. Currently, diffusion MRI of the living brain enables to map the microstructure of tissue, allowing to track the pathways of fiber bundles connecting the cortical regions across the brain. These bundles are summarized in a network representation called connectome that is analyzed using graph theory. The extraction of the connectome from diffusion MRI requires a large processing flow including image enhancement, reconstruction, segmentation, registration, diffusion tracking, etc. Although a concerted effort has been devoted to the definition of standard pipelines for the connectome extraction, it is still crucial to define quality assessment protocols of these workflows. The definition of quality control protocols is hindered by the complexity of the pipelines under test and the absolute lack of gold-standards for diffusion MRI data. Here we characterize the impact on structural connectivity workflows of the geometrical deformation typically shown by diffusion MRI data due to the inhomogeneity of magnetic susceptibility across the imaged object. We propose an evaluation framework to compare the existing methodologies to correct for these artifacts including whole-brain realistic phantoms. Additionally, we design and implement an image segmentation and registration method to avoid performing the correction task and to enable processing in the native space of diffusion data. We release PySDCev, an evaluation framework for the quality control of connectivity pipelines, specialized in the study of susceptibility-derived distortions. In this context, we propose Diffantom, a whole-brain phantom that provides a solution to the lack of gold-standard data. The three correction methodologies under comparison performed reasonably, and it is difficult to determine which method is more advisable. We demonstrate that susceptibility-derived correction is necessary to increase the sensitivity of connectivity pipelines, at the cost of specificity. Finally, with the registration and segmentation tool called regseg we demonstrate how the problem of susceptibility-derived distortion can be overcome allowing data to be used in their original coordinates. This is crucial to increase the sensitivity of the whole pipeline without any loss in specificity.
Resumo:
A DNA sequence has been obtained for a 35.6-kb genomic segment from Heliobacillus mobilis that contains a major cluster of photosynthesis genes. A total of 30 ORFs were identified, 20 of which encode enzymes for bacteriochlorophyll and carotenoid biosynthesis, reaction-center (RC) apoprotein, and cytochromes for cyclic electron transport. Donor side electron-transfer components to the RC include a putative RC-associated cytochrome c553 and a unique four-large-subunit cytochrome bc complex consisting of Rieske Fe-S protein (encoded by petC), cytochrome b6 (petB), subunit IV (petD), and a diheme cytochrome c (petX). Phylogenetic analysis of various photosynthesis gene products indicates a consistent grouping of oxygenic lineages that are distinct and descendent from anoxygenic lineages. In addition, H. mobilis was placed as the closest relative to cyanobacteria, which form a monophyletic origin to chloroplast-based photosynthetic lineages. The consensus of the photosynthesis gene trees also indicates that purple bacteria are the earliest emerging photosynthetic lineage. Our analysis also indicates that an ancient gene-duplication event giving rise to the paralogous bchI and bchD genes predates the divergence of all photosynthetic groups. In addition, our analysis of gene duplication of the photosystem I and photosystem II core polypeptides supports a “heterologous fusion model” for the origin and evolution of oxygenic photosynthesis.
Resumo:
Objetivo: O principal propósito do estudo foi pesquisar a disfunção ventricular esquerda subclínica em pacientes com lúpus eritematoso sistêmico juvenil (LESJ) através da técnica de speckle-tracking bidimensional. Foi investigada ainda uma possível correlação entre o comprometimento da deformação miocárdica e o SLEDAI-2K (Systemic Lupus Erithematosus Disease Activity Index 2000), bem como a presença de fatores de risco cardiovascular, tanto tradicionais como ligados à doença. Métodos: 50 pacientes assintomáticos do ponto de vista cardiovascular e 50 controles saudáveis (14,74 vs. 14,82 anos, p=0.83) foram avaliados pelo ecocardiograma convencional e pelo speckle-tracking bidimensional. Resultados: Apesar da fração de ejeção normal, os pacientes apresentaram redução de todos os parâmetros de deformação miocárdica longitudinal e radial, quando comparados aos controles: strain de pico sistólico longitudinal [-20,3 (-11 a -26) vs. -22 (-17,8 a -30.4) %, p < 0,0001], strain rate de pico sistólico longitudinal [-1,19 ± 0,21 vs. -1,3 ± 0,25 s-1, p=0,0005], strain rate longitudinal na diástole precoce [1,7 (0,99 a 2,95) vs. 2 (1,08 a 3,00) s-1 , p=0,0034], strain de pico sistólico radial [33,09 ± 8,6 vs. 44,36 ± 8,72%, p < 0,0001], strain rate de pico sistólico radial [1,98 ± 0,53 vs. 2,49 ± 0,68 s-1, p < 0,0001] e strain rate radial na diástole precoce [-2,31 ± 0,88 vs. -2,75 ± 0,97 s-1, p=0,02]. O strain de pico sistólico circunferencial [-23,67 ± 3,46 vs. - 24,6 ± 2,86%, p=0,43] e o strain rate circunferencial na diástole precoce [2 (0,88 a 3,4) vs. 1,99 (1,19 a 3,7) s-1, p=0,88] foram semelhantes em pacientes e controles. Apenas o strain rate de pico sistólico circunferencial [-1,5 ± 0,3 vs. -1,6 ± 0,3 s-1, p=0,036] mostrou-se reduzido no LESJ. Uma correlação negativa foi identificada entre o strain de pico sistólico longitudinal e o SLEDAI-2K (r = - 0,52; p < 0,0001) e também o número de fatores de risco cardiovascular por paciente (r = -0,32, p=0,024). Conclusões: Foi evidenciada disfunção sistólica e diastólica subclínica de ventrículo esquerdo no LESJ através da técnica de speckle-tracking bidimensional. A atividade da doença e a exposição aos fatores de risco cardiovascular provavelmente contribuíram para o comprometimento da deformação miocárdica nesses pacientes
Resumo:
Subpixel techniques are commonly used to increase the spatial resolution in tracking tasks. Object tracking with targets of known shape permits obtaining information about object position and orientation in the three-dimensional space. A proper selection of the target shape allows us to determine its position inside a plane and its angular and azimuthal orientation under certain limits. Our proposal is demonstrated both numerical and experimentally and provides an increase the accuracy of more than one order of magnitude compared to the nominal resolution of the sensor. The experiment has been performed with a high-speed camera, which simultaneously provides high spatial and temporal resolution, so it may be interesting for some applications where this kind of targets can be attached, such as vibration monitoring and structural analysis.
Resumo:
We present a targetless motion tracking method for detecting planar movements with subpixel accuracy. This method is based on the computation and tracking of the intersection of two nonparallel straight-line segments in the image of a moving object in a scene. The method is simple and easy to implement because no complex structures have to be detected. It has been tested and validated using a lab experiment consisting of a vibrating object that was recorded with a high-speed camera working at 1000 fps. We managed to track displacements with an accuracy of hundredths of pixel or even of thousandths of pixel in the case of tracking harmonic vibrations. The method is widely applicable because it can be used for distance measuring amplitude and frequency of vibrations with a vision system.
Resumo:
Electronic books (e-book) are an interesting option compared to classic paper books. Most e-reading devices of the first generation were based on e-ink technology. With the appearance of the Apple iPad on the market, TFT-LCDs became important in the field of e-reading. Both technologies have advantages and disadvantages but the question remains whether one or the other technology is better for reading. In the present study we analyzed and compared reading behavior when reading on e-inkreader (e-ink displays) and on tablets (TFT-LCDs) as measured by eye-tracking. The results suggest that the reading behavior on tablets is indeed very similar to the reading behavior on e-ink-reader. Participants showed no difference in fixation duration. Significant differences in reading speed and in the proportion of regressive saccades suggest that tablets, under special artificial light conditions, may even provide better legibility.
Resumo:
"NASA TM X-63872."
Resumo:
Senior thesis written for Oceanography 445
Resumo:
Stirred mills are becoming increasingly used for fine and ultra-fine grinding. This technology is still poorly understood when used in the mineral processing context. This makes process optimisation of such devices problematic. 3D DEM simulations of the flow of grinding media in pilot scale tower mills and pin mills are carried out in order to investigate the relative performance of these stirred mills. Media flow patterns and energy absorption rates and distributions are analysed here. In the second part of this paper, coherent flow structures, equipment wear and mixing and transport efficiency are analysed. (C) 2006 Published by Elsevier Ltd.
Resumo:
In Australia more than 300 vertebrates, including 43 insectivorous bat species, depend on hollows in habitat trees for shelter, with many species using a network of multiple trees as roosts, We used roost-switching data on white-striped freetail bats (Tadarida australis; Microchiroptera: Molossidae) to construct a network representation of day roosts in suburban Brisbane, Australia. Bats were caught from a communal roost tree with a roosting group of several hundred individuals and released with transmitters. Each roost used by the bats represented a node in the network, and the movements of bats between roosts formed the links between nodes. Despite differences in gender and reproductive stages, the bats exhibited the same behavior throughout three radiotelemetry periods and over 500 bat days of radio tracking: each roosted in separate roosts, switched roosts very infrequently, and associated with other bats only at the communal roost This network resembled a scale-free network in which the distribution of the number of links from each roost followed a power law. Despite being spread over a large geographic area (> 200 km(2)), each roost was connected to others by less than three links. One roost (the hub or communal roost) defined the architecture of the network because it had the most links. That the network showed scale-free properties has profound implications for the management of the habitat trees of this roosting group. Scale-free networks provide high tolerance against stochastic events such as random roost removals but are susceptible to the selective removal of hub nodes. Network analysis is a useful tool for understanding the structural organization of habitat tree usage and allows the informed judgment of the relative importance of individual trees and hence the derivation of appropriate management decisions, Conservation planners and managers should emphasize the differential importance of habitat trees and think of them as being analogous to vital service centers in human societies.
Resumo:
To participate effectively in the post-industrial information societies and knowledge/service economies of the 21st century, individuals must be better-informed, have greater thinking and problem-solving abilities, be self-motivated; have a capacity for cooperative interaction; possess varied and specialised skills; and be more resourceful and adaptable than ever before. This paper reports on one outcome from a national project funded by the Ministerial Council on Education, Employment Training and Youth Affairs, which investigated what practices, processes, strategies and structures best promote lifelong learning and the development of lifelong learners in the middle years of schooling. The investigation linked lifelong learning with middle schooling because there were indications that middle schooling reform practices also lead to the development of lifelong learning attributes, which is regarded as a desirable outcome of schooling in Australia. While this larger project provides depth around these questions, this paper specifically reports on the development of a three-phase model that can guide the sequence in which schools undertaking middle schooling reform attend to particular core component changes. The model is developed from the extensive analysis of 25 innovative schools around the nation, and provides a unique insight into the desirable sequences and time spent achieving reforms, along with typical pitfalls that lead to a regression in the reform process. Importantly, the model confirms that schooling reform takes much more time than planners typically expect or allocate, and there are predictable and identifiable inhibitors to achieving it.
Resumo:
Operationalising and measuring the concept of globalisation is important, as the extent to which the international economy is integrated has a direct impact on industrial dynamics, national trade policies and firm strategies. Using complex systems network analysis with longitudinal trade data from 1938 to 2003, this paper presents a new way to measure globalisation. It demonstrates that some important aspects of the international trade network have been remarkably stable over this period. However, several network measures have changed substantially over the same time frame. Taken together, these analyses provide a novel measure of globalisation.
Resumo:
For optimum utilization of satellite-borne instrumentation, it is necessary to know precisely the orbital position of the spacecraft. The aim of this thesis is therefore two-fold - firstly to derive precise orbits with particular emphasis placed on the altimetric satellite SEASAT and secondly, to utilize the precise orbits, to improve upon atmospheric density determinations for satellite drag modelling purposes. Part one of the thesis, on precise orbit determinations, is particularly concerned with the tracking data - satellite laser ranging, altimetry and crossover height differences - and how this data can be used to analyse errors in the orbit, the geoid and sea-surface topography. The outcome of this analysis is the determination of a low degree and order model for sea surface topography. Part two, on the other hand, mainly concentrates on using the laser data to analyse and improve upon current atmospheric density models. In particular, the modelling of density changes associated with geomagnetic disturbances comes under scrutiny in this section. By introducing persistence modelling of a geomagnetic event and solving for certain geomagnetic parameters, a new density model is derived which performs significantly better than the state-of-the-art models over periods of severe geomagnetic storms at SEASAT heights. This is independently verified by application of the derived model to STARLETTE orbit determinations.
Resumo:
A re-examination of fundamental concepts and a formal structuring of the waveform analysis problem is presented in Part I. eg. the nature of frequency is examined and a novel alternative to the classical methods of detection proposed and implemented which has the advantage of speed and independence from amplitude. Waveform analysis provides the link between Parts I and II. Part II is devoted to Human Factors and the Adaptive Task Technique. The Historical, Technical and Intellectual development of the technique is traced in a review which examines the evidence of its advantages relative to non-adaptive fixed task methods of training, skill assessment and man-machine optimisation. A second review examines research evidence on the effect of vibration on manual control ability. Findings are presented in terms of percentage increment or decrement in performance relative to performance without vibration in the range 0-0.6Rms'g'. Primary task performance was found to vary by as much as 90% between tasks at the same Rms'g'. Differences in task difficulty accounted for this difference. Within tasks vibration-added-difficulty accounted for the effects of vibration intensity. Secondary tasks were found to be largely insensitive to vibration except secondaries which involved fine manual adjustment of minor controls. Three experiments are reported next in which an adaptive technique was used to measure the % task difficulty added by vertical random and sinusoidal vibration to a 'Critical Compensatory Tracking task. At vibration intensities between 0 - 0.09 Rms 'g' it was found that random vibration added (24.5 x Rms'g')/7.4 x 100% to the difficulty of the control task. An equivalence relationship between Random and Sinusoidal vibration effects was established based upon added task difficulty. Waveform Analyses which were applied to the experimental data served to validate Phase Plane analysis and uncovered the development of a control and possibly a vibration isolation strategy. The submission ends with an appraisal of subjects mentioned in the thesis title.