861 resultados para Stereo matching
Resumo:
Moving cameras are needed for a wide range of applications in robotics, vehicle systems, surveillance, etc. However, many foreground object segmentation methods reported in the literature are unsuitable for such settings; these methods assume that the camera is fixed and the background changes slowly, and are inadequate for segmenting objects in video if there is significant motion of the camera or background. To address this shortcoming, a new method for segmenting foreground objects is proposed that utilizes binocular video. The method is demonstrated in the application of tracking and segmenting people in video who are approximately facing the binocular camera rig. Given a stereo image pair, the system first tries to find faces. Starting at each face, the region containing the person is grown by merging regions from an over-segmented color image. The disparity map is used to guide this merging process. The system has been implemented on a consumer-grade PC, and tested on video sequences of people indoors obtained from a moving camera rig. As can be expected, the proposed method works well in situations where other foreground-background segmentation methods typically fail. We believe that this superior performance is partly due to the use of object detection to guide region merging in disparity/color foreground segmentation, and partly due to the use of disparity information available with a binocular rig, in contrast with most previous methods that assumed monocular sequences.
Resumo:
Using data on user attributes and interactions from an online dating site, we estimate mate preferences, and use the Gale-Shapley algorithm to predict stable matches. The predicted matches are similar to the actual matches achieved by the dating site, and the actual matches are approximately efficient. Out-of-sample predictions of offline matches, i.e., marriages, exhibit assortative mating patterns similar to those observed in actual marriages. Thus, mate preferences, without resort to search frictions, can generate sorting in marriages. However, we underpredict some of the correlation patterns; search frictions may play a role in explaining the discrepancy.
Resumo:
The design of the New York City (NYC) high school match involved trade-offs among efficiency, stability, and strategy-proofness that raise new theoretical questions. We analyze a model with indifferences-ties-in school preferences. Simulations with field data and the theory favor breaking indifferences the same way at every school-single tiebreaking-in a student-proposing deferred acceptance mechanism. Any inefficiency associated with a realized tiebreaking cannot be removed without harming student incentives. Finally, we empirically document the extent of potential efficiency loss associated with strategy-proofness and stability, and direct attention to some open questions. (JEL C78, D82, I21).
Resumo:
Telecentric optical computed tomography (optical-CT) is a state-of-the-art method for visualizing and quantifying 3-dimensional dose distributions in radiochromic dosimeters. In this work a prototype telecentric system (DFOS-Duke Fresnel Optical-CT Scanner) is evaluated which incorporates two substantial design changes: the use of Fresnel lenses (reducing lens costs from $10-30K t0 $1-3K) and the use of a 'solid tank' (which reduces noise, and the volume of refractively matched fluid from 1 ltr to 10 cc). The efficacy of DFOS was evaluated by direct comparison against commissioned scanners in our lab. Measured dose distributions from all systems were compared against the predicted dose distributions from a commissioned treatment planning system (TPS). Three treatment plans were investigated including a simple four-field box treatment, a multiple small field delivery, and a complex IMRT treatment. Dosimeters were imaged within 2 h post irradiation, using consistent scanning techniques (360 projections acquired at 1 degree intervals, reconstruction at 2mm). DFOS efficacy was evaluated through inspection of dose line-profiles, and 2D and 3D dose and gamma maps. DFOS/TPS gamma pass rates with 3%/3mm dose difference/distance-to-agreement criteria ranged from 89.3% to 92.2%, compared to from 95.6% to 99.0% obtained with the commissioned system. The 3D gamma pass rate between the commissioned system and DFOS was 98.2%. The typical noise rates in DFOS reconstructions were up to 3%, compared to under 2% for the commissioned system. In conclusion, while the introduction of a solid tank proved advantageous with regards to cost and convenience, further work is required to improve the image quality and dose reconstruction accuracy of the new DFOS optical-CT system.
Resumo:
Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.
Resumo:
This paper introduces a mechanism for representing and recognizing case history patterns with rich internal temporal aspects. A case history is characterized as a collection of elemental cases as in conventional case-based reasoning systems, together with the corresponding temporal constraints that can be relative and/or with absolute values. A graphical representation for case histories is proposed as a directed, partially weighted and labeled simple graph. In terms of such a graphical representation, an eigen-decomposition graph matching algorithm is proposed for recognizing case history patterns.
Resumo:
In this paper, we shall critically examine a special class of graph matching algorithms that follow the approach of node-similarity measurement. A high-level algorithm framework, namely node-similarity graph matching framework (NSGM framework), is proposed, from which, many existing graph matching algorithms can be subsumed, including the eigen-decomposition method of Umeyama, the polynomial-transformation method of Almohamad, the hubs and authorities method of Kleinberg, and the kronecker product successive projection methods of Wyk, etc. In addition, improved algorithms can be developed from the NSGM framework with respects to the corresponding results in graph theory. As the observation, it is pointed out that, in general, any algorithm which can be subsumed from NSGM framework fails to work well for graphs with non-trivial auto-isomorphism structure.
Resumo:
This paper examines different ways of measuring similarity between software design models for Case Based Reasoning (CBR) to facilitate reuse of software design and code. The paper considers structural and behavioural aspects of similarity between software design models. Similarity metrics for comparing static class structures are defined and discussed. A Graph representation of UML class diagrams and corresponding similarity measures for UML class diagrams are defined. A full search graph matching algorithm for measuring structural similarity diagrams based on the identification of the Maximum Common Sub-graph (MCS) is presented. Finally, a simple evaluation of the approach is presented and discussed.
Resumo:
In terms of a general time theory which addresses time-elements as typed point-based intervals, a formal characterization of time-series and state-sequences is introduced. Based on this framework, the subsequence matching problem is specially tackled by means of being transferred into bipartite graph matching problem. Then a hybrid similarity model with high tolerance of inversion, crossover and noise is proposed for matching the corresponding bipartite graphs involving both temporal and non-temporal measurements. Experimental results on reconstructed time-series data from UCI KDD Archive demonstrate that such an approach is more effective comparing with the traditional similarity model based algorithms, promising robust techniques for lager time-series databases and real-life applications such as Content-based Video Retrieval (CBVR), etc.
Resumo:
We report an experimental technique for the comparison of ionization processes in ultrafast laser pulses irrespective of pulse ellipticity. Multiple ionization of xenon by 50 fs 790 nm, linearly and circularly polarized laser pulses is observed over the intensity range 10 TW/cm(2) to 10 PW/cm(2) using effective intensity matching (EIM), which is coupled with intensity selective scanning (ISS) to recover the geometry-independent probability of ionization. Such measurements, made possible by quantifying diffraction effects in the laser focus, are compared directly to theoretical predictions of multiphoton, tunnel and field ionization, and a remarkable agreement demonstrated. EIM-ISS allows the straightforward quantification of the probability of recollision ionization in a linearly polarized laser pulse. Furthermore, the probability of ionization is discussed in terms of the Keldysh adiabaticity parameter gamma, and the influence of the precursor ionic states present in recollision ionization is observed.
Resumo:
Recent experimental neutron diffraction data and ab initio molecular dynamics simulation of the ionic liquid dimethylimidazolium chloride ([dmim]Cl) have provided a structural description of the system at the molecular level. However, partial radial distribution functions calculated from the latter, when compared to previous classical simulation results, highlight some limitations in the structural description offered by force fieldbased simulations. With the availability of ab initio data it is possible to improve the classical description of [dmim]Cl by using the force matching approach, and the strategy for fitting complex force fields in their original functional form is discussed. A self-consistent optimization method for the generation of classical potentials of general functional form is presented and applied, and a force field that better reproduces the observed first principles forces is obtained. When used in simulation, it predicts structural data which reproduces more faithfully that observed in the ab initio studies. Some possible refinements to the technique, its application, and the general suitability of common potential energy functions used within many ionic liquid force fields are discussed.
Resumo:
In this paper, a parallel-matching processor architecture with early jump-out (EJO) control is proposed to carry out high-speed biometric fingerprint database retrieval. The processor performs the fingerprint retrieval by using minutia point matching. An EJO method is applied to the proposed architecture to speed up the large database retrieval. The processor is implemented on a Xilinx Virtex-E, and occupies 6,825 slices and runs at up to 65 MHz. The software/hardware co-simulation benchmark with a database of 10,000 fingerprints verifies that the matching speed can achieve the rate of up to 1.22 million fingerprints per second. EJO results in about a 22% gain in computing efficiency.