922 resultados para Pan-tilt camera calibration
Resumo:
Ce mémoire s'intéresse à la vision par ordinateur appliquée à des projets d'art technologique. Le sujet traité est la calibration de systèmes de caméras et de projecteurs dans des applications de suivi et de reconstruction 3D en arts visuels et en art performatif. Le mémoire s'articule autour de deux collaborations avec les artistes québécois Daniel Danis et Nicolas Reeves. La géométrie projective et les méthodes de calibration classiques telles que la calibration planaire et la calibration par géométrie épipolaire sont présentées pour introduire les techniques utilisées dans ces deux projets. La collaboration avec Nicolas Reeves consiste à calibrer un système caméra-projecteur sur tête robotisée pour projeter des vidéos en temps réel sur des écrans cubiques mobiles. En plus d'appliquer des méthodes de calibration classiques, nous proposons une nouvelle technique de calibration de la pose d'une caméra sur tête robotisée. Cette technique utilise des plans elliptiques générés par l'observation d'un seul point dans le monde pour déterminer la pose de la caméra par rapport au centre de rotation de la tête robotisée. Le projet avec le metteur en scène Daniel Danis aborde les techniques de calibration de systèmes multi-caméras. Pour son projet de théâtre, nous avons développé un algorithme de calibration d'un réseau de caméras wiimotes. Cette technique basée sur la géométrie épipolaire permet de faire de la reconstruction 3D d'une trajectoire dans un grand volume à un coût minime. Les résultats des techniques de calibration développées sont présentés, de même que leur utilisation dans des contextes réels de performance devant public.
Resumo:
Wireless Multi-media Sensor Networks (WMSNs) have become increasingly popular in recent years, driven in part by the increasing commoditization of small, low-cost CMOS sensors. As such, the challenge of automatically calibrating these types of cameras nodes has become an important research problem, especially for the case when a large quantity of these type of devices are deployed. This paper presents a method for automatically calibrating a wireless camera node with the ability to rotate around one axis. The method involves capturing images as the camera is rotated and computing the homographies between the images. The camera parameters, including focal length, principal point and the angle and axis of rotation can then recovered from two or more homographies. The homography computation algorithm is designed to deal with the limited resources of the wireless sensor and to minimize energy con- sumption. In this paper, a modified RANdom SAmple Consensus (RANSAC) algorithm is proposed to effectively increase the efficiency and reliability of the calibration procedure.
Resumo:
A methodology for determining spacecraft attitude and autonomously calibrating star camera, both independent of each other, is presented in this paper. Unlike most of the attitude determination algorithms where attitude of the satellite depend on the camera calibrating parameters (like principal point offset, focal length etc.), the proposed method has the advantage of computing spacecraft attitude independently of camera calibrating parameters except lens distortion. In the proposed method both attitude estimation and star camera calibration is done together independent of each other by directly utilizing the star coordinate in image plane and corresponding star vector in inertial coordinate frame. Satellite attitude, camera principal point offset, focal length (in pixel), lens distortion coefficient are found by a simple two step method. In the first step, all parameters (except lens distortion) are estimated using a closed-form solution based on a distortion free camera model. In the second step lens distortion coefficient is estimated by linear least squares method using the solution of the first step to be used in the camera model that incorporates distortion. These steps are applied in an iterative manner to refine the estimated parameters. The whole procedure is faster enough for onboard implementation.
Resumo:
This paper presents two methods of star camera calibration to determine camera calibrating parameters (like principal point, focal length etc) along with lens distortions (radial and decentering). First method works autonomously utilizing star coordinates in three consecutive image frames thus independent of star identification or biased attitude information. The parameters obtained in autonomous self-calibration technique helps to identify the imaged stars with the cataloged stars. Least Square based second method utilizes inertial star coordinates to determine satellite attitude and star camera parameters with lens radial distortion, both independent of each other. Camera parameters determined by the second method are more accurate than the first method of camera self calibration. Moreover, unlike most of the attitude determination algorithms where attitude of the satellite depend on the camera calibrating parameters, the second method has the advantage of computing spacecraft attitude independent of camera calibrating parameters except lens distortions (radial). Finally Kalman filter based sequential estimation scheme is employed to filter out the noise of the LS based estimation.
Resumo:
Oceans - San Diego, 2013
Resumo:
This paper describes a simple method for internal camera calibration for computer vision. This method is based on tracking image features through a sequence of images while the camera undergoes pure rotation. The location of the features relative to the camera or to each other need not be known and therefore this method can be used both for laboratory calibration and for self calibration in autonomous robots working in unstructured environments. A second method of calibration is also presented. This method uses simple geometric objects such as spheres and straight lines to The camera parameters. Calibration is performed using both methods and the results compared.
Resumo:
The paper reports an interactive tool for calibrating a camera, suitable for use in outdoor scenes. The motivation for the tool was the need to obtain an approximate calibration for images taken with no explicit calibration data. Such images are frequently presented to research laboratories, especially in surveillance applications, with a request to demonstrate algorithms. The method decomposes the calibration parameters into intuitively simple components, and relies on the operator interactively adjusting the parameter settings to achieve a visually acceptable agreement between a rectilinear calibration model and his own perception of the scene. Using the tool, we have been able to calibrate images of unknown scenes, taken with unknown cameras, in a matter of minutes. The standard of calibration has proved to be sufficient for model-based pose recovery and tracking of vehicles.