945 resultados para Camera Obscura
Resumo:
The importance and use of text extraction from camera based coloured scene images is rapidly increasing with time. Text within a camera grabbed image can contain a huge amount of meta data about that scene. Such meta data can be useful for identification, indexing and retrieval purposes. While the segmentation and recognition of text from document images is quite successful, detection of coloured scene text is a new challenge for all camera based images. Common problems for text extraction from camera based images are the lack of prior knowledge of any kind of text features such as colour, font, size and orientation as well as the location of the probable text regions. In this paper, we document the development of a fully automatic and extremely robust text segmentation technique that can be used for any type of camera grabbed frame be it single image or video. A new algorithm is proposed which can overcome the current problems of text segmentation. The algorithm exploits text appearance in terms of colour and spatial distribution. When the new text extraction technique was tested on a variety of camera based images it was found to out perform existing techniques (or something similar). The proposed technique also overcomes any problems that can arise due to an unconstraint complex background. The novelty in the works arises from the fact that this is the first time that colour and spatial information are used simultaneously for the purpose of text extraction.
Resumo:
The Rapid Oscillations in the Solar Atmosphere (ROSA) instrument is a synchronized, six-camera high-cadence solar imaging instrument developed by Queen's University Belfast. The system is available on the Dunn Solar Telescope at the National Solar Observatory in Sunspot, New Mexico, USA, as a common-user instrument. Consisting of six 1k x 1k Peltier-cooled frame-transfer CCD cameras with very low noise (0.02 -aEuro parts per thousand 15 e s(-1) pixel(-1)), each ROSA camera is capable of full-chip readout speeds in excess of 30 Hz, or 200 Hz when the CCD is windowed. Combining multiple cameras and fast readout rates, ROSA will accumulate approximately 12 TB of data per 8 hours observing. Following successful commissioning during August 2008, ROSA will allow for multi-wavelength studies of the solar atmosphere at a high temporal resolution.
Resumo:
Baited cameras are often used for abundance estimation wherever alternative techniques are precluded, e.g. in abyssal systems and areas such as reefs. This method has thus far used models of the arrival process that are deterministic and, therefore, permit no estimate of precision.
Furthermore, errors due to multiple counting of fish and missing those not seen by the camera have restricted the technique to using only the time of first arrival, leaving a lot of data redundant. Here, we reformulate the arrival process using a stochastic model, which allows the precision of abundance
estimates to be quantified. Assuming a non-gregarious, cross-current-scavenging fish, we show that prediction of abundance from first arrival time is extremely uncertain. Using example data, we show
that simple regression-based prediction from the initial (rising) slope of numbers at the bait gives good precision, accepting certain assumptions. The most precise abundance estimates were obtained
by including the declining phase of the time series, using a simple model of departures, and taking account of scavengers beyond the camera’s view, using a hidden Markov model.
Resumo:
Utilising cameras as a means to survey the surrounding environment is becoming increasingly popular in a number of different research areas and applications. Central to using camera sensors as input to a vision system, is the need to be able to manipulate and process the information captured in these images. One such application, is the use of cameras to monitor the quality of airport landing lighting at aerodromes where a camera is placed inside an aircraft and used to record images of the lighting pattern during the landing phase of a flight. The images are processed to determine a performance metric. This requires the development of custom software for the localisation and identification of luminaires within the image data. However, because of the necessity to keep airport operations functioning as efficiently as possible, it is difficult to collect enough image data to develop, test and validate any developed software. In this paper, we present a technique to model a virtual landing lighting pattern. A mathematical model is postulated which represents the glide path of the aircraft including random deviations from the expected path. A morphological method has been developed to localise and track the luminaires under different operating conditions. © 2011 IEEE.
Resumo:
ULTRACAM is a high-speed three-colour CCD camera designed to provide imaging photometry at high temporal resolutions. The instrument is highly portable and will be used at a number of large telescopes around the world. ULTRACAM was successfully commissioned on the 4.2-m William Herschel Telescope on La Palma on 16 May 2002 over 3 months ahead of schedule and within budget. The instrument was funded by PPARC and designed and built by a consortium involving the Universities of Sheffield Southampton and the UKATC Edinburgh. We present an overview of the design and performance characteristics of ULTRACAM and highlight some of its most recent scientific results.