943 resultados para Digital processing
Digital signal processing and digital system design using discrete cosine transform [student course]
Resumo:
The discrete cosine transform (DCT) is an important functional block for image processing applications. The implementation of a DCT has been viewed as a specialized research task. We apply a micro-architecture based methodology to the hardware implementation of an efficient DCT algorithm in a digital design course. Several circuit optimization and design space exploration techniques at the register-transfer and logic levels are introduced in class for generating the final design. The students not only learn how the algorithm can be implemented, but also receive insights about how other signal processing algorithms can be translated into a hardware implementation. Since signal processing has very broad applications, the study and implementation of an extensively used signal processing algorithm in a digital design course significantly enhances the learning experience in both digital signal processing and digital design areas for the students.
Resumo:
The paper showcases the field- and lab-documentation system developed for Kinneret Regional Project, an international archaeological expedition to the Northwestern shore of the Sea of Galilee (Israel) under the auspices of the University of Bern, the University of Helsinki, Leiden University and Wofford College. The core of the data management system is a fully relational, server-based database framework, which also includes time-based and static GIS services, stratigraphic analysis tools and fully indexed document/digital image archives. Data collection in the field is based on mobile, hand-held devices equipped with a custom-tailored stand-alone application. Comprehensive three-dimensional documentation of all finds and findings is achieved by means of total stations and/or high-precision GPS devices. All archaeological information retrieved in the field – including tachymetric data – is synched with the core system on the fly and thus immediately available for further processing in the field lab (within the local network) or for post-excavation analysis at remote institutions (via the WWW). Besides a short demonstration of the main functionalities, the paper also presents some of the key technologies used and illustrates usability aspects of the system’s individual components.
Resumo:
Monument conservation is related to the interaction between the original petrological parameters of the rock and external factors in the area where the building is sited, such as weather conditions, pollution, and so on. Depending on the environmental conditions and the characteristics of the materials used, different types of weathering predominate. In all, the appearance of surface crusts constitutes a first stage, whose origin can often be traced to the properties of the material itself. In the present study, different colours of “patinas” were distinguished by defining the threshold levels of greys associated with “pathology” in the histogram. These data were compared to background information and other parameters, such as mineralogical composition, porosity, and so on, as well as other visual signs of deterioration. The result is a map of the pathologies associated with “cover films” on monuments, which generate images by relating colour characteristics to desired properties or zones of interest.
Resumo:
Mining in the Iberian Pyrite Belt (IPB), the biggest VMS metallogenetic province known in the world to date, has to face a deep crisis in spite of the huge reserves still known after ≈5 000 years of production. This is due to several factors, as the difficult processing of complex Cu-Pb-Zn-Ag- Au ores, the exhaustion of the oxidation zone orebodies (the richest for gold, in gossan), the scarce demand for sulphuric acid in the world market, and harder environmental regulations. Of these factors, only the first and the last mentioned can be addressed by local ore geologists. A reactivation of mining can therefore only be achieved by an improved and more efficient ore processing, under the constraint of strict environmental controls. Digital image analysis of the ores, coupled to reflected light microscopy, provides a quantified and reliable mineralogical and textural characterization of the ores. The automation of the procedure for the first time furnishes the process engineers with real-time information, to improve the process and to preclude or control pollution; it can be applied to metallurgical tailings as well. This is shown by some examples of the IPB.
Resumo:
Digital chaotic behavior in an optically processing element is reported. It is obtained as the result of processing two fixed trains of bits. The process is performed with an optically programmable logic gate, previously reported as a possible main block for optical computing. Outputs for some specific conditions of the circuit are given. Digital chaos is obtained using a feedback configuration. Period doublings in a Feigenbaum‐like scenario are obtained. A new method to characterize this type of digital chaos is reported.
Resumo:
Digital chaotic behavior in an optically processing element is reported. It is obtained as the result of processing two fixed train of bits. The process is performed with an Optically Programmable Logic Gate. Possible outputs for some specific conditions of the circuit are given. These outputs have some fractal characteristics, when input variations are considered. Digital chaotic behavior is obtained by using a feedback configuration. A random-like bit generator is presented.
Resumo:
Bibliography: p. 74.
Resumo:
Thesis (M. S.)--University of Illinois at Urbana-Champaign.
Resumo:
Thesis (M. S.)--University of Illinois at Urbana-Champaign.
Resumo:
Thesis (M.S.)--University of Illinois at Urbana-Champaign.
Resumo:
Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.
Resumo:
The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.
Resumo:
Through numerical modeling, we illustrate the possibility of a new approach to digital signal processing in coherent optical communications based on the application of the so-called inverse scattering transform. Considering without loss of generality a fiber link with normal dispersion and quadrature phase shift keying signal modulation, we demonstrate how an initial information pattern can be recovered (without direct backward propagation) through the calculation of nonlinear spectral data of the received optical signal. © 2013 Optical Society of America.
Resumo:
Coherent optical orthogonal frequency division multiplexing (CO-OFDM) has been actively considered as a potential candidate for long-haul transmission and 400 Gb/s to 1 Tb/s Ethernet transport because of its high spectral efficiency, efficient implementation, flexibility and robustness against linear impairments such as chromatic dispersion and polarization mode dispersion. However, due to the long symbol duration and narrow subcarrier spacing, CO-OFDM systems are sensitive to laser phase noise and fibre nonlinearity induced penalties. As a result, the development of CO-OFDM transmission technology crucially relies on efficient techniques to compensate for the laser phase noise and fibre nonlinearity impairments. In this thesis, high performance and low complexity digital signal processing techniques for laser phase noise and fibre nonlinearity compensation in CO-OFDM transmissions are demonstrated. For laser phase noise compensation, three novel techniques, namely quasipilot-aided, decision-directed-free blind and multiplier-free blind are introduced. For fibre nonlinear compensation, two novel techniques which are referred to as phase conjugated pilots and phase conjugated subcarrier coding, are proposed. All these abovementioned digital signal processing techniques offer high performances and flexibilities while requiring relatively low complexities in comparison with other existing phase noise and nonlinear compensation techniques. As a result of the developments of these digital signal processing techniques, CO-OFDM technology is expected to play a significant role in future ultra-high capacity optical network. In addition, this thesis also presents preliminary study on nonlinear Fourier transform based transmission schemes in which OFDM is a highly suitable modulation format. The obtained result paves the way towards a truly flexible nonlinear wave-division multiplexing system that allows the current nonlinear transmission limitations to be exceeded.
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.