857 resultados para analog optical signal processing
Resumo:
We have designed and fabricated a new type of fibre Bragg grating (FBG) with a V-shaped dispersion profile for multi-channel dispersion compensation in communication links.
Resumo:
Through numerical modeling, we illustrate the possibility of a new approach to digital signal processing in coherent optical communications based on the application of the so-called inverse scattering transform. Considering without loss of generality a fiber link with normal dispersion and quadrature phase shift keying signal modulation, we demonstrate how an initial information pattern can be recovered (without direct backward propagation) through the calculation of nonlinear spectral data of the received optical signal. © 2013 Optical Society of America.
Resumo:
Optical differentiators constitute a basic device for analog all-optical signal processing [1]. Fiber grating approaches, both fiber Bragg grating (FBG) and long period grating (LPG), constitute an attractive solution because of their low cost, low insertion losses, and full compatibility with fiber optic systems. A first order differentiator LPG approach was proposed and demonstrated in [2], but FBGs may be preferred in applications with a bandwidth up to few nm because of the extreme sensitivity of LPGs to environmental fluctuations [3]. Several FBG approaches have been proposed in [3-6], requiring one or more additional optical elements to create a first-order differentiator. A very simple, single optical element FBG approach was proposed in [7] for first order differentiation, applying the well-known logarithmic Hilbert transform relation of the amplitude and phase of an FBG in transmission [8]. Using this relationship in the design process, it was theoretically and numerically demonstrated that a single FBG in transmission can be designed to simultaneously approach the amplitude and phase of a first-order differentiator spectral response, without need of any additional elements. © 2013 IEEE.
Resumo:
The never-stopping increase in demand for information transmission capacity has been met with technological advances in telecommunication systems, such as the implementation of coherent optical systems, advanced multilevel multidimensional modulation formats, fast signal processing, and research into new physical media for signal transmission (e.g. a variety of new types of optical fibers). Since the increase in the signal-to-noise ratio makes fiber communication channels essentially nonlinear (due to the Kerr effect for example), the problem of estimating the Shannon capacity for nonlinear communication channels is not only conceptually interesting, but also practically important. Here we discuss various nonlinear communication channels and review the potential of different optical signal coding, transmission and processing techniques to improve fiber-optic Shannon capacity and to increase the system reach.
Resumo:
The development of new all-optical technologies for data processing and signal manipulation is a field of growing importance with a strong potential for numerous applications in diverse areas of modern science. Nonlinear phenomena occurring in optical fibres have many attractive features and great, but not yet fully explored, potential in signal processing. Here, we review recent progress on the use of fibre nonlinearities for the generation and shaping of optical pulses and on the applications of advanced pulse shapes in all-optical signal processing. Amongst other topics, we will discuss ultrahigh repetition rate pulse sources, the generation of parabolic shaped pulses in active and passive fibres, the generation of pulses with triangular temporal profiles, and coherent supercontinuum sources. The signal processing applications will span optical regeneration, linear distortion compensation, optical decision at the receiver in optical communication systems, spectral and temporal signal doubling, and frequency conversion. © Copyright 2012 Sonia Boscolo and Christophe Finot.
Resumo:
Signal processing techniques for mitigating intra-channel and inter-channel fiber nonlinearities are reviewed. More detailed descriptions of three specific examples highlight the diversity of the electronic and optical approaches that have been investigated.
Resumo:
Structural health monitoring (SHM) is the term applied to the procedure of monitoring a structure’s performance, assessing its condition and carrying out appropriate retrofitting so that it performs reliably, safely and efficiently. Bridges form an important part of a nation’s infrastructure. They deteriorate due to age and changing load patterns and hence early detection of damage helps in prolonging the lives and preventing catastrophic failures. Monitoring of bridges has been traditionally done by means of visual inspection. With recent developments in sensor technology and availability of advanced computing resources, newer techniques have emerged for SHM. Acoustic emission (AE) is one such technology that is attracting attention of engineers and researchers all around the world. This paper discusses the use of AE technology in health monitoring of bridge structures, with a special focus on analysis of recorded data. AE waves are stress waves generated by mechanical deformation of material and can be recorded by means of sensors attached to the surface of the structure. Analysis of the AE signals provides vital information regarding the nature of the source of emission. Signal processing of the AE waveform data can be carried out in several ways and is predominantly based on time and frequency domains. Short time Fourier transform and wavelet analysis have proved to be superior alternatives to traditional frequency based analysis in extracting information from recorded waveform. Some of the preliminary results of the application of these analysis tools in signal processing of recorded AE data will be presented in this paper.
Resumo:
The development and use of a virtual assessment tool for a signal processing unit is described. It allows students to take a test from anywhere using a web browser to connect to the university server that hosts the test. While student responses are of the multiple choice type, they have to work out problems to arrive at the answer to be entered. CGI programming is used to verify student identification information and record their scores as well as provide immediate feedback after the test is complete. The tool has been used at QUT for the past 3 years and student feedback is discussed. The virtual assessment tool is an efficient alternative to marking written assignment reports that can often take more hours than actual lecture hall contact from a lecturer or tutor. It is especially attractive for very large classes that are now the norm at many universities in the first two years.
Resumo:
This paper describes the design and implementation of a unique undergraduate program in signal processing at the Queensland University of Technology (QUT). The criteria that influenced the choice of the subjects and the laboratories developed to support them are presented. A recently established Signal Processing Research Centre (SPRC) has played an important role in the development of the signal processing teaching program. The SPRC also provides training opportunities for postgraduate studies and research.
Resumo:
The School of Electrical and Electronic Systems Engineering at Queensland University of Technology, Brisbane, Australia (QUT), offers three bachelor degree courses in electrical and computer engineering. In all its courses there is a strong emphasis on signal processing. A newly established Signal Processing Research Centre (SPRC) has played an important role in the development of the signal processing units in these courses. This paper describes the unique design of the undergraduate program in signal processing at QUT, the laboratories developed to support it, and the criteria that influenced the design.
Resumo:
The School of Electrical and Electronic Systems Engineering of Queensland University of Technology (like many other universities around the world) has recognised the importance of complementing the teaching of signal processing with computer based experiments. A laboratory has been developed to provide a "hands-on" approach to the teaching of signal processing techniques. The motivation for the development of this laboratory was the cliche "What I hear I remember but what I do I understand." The laboratory has been named as the "Signal Computing and Real-time DSP Laboratory" and provides practical training to approximately 150 final year undergraduate students each year. The paper describes the novel features of the laboratory, techniques used in the laboratory based teaching, interesting aspects of the experiments that have been developed and student evaluation of the teaching techniques
Resumo:
A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.