961 resultados para Techniques: images processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents experimental investigations of the use of semiconductor optical amplifiers in a nonlinear loop mirror (SOA-NOLM) and its application in all-optical processing. The techniques used are mainly experimental and are divided into three major applications. Initially the semiconductor optical amplifier, SOA, is experimentally characterised and the optimum operating condition is identified. An interferometric switch based on a Sagnac loop with the SOA as the nonlinear element is employed to realise all-optical switching. All-optical switching is a very attractive alternative to optoelectronic conversion because it avoids the conversion from the optical to the electronic domain and back again. The first major investigation involves a carrier suppressed return to zero, CSRZ, format conversion and transmission. This study is divided into single channel and four channel WDM respectively. The optical bandwidth which limits the conversion is investigated. The improvement of the nonlinear tolerance in the CSRZ transmission is shown which shows the suitability of this format for enhancing system performance. Second, a symmetrical switching window is studied in the SOA-NOLM where two similar control pulses are injected into the SOA from opposite directions. The switching window is symmetric when these two control pulses have the same power and arrive at the same time in the SOA. Finally, I study an all-optical circulating shift register with an inverter. The detailed behaviour of the blocks of zeros and ones has been analysed in terms of their transient measurement. Good agreement with a simple model of the shift register is obtained. The transient can be reduced but it will affect the extinction ratio of the pulses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improving bit error rates in optical communication systems is a difficult and important problem. The error correction must take place at high speed and be extremely accurate. We show the feasibility of using hardware implementable machine learning techniques. This may enable some error correction at the speed required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grafted GMA on EPR samples were prepared in a Thermo-Haake internal mixer by free radical melt grafting reactions in the absence (conventional system; EPR-g-GMA(CONV)) and presence of the reactive comonomer divinyl benzene, DVB (EPR-g-GMA(DVB)). The GMA-homopolymer (poly-GMA), a major side reaction product in the conventional system, was almost completely absent in the DVB-containing system, the latter also resulted in a much higher level of GMA grafting. A comprehensive microstructure analysis of the formed poly-GMA was performed based on one-dimensional H-1 and C-13 NMR spectroscopy and the complete spectral assignments were supported by two-dimensional NMR techniques based on long range two and three bond order carbon-proton couplings from HMBC (Heteronuclear Multiple Bond Coherence) and that of one bond carbon-proton couplings from HSQC (Heteronuclear Single Quantum Coherence), as well as the use of Distortionless Enhancement by Polarization Transfer (DEPT) NMR spectroscopy. The unambiguous analysis of the stereochemical configuration of poly-GMA was further used to help understand the microstructures of the GMA-grafts obtained in the two different free radical melt grafting reactions, the conventional and comonomer-containing systems. In the grafted GMA, in the conventional system (EPR-g-GMA(CONV)), the methylene protons of the GMA were found to be sensitive to tetrad configurational sequences and the results showed that 56% of the GMA sequence in the graft is in atactic configuration and 42% is in syndiotactic configuration whereas the poly-GMA was predominantly syndiotactic. The differences in the microstructures of the graft in the conventional EPR-g-GMA(CONV) and the DVB-containing (EPR-g-GMA(DVB)) systems is also reported (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this chapter we present the relevant mathematical background to address two well defined signal and image processing problems. Namely, the problem of structured noise filtering and the problem of interpolation of missing data. The former is addressed by recourse to oblique projection based techniques whilst the latter, which can be considered equivalent to impulsive noise filtering, is tackled by appropriate interpolation methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All-optical technologies for data processing and signal manipulation are expected to play a major role in future optical communications. Nonlinear phenomena occurring in optical fibre have many attractive features and great, but not yet fully exploited potential in optical signal processing. Here, we overview our recent results and advances in developing novel photonic techniques and approaches to all-optical processing based on fibre nonlinearities. Amongst other topics, we will discuss phase-preserving optical 2R regeneration, the possibility of using parabolic/flat-top pulses for optical signal processing and regeneration, and nonlinear optical pulse shaping. A method for passive nonlinear pulse shaping based on pulse pre-chirping and propagation in a normally dispersive fibre will be presented. The approach provides a simple way of generating various temporal waveforms of fundamental and practical interest. Particular emphasis will be given to the formation and characterization of pulses with a triangular intensity profile. A new technique of doubling/copying optical pulses in both the frequency and time domains using triangular-shaped pulses will be also introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All-optical data processing is expected to play a major role in future optical communications. Nonlinear effects in optical fibers have attractive applications in optical signal processing. In this paper, we review our recent advances in developing all-optical processing techniques at high speed based on optical fiber nonlinearities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Procedural knowledge is the knowledge required to perform certain tasks, and forms an important part of expertise. A major source of procedural knowledge is natural language instructions. While these readable instructions have been useful learning resources for human, they are not interpretable by machines. Automatically acquiring procedural knowledge in machine interpretable formats from instructions has become an increasingly popular research topic due to their potential applications in process automation. However, it has been insufficiently addressed. This paper presents an approach and an implemented system to assist users to automatically acquire procedural knowledge in structured forms from instructions. We introduce a generic semantic representation of procedures for analysing instructions, using which natural language techniques are applied to automatically extract structured procedures from instructions. The method is evaluated in three domains to justify the generality of the proposed semantic representation as well as the effectiveness of the implemented automatic system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All-optical data processing is expected to play a major role in future optical communications. Nonlinear effects in optical fibres have many attractive features and a great, not yet fully explored potential in optical signal processing. Here, we overview our recent advances in developing novel techniques and approaches to all-optical processing based on optical fibre nonlinearities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All-optical data processing is expected to play a major role in future optical communications. Nonlinear effects in optical fibers have attractive applications in optical signal processing. In this paper, we review our recent advances in developing all-optical processing techniques at high speed based on optical fiber nonlinearities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improving bit error rates in optical communication systems is a difficult and important problem. The error correction must take place at high speed and be extremely accurate. We show the feasibility of using hardware implementable machine learning techniques. This may enable some error correction at the speed required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All-optical technologies for data processing and signal manipulation are expected to play a major role in future optical communications. Nonlinear phenomena occurring in optical fibre have many attractive features and great, but not yet fully exploited potential in optical signal processing. Here, we overview our recent results and advances in developing novel photonic techniques and approaches to all-optical processing based on fibre nonlinearities. Amongst other topics, we will discuss phase-preserving optical 2R regeneration, the possibility of using parabolic/flat-top pulses for optical signal processing and regeneration, and nonlinear optical pulse shaping. A method for passive nonlinear pulse shaping based on pulse pre-chirping and propagation in a normally dispersive fibre will be presented. The approach provides a simple way of generating various temporal waveforms of fundamental and practical interest. Particular emphasis will be given to the formation and characterization of pulses with a triangular intensity profile. A new technique of doubling/copying optical pulses in both the frequency and time domains using triangular-shaped pulses will be also introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photonic signal processing is used to implement common mode signal cancellation across a very wide bandwidth utilising phase modulation of radio frequency (RF) signals onto a narrow linewidth laser carrier. RF spectra were observed using narrow-band, tunable optical filtering using a scanning Fabry Perot etalon. Thus functions conventionally performed using digital signal processing techniques in the electronic domain have been replaced by analog techniques in the photonic domain. This technique was able to observe simultaneous cancellation of signals across a bandwidth of 1400 MHz, limited only by the free spectral range of the etalon. © 2013 David M. Benton.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improvements in imaging chips and computer processing power have brought major advances in imaging of the anterior eye. Digitally captured images can be visualised immediately and can be stored and retrieved easily. Anterior ocular imaging techniques using slitlamp biomicroscopy, corneal topography, confocal microscopy, optical coherence tomography (OCT), ultrasonic biomicroscopy, computerised tomography (CT) and magnetic resonance imaging (MRI) are reviewed. Conventional photographic imaging can be used to quantify corneal topography, corneal thickness and transparency, anterior chamber depth and lateral angle and crystalline lens position, curvature, thickness and transparency. Additionally, the effects of tumours, foreign bodies and trauma can be localised, the corneal layers can be examined and the tear film thickness assessed. © 2006 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of the comonomer content in a series of metallocene-based ethylene-1-octene copolymers (m-LLDPE) on thermo-mechanical, rheological, and thermo-oxidative behaviours during melt processing were examined using a range of characterisation techniques. The amount of branching was calculated from 13C NMR and studies using differential scanning calorimetry (DSC) and dynamic mechanical analysis (DMA) were employed to determine the effect of short chain branching (SCB, comonomer content) on thermal and mechanical characteristics of the polymer. The effect of melt processing at different temperatures on the thermo-oxidative behaviour of the polymers was investigated by examining the changes in rheological properties, using both melt flow and capillary rheometry, and the evolution of oxidation products during processing using infrared spectroscopy. The results show that the comonomer content and catalyst type greatly affect thermal, mechanical and oxidative behaviour of the polymers. For the metallocene polymer series, it was shown from both DSC and DMA that (i) crystallinity and melting temperatures decreased linearly with comonomer content, (ii) the intensity of the ß-transition increased, and (iii) the position of the tan δmax peak corresponding to the a-transition shifted to lower temperatures, with higher comonomer content. In contrast, a corresponding Ziegler polymer containing the same level of SCB as in one of the m-LLDPE polymers, showed different characteristics due to its more heterogeneous nature: higher elongational viscosity, and a double melting peak with broader intensity that occurred at higher temperature (from DSC endotherm) indicating a much broader short chain branch distribution. The thermo-oxidative behaviour of the polymers after melt processing was similarly influenced by the comonomer content. Rheological characteristics and changes in concentrations of carbonyl and the different unsaturated groups, particularly vinyl, vinylidene and trans-vinylene, during processing of m-LLDPE polymers, showed that polymers with lower levels of SCB gave rise to predominantly crosslinking reactions at all processing temperatures. By contrast, chain scission reactions at higher processing temperatures became more favoured in the higher comonomer-containing polymers. Compared to its metallocene analogue, the Ziegler polymer showed a much higher degree of crosslinking at all temperatures because of the high levels of vinyl unsaturation initially present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sizeable amount of the testing in eye care, requires either the identification of targets such as letters to assess functional vision, or the subjective evaluation of imagery by an examiner. Computers can render a variety of different targets on their monitors and can be used to store and analyse ophthalmic images. However, existing computing hardware tends to be large, screen resolutions are often too low, and objective assessments of ophthalmic images unreliable. Recent advances in mobile computing hardware and computer-vision systems can be used to enhance clinical testing in optometry. High resolution touch screens embedded in mobile devices, can render targets at a wide variety of distances and can be used to record and respond to patient responses, automating testing methods. This has opened up new opportunities in computerised near vision testing. Equally, new image processing techniques can be used to increase the validity and reliability of objective computer vision systems. Three novel apps for assessing reading speed, contrast sensitivity and amplitude of accommodation were created by the author to demonstrate the potential of mobile computing to enhance clinical measurement. The reading speed app could present sentences effectively, control illumination and automate the testing procedure for reading speed assessment. Meanwhile the contrast sensitivity app made use of a bit stealing technique and swept frequency target, to rapidly assess a patient’s full contrast sensitivity function at both near and far distances. Finally, customised electronic hardware was created and interfaced to an app on a smartphone device to allow free space amplitude of accommodation measurement. A new geometrical model of the tear film and a ray tracing simulation of a Placido disc topographer were produced to provide insights on the effect of tear film breakdown on ophthalmic images. Furthermore, a new computer vision system, that used a novel eye-lash segmentation technique, was created to demonstrate the potential of computer vision systems for the clinical assessment of tear stability. Studies undertaken by the author to assess the validity and repeatability of the novel apps, found that their repeatability was comparable to, or better, than existing clinical methods for reading speed and contrast sensitivity assessment. Furthermore, the apps offered reduced examination times in comparison to their paper based equivalents. The reading speed and amplitude of accommodation apps correlated highly with existing methods of assessment supporting their validity. Their still remains questions over the validity of using a swept frequency sine-wave target to assess patient’s contrast sensitivity functions as no clinical test provides the range of spatial frequencies and contrasts, nor equivalent assessment at distance and near. A validation study of the new computer vision system found that the authors tear metric correlated better with existing subjective measures of tear film stability than those of a competing computer-vision system. However, repeatability was poor in comparison to the subjective measures due to eye lash interference. The new mobile apps, computer vision system, and studies outlined in this thesis provide further insight into the potential of applying mobile and image processing technology to enhance clinical testing by eye care professionals.