999 resultados para Array processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors present a super-fast scanning (SFS) technique for phased array weather radar applications. The fast scanning feature of the SFS technique is described and its drawbacks identified. Techniques which combat these drawbacks are also presented. A concept design phased array radar system (CDPAR) is used as a benchmark to compare the performance of a conventional scanning phased array radar system with the SFS technique. It is shown that the SFS technique, in association with suitable waveform processing, can realise four times the scanning speed and achieve similar accuracy compared to the conventional phased array benchmark.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a rectangular array antenna with a suitable signal-processing algorithm that is able to steer the beam in azimuth over a wide frequency band. In the previous approach, which was reported in the literature, an inverse discrete Fourier transform technique was proposed for obtaining the signal weighting coefficients. This approach was demonstrated for large arrays in which the physical parameters of the antenna elements were not considered. In this paper, a modified signal-weighting algorithm that works for arbitrary-size arrays is described. Its validity is demonstrated in examples of moderate-size arrays with real antenna elements. It is shown that in some cases, the original beam-forming algorithm fails, while the new algorithm is able to form the desired radiation pattern over a wide frequency band. The performance of the new algorithm is assessed for two cases when the mutual coupling between array elements is both neglected and taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents the design of a wideband rectangular array of planar monopoles, which is able to steer its beam and nulls over a wide frequency band using real-valued weights. These weights can be realized in practice by amplifiers or attenuators leading to a low cost development of a wideband array antenna with beam and null steering capability. The weights are determined by applying an inverse discrete Fourier transform to an assumed radiation pattern. This wideband beam and null forming concept is verified by full electromagnetic simulations which take into account mutual coupling effects between the array elements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the design of a Multiple Input Multiple Output testbed for assessing various MIMO transmission schemes in rich scattering indoor environments. In the undertaken design, a Field Programmable Gate Array (FPGA) board is used for fast processing of Intermediate Frequency signals. At the present stage, the testbed performance is assessed when the channel emulator between transmitter and receiver modules is introduced. Here, the results are presented for the case when a 2x2 Alamouti scheme for space time coding/decoding at transmitter and receiver is used. Various programming details of the FPGA board along with the obtained simulation results are reported

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of transmitter and receiver array configurations on the stray-light and diffraction-caused crosstalk in free-space optical interconnects was investigated. The optical system simulation software (Code V) is used to simulate both the stray-light and diffraction-caused crosstalk. Experimentally measured, spectrally-resolved, near-field images of VCSEL higher order modes were used as extended sources in our simulation model. In addition, we have included the electrical and optical noise in our analysis to give more accurate overall performance of the FSOI system. Our results show that by changing the square lattice geometry to a hexagonal configuration, we obtain an overall signal-to-noise ratio improvement of 3 dB. Furthermore, system density is increased by up to 4 channels/mm2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines children's consumer choice behaviour using an information processing perspective, with the fundamental goal of applying academic research to practical marketing and commercial problems. Proceeding a preface, which describes the academic and commercial terms of reference within which this interdisciplinary study is couched, the thesis comprises four discernible parts. Initially, the rationale inherent in adopting an information processing perspective is justified and the diverse array of topics which have bearing on children's consumer processing and behaviour are aggregated. The second part uses this perspective as a springboard to appraise the little explored role of memory, and especially memory structure, as a central cognitive component in children's consumer choice processing. The main research theme explores the ease with which 10 and 11 year olds retrieve contemporary consumer information from subjectively defined memory organisations. Adopting a sort-recall paradigm, hierarchical retrieval processing is stimulated and it is contended that when two items, known to be stored proximally in the memory organisation are not recalled adjacently, this discrepancy is indicative of retrieval processing ease. Results illustrate the marked influence of task conditions and orientation of memory structure on retrieval; these conclusions are accounted for in terms of input and integration failure. The third section develops the foregoing interpellations in the marketing context. A straightforward methodology for structuring marketing situations is postulated, a basis for segmenting children's markets using processing characteristics is adopted, and criteria for communicating brand support information to children are discussed. A taxonomy of market-induced processing conditions is developed. Finally, a case study with topical commercial significance is described. The development, launch and marketing of a new product in the confectionery market is outlined, the aetiology of its subsequent demise identified and expounded, and prescriptive guidelines are put forward to help avert future repetition of marketing misjudgements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the performance of a group of adult dyslexics and matched controls in an array-matching task where two strings of either consonants or symbols are presented side by side and have to be judged to be the same or different. The arrays may differ either in the order or identity of two adjacent characters. This task does not require naming – which has been argued to be the cause of dyslexics’ difficulty in processing visual arrays – but, instead, has a strong serial component as demonstrated by the fact that, in both groups, Reaction times (RTs) increase monotonically with position of a mismatch. The dyslexics are clearly impaired in all conditions and performance in the identity conditions predicts performance across orthographic tasks even after age, performance IQ and phonology are partialled out. Moreover, the shapes of serial position curves are revealing of the underlying impairment. In the dyslexics, RTs increase with position at the same rate as in the controls (lines are parallel) ruling out reduced processing speed or difficulties in shifting attention. Instead, error rates show a catastrophic increase for positions which are either searched later or more subject to interference. These results are consistent with a reduction in the attentional capacity needed in a serial task to bind together identity and positional information. This capacity is best seen as a reduction in the number of spotlights into which attention can be split to process information at different locations rather than as a more generic reduction of resources which would also affect processing the details of single objects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a low-energy glow-discharge process using reactive ion etching system that enables non-circular device patterns, such as squares or hexagons, to be formed from a precursor array of uniform circular openings in polymethyl methacrylate, PMMA, defined by electron beam lithography. This technique is of a particular interest for bit-patterned magnetic recording medium fabrication, where close packed square magnetic bits may improve its recording performance. The process and results of generating close packed square patterns by self-limiting low-energy glow-discharge are investigated. Dense magnetic arrays formed by electrochemical deposition of nickel over self-limiting formed molds are demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By modification of the classical retrodirective arrays (RDAs) architecture a directional modulation (DM) transmitter can be realized without the need for synthesis. Importantly, through analytical analysis and exemplar simulations, it is proved that, besides the conventional DM application scenario, i.e., secure transmission to one legitimate receiver located along one spatial direction in free space, the proposed synthesis-free DM transmitter should also perform well for systems where there are more than one legitimate receivers positioned along different directions in free space, and where one or more legitimate receivers exist in a multipath environment. None of these have ever been achieved before using synthesis-free DM arrangements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a multipair relay channel, where multiple sources communicate with multiple destinations with the help of a full-duplex (FD) relay station (RS). All sources and destinations have a single antenna, while the RS is equipped with massive arrays. We assume that the RS estimates the channels by using training sequences transmitted from sources and destinations. Then, it uses maximum-ratio combining/maximum-ratio transmission (MRC/MRT) to process the signals. To significantly reduce the loop interference (LI) effect, we propose two massive MIMO processing techniques: i) using a massive receive antenna array; or ii) using a massive transmit antenna array together with very low transmit power at the RS. We derive an exact achievable rate in closed-form and evaluate the system spectral efficiency. We show that, by doubling the number of antennas at the RS, the transmit power of each source and of the RS can be reduced by 1.5 dB if the pilot power is equal to the signal power and by 3 dB if the pilot power is kept fixed, while maintaining a given quality-of-service. Furthermore, we compare FD and half-duplex (HD) modes and show that FD improves significantly the performance when the LI level is low.