943 resultados para Digital processing
Resumo:
As airports continue to become more ‘customer-centric’ their digital customer-facing technologies are increasingly embedded within the passenger journey. This study takes a customer-centric view of airport digital technology by exploring the ways that digital technologies are being applied within airports to improve passenger perspectives of service quality during their journey. The literature review develops a framework encompassing the themes of airport service quality (function, interaction and diversion) and digital strategy. This framework has been applied to six airports exhibiting high service quality. Currently, the findings suggest that the improvement of customer function involves the use of automated and self-service technologies providing passengers greater efficiency and effectiveness during processing points. Additionally, technology to improve experience during wait times may entail either aesthetic qualities, or provide some form of productivity to passengers. Alternatively, customer interaction is influenced by digital technology through constant passenger engagement during their journey. As the research nears completion, the influence of these themes on the framework will become more apparent.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
Acoustic recordings of the environment provide an effective means to monitor bird species diversity. To facilitate exploration of acoustic recordings, we describe a content-based birdcall retrieval algorithm. A query birdcall is a region of spectrogram bounded by frequency and time. Retrieval depends on a similarity measure derived from the orientation and distribution of spectral ridges. The spectral ridge detection method caters for a broad range of birdcall structures. In this paper, we extend previous work by incorporating a spectrogram scaling step in order to improve the detection of spectral ridges. Compared to an existing approach based on MFCC features, our feature representation achieves better retrieval performance for multiple bird species in noisy recordings.
Resumo:
Color displays used in image processing systems consist of a refresh memory buffer storing digital image data which are converted into analog signals to display an image by driving the primary color channels (red, green, and blue) of a color television monitor. The color cathode ray tube (CRT) of the monitor is unable to reproduce colors exactly due to phosphor limitations, exponential luminance response of the tube to the applied signal, and limitations imposed by the digital-to-analog conversion. In this paper we describe some computer simulation studies (using the U*V*W* color space) carried out to measure these reproduction errors. Further, a procedure to correct for color reproduction error due to the exponential luminance response (gamma) of the picture tube is proposed, using a video-lookup-table and a higher resolution digital-to-analog converter. It is found, on the basis of computer simulation studies, that the proposed gamma correction scheme is effective and robust with respect to variations in the assumed value of the gamma.
Resumo:
In an earlier paper (Part I) we described the construction of Hermite code for multiple grey-level pictures using the concepts of vector spaces over Galois Fields. In this paper a new algebra is worked out for Hermite codes to devise algorithms for various transformations such as translation, reflection, rotation, expansion and replication of the original picture. Also other operations such as concatenation, complementation, superposition, Jordan-sum and selective segmentation are considered. It is shown that the Hermite code of a picture is very powerful and serves as a mathematical signature of the picture. The Hermite code will have extensive applications in picture processing, pattern recognition and artificial intelligence.
Resumo:
Frequency response analysis is critical in understanding the steady and transient state behavior of any electrical network. Network analyzeror frequency response analyzer is used to determine the frequency response of an electrical network. This paper deals with the design of an inexpensive digitally controlled Network Analyzer. The frequency range of the network analyzer is from 10Hz to 50kHz (suitable range for system studies on most power electronics apparatus). It is composed of a microcontroller (as central processing unit) and a personal computer (as analyzer and display). The communication between the microcontroller and personal computer is established through one of the USB ports. The testing and evaluation of the analyzer is done with RC, RLC and multi-resonant circuits. The design steps, basis of analysis, experimental results, limitation in bandwidth and possible techniques for improvement in performances are presented.
Resumo:
Inverse filters are conventionally used for resolving overlapping signals of identical waveshape. However, the inverse filtering approach is shown to be useful for resolving overlapping signals, identical or otherwise, of unknown waveshapes. Digital inverse filter design based on autocorrelation formulation of linear prediction is known to perform optimum spectral flattening of the input signal for which the filter is designed. This property of the inverse filter is used to accomplish composite signal decomposition. The theory has been presented assuming constituent signals to be responses of all-pole filters. However, the approach may be used for a general situation.
Resumo:
We address the problem of exact complex-wave reconstruction in digital holography. We show that, by confining the object-wave modulation to one quadrant of the frequency domain, and by maintaining a reference-wave intensity higher than that of the object, one can achieve exact complex-wave reconstruction in the absence of noise. A feature of the proposed technique is that the zero-order artifact, which is commonly encountered in hologram reconstruction, can be completely suppressed in the absence of noise. The technique is noniterative and nonlinear. We also establish a connection between the reconstruction technique and homomorphic signal processing, which enables an interpretation of the technique from the perspective of deconvolution. Another key contribution of this paper is a direct link between the reconstruction technique and the two-dimensional Hilbert transform formalism proposed by Hahn. We show that this connection leads to explicit Hilbert transform relations between the magnitude and phase of the complex wave encoded in the hologram. We also provide results on simulated as well as experimental data to validate the accuracy of the reconstruction technique. (C) 2011 Optical Society of America
Resumo:
CDS/ISIS is an advanced non-numerical information storage and retrieval software developed by UNESCO since 1985 to satisfy the need expressed by many institutions, especially in developing countries, to be able to streamline their information processing activities by using modern (and relatively inexpensive) technologies [1]. CDS/ISIS is available for MS-DOS, Windows and Unix operating system platforms. The formatting language of CDS/ISIS is one of its several strengths. It is not only used for formatting records for display but is also used for creating customized indexes. CDS/ISIS by itself does not facilitate in publishing its databases on the Internet nor does it facilitate in publishing on CD-ROMs. However, numbers of open source tools are now available, which enables in publishing CDS/ISIS databases on the Internet and also on CD-ROMs. In this paper, we have discussed the ways and means of integrating CDS/ISIS databases with GSDL, an open source digital library (DL) software.
Resumo:
We present a technique for irreversible watermarking approach robust to affine transform attacks in camera, biomedical and satellite images stored in the form of monochrome bitmap images. The watermarking approach is based on image normalisation in which both watermark embedding and extraction are carried out with respect to an image normalised to meet a set of predefined moment criteria. The normalisation procedure is invariant to affine transform attacks. The result of watermarking scheme is suitable for public watermarking applications, where the original image is not available for watermark extraction. Here, direct-sequence code division multiple access approach is used to embed multibit text information in DCT and DWT transform domains. The proposed watermarking schemes are robust against various types of attacks such as Gaussian noise, shearing, scaling, rotation, flipping, affine transform, signal processing and JPEG compression. Performance analysis results are measured using image processing metrics.
Resumo:
Carbon nanotubes dispersed in polymer matrix have been aligned in the form of fibers and interconnects and cured electrically and by UV light. Conductivity and effective semiconductor tunneling against reverse to forward bias field have been designed to have differentiable current-voltage response of each of the fiber/channel. The current-voltage response is a function of the strain applied to the fibers along axial direction. Biaxial and shear strains are correlated by differentiating signals from the aligned fibers/channels. Using a small doping of magnetic nanoparticles in these composite fibers, magneto-resistance properties are realized which are strong enough to use the resulting magnetostriction as a state variable for signal processing and computing. Various basic analog signal processing tasks such as addition, convolution and filtering etc. can be performed. These preliminary study shows promising application of the concept in combined analog-digital computation in carbon nanotube based fibers. Various dynamic effects such as relaxation, electric field dependent nonlinearities and hysteresis on the output signals are studied using experimental data and analytical model.
Resumo:
In this paper, we discuss the issues related to word recognition in born-digital word images. We introduce a novel method of power-law transformation on the word image for binarization. We show the improvement in image binarization and the consequent increase in the recognition performance of OCR engine on the word image. The optimal value of gamma for a word image is automatically chosen by our algorithm with fixed stroke width threshold. We have exhaustively experimented our algorithm by varying the gamma and stroke width threshold value. By varying the gamma value, we found that our algorithm performed better than the results reported in the literature. On the ICDAR Robust Reading Systems Challenge-1: Word Recognition Task on born digital dataset, as compared to the recognition rate of 61.5% achieved by TH-OCR after suitable pre-processing by Yang et. al. and 63.4% by ABBYY Fine Reader (used as baseline by the competition organizers without any preprocessing), we achieved 82.9% using Omnipage OCR applied on the images after being processed by our algorithm.
Resumo:
Fringe tracking and fringe order assignment have become the central topics of current research in digital photoelasticity. Isotropic points (IPs) appearing in low fringe order zones are often either overlooked or entirely missed in conventional as well as digital photoelasticity. We aim to highlight image processing for characterizing IPs in an isochromatic fringe field. By resorting to a global analytical solution of a circular disk, sensitivity of IPs to small changes in far-field loading on the disk is highlighted. A local theory supplements the global closed-form solutions of three-, four-, and six-point loading configurations of circular disk. The local theoretical concepts developed in this paper are demonstrated through digital image analysis of isochromatics in circular disks subjected to three-and four-point loads. (C) 2015 Society of Photo-Optical Instrumentation Engineers (SPIE)
Resumo:
We present methods for fixed-lag smoothing using Sequential Importance sampling (SIS) on a discrete non-linear, non-Gaussian state space system with unknown parameters. Our particular application is in the field of digital communication systems. Each input data point is taken from a finite set of symbols. We represent transmission media as a fixed filter with a finite impulse response (FIR), hence a discrete state-space system is formed. Conventional Markov chain Monte Carlo (MCMC) techniques such as the Gibbs sampler are unsuitable for this task because they can only perform processing on a batch of data. Data arrives sequentially, so it would seem sensible to process it in this way. In addition, many communication systems are interactive, so there is a maximum level of latency that can be tolerated before a symbol is decoded. We will demonstrate this method by simulation and compare its performance to existing techniques.
Resumo:
373 p. : il., gráf., fot., tablas