887 resultados para Analysis and statistical methods
Resumo:
Most adaptive linearization circuits for the nonlinear amplifier have a feedback loop that returns the output signal oj'tne eunplifier to the lineurizer. The loop delay of the linearizer most be controlled precisely so that the convergence of the linearizer should be assured lot this Letter a delay control circuit is presented. It is a delay lock loop (ULL) with it modified early-lute gate and can he easily applied to a DSP implementation. The proposed DLL circuit is applied to an adaptive linearizer with the use of a polynomial predistorter, and the simulalion for a 16-QAM signal is performed. The simulation results show that the proposed DLL eliminates the delay between the reference input signal and the delayed feedback signal of the linearizing circuit perfectly, so that the predistorter polynomial coefficients converge into the optimum value and a high degree of linearization is achieved
Resumo:
A dual-port dual-polarized compact microstrip antenna for avoiding cross coupling between the two frequency bands is proposed and analyzed. This antenna offers channel isolation better than 25 dB, and is more compact compared to a conventional rectangular patch. Analytical equations for calculating the resonant frequencies at both ports are also presented. The theoretical calculations are verified using experimental results
Resumo:
A novel optical add-drop multiplexer (OADM) based on the Mach-Zelauler interferometer (MZI) and the fiber Bragg grating (FBG) is proposed for the first tittle to the authors ' knowledge. In the structure, the Mach-Zehnder interferometer acts as an optical switch. The principle of the OADM is analyzed in this paper. The OADM can add/drop one of the multi-input channels or pass the channel directly by adjusting the difference of the two arms of the interferometer. The channel isolation is more than 20 dB
Resumo:
International School of Photonics, Cochin University of Science and Technology
Resumo:
Thermal analysis, powder diffraction, and Raman scattering as a function of the temperature were carried out on K2BeF4. Moreover, the crystal structure was determined at 293 K from powder diffraction. The compound shows a transition from Pna21 to Pnam space group at 921 K with a transition enthalpy of 5 kJ/mol. The transition is assumed to be first order because the compound shows metastability. Structurally and spectroscopically the transition is similar to those observed in (NH4)2SO4, which suggests that the low-temperature phase is ferroelectric. In order to confirm it, the spontaneous polarization has been computed using an ionic model.
Resumo:
Median filtering is a simple digital non—linear signal smoothing operation in which median of the samples in a sliding window replaces the sample at the middle of the window. The resulting filtered sequence tends to follow polynomial trends in the original sample sequence. Median filter preserves signal edges while filtering out impulses. Due to this property, median filtering is finding applications in many areas of image and speech processing. Though median filtering is simple to realise digitally, its properties are not easily analysed with standard analysis techniques,
Resumo:
The work is intended to study the following important aspects of document image processing and develop new methods. (1) Segmentation ofdocument images using adaptive interval valued neuro-fuzzy method. (2) Improving the segmentation procedure using Simulated Annealing technique. (3) Development of optimized compression algorithms using Genetic Algorithm and parallel Genetic Algorithm (4) Feature extraction of document images (5) Development of IV fuzzy rules. This work also helps for feature extraction and foreground and background identification. The proposed work incorporates Evolutionary and hybrid methods for segmentation and compression of document images. A study of different neural networks used in image processing, the study of developments in the area of fuzzy logic etc is carried out in this work
Resumo:
Among many other knowledge representations formalisms, Ontologies and Formal Concept Analysis (FCA) aim at modeling ‘concepts’. We discuss how these two formalisms may complement another from an application point of view. In particular, we will see how FCA can be used to support Ontology Engineering, and how ontologies can be exploited in FCA applications. The interplay of FCA and ontologies is studied along the life cycle of an ontology: (i) FCA can support the building of the ontology as a learning technique. (ii) The established ontology can be analyzed and navigated by using techniques of FCA. (iii) Last but not least, the ontology may be used to improve an FCA application.
Resumo:
Semantic Web Mining aims at combining the two fast-developing research areas Semantic Web and Web Mining. This survey analyzes the convergence of trends from both areas: Growing numbers of researchers work on improving the results of Web Mining by exploiting semantic structures in the Web, and they use Web Mining techniques for building the Semantic Web. Last but not least, these techniques can be used for mining the Semantic Web itself. The second aim of this paper is to use these concepts to circumscribe what Web space is, what it represents and how it can be represented and analyzed. This is used to sketch the role that Semantic Web Mining and the software agents and human agents involved in it can play in the evolution of Web space.
Resumo:
This study describes a combined empirical/modeling approach to assess the possible impact of climate variability on rice production in the Philippines. We collated climate data of the last two decades (1985-2002) as well as yield statistics of six provinces of the Philippines, selected along a North-South gradient. Data from the climate information system of NASA were used as input parameters of the model ORYZA2000 to determine potential yields and, in the next steps, the yield gaps defined as the difference between potential and actual yields. Both simulated and actual yields of irrigated rice varied strongly between years. However, no climate-driven trends were apparent and the variability in actual yields showed no correlation with climatic parameters. The observed variation in simulated yields was attributable to seasonal variations in climate (dry/wet season) and to climatic differences between provinces and agro-ecological zones. The actual yield variation between provinces was not related to differences in the climatic yield potential but rather to soil and management factors. The resulting yield gap was largest in remote and infrastructurally disfavored provinces (low external input use) with a high production potential (high solar radiation and day-night temperature differences). In turn, the yield gap was lowest in central provinces with good market access but with a relatively low climatic yield potential. We conclude that neither long-term trends nor the variability of the climate can explain current rice yield trends and that agroecological, seasonal, and management effects are over-riding any possible climatic variations. On the other hand the lack of a climate-driven trend in the present situation may be superseded by ongoing climate change in the future.
Resumo:
This thesis addresses the problem of developing automatic grasping capabilities for robotic hands. Using a 2-jointed and a 4-jointed nmodel of the hand, we establish the geometric conditions necessary for achieving form closure grasps of cylindrical objects. We then define and show how to construct the grasping pre-image for quasi-static (friction dominated) and zero-G (inertia dominated) motions for sensorless and sensor-driven grasps with and without arm motions. While the approach does not rely on detailed modeling, it is computationally inexpensive, reliable, and easy to implement. Example behaviors were successfully implemented on the Salisbury hand and on a planar 2-fingered, 4 degree-of-freedom hand.
Resumo:
I present a novel design methodology for the synthesis of automatic controllers, together with a computational environment---the Control Engineer's Workbench---integrating a suite of programs that automatically analyze and design controllers for high-performance, global control of nonlinear systems. This work demonstrates that difficult control synthesis tasks can be automated, using programs that actively exploit and efficiently represent knowledge of nonlinear dynamics and phase space and effectively use the representation to guide and perform the control design. The Control Engineer's Workbench combines powerful numerical and symbolic computations with artificial intelligence reasoning techniques. As a demonstration, the Workbench automatically designed a high-quality maglev controller that outperforms a previous linear design by a factor of 20.
Resumo:
Image analysis and graphics synthesis can be achieved with learning techniques using directly image examples without physically-based, 3D models. In our technique: -- the mapping from novel images to a vector of "pose" and "expression" parameters can be learned from a small set of example images using a function approximation technique that we call an analysis network; -- the inverse mapping from input "pose" and "expression" parameters to output images can be synthesized from a small set of example images and used to produce new images using a similar synthesis network. The techniques described here have several applications in computer graphics, special effects, interactive multimedia and very low bandwidth teleconferencing.
Resumo:
This paper presents a new paradigm for signal reconstruction and superresolution, Correlation Kernel Analysis (CKA), that is based on the selection of a sparse set of bases from a large dictionary of class- specific basis functions. The basis functions that we use are the correlation functions of the class of signals we are analyzing. To choose the appropriate features from this large dictionary, we use Support Vector Machine (SVM) regression and compare this to traditional Principal Component Analysis (PCA) for the tasks of signal reconstruction, superresolution, and compression. The testbed we use in this paper is a set of images of pedestrians. This paper also presents results of experiments in which we use a dictionary of multiscale basis functions and then use Basis Pursuit De-Noising to obtain a sparse, multiscale approximation of a signal. The results are analyzed and we conclude that 1) when used with a sparse representation technique, the correlation function is an effective kernel for image reconstruction and superresolution, 2) for image compression, PCA and SVM have different tradeoffs, depending on the particular metric that is used to evaluate the results, 3) in sparse representation techniques, L_1 is not a good proxy for the true measure of sparsity, L_0, and 4) the L_epsilon norm may be a better error metric for image reconstruction and compression than the L_2 norm, though the exact psychophysical metric should take into account high order structure in images.