84 resultados para Search-based technique


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a procedure for filtering electromyographic (EMG) signals. Its key element is the Empirical Mode Decomposition, a novel digital signal processing technique that can decompose my time-series into a set of functions designated as intrinsic mode functions. The procedure for EMG signal filtering is compared to a related approach based on the wavelet transform. Results obtained from the analysis of synthetic and experimental EMG signals show that Our method can be Successfully and easily applied in practice to attenuation of background activity in EMG signals. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and practical technique for assessing the risks, that is, the potential for error, and consequent loss, in software system development, acquired during a requirements engineering phase is described. The technique uses a goal-based requirements analysis as a framework to identify and rate a set of key issues in order to arrive at estimates of the feasibility and adequacy of the requirements. The technique is illustrated and how it has been applied to a real systems development project is shown. How problems in this project could have been identified earlier is shown, thereby avoiding costly additional work and unhappy users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the classical Parzen window (PW) estimate as the desired response, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density (SKD) estimates. The proposed algorithm incrementally minimises a leave-one-out test score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights of the selected sparse model are finally updated using the multiplicative nonnegative quadratic programming algorithm, which ensures the nonnegative and unity constraints for the kernel weights and has the desired ability to reduce the model size further. Except for the kernel width, the proposed method has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Several examples demonstrate the ability of this simple regression-based approach to effectively construct a SKID estimate with comparable accuracy to that of the full-sample optimised PW density estimate. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An information processing paradigm in the brain is proposed, instantiated in an artificial neural network using biologically motivated temporal encoding. The network will locate within the external world stimulus, the target memory, defined by a specific pattern of micro-features. The proposed network is robust and efficient. Akin in operation to the swarm intelligence paradigm, stochastic diffusion search, it will find the best-fit to the memory with linear time complexity. information multiplexing enables neurons to process knowledge as 'tokens' rather than 'types'. The network illustrates possible emergence of cognitive processing from low level interactions such as memory retrieval based on partial matching. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel Linear Hashtable Method Predicted Hexagonal Search (LHMPHS) method for block based motion compensation is proposed. Fast block matching algorithms use the origin as the initial search center, which often does not track motion very well. To improve the accuracy of the fast BMA's, we employ a predicted starting search point, which reflects the motion trend of the current block. The predicted search centre is found closer to the global minimum. Thus the center-biased BMA's can be used to find the motion vector more efficiently. The performance of the algorithm is evaluated by using standard video sequences, considers the three important metrics: The results show that the proposed algorithm enhances the accuracy of current hexagonal algorithms and is better than Full Search, Logarithmic Search etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While search is normally modelled by economists purely in terms of decisions over making observations, this paper models it as a process in which information is gained through feedback from innovatory product launches. The information gained can then be used to decide whether to exercise real options. In the model the initial decisions involve a product design and the scale of production capacity. There are then real options to change these factors based on what is learned. The case of launching product variants in parallel is also considered. Under ‘true’ uncertainty, the model can be seen in terms of heuristic decision-making based on subjective beliefs with limited foresight. Search costs, the values of the real options, beliefs, and the cost of capital are all shown to be significant in determining the search path.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a high-level design method to synthesize multi-phase regular arrays. The method is based on deriving component designs using classical regular (or systolic) array synthesis techniques and composing these separately evolved component design into a unified global design. Similarity transformations ar e applied to component designs in the composition stage in order to align data ow between the phases of the computations. Three transformations are considered: rotation, re ection and translation. The technique is aimed at the design of hardware components for high-throughput embedded systems applications and we demonstrate this by deriving a multi-phase regular array for the 2-D DCT algorithm which is widely used in many vide ocommunications applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the search for a versatile building block that allows the preparation of heteroditopic tpy-pincer bridging ligands, the synthon 14'-[C6H3(CH2Br)(2)-3,5]-2,2':6',2 ''-terpyridine was synthesized. Facile introduction of diphenylphosphanyl groups in this synthon gave the ligand 14'-[C6H3(CH2PPh2)2-3,5]-2,2':6',2"-terpyridine) ([tpyPC(H)Pj). The asymmetric mononuclear complex [Fe(tpy){tpyPC(H)P}](PF6)(2), prepared by selective coordination of [Fe(tpy)Cl-3] to the tpy moiety of [tpyPC(H)P], was used for the synthesis of the heterodimetallic complex [Fe(tpy)(tpyPCP)Ru(tpy)](PFC,)3, which applies the "complex as ligand" approach. Coordination of the ruthenium centre at the PC(H)P-pincer moiety of [Fe(tpy){tpyPC(H)P}](PF6)(2) has been achieved by applying a transcyclometallation procedure. The ground-state electronic properties of both complexes, investigated by cyclic and square-wave voltammetries and UV/Vis spectroscopy, are discussed and compared with those of [Fe(tPY)(2)](PF6)(2) and [Ru(PCP)(tpy)]Cl, which represent the mononuclear components of the heterodinuclear species. An in situ UV/Vis spectroelectrochemical study was performed in order to localize the oxidation and reduction steps and to gain information about the Fe-II-Ru-II communication in the heterodimetallic system [Fe(tpy)(tpyPCP)Ru(tpy)](PF6)(3) mediated by the bridging ligand [tpyPCP]. Both the voltammetric and spectroelectrochemical results point to only very limited electronic interaction between the metal centres in the ground state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deep Brain Stimulation (DBS) has been successfully used throughout the world for the treatment of Parkinson's disease symptoms. To control abnormal spontaneous electrical activity in target brain areas DBS utilizes a continuous stimulation signal. This continuous power draw means that its implanted battery power source needs to be replaced every 18–24 months. To prolong the life span of the battery, a technique to accurately recognize and predict the onset of the Parkinson's disease tremors in human subjects and thus implement an on-demand stimulator is discussed here. The approach is to use a radial basis function neural network (RBFNN) based on particle swarm optimization (PSO) and principal component analysis (PCA) with Local Field Potential (LFP) data recorded via the stimulation electrodes to predict activity related to tremor onset. To test this approach, LFPs from the subthalamic nucleus (STN) obtained through deep brain electrodes implanted in a Parkinson patient are used to train the network. To validate the network's performance, electromyographic (EMG) signals from the patient's forearm are recorded in parallel with the LFPs to accurately determine occurrences of tremor, and these are compared to the performance of the network. It has been found that detection accuracies of up to 89% are possible. Performance comparisons have also been made between a conventional RBFNN and an RBFNN based on PSO which show a marginal decrease in performance but with notable reduction in computational overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The search for Earth-like exoplanets, orbiting in the habitable zone of stars other than our Sun and showing biological activity, is one of the most exciting and challenging quests of the present time. Nulling interferometry from space, in the thermal infrared, appears as a promising candidate technique for the task of directly observing extra-solar planets. It has been studied for about 10 years by ESA and NASA in the framework of the Darwin and TPF-I missions respectively. Nevertheless, nulling interferometry in the thermal infrared remains a technological challenge at several levels. Among them, the development of the "modal filter" function is mandatory for the filtering of the wavefronts in adequacy with the objective of rejecting the central star flux to an efficiency of about 105. Modal filtering takes benefit of the capability of single-mode waveguides to transmit a single amplitude function, to eliminate virtually any perturbation of the interfering wavefronts, thus making very high rejection ratios possible. The modal filter may either be based on single-mode Integrated Optics (IO) and/or Fiber Optics. In this paper, we focus on IO, and more specifically on the progress of the on-going "Integrated Optics" activity of the European Space Agency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a connectionist searching technique - the Stochastic Diffusion Search (SDS), capable of rapidly locating a specified pattern in a noisy search space. In operation SDS finds the position of the pre-specified pattern or if it does not exist - its best instantiation in the search space. This is achieved via parallel exploration of the whole search space by an ensemble of agents searching in a competitive cooperative manner. We prove mathematically the convergence of stochastic diffusion search. SDS converges to a statistical equilibrium when it locates the best instantiation of the object in the search space. Experiments presented in this paper indicate the high robustness of SDS and show good scalability with problem size. The convergence characteristic of SDS makes it a fully adaptive algorithm and suggests applications in dynamically changing environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic Diffusion Search is an efficient probabilistic bestfit search technique, capable of transformation invariant pattern matching. Although inherently parallel in operation it is difficult to implement efficiently in hardware as it requires full inter-agent connectivity. This paper describes a lattice implementation, which, while qualitatively retaining the properties of the original algorithm, restricts connectivity, enabling simpler implementation on parallel hardware. Diffusion times are examined for different network topologies, ranging from ordered lattices, over small-world networks to random graphs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Higher order cumulant analysis is applied to the blind equalization of linear time-invariant (LTI) nonminimum-phase channels. The channel model is moving-average based. To identify the moving average parameters of channels, a higher-order cumulant fitting approach is adopted in which a novel relay algorithm is proposed to obtain the global solution. In addition, the technique incorporates model order determination. The transmitted data are considered as independently identically distributed random variables over some discrete finite set (e.g., set {±1, ±3}). A transformation scheme is suggested so that third-order cumulant analysis can be applied to this type of data. Simulation examples verify the feasibility and potential of the algorithm. Performance is compared with that of the noncumulant-based Sato scheme in terms of the steady state MSE and convergence rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many techniques are currently used for motion estimation. In the block-based approaches the most common procedure applied is the block-matching based on various algorithms. To refine the motion estimates resulting from the full search or any coarse search algorithm, one can find few applications of Kalman filtering, mainly in the intraframe scheme. The Kalman filtering technique applicability for block-based motion estimation is rather limited due to discontinuities in the dynamic behaviour of the motion vectors. Therefore, we propose an application of the concept of the filtering by approximated densities (FAD). The FAD, originally introduced to alleviate limitations due to conventional Kalman modelling, is applied to interframe block-motion estimation. This application uses a simple form of FAD involving statistical characteristics of multi-modal distributions up to second order.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A form of three-dimensional X-ray imaging, called Object 3-D, is introduced, where the relevant subject material is represented as discrete ‘objects’. The surface of each such object is derived accurately from the projections of its outline, and of its other discontinuities, in about ten conventional X-ray views, distributed in solid angle. This technique is suitable for many applications, and permits dramatic savings in radiation exposure and in data acquisition and manipulation. It is well matched to user-friendly interactive displays.