196 resultados para Adaptive Search


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning automata are adaptive decision making devices that are found useful in a variety of machine learning and pattern recognition applications. Although most learning automata methods deal with the case of finitely many actions for the automaton, there are also models of continuous-action-set learning automata (CALA). A team of such CALA can be useful in stochastic optimization problems where one has access only to noise-corrupted values of the objective function. In this paper, we present a novel formulation for noise-tolerant learning of linear classifiers using a CALA team. We consider the general case of nonuniform noise, where the probability that the class label of an example is wrong may be a function of the feature vector of the example. The objective is to learn the underlying separating hyperplane given only such noisy examples. We present an algorithm employing a team of CALA and prove, under some conditions on the class conditional densities, that the algorithm achieves noise-tolerant learning as long as the probability of wrong label for any example is less than 0.5. We also present some empirical results to illustrate the effectiveness of the algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The knowledge of hydrological variables (e. g. soil moisture, evapotranspiration) are of pronounced importance in various applications including flood control, agricultural production and effective water resources management. These applications require the accurate prediction of hydrological variables spatially and temporally in watershed/basin. Though hydrological models can simulate these variables at desired resolution (spatial and temporal), often they are validated against the variables, which are either sparse in resolution (e. g. soil moisture) or averaged over large regions (e. g. runoff). A combination of the distributed hydrological model (DHM) and remote sensing (RS) has the potential to improve resolution. Data assimilation schemes can optimally combine DHM and RS. Retrieval of hydrological variables (e. g. soil moisture) from remote sensing and assimilating it in hydrological model requires validation of algorithms using field studies. Here we present a review of methodologies developed to assimilate RS in DHM and demonstrate the application for soil moisture in a small experimental watershed in south India.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most important applications of adaptive systems is in noise cancellation using adaptive filters. Ln this paper, we propose adaptive noise cancellation schemes for the enhancement of EEG signals in the presence of EOG artifacts. The effect of two reference inputs is studied on simulated as well as recorded EEG signals and it is found that one reference input is enough to get sufficient minimization of EOG artifacts. This has been verified through correlation analysis also. We use signal to noise ratio and linear prediction spectra, along with time plots, for comparing the performance of the proposed schemes for minimizing EOG artifacts from contaminated EEG signals. Results show that the proposed schemes are very effective (especially the one which employs Newton's method) in minimizing the EOG artifacts from contaminated EEG signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accretion disk around a compact object is a nonlinear general relativistic system involving magnetohydrodynamics. Naturally, the question arises whether such a system is chaotic (deterministic) or stochastic (random) which might be related to the associated transport properties whose origin is still not confirmed. Earlier, the black hole system GRS 1915+105 was shown to be low-dimensional chaos in certain temporal classes. However, so far such nonlinear phenomena have not been studied fairly well for neutron stars which are unique for their magnetosphere and kHz quasi-periodic oscillation (QPO). On the other hand, it was argued that the QPO is a result of nonlinear magnetohydrodynamic effects in accretion disks. If a neutron star exhibits chaotic signature, then what is the chaotic/correlation dimension? We analyze RXTE/PCA data of neutron stars Sco X-1 and Cyg X-2, along with the black hole Cyg X-1 and the unknown source Cyg X-3, and show that while Sco X-1 and Cyg X-2 are low dimensional chaotic systems, Cyg X-1 and Cyg X-3 are stochastic sources. Based on our analysis, we argue that Cyg X-3 may be a black hole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present numerical evidence that supports the notion of minimization in the sequence space of proteins for a target conformation. We use the conformations of the real proteins in the Protein Data Bank (PDB) and present computationally efficient methods to identify the sequences with minimum energy. We use edge-weighted connectivity graph for ranking the residue sites with reduced amino acid alphabet and then use continuous optimization to obtain the energy-minimizing sequences. Our methods enable the computation of a lower bound as well as a tight upper bound for the energy of a given conformation. We validate our results by using three different inter-residue energy matrices for five proteins from protein data bank (PDB), and by comparing our energy-minimizing sequences with 80 million diverse sequences that are generated based on different considerations in each case. When we submitted some of our chosen energy-minimizing sequences to Basic Local Alignment Search Tool (BLAST), we obtained some sequences from non-redundant protein sequence database that are similar to ours with an E-value of the order of 10(-7). In summary, we conclude that proteins show a trend towards minimizing energy in the sequence space but do not seem to adopt the global energy-minimizing sequence. The reason for this could be either that the existing energy matrices are not able to accurately represent the inter-residue interactions in the context of the protein environment or that Nature does not push the optimization in the sequence space, once it is able to perform the function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An adaptive drug delivery design is presented in this paper using neural networks for effective treatment of infectious diseases. The generic mathematical model used describes the coupled evolution of concentration of pathogens, plasma cells, antibodies and a numerical value that indicates the relative characteristic of a damaged organ due to the disease under the influence of external drugs. From a system theoretic point of view, the external drugs can be interpreted as control inputs, which can be designed based on control theoretic concepts. In this study, assuming a set of nominal parameters in the mathematical model, first a nonlinear controller (drug administration) is designed based on the principle of dynamic inversion. This nominal drug administration plan was found to be effective in curing "nominal model patients" (patients whose immunological dynamics conform to the mathematical model used for the control design exactly. However, it was found to be ineffective in curing "realistic model patients" (patients whose immunological dynamics may have off-nominal parameter values and possibly unwanted inputs) in general. Hence, to make the drug delivery dosage design more effective for realistic model patients, a model-following adaptive control design is carried out next by taking the help of neural networks, that are trained online. Simulation studies indicate that the adaptive controller proposed in this paper holds promise in killing the invading pathogens and healing the damaged organ even in the presence of parameter uncertainties and continued pathogen attack. Note that the computational requirements for computing the control are very minimal and all associated computations (including the training of neural networks) can be carried out online. However it assumes that the required diagnosis process can be carried out at a sufficient faster rate so that all the states are available for control computation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pricing is an effective tool to control congestion and achieve quality of service (QoS) provisioning for multiple differentiated levels of service. In this paper, we consider the problem of pricing for congestion control in the case of a network of nodes under multiple service classes. Our work draws upon [1] and [2] in various ways. We use the Tirupati pricing scheme in conjunction with the stochastic approximation based adaptive pricing methodology for queue control (proposed in [1]) for minimizing network congestion. However, unlike the methodology of [1] where pricing for entire routes is directly considered, we consider prices for individual link-service grade tuples. Further, we adapt the methodology proposed in [21 for a single-node scenario to the case of a network of nodes, for evaluating performance in terms of price, revenue rate and disutility. We obtain considerable performance improvements using our approach over that in [1]. In particular, our approach exhibits a throughput improvement in the range of 54 to 80 percent in all cases studied (over all routes) while exhibiting a lower packet delay in the range of 26 to 38 percent over the scheme in [1].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we analyze a deploy and search strategy for multi-agent systems. Mobile agents equipped with sensors carry out search operation in the search space. The lack of information about the search space is modeled as an uncertainty density distribution over the space, and is assumed to be known to the agents a priori. In each step, the agents deploy themselves in an optimal way so as to maximize per step reduction in the uncertainty density. We analyze the proposed strategy for convergence and spatial distributedness. The control law moving the agents has been analyzed for stability and convergence using LaSalle's invariance principle, and for spatial distributedness under a few realistic constraints on the control input such as constant speed, limit on maximum speed, and also sensor range limits. The simulation experiments show that the strategy successfully reduces the average uncertainty density below the required level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we design a novel auction which we call the OPT (optimal) auction. The OPT mechanism maximizes the search engine's expected revenue while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We show that the OPT mechanism is superior to two of the most commonly used mechanisms for sponsored search namely (1) GSP (Generalized Second Price) and (2) VCG (Vickrey-Clarke-Groves). We then show an important revenue equivalence result that the expected revenue earned by the search engine is the same for all the three mechanisms provided the advertisers are symmetric and the number of sponsored slots is strictly less than the number of advertisers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relatively few studies have addressed water management and adaptation measures in the face of changing water balances due to climate change. The current work studies climate change impact on a multipurpose reservoir performance and derives adaptive policies for possible futurescenarios. The method developed in this work is illustrated with a case study of Hirakud reservoir on the Mahanadi river in Orissa, India,which is a multipurpose reservoir serving flood control, irrigation and power generation. Climate change effects on annual hydropower generation and four performance indices (reliability with respect to three reservoir functions, viz. hydropower, irrigation and flood control, resiliency, vulnerability and deficit ratio with respect to hydropower) are studied. Outputs from three general circulation models (GCMs) for three scenarios each are downscaled to monsoon streamflow in the Mahanadi river for two future time slices, 2045-65 and 2075-95. Increased irrigation demands, rule curves dictated by increased need for flood storage and downscaled projections of streamflow from the ensemble of GCMs and scenarios are used for projecting future hydrologic scenarios. It is seen that hydropower generation and reliability with respect to hydropower and irrigation are likely to show a decrease in future in most scenarios, whereas the deficit ratio and vulnerability are likely to increase as a result of climate change if the standard operating policy (SOP) using current rule curves for flood protection is employed. An optimal monthly operating policy is then derived using stochastic dynamic programming (SDP) as an adaptive policy for mitigating impacts of climate change on reservoir operation. The objective of this policy is to maximize reliabilities with respect to multiple reservoir functions of hydropower, irrigation and flood control. In variations to this adaptive policy, increasingly more weightage is given to the purpose of maximizing reliability with respect to hydropower for two extreme scenarios. It is seen that by marginally sacrificing reliability with respect to irrigation and flood control, hydropower reliability and generation can be increased for future scenarios. This suggests that reservoir rules for flood control may have to be revised in basins where climate change projects an increasing probability of droughts. However, it is also seen that power generation is unable to be restored to current levels, due in part to the large projected increases in irrigation demand. This suggests that future water balance deficits may limit the success of adaptive policy options. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present an analysis for the bit error rate (BER) performance of space-time block codes (STBC) from generalized complex orthogonal designs for M-PSK modulation. In STBCs from complex orthogonal designs (COD), the norms of the column vectors are the same (e.g., Alamouti code). However, in generalized COD (GCOD), the norms of the column vectors may not necessarily be the same (e.g., the rate-3/5 and rate-7/11 codes by Su and Xia in [1]). STBCs from GCOD are of interest because of the high rates that they can achieve (in [2], it has been shown that the maximum achievable rate for STBCs from GCOD is bounded by 4/5). While the BER performance of STBCs: from COD (e.g., Alamouti code) can be simply obtained from existing analytical expressions for receive diversity with the same diversity order by appropriately scaling the SNR, this can not be done for STBCs from GCOD (because of the unequal norms of the column vectors). Our contribution in this paper is that we derive analytical expressions for the BER performance of any STBC from GCOD. Our BER analysis for the GCOD captures the performance of STBCs from COD as special cases. We validate our results with two STBCs from GCOD reported by Su and Xia in [1], for 5 and 6 transmit antennas (G(5) and G(6) in [1]) with rates 7/11 and 3/5, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The keyword based search technique suffers from the problem of synonymic and polysemic queries. Current approaches address only theproblem of synonymic queries in which different queries might have the same information requirement. But the problem of polysemic queries,i.e., same query having different intentions, still remains unaddressed. In this paper, we propose the notion of intent clusters, the members of which will have the same intention. We develop a clustering algorithm that uses the user session information in query logs in addition to query URL entries to identify cluster of queries having the same intention. The proposed approach has been studied through case examples from the actual log data from AOL, and the clustering algorithm is shown to be successful in discerning the user intentions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The search engine log files have been used to gather direct user feedback on the relevancy of the documents presented in the results page. Typically the relative position of the clicks gathered from the log files is used a proxy for the direct user feedback. In this paper we identify reasons for the incompleteness of the relative position of clicks for deciphering the user preferences. Hence, we propose the use of time spent by the user in reading through the document as indicative of user preference for a document with respect to a query. Also, we identify the issues involved in using the time measure and propose means to address them.