99 resultados para Search procedures


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of Gaussian Quadrature (GQ) procedures to the evaluation of i—E curves in linear sweep voltammetry is advocated. It is shown that a high degree of precision is achieved with these methods and the values obtained through GQ are in good agreement with (and even better than) the values reported in literature by Nicholson-Shain, for example. Another welcome feature with GQ is its ability to be interpreted as an elegant, efficient analytic approximation scheme too. A comparison of the values obtained by this approach and by a recent scheme based on series approximation proposed by Oldham is made and excellent agreement is shown to exist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents simple graphical procedures for the position synthesis of plane linkage mechanisms with sliding inputs and output to generate functions of two independent variables. The procedures are based on point position reduction and permit synthesis of the linkage to satisfy up to five arbitrarily selected precision positions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We explore here the acceleration of convergence of iterative methods for the solution of a class of quasilinear and linear algebraic equations. The specific systems are the finite difference form of the Navier-Stokes equations and the energy equation for recirculating flows. The acceleration procedures considered are: the successive over relaxation scheme; several implicit methods; and a second-order procedure. A new implicit method—the alternating direction line iterative method—is proposed in this paper. The method combines the advantages of the line successive over relaxation and alternating direction implicit methods. The various methods are tested for their computational economy and accuracy on a typical recirculating flow situation. The numerical experiments show that the alternating direction line iterative method is the most economical method of solving the Navier-Stokes equations for all Reynolds numbers in the laminar regime. The usual ADI method is shown to be not so attractive for large Reynolds numbers because of the loss of diagonal dominance. This loss can however be restored by a suitable choice of the relaxation parameter, but at the cost of accuracy. The accuracy of the new procedure is comparable to that of the well-tested successive overrelaxation method and to the available results in the literature. The second-order procedure turns out to be the most efficient method for the solution of the linear energy equation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple yet efficient method for the minimization of incompletely specified sequential machines (ISSMs) is proposed. Precise theorems are developed, as a consequence of which several compatibles can be deleted from consideration at the very first stage in the search for a minimal closed cover. Thus, the computational work is significantly reduced. Initial cardinality of the minimal closed cover is further reduced by a consideration of the maximal compatibles (MC's) only; as a result the method converges to the solution faster than the existing procedures. "Rank" of a compatible is defined. It is shown that ordering the compatibles, in accordance with their rank, reduces the number of comparisons to be made in the search for exclusion of compatibles. The new method is simple, systematic, and programmable. It does not involve any heuristics or intuitive procedures. For small- and medium-sized machines, it canle used for hand computation as well. For one of the illustrative examples used in this paper, 30 out of 40 compatibles can be ignored in accordance with the proposed rules and the remaining 10 compatibles only need be considered for obtaining a minimal solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method that yields optical Barker codes of smallest known lengths for given discrimination is described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The following problem is considered. Given the locations of the Central Processing Unit (ar;the terminals which have to communicate with it, to determine the number and locations of the concentrators and to assign the terminals to the concentrators in such a way that the total cost is minimized. There is alao a fixed cost associated with each concentrator. There is ail upper limit to the number of terminals which can be connected to a concentrator. The terminals can be connected directly to the CPU also In this paper it is assumed that the concentrators can bo located anywhere in the area A containing the CPU and the terminals. Then this becomes a multimodal optimization problem. In the proposed algorithm a stochastic automaton is used as a search device to locate the minimum of the multimodal cost function . The proposed algorithm involves the following. The area A containing the CPU and the terminals is divided into an arbitrary number of regions (say K). An approximate value for the number of concentrators is assumed (say m). The optimum number is determined by iteration later The m concentrators can be assigned to the K regions in (mk) ways (m > K) or (km) ways (K>m).(All possible assignments are feasible, i.e. a region can contain 0,1,…, to concentrators). Each possible assignment is assumed to represent a state of the stochastic variable structure automaton. To start with, all the states are assigned equal probabilities. At each stage of the search the automaton visits a state according to the current probability distribution. At each visit the automaton selects a 'point' inside that state with uniform probability. The cost associated with that point is calculated and the average cost of that state is updated. Then the probabilities of all the states are updated. The probabilities are taken to bo inversely proportional to the average cost of the states After a certain number of searches the search probabilities become stationary and the automaton visits a particular state again and again. Then the automaton is said to have converged to that state Then by conducting a local gradient search within that state the exact locations of the concentrators are determined This algorithm was applied to a set of test problems and the results were compared with those given by Cooper's (1964, 1967) EAC algorithm and on the average it was found that the proposed algorithm performs better.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Because of limited sensor and communication ranges, designing efficient mechanisms for cooperative tasks is difficult. In this article, several negotiation schemes for multiple agents performing a cooperative task are presented. The negotiation schemes provide suboptimal solutions, but have attractive features of fast decision-making, and scalability to large number of agents without increasing the complexity of the algorithm. A software agent architecture of the decision-making process is also presented. The effect of the magnitude of information flow during the negotiation process is studied by using different models of the negotiation scheme. The performance of the various negotiation schemes, using different information structures, is studied based on the uncertainty reduction achieved for a specified number of search steps. The negotiation schemes perform comparable to that of optimal strategy in terms of uncertainty reduction and also require very low computational time, similar to 7 per cent to that of optimal strategy. Finally, analysis on computational and communication requirement for the negotiation schemes is carried out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Site-specific geotechnical data are always random and variable in space. In the present study, a procedure for quantifying the variability in geotechnical characterization and design parameters is discussed using the site-specific cone tip resistance data (qc) obtained from static cone penetration test (SCPT). The parameters for the spatial variability modeling of geotechnical parameters i.e. (i) existing trend function in the in situ qc data; (ii) second moment statistics i.e. analysis of mean, variance, and auto-correlation structure of the soil strength and stiffness parameters; and (iii) inputs from the spatial correlation analysis, are utilized in the numerical modeling procedures using the finite difference numerical code FLAC 5.0. The influence of consideration of spatially variable soil parameters on the reliability-based geotechnical deign is studied for the two cases i.e. (a) bearing capacity analysis of a shallow foundation resting on a clayey soil, and (b) analysis of stability and deformation pattern of a cohesive-frictional soil slope. The study highlights the procedure for conducting a site-specific study using field test data such as SCPT in geotechnical analysis and demonstrates that a few additional computations involving soil variability provide a better insight into the role of variability in designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The estimation of the frequency of a sinusoidal signal is a well researched problem. In this work we propose an initialization scheme to the popular dichotomous search of the periodogram peak algorithm(DSPA) that is used to estimate the frequency of a sinusoid in white gaussian noise. Our initialization is computationally low cost and gives the same performance as the DSPA, while reducing the number of iterations needed for the fine search stage. We show that our algorithm remains stable as we reduce the number of iterations in the fine search stage. We also compare the performance of our modification to a previous modification of the DSPA and show that we enhance the performance of the algorithm with our initialization technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accretion disk around a compact object is a nonlinear general relativistic system involving magnetohydrodynamics. Naturally, the question arises whether such a system is chaotic (deterministic) or stochastic (random) which might be related to the associated transport properties whose origin is still not confirmed. Earlier, the black hole system GRS 1915+105 was shown to be low-dimensional chaos in certain temporal classes. However, so far such nonlinear phenomena have not been studied fairly well for neutron stars which are unique for their magnetosphere and kHz quasi-periodic oscillation (QPO). On the other hand, it was argued that the QPO is a result of nonlinear magnetohydrodynamic effects in accretion disks. If a neutron star exhibits chaotic signature, then what is the chaotic/correlation dimension? We analyze RXTE/PCA data of neutron stars Sco X-1 and Cyg X-2, along with the black hole Cyg X-1 and the unknown source Cyg X-3, and show that while Sco X-1 and Cyg X-2 are low dimensional chaotic systems, Cyg X-1 and Cyg X-3 are stochastic sources. Based on our analysis, we argue that Cyg X-3 may be a black hole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present numerical evidence that supports the notion of minimization in the sequence space of proteins for a target conformation. We use the conformations of the real proteins in the Protein Data Bank (PDB) and present computationally efficient methods to identify the sequences with minimum energy. We use edge-weighted connectivity graph for ranking the residue sites with reduced amino acid alphabet and then use continuous optimization to obtain the energy-minimizing sequences. Our methods enable the computation of a lower bound as well as a tight upper bound for the energy of a given conformation. We validate our results by using three different inter-residue energy matrices for five proteins from protein data bank (PDB), and by comparing our energy-minimizing sequences with 80 million diverse sequences that are generated based on different considerations in each case. When we submitted some of our chosen energy-minimizing sequences to Basic Local Alignment Search Tool (BLAST), we obtained some sequences from non-redundant protein sequence database that are similar to ours with an E-value of the order of 10(-7). In summary, we conclude that proteins show a trend towards minimizing energy in the sequence space but do not seem to adopt the global energy-minimizing sequence. The reason for this could be either that the existing energy matrices are not able to accurately represent the inter-residue interactions in the context of the protein environment or that Nature does not push the optimization in the sequence space, once it is able to perform the function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we analyze a deploy and search strategy for multi-agent systems. Mobile agents equipped with sensors carry out search operation in the search space. The lack of information about the search space is modeled as an uncertainty density distribution over the space, and is assumed to be known to the agents a priori. In each step, the agents deploy themselves in an optimal way so as to maximize per step reduction in the uncertainty density. We analyze the proposed strategy for convergence and spatial distributedness. The control law moving the agents has been analyzed for stability and convergence using LaSalle's invariance principle, and for spatial distributedness under a few realistic constraints on the control input such as constant speed, limit on maximum speed, and also sensor range limits. The simulation experiments show that the strategy successfully reduces the average uncertainty density below the required level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analysis of large deformations of flexible membrane structures within the tension field theory is considered. A modification-of the finite element procedure by Roddeman et al. (Roddeman, D. G., Drukker J., Oomens, C. W J., Janssen, J. D., 1987, ASME J. Appl. Mech. 54, pp. 884-892) is proposed to study the wrinkling behavior of a membrane element. The state of stress in the element is determined through a modified deformation gradient corresponding to a fictive nonwrinkled surface. The new model uses a continuously modified deformation gradient to capture the location orientation of wrinkles more precisely. It is argued that the fictive nonwrinkled surface may be looked upon as an everywhere-taut surface in the limit as the minor (tensile) principal stresses over the wrinkled portions go to zero. Accordingly, the modified deformation gradient is thought of as the limit of a sequence of everywhere-differentiable tensors. Under dynamic excitations, the governing equations are weakly projected to arrive at a system of nonlinear ordinary differential equations that is solved using different integration schemes. It is concluded that, implicit integrators work much better than explicit ones in the present context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we design a novel auction which we call the OPT (optimal) auction. The OPT mechanism maximizes the search engine's expected revenue while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We show that the OPT mechanism is superior to two of the most commonly used mechanisms for sponsored search namely (1) GSP (Generalized Second Price) and (2) VCG (Vickrey-Clarke-Groves). We then show an important revenue equivalence result that the expected revenue earned by the search engine is the same for all the three mechanisms provided the advertisers are symmetric and the number of sponsored slots is strictly less than the number of advertisers.