934 resultados para iterated local search


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contention-based multiple access is a crucial component of many wireless systems. Multiple-packet reception (MPR) schemes that use interference cancellation techniques to receive and decode multiple packets that arrive simultaneously are known to be very efficient. However, the MPR schemes proposed in the literature require complex receivers capable of performing advanced signal processing over significant amounts of soft undecodable information received over multiple contention steps. In this paper, we show that local channel knowledge and elementary received signal strength measurements, which are available to many receivers today, can actively facilitate multipacket reception and even simplify the interference canceling receiver¿s design. We introduce two variants of a simple algorithm called Dual Power Multiple Access (DPMA) that use local channel knowledge to limit the receive power levels to two values that facilitate successive interference cancellation. The resulting receiver structure is markedly simpler, as it needs to process only the immediate received signal without having to store and process signals received previously. Remarkably, using a set of three feedback messages, the first variant, DPMA-Lite, achieves a stable throughput of 0.6865 packets per slot. Using four possible feedback messages, the second variant, Turbo-DPMA, achieves a stable throughput of 0.793 packets per slot, which is better than all contention algorithms known to date.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A residual-based strategy to estimate the local truncation error in a finite volume framework for steady compressible flows is proposed. This estimator, referred to as the -parameter, is derived from the imbalance arising from the use of an exact operator on the numerical solution for conservation laws. The behaviour of the residual estimator for linear and non-linear hyperbolic problems is systematically analysed. The relationship of the residual to the global error is also studied. The -parameter is used to derive a target length scale and consequently devise a suitable criterion for refinement/derefinement. This strategy, devoid of any user-defined parameters, is validated using two standard test cases involving smooth flows. A hybrid adaptive strategy based on both the error indicators and the -parameter, for flows involving shocks is also developed. Numerical studies on several compressible flow cases show that the adaptive algorithm performs excellently well in both two and three dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following problem is considered. Given the locations of the Central Processing Unit (ar;the terminals which have to communicate with it, to determine the number and locations of the concentrators and to assign the terminals to the concentrators in such a way that the total cost is minimized. There is alao a fixed cost associated with each concentrator. There is ail upper limit to the number of terminals which can be connected to a concentrator. The terminals can be connected directly to the CPU also In this paper it is assumed that the concentrators can bo located anywhere in the area A containing the CPU and the terminals. Then this becomes a multimodal optimization problem. In the proposed algorithm a stochastic automaton is used as a search device to locate the minimum of the multimodal cost function . The proposed algorithm involves the following. The area A containing the CPU and the terminals is divided into an arbitrary number of regions (say K). An approximate value for the number of concentrators is assumed (say m). The optimum number is determined by iteration later The m concentrators can be assigned to the K regions in (mk) ways (m > K) or (km) ways (K>m).(All possible assignments are feasible, i.e. a region can contain 0,1,…, to concentrators). Each possible assignment is assumed to represent a state of the stochastic variable structure automaton. To start with, all the states are assigned equal probabilities. At each stage of the search the automaton visits a state according to the current probability distribution. At each visit the automaton selects a 'point' inside that state with uniform probability. The cost associated with that point is calculated and the average cost of that state is updated. Then the probabilities of all the states are updated. The probabilities are taken to bo inversely proportional to the average cost of the states After a certain number of searches the search probabilities become stationary and the automaton visits a particular state again and again. Then the automaton is said to have converged to that state Then by conducting a local gradient search within that state the exact locations of the concentrators are determined This algorithm was applied to a set of test problems and the results were compared with those given by Cooper's (1964, 1967) EAC algorithm and on the average it was found that the proposed algorithm performs better.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modified local stability scheme is applied to several two-dimensional problems—blunt body flow, regular reflection of a shock and lambda shock. The resolution of the flow features obtained by the modified local stability scheme is found to be better than that achieved by the other first order schemes and almost identical to that achieved by the second order schemes incorporating artificial viscosity. The scheme is easy for coding, consumes moderate amount of computer storage and time. The scheme can be advantageously used in place of second order schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present numerical evidence that supports the notion of minimization in the sequence space of proteins for a target conformation. We use the conformations of the real proteins in the Protein Data Bank (PDB) and present computationally efficient methods to identify the sequences with minimum energy. We use edge-weighted connectivity graph for ranking the residue sites with reduced amino acid alphabet and then use continuous optimization to obtain the energy-minimizing sequences. Our methods enable the computation of a lower bound as well as a tight upper bound for the energy of a given conformation. We validate our results by using three different inter-residue energy matrices for five proteins from protein data bank (PDB), and by comparing our energy-minimizing sequences with 80 million diverse sequences that are generated based on different considerations in each case. When we submitted some of our chosen energy-minimizing sequences to Basic Local Alignment Search Tool (BLAST), we obtained some sequences from non-redundant protein sequence database that are similar to ours with an E-value of the order of 10(-7). In summary, we conclude that proteins show a trend towards minimizing energy in the sequence space but do not seem to adopt the global energy-minimizing sequence. The reason for this could be either that the existing energy matrices are not able to accurately represent the inter-residue interactions in the context of the protein environment or that Nature does not push the optimization in the sequence space, once it is able to perform the function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common trick for designing faster quantum adiabatic algorithms is to apply the adiabaticity condition locally at every instant. However it is often difficult to determine the instantaneous gap between the lowest two eigenvalues, which is an essential ingredient in the adiabaticity condition. In this paper we present a simple linear algebraic technique for obtaining a lower bound on the instantaneous gap even in such a situation. As an illustration, we investigate the adiabatic un-ordered search of van Dam et al. [17] and Roland and Cerf [15] when the non-zero entries of the diagonal final Hamiltonian are perturbed by a polynomial (in log N, where N is the length of the unordered list) amount. We use our technique to derive a bound on the running time of a local adiabatic schedule in terms of the minimum gap between the lowest two eigenvalues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the theoretical foundations for the multiple rendezvous problem involving design of local control strategies that enable groups of visibility-limited mobile agents to split into subgroups, exhibit simultaneous taxis behavior towards, and eventually rendezvous at, multiple unknown locations of interest. The theoretical results are proved under certain restricted set of assumptions. The algorithm used to solve the above problem is based on a glowworm swarm optimization (GSO) technique, developed earlier, that finds multiple optima of multimodal objective functions. The significant difference between our work and most earlier approaches to agreement problems is the use of a virtual local-decision domain by the agents in order to compute their movements. The range of the virtual domain is adaptive in nature and is bounded above by the maximum sensor/visibility range of the agent. We introduce a new decision domain update rule that enhances the rate of convergence by a factor of approximately two. We use some illustrative simulations to support the algorithmic correctness and theoretical findings of the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to know and to quantify the liquid holdups both dynamic and static at local levels as it will lead to understand various blast furnace phenomena properly such as slag/metal.gas.solid reactions, gas flow behaviour and interfacial area between the gas/solid/liquid. In the present study, considering the importance of local liquid holdup and non-availability of holdup data in these systems, an attempt has been made to quantify the local holdups in the dropping and around raceway zones in a cold model study using a non-wetting packing for liquid. In order to quantify the liquid holdups at microscopic level, a previously developed technique, X-ray radiography, has been used. It is observed that the liquid flows in preferred paths or channels which carry droplets/rivulets. It has been found that local holdup in some regions of the packed bed is much higher than average at a particular flow rate and this can have important consequences for the correct modelling of such systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-complexity near-optimal detection of signals in MIMO systems with large number (tens) of antennas is getting increased attention. In this paper, first, we propose a variant of Markov chain Monte Carlo (MCMC) algorithm which i) alleviates the stalling problem encountered in conventional MCMC algorithm at high SNRs, and ii) achieves near-optimal performance for large number of antennas (e.g., 16×16, 32×32, 64×64 MIMO) with 4-QAM. We call this proposed algorithm as randomized MCMC (R-MCMC) algorithm. Second, we propose an other algorithm based on a random selection approach to choose candidate vectors to be tested in a local neighborhood search. This algorithm, which we call as randomized search (RS) algorithm, also achieves near-optimal performance for large number of antennas with 4-QAM. The complexities of the proposed R-MCMC and RS algorithms are quadratic/sub-quadratic in number of transmit antennas, which are attractive for detection in large-MIMO systems. We also propose message passing aided R-MCMC and RS algorithms, which are shown to perform well for higher-order QAM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perceive objects as containing a variety of attributes: local features, relations between features, internal details, and global properties. But we know little about how they combine. Here, we report a remarkably simple additive rule that governs how these diverse object attributes combine in vision. The perceived dissimilarity between two objects was accurately explained as a sum of (a) spatially tuned local contour-matching processes modulated by part decomposition; (b) differences in internal details, such as texture; (c) differences in emergent attributes, such as symmetry; and (d) differences in global properties, such as orientation or overall configuration of parts. Our results elucidate an enduring question in object vision by showing that the whole object is not a sum of its parts but a sum of its many attributes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power Point from Panel presentation giving implementation and search result displays and linking (17 slides)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho é o resultado das reflexões relativas às leituras que venho realizando desde a minha graduação. A partir do entendimento de que a educação escolar, ao contrário de alguns de seus pressupostos, serviria especialmente à ordem social capitalista; sendo uma das maneiras mais eficientes de domesticar o ser humano. Não se trata de afirmar que toda educação escolar será sempre domesticadora, mas, que grande parte do seu trabalho acaba por contribuir para a diminuição da capacidade criativa do indivíduo. Entendo que a imposição da busca pela verdade e a tentativa de igualar a todos os indivíduos são julgamentos morais. Esses julgamentos representariam a interpretação da vida a partir de um ser enfraquecido, ou seja, teriam como função a preservação do decadente. Esses valores morais acabam por diminuir, também, a energia dos indivíduos que poderiam expandir-se. A escola pode ser um local onde o estudante poderá ter experiências capazes de levá-lo ao crescimento, ou não. Concordamos com Nietzsche, ao entender que há uma oposição entre a vontade de moral e a vontade de potência. O ser vivo quer crescer, ou preservar-se. A criação de novas formas de viver, de desejar uma vida com outra intensidade depende, dos instintos. A moral impõe ao indivíduo uma constante vigilância e coerção contra a expressão desses instintos. A energia que poderia gerar novas formas de viver, agora será utilizada contra o próprio indivíduo, atormentando-o. A escola ainda é uma fonte de experiências. Resta-nos saber como utilizá-la em seus potenciais criadores

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need to create high-value products for specialist applications, and the search for efficient forming routes that obviate the need for some machining steps, is driving Interest In a novel class of forming processes aiming to create locally thickened features within sheet work- pieces. A number of novel forming processes have been proposed to meet this need, but it is as yet unclear which processes will be most effective in creating local thickening of various geometries, and many process configurations have yet to be tried. This paper aims to provide some basic principles for designing and characterising process behaviour. A simplified generic description of sheet thickening processes is provided, with two tools of variable operating on a sheet workpiece in plane strain, with different tool separations and motions parameterised. A comprehensive numerical study of the behaviour of this class of processes is conducted in Abaqus to predict the main characteristics of the material flow in each configuration. The results are used to classify the different basic behaviours that can be achieved by the sheet-bulk thickening processes and to give guidance on future process development, capability and applicability. © 2011 Wiley-VCH Verlag GmbH & Co. KGaA. Weinheim.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How do humans use predictive contextual information to facilitate visual search? How are consistently paired scenic objects and positions learned and used to more efficiently guide search in familiar scenes? For example, a certain combination of objects can define a context for a kitchen and trigger a more efficient search for a typical object, such as a sink, in that context. A neural model, ARTSCENE Search, is developed to illustrate the neural mechanisms of such memory-based contextual learning and guidance, and to explain challenging behavioral data on positive/negative, spatial/object, and local/distant global cueing effects during visual search. The model proposes how global scene layout at a first glance rapidly forms a hypothesis about the target location. This hypothesis is then incrementally refined by enhancing target-like objects in space as a scene is scanned with saccadic eye movements. The model clarifies the functional roles of neuroanatomical, neurophysiological, and neuroimaging data in visual search for a desired goal object. In particular, the model simulates the interactive dynamics of spatial and object contextual cueing in the cortical What and Where streams starting from early visual areas through medial temporal lobe to prefrontal cortex. After learning, model dorsolateral prefrontal cortical cells (area 46) prime possible target locations in posterior parietal cortex based on goalmodulated percepts of spatial scene gist represented in parahippocampal cortex, whereas model ventral prefrontal cortical cells (area 47/12) prime possible target object representations in inferior temporal cortex based on the history of viewed objects represented in perirhinal cortex. The model hereby predicts how the cortical What and Where streams cooperate during scene perception, learning, and memory to accumulate evidence over time to drive efficient visual search of familiar scenes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To investigate the effect of statin use after radical prostatectomy (RP) on biochemical recurrence (BCR) in patients with prostate cancer who never received statins before RP. PATIENTS AND METHODS: We conducted a retrospective analysis of 1146 RP patients within the Shared Equal Access Regional Cancer Hospital (SEARCH) database. Multivariable Cox proportional hazards analyses were used to examine differences in risk of BCR between post-RP statin users vs nonusers. To account for varying start dates and duration of statin use during follow-up, post-RP statin use was treated as a time-dependent variable. In a secondary analysis, models were stratified by race to examine the association of post-RP statin use with BCR among black and non-black men. RESULTS: After adjusting for clinical and pathological characteristics, post-RP statin use was significantly associated with 36% reduced risk of BCR (hazard ratio [HR] 0.64, 95% confidence interval [CI] 0.47-0.87; P = 0.004). Post-RP statin use remained associated with reduced risk of BCR after adjusting for preoperative serum cholesterol levels. In secondary analysis, after stratification by race, this protective association was significant in non-black (HR 0.49, 95% CI 0.32-0.75; P = 0.001) but not black men (HR 0.82, 95% CI 0.53-1.28; P = 0.384). CONCLUSION: In this retrospective cohort of men undergoing RP, post-RP statin use was significantly associated with reduced risk of BCR. Whether the association between post-RP statin use and BCR differs by race requires further study. Given these findings, coupled with other studies suggesting that statins may reduce risk of advanced prostate cancer, randomised controlled trials are warranted to formally test the hypothesis that statins slow prostate cancer progression.