988 resultados para Search procedures
Resumo:
The estimation of the frequency of a sinusoidal signal is a well researched problem. In this work we propose an initialization scheme to the popular dichotomous search of the periodogram peak algorithm(DSPA) that is used to estimate the frequency of a sinusoid in white gaussian noise. Our initialization is computationally low cost and gives the same performance as the DSPA, while reducing the number of iterations needed for the fine search stage. We show that our algorithm remains stable as we reduce the number of iterations in the fine search stage. We also compare the performance of our modification to a previous modification of the DSPA and show that we enhance the performance of the algorithm with our initialization technique.
Resumo:
The accretion disk around a compact object is a nonlinear general relativistic system involving magnetohydrodynamics. Naturally, the question arises whether such a system is chaotic (deterministic) or stochastic (random) which might be related to the associated transport properties whose origin is still not confirmed. Earlier, the black hole system GRS 1915+105 was shown to be low-dimensional chaos in certain temporal classes. However, so far such nonlinear phenomena have not been studied fairly well for neutron stars which are unique for their magnetosphere and kHz quasi-periodic oscillation (QPO). On the other hand, it was argued that the QPO is a result of nonlinear magnetohydrodynamic effects in accretion disks. If a neutron star exhibits chaotic signature, then what is the chaotic/correlation dimension? We analyze RXTE/PCA data of neutron stars Sco X-1 and Cyg X-2, along with the black hole Cyg X-1 and the unknown source Cyg X-3, and show that while Sco X-1 and Cyg X-2 are low dimensional chaotic systems, Cyg X-1 and Cyg X-3 are stochastic sources. Based on our analysis, we argue that Cyg X-3 may be a black hole.
Resumo:
In this paper, we present numerical evidence that supports the notion of minimization in the sequence space of proteins for a target conformation. We use the conformations of the real proteins in the Protein Data Bank (PDB) and present computationally efficient methods to identify the sequences with minimum energy. We use edge-weighted connectivity graph for ranking the residue sites with reduced amino acid alphabet and then use continuous optimization to obtain the energy-minimizing sequences. Our methods enable the computation of a lower bound as well as a tight upper bound for the energy of a given conformation. We validate our results by using three different inter-residue energy matrices for five proteins from protein data bank (PDB), and by comparing our energy-minimizing sequences with 80 million diverse sequences that are generated based on different considerations in each case. When we submitted some of our chosen energy-minimizing sequences to Basic Local Alignment Search Tool (BLAST), we obtained some sequences from non-redundant protein sequence database that are similar to ours with an E-value of the order of 10(-7). In summary, we conclude that proteins show a trend towards minimizing energy in the sequence space but do not seem to adopt the global energy-minimizing sequence. The reason for this could be either that the existing energy matrices are not able to accurately represent the inter-residue interactions in the context of the protein environment or that Nature does not push the optimization in the sequence space, once it is able to perform the function.
Resumo:
In this paper we analyze a deploy and search strategy for multi-agent systems. Mobile agents equipped with sensors carry out search operation in the search space. The lack of information about the search space is modeled as an uncertainty density distribution over the space, and is assumed to be known to the agents a priori. In each step, the agents deploy themselves in an optimal way so as to maximize per step reduction in the uncertainty density. We analyze the proposed strategy for convergence and spatial distributedness. The control law moving the agents has been analyzed for stability and convergence using LaSalle's invariance principle, and for spatial distributedness under a few realistic constraints on the control input such as constant speed, limit on maximum speed, and also sensor range limits. The simulation experiments show that the strategy successfully reduces the average uncertainty density below the required level.
Resumo:
An analysis of large deformations of flexible membrane structures within the tension field theory is considered. A modification-of the finite element procedure by Roddeman et al. (Roddeman, D. G., Drukker J., Oomens, C. W J., Janssen, J. D., 1987, ASME J. Appl. Mech. 54, pp. 884-892) is proposed to study the wrinkling behavior of a membrane element. The state of stress in the element is determined through a modified deformation gradient corresponding to a fictive nonwrinkled surface. The new model uses a continuously modified deformation gradient to capture the location orientation of wrinkles more precisely. It is argued that the fictive nonwrinkled surface may be looked upon as an everywhere-taut surface in the limit as the minor (tensile) principal stresses over the wrinkled portions go to zero. Accordingly, the modified deformation gradient is thought of as the limit of a sequence of everywhere-differentiable tensors. Under dynamic excitations, the governing equations are weakly projected to arrive at a system of nonlinear ordinary differential equations that is solved using different integration schemes. It is concluded that, implicit integrators work much better than explicit ones in the present context.
Resumo:
The legality of the operation of Google’s search engine, and its liability as an Internet intermediary, has been tested in various jurisdictions on various grounds. In Australia, there was an ultimately unsuccessful case against Google under the Australian Consumer Law relating to how it presents results from its search engine. Despite this failed claim, several complex issues were not adequately addressed in the case including whether Google sufficiently distinguishes between the different parts of its search results page, so as not to mislead or deceive consumers. This article seeks to address this question of consumer confusion by drawing on empirical survey evidence of Australian consumers’ understanding of Google’s search results layout. This evidence, the first of its kind in Australia, indicates some level of consumer confusion. The implications for future legal proceedings in against Google in Australia and in other jurisdictions are discussed.
Resumo:
In this paper we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we design a novel auction which we call the OPT (optimal) auction. The OPT mechanism maximizes the search engine's expected revenue while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We show that the OPT mechanism is superior to two of the most commonly used mechanisms for sponsored search namely (1) GSP (Generalized Second Price) and (2) VCG (Vickrey-Clarke-Groves). We then show an important revenue equivalence result that the expected revenue earned by the search engine is the same for all the three mechanisms provided the advertisers are symmetric and the number of sponsored slots is strictly less than the number of advertisers.
Resumo:
By detecting leading protons produced in the Central Exclusive Diffractive process, p+p → p+X+p, one can measure the missing mass, and scan for possible new particle states such as the Higgs boson. This process augments - in a model independent way - the standard methods for new particle searches at the Large Hadron Collider (LHC) and will allow detailed analyses of the produced central system, such as the spin-parity properties of the Higgs boson. The exclusive central diffractive process makes possible precision studies of gluons at the LHC and complements the physics scenarios foreseen at the next e+e− linear collider. This thesis first presents the conclusions of the first systematic analysis of the expected precision measurement of the leading proton momentum and the accuracy of the reconstructed missing mass. In this initial analysis, the scattered protons are tracked along the LHC beam line and the uncertainties expected in beam transport and detection of the scattered leading protons are accounted for. The main focus of the thesis is in developing the necessary radiation hard precision detector technology for coping with the extremely demanding experimental environment of the LHC. This will be achieved by using a 3D silicon detector design, which in addition to the radiation hardness of up to 5×10^15 neutrons/cm2, offers properties such as a high signal-to- noise ratio, fast signal response to radiation and sensitivity close to the very edge of the detector. This work reports on the development of a novel semi-3D detector design that simplifies the 3D fabrication process, but conserves the necessary properties of the 3D detector design required in the LHC and in other imaging applications.
Resumo:
This paper deals with the development of simplified semi-empirical relations for the prediction of residual velocities of small calibre projectiles impacting on mild steel target plates, normally or at an angle, and the ballistic limits for such plates. It has been shown, for several impact cases for which test results on perforation of mild steel plates are available, that most of the existing semi-empirical relations which are applicable only to normal projectile impact do not yield satisfactory estimations of residual velocity. Furthermore, it is difficult to quantify some of the empirical parameters present in these relations for a given problem. With an eye towards simplicity and ease of use, two new regression-based relations employing standard material parameters have been discussed here for predicting residual velocity and ballistic limit for both normal and oblique impact. The latter expressions differ in terms of usage of quasi-static or strain rate-dependent average plate material strength. Residual velocities yielded by the present semi-empirical models compare well with the experimental results. Additionally, ballistic limits from these relations show close correlation with the corresponding finite element-based predictions.
Resumo:
The keyword based search technique suffers from the problem of synonymic and polysemic queries. Current approaches address only theproblem of synonymic queries in which different queries might have the same information requirement. But the problem of polysemic queries,i.e., same query having different intentions, still remains unaddressed. In this paper, we propose the notion of intent clusters, the members of which will have the same intention. We develop a clustering algorithm that uses the user session information in query logs in addition to query URL entries to identify cluster of queries having the same intention. The proposed approach has been studied through case examples from the actual log data from AOL, and the clustering algorithm is shown to be successful in discerning the user intentions.
Resumo:
The study examines the term "low threshold" from the point of view of the most marginalized drug users. While using illicit drugs is criminalised and morally judged in Finland, users have special barriers to seek for care. Low threshold services aim at reaching drug users who themselves don t seek for help. "Low threshold" is a metaphor describing easy access to services. The theoretical frame of reference of the study consists of processing the term analytically and critically. The research work sets out to test the rhetoric of low threshold by making use of a qualitative multi-case study to find out, if the threshold of so called low threshold services always appears low for the most marginalized drug users. The cases are: the mobile unite offering health counselling, the day service centre for marginalized substance abusers and the low threshold project of the outpatient clinic for drug users in Helsinki and the health counselling service trial in Vyborg, Russia. The case study answer following questions: 1) How do the method of low threshold work out in the studied cases from the point of view of the most marginalized drug users? 2) How do potential thresholds appear and how did they develop? 3) How do the most marginalized drug users get into the care system through low threshold? The data consists of interviews of drug users, workers and other specialists having been accomplished in the years 2001 - 2006, patient documents and customer registers. The dissertation includes four articles published in the years 2006 - 2008 and the summary article. The study manifests that even low threshold is not always low enough for the most marginalized drug users. That expresses a highly multiproblematised and underpriviledged group of drug users, whose life and utilization of services are framed by deep marginalisation, homelessness, multi-substance use, mental and somatic illnesses and being repeatedly imprisoned. Using services is rendered difficult by many factors arising from the care system, drug users themselves and the action environment. In Finland thresholds are generally due to the execution of practical services and procedures not considering the fear of control and labelling as a drug user. When striving for further rehabilitating substance abuse care by means of low threshold services the marginalized drug users meet the biggest difficulties. They are due to inelastic structures, procedures and division of labour in the established care system and also to poor chances of drug users to be in action in the way expected by the care system. Multiproblematic multisubstance users become "wrong" customers by high expectations of care motivation and specializing in the care system. In Russia the thresholds are primarily caused by rigid control politics directed to drug users by the society and by the scantiness of care system. The ideology of reducing drug related harm is not approved and the care system is unwilling to commit to it. Low threshold turnes out to be relative as a term. The rhetoric of the care system is not enough to unilaterally define lowness of the threshold. The experiences of drug users and the actual activity to search for care determine the threshold. It does not appear the same for everybody either. Access of certain customer group to a service unit may even raise the threshold for some other group. The low threshold system also is surprisingly realized: you could not always tell in advance, what kind of customers and how many of them could be reached. Keywords: low threshold, marginalized drug users, harm reduction, barriers to services, outreach