33 resultados para Search and retrieval
Resumo:
Motivated by the need for designing efficient and robust fully-distributed computation in highly dynamic networks such as Peer-to-Peer (P2P) networks, we study distributed protocols for constructing and maintaining dynamic network topologies with good expansion properties. Our goal is to maintain a sparse (bounded degree) expander topology despite heavy {\em churn} (i.e., nodes joining and leaving the network continuously over time). We assume that the churn is controlled by an adversary that has complete knowledge and control of what nodes join and leave and at what time and has unlimited computational power, but is oblivious to the random choices made by the algorithm. Our main contribution is a randomized distributed protocol that guarantees with high probability the maintenance of a {\em constant} degree graph with {\em high expansion} even under {\em continuous high adversarial} churn. Our protocol can tolerate a churn rate of up to $O(n/\poly\log(n))$ per round (where $n$ is the stable network size). Our protocol is efficient, lightweight, and scalable, and it incurs only $O(\poly\log(n))$ overhead for topology maintenance: only polylogarithmic (in $n$) bits needs to be processed and sent by each node per round and any node's computation cost per round is also polylogarithmic. The given protocol is a fundamental ingredient that is needed for the design of efficient fully-distributed algorithms for solving fundamental distributed computing problems such as agreement, leader election, search, and storage in highly dynamic P2P networks and enables fast and scalable algorithms for these problems that can tolerate a large amount of churn.
Resumo:
This paper reports on a research study to identify the nature of the profession of Organisation Development (OD) in the UK and how it has evolved over four decades. The study is designed to compare academic perspectives on OD with what is happening in the professional practice world. Three forms of data were collected for this study, content analysis of job advertisements from a four decade period, a bibliometric search and interviews with subject experts. The findings were analysed through the theory lens of institutional theory, the dissemination of ideas and fads and fashions in management. Emerging insights are that there is a difference between academic and practitioners development of the OD profession in the UK. The reasons for the difference have been explored in the discussion.
Resumo:
In a typical shoeprint classification and retrieval system, the first step is to segment meaningful basic shapes and patterns in a noisy shoeprint image. This step has significant influence on shape descriptors and shoeprint indexing in the later stages. In this paper, we extend a recently developed denoising technique proposed by Buades, called non-local mean filtering, to give a more general model. In this model, the expected result of an operation on a pixel can be estimated by performing the same operation on all of its reference pixels in the same image. A working pixel’s reference pixels are those pixels whose neighbourhoods are similar to the working pixel’s neighbourhood. Similarity is based on the correlation between the local neighbourhoods of the working pixel and the reference pixel. We incorporate a special instance of this general case into thresholding a very noisy shoeprint image. Visual and quantitative comparisons with two benchmarking techniques, by Otsu and Kittler, are conducted in the last section, giving evidence of the effectiveness of our method for thresholding noisy shoeprint images.
Resumo:
Nurse rostering is a difficult search problem with many constraints. In the literature, a number of approaches have been investigated including penalty function methods to tackle these constraints within genetic algorithm frameworks. In this paper, we investigate an extension of a previously proposed stochastic ranking method, which has demonstrated superior performance to other constraint handling techniques when tested against a set of constrained optimisation benchmark problems. An initial experiment on nurse rostering problems demonstrates that the stochastic ranking method is better in finding feasible solutions but fails to obtain good results with regard to the objective function. To improve the performance of the algorithm, we hybridise it with a recently proposed simulated annealing hyper-heuristic within a local search and genetic algorithm framework. The hybrid algorithm shows significant improvement over both the genetic algorithm with stochastic ranking and the simulated annealing hyper-heuristic alone. The hybrid algorithm also considerably outperforms the methods in the literature which have the previously best known results.
Resumo:
In a decision feedback equalizer (DFE), the structural parameters, including the decision delay, the feedforward filter (FFF), and feedback filter (FBF) lengths, must be carefully chosen, as they greatly influence the performance. Although the FBF length can be set as the channel memory, there is no closed-form expression for the FFF length and decision delay. In this letter, first we analytically show that the two-dimensional search for the optimum FFF length and decision delay can be simplified to a one-dimensional search and then describe a new adaptive DFE where the optimum structural parameters can he self-adapted.
Resumo:
All extra-solar planet masses that have been derived spectroscopically are lower limits since the inclination of the orbit to our line-of-sight is unknown except for transiting systems. In theory, however, it is possible to determine the inclination angle, i, between the rotation axis of a star and an observer's line-of-sight from measurements of the projected equatorial velocity (v sin i), the stellar rotation period (P(rot)) and the stellar radius (R(*)). For stars which host planetary systems this allows the removal of the sin i dependency of extra-solar planet masses derived from spectroscopic observations under the assumption that the planetary orbits lie perpendicular to the stellar rotation axis.
We have carried out an extensive literature search and present a catalogue of v sin i, P(rot) and R(*) estimates for stars hosting extra-solar planets. In addition, we have used Hipparcos parallaxes and the Barnes-Evans relationship to further supplement the R(*) estimates obtained from the literature. Using this catalogue, we have obtained sin i estimates using a Markov-chain Monte Carlo analysis. This technique allows proper 1 Sigma two-tailed confidence limits to be placed on the derived sin i's along with the transit probability for each planet to be determined.
While we find that a small proportion of systems yield sin i's significantly greater than 1, most likely due to poor P(rot) estimations, the large majority are acceptable. We are further encouraged by the cases where we have data on transiting systems, as the technique indicates inclinations of similar to 90 degrees and high transit probabilities. In total, we are able to estimate the true masses of 133 extra-solar planets. Of these 133 extra-solar planets, only six have revised masses that place them above the 13M(J) deuterium burning limit; four of those six extra-solar planet candidates were already suspected to lie above the deuterium burning limit before correcting their masses for the sin i dependency. Our work reveals a population of high-mass extra-solar planets with low eccentricities, and we speculate that these extra-solar planets may represent the signature of different planetary formation mechanisms at work. Finally, we discuss future observations that should improve the robustness of this technique.
Resumo:
Recent searches by unbiased, wide-field surveys have uncovered a group of extremely luminous optical transients. The initial discoveries of SN 2005ap by the Texas Supernova Search and SCP-06F6 in a deep Hubble pencil beam survey were followed by the Palomar Transient Factory confirmation of host redshifts for other similar transients. The transients share the common properties of high optical luminosities (peak magnitudes similar to -21 to -23), blue colors, and a lack of H or He spectral features. The physical mechanism that produces the luminosity is uncertain, with suggestions ranging from jet-driven explosion to pulsational pair instability. Here, we report the most detailed photometric and spectral coverage of an ultra-bright transient (SN 2010gx) detected in the Pan-STARRS 1 sky survey. In common with other transients in this family, early-time spectra show a blue continuum and prominent broad absorption lines of O II. However, about 25 days after discovery, the spectra developed type Ic supernova features, showing the characteristic broad Fe II and Si II absorption lines. Detailed, post-maximum follow-up may show that all SN 2005ap and SCP-06F6 type transients are linked to supernovae Ic. This poses problems in understanding the physics of the explosions: there is no indication from late-time photometry that the luminosity is powered by Ni-56, the broad light curves suggest very large ejected masses, and the slow spectral evolution is quite different from typical Ic timescales. The nature of the progenitor stars and the origin of the luminosity are intriguing and open questions.
Resumo:
A scalable large vocabulary, speaker independent speech recognition system is being developed using Hidden Markov Models (HMMs) for acoustic modeling and a Weighted Finite State Transducer (WFST) to compile sentence, word, and phoneme models. The system comprises a software backend search and an FPGA-based Gaussian calculation which are covered here. In this paper, we present an efficient pipelined design implemented both as an embedded peripheral and as a scalable, parallel hardware accelerator. Both architectures have been implemented on an Alpha Data XRC-5T1, reconfigurable computer housing a Virtex 5 SX95T FPGA. The core has been tested and is capable of calculating a full set of Gaussian results from 3825 acoustic models in 9.03 ms which coupled with a backend search of 5000 words has provided an accuracy of over 80%. Parallel implementations have been designed with up to 32 cores and have been successfully implemented with a clock frequency of 133?MHz.
Resumo:
A full hardware implementation of a Weighted Fair Queuing (WFQ) packet scheduler is proposed. The circuit architecture presented has been implemented using Altera Stratix II FPGA technology, utilizing RLDII and QDRII memory components. The circuit can provide fine granularity Quality of Service (QoS) support at a line throughput rate of 12.8Gb/s in its current implementation. The authors suggest that, due to the flexible and scalable modular circuit design approach used, the current circuit architecture can be targeted for a full ASIC implementation to deliver 50 Gb/s throughput. The circuit itself comprises three main components; a WFQ algorithm computation circuit, a tag/time-stamp sort and retrieval circuit, and a high throughput shared buffer. The circuit targets the support of emerging wireline and wireless network nodes that focus on Service Level Agreements (SLA's) and Quality of Experience.
Resumo:
Geoscience methods are increasingly being utilised in criminal, environmental and humanitarian forensic investigations, and the use of such methods is supported by a growing body of experimental and theoretical research. Geoscience search techniques can complement traditional methodologies in the search for buried objects, including clandestine graves, weapons, explosives, drugs, illegal weapons, hazardous waste and vehicles. This paper details recent advances in search and detection methods, with case studies and reviews. Relevant examples are given, together with a generalised workflow for search and suggested detection technique(s) table. Forensic geoscience techniques are continuing to rapidly evolve to assist search investigators to detect hitherto difficult to locate forensic targets.
Resumo:
Using the foraging movements of an insectivorous bat, Myotis mystacinus, we describe temporal switching of foraging behaviour in response to resource availability. These observations conform to predictions of optimized search under the Lévy flight paradigm. However, we suggest that this occurs as a result of a preference behaviour and knowledge of resource distribution. Preferential behaviour and knowledge of a familiar area generate distinct movement patterns as resource availability changes on short temporal scales. The behavioural response of predators to changes in prey fields can elicit different functional responses, which are considered to be central in the development of stable predator-prey communities. Recognizing how the foraging movements of an animal relate to environmental conditions also elucidates the evolution of optimized search and the prevalence of discrete strategies in natural systems. Applying techniques that use changes in the frequency distribution of movements facilitates exploration of the processes that underpin behavioural changes. © 2012 The Author(s) Published by the Royal Society. All rights reserved.
Resumo:
Scrapers have established an important position in the earthmoving field as they are independently capable of accomplishing an earthmoving operation. Given that loading a scraper to its capacity does not entail its maximum production, optimizing the scraper’s loading time is an essential prerequisite for successful operations management. The relevant literature addresses the loading time optimization through a graphical method that is founded on the invalid assumption that the hauling time is independent of the load time. To correct this, a new algorithmic optimization method that incorporates the golden section search and the bisection algorithm is proposed. Comparison of the results derived from the proposed and the existing method demonstrates that the latter entails the systematic needless prolongation of the loading stage thus resulting in reduced hourly production and increased cost. Therefore, the proposed method achieves an improved modeling of scraper earthmoving operations and contributes toward a more efficient cost management.
Resumo:
Approximately 90% of the UK population spend some time in hospital in their final year of life, and more than half of the population die in hospital. This review aims to explore the experiences of general nurses when providing end-of-life care to patients in the acute hospital setting. Nine studies were identified through a literature search, and each was then analysed and evaluated until themes emerged. Six themes were drawn from the literature: lack of education and knowledge, lack of time with patients, barriers arising in the culture of the health-care setting, communication barriers, symptom management, and nurses' personal issues. The themes cause concern about the quality of end-of-life care being provided in the acute care setting. The literature appears to be consistent in the view that terminally ill patients are best cared for in specialised care settings, such as palliative care units and hospices. However, increasing demands on health services will result in greater numbers of dying patients being admitted to the acute hospital setting. It is therefore paramount that general nurses' educational needs are met to ensure they develop clinical competence to provide high-quality holistic end-of-life care.
Resumo:
This paper studies disinflationary shocks in a non-linear New Keynesian model with search and matching frictions and moral hazard in the labor markets. Our focus is on understanding the wage formation process as well as welfare costs of disinflations in the presence of such labor market frictions.
The presence of imperfect information in labor markets imposes a lower bound on worker surplus that varies endogenously. Consequently equilibrium can take two forms depending on whether the no shirking condition is binding or not. We also evaluate both regimes from a welfare perspective when the economy is subject to a perfectly credible disinflationary shock.
Resumo:
An unusual application of hydrological understanding to a police search is described. The lacustrine search for a missing person provided reports of bottom-water currents in the lake and contradictory indications from cadaver dogs. A hydrological model of the area was developed using pre-existing information from side scan sonar, a desktop hydrogeological study and deployment of water penetrating radar (WPR). These provided a hydrological theory for the initial search involving subaqueous groundwater flow, focused on an area of bedrock surrounded by sediment, on the lake floor. The work shows the value a hydrological explanation has to a police search operation (equally to search and rescue). With hindsight, the desktop study should have preceded the search, allowing better understanding of water conditions. The ultimate reason for lacustrine flow in this location is still not proven, but the hydrological model explained the problems encountered in the initial search.