928 resultados para uniform storng consistency


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Space carving has emerged as a powerful method for multiview scene reconstruction. Although a wide variety of methods have been proposed, the quality of the reconstruction remains highly-dependent on the photometric consistency measure, and the threshold used to carve away voxels. In this paper, we present a novel photo-consistency measure that is motivated by a multiset variant of the chamfer distance. The new measure is robust to high amounts of within-view color variance and also takes into account the projection angles of back-projected pixels. Another critical issue in space carving is the selection of the photo-consistency threshold used to determine what surface voxels are kept or carved away. In this paper, a reliable threshold selection technique is proposed that examines the photo-consistency values at contour generator points. Contour generators are points that lie on both the surface of the object and the visual hull. To determine the threshold, a percentile ranking of the photo-consistency values of these generator points is used. This improved technique is applicable to a wide variety of photo-consistency measures, including the new measure presented in this paper. Also presented in this paper is a method to choose between photo-consistency measures, and voxel array resolutions prior to carving using receiver operating characteristic (ROC) curves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent work in sensor databases has focused extensively on distributed query problems, notably distributed computation of aggregates. Existing methods for computing aggregates broadcast queries to all sensors and use in-network aggregation of responses to minimize messaging costs. In this work, we focus on uniform random sampling across nodes, which can serve both as an alternative building block for aggregation and as an integral component of many other useful randomized algorithms. Prior to our work, the best existing proposals for uniform random sampling of sensors involve contacting all nodes in the network. We propose a practical method which is only approximately uniform, but contacts a number of sensors proportional to the diameter of the network instead of its size. The approximation achieved is tunably close to exact uniform sampling, and only relies on well-known existing primitives, namely geographic routing, distributed computation of Voronoi regions and von Neumann's rejection method. Ultimately, our sampling algorithm has the same worst-case asymptotic cost as routing a point-to-point message, and thus it is asymptotically optimal among request/reply-based sampling methods. We provide experimental results demonstrating the effectiveness of our algorithm on both synthetic and real sensor topologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A secure sketch (defined by Dodis et al.) is an algorithm that on an input w produces an output s such that w can be reconstructed given its noisy version w' and s. Security is defined in terms of two parameters m and m˜ : if w comes from a distribution of entropy m, then a secure sketch guarantees that the distribution of w conditioned on s has entropy m˜ , where λ = m−m˜ is called the entropy loss. In this note we show that the entropy loss of any secure sketch (or, more generally, any randomized algorithm) on any distribution is no more than it is on the uniform distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study properties of non-uniform reductions and related completeness notions. We strengthen several results of Hitchcock and Pavan and give a trade-off between the amount of advice needed for a reduction and its honesty on NEXP. We construct an oracle relative to which this trade-off is optimal. We show, in a more systematic study of non-uniform reductions, that among other things non-uniformity can be removed at the cost of more queries. In line with Post's program for complexity theory we connect such 'uniformization' properties to the separation of complexity classes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In research areas involving mathematical rigor, there are numerous benefits to adopting a formal representation of models and arguments: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, broad accessibility has not been a priority in the design of formal verification tools that can provide these benefits. We propose a few design criteria to address these issues: a simple, familiar, and conventional concrete syntax that is independent of any environment, application, or verification strategy, and the possibility of reducing workload and entry costs by employing features selectively. We demonstrate the feasibility of satisfying such criteria by presenting our own formal representation and verification system. Our system’s concrete syntax overlaps with English, LATEX and MediaWiki markup wherever possible, and its verifier relies on heuristic search techniques that make the formal authoring process more manageable and consistent with prevailing practices. We employ techniques and algorithms that ensure a simple, uniform, and flexible definition and design for the system, so that it easy to augment, extend, and improve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A difficulty in lung image registration is accounting for changes in the size of the lungs due to inspiration. We propose two methods for computing a uniform scale parameter for use in lung image registration that account for size change. A scaled rigid-body transformation allows analysis of corresponding lung CT scans taken at different times and can serve as a good low-order transformation to initialize non-rigid registration approaches. Two different features are used to compute the scale parameter. The first method uses lung surfaces. The second uses lung volumes. Both approaches are computationally inexpensive and improve the alignment of lung images over rigid registration. The two methods produce different scale parameters and may highlight different functional information about the lungs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Disclosure of authors' financial interests has been proposed as a strategy for protecting the integrity of the biomedical literature. We examined whether authors' financial interests were disclosed consistently in articles on coronary stents published in 2006. METHODOLOGY/PRINCIPAL FINDINGS: We searched PubMed for English-language articles published in 2006 that provided evidence or guidance regarding the use of coronary artery stents. We recorded article characteristics, including information about authors' financial disclosures. The main outcome measures were the prevalence, nature, and consistency of financial disclosures. There were 746 articles, 2985 authors, and 135 journals in the database. Eighty-three percent of the articles did not contain disclosure statements for any author (including declarations of no interests). Only 6% of authors had an article with a disclosure statement. In comparisons between articles by the same author, the types of disagreement were as follows: no disclosure statements vs declarations of no interests (64%); specific disclosures vs no disclosure statements (34%); and specific disclosures vs declarations of no interests (2%). Among the 75 authors who disclosed at least 1 relationship with an organization, there were 2 cases (3%) in which the organization was disclosed in every article the author wrote. CONCLUSIONS/SIGNIFICANCE: In the rare instances when financial interests were disclosed, they were not disclosed consistently, suggesting that there are problems with transparency in an area of the literature that has important implications for patient care. Our findings suggest that the inconsistencies we observed are due to both the policies of journals and the behavior of some authors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

On September 12, 2001, 54 Duke students recorded their memory of first hearing about the terrorist attacks of September 11 and of a recent everyday event. They were tested again either 1, 6, or 32 weeks later. Consistency for the flashbulb and everyday memories did not differ, in both cases declining over time. However, ratings of vividness, recollection, and belief in the accuracy of memory declined only for everyday memories. Initial visceral emotion ratings correlated with later belief in accuracy, but not consistency, for flashbulb memories. Initial visceral emotion ratings predicted later posttraumatic stress disorder symptoms. Flashbulb memories are not special in their accuracy, as previously claimed, but only in their perceived accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research project uses field measurements to investigate the cooling of a triple-junction, photovoltaic cell under natural convection when subjected to various amounts of insolation. The team built an experimental apparatus consisting of a mirror and Fresnel lens to concentrate light onto a triple-junction photovoltaic cell, mounted vertically on a copper heat sink. Measurements were taken year-round to provide a wide range of ambient conditions. A surface was then generated, in MATLAB, using Sparrow’s model for natural convection on a vertical plate under constant heat flux. This surface can be used to find the expected operating temperature of a cell at any location, given the ambient temperature and insolation. This research is an important contribution to the industry because it utilizes field data that represents how a cell would react under normal operation. It also extends the use of a well-known model from a one-sun environment to a multi-sun one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The notion of time plays a vital and ubiquitous role of a common universal reference. In knowledge-based systems, temporal information is usually represented in terms of a collection of statements, together with the corresponding temporal reference. This paper introduces a visualized consistency checker for temporal reference. It allows expression of both absolute and relative temporal knowledge, and provides visual representation of temporal references in terms of directed and partially weighted graphs. Based on the temporal reference of a given scenario, the visualized checker can deliver a verdict to the user as to whether the scenario is temporally consistent or not, and provide the corresponding analysis / diagnosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we provide a unified approach to solving preemptive scheduling problems with uniform parallel machines and controllable processing times. We demonstrate that a single criterion problem of minimizing total compression cost subject to the constraint that all due dates should be met can be formulated in terms of maximizing a linear function over a generalized polymatroid. This justifies applicability of the greedy approach and allows us to develop fast algorithms for solving the problem with arbitrary release and due dates as well as its special case with zero release dates and a common due date. For the bicriteria counterpart of the latter problem we develop an efficient algorithm that constructs the trade-off curve for minimizing the compression cost and the makespan.