949 resultados para Cadastral updating


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge about the world phylogeny of human mitochondrial DNA (mtDNA) is essential not only for evaluating the pathogenic role of specific mtDNA mutations but also for performing reliable association studies between mtDNA haplogroups and complex disorder

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some amount of differential settlement occurs even in the most uniform soil deposit, but it is extremely difficult to estimate because of the natural heterogeneity of the soil. The compression response of the soil and its variability must be characterised in order to estimate the probability of the differential settlement exceeding a certain threshold value. The work presented in this paper introduces a probabilistic framework to address this issue in a rigorous manner, while preserving the format of a typical geotechnical settlement analysis. In order to avoid dealing with different approaches for each category of soil, a simplified unified compression model is used to characterise the nonlinear compression behavior of soils of varying gradation through a single constitutive law. The Bayesian updating rule is used to incorporate information from three different laboratory datasets in the computation of the statistics (estimates of the means and covariance matrix) of the compression model parameters, as well as of the uncertainty inherent in the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper attempts to develop a reduction-based model updating technique for jacket offshore platform structure. A reduced model is used instead of the direct finite-element model of the real structure in order to circumvent such difficulties as huge degrees of freedom and incomplete experimental data that are usually civil engineers' trouble during the model updating. The whole process consists of three steps: reduction of FE model, the first model updating to minimize the reduction error, and the second model updating to minimize the modeling error of the reduced model and the real structure. According to the performance of jacket platforms, a local-rigidity assumption is employed to obtain the reduced model. The technique is applied in a downscale model of a four-legged offshore platform where its effectiveness is well proven. Furthermore, a comparison between the real structure and its numerical models in the following model validation shows that the updated models have good approximation to the real structure. Besides, some difficulties in the field of model updating are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider a network of processors (sites) in which each site x has a finite set N(x) of neighbors. There is a transition function f that for each site x computes the next state ξ(x) from the states in N(x). But these transitions (updates) are applied in arbitrary order, one or many at a time. If the state of site x at time t is η(x; t) then let us define the sequence ζ(x; 0); ζ(x; 1), ... by taking the sequence η(x; 0),η(x; 1), ... , and deleting each repetition, i.e. each element equal to the preceding one. The function f is said to have invariant histories if the sequence ζ(x; i), (while it lasts, in case it is finite) depends only on the initial configuration, not on the order of updates. This paper shows that though the invariant history property is typically undecidable, there is a useful simple sufficient condition, called commutativity: For any configuration, for any pair x; y of neighbors, if the updating would change both ξ(x) and ξ(y) then the result of updating first x and then y is the same as the result of doing this in the reverse order. This fact is derivable from known results on the confluence of term-rewriting systems but the self-contained proof given here may be justifiable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As we look around a scene, we perceive it as continuous and stable even though each saccadic eye movement changes the visual input to the retinas. How the brain achieves this perceptual stabilization is unknown, but a major hypothesis is that it relies on presaccadic remapping, a process in which neurons shift their visual sensitivity to a new location in the scene just before each saccade. This hypothesis is difficult to test in vivo because complete, selective inactivation of remapping is currently intractable. We tested it in silico with a hierarchical, sheet-based neural network model of the visual and oculomotor system. The model generated saccadic commands to move a video camera abruptly. Visual input from the camera and internal copies of the saccadic movement commands, or corollary discharge, converged at a map-level simulation of the frontal eye field (FEF), a primate brain area known to receive such inputs. FEF output was combined with eye position signals to yield a suitable coordinate frame for guiding arm movements of a robot. Our operational definition of perceptual stability was "useful stability,” quantified as continuously accurate pointing to a visual object despite camera saccades. During training, the emergence of useful stability was correlated tightly with the emergence of presaccadic remapping in the FEF. Remapping depended on corollary discharge but its timing was synchronized to the updating of eye position. When coupled to predictive eye position signals, remapping served to stabilize the target representation for continuously accurate pointing. Graded inactivations of pathways in the model replicated, and helped to interpret, previous in vivo experiments. The results support the hypothesis that visual stability requires presaccadic remapping, provide explanations for the function and timing of remapping, and offer testable hypotheses for in vivo studies. We conclude that remapping allows for seamless coordinate frame transformations and quick actions despite visual afferent lags. With visual remapping in place for behavior, it may be exploited for perceptual continuity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most appropriate way to measure the social benefits of conserving built cultural heritage sites is to ask the beneficiaries of conservation interventions how much they would be willing to pay for them. We use contingent valuation - a survey-based approach that elicits willingness to pay (WTP) directly from individuals - to estimate the benefits of a nationwide conservation of built cultural heritage sites in Armenia. The survey was administered to Armenian nationals living in Armenia, and obtained extensive information about the respondents' perceptions of the current state of conservation of monuments in Armenia, described the current situation, presented a hypothetical conservation program, elicited WTP for it, and queried individuals about what they thought would happen to monument sites in the absence of the government conservation program. We posit that respondents combined the information about the fate of monuments provided by the questionnaire with their prior beliefs, and that WTP for the good, or program, is likely to be affected by these updated beliefs. We propose a Bayesian updating model of prior beliefs, and empirically implement it with the data from our survey. We found that uncertainty about what would happen to monuments in the absence of the program results in lower WTP amounts. © 2008 Pion Ltd and its Licensors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We studied the effect of intervening saccades on the manual interception of a moving target. Previous studies suggest that stationary reach goals are coded and updated across saccades in gaze-centered coordinates, but whether this generalizes to interception is unknown. Subjects (n = 9) reached to manually intercept a moving target after it was rendered invisible. Subjects either fixated throughout the trial or made a saccade before reaching (both fixation points were in the range of -10° to 10°). Consistent with previous findings and our control experiment with stationary targets, the interception errors depended on the direction of the remembered moving goal relative to the new eye position, as if the target is coded and updated across the saccade in gaze-centered coordinates. However, our results were also more variable in that the interception errors for more than half of our subjects also depended on the goal direction relative to the initial gaze direction. This suggests that the feedforward transformations for interception differ from those for stationary targets. Our analyses show that the interception errors reflect a combination of biases in the (gaze-centered) representation of target motion and in the transformation of goal information into body-centered coordinates for action.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mineral exploration programmes around the world use data from remote sensing, geophysics and direct sampling. On a regional scale, the combination of airborne geophysics and ground-based geochemical sampling can aid geological mapping and economic minerals exploration. The fact that airborne geophysical and traditional soil-sampling data are generated at different spatial resolutions means that they are not immediately comparable due to their different sampling density. Several geostatistical techniques, including indicator cokriging and collocated cokriging, can be used to integrate different types of data into a geostatistical model. With increasing numbers of variables the inference of the cross-covariance model required for cokriging can be demanding in terms of effort and computational time. In this paper a Gaussian-based Bayesian updating approach is applied to integrate airborne radiometric data and ground-sampled geochemical soil data to maximise information generated from the soil survey, to enable more accurate geological interpretation for the exploration and development of natural resources. The Bayesian updating technique decomposes the collocated estimate into a production of two models: prior and likelihood models. The prior model is built from primary information and the likelihood model is built from secondary information. The prior model is then updated with the likelihood model to build the final model. The approach allows multiple secondary variables to be simultaneously integrated into the mapping of the primary variable. The Bayesian updating approach is demonstrated using a case study from Northern Ireland where the history of mineral prospecting for precious and base metals dates from the 18th century. Vein-hosted, strata-bound and volcanogenic occurrences of mineralisation are found. The geostatistical technique was used to improve the resolution of soil geochemistry, collected one sample per 2 km2, by integrating more closely measured airborne geophysical data from the GSNI Tellus Survey, measured over a footprint of 65 x 200 m. The directly measured geochemistry data were considered as primary data in the Bayesian approach and the airborne radiometric data were used as secondary data. The approach produced more detailed updated maps and in particular maximized information on mapped estimates of zinc, copper and lead. Greater delineation of an elongated northwest/southeast trending zone in the updated maps strengthened the potential to investigate stratabound base metal deposits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article explores The Connoisseur's combined engagement with its most important literary precursor and the society of its day. With its satire on the fashionable leisure culture of the mid-eighteenth century, Bonnell Thornton and George Colman's periodical, published from 1754 to 1756, followed self-consciously in the footsteps of Addison. Yet adopting the Addisonian model at mid-century was no straightforward task. Not only had the cultural landscape shifted during the forty years since The Spectator, but emulating this modern classic raised thorny issues regarding the originality and value of The Connoisseur itself. In appropriating the Addisonian essay, the challenge for Colman and Thornton was thus to update Addison: to adapt their model to changing times. This article examines how Colman and Thornton sought to validate their particular contribution to the polite periodical tradition, along with the difficulties they encountered in maintaining a Spectatorial detachment from the fashionable milieu that was their primary theme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Credal nets are probabilistic graphical models which extend Bayesian nets to cope with sets of distributions. An algorithm for approximate credal network updating is presented. The problem in its general formulation is a multilinear optimization task, which can be linearized by an appropriate rule for fixing all the local models apart from those of a single variable. This simple idea can be iterated and quickly leads to accurate inferences. A transformation is also derived to reduce decision making in credal networks based on the maximality criterion to updating. The decision task is proved to have the same complexity of standard inference, being NPPP-complete for general credal nets and NP-complete for polytrees. Similar results are derived for the E-admissibility criterion. Numerical experiments confirm a good performance of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Credal networks relax the precise probability requirement of Bayesian networks, enabling a richer representation of uncertainty in the form of closed convex sets of probability measures. The increase in expressiveness comes at the expense of higher computational costs. In this paper, we present a new variable elimination algorithm for exactly computing posterior inferences in extensively specified credal networks, which is empirically shown to outperform a state-of-the-art algorithm. The algorithm is then turned into a provably good approximation scheme, that is, a procedure that for any input is guaranteed to return a solution not worse than the optimum by a given factor. Remarkably, we show that when the networks have bounded treewidth and bounded number of states per variable the approximation algorithm runs in time polynomial in the input size and in the inverse of the error factor, thus being the first known fully polynomial-time approximation scheme for inference in credal networks.