53 resultados para Cartographic updating
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
This article explores statistical approaches for assessing the relative accuracy of medieval mapping. It focuses on one particular map, the Gough Map of Great Britain. This is an early and remarkable example of a medieval “national” map covering Plantagenet Britain. Conventionally dated to c. 1360, the map shows the position of places in and coastal outline of Great Britain to a considerable degree of spatial accuracy. In this article, aspects of the map's content are subjected to a systematic analysis to identify geographical variations in the map's veracity, or truthfulness. It thus contributes to debates among historical geographers and cartographic historians on the nature of medieval maps and mapping and, in particular, questions of their distortion of geographic space. Based on a newly developed digital version of the Gough Map, several regression-based approaches are used here to explore the degree and nature of spatial distortion in the Gough Map. This demonstrates that not only are there marked variations in the positional accuracy of places shown on the map between regions (i.e., England, Scotland, and Wales), but there are also fine-scale geographical variations in the spatial accuracy of the map within these regions. The article concludes by suggesting that the map was constructed using a range of sources, and that the Gough Map is a composite of multiscale representations of places in Great Britain. The article details a set of approaches that could be transferred to other contexts and add value to historic maps by enhancing understanding of their contents.
Resumo:
The most appropriate way to measure the social benefits of conserving built cultural heritage sites is to ask the beneficiaries of conservation interventions how much they would be willing to pay for them. We use contingent valuation - a survey-based approach that elicits willingness to pay (WTP) directly from individuals - to estimate the benefits of a nationwide conservation of built cultural heritage sites in Armenia. The survey was administered to Armenian nationals living in Armenia, and obtained extensive information about the respondents' perceptions of the current state of conservation of monuments in Armenia, described the current situation, presented a hypothetical conservation program, elicited WTP for it, and queried individuals about what they thought would happen to monument sites in the absence of the government conservation program. We posit that respondents combined the information about the fate of monuments provided by the questionnaire with their prior beliefs, and that WTP for the good, or program, is likely to be affected by these updated beliefs. We propose a Bayesian updating model of prior beliefs, and empirically implement it with the data from our survey. We found that uncertainty about what would happen to monuments in the absence of the program results in lower WTP amounts. © 2008 Pion Ltd and its Licensors.
Resumo:
We studied the effect of intervening saccades on the manual interception of a moving target. Previous studies suggest that stationary reach goals are coded and updated across saccades in gaze-centered coordinates, but whether this generalizes to interception is unknown. Subjects (n = 9) reached to manually intercept a moving target after it was rendered invisible. Subjects either fixated throughout the trial or made a saccade before reaching (both fixation points were in the range of -10° to 10°). Consistent with previous findings and our control experiment with stationary targets, the interception errors depended on the direction of the remembered moving goal relative to the new eye position, as if the target is coded and updated across the saccade in gaze-centered coordinates. However, our results were also more variable in that the interception errors for more than half of our subjects also depended on the goal direction relative to the initial gaze direction. This suggests that the feedforward transformations for interception differ from those for stationary targets. Our analyses show that the interception errors reflect a combination of biases in the (gaze-centered) representation of target motion and in the transformation of goal information into body-centered coordinates for action.
Resumo:
This paper uses the analytical potential of Geographical Information Systems (GIS) to explore processes of map production and circulation in early-seventeenth century Ireland. The paper focuses on a group of historic maps, attributed to Josias Bodley, which were commissioned in 1609 by the English Crown to assist in the Plantation of Ulster. Through GIS and digitizing map-features, and in particular by quantifying map-distortion, it is possible to examine how these maps were made, and by whom. Statistical analyses of spatial data derived from the GIS are shown to provide a methodological basis for ‘excavating’ historical geographies of Plantation map-making. These techniques, when combined with contemporary written sources, reveal further insight on the ‘cartographic encounters’ taking place between surveyors and map-makers working in Ireland in the early 1600s, opening up the ‘mapping worlds’ which linked Ireland and Britain through the networks and embodied practices of Bodley and his map-makers.
Resumo:
Mineral exploration programmes around the world use data from remote sensing, geophysics and direct sampling. On a regional scale, the combination of airborne geophysics and ground-based geochemical sampling can aid geological mapping and economic minerals exploration. The fact that airborne geophysical and traditional soil-sampling data are generated at different spatial resolutions means that they are not immediately comparable due to their different sampling density. Several geostatistical techniques, including indicator cokriging and collocated cokriging, can be used to integrate different types of data into a geostatistical model. With increasing numbers of variables the inference of the cross-covariance model required for cokriging can be demanding in terms of effort and computational time. In this paper a Gaussian-based Bayesian updating approach is applied to integrate airborne radiometric data and ground-sampled geochemical soil data to maximise information generated from the soil survey, to enable more accurate geological interpretation for the exploration and development of natural resources. The Bayesian updating technique decomposes the collocated estimate into a production of two models: prior and likelihood models. The prior model is built from primary information and the likelihood model is built from secondary information. The prior model is then updated with the likelihood model to build the final model. The approach allows multiple secondary variables to be simultaneously integrated into the mapping of the primary variable. The Bayesian updating approach is demonstrated using a case study from Northern Ireland where the history of mineral prospecting for precious and base metals dates from the 18th century. Vein-hosted, strata-bound and volcanogenic occurrences of mineralisation are found. The geostatistical technique was used to improve the resolution of soil geochemistry, collected one sample per 2 km2, by integrating more closely measured airborne geophysical data from the GSNI Tellus Survey, measured over a footprint of 65 x 200 m. The directly measured geochemistry data were considered as primary data in the Bayesian approach and the airborne radiometric data were used as secondary data. The approach produced more detailed updated maps and in particular maximized information on mapped estimates of zinc, copper and lead. Greater delineation of an elongated northwest/southeast trending zone in the updated maps strengthened the potential to investigate stratabound base metal deposits.
Resumo:
This article explores The Connoisseur's combined engagement with its most important literary precursor and the society of its day. With its satire on the fashionable leisure culture of the mid-eighteenth century, Bonnell Thornton and George Colman's periodical, published from 1754 to 1756, followed self-consciously in the footsteps of Addison. Yet adopting the Addisonian model at mid-century was no straightforward task. Not only had the cultural landscape shifted during the forty years since The Spectator, but emulating this modern classic raised thorny issues regarding the originality and value of The Connoisseur itself. In appropriating the Addisonian essay, the challenge for Colman and Thornton was thus to update Addison: to adapt their model to changing times. This article examines how Colman and Thornton sought to validate their particular contribution to the polite periodical tradition, along with the difficulties they encountered in maintaining a Spectatorial detachment from the fashionable milieu that was their primary theme.
Resumo:
Credal nets are probabilistic graphical models which extend Bayesian nets to cope with sets of distributions. An algorithm for approximate credal network updating is presented. The problem in its general formulation is a multilinear optimization task, which can be linearized by an appropriate rule for fixing all the local models apart from those of a single variable. This simple idea can be iterated and quickly leads to accurate inferences. A transformation is also derived to reduce decision making in credal networks based on the maximality criterion to updating. The decision task is proved to have the same complexity of standard inference, being NPPP-complete for general credal nets and NP-complete for polytrees. Similar results are derived for the E-admissibility criterion. Numerical experiments confirm a good performance of the method.
Resumo:
Credal networks relax the precise probability requirement of Bayesian networks, enabling a richer representation of uncertainty in the form of closed convex sets of probability measures. The increase in expressiveness comes at the expense of higher computational costs. In this paper, we present a new variable elimination algorithm for exactly computing posterior inferences in extensively specified credal networks, which is empirically shown to outperform a state-of-the-art algorithm. The algorithm is then turned into a provably good approximation scheme, that is, a procedure that for any input is guaranteed to return a solution not worse than the optimum by a given factor. Remarkably, we show that when the networks have bounded treewidth and bounded number of states per variable the approximation algorithm runs in time polynomial in the input size and in the inverse of the error factor, thus being the first known fully polynomial-time approximation scheme for inference in credal networks.
Resumo:
This paper explores semi-qualitative probabilistic networks (SQPNs) that combine numeric and qualitative information. We first show that exact inferences with SQPNs are NPPP-Complete. We then show that existing qualitative relations in SQPNs (plus probabilistic logic and imprecise assessments) can be dealt effectively through multilinear programming. We then discuss learning: we consider a maximum likelihood method that generates point estimates given a SQPN and empirical data, and we describe a Bayesian-minded method that employs the Imprecise Dirichlet Model to generate set-valued estimates.