8 resultados para Cartographic updating
em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco
Resumo:
This paper provides experimental evidence on how players predict end game effects in a linear public good game. Our regression analysis yields a measure of the relative importance of priors and signals on subjects\' beliefs on contributions and allow us to conclude that, firstly, the weight of the signal is relatively unimportant, while priors have a large weight and, secondly, priors are the same for all periods. Hence, subjects do not expect end game effects and there is very little updating of beliefs.
Resumo:
[EN] Data contained in this record come from the following accademic activity (from which it is possible to locate additional records related with the Monastery):
Resumo:
In this paper we introduce four scenario Cluster based Lagrangian Decomposition (CLD) procedures for obtaining strong lower bounds to the (optimal) solution value of two-stage stochastic mixed 0-1 problems. At each iteration of the Lagrangian based procedures, the traditional aim consists of obtaining the solution value of the corresponding Lagrangian dual via solving scenario submodels once the nonanticipativity constraints have been dualized. Instead of considering a splitting variable representation over the set of scenarios, we propose to decompose the model into a set of scenario clusters. We compare the computational performance of the four Lagrange multiplier updating procedures, namely the Subgradient Method, the Volume Algorithm, the Progressive Hedging Algorithm and the Dynamic Constrained Cutting Plane scheme for different numbers of scenario clusters and different dimensions of the original problem. Our computational experience shows that the CLD bound and its computational effort depend on the number of scenario clusters to consider. In any case, our results show that the CLD procedures outperform the traditional LD scheme for single scenarios both in the quality of the bounds and computational effort. All the procedures have been implemented in a C++ experimental code. A broad computational experience is reported on a test of randomly generated instances by using the MIP solvers COIN-OR and CPLEX for the auxiliary mixed 0-1 cluster submodels, this last solver within the open source engine COIN-OR. We also give computational evidence of the model tightening effect that the preprocessing techniques, cut generation and appending and parallel computing tools have in stochastic integer optimization. Finally, we have observed that the plain use of both solvers does not provide the optimal solution of the instances included in the testbed with which we have experimented but for two toy instances in affordable elapsed time. On the other hand the proposed procedures provide strong lower bounds (or the same solution value) in a considerably shorter elapsed time for the quasi-optimal solution obtained by other means for the original stochastic problem.
Resumo:
[EN] This academic activity has been the origin of other work that are also located in this repository. The first one is the dataset of information about the geometry of the Monastery recorded during the two years of fieldwork, then some bachelor thesis and papers are listed:
Resumo:
This paper presents a model-based approach for reconstructing 3D polyhedral building models from aerial images. The proposed approach exploits some geometric and photometric properties resulting from the perspective projection of planar structures. Data are provided by calibrated aerial images. The novelty of the approach lies in its featurelessness and in its use of direct optimization based on image rawbrightness. The proposed framework avoids feature extraction and matching. The 3D polyhedral model is directly estimated by optimizing an objective function that combines an image-based dissimilarity measure and a gradient score over several aerial images. The optimization process is carried out by the Differential Evolution algorithm. The proposed approach is intended to provide more accurate 3D reconstruction than feature-based approaches. Fast 3D model rectification and updating can take advantage of the proposed method. Several results and evaluations of performance from real and synthetic images show the feasibility and robustness of the proposed approach.
Resumo:
[EN] This paper is based in the following project:
Resumo:
[ES] Los modelos virtuales de edificios hist??ricos u otros elementos patrimoniales cuentan con una gran variedad de aplicaciones relacionadas, sobre todo, con la difusi??n y multimedia. Aunque en muchas ocasiones se trata de productos eminentemente inform??ticos en los que la componente geom??trica y, por lo tanto topogr??fica, es muy limitada, existen aplicaciones, como los estudios hist??ricos, en los cuales, es necesario que estos modelos representen fielmente la geometr??a y las texturas. En estos casos, los modelos pueden considerarse como productos cartogr??ficos. A continuaci??n se hace una recapitulaci??n de las conclusiones a las que hemos llegado durante los ??ltimos dos a??os despu??s de haber desarrollado una veintena de proyectos en este campo.
Resumo:
[ES]El objetivo principal de este trabajo es el diseño y desarrollo de un sistema para la validación de las metodologías propuestas en el borrador de recomendación P.STMWeb [1] “Metodología para la evaluación subjetiva de la calidad percibida en la navegación web”. Como resultado de la validación realizada con el sistema han sido aprobadas las recomendaciones G.1031 [5] “Factores del QoE en web browsing” y P.1501 [6] “Metodología de prueba subjetiva para web browsing” de la ITU-T. El sistema propuesto se ha diseñado en base a las especificaciones marcadas en el borrador de la recomendación en vías de estandarización [P.STMWeb].Estas especificaciones han servido de base para el diseño e implementación de un sistema que permite una navegación web interactiva, con control y actualización de páginas web en base a variaciones de parámetros de red, como el retardo en un entorno controlado. Así mismo siguiendo la metodología establecida en el borrador de recomendación anteriormente mencionado se ha diseñado y desarrollado un sistema de encuestas para evaluar de forma subjetiva la calidad experimentada por los usuarios (QoE) en la navegación web. Para el diseño del sistema de encuestas, se analizaron y utilizaron como especificaciones los diferentes factores que se contemplan en el borrador de recomendación G.QoE-Web [2] “Factores relevantes y casos de uso para la QoE Web” de la ITU-T. En base al diseño realizado, se ha desarrollado un sistema que permite analizar la calidad experimentada por los usuarios en la navegación Web.