987 resultados para Coastal Monitoring. Geodesy. DEM. LiDAR


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les laves torrentielles sont l'un des vecteurs majeurs de sédiments en milieu montagneux. Leur comportement hydrogéomorphologique est contrôlé par des facteurs géologique, géomorphologique, topographique, hydrologique, climatique et anthropique. Si, en Europe, la recherche s'est plus focalisée sur les aspects hydrologiques que géomorphologiques de ces phénomènes, l'identification des volumes de sédiments potentiellement mobilisables au sein de petits systèmes torrentiels et des processus responsables de leur transfert est d'une importance très grande en termes d'aménagement du territoire et de gestion des dangers naturels. De plus, une corrélation entre des événements pluviométriques et l'occurrence de laves torrentielles n'est pas toujours établie et de nombreux événements torrentiels semblent se déclencher lorsqu'un seuil géomorphologique intrinsèque (degré de remplissage du chenal) au cours d'eau est atteint.Une méthodologie pragmatique a été développée pour cartographier les stocks sédimentaires constituant une source de matériaux pour les laves torrentielles, comme outil préliminaire à la quantification des volumes transportés par ces phénomènes. La méthode s'appuie sur des données dérivées directement d'analyses en environnement SIG réalisées sur des modèles numériques d'altitude de haute précision, de mesures de terrain et d'interprétation de photographies aériennes. La méthode a été conçue pour évaluer la dynamique des transferts sédimentaires, en prenant en compte le rôle des différents réservoirs sédimentaires, par l'application du concept de cascade sédimentaire sous un angle cartographique.Les processus de transferts sédimentaires ont été étudiés dans deux bassins versants des Alpes suisses (torrent du Bruchi, à Blatten beiNaters et torrent du Meretschibach, à Agarn). La cartographie géomorphologique a été couplée avec des mesures complémentaires permettant d'estimer les flux sédimentaires et les taux d'érosion (traçages de peinture, piquets de dénudation et utilisation du LiDAR terrestre). La méthode proposée se révèle innovatrice en comparaison avec la plupart des systèmes de légendes géomorphologiques existants, qui ne sont souvent pas adaptés pour cartographier de manière satisfaisante les systèmes géomorphologiques complexes et actifs que sont les bassins torrentiels. L'intérêt de cette méthode est qu'elle permet l'établissement d'une cascade sédimentaire, mais uniquement pour des systèmes où l'occurrence d'une lave torrentielle est contrôlé par le degré de remplissage en matériaux du chenal. Par ailleurs, le produit cartographique ne peut être directement utilisé pour la création de cartes de dangers - axées sur les zones de dépôt - mais revêt un intérêt pour la mise en place de mesures de correction et pour l'installation de systèmes de monitoring ou d'alerte.La deuxième partie de ce travail de recherche est consacrée à la cartographie géomorphologique. Une analyse a porté sur un échantillon de 146 cartes ou systèmes de légende datant des années 1950 à 2009 et réalisés dans plus de 40 pays. Cette analyse a permis de mettre en évidence la diversité des applications et des techniques d'élaboration des cartes géomorphologiques. - Debris flows are one of the most important vectors of sediment transfer in mountainous areas. Their hydro-geomorphological behaviour is conditioned by geological, geomorphological, topographical, hydrological, climatic and anthropic factors. European research in torrential systems has focused more on hydrological processes than on geomorphological processes acting as debris flow triggers. Nevertheless, the identification of sediment volumes that have the potential to be mobilised in small torrential systems, as well as the recognition of processes responsible for their mobilisation and transfer within the torrential system, are important in terms of land-use planning and natural hazard management. Moreover, a correlation between rainfall and debris flow occurrence is not always established and a number of debris flows seems to occur when a poorly understood geomorphological threshold is reached.A pragmatic methodology has been developed for mapping sediment storages that may constitute source zone of bed load transport and debris flows as a preliminary tool before quantifying their volumes. It is based on data directly derived from GIS analysis using high resolution DEM's, field measurements and aerial photograph interpretations. It has been conceived to estimate sediment transfer dynamics, taking into account the role of different sediment stores in the torrential system applying the concept of "sediment cascade" in a cartographic point of view.Sediment transfer processes were investigated in two small catchments in the Swiss Alps (Bruchi torrent, Blatten bei Naters and Meretschibach torrent, Agarn). Thorough field geomorphological mapping coupled with complementary measurements were conducted to estimate sediment fluxes and denudation rates, using various methods (reference coloured lines, wooden markers and terrestrial LiDAR). The proposed geomorphological mapping methodology is quite innovative in comparison with most legend systems that are not adequate for mapping active and complex geomorphological systems such as debris flow catchments. The interest of this mapping method is that it allows the concept of sediment cascade to be spatially implemented but only for supply-limited systems. The map cannot be used directly for the creation of hazard maps, focused on the deposition areas, but for the design of correction measures and the implementation of monitoring and warning systems.The second part of this work focuses on geomorphological mapping. An analysis of a sample of 146 (extracts of) maps or legend systems dating from the middle of the 20th century to 2009 - realised in more than 40 different countries - was carried out. Even if this study is not exhaustive, it shows a clear renewed interest for the discipline worldwide. It highlights the diversity of applications, techniques (scale, colours and symbology) used for their conception.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring spielt eine wichtige Rolle zur Therapieevaluierung und Behandlungsentscheidung - solange es auf der Basis der Messung von entsprechenden klinischen oder validierten Surrogat-Markern stattfindet. Im Hinblick auf die Imatinib-Therapie scheint das «Therapeutische Drug-Monitoring (TDM) ein nützlicher Ansatz zum Therapie-Monitoring der CML-Behandlung zu sein, welches die Plasmakonzentration des Arzneimittels als Marker zur Therapieüberwachung verwendet. Imatinib-Plasmakonzentrationen variieren beträchtlich von Patient zu Patient unter dem gleichen Dosierungsschema, aufgrund der interindividuell unterschiedlichen Pharmakokinetik des Arzneimittels. Für die Plasmaexposition wurde gezeigt, dass sie mit dem klinischen Outcome von CML-Patienten korreliert - sowohl im Bezug auf das Therapieansprechen als auch auf das Nebenwirkungsprofil. Es ist noch unklar, ob das TDM von Imatinib nur im Falle von klinischen Problemen Verwendung finden sollte oder ob CML-Patienten bereits von einem systematischen, präventiven «Routine»-Monitoring zur Therapieindividualisierung - zur Steuerung der Plasmakonzentration in einen therapeutischen Bereich - profitieren könnten, welches in letzter Zeit immer häufiger empfohlen wird. Um diese Fragestellung zu beantworten, nimmt eine prospektive, randomisiert kontrollierte Schweizer Studie CML-Patienten auf, die seit weniger als 5 Jahren mit Imatinib behandelt werden, und bietet das TDM zudem für alle Patienten im Falle von klinischen Problemen an.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Manival near Grenoble (French Prealps) is a very active debris-flow torrent equipped with a large sediment trap (25 000 m3) protecting an urbanized alluvial fan from debris-flows. We began monitoring the sediment budget of the catchment controlled by the trap in Spring 2009. Terrestrial laser scanner is used for monitoring topographic changes in a small gully, the main channel, and the sediment trap. In the main channel, 39 cross-sections are surveyed after every event. Three periods of intense geomorphic activity are documented here. The first was induced by a convective storm in August 2009 which triggered a debris-flow that deposited ~1,800 m3 of sediment in the trap. The debris-flow originated in the upper reach of the main channel and our observations showed that sediment outputs were entirely supplied by channel scouring. Hillslope debris-flows were initiated on talus slopes, as revealed by terrestrial LiDAR resurveys; however they were disconnected to the main channel. The second and third periods of geomorphic activity were induced by long duration and low intensity rainfall events in September and October 2009 which generate small flow events with intense bedload transport. These events contribute to recharge the debris-flow channel with sediments by depositing important gravel dunes propagating from headwaters. The total recharge in the torrent subsequent to bedload transport events was estimated at 34% of the sediment erosion induced by the August debris-flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major oil spills can have long-term impacts since oil pollution does not only result in acute mortality of marine organisms, but also affects productivity levels, predator-prey dynamics, and damages habitats that support marine communities. However, despite the conservation implications of oil accidents, the monitoring and assessment of its lasting impacts still remains a difficult and daunting task. Here, we used European shags to evaluate the overall, lasting effects of the Prestige oil spill (2002) on the affected marine ecosystem. Using δ15N and Hg analysis, we trace temporal changes in feeding ecology potentially related to alterations of the food web due to the spill. Using climatic and oceanic data, we also investigate the influence of North Atlantic Oscillation (NAO) index, the sea surface temperature (SST) and the chlorophyll a (Chl a) on the observed changes. Analysis of δ15N and Hg concentrations revealed that after the Prestige oil spill, shag chicks abruptly switched their trophic level from a diet based on a high percentage of demersal-benthic fish to a higher proportion of pelagic/semi-pelagic species. There was no evidence that Chl a, SST and NAO reflected any particular changes or severity in environmental conditions for any year or season that may explain the sudden change observed in trophic level. Thus, this study highlighted an impact on the marine food web for at least three years. Our results provide the best evidence to date of the long-term consequences of the Prestige oil spill. They also show how, regardless of wider oceanographic variability, lasting impacts on predator-prey dynamics can be assessed using biochemical markers. This is particularly useful if larger scale and longer term monitoring of all trophic levels is unfeasible due to limited funding or high ecosystem complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to reduce greenhouse emissions from forest degradation and deforestation the international programme REDD (Reducing Emissions from Deforestation and forest Degradation) was established in 2005 by the United Nations Framework Convention on Climate Change (UNFCCC). This programme is aimed to financially reward to developing countries for any emissions reductions. Under this programm the project of setting up the payment system in Nepal was established. This project is aimed to engage local communities in forest monitoring. The major objective of this thesis is to compare and verify data obtained from di erect sources - remotely sensed data, namely LiDAR and field sample measurements made by two groups of researchers using two regression models - Sparse Bayesian Regression and Bayesian Regression with Orthogonal Variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In coastal waters, physico-chemical and biological properties and constituents vary at different time scales. In the study area of this thesis, within the Archipelago Sea in the northern Baltic Sea, seasonal cycles of light and temperature set preconditions for intra-annual variations, but developments at other temporal scales occur as well. Weather-induced runoffs and currents may alter water properties over the short term, and the consequences over time of eutrophication and global changes are to a degree unpredictable. The dynamic characteristics of northern Baltic Sea waters are further diversified at the archipelago coasts. Water properties may differ in adjacent basins, which are separated by island and underwater thresholds limiting water exchange, making the area not only a mosaic of islands but also one of water masses. Long-term monitoring and in situ observations provide an essential data reserve for coastal management and research. Since the seasonal amplitudes of water properties are so high, inter-annual comparisons of water-quality variables have to be based on observations sampled at the same time each year. In this thesis I compare areas by their temporal characteristics, using both inter-annual and seasonal data. After comparing spatial differences in seasonal cycles, I conclude that spatial comparisons and temporal generalizations have to be made with caution. In classifying areas by the state of their waters, the results may be biased even if the sampling is annually simultaneous, since the dynamics of water properties may vary according to the area. The most comprehensive view of the spatiotemporal dynamics of water properties would be achieved by means of comparisons with data consisting of multiple annual samples. For practical reasons, this cannot be achieved with conventional in situ sampling. A holistic understanding of the spatiotemporal features of the water properties of the Archipelago Sea will have to be based on the application of multiple methods, complementing each other’s spatial and temporal coverage. The integration of multi-source observational data and time-series analysis may be methodologically challenging, but it will yield new information as to the spatiotemporal regime of the Archipelago Sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An attempt is made to study the possible relationship between the process of upwelling and zooplankton biomass in the shelf weters along the south west coast of India between Cape comorin and Ratnagiri based on oceanographic and Zooplankton data collected by the erstwhile FAO/UNDP Pelagic Fishery Project,Cochin between 1973 and 1978. Different factors such as the depth from which the bottom waters are induced upwards during the process of upwelling,the depth to which the bottom waters are drawn, vertical velocity of upwelling and the resultant zooplankton productivity were considered while arriving at the deductions. Except for nutrients and phytoplankton productivity, for which simultaneous data is lacking, all the major factors were taken into consideration before cocluding- xon positive/negative correlation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessment water’ quality nowa-days in global scenario implies the need for a reference point against which monitoring can be measured and weighed. Aquatic ecosystenis as part of the natural environment are balanced both witliin tlicinselves and with other environmental compartments and this equilibrium is subject to natural variations and evolutions as well as variations caused by human intervention. The present assessnient is to identify. and possibly quantify, anthropogenic influences over time against a “natural baseline situation. Water pollution problems have only recently been taken seriously in retrospect. Once damage occurred, it becomes immeasurable, and control action cannot be initiated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-lying coastal areas are more vulnerable to the impacts of climate change as they are highly prone for inundation to SLR (Sea-Level Rise). This study presents an appraisal of the impacts of SLR on the coastal natural resources and its dependent social communities in the low-lying area of VellareColeroon estuarine region of the Tamil Nadu coast, India. Digital Elevation Model (DEM) derived from SRTM 90M (Shuttle Radar Topographic Mission) data, along with GIS (Geographic Information System) techniques are used to identify an area of inundation in the study site. The vulnerability of coastal areas in Vellar-Coleroon estuarine region of Tamil Nadu coast to inundation was calculated based on the projected SLR scenarios of 0.5 m and 1 m. The results demonstrated that about 1570 ha of the LULC (Land use and Land cover) of the study area would be permanently inundated to 0.5 m and 2407 ha for 1 m SLR and has also resulted in the loss of three major coastal natural resources like coastal agriculture, mangroves and aquaculture. It has been identified that six hamlets of the social communities who depend on these resources are at high-risk and vulnerable to 0.5 m SLR and 12 hamlets for 1 m SLR. From the study, it has been emphasized that mainstreaming adaptation options to SLR should be embedded within a coastal zone management and planning effort, which includes all coastal natural resources (ecosystem-based adaptation), and its dependent social communities (community-based adaptation) involved through capacity building

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a model for the international market of credit ratings in order to promote transparency of rating methodologies and combat the oligopolistic market structure where Standard & Poor‘s, Moody‘s and Fitch Ratings collectively comprise approximately 85 percent of the market. For the German credit market this paper strongly advises the establishment of at least three centralistic credit rating agencies (CRAs), set up and run independently by the large bank institutions – „Großbanken“, „Sparkassen“ and „Genossenschaftsbanken“. By acting as CRAs, universal banks could not only decrease their costs but would also be able to increase competition and transparency. These new credit rating agencies would be subject to the Basel II internal ratings-based (IRB) surveillance standards that go far beyond the Basel II standard approach with its external ratings by the dominating three US-american CRAs. Due to the fact that the new Basle Accord has already been implemented in Europe, this model could be applied all over Europe and possibly even worldwide, assuming the US were to adopt the new capital adequacy rules. This would lead to an increase in the number of CRAs and hence to more competition, as the barriers to entry in the rating industry would not apply to these new institutions because of their expertise in the credit market. The fact that the IRB-criteria already have to be disclosed by law would make the methodologies transparent and subject to approval by national regulators such as the „Bundesanstalt für Finanzdienstleistungsaufsicht“ (BaFin) in Germany. Hence the requirement to set up a new monitoring committee in Europe would become obsolete.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Callcenter spielen in der heutigen Zeit eine bedeutende Rolle in der Kundenkommunikation. Für die Betreiber der Callcenter ist dabei der Umgang mit Kundendaten selbstverständlich; dasselbe gilt oftmals in Bezug auf die Daten ihrer Mitarbeiter, die bei der Telekommunikation anfallen. Das Persönlichkeitsrecht der Arbeitnehmer schränkt den zulässigen Umgang mit solchen Daten jedoch stark ein. Der vorliegende Beitrag behandelt – insbesondere unter Berücksichtigung des seit 1. September 2009 geltenden § 32 BDSG zum Beschäftigtendatenschutz – die datenschutzrechtliche Einordnung verschiedener Maßnahmen, die Callcenter-Mitarbeiter kontrollieren und ihre Leistung sowie ihr Verhalten feststellen sollen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ongoing depletion of the coastal aquifer in the Gaza strip due to groundwater overexploitation has led to the process of seawater intrusion, which is continually becoming a serious problem in Gaza, as the seawater has further invaded into many sections along the coastal shoreline. As a first step to get a hold on the problem, the artificial neural network (ANN)-model has been applied as a new approach and an attractive tool to study and predict groundwater levels without applying physically based hydrologic parameters, and also for the purpose to improve the understanding of complex groundwater systems and which is able to show the effects of hydrologic, meteorological and anthropogenic impacts on the groundwater conditions. Prediction of the future behaviour of the seawater intrusion process in the Gaza aquifer is thus of crucial importance to safeguard the already scarce groundwater resources in the region. In this study the coupled three-dimensional groundwater flow and density-dependent solute transport model SEAWAT, as implemented in Visual MODFLOW, is applied to the Gaza coastal aquifer system to simulate the location and the dynamics of the saltwater–freshwater interface in the aquifer in the time period 2000-2010. A very good agreement between simulated and observed TDS salinities with a correlation coefficient of 0.902 and 0.883 for both steady-state and transient calibration is obtained. After successful calibration of the solute transport model, simulation of future management scenarios for the Gaza aquifer have been carried out, in order to get a more comprehensive view of the effects of the artificial recharge planned in the Gaza strip for some time on forestall, or even to remedy, the presently existing adverse aquifer conditions, namely, low groundwater heads and high salinity by the end of the target simulation period, year 2040. To that avail, numerous management scenarios schemes are examined to maintain the ground water system and to control the salinity distributions within the target period 2011-2040. In the first, pessimistic scenario, it is assumed that pumping from the aquifer continues to increase in the near future to meet the rising water demand, and that there is not further recharge to the aquifer than what is provided by natural precipitation. The second, optimistic scenario assumes that treated surficial wastewater can be used as a source of additional artificial recharge to the aquifer which, in principle, should not only lead to an increased sustainable yield of the latter, but could, in the best of all cases, revert even some of the adverse present-day conditions in the aquifer, i.e., seawater intrusion. This scenario has been done with three different cases which differ by the locations and the extensions of the injection-fields for the treated wastewater. The results obtained with the first (do-nothing) scenario indicate that there will be ongoing negative impacts on the aquifer, such as a higher propensity for strong seawater intrusion into the Gaza aquifer. This scenario illustrates that, compared with 2010 situation of the baseline model, at the end of simulation period, year 2040, the amount of saltwater intrusion into the coastal aquifer will be increased by about 35 %, whereas the salinity will be increased by 34 %. In contrast, all three cases of the second (artificial recharge) scenario group can partly revert the present seawater intrusion. From the water budget point of view, compared with the first (do nothing) scenario, for year 2040, the water added to the aquifer by artificial recharge will reduces the amount of water entering the aquifer by seawater intrusion by 81, 77and 72 %, for the three recharge cases, respectively. Meanwhile, the salinity in the Gaza aquifer will be decreased by 15, 32 and 26% for the three cases, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tecnología LiDAR (Light Detection and Ranging), basada en el escaneado del territorio por un telémetro láser aerotransportado, permite la construcción de Modelos Digitales de Superficie (DSM) mediante una simple interpolación, así como de Modelos Digitales del Terreno (DTM) mediante la identificación y eliminación de los objetos existentes en el terreno (edificios, puentes o árboles). El Laboratorio de Geomática del Politécnico de Milán – Campus de Como- desarrolló un algoritmo de filtrado de datos LiDAR basado en la interpolación con splines bilineares y bicúbicas con una regularización de Tychonov en una aproximación de mínimos cuadrados. Sin embargo, en muchos casos son todavía necesarios modelos más refinados y complejos en los cuales se hace obligatorio la diferenciación entre edificios y vegetación. Este puede ser el caso de algunos modelos de prevención de riesgos hidrológicos, donde la vegetación no es necesaria; o la modelización tridimensional de centros urbanos, donde la vegetación es factor problemático. (...)