854 resultados para Concept-based Retrieval


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The oil price rises more and more, and the world energy consumption is projected to expand by 50 percent from 2005 to 2030. Nowadays intensive research is focused on the development of alternative energies. Among them, there are dye-sensitized nanocrystalline solar cells (DSSCs) “the third generation solar cells”. The latter have gained attention during the last decade and are currently subject of intense research in the framework of renewable energies as a low-cost photovoltaic. At present DSSCs with ruthenium based dyes exhibit highest efficiencies (ca 11%). The objective of the present work is to fabricate, characterize and improve the performance of DSSCs based on metal free dyes as sensitizers, especially on perylene derivatives. The work begins by a general introduction to the photovoltaics and dye-sensitized solar cells, such as the operating principles and the characteristics of the DSSCs. Chapter 2 and 3 discuss the state of the art of sensitizers used in DSSCs, present the compounds used as sensitizer in the present work and illustrate practical issues of experimental techniques and device preparation. A comparative study of electrolyte-DSSCs based on P1, P4, P7, P8, P9, and P10 are presented in chapter 4. Experimental results show that the dye structure plays a crucial role in the performance of the devices. The dye based on the spiro-concept (bipolar spiro compound) exhibited a higher efficiency than the non-spiro compounds. The presence of tert-butylpyridine as additive in the electrolyte was found to increase the open circuit voltage and simultaneously decrease the efficiency. The presence of lithium ions in the electrolyte increases both output current and the efficiency. The sensitivity of the dye to cations contained in the electrolyte was investigated in the chapter 5. FT-IR and UV-Vis were used to investigate the in-situ coordination of the cation to the adsorbed dye in the working devices. The open-circuit voltage was found to depend on the number of coordination sites in the dye. P1 with most coordination sites has shown the lowest potential drop, opposite to P7, which is less sensitive to cations in the working cells. A strategy to improve the dye adsorption onto the TiO2 surface, and thus the light harvesting efficiency of the photoanode by UV treatment, is presented in chapter 6. The treatment of the TiO2 film with UV light generates hydroxyl groups and renders the TiO2 surface more and more hydrophilic. The treated TiO2 surface reacts readily with the acid anhydride group of the dye that acts as an anchoring group and improves the dye adsorption. The short-circuit current density and the efficiency of the electrolyte-based dye cells was considerably improved by the UV treatment of the TiO2 film. Solid-state dye-sensitized solar cells (SSDs) based on spiro-MeOTAD (used as hole transport material) are studied in chapter 7. The efficiency of SSDs was globally found to be lower than that of electrolyte-based solar cells. That was due to poor pore filling of the dye-loaded TiO2 film by the spin-coated spiro-MeOTAD and to the significantly slower charge transport in the spiro-MeOTAD compared to the electrolyte redox mediator. However, the presence of the donor moieties in P1 that are structurally similar to spiro-MeOTAD was found to improve the wettability of the P1-loaded TiO2 film. As a consequence the performance of the P1-based solid-state cells is better compared to the cells based on non-spiro compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At present, a fraction of 0.1 - 0.2% of the patients undergoing surgery become aware during the process. The situation is referred to as anesthesia awareness and is obviously very traumatic for the person experiencing it. The reason for its occurrence is mostly an insufficient dosage of the narcotic Propofol combined with the incapability of the technology monitoring the depth of the patient’s anesthetic state to notice the patient becoming aware. A solution can be a highly sensitive and selective real time monitoring device for Propofol based on optical absorption spectroscopy. Its working principle has been postulated by Prof. Dr. habil. H. Hillmer and formulated in DE10 2004 037 519 B4, filed on Aug 30th, 2004. It consists of the exploitation of Intra Cavity Absorption effects in a two mode laser system. In this Dissertation, a two mode external cavity semiconductor laser, which has been developed previously to this work is enhanced and optimized to a functional sensor. Enhancements include the implementation of variable couplers into the system and the implementation of a collimator arrangement into which samples can be introduced. A sample holder and cells are developed and characterized with a focus on compatibility with the measurement approach. Further optimization concerns the overall performance of the system: scattering sources are reduced by re-splicing all fiber-to-fiber connections, parasitic cavities are eliminated by suppressing the Fresnel reflexes of all one fiber ends by means of optical isolators and wavelength stability of the system is improved by the implementation of thermal insulation to the Fiber Bragg Gratings (FBG). The final laser sensor is characterized in detail thermally and optically. Two separate modes are obtained at 1542.0 and 1542.5 nm, tunable in a range of 1nm each. Mode Full Width at Half Maximum (FWHM) is 0.06nm and Signal to Noise Ratio (SNR) is as high as 55 dB. Independent of tuning the two modes of the system can always be equalized in intensity, which is important as the delicacy of the intensity equilibrium is one of the main sensitivity enhancing effects formulated in DE10 2004 037 519 B4. For the proof of concept (POC) measurements the target substance Propofol is diluted in the solvents Acetone and DiChloroMethane (DCM), which have been investigated for compatibility with Propofol beforehand. Eight measurement series (two solvents, two cell lengths and two different mode spacings) are taken, which draw a uniform picture: mode intensity ratio responds linearly to an increase of Propofol in all cases. The slope of the linear response indicates the sensitivity of the system. The eight series are split up into two groups: measurements taken in long cells and measurements taken in short cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The scope of this work is the fundamental growth, tailoring and characterization of self-organized indium arsenide quantum dots (QDs) and their exploitation as active region for diode lasers emitting in the 1.55 µm range. This wavelength regime is especially interesting for long-haul telecommunications as optical fibers made from silica glass have the lowest optical absorption. Molecular Beam Epitaxy is utilized as fabrication technique for the quantum dots and laser structures. The results presented in this thesis depict the first experimental work for which this reactor was used at the University of Kassel. Most research in the field of self-organized quantum dots has been conducted in the InAs/GaAs material system. It can be seen as the model system of self-organized quantum dots, but is not suitable for the targeted emission wavelength. Light emission from this system at 1.55 µm is hard to accomplish. To stay as close as possible to existing processing technology, the In(AlGa)As/InP (100) material system is deployed. Depending on the epitaxial growth technique and growth parameters this system has the drawback of producing a wide range of nano species besides quantum dots. Best known are the elongated quantum dashes (QDash). Such structures are preferentially formed, if InAs is deposited on InP. This is related to the low lattice-mismatch of 3.2 %, which is less than half of the value in the InAs/GaAs system. The task of creating round-shaped and uniform QDs is rendered more complex considering exchange effects of arsenic and phosphorus as well as anisotropic effects on the surface that do not need to be dealt with in the InAs/GaAs case. While QDash structures haven been studied fundamentally as well as in laser structures, they do not represent the theoretical ideal case of a zero-dimensional material. Creating round-shaped quantum dots on the InP(100) substrate remains a challenging task. Details of the self-organization process are still unknown and the formation of the QDs is not fully understood yet. In the course of the experimental work a novel growth concept was discovered and analyzed that eases the fabrication of QDs. It is based on different crystal growth and ad-atom diffusion processes under supply of different modifications of the arsenic atmosphere in the MBE reactor. The reactor is equipped with special valved cracking effusion cells for arsenic and phosphorus. It represents an all-solid source configuration that does not rely on toxic gas supply. The cracking effusion cell are able to create different species of arsenic and phosphorus. This constitutes the basis of the growth concept. With this method round-shaped QD ensembles with superior optical properties and record-low photoluminescence linewidth were achieved. By systematically varying the growth parameters and working out a detailed analysis of the experimental data a range of parameter values, for which the formation of QDs is favored, was found. A qualitative explanation of the formation characteristics based on the surface migration of In ad-atoms is developed. Such tailored QDs are finally implemented as active region in a self-designed diode laser structure. A basic characterization of the static and temperature-dependent properties was carried out. The QD lasers exceed a reference quantum well laser in terms of inversion conditions and temperature-dependent characteristics. Pulsed output powers of several hundred milli watt were measured at room temperature. In particular, the lasers feature a high modal gain that even allowed cw-emission at room temperature of a processed ridge wave guide device as short as 340 µm with output powers of 17 mW. Modulation experiments performed at the Israel Institute of Technology (Technion) showed a complex behavior of the QDs in the laser cavity. Despite the fact that the laser structure is not fully optimized for a high-speed device, data transmission capabilities of 15 Gb/s combined with low noise were achieved. To the best of the author`s knowledge, this renders the lasers the fastest QD devices operating at 1.55 µm. The thesis starts with an introductory chapter that pronounces the advantages of optical fiber communication in general. Chapter 2 will introduce the fundamental knowledge that is necessary to understand the importance of the active region`s dimensions for the performance of a diode laser. The novel growth concept and its experimental analysis are presented in chapter 3. Chapter 4 finally contains the work on diode lasers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ongoing growth of the World Wide Web, catalyzed by the increasing possibility of ubiquitous access via a variety of devices, continues to strengthen its role as our prevalent information and commmunication medium. However, although tools like search engines facilitate retrieval, the task of finally making sense of Web content is still often left to human interpretation. The vision of supporting both humans and machines in such knowledge-based activities led to the development of different systems which allow to structure Web resources by metadata annotations. Interestingly, two major approaches which gained a considerable amount of attention are addressing the problem from nearly opposite directions: On the one hand, the idea of the Semantic Web suggests to formalize the knowledge within a particular domain by means of the "top-down" approach of defining ontologies. On the other hand, Social Annotation Systems as part of the so-called Web 2.0 movement implement a "bottom-up" style of categorization using arbitrary keywords. Experience as well as research in the characteristics of both systems has shown that their strengths and weaknesses seem to be inverse: While Social Annotation suffers from problems like, e. g., ambiguity or lack or precision, ontologies were especially designed to eliminate those. On the contrary, the latter suffer from a knowledge acquisition bottleneck, which is successfully overcome by the large user populations of Social Annotation Systems. Instead of being regarded as competing paradigms, the obvious potential synergies from a combination of both motivated approaches to "bridge the gap" between them. These were fostered by the evidence of emergent semantics, i. e., the self-organized evolution of implicit conceptual structures, within Social Annotation data. While several techniques to exploit the emergent patterns were proposed, a systematic analysis - especially regarding paradigms from the field of ontology learning - is still largely missing. This also includes a deeper understanding of the circumstances which affect the evolution processes. This work aims to address this gap by providing an in-depth study of methods and influencing factors to capture emergent semantics from Social Annotation Systems. We focus hereby on the acquisition of lexical semantics from the underlying networks of keywords, users and resources. Structured along different ontology learning tasks, we use a methodology of semantic grounding to characterize and evaluate the semantic relations captured by different methods. In all cases, our studies are based on datasets from several Social Annotation Systems. Specifically, we first analyze semantic relatedness among keywords, and identify measures which detect different notions of relatedness. These constitute the input of concept learning algorithms, which focus then on the discovery of synonymous and ambiguous keywords. Hereby, we assess the usefulness of various clustering techniques. As a prerequisite to induce hierarchical relationships, our next step is to study measures which quantify the level of generality of a particular keyword. We find that comparatively simple measures can approximate the generality information encoded in reference taxonomies. These insights are used to inform the final task, namely the creation of concept hierarchies. For this purpose, generality-based algorithms exhibit advantages compared to clustering approaches. In order to complement the identification of suitable methods to capture semantic structures, we analyze as a next step several factors which influence their emergence. Empirical evidence is provided that the amount of available data plays a crucial role for determining keyword meanings. From a different perspective, we examine pragmatic aspects by considering different annotation patterns among users. Based on a broad distinction between "categorizers" and "describers", we find that the latter produce more accurate results. This suggests a causal link between pragmatic and semantic aspects of keyword annotation. As a special kind of usage pattern, we then have a look at system abuse and spam. While observing a mixed picture, we suggest that an individual decision should be taken instead of disregarding spammers as a matter of principle. Finally, we discuss a set of applications which operationalize the results of our studies for enhancing both Social Annotation and semantic systems. These comprise on the one hand tools which foster the emergence of semantics, and on the one hand applications which exploit the socially induced relations to improve, e. g., searching, browsing, or user profiling facilities. In summary, the contributions of this work highlight viable methods and crucial aspects for designing enhanced knowledge-based services of a Social Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumer reviews, opinions and shared experiences in the use of a product is a powerful source of information about consumer preferences that can be used in recommender systems. Despite the importance and value of such information, there is no comprehensive mechanism that formalizes the opinions selection and retrieval process and the utilization of retrieved opinions due to the difficulty of extracting information from text data. In this paper, a new recommender system that is built on consumer product reviews is proposed. A prioritizing mechanism is developed for the system. The proposed approach is illustrated using the case study of a recommender system for digital cameras

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows the impact of the atomic capabilities concept to include control-oriented knowledge of linear control systems in the decisions making structure of physical agents. These agents operate in a real environment managing physical objects (e.g. their physical bodies) in coordinated tasks. This approach is presented using an introspective reasoning approach and control theory based on the specific tasks of passing a ball and executing the offside manoeuvre between physical agents in the robotic soccer testbed. Experimental results and conclusions are presented, emphasising the advantages of our approach that improve the multi-agent performance in cooperative systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the final results of the research project undertaken in 2010 and 2011 by the legal research group “Filius”, affiliated with Corporación Universitaria Empresarial Alexander von Humboldt of Armenia, (Quindío). The project’s general objective is “to establish the concept of family used by the Colombian legal system based on the judgments of the Constitutional Court granting rights to same-sex couples”. To this end, a line of jurisprudence was developed from the Court’s rulings that discussed the rights of same-sex couples, concluding that despite the great progress made in Colombia on the recognition of rights to these couples following Decision C-075/2007, in all these judgments the Court had always refused to recognize their family status, and it was not until 2011, in Decision C-577, that the Court accepted that same-sex couples constitute a family, thereby dramatically changing the constitutional doctrine that had maintained the criteria of heterosexuality as defining family.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leaders have the task of developing future strategy while being consciously engaged in executing current strategy and mapping landmarks, pathways and obstacles which they meet asthey endeavor to traverse challenging, rapidly evolving terrain. In an era in which there is a global leadership credibility crisis, business as usual is no longer an option in the pursuit of the longer-term survival of any organization. The leadership approach to complexity outlined here is based on learning to achieve results through experimentation, learning, and reflection. A case study is presented that illustrates the application of this approach. In this paper, the reader is first introduced to a brief overview of some key definitions and debates, shifting leadership boundaries, and emerging accountabilities and opportunities. This is followed by a summary of some of the key topics and issues that face current and future leaders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shape complexity has recently received attention from different fields, such as computer vision and psychology. In this paper, integral geometry and information theory tools are applied to quantify the shape complexity from two different perspectives: from the inside of the object, we evaluate its degree of structure or correlation between its surfaces (inner complexity), and from the outside, we compute its degree of interaction with the circumscribing sphere (outer complexity). Our shape complexity measures are based on the following two facts: uniformly distributed global lines crossing an object define a continuous information channel and the continuous mutual information of this channel is independent of the object discretisation and invariant to translations, rotations, and changes of scale. The measures introduced in this paper can be potentially used as shape descriptors for object recognition, image retrieval, object localisation, tumour analysis, and protein docking, among others

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El test de circuits és una fase del procés de producció que cada vegada pren més importància quan es desenvolupa un nou producte. Les tècniques de test i diagnosi per a circuits digitals han estat desenvolupades i automatitzades amb èxit, mentre que aquest no és encara el cas dels circuits analògics. D'entre tots els mètodes proposats per diagnosticar circuits analògics els més utilitzats són els diccionaris de falles. En aquesta tesi se'n descriuen alguns, tot analitzant-ne els seus avantatges i inconvenients. Durant aquests últims anys, les tècniques d'Intel·ligència Artificial han esdevingut un dels camps de recerca més importants per a la diagnosi de falles. Aquesta tesi desenvolupa dues d'aquestes tècniques per tal de cobrir algunes de les mancances que presenten els diccionaris de falles. La primera proposta es basa en construir un sistema fuzzy com a eina per identificar. Els resultats obtinguts son força bons, ja que s'aconsegueix localitzar la falla en un elevat tant percent dels casos. Per altra banda, el percentatge d'encerts no és prou bo quan a més a més s'intenta esbrinar la desviació. Com que els diccionaris de falles es poden veure com una aproximació simplificada al Raonament Basat en Casos (CBR), la segona proposta fa una extensió dels diccionaris de falles cap a un sistema CBR. El propòsit no és donar una solució general del problema sinó contribuir amb una nova metodologia. Aquesta consisteix en millorar la diagnosis dels diccionaris de falles mitjançant l'addició i l'adaptació dels nous casos per tal d'esdevenir un sistema de Raonament Basat en Casos. Es descriu l'estructura de la base de casos així com les tasques d'extracció, de reutilització, de revisió i de retenció, fent èmfasi al procés d'aprenentatge. En el transcurs del text s'utilitzen diversos circuits per mostrar exemples dels mètodes de test descrits, però en particular el filtre biquadràtic és l'utilitzat per provar les metodologies plantejades, ja que és un dels benchmarks proposats en el context dels circuits analògics. Les falles considerades son paramètriques, permanents, independents i simples, encara que la metodologia pot ser fàcilment extrapolable per a la diagnosi de falles múltiples i catastròfiques. El mètode es centra en el test dels components passius, encara que també es podria extendre per a falles en els actius.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality management Self-evaluation of the organisation Citizens/customers satisfaction Impact on society evaluation Key performance evaluation Good practices comparison (Benchmarking) Continuous improvement In professional environments, when quality assessment of museums is discussed, one immediately thinks of the honourableness of the directors and curators, the erudition and specialisation of knowledge, the diversity of the gathered material and study of the collections, the collections conservation methods and environmental control, the regularity and notoriety of the exhibitions and artists, the building’s architecture and site, the recreation of environments, the museographic equipment design. We admit that the roles and attributes listed above can contribute to the definition of a specificity of museological good practice within a hierarchised functional perspective (the museum functions) and for the classification of museums according to a scale, validated between peers, based on “installed” appreciation criteria, enforced from above downwards, according to the “prestige” of the products and of those who conceive them, but that say nothing about the effective satisfaction of the citizen/customers and the real impact on society. There is a lack of evaluation instruments that would give us a return of all that the museum is and represents in contemporary society, focused on being and on the relation with the other, in detriment of the ostentatious possession and of the doing in order to meet one’s duties. But it is only possible to evaluate something by measurement and comparison, on the basis of well defined criteria, from a common grid, implicating all of the actors in the self-evaluation, in the definition of the aims to fulfil and in the obtaining of results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The s–x model of microwave emission from soil and vegetation layers is widely used to estimate soil moisture content from passive microwave observations. Its application to prospective satellite-based observations aggregating several thousand square kilometres requires understanding of the effects of scene heterogeneity. The effects of heterogeneity in soil surface roughness, soil moisture, water area and vegetation density on the retrieval of soil moisture from simulated single- and multi-angle observing systems were tested. Uncertainty in water area proved the most serious problem for both systems, causing errors of a few percent in soil moisture retrieval. Single-angle retrieval was largely unaffected by the other factors studied here. Multiple-angle retrievals errors around one percent arose from heterogeneity in either soil roughness or soil moisture. Errors of a few percent were caused by vegetation heterogeneity. A simple extension of the model vegetation representation was shown to reduce this error substantially for scenes containing a range of vegetation types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within this paper modern techniques such as satellite image analysis and tools provided by geographic information systems (GIS.) are exploited in order to extend and improve existing techniques for mapping the spatial distribution of sediment transport processes. The processes of interest comprise mass movements such as solifluction, slope wash, dirty avalanches and rock- and boulder falls. They differ considerably in nature and therefore different approaches for the derivation of their spatial extent are required. A major challenge is addressing the differences between the comparably coarse resolution of the available satellite data (Landsat TM/ETM+, 30 in x 30 m) and the actual scale of sediment transport in this environment. A three-stepped approach has been developed which is based on the concept of Geomorphic Process Units (GPUs): parameterization, process area delineation and combination. Parameters include land cover from satellite data and digital elevation model derivatives. Process areas are identified using a hierarchical classification scheme utilizing thresholds and definition of topology. The approach has been developed for the Karkevagge in Sweden and could be successfully transferred to the Rabotsbekken catchment at Okstindan, Norway using similar input data. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential of the τ-ω model for retrieving the volumetric moisture content of bare and vegetated soil from dual polarisation passive microwave data acquired at single and multiple angles is tested. Measurement error and several additional sources of uncertainty will affect the theoretical retrieval accuracy. These include uncertainty in the soil temperature, the vegetation structure and consequently its microwave singlescattering albedo, and uncertainty in soil microwave emissivity based on its roughness. To test the effects of these uncertainties for simple homogeneous scenes, we attempt to retrieve soil moisture from a number of simulated microwave brightness temperature datasets generated using the τ-ω model. The uncertainties for each influence are estimated and applied to curves generated for typical scenarios, and an inverse model used to retrieve the soil moisture content, vegetation optical depth and soil temperature. The effect of each influence on the theoretical soil moisture retrieval limit is explored, the likelihood of each sensor configuration meeting user requirements is assessed, and the most effective means of improving moisture retrieval indicated.