12 resultados para dynamic threat avoid

em Universitätsbibliothek Kassel, Universität Kassel, Germany


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Land use has become a force of global importance, considering that 34% of the Earth’s ice-free surface was covered by croplands or pastures in 2000. The expected increase in global human population together with eminent climate change and associated search for energy sources other than fossil fuels can, through land-use and land-cover changes (LUCC), increase the pressure on nature’s resources, further degrade ecosystem services, and disrupt other planetary systems of key importance to humanity. This thesis presents four modeling studies on the interplay between LUCC, increased production of biofuels and climate change in four selected world regions. In the first study case two new crop types (sugarcane and jatropha) are parameterized in the LPJ for managed Lands dynamic global vegetation model for calculation of their potential productivity. Country-wide spatial variation in the yields of sugarcane and jatropha incurs into substantially different land requirements to meet the biofuel production targets for 2015 in Brazil and India, depending on the location of plantations. Particularly the average land requirements for jatropha in India are considerably higher than previously estimated. These findings indicate that crop zoning is important to avoid excessive LUCC. In the second study case the LandSHIFT model of land-use and land-cover changes is combined with life cycle assessments to investigate the occurrence and extent of biofuel-driven indirect land-use changes (ILUC) in Brazil by 2020. The results show that Brazilian biofuels can indeed cause considerable ILUC, especially by pushing the rangeland frontier into the Amazonian forests. The carbon debt caused by such ILUC would result in no carbon savings (from using plant-based ethanol and biodiesel instead of fossil fuels) before 44 years for sugarcane ethanol and 246 years for soybean biodiesel. The intensification of livestock grazing could avoid such ILUC. We argue that such an intensification of livestock should be supported by the Brazilian biofuel sector, based on the sector’s own interest in minimizing carbon emissions. In the third study there is the development of a new method for crop allocation in LandSHIFT, as influenced by the occurrence and capacity of specific infrastructure units. The method is exemplarily applied in a first assessment of the potential availability of land for biogas production in Germany. The results indicate that Germany has enough land to fulfill virtually all (90 to 98%) its current biogas plant capacity with only cultivated feedstocks. Biogas plants located in South and Southwestern (North and Northeastern) Germany might face more (less) difficulties to fulfill their capacities with cultivated feedstocks, considering that feedstock transport distance to plants is a crucial issue for biogas production. In the fourth study an adapted version of LandSHIFT is used to assess the impacts of contrasting scenarios of climate change and conservation targets on land use in the Brazilian Amazon. Model results show that severe climate change in some regions by 2050 can shift the deforestation frontier to areas that would experience low levels of human intervention under mild climate change (such as the western Amazon forests or parts of the Cerrado savannas). Halting deforestation of the Amazon and of the Brazilian Cerrado would require either a reduction in the production of meat or an intensification of livestock grazing in the region. Such findings point out the need for an integrated/multicisciplinary plan for adaptation to climate change in the Amazon. The overall conclusions of this thesis are that (i) biofuels must be analyzed and planned carefully in order to effectively reduce carbon emissions; (ii) climate change can have considerable impacts on the location and extent of LUCC; and (iii) intensification of grazing livestock represents a promising venue for minimizing the impacts of future land-use and land-cover changes in Brazil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context awareness, dynamic reconfiguration at runtime and heterogeneity are key characteristics of future distributed systems, particularly in ubiquitous and mobile computing scenarios. The main contributions of this dissertation are theoretical as well as architectural concepts facilitating information exchange and fusion in heterogeneous and dynamic distributed environments. Our main focus is on bridging the heterogeneity issues and, at the same time, considering uncertain, imprecise and unreliable sensor information in information fusion and reasoning approaches. A domain ontology is used to establish a common vocabulary for the exchanged information. We thereby explicitly support different representations for the same kind of information and provide Inter-Representation Operations that convert between them. Special account is taken of the conversion of associated meta-data that express uncertainty and impreciseness. The Unscented Transformation, for example, is applied to propagate Gaussian normal distributions across highly non-linear Inter-Representation Operations. Uncertain sensor information is fused using the Dempster-Shafer Theory of Evidence as it allows explicit modelling of partial and complete ignorance. We also show how to incorporate the Dempster-Shafer Theory of Evidence into probabilistic reasoning schemes such as Hidden Markov Models in order to be able to consider the uncertainty of sensor information when deriving high-level information from low-level data. For all these concepts we provide architectural support as a guideline for developers of innovative information exchange and fusion infrastructures that are particularly targeted at heterogeneous dynamic environments. Two case studies serve as proof of concept. The first case study focuses on heterogeneous autonomous robots that have to spontaneously form a cooperative team in order to achieve a common goal. The second case study is concerned with an approach for user activity recognition which serves as baseline for a context-aware adaptive application. Both case studies demonstrate the viability and strengths of the proposed solution and emphasize that the Dempster-Shafer Theory of Evidence should be preferred to pure probability theory in applications involving non-linear Inter-Representation Operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Temporal changes in odor concentration are vitally important to many animals orienting and navigating in their environment. How are such temporal changes detected? Within the scope of the present work an accurate stimulation and analysis system was developed to examine the dynamics of physiological properties of Drosophila melanogaster olfactory receptor organs. Subsequently a new method for delivering odor stimuli was tested and used to present the first dynamic characterization of olfactory receptors at the level of single neurons. Initially, recordings of the whole antenna were conducted while stimulating with different odors. The odor delivery system allowed the dynamic characterization of the whole fly antenna, including its sensilla and receptor neurons. Based on the obtained electroantennogram data a new odor delivery method called digital sequence method was developed. In addition the degree of accuracy was enhanced, initially using electroantennograms, and later recordings of odorant receptor cells at the single sensilla level. This work shows for the first time that different odors evoked different responses within one neuron depending on the chemical structure of the odor. The present work offers new insights into the dynamic properties of olfactory transduction in Drosophila melanogaster and describes time dependent parameters underlying these properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid growth in high data rate communication systems has introduced new high spectral efficient modulation techniques and standards such as LTE-A (long term evolution-advanced) for 4G (4th generation) systems. These techniques have provided a broader bandwidth but introduced high peak-to-average power ratio (PAR) problem at the high power amplifier (HPA) level of the communication system base transceiver station (BTS). To avoid spectral spreading due to high PAR, stringent requirement on linearity is needed which brings the HPA to operate at large back-off power at the expense of power efficiency. Consequently, high power devices are fundamental in HPAs for high linearity and efficiency. Recent development in wide bandgap power devices, in particular AlGaN/GaN HEMT, has offered higher power level with superior linearity-efficiency trade-off in microwaves communication. For cost-effective HPA design to production cycle, rigorous computer aided design (CAD) AlGaN/GaN HEMT models are essential to reflect real response with increasing power level and channel temperature. Therefore, large-size AlGaN/GaN HEMT large-signal electrothermal modeling procedure is proposed. The HEMT structure analysis, characterization, data processing, model extraction and model implementation phases have been covered in this thesis including trapping and self-heating dispersion accounting for nonlinear drain current collapse. The small-signal model is extracted using the 22-element modeling procedure developed in our department. The intrinsic large-signal model is deeply investigated in conjunction with linearity prediction. The accuracy of the nonlinear drain current has been enhanced through several issues such as trapping and self-heating characterization. Also, the HEMT structure thermal profile has been investigated and corresponding thermal resistance has been extracted through thermal simulation and chuck-controlled temperature pulsed I(V) and static DC measurements. Higher-order equivalent thermal model is extracted and implemented in the HEMT large-signal model to accurately estimate instantaneous channel temperature. Moreover, trapping and self-heating transients has been characterized through transient measurements. The obtained time constants are represented by equivalent sub-circuits and integrated in the nonlinear drain current implementation to account for complex communication signals dynamic prediction. The obtained verification of this table-based large-size large-signal electrothermal model implementation has illustrated high accuracy in terms of output power, gain, efficiency and nonlinearity prediction with respect to standard large-signal test signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auf dem Gebiet der Strukturdynamik sind computergestützte Modellvalidierungstechniken inzwischen weit verbreitet. Dabei werden experimentelle Modaldaten, um ein numerisches Modell für weitere Analysen zu korrigieren. Gleichwohl repräsentiert das validierte Modell nur das dynamische Verhalten der getesteten Struktur. In der Realität gibt es wiederum viele Faktoren, die zwangsläufig zu variierenden Ergebnissen von Modaltests führen werden: Sich verändernde Umgebungsbedingungen während eines Tests, leicht unterschiedliche Testaufbauten, ein Test an einer nominell gleichen aber anderen Struktur (z.B. aus der Serienfertigung), etc. Damit eine stochastische Simulation durchgeführt werden kann, muss eine Reihe von Annahmen für die verwendeten Zufallsvariablengetroffen werden. Folglich bedarf es einer inversen Methode, die es ermöglicht ein stochastisches Modell aus experimentellen Modaldaten zu identifizieren. Die Arbeit beschreibt die Entwicklung eines parameter-basierten Ansatzes, um stochastische Simulationsmodelle auf dem Gebiet der Strukturdynamik zu identifizieren. Die entwickelte Methode beruht auf Sensitivitäten erster Ordnung, mit denen Parametermittelwerte und Kovarianzen des numerischen Modells aus stochastischen experimentellen Modaldaten bestimmt werden können.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vor dem Hintergund der Integration des wissensbasierten Managementsystems Precision Farming in den Ökologischen Landbau wurde die Umsetzung bestehender sowie neu zu entwickelnder Strategien evaluiert und diskutiert. Mit Blick auf eine im Precision Farming maßgebende kosteneffiziente Ertragserfassung der im Ökologischen Landbau flächenrelevanten Leguminosen-Grasgemenge wurden in zwei weiteren Beiträgen die Schätzgüten von Ultraschall- und Spektralsensorik in singulärer und kombinierter Anwendung analysiert. Das Ziel des Precision Farming, ein angepasstes Management bezogen auf die flächeninterne Variabilität der Standorte umzusetzen, und damit einer Reduzierung von Betriebsmitteln, Energie, Arbeit und Umwelteffekten bei gleichzeitiger Effektivitätssteigerung und einer ökonomischen Optimierung zu erreichen, deckt sich mit wesentlichen Bestrebungen im Ökogischen Landbau. Es sind vorrangig Maßnahmen zur Erfassung der Variabilität von Standortfaktoren wie Geländerelief, Bodenbeprobung und scheinbare elektrische Leitfähigkeit sowie der Ertragserfassung über Mähdrescher, die direkt im Ökologischen Landbau Anwendung finden können. Dagegen sind dynamisch angepasste Applikationen zur Düngung, im Pflanzenschutz und zur Beseitigung von Unkräutern aufgrund komplexer Interaktionen und eines eher passiven Charakters dieser Maßnahmen im Ökologischen Landbau nur bei Veränderung der Applikationsmodelle und unter Einbindung weiterer dynamischer Daten umsetzbar. Beispiele hiefür sind einzubeziehende Mineralisierungsprozesse im Boden und organischem Dünger bei der Düngemengenberechnung, schwer ortsspezifisch zuzuordnende präventive Maßnamen im Pflanzenschutz sowie Einflüsse auf bodenmikrobiologische Prozesse bei Hack- oder Striegelgängen. Die indirekten Regulationsmechanismen des Ökologischen Landbaus begrenzen daher die bisher eher auf eine direkte Wirkung ausgelegten dynamisch angepassten Applikationen des konventionellen Precision Farming. Ergänzend sind innovative neue Strategien denkbar, von denen die qualitätsbezogene Ernte, der Einsatz hochsensibler Sensoren zur Früherkennung von Pflanzenkrankheiten oder die gezielte teilflächen- und naturschutzorientierte Bewirtschaftung exemplarisch in der Arbeit vorgestellt werden. Für die häufig große Flächenanteile umfassenden Leguminosen-Grasgemenge wurden für eine kostengünstige und flexibel einsetzbare Ertragserfassung die Ultraschalldistanzmessung zur Charakterisierung der Bestandeshöhe sowie verschiedene spektrale Vegetationsindices als Schätzindikatoren analysiert. Die Vegetationsindices wurden aus hyperspektralen Daten nach publizierten Gleichungen errechnet sowie als „Normalized Difference Spectral Index“ (NDSI) stufenweise aus allen möglichen Wellenlängenkombinationen ermittelt. Die Analyse erfolgte für Ultraschall und Vegetationsindices in alleiniger und in kombinierter Anwendung, um mögliche kompensatorische Effekte zu nutzen. In alleiniger Anwendung erreichte die Ultraschallbestandeshöhe durchweg bessere Schätzgüten, als alle einzelnen Vegetationsindices. Bei den letztgenannten erreichten insbesondere auf Wasserabsorptionsbanden basierende Vegetationsindices eine höhere Schätzgenauigkeit als traditionelle Rot/Infrarot-Indices. Die Kombination beider Sensorda-ten ließ eine weitere Steigerung der Schätzgüte erkennen, insbesondere bei bestandesspezifischer Kalibration. Hierbei kompensieren die Vegetationsindices Fehlschätzungen der Höhenmessung bei diskontinuierlichen Bestandesdichtenänderungen entlang des Höhengradienten, wie sie beim Ährenschieben oder durch einzelne hochwachsende Arten verursacht werden. Die Kombination der Ultraschallbestandeshöhe mit Vegetationsindices weist das Potential zur Entwicklung kostengünstiger Ertragssensoren für Leguminosen-Grasgemenge auf. Weitere Untersuchungen mit hyperspektralen Vegetationsindices anderer Berechnungstrukturen sowie die Einbindung von mehr als zwei Wellenlängen sind hinsichtlich der Entwicklung höherer Schätzgüten notwendig. Ebenso gilt es, Kalibrierungen und Validationen der Sensorkombination im artenreichen Grasland durchzuführen. Die Ertragserfassung in den Leguminosen-Grasgemengen stellt einen wichtigen Beitrag zur Erstellung einer Ertragshistorie in den vielfältigen Fruchtfolgen des Ökologischen Landbaus dar und ermöglicht eine verbesserte Einschätzung von Produktionspotenzialen und Defizitarealen für ein standortangepasstes Management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Web services from different partners can be combined to applications that realize a more complex business goal. Such applications built as Web service compositions define how interactions between Web services take place in order to implement the business logic. Web service compositions not only have to provide the desired functionality but also have to comply with certain Quality of Service (QoS) levels. Maximizing the users' satisfaction, also reflected as Quality of Experience (QoE), is a primary goal to be achieved in a Service-Oriented Architecture (SOA). Unfortunately, in a dynamic environment like SOA unforeseen situations might appear like services not being available or not responding in the desired time frame. In such situations, appropriate actions need to be triggered in order to avoid the violation of QoS and QoE constraints. In this thesis, proper solutions are developed to manage Web services and Web service compositions with regard to QoS and QoE requirements. The Business Process Rules Language (BPRules) was developed to manage Web service compositions when undesired QoS or QoE values are detected. BPRules provides a rich set of management actions that may be triggered for controlling the service composition and for improving its quality behavior. Regarding the quality properties, BPRules allows to distinguish between the QoS values as they are promised by the service providers, QoE values that were assigned by end-users, the monitored QoS as measured by our BPR framework, and the predicted QoS and QoE values. BPRules facilitates the specification of certain user groups characterized by different context properties and allows triggering a personalized, context-aware service selection tailored for the specified user groups. In a service market where a multitude of services with the same functionality and different quality values are available, the right services need to be selected for realizing the service composition. We developed new and efficient heuristic algorithms that are applied to choose high quality services for the composition. BPRules offers the possibility to integrate multiple service selection algorithms. The selection algorithms are applicable also for non-linear objective functions and constraints. The BPR framework includes new approaches for context-aware service selection and quality property predictions. We consider the location information of users and services as context dimension for the prediction of response time and throughput. The BPR framework combines all new features and contributions to a comprehensive management solution. Furthermore, it facilitates flexible monitoring of QoS properties without having to modify the description of the service composition. We show how the different modules of the BPR framework work together in order to execute the management rules. We evaluate how our selection algorithms outperform a genetic algorithm from related research. The evaluation reveals how context data can be used for a personalized prediction of response time and throughput.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Almost everyone sketches. People use sketches day in and day out in many different and heterogeneous fields, to share their thoughts and clarify ambiguous interpretations, for example. The media used to sketch varies from analog tools like flipcharts to digital tools like smartboards. Whereas analog tools are usually affected by insufficient editing capabilities like cut/copy/paste, digital tools greatly support these scenarios. Digital tools can be grouped into informal and formal tools. Informal tools can be understood as simple drawing environments, whereas formal tools offer sophisticated support to create, optimize and validate diagrams of a certain application domain. Most digital formal tools force users to stick to a concrete syntax and editing workflow, limiting the user’s creativity. For that reason, a lot of people first sketch their ideas using the flexibility of analog or digital informal tools. Subsequently, the sketch is "portrayed" in an appropriate digital formal tool. This work presents Scribble, a highly configurable and extensible sketching framework which allows to dynamically inject sketching features into existing graphical diagram editors, based on Eclipse GEF. This allows to combine the flexibility of informal tools with the power of formal tools without any effort. No additional code is required to augment a GEF editor with sophisticated sketching features. Scribble recognizes drawn elements as well as handwritten text and automatically generates the corresponding domain elements. A local training data library is created dynamically by incrementally learning shapes, drawn by the user. Training data can be shared with others using the WebScribble web application which has been created as part of this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the present research, we investigated effects of existential threat on veracity judgments. According to several meta-analyses, people judge potentially deceptive messages of other people as true rather than as false (so-called truth bias). This judgmental bias has been shown to depend on how people weigh the error of judging a true message as a lie (error 1) and the error of judging a lie as a true message (error 2). The weight of these errors has been further shown to be affected by situational variables. Given that research on terror management theory has found evidence that mortality salience (MS) increases the sensitivity toward the compliance of cultural norms, especially when they are of focal attention, we assumed that when the honesty norm is activated, MS affects judgmental error weighing and, consequently, judgmental biases. Specifically, activating the norm of honesty should decrease the weight of error 1 (the error of judging a true message as a lie) and increase the weight of error 2 (the error of judging a lie as a true message) when mortality is salient. In a first study, we found initial evidence for this assumption. Furthermore, the change in error weighing should reduce the truth bias, automatically resulting in better detection accuracy of actual lies and worse accuracy of actual true statements. In two further studies, we manipulated MS and honesty norm activation before participants judged several videos containing actual truths or lies. Results revealed evidence for our prediction. Moreover, in Study 3, the truth bias was increased after MS when group solidarity was previously emphasized.