890 resultados para Scope of Protection
Resumo:
[EN]Detecting people is a key capability for robots that operate in populated environments. In this paper, we have adopted a hierarchical approach that combines classifiers created using supervised learning in order to identify whether a person is in the view-scope of the robot or not. Our approach makes use of vision, depth and thermal sensors mounted on top of a mobile platform.
Resumo:
[EN]On the basis of an extensive work of bibliographic documentation, supplemented by a work of field research in the area of study (watching and direct participation in the activity), in the present work aims to provide the basic information for understanding the current situation of cetaceans on the south coast of Gran Canaria, Canary Islands, Spain, between July on 2013 and March on 2014 (the months of February and March of 2014 correspond to my work of field research in the area of study). There will be a guide to the species that integrate the sector of the large whales, analyzing their specific characteristics, its distribution, its degree of protection
Resumo:
[EN]This Ph. D. thesis presents a simple and stable procedure for the estimation of periods and dampings of pile shear buildings taking soil-structure interaction into account. The coupled-system response is obtained by using a substructuring model. A boundary element-finite element coupling formulation is used to compute impedances and kinematic interaction factors of the pile group configurations under investigation. The proposed procedure is applied to perform parametric analyses to determine the influence of the main parameters of soil-structure interaction problems on the dynamic response of the superstructure. The scope of this thesis also encompasses the study of foundations including battered piles.
Resumo:
The main scope of this Ph.D. thesis has concerned the possible transformations of bridging ligands in diiron complexes, in order to explore unconventional routes to the synthesis of new functionalized multisite bound organic frames. The results achieved during the Ph.D. can be summarized in the following points: 1) We have extended the assembling between small unsaturated molecules and bridging carbyne ligands in diiron complexes to other species. In particular, we have investigated the coupling between olefins and thiocarbyne, leading to the synthesis of thioallylidene bridging diiron complexes. Then, we have extended the study to the coupling between olefins and aminocarbyne. This result shows that the coupling between activated olefins and heteroatom substituted bridging carbynes has a general character. 2) As we have shown, the coupling of bridging alkylidyne ligands with alkynes and alkenes provides excellent routes to the synthesis of bridging C3 hydrocarbyl ligands. As a possible extension of these results we have examined the synthesis of C4 bridging frames through the combination of bridging alkylidynes with allenes. Also in this case the reaction has a general character. 3) Diiron complexes bearing bridging functionalized C3 organic frames display the presence of donor atoms, such as N and S, potentially able to coordinate unsaturated metal fragments. Thus, we have studied the possibility for these systems to act as ‘organometallic ligands’, in particular towards Pd and Rh. 4) The possibility of releasing the organic frame from the bridging coordination appears particularly appealing in the direction of a metal-assisted organic synthesis. Within this field, we have investigated the possibility of involving the C3 bridging ligand in cycloaddition reactions with alkynes, with the aim of generating variously functionalized five-membered cycles. The [3+2] cyclization does not lead to the complete release of the organic fragment but rather it produces its transformation into a cyclopentadienyl ring, which remains coordinated to one Fe atom. This result introduces a new approach to the formation of polyfunctionalised ferrocenes. 5) Furthermore, I have spent a research period of about six months at the Department of Inorganic Chemistry of the Barcelona University, under the supervision of Prof. Concepción López, with the aim of studying the chemistry of polydentate ferrocenyl ligands and their use in organometallic synthesis.
Resumo:
Several MCAO systems are under study to improve the angular resolution of the current and of the future generation large ground-based telescopes (diameters in the 8-40 m range). The subject of this PhD Thesis is embedded in this context. Two MCAO systems, in dierent realization phases, are addressed in this Thesis: NIRVANA, the 'double' MCAO system designed for one of the interferometric instruments of LBT, is in the integration and testing phase; MAORY, the future E-ELT MCAO module, is under preliminary study. These two systems takle the sky coverage problem in two dierent ways. The layer oriented approach of NIRVANA, coupled with multi-pyramids wavefront sensors, takes advantage of the optical co-addition of the signal coming from up to 12 NGS in a annular 2' to 6' technical FoV and up to 8 in the central 2' FoV. Summing the light coming from many natural sources permits to increase the limiting magnitude of the single NGS and to improve considerably the sky coverage. One of the two Wavefront Sensors for the mid- high altitude atmosphere analysis has been integrated and tested as a stand- alone unit in the laboratory at INAF-Osservatorio Astronomico di Bologna and afterwards delivered to the MPIA laboratories in Heidelberg, where was integrated and aligned to the post-focal optical relay of one LINC-NIRVANA arm. A number of tests were performed in order to characterize and optimize the system functionalities and performance. A report about this work is presented in Chapter 2. In the MAORY case, to ensure correction uniformity and sky coverage, the LGS-based approach is the current baseline. However, since the Sodium layer is approximately 10 km thick, the articial reference source looks elongated, especially when observed from the edge of a large aperture. On a 30-40 m class telescope, for instance, the maximum elongation varies between few arcsec and 10 arcsec, depending on the actual telescope diameter, on the Sodium layer properties and on the laser launcher position. The centroiding error in a Shack-Hartmann WFS increases proportionally to the elongation (in a photon noise dominated regime), strongly limiting the performance. To compensate for this effect a straightforward solution is to increase the laser power, i.e. to increase the number of detected photons per subaperture. The scope of Chapter 3 is twofold: an analysis of the performance of three dierent algorithms (Weighted Center of Gravity, Correlation and Quad-cell) for the instantaneous LGS image position measurement in presence of elongated spots and the determination of the required number of photons to achieve a certain average wavefront error over the telescope aperture. An alternative optical solution to the spot elongation problem is proposed in Section 3.4. Starting from the considerations presented in Chapter 3, a first order analysis of the LGS WFS for MAORY (number of subapertures, number of detected photons per subaperture, RON, focal plane sampling, subaperture FoV) is the subject of Chapter 4. An LGS WFS laboratory prototype was designed to reproduce the relevant aspects of an LGS SH WFS for the E-ELT and to evaluate the performance of different centroid algorithms in presence of elongated spots as investigated numerically and analytically in Chapter 3. This prototype permits to simulate realistic Sodium proles. A full testing plan for the prototype is set in Chapter 4.
Resumo:
The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.
Resumo:
Background. One of the phenomena observed in human aging is the progressive increase of a systemic inflammatory state, a condition referred to as “inflammaging”, negatively correlated with longevity. A prominent mediator of inflammation is the transcription factor NF-kB, that acts as key transcriptional regulator of many genes coding for pro-inflammatory cytokines. Many different signaling pathways activated by very diverse stimuli converge on NF-kB, resulting in a regulatory network characterized by high complexity. NF-kB signaling has been proposed to be responsible of inflammaging. Scope of this analysis is to provide a wider, systemic picture of such intricate signaling and interaction network: the NF-kB pathway interactome. Methods. The study has been carried out following a workflow for gathering information from literature as well as from several pathway and protein interactions databases, and for integrating and analyzing existing data and the relative reconstructed representations by using the available computational tools. Strong manual intervention has been necessarily used to integrate data from multiple sources into mathematically analyzable networks. The reconstruction of the NF-kB interactome pursued with this approach provides a starting point for a general view of the architecture and for a deeper analysis and understanding of this complex regulatory system. Results. A “core” and a “wider” NF-kB pathway interactome, consisting of 140 and 3146 proteins respectively, were reconstructed and analyzed through a mathematical, graph-theoretical approach. Among other interesting features, the topological characterization of the interactomes shows that a relevant number of interacting proteins are in turn products of genes that are controlled and regulated in their expression exactly by NF-kB transcription factors. These “feedback loops”, not always well-known, deserve deeper investigation since they may have a role in tuning the response and the output consequent to NF-kB pathway initiation, in regulating the intensity of the response, or its homeostasis and balance in order to make the functioning of such critical system more robust and reliable. This integrated view allows to shed light on the functional structure and on some of the crucial nodes of thet NF-kB transcription factors interactome. Conclusion. Framing structure and dynamics of the NF-kB interactome into a wider, systemic picture would be a significant step toward a better understanding of how NF-kB globally regulates diverse gene programs and phenotypes. This study represents a step towards a more complete and integrated view of the NF-kB signaling system.
Resumo:
This dissertation deals with the development of a project concerning a demonstration in the scope of the Supply Chain 6 of the Internet of Energy (IoE) project: the Remote Monitoring Emulator, which bears my personal contribution in several sections. IoE is a project of international relevance, that means to establish an interoperability standard as regards the electric power production and utilization infrastructure, using Smart Space platforms. The future perspectives of IoE have to do with a platform for electrical power trade-of, the Smart Grid, whose energy is produced by decentralized renewable sources and whose services are exploited primarily according to the Internet of Things philosophy. The main consumers of this kind of smart technology will be Smart Houses (that is to say, buildings controlled by an autonomous system for electrical energy management that is interoperable with the Smart Grid) and Electric Mobility, that is a smart and automated management regarding movement and, overall, recharging of electrical vehicles. It is precisely in the latter case study that the project Remote Monitoring Emulator takes place. It consists in the development of a simulated platform for the management of an electrical vehicle recharging in a city. My personal contribution to this project lies in development and modeling of the simulation platform, of its counterpart in a mobile application and implementation of a city service prototype. This platform shall, ultimately, make up a demonstrator system exploiting the same device which a real user, inside his vehicle, would use. The main requirements that this platform shall satisfy will be interoperability, expandability and relevance to standards, as it needs to communicate with other development groups and to effectively respond to internal changes that can affect IoE.
Resumo:
A polar stratospheric cloud submodel has been developed and incorporated in a general circulation model including atmospheric chemistry (ECHAM5/MESSy). The formation and sedimentation of polar stratospheric cloud (PSC) particles can thus be simulated as well as heterogeneous chemical reactions that take place on the PSC particles. For solid PSC particle sedimentation, the need for a tailor-made algorithm has been elucidated. A sedimentation scheme based on first order approximations of vertical mixing ratio profiles has been developed. It produces relatively little numerical diffusion and can deal well with divergent or convergent sedimentation velocity fields. For the determination of solid PSC particle sizes, an efficient algorithm has been adapted. It assumes a monodisperse radii distribution and thermodynamic equilibrium between the gas phase and the solid particle phase. This scheme, though relatively simple, is shown to produce particle number densities and radii within the observed range. The combined effects of the representations of sedimentation and solid PSC particles on vertical H2O and HNO3 redistribution are investigated in a series of tests. The formation of solid PSC particles, especially of those consisting of nitric acid trihydrate, has been discussed extensively in recent years. Three particle formation schemes in accordance with the most widely used approaches have been identified and implemented. For the evaluation of PSC occurrence a new data set with unprecedented spatial and temporal coverage was available. A quantitative method for the comparison of simulation results and observations is developed and applied. It reveals that the relative PSC sighting frequency can be reproduced well with the PSC submodel whereas the detailed modelling of PSC events is beyond the scope of coarse global scale models. In addition to the development and evaluation of new PSC submodel components, parts of existing simulation programs have been improved, e.g. a method for the assimilation of meteorological analysis data in the general circulation model, the liquid PSC particle composition scheme, and the calculation of heterogeneous reaction rate coefficients. The interplay of these model components is demonstrated in a simulation of stratospheric chemistry with the coupled general circulation model. Tests against recent satellite data show that the model successfully reproduces the Antarctic ozone hole.
Resumo:
The scope of my research project is to produce and characterize new crystalline forms of organic compounds, focusing the attention on co-crystals and then transferring these notions on APIs to produce co-crystals of potential interest in the pharmaceutical field. In the first part of this work co-crystallization experiments were performed using as building blocks the family of aliphatic dicarboxylic acids HOOC-(CH2)n-COOH, with n= 2-8. This class of compounds has always been an object of study because it is characterized by an interesting phenomenon of alternation of melting points: the acids with an even number of carbon atoms show a melting point higher than those with an odd one. The acids were co-crystallized with four dipyridyl molecules (formed by two pyridine rings with a different number of bridging carbon atoms) through the formation of intermolecular interactions N•••(H)O. The bases used were: 4,4’-bipyridine (BPY), 1,2-bis(4-pyridyl)ethane (BPA), 1,2-(di-4-pyridyl)ethylene (BPE) and 1,2-bis(4-pyridyl)propane (BPP). The co-crystals obtained by solution synthesis were characterized by different solid-state techniques to determine the structure and to see how the melting points in co-crystals change. In the second part of this study we tried to obtain new crystal forms of compounds of pharmaceutical interest. The APIs studied are: O-desmethylvenlafaxine, Lidocaine, Nalidixic Acid and Sulfadiazine. Each API was subjected to Polymorph Screening and Salt/Co-crystal Screening experiments to identify new crystal forms characterized by different properties. In a typical Salt/Co-crystal Screening the sample was made to react with a co-former (solid or liquid) through different methods: crystallization by solution, grinding, kneading and solid-gas reactions. The new crystal forms obtained were characterized by different solid state techniques (X-ray single crystal diffraction, X-ray powder diffraction, Differential Scanning Calorimetry, Thermogravimetric Analysis, Evolved gas analysis, FT-IR – ATR, Solid State N.M.R).
Resumo:
The main scope of my PhD is the reconstruction of the large-scale bivalve phylogeny on the basis of four mitochondrial genes, with samples taken from all major groups of the class. To my knowledge, it is the first attempt of such a breadth in Bivalvia. I decided to focus on both ribosomal and protein coding DNA sequences (two ribosomal encoding genes -12s and 16s -, and two protein coding ones - cytochrome c oxidase I and cytochrome b), since either bibliography and my preliminary results confirmed the importance of combined gene signals in improving evolutionary pathways of the group. Moreover, I wanted to propose a methodological pipeline that proved to be useful to obtain robust results in bivalves phylogeny. Actually, best-performing taxon sampling and alignment strategies were tested, and several data partitioning and molecular evolution models were analyzed, thus demonstrating the importance of molding and implementing non-trivial evolutionary models. In the line of a more rigorous approach to data analysis, I also proposed a new method to assess taxon sampling, by developing Clarke and Warwick statistics: taxon sampling is a major concern in phylogenetic studies, and incomplete, biased, or improper taxon assemblies can lead to misleading results in reconstructing evolutionary trees. Theoretical methods are already available to optimize taxon choice in phylogenetic analyses, but most involve some knowledge about genetic relationships of the group of interest, or even a well-established phylogeny itself; these data are not always available in general phylogenetic applications. The method I proposed measures the "phylogenetic representativeness" of a given sample or set of samples and it is based entirely on the pre-existing available taxonomy of the ingroup, which is commonly known to investigators. Moreover, it also accounts for instability and discordance in taxonomies. A Python-based script suite, called PhyRe, has been developed to implement all analyses.
Resumo:
One of the ways by which the legal system has responded to different sets of problems is the blurring of the traditional boundaries of criminal law, both procedural and substantive. This study aims to explore under what conditions does this trend lead to the improvement of society's welfare by focusing on two distinguishing sanctions in criminal law, incarceration and social stigma. In analyzing how incarceration affects the incentive to an individual to violate a legal standard, we considered the crucial role of the time constraint. This aspect has not been fully explored in the literature on law and economics, especially with respect to the analysis of the beneficiality of imposing either a fine or a prison term. We observed that that when individuals are heterogeneous with respect to wealth and wage income, and when the level of activity can be considered a normal good, only the middle wage and middle income groups can be adequately deterred by a fixed fines alone regime. The existing literature only considers the case of the very poor, deemed as judgment proof. However, since imprisonment is a socially costly way to deprive individuals of their time, other alternatives may be sought such as the imposition of discriminatory monetary fine, partial incapacitation and other alternative sanctions. According to traditional legal theory, the reason why criminal law is obeyed is not mainly due to the monetary sanctions but to the stigma arising from the community’s moral condemnation that accompanies conviction or merely suspicion. However, it is not sufficiently clear whether social stigma always accompanies a criminal conviction. We addressed this issue by identifying the circumstances wherein a criminal conviction carries an additional social stigma. Our results show that social stigma is seen to accompany a conviction under the following conditions: first, when the law coincides with the society's social norms; and second, when the prohibited act provides information on an unobservable attribute or trait of an individual -- crucial in establishing or maintaining social relationships beyond mere economic relationships. Thus, even if the social planner does not impose the social sanction directly, the impact of social stigma can still be influenced by the probability of conviction and the level of the monetary fine imposed as well as the varying degree of correlation between the legal standard violated and the social traits or attributes of the individual. In this respect, criminal law serves as an institution that facilitates cognitive efficiency in the process of imposing the social sanction to the extent that the rest of society is boundedly rational and use judgment heuristics. Paradoxically, using criminal law in order to invoke stigma for the violation of a legal standard may also serve to undermine its strength. To sum, the results of our analysis reveal that the scope of criminal law is narrow both for the purposes of deterrence and cognitive efficiency. While there are certain conditions where the enforcement of criminal law may lead to an increase in social welfare, particularly with respect to incarceration and stigma, we have also identified the channels through which they could affect behavior. Since such mechanisms can be replicated in less costly ways, society should first try or seek to employ these legal institutions before turning to criminal law as a last resort.
Resumo:
This thesis deals with distributed control strategies for cooperative control of multi-robot systems. Specifically, distributed coordination strategies are presented for groups of mobile robots. The formation control problem is initially solved exploiting artificial potential fields. The purpose of the presented formation control algorithm is to drive a group of mobile robots to create a completely arbitrarily shaped formation. Robots are initially controlled to create a regular polygon formation. A bijective coordinate transformation is then exploited to extend the scope of this strategy, to obtain arbitrarily shaped formations. For this purpose, artificial potential fields are specifically designed, and robots are driven to follow their negative gradient. Artificial potential fields are then subsequently exploited to solve the coordinated path tracking problem, thus making the robots autonomously spread along predefined paths, and move along them in a coordinated way. Formation control problem is then solved exploiting a consensus based approach. Specifically, weighted graphs are used both to define the desired formation, and to implement collision avoidance. As expected for consensus based algorithms, this control strategy is experimentally shown to be robust to the presence of communication delays. The global connectivity maintenance issue is then considered. Specifically, an estimation procedure is introduced to allow each agent to compute its own estimate of the algebraic connectivity of the communication graph, in a distributed manner. This estimate is then exploited to develop a gradient based control strategy that ensures that the communication graph remains connected, as the system evolves. The proposed control strategy is developed initially for single-integrator kinematic agents, and is then extended to Lagrangian dynamical systems.
Resumo:
The present research aims to study the special rights other than shares in Spanish Law and the protection of their holders in cross-border mergers of limited liability companies within the European Union frame. Special rights other than shares are recognised as an independent legal category within legal systems of some EU Member States, such as Germany or Spain, through the implementation of the Third Directive 78/855/CEE concerning mergers of public limited liability companies. The above-cited Directive contains a special regime of protection for the holders of securities, other than shares, to which special rights are attached, consisting of being given rights in the acquiring company, at least equivalent to those they possessed in the company being acquired. This safeguard is to highlight the intimate connection between this type of rights and the company whose extinction determines the existence of those. Pursuant to the Directive 2005/56/CE on cross-border mergers of limited liability companies, each company taking part in these operations shall comply with the safeguards of members and third parties provided in their respective national law to which is subject. In this regard, the protection for holders of special rights other than shares shall be ruled by the domestic M&A regime. As far as Spanish Law are concerned, holders of these special rights are recognized a right of merger information, in the same terms as shareholders, as well as equal rights in the company resulting from the cross-border merger. However, these measures are not enough guarantee for a suitable protection, thus considering those holders of special rights as special creditors, sometimes it will be necessary to go to the general protection regime for creditors. In Spanish Law, it would involve the recognition of right to the merger opposition, whose exercise would prevent the operation was completed until ensuring equal rights.
Resumo:
In den letzten drei Jahrzehnten sind Fernerkundung und GIS in den Geowissenschaften zunehmend wichtiger geworden, um die konventionellen Methoden von Datensammlung und zur Herstellung von Landkarten zu verbessern. Die vorliegende Arbeit befasst sich mit der Anwendung von Fernerkundung und geographischen Informationssystemen (GIS) für geomorphologische Untersuchungen. Durch die Kombination beider Techniken ist es vor allem möglich geworden, geomorphologische Formen im Überblick und dennoch detailliert zu erfassen. Als Grundlagen werden in dieser Arbeit topographische und geologische Karten, Satellitenbilder und Klimadaten benutzt. Die Arbeit besteht aus 6 Kapiteln. Das erste Kapitel gibt einen allgemeinen Überblick über den Untersuchungsraum. Dieser umfasst folgende morphologische Einheiten, klimatischen Verhältnisse, insbesondere die Ariditätsindizes der Küsten- und Gebirgslandschaft sowie das Siedlungsmuster beschrieben. Kapitel 2 befasst sich mit der regionalen Geologie und Stratigraphie des Untersuchungsraumes. Es wird versucht, die Hauptformationen mit Hilfe von ETM-Satellitenbildern zu identifizieren. Angewandt werden hierzu folgende Methoden: Colour Band Composite, Image Rationing und die sog. überwachte Klassifikation. Kapitel 3 enthält eine Beschreibung der strukturell bedingten Oberflächenformen, um die Wechselwirkung zwischen Tektonik und geomorphologischen Prozessen aufzuklären. Es geht es um die vielfältigen Methoden, zum Beispiel das sog. Image Processing, um die im Gebirgskörper vorhandenen Lineamente einwandfrei zu deuten. Spezielle Filtermethoden werden angewandt, um die wichtigsten Lineamente zu kartieren. Kapitel 4 stellt den Versuch dar, mit Hilfe von aufbereiteten SRTM-Satellitenbildern eine automatisierte Erfassung des Gewässernetzes. Es wird ausführlich diskutiert, inwieweit bei diesen Arbeitsschritten die Qualität kleinmaßstäbiger SRTM-Satellitenbilder mit großmaßstäbigen topographischen Karten vergleichbar ist. Weiterhin werden hydrologische Parameter über eine qualitative und quantitative Analyse des Abflussregimes einzelner Wadis erfasst. Der Ursprung von Entwässerungssystemen wird auf der Basis geomorphologischer und geologischer Befunde interpretiert. Kapitel 5 befasst sich mit der Abschätzung der Gefahr episodischer Wadifluten. Die Wahrscheinlichkeit ihres jährlichen Auftretens bzw. des Auftretens starker Fluten im Abstand mehrerer Jahre wird in einer historischen Betrachtung bis 1921 zurückverfolgt. Die Bedeutung von Regentiefs, die sich über dem Roten Meer entwickeln, und die für eine Abflussbildung in Frage kommen, wird mit Hilfe der IDW-Methode (Inverse Distance Weighted) untersucht. Betrachtet werden außerdem weitere, regenbringende Wetterlagen mit Hilfe von Meteosat Infrarotbildern. Genauer betrachtet wird die Periode 1990-1997, in der kräftige, Wadifluten auslösende Regenfälle auftraten. Flutereignisse und Fluthöhe werden anhand von hydrographischen Daten (Pegelmessungen) ermittelt. Auch die Landnutzung und Siedlungsstruktur im Einzugsgebiet eines Wadis wird berücksichtigt. In Kapitel 6 geht es um die unterschiedlichen Küstenformen auf der Westseite des Roten Meeres zum Beispiel die Erosionsformen, Aufbauformen, untergetauchte Formen. Im abschließenden Teil geht es um die Stratigraphie und zeitliche Zuordnung von submarinen Terrassen auf Korallenriffen sowie den Vergleich mit anderen solcher Terrassen an der ägyptischen Rotmeerküste westlich und östlich der Sinai-Halbinsel.