533 resultados para Chirp sonar


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Snow height was measured by the Snow Depth Buoy 2013S8, an autonomous platform, drifting on Antarctic sea ice, deployed during POLARSTERN cruise ANT-XXIX/6 (PS81). The resulting time series describes the evolution of snow height as a function of place and time between 2013-07-09 and 2014-01-05 in sample intervals of 1 hour. The Snow Depth Buoy consists of four independent sonar measurements representing the area (approx. 10 m**2) around the buoy. The buoy was installed on first year ice. In addition to snow height, geographic position (GPS), barometric pressure, air temperature, and ice surface temperature were measured. Negative values of snow height occur if surface ablation continues into the sea ice. Thus, these measurements describe the position of the sea ice surface relative to the original snow-ice interface. Differences between single sensors indicate small-scale variability of the snow pack around the buoy. The data set has been processed, including the removal of obvious inconsistencies (missing values). Records without any snow height may still be used for sea ice drift analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Snow height was measured by the Snow Depth Buoy 2013S7, an autonomous platform, drifting on Antarctic sea ice, deployed during POLARSTERN cruise ANT-XXIX/6 (PS81). The resulting time series describes the evolution of snow height as a function of place and time between 2013-07-06 and 2013-09-13 in sample intervals of 1 hour. The Snow Depth Buoy consists of four independent sonar measurements representing the area (approx. 10 m**2) around the buoy. The buoy was installed on first year ice. In addition to snow height, geographic position (GPS), barometric pressure, air temperature, and ice surface temperature were measured. Negative values of snow height occur if surface ablation continues into the sea ice. Thus, these measurements describe the position of the sea ice surface relative to the original snow-ice interface. Differences between single sensors indicate small-scale variability of the snow pack around the buoy. The data set has been processed, including the removal of obvious inconsistencies (missing values). Records without any snow height may still be used for sea ice drift analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Snow height was measured by the Snow Depth Buoy 2014S15, an autonomous platform, drifting on Arctic sea ice, deployed during POLARSTERN cruise ARK-XXVIII/4 (PS87). The resulting time series describes the evolution of snow depth as a function of place and time between 2014-08-29 and 2014-12-31 in sample intervals of 1 hour. The Snow Depth Buoy consists of four independent sonar measurements representing the area (approx. 10 m**2) around the buoy. The measurements describe the position of the sea ice surface relative to the original snow-ice interface. Differences between single sensors indicate small-scale variability of the snow pack around the buoy. The data set has been processed, including the removal of obvious inconsistencies (missing values). The buoy was installed on multi year ice. In addition to snow depth, geographic position (GPS), barometric pressure, air temperature, and ice surface temperature were measured. Records without any snow depth may still be used for sea ice drift analyses. Note: This data set contains only relative changes in snow depth, because no initial readings of absolute snow depth are available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Snow height was measured by the Snow Depth Buoy 2014S17, an autonomous platform, drifting on Antarctic sea ice, deployed during POLARSTERN cruise ANT-XXX/2 (PS89). The resulting time series describes the evolution of snow depth as a function of place and time between 2014-12-20 and 2015-02-01 in sample intervals of 1 hour. The Snow Depth Buoy consists of four independent sonar measurements representing the area (approx. 10 m**2) around the buoy. The buoy was installed on first year ice. In addition to snow depth, geographic position (GPS), barometric pressure, air temperature, and ice surface temperature were measured. Negative values of snow depth occur if surface ablation continues into the sea ice. Thus, these measurements describe the position of the sea ice surface relative to the original snow-ice interface. Differences between single sensors indicate small-scale variability of the snow pack around the buoy. The data set has been processed, including the removal of obvious inconsistencies (missing values). In this data set, diurnal variations occur in the data set, although the sonic readings were compensated for temperature changes. Records without any snow depth may still be used for sea ice drift analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Snow height was measured by the Snow Depth Buoy 2014S24, an autonomous platform, installed close to Neumayer III Base, Antarctic during Antarctic Fast Ice Network 2014 (AFIN 2014). The resulting time series describes the evolution of snow depth as a function of place and time between 2014-03-07 and 2014-05-16 in sample intervals of 1 hour. The Snow Depth Buoy consists of four independent sonar measurements representing the area (approx. 10 m**2) around the buoy. The buoy was installed on the ice shelf. In addition to snow depth, geographic position (GPS), barometric pressure, air temperature, and ice surface temperature were measured. Negative values of snow depth occur if surface ablation continues into the sea ice. Thus, these measurements describe the position of the sea ice surface relative to the original snow-ice interface. Differences between single sensors indicate small-scale variability of the snow pack around the buoy. The data set has been processed, including the removal of obvious inconsistencies (missing values). Records without any snow depth may still be used for sea ice drift analyses. Note: This data set contains only relative changes in snow depth, because no initial readings of absolute snow depth are available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the fact that a metro network market is very cost sensitive, direct modulated schemes appear attractive. In this paper a CWDM (Coarse Wavelength Division Multiplexing) system is studied in detail by means of an Optical Communication System Design Software; a detailed study of the modulated current shape (exponential, sine and gaussian) for 2.5 Gb/s CWDM Metropolitan Area Networks is performed to evaluate its tolerance to linear impairments such as signal-to-noise-ratio degradation and dispersion. Point-to-point links are investigated and optimum design parameters are obtained. Through extensive sets of simulation results, it is shown that some of these shape pulses are more tolerant to dispersion when compared with conventional gaussian shape pulses. In order to achieve a low Bit Error Rate (BER), different types of optical transmitters are considered including strongly adiabatic and transient chirp dominated Directly Modulated Lasers (DMLs). We have used fibers with different dispersion characteristics, showing that the system performance depends, strongly, on the chosen DML?fiber couple.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new method for measuring the linewidth enhancement factor (α-parameter) of semiconductor lasers is proposed and discussed. The method itself provides an estimation of the measurement error, thus self-validating the entire procedure. The α-parameter is obtained from the temporal profile and the instantaneous frequency (chirp) of the pulses generated by gain switching. The time resolved chirp is measured with a polarization based optical differentiator. The accuracy of the obtained values of the α-parameter is estimated from the comparison between the directly measured pulse spectrum and the spectrum reconstructed from the chirp and the temporal profile of the pulse. The method is applied to a VCSEL and to a DFB laser emitting around 1550 nm at different temperatures, obtaining a measurement error lower than ± 8%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current bias estimation algorithms for air traffic control (ATC) surveillance are focused on radar sensors, but the integration of new sensors (especially automatic dependent surveillance-broadcast and wide area multilateration) demands the extension of traditional procedures. This study describes a generic architecture for bias estimation applicable to multisensor multitarget surveillance systems. It consists on first performing bias estimations using measurements from each target, of a subset of sensors, assumed to be reliable, forming track bias estimations. All track bias estimations are combined to obtain, for each of those sensors, the corresponding sensor bias. Then, sensor bias terms are corrected, to subsequently calculate the target or sensor-target pair specific biases. Once these target-specific biases are corrected, the process is repeated recursively for other sets of less reliable sensors, assuming bias corrected measures from previous iterations are unbiased. This study describes the architecture and outlines the methodology for the estimation and the bias estimation design processes. Then the approach is validated through simulation, and compared with previous methods in the literature. Finally, the study describes the application of the methodology to the design of the bias estimation procedures for a modern ATC surveillance application, specifically for off-line assessment of ATC surveillance performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interpolation of points by means of Information Technology programs appears as a technical tool of some relevancy in the hydrogeology in general and in the study of the humid zones in particular. Our approach has been the determination of the 3-D geometry of the humid zones of major depth of the Rabasa Lakes. To estimate the topography of the lake bed, we proceed to acquire information in the field by means of sonar and GPS equipment. A total of 335 points were measured both on the perimeter and in the lake bed. In a second stage, this information was used in a kriging program to obtain the bathymetry of the wetland. This methodology is demonstrated as one of the most reliable and cost-efficient for the 3-D analysis of this type of water masses. The bathymetric study of the zone allows us to characterize the mid- and long-term hydrological evolution of the lakes by means of depth-area-volume curves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, educational software for intuitive understanding of the basic dynamic processes of semiconductor lasers is presented. The proposed tool is addressed to the students of optical communication courses, encouraging self consolidation of the subjects learned in lectures. The semiconductor laser model is based on the well known rate equations for the carrier density, photon density and optical phase. The direct modulation of the laser is considered with input parameters which can be selected by the user. Different options for the waveform, amplitude and frequency of thpoint. Simulation results are plotted for carrier density and output power versus time. Instantaneous frequency variations of the laser output are numerically shifted to the audible frequency range and sent to the computer loudspeakers. This results in an intuitive description of the “chirp” phenomenon due to amplitude-phase coupling, typical of directly modulated semiconductor lasers. In this way, the student can actually listen to the time resolved spectral content of the laser output. By changing the laser parameters and/or the modulation parameters,consequent variation of the laser output can be appreciated in intuitive manner. The proposed educational tool has been previously implemented by the same authors with locally executable software. In the present manuscript, we extend our previous work to a web based platform, offering improved distribution and allowing its use to the wide audience of the web.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Foliage Penetration (FOPEN) radar systems were introduced in 1960, and have been constantly improved by several organizations since that time. The use of Synthetic Aperture Radar (SAR) approaches for this application has important advantages, due to the need for high resolution in two dimensions. The design of this type of systems, however, includes some complications that are not present in standard SAR systems. FOPEN SAR systems need to operate with a low central frequency (VHF or UHF bands) in order to be able to penetrate the foliage. High bandwidth is also required to obtain high resolution. Due to the low central frequency, large integration angles are required during SAR image formation, and therefore the Range Migration Algorithm (RMA) is used. This project thesis identifies the three main complications that arise due to these requirements. First, a high fractional bandwidth makes narrowband propagation models no longer valid. Second, the VHF and UHF bands are used by many communications systems. The transmitted signal spectrum needs to be notched to avoid interfering them. Third, those communications systems cause Radio Frequency Interference (RFI) on the received signal. The thesis carries out a thorough analysis of the three problems, their degrading effects and possible solutions to compensate them. The UWB model is applied to the SAR signal, and the degradation induced by it is derived. The result is tested through simulation of both a single pulse stretch processor and the complete RMA image formation. Both methods show that the degradation is negligible, and therefore the UWB propagation effect does not need compensation. A technique is derived to design a notched transmitted signal. Then, its effect on the SAR image formation is evaluated analytically. It is shown that the stretch processor introduces a processing gain that reduces the degrading effects of the notches. The remaining degrading effect after processing gain is assessed through simulation, and an experimental graph of degradation as a function of percentage of nulled frequencies is obtained. The RFI is characterized and its effect on the SAR processor is derived. Once again, a processing gain is found to be introduced by the receiver. As the RFI power can be much higher than that of the desired signal, an algorithm is proposed to remove the RFI from the received signal before RMA processing. This algorithm is a modification of the Chirp Least Squares Algorithm (CLSA) explained in [4], which adapts it to deramped signals. The algorithm is derived analytically and then its performance is evaluated through simulation, showing that it is effective in removing the RFI and reducing the degradation caused by both RFI and notching. Finally, conclusions are drawn as to the importance of each one of the problems in SAR system design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El trabajo de fin de grado que se va a definir detalladamente en esta memoria, trata de poner de manifiesto muchos de los conocimientos que he adquirido a lo largo de la carrera, aplicándolos en un proyecto real. Se ha desarrollado una plataforma capaz de albergar ideas, escritas por personas de todo el mundo que buscan compartirlas con los demás, para que estas sean comentadas, valoradas y entre todos poder mejorarlas. Estas ideas pueden ser de cualquier ámbito, por tanto, se da la posibilidad de clasificarlas en las categorías que mejor encajen con la idea. La aplicación ofrece una API RESTful muy descriptiva, en la que se ha identificado y estructurado cada recurso, para que a través de los “verbos http” se puedan gestionar todos los elementos de una forma fácil y sencilla, independientemente del cliente que la utilice. La arquitectura está montada siguiendo el patrón de diseño modelo vista-controlador, utilizando las últimas tecnologías del mercado como Spring, Liferay, SmartGWT y MongoDB (entre muchas otras) con el objetivo de crear una aplicación segura, escalable y modulada, por lo que se ha tenido que integrar todos estos frameworks. Los datos de la aplicación se hacen persistentes en dos tipos de bases de datos, una relacional (MySQL) y otra no relacional (MongoDB), aprovechando al máximo las características que ofrecen cada una de ellas. El cliente propuesto es accesible mediante un navegador web, se basa en el portal de Liferay. Se han desarrollado varios “Portlets o Widgets”, que componen la estructura de contenido que ve el usuario final. A través de ellos se puede acceder al contenido de la aplicación, ideas, comentarios y demás contenidos sociales, de una forma agradable para el usuario, ya que estos “Portlets” se comunican entre sí y hacen peticiones asíncronas a la API RESTful sin necesidad de recargar toda la estructura de la página. Además, los usuarios pueden registrarse en el sistema para aportar más contenidos u obtener roles que les dan permisos para realizar acciones de administración. Se ha seguido una metodología “Scrum” para la realización del proyecto, con el objetivo de dividir el proyecto en tareas pequeñas y desarrollarlas de una forma ágil. Herramientas como “Jenkins” me han ayudado a una integración continua y asegurando mediante la ejecución de los test de prueba, que todos los componentes funcionan. La calidad ha sido un aspecto principal en el proyecto, se han seguido metodologías software y patrones de diseño para garantizar un diseño de calidad, reutilizable, óptimo y modulado. El uso de la herramienta “Sonar” ha ayudado a este cometido. Además, se ha implementado un sistema de pruebas muy completo de todos los componentes de la aplicación. En definitiva, se ha diseñado una aplicación innovadora de código abierto, que establece unas bases muy definidas para que si algún día se pone en producción, sirva a las personas para compartir pensamientos o ideas ayudando a mejorar el mundo en el que vivimos. ---ABSTRACT---The Final Degree Project, described in detail in this report, attempts to cover a lot of the knowledge I have acquired during my studies, applying it to a real project. The objective of the project has been to develop a platform capable of hosting ideas from people all over the world, where users can share their ideas, comment on and rate the ideas of others and together help improving them. Since these ideas can be of any kind, it is possible to classify them into suitable categories. The application offers a very descriptive API RESTful, where each resource has been identified and organized in a way that makes it possible to easily manage all the elements using the HTTP verbs, regardless of the client using it. The architecture has been built following the design pattern model-view-controller, using the latest market technologies such as Spring, Liferay, Smart GWT and MongoDB (among others) with the purpose of creating a safe, scalable and adjustable application. The data of the application are persistent in two different kinds of databases, one relational (MySQL) and the other non-relational (MongoDB), taking advantage of all the different features each one of them provides. The suggested client is accessible through a web browser and it is based in Liferay. Various “Portlets" or "Widgets” make up the final content of the page. Thanks to these Portlets, the user can access the application content (ideas, comments and categories) in a pleasant way as the Portlets communicate with each other making asynchronous requests to the API RESTful without the necessity to refresh the whole page. Furthermore, users can log on to the system to contribute with more contents or to obtain administrator privileges. The Project has been developed following a “Scrum” methodology, with the main objective being that of dividing the Project into smaller tasks making it possible to develop each task in a more agile and ultimately faster way. Tools like “Jenkins” have been used to guarantee a continuous integration and to ensure that all the components work correctly thanks to the execution of test runs. Quality has been one of the main aspects in this project, why design patterns and software methodologies have been used to guarantee a high quality, reusable, modular and optimized design. The “Sonar” technology has helped in the achievement of this goal. Furthermore, a comprehensive proofing system of all the application's components has been implemented. In conclusion, this Project has consisted in developing an innovative, free source application that establishes a clearly defined basis so that, if it someday will be put in production, it will allow people to share thoughts and ideas, and by doing so, help them to improve the World we live in.