648 resultados para Broadband
Resumo:
Antenna design is an iterative process in which structures are analyzed and changed to comply with certain performance parameters required. The classic approach starts with analyzing a "known" structure, obtaining the value of its performance parameter and changing this structure until the "target" value is achieved. This process relies on having an initial structure, which follows some known or "intuitive" patterns already familiar to the designer. The purpose of this research was to develop a method of designing UWB antennas. What is new in this proposal is that the design process is reversed: the designer will start with the target performance parameter and obtain a structure as the result of the design process. This method provided a new way to replicate and optimize existing performance parameters. The base of the method was the use of a Genetic Algorithm (GA) adapted to the format of the chromosome that will be evaluated by the Electromagnetic (EM) solver. For the electromagnetic study we used XFDTD™ program, based in the Finite-Difference Time-Domain technique. The programming portion of the method was created under the MatLab environment, which serves as the interface for converting chromosomes, file formats and transferring of data between the XFDTD™ and GA. A high level of customization had to be written into the code to work with the specific files generated by the XFDTD™ program. Two types of cost functions were evaluated; the first one seeking broadband performance within the UWB band, and the second one searching for curve replication of a reference geometry. The performance of the method was evaluated considering the speed provided by the computer resources used. Balance between accuracy, data file size and speed of execution was achieved by defining parameters in the GA code as well as changing the internal parameters of the XFDTD™ projects. The results showed that the GA produced geometries that were analyzed by the XFDTD™ program and changed following the search criteria until reaching the target value of the cost function. Results also showed how the parameters can change the search criteria and influence the running of the code to provide a variety of geometries.
Resumo:
With the increase in traffic on the internet, there is a greater demand for wireless mobile and ubiquitous applications. These applications need antennas that are not only broadband, but can also work in different frequency spectrums. Even though there is a greater demand for such applications, it is still imperative to conserve power. Thus, there is a need to design multi-broadband antennas that do not use a lot of power. Reconfigurable antennas can work in different frequency spectrums as well as conserve power. The current designs of reconfigurable antennas work only in one band. There is a need to design reconfigurable antennas that work in different frequency spectrums. In this current era of high power consumption there is also a greater demand for wireless powering. This dissertation explores ideal designs of reconfigurable antennas that can improve performance and enable wireless powering. This dissertation also presents lab results of the multi-broadband reconfigurable antenna that was created. A detailed mathematical analyses, as well as extensive simulation results are also presented. The novel reconfigurable antenna designs can be extended to Multiple Input Multiple Output (MIMO) environments and military applications.^
Resumo:
A wireless mesh network is a mesh network implemented over a wireless network system such as wireless LANs. Wireless Mesh Networks(WMNs) are promising for numerous applications such as broadband home networking, enterprise networking, transportation systems, health and medical systems, security surveillance systems, etc. Therefore, it has received considerable attention from both industrial and academic researchers. This dissertation explores schemes for resource management and optimization in WMNs by means of network routing and network coding.^ In this dissertation, we propose three optimization schemes. (1) First, a triple-tier optimization scheme is proposed for load balancing objective. The first tier mechanism achieves long-term routing optimization, and the second tier mechanism, using the optimization results obtained from the first tier mechanism, performs the short-term adaptation to deal with the impact of dynamic channel conditions. A greedy sub-channel allocation algorithm is developed as the third tier optimization scheme to further reduce the congestion level in the network. We conduct thorough theoretical analysis to show the correctness of our design and give the properties of our scheme. (2) Then, a Relay-Aided Network Coding scheme called RANC is proposed to improve the performance gain of network coding by exploiting the physical layer multi-rate capability in WMNs. We conduct rigorous analysis to find the design principles and study the tradeoff in the performance gain of RANC. Based on the analytical results, we provide a practical solution by decomposing the original design problem into two sub-problems, flow partition problem and scheduling problem. (3) Lastly, a joint optimization scheme of the routing in the network layer and network coding-aware scheduling in the MAC layer is introduced. We formulate the network optimization problem and exploit the structure of the problem via dual decomposition. We find that the original problem is composed of two problems, routing problem in the network layer and scheduling problem in the MAC layer. These two sub-problems are coupled through the link capacities. We solve the routing problem by two different adaptive routing algorithms. We then provide a distributed coding-aware scheduling algorithm. According to corresponding experiment results, the proposed schemes can significantly improve network performance.^
Resumo:
Increased broadband penetration (BP) rates around the world have encouraged web designers to include more web content and additional functions on their web sites, thereby enhancing the richness and playfulness of the information. However, it is often very difficult for web surfers who are still using narrowband connections to access such web sites. Many university web sites target international audiences; therefore their download performance should be considered, as it may directly influence the user experience. This exploratory study examined 331 university hospitality and tourism department web sites in 37 countries. The empirical results showed that entry web pages of universities in Asia, with a medium BP rate (mid-BP), have the slowest download speeds, and those in Australia and New Zealand perform the best. The adoption rate of the Cascade Style Sheet (CSS) in Asia is relatively lower than that of other regions.
Resumo:
Today, smart-phones have revolutionized wireless communication industry towards an era of mobile data. To cater for the ever increasing data traffic demand, it is of utmost importance to have more spectrum resources whereby sharing under-utilized spectrum bands is an effective solution. In particular, the 4G broadband Long Term Evolution (LTE) technology and its foreseen 5G successor will benefit immensely if their operation can be extended to the under-utilized unlicensed spectrum. In this thesis, first we analyze WiFi 802.11n and LTE coexistence performance in the unlicensed spectrum considering multi-layer cell layouts through system level simulations. We consider a time division duplexing (TDD)-LTE system with an FTP traffic model for performance evaluation. Simulation results show that WiFi performance is more vulnerable to LTE interference, while LTE performance is degraded only slightly. Based on the initial findings, we propose a Q-Learning based dynamic duty cycle selection technique for configuring LTE transmission gaps, so that a satisfactory throughput is maintained both for LTE and WiFi systems. Simulation results show that the proposed approach can enhance the overall capacity performance by 19% and WiFi capacity performance by 77%, hence enabling effective coexistence of LTE and WiFi systems in the unlicensed band.
Resumo:
Combustion-generated carbon black nano particles, or soot, have both positive and negative effects depending on the application. From a positive point of view, it is used as a reinforcing agent in tires, black pigment in inks, and surface coatings. From a negative point of view, it affects performance and durability of many combustion systems, it is a major contributor of global warming, and it is linked to respiratory illness and cancer. Laser-Induced Incandescence (LII) was used in this study to measure soot volume fractions in four steady and twenty-eight pulsed ethylene diffusion flames burning at atmospheric pressure. A laminar coflow diffusion burner combined with a very-high-speed solenoid valve and control circuit provided unsteady flows by forcing the fuel flow with frequencies between 10 Hz and 200 Hz. Periodic flame oscillations were captured by two-dimensional phase-locked LII images and broadband luminosity images for eight phases (0°- 360°) covering each period. A comparison between the steady and pulsed flames and the effect of the pulsation frequency on soot volume fraction in the flame region and the post flame region are presented. The most significant effect of pulsing frequency was observed at 10 Hz. At this frequency, the flame with the lowest mean flow rate had 1.77 times enhancement in peak soot volume fraction and 1.2 times enhancement in total soot volume fraction; whereas the flame with the highest mean flow rate had no significant change in the peak soot volume fraction and 1.4 times reduction in the total soot volume fraction. A correlation (ƒv Reˉ1 = a+b· Str) for the total soot volume fraction in the flame region for the unsteady laminar ethylene flames was obtained for the pulsation frequency between 10 Hz and 200 Hz, and the Reynolds number between 37 and 55. The soot primary particle size in steady and unsteady flames was measured using the Time-Resolved Laser-Induced Incandescence (TIRE-LII) and the double-exponential fit method. At maximum frequency (200 Hz), the soot particles were smaller in size by 15% compared to the steady case in the flame with the highest mean flow rate.
Resumo:
With the increase in traffic on the internet, there is a greater demand for wireless mobile and ubiquitous applications. These applications need antennas that are not only broadband, but can also work in different frequency spectrums. Even though there is a greater demand for such applications, it is still imperative to conserve power. Thus, there is a need to design multi-broadband antennas that do not use a lot of power. Reconfigurable antennas can work in different frequency spectrums as well as conserve power. The current designs of reconfigurable antennas work only in one band. There is a need to design reconfigurable antennas that work in different frequency spectrums. In this current era of high power consumption there is also a greater demand for wireless powering. This dissertation explores ideal designs of reconfigurable antennas that can improve performance and enable wireless powering. This dissertation also presents lab results of the multi-broadband reconfigurable antenna that was created. A detailed mathematical analyses, as well as extensive simulation results are also presented. The novel reconfigurable antenna designs can be extended to Multiple Input Multiple Output (MIMO) environments and military applications.
Resumo:
The mantle transition zone is defined by two seismic discontinuities, nominally at 410 and 660 km depth, which result from transformations in the mineral olivine. The topography of these discontinuities provides information about lateral temperature changes in the transition zone. In this work, P-to-S conversions from teleseismic events recorded at 32 broadband stations in the Borborema Province were used to determine the transition zone thickness beneath this region and to investigate whether there are lateral temperature changes within this depth range. For this analysis, stacking and migration of receiver functions was performed. In the Borborema Province, geophysical studies have revealed a geoid anomaly which could reflect the presence of a thermal anomaly related to the origin of intraplate volcanism and uplift that marked the evolution of the Province in the Cenozoic. Several models have been proposed to explain these phenomena, which include those invoking the presence of a deep-seated mantle plume and those invoking shallower sources, such as small-scale convection cells. The results of this work show that no thermal anomalies are present at transition zone depths, as significant variations in the transition zone thickness were not observed. However, regions of depressed topography for both discontinuities (410 and 660 km) that approximately overlap in space were identified, suggesting that lower-thanaverage, lateral variations in seismic velocity above 410 km depth may exist below the the Borborema Province. This is consistent with the presence of a thermally-induced, low-density body independently inferred from analysis of geoid anomalies. Therefore, the magma source responsible for the Cenozoic intraplate volcanism and related uplift in the Province, is likely to be confined above the upper mantle transition zone.
Resumo:
This work aims to understand how cloud computing contextualizes the IT government and decision agenda, in the light of the multiple streams model, considering the current status of public IT policies, the dynamics of the agenda setting for the area, the interface between the various institutions, and existing initiatives on the use of cloud computing in government. Therefore, a qualitative study was conducted through interviews with a group of policy makers and the other group consists of IT managers. As analysis technique, this work made use of content analysis and analysis of documents, with some results by word cloud. As regards the main results to overregulation to the area, usually scattered in various agencies of the federal government, which hinders the performance of the managers. Identified a lack of knowledge of standards, government programs, regulations and guidelines. Among these he highlighted a lack of understanding of the TI Maior Program, the lack of effectiveness of the National Broadband Plan in view of the respondents, as well as the influence of Internet Landmark as an element that can jam the advances in the use of computing cloud in the Brazilian government. Also noteworthy is the bureaucratization of the acquisition of goods to IT services, limited, in many cases, technological advances. Regarding the influence of the actors, it was not possible to identify the presence of a political entrepreneur, and it was noticed a lack of political force. Political flow was affected only by changes within the government. Fragmentation was a major factor for the theme of weakening the agenda formation. Information security was questioned by the respondents pointed out that the main limitation coupled with the lack of training of public servants. In terms of benefits, resource economy is highlighted, followed by improving efficiency. Finally, the discussion about cloud computing needs to advance within the public sphere, whereas the international experience is already far advanced, framing cloud computing as a responsible element for the improvement of processes, services and economy of public resources
Resumo:
Ambient seismic noise has traditionally been considered as an unwanted perturbation in seismic data acquisition that "contaminates" the clean recording of earthquakes. Over the last decade, however, it has been demonstrated that consistent information about the subsurface structure can be extracted from cross-correlation of ambient seismic noise. In this context, the rules are reversed: the ambient seismic noise becomes the desired seismic signal, while earthquakes become the unwanted perturbation that needs to be removed. At periods lower than 30 s, the spectrum of ambient seismic noise is dominated by microseism, which originates from distant atmospheric perturbations over the oceans. The microsseism is the most continuous seismic signal and can be classified as primary – when observed in the range 10-20 s – and secondary – when observed in the range 5-10 s. The Green‘s function of the propagating medium between two receivers (seismic stations) can be reconstructed by cross-correlating seismic noise simultaneously recorded at the receivers. The reconstruction of the Green‘s function is generally proportional to the surface-wave portion of the seismic wavefield, as microsseismic energy travels mostly as surface-waves. In this work, 194 Green‘s functions obtained from stacking of one month of daily cross-correlations of ambient seismic noise recorded in the vertical component of several pairs of broadband seismic stations in Northeast Brazil are presented. The daily cross-correlations were stacked using a timefrequency, phase-weighted scheme that enhances weak coherent signals by reducing incoherent noise. The cross-correlations show that, as expected, the emerged signal is dominated by Rayleigh waves, with dispersion velocities being reliably measured for periods ranging between 5 and 20 s. Both permanent stations from a monitoring seismic network and temporary stations from past passive experiments in the region are considered, resulting in a combined network of 33 stations separated by distances between 60 and 1311 km, approximately. The Rayleigh-wave, dispersion velocity measurements are then used to develop tomographic images of group velocity variation for the Borborema Province of Northeast Brazil. The tomographic maps allow to satisfactorily map buried structural features in the region. At short periods (~5 s) the images reflect shallow crustal structure, clearly delineating intra-continental and marginal sedimentary basins, as well as portions of important shear zones traversing the Borborema Province. At longer periods (10 – 20 s) the images are sensitive to deeper structure in the upper crust, and most of the shallower anomalies fade away. Interestingly, some of them do persist. The deep anomalies do not correlate with either the location of Cenozoic volcanism and uplift - which marked the evolution of the Borborema Province in the Cenozoic - or available maps of surface heat-flow, and the origin of the deep anomalies remains enigmatic.
Resumo:
The Borborema Province, located in northeastern Brazil, has a basement of Precambrian age and a tectonic framework structured at the Neoproterozoic (740-560 Ma). After separation between South America and Africa during the Mesozoic, a rift system was formed, giving rise to a number of marginal and inland basins in the Province. After continental breakup, episodes of volcanism and uplift characterized the evolution of the Province. Plateau uplift was initially related to magmatic underplating of mafic material at the base of the crust, perhaps related to the generation of young continental plugs (45-7 Ma) along the Macau-Queimadas Alignment (MQA), due to a small-scale convection at the continental edge. The goal of this study is to investigate the causes of intra-plate uplift and its relationship to MQA volcanism, by using broadband seismology and integrating our results with independent geophysical and geological studies in the Borborema Province. The investigation of the deep structure of the Province with broadband seismic data includes receiver functions and surface-wave dispersion tomography. Both the receiver functions and surface-wave dispersion tomography are methods that use teleseismic events and allow to develop estimates of crustal parameters such as crustal thickness, Vp/Vs ratio, and S-velocity structure. The seismograms used for the receiver function work were obtained from 52 stations in Northeast Brazil: 16 broadband stations from the RSISNE network (Rede Sismográfica do Nordeste do Brasil), and 21 short-period and 6 broadband stations from the INCT-ET network (Instituto Nacional de Ciência e Tecnologia – Estudos Tectônicos). These results add signifi- cantly to previous datasets collected at individual stations in the Province, which include station RCBR (GSN - Global Seismic Network), stations CAUB and AGBL (Brazilian Lithosphere Seismic Project IAG/USP), and 6 other broadband stations that were part of the Projeto Milênio - Estudos geofísicos e tectônicos na Província Borborema/CNPq. For the surface-wave vii tomography, seismograms recorde at 22 broadband stations were utilized: 16 broadband stations from the RSISNE network and 6 broadband stations from the Milênio project. The new constraints developed in this work include: (i) estimates of crustal thickness and bulk Vp/Vs ratio for each station using receiver functions; (ii) new measurements of surfassewave group velocity, which were integrated to existing measurementes from a continental-scale tomography for South America, and (iii) S-wave velocity models (1D) at various locations in the Borborema Province, developed through the simultaneous inversion of receiver functions and surface-wave dispersion velocities. The results display S-wave velocity structure down to the base of the crust that are consistent with the presence of a 5-7.5 km thick mafic layer. The mafic layer was observed only in the southern portion of the Plateau and absent in its northern portion. Another important observation is that our models divide the plateau into a region of thin crust (northern Plateau) and a region of thick crust (southern Plateau), confirming results from independent refraction surveys and receiver function analyses. Existing models of plateau uplift, nonetheless, cannot explain all the new observations. It is proposed that during the Brazilian orogeny a layer of preexisting mafic material was delaminated, as a whole or in part, from the original Brasiliano crust. Partial delamination would have happened in the southern portion of the plateau, where independent studies found evidence of a more resistant rheology. During Mesozoic rifting, thinning of the crust around the southern Plateau would have formed the marginal basins and the Sertaneja depression, which would have included the northern part of the Plateau. In the Cenozoic, uplift of the northern Plateau would have occurred, resulting in a northern Plateau without mafic material at the base of the crust and a southern Plateau with partially delaminated mafic layer.
Resumo:
This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.
Resumo:
This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.
Resumo:
The increasing demand for Internet data traffic in wireless broadband access networks requires both the development of efficient, novel wireless broadband access technologies and the allocation of new spectrum bands for that purpose. The introduction of a great number of small cells in cellular networks allied to the complimentary adoption of Wireless Local Area Network (WLAN) technologies in unlicensed spectrum is one of the most promising concepts to attend this demand. One alternative is the aggregation of Industrial, Science and Medical (ISM) unlicensed spectrum to licensed bands, using wireless networks defined by Institute of Electrical and Electronics Engineers (IEEE) and Third Generation Partnership Project (3GPP). While IEEE 802.11 (Wi-Fi) networks are aggregated to Long Term Evolution (LTE) small cells via LTE / WLAN Aggregation (LWA), in proposals like Unlicensed LTE (LTE-U) and LWA the LTE air interface itself is used for transmission on the unlicensed band. Wi-Fi technology is widespread and operates in the same 5 GHz ISM spectrum bands as the LTE proposals, which may bring performance decrease due to the coexistence of both technologies in the same spectrum bands. Besides, there is the need to improve Wi-Fi operation to support scenarios with a large number of neighbor Overlapping Basic Subscriber Set (OBSS) networks, with a large number of Wi-Fi nodes (i.e. dense deployments). It is long known that the overall Wi-Fi performance falls sharply with the increase of Wi-Fi nodes sharing the channel, therefore there is the need for introducing mechanisms to increase its spectral efficiency. This work is dedicated to the study of coexistence between different wireless broadband access systems operating in the same unlicensed spectrum bands, and how to solve the coexistence problems via distributed coordination mechanisms. The problem of coexistence between different networks (i.e. LTE and Wi-Fi) and the problem of coexistence between different networks of the same technology (i.e. multiple Wi-Fi OBSSs) is analyzed both qualitatively and quantitatively via system-level simulations, and the main issues to be faced are identified from these results. From that, distributed coordination mechanisms are proposed and evaluated via system-level simulations, both for the inter-technology coexistence problem and intra-technology coexistence problem. Results indicate that the proposed solutions provide significant gains when compare to the situation without distributed coordination.
Resumo:
Within the context of the overall ecological working programme Dynamics of Antarctic Marine Shelf Ecosystems (DynAMo) of the PS96 (ANT-XXXI/2) cruise of RV "Polarstern" to the Weddell Sea (Dec 2015 to Feb 2016), seabed imaging surveys were carried out along drift profiles by means of the Ocean Floor Observation System (OFOS) of the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) Bremerhaven. The setup and mode of deployment of the OFOS was similar to that described by Bergmann and Klages (2012, doi:10.1016/j.marpolbul.2012.09.018). OFOS is a surface-powered gear equipped with two downward-looking cameras installed side-by-side: one high-resolution, wide-angle still camera (CANON® EOS 5D Mark III; lens: Canon EF 24 f/1.4L II, f stop: 13, exposure time: 1/125 sec; in-air view angles: 74° (horizontal), 53° (vertical), 84° (diagonal); image size: 5760 x 3840 px = 21 MPix; front of pressure resistant camera housing consisting of plexiglass dome port) and one high-definition color video camera (SONY® FCB-H11). The system was vertically lowered over the stern of the ship with a broadband fibre-optic cable, until it hovers approximately 1.5 m above the seabed. It was then towed after the slowly sailing ship at a speed of approximately 0.5 kn (0.25 m/s). The ship's Global Acoustic Positioning System (GAPS), combining Ultra Short Base Line (USBL), Inertial Navigation System (INS) and satellite-based Global Positioning System (GPS) technologies, was used to gain highly precise underwater position data of the OFOS. During the profile, OFOS was kept hanging at the preferred height above the seafloor by means of the live video feed and occasional minor cable-length adjustments with the winch to compensate small-scale bathymetric variations in seabed morphology. Information on water depth and height above the seafloor were continuously recorded by means of OFOS-mounted sensors (GAPS transponder, Tritech altimeter). Three lasers, which are placed beside the still camera, emit parallel beams and project red light points, arranged as an equilateral triangle with a side length of 50 cm, in each photo, thus providing a scale that can be used to calculate the seabed area depicted in each image and/or measure the size of organisms or seabed features visible in the image. In addition, the seabed area depicted was estimated using altimeter-derived height above seafloor and optical characteristics of the OFOS still camera. In automatic mode, a seabed photo, depicting an area of approximately 3.45 m**2 (= 2.3 m x 1.5 m; with variations depending on the actual height above ground), was taken every 30 seconds to obtain series of "TIMER" stills distributed at regular distances along the profiles that vary in length depending on duration of the cast. At a ship speed of 0.5 kn, the average distance between seabed images was approximately 5 m. Additional "HOTKEY" photos were taken from interesting objects (organisms, seabed features, such as putative iceberg scours) when they appeared in the live video feed (which was also recorded, in addition to the stills, for documentation and possible later analysis). If any image from this collection is used, please cite the reference as given above.