938 resultados para Topographic maps
Resumo:
The geological overview map was compiled from 15 geological maps (1 : 25,000) and is based on Jacobs et al. 1996. The topographic basemaps were adapted from unpublished 1:250,000 provisional topographic maps, Institut f. Angewandte Geodäsie, Frankfurt, 1983. Part of the contour lines are from Radarsat (Liu et al. 2001).
Resumo:
"The fifth of a series of papers on topographic mapping by aerial photography."
Resumo:
"January 1985."
Resumo:
Recently, there has been a considerable research activity in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, the representational capabilities and internal representations of the models are not well understood. We rigorously analyze a generalization of the Self-Organizing Map (SOM) for processing sequential data, Recursive SOM (RecSOM [1]), as a non-autonomous dynamical system consisting off a set of fixed input maps. We show that contractive fixed input maps are likely to produce Markovian organizations of receptive fields o the RecSOM map. We derive bounds on parameter $\beta$ (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed input maps is guaranteed.
Resumo:
Different visual stimuli may activate separate channels in the visual system and produce magnetic responses from the human bran which originate from distinct regions of the visual cortex. To test this hypothesis, we have investigated the distribution of visual evoked magnetic responses to three distinct visual stimuli over the occipital region of the scalp with a DC-SQUID second-order gradiometer in an ubshielded environment. Patterned stimuli were presented full field and to the right half field, while a flash stimulus was presented full field only, in five normal subjects. Magnetic responses were recorded from 20 to 42 positions over the occipital scalp. Topographic maps were prepared of the major positive component within the first 150ms to the three stimuli, i.e., the P100m (pattern shift), C11m (pattern onset) and P2m (flash). For the pattern shift stimulus the data suggested the source of the P100m was close to the midline with the current directed towards the medial surface. The data for the pattern onset C11m suggested a source at a similar depth but with the current directed away from the midline towards the lateral surface. The flash P2m appeared to originate closer to the surface of the occipital pole than both the patterned stimuli. Hence the pattern shift (which may represent movement), and the pattern onset C11m (representing contrast and contour) appear to originate in similar areas of brain but to represent different asepcts of cortical processing. By contrast, the flash P2m (representing luminance change) appears to originate in a distinct area of visual cortex closer to the occipital pole.
Resumo:
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
Resumo:
The research compares the usefullness of four remote sensing information sources, these being LANDSAT photographic prints, LANDSAT computer compatible tapes, Metric Camera and SIR-A photographic prints. These sources provide evaluations of the catchment characteristics of the Belize and Sibun river basins in Central America. Map evaluations at 1:250,000 scale are compared to the results of the same scale, remotely sensed information sources. The values of catchment characteristics for both maps and LANDSAT prints are used in multiple regression analysis, providing flood flow formulae, after investigations to provide a suitable dependent variable discharge series are made for short term records. The use of all remotely sensed information sources in providing evaluations of catchment characteristics is discussed. LANDSAT prints and computer compatible tapes of a post flood scene are used to estimate flood distributions and volumes. These are compared to values obtained from unit hydrograph analysis, using the dependent discharge series and evaluate the probable losses from the Belize river to the floodplain, thereby assessing the accuracy of LANDSAT estimates. Information relating to flood behaviour is discussed in terms of basic image presentation as well as image processing. A cost analysis of the purchase and use of all materials is provided. Conclusions of the research indicate that LANDSAT print material may provide information suitable for regression analysis at levels of accuracy as great as those of topographic maps, that the differing information sources are uniquely applicable and that accurate estimates of flood volumes may be determined even by post flood imagery.
Resumo:
The premise of this dissertation is to create a highly integrated platform that combines the most current recording technologies for brain research through the development of new algorithms for three-dimensional (3D) functional mapping and 3D source localization. The recording modalities that were integrated include: Electroencephalography (EEG), Optical Topographic Maps (OTM), Magnetic Resonance Imaging (MRI), and Diffusion Tensor Imaging (DTI). This work can be divided into two parts: The first part involves the integration of OTM with MRI, where the topographic maps are mapped to both the skull and cortical surface of the brain. This integration process is made possible through the development of new algorithms that determine the probes location on the MRI head model and warping the 2D topographic maps onto the 3D MRI head/brain model. Dynamic changes of the brain activation can be visualized on the MRI head model through a graphical user interface. The second part of this research involves augmenting a fiber tracking system, by adding the ability to integrate the source localization results generated by commercial software named Curry. This task involved registering the EEG electrodes and the dipole results to the MRI data. Such Integration will allow the visualization of fiber tracts, along with the source of the EEG, in a 3D transparent brain structure. The research findings of this dissertation were tested and validated through the participation of patients from Miami Children Hospital (MCH). Such an integrated platform presented to the medical professionals in the form of a user-friendly graphical interface is viewed as a major contribution of this dissertation. It should be emphasized that there are two main aspects to this research endeavor: (1) if a dipole could be situated in time at its different positions, its trajectory may reveal additional information on the extent and nature of the brain malfunction; (2) situating such a dipole trajectory with respect to the fiber tracks could ensure the preservation of these fiber tracks (axons) during surgical interventions, preserving as a consequence these parts of the brain that are responsible for information transmission.
Resumo:
A 3.38 m long sediment core raised from the tidal flat sediments of the 'Blauortsand' in the Wadden Sea northwest of Büsum (Schleswig-Holstein, Germany) was analysed in order to investigate long term changes in sediment pollution with Pb, Cu, Zn and Cd. Comparison with the topographic maps since 1952 and 210Pb activity allowed a general dating of the sediment succession in the core. The heavy metal concentrations including 210Pb of the < 20 µm grain-size fraction in thick sediment slices below 1.30 m indicated background niveaus. Their values increased and reached modern levels in the upper sediment layers of the core above 1 m. The increments for Pb, Cu, Zn was 1 to 3 fold and Cd up to 11 fold since the second half of the 19th century. More investigations are needed to quantify the geographical extent and history of the contaminations shown in this pilot study.
Resumo:
Svalbard is a heavily glacier-covered archipelago in the Arctic. Dickson Land (DL), in the central part of the largest island, Spitsbergen, is relatively arid, and as a result, glaciers there are relatively small and restricted mostly to valleys and cirques. This study presents a comprehensive analysis of glacier changes in DL based on inventories compiled from topographic maps and digital elevation models for the Little Ice Age maximum (LIA), the 1960s, 1990 and 2009/11. Total glacier area decreased by ~38 % since the LIA maximum, and front retreat has increased over the study period. Recently, most of the local glaciers have been consistently thinning in all elevation bands, in contrast to larger Svalbard ice masses which remain closer to balance. The mean 1990–2009/11 geodetic mass balance of glaciers in DL is among the most negative from the Svalbard regional means known from the literature.
Resumo:
Relief shown by contours.
Resumo:
Confidential "Gunji gokuhi" printed.
Resumo:
We present an imaging technique for the 3D-form metrology of optical surfaces. It is based on the optical absorption in fluids situated between the surface and a reference. An improved setup with a bi-chromatic light source is fundamental to obtain reliable topographic maps. It is able to measure any surface finish (rough or polished), form and slope and independently of scale. We present results focused on flat and spherical optical surfaces, arrays of lenses and with different surface finish (rough-polished). We achieve form accuracies from several nanometers to sub-lambda for sag departures from tens to hundred of microns. Therefore, it seems suitable for the quality control in the production of precision aspheric, freeform lenses and other complex shapes on transparent substrates, independently of the surface finish.
Resumo:
Bonebridge™ (BB) implantation relies on optimal anchoring of the bone-conduction implant in the temporal bone. Preoperative position planning has to account for the available bone thickness minimizing unwanted interference with underlying anatomical structures. This study describes the first clinical experience with a planning method based on topographic bone thickness maps (TBTM) for presigmoid BB implantations. The temporal bone was segmented enabling three-dimensional surface generation. Distances between the external and internal surface were color encoded and mapped to a TBTM. Suitable implant positions were planned with reference to the TBTM. Surgery was performed according to the standard procedure (n = 7). Computation of the TBTM and consecutive implant position planning took 70 min on average for a trained technician. Surgical time for implantations under passive TBTM image guidance was 60 min, on average. The sigmoid sinus (n = 5) and dura mater (n = 1) were exposed, as predicted with the TBTM. Feasibility of the TBTM method was shown for standard presigmoid BB implantations. The projection of three-dimensional bone thickness information into a single topographic map provides the surgeon with an intuitive display of the anatomical situation prior to implantation. Nevertheless, TBTM generation time has to be significantly reduced to simplify integration in clinical routine.