921 resultados para Point interpolation method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Based on the fractal theories, contractive mapping principles as well as the fixed point theory, by means of affine transform, this dissertation develops a novel Explicit Fractal Interpolation Function(EFIF)which can be used to reconstruct the seismic data with high fidelity and precision. Spatial trace interpolation is one of the important issues in seismic data processing. Under the ideal circumstances, seismic data should be sampled with a uniform spatial coverage. However, practical constraints such as the complex surface conditions indicate that the sampling density may be sparse or for other reasons some traces may be lost. The wide spacing between receivers can result in sparse sampling along traverse lines, thus result in a spatial aliasing of short-wavelength features. Hence, the method of interpolation is of very importance. It not only needs to make the amplitude information obvious but the phase information, especially that of the point that the phase changes acutely. Many people put forward several interpolation methods, yet this dissertation focuses attention on a special class of fractal interpolation function, referred to as explicit fractal interpolation function to improve the accuracy of the interpolation reconstruction and to make the local information obvious. The traditional fractal interpolation method mainly based on the randomly Fractional Brown Motion (FBM) model, furthermore, the vertical scaling factor which plays a critical role in the implementation of fractal interpolation is assigned the same value during the whole interpolating process, so it can not make the local information obvious. In addition, the maximal defect of the traditional fractal interpolation method is that it cannot obtain the function values on each interpolating nodes, thereby it cannot analyze the node error quantitatively and cannot evaluate the feasibility of this method. Detailed discussions about the applications of fractal interpolation in seismology have not been given by the pioneers, let alone the interpolating processing of the single trace seismogram. On the basis of the previous work and fractal theory this dissertation discusses the fractal interpolation thoroughly and the stability of this special kind of interpolating function is discussed, at the same time the explicit presentation of the vertical scaling factor which controls the precision of the interpolation has been proposed. This novel method develops the traditional fractal interpolation method and converts the fractal interpolation with random algorithms into the interpolation with determined algorithms. The data structure of binary tree method has been applied during the process of interpolation, and it avoids the process of iteration that is inevitable in traditional fractal interpolation and improves the computation efficiency. To illustrate the validity of the novel method, this dissertation develops several theoretical models and synthesizes the common shot gathers and seismograms and reconstructs the traces that were erased from the initial section using the explicit fractal interpolation method. In order to compare the differences between the theoretical traces that were erased in the initial section and the resulting traces after reconstruction on waveform and amplitudes quantitatively, each missing traces are reconstructed and the residuals are analyzed. The numerical experiments demonstrate that the novel fractal interpolation method is not only applicable to reconstruct the seismograms with small offset but to the seismograms with large offset. The seismograms reconstructed by explicit fractal interpolation method resemble the original ones well. The waveform of the missing traces could be estimated very well and also the amplitudes of the interpolated traces are a good approximation of the original ones. The high precision and computational efficiency of the explicit fractal interpolation make it a useful tool to reconstruct the seismic data; it can not only make the local information obvious but preserve the overall characteristics of the object investigated. To illustrate the influence of the explicit fractal interpolation method to the accuracy of the imaging of the structure in the earth’s interior, this dissertation applies the method mentioned above to the reverse-time migration. The imaging sections obtained by using the fractal interpolated reflected data resemble the original ones very well. The numerical experiments demonstrate that even with the sparse sampling we can still obtain the high accurate imaging of the earth’s interior’s structure by means of the explicit fractal interpolation method. So we can obtain the imaging results of the earth’s interior with fine quality by using relatively small number of seismic stations. With the fractal interpolation method we will improve the efficiency and the accuracy of the reverse-time migration under economic conditions. To verify the application effect to real data of the method presented in this paper, we tested the method by using the real data provided by the Broadband Seismic Array Laboratory, IGGCAS. The results demonstrate that the accuracy of explicit fractal interpolation is still very high even with the real data with large epicenter and large offset. The amplitudes and the phase of the reconstructed station data resemble the original ones that were erased in the initial section very well. Altogether, the novel fractal interpolation function provides a new and useful tool to reconstruct the seismic data with high precision and efficiency, and presents an alternative to image the deep structure of the earth accurately.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ionospheric parameter M(3000)F2 (the so-called transmission factor or the propagation factor) is important not only in practical applications such as frequency planning for radio-communication but also in ionospheric modeling. This parameter is strongly anti-correlated with the ionospheric F2-layer peak height hmF2,a parameter often used as a key anchor point in some widely used empirical models of the ionospheric electron density profile (e.g., in IRI and NeQuick models). Since hmF2 is not easy to obtain from measurements and M(3000)F2 can be routinely scaled from ionograms recorded by ionosonde/digisonde stations distributed globally and its data has been accumulated for a long history, usually the value of hmF2 is calculated from M(3000)F2 using the empirical formula connecting them. In practice, CCIR M(3000)F2 model is widely used to obtain M(3000)F2 value. However, recently some authors found that the CCIR M(3000)F2 model has remarkable discrepancies with the measured M(3000)F2, especially in low-latitude and equatorial regions. For this reason, the International Reference Ionosphere (IRI) research community proposes to improve or update the currently used CCIR M(3000)F2 model. Any efforts toward the improvement and updating of the current M(3000)F2 model or newly development of a global hmF2 model are encouraged. In this dissertation, an effort is made to construct the empirical models of M(3000)F2 and hmF2 based on the empirical orthogonal function (EOF) analysis combined with regression analysis method. The main results are as follows: 1. A single station model is constructed using monthly median hourly values of M(3000)F2 data observed at Wuhan Ionospheric Observatory during the years of 1957–1991 and compared with the IRI model. The result shows that EOF method is possible to use only a few orders of EOF components to represent most of the variance of the original data set. It is a powerful method for ionospheric modeling. 2. Using the values of M(3000)F2 observed by ionosondes distributed globally, data at grids uniformly distributed globally were obtained by using the Kriging interpolation method. Then the gridded data were decomposed into EOF components using two different coordinates: (1) geographical longitude and latitude; (2) modified dip (Modip) and local time. Based on the EOF decompositions of the gridded data under these two coordinates systems, two types of the global M(3000)F2 model are constructed. Statistical analysis showed that the two types of the constructed M(3000)F2 model have better agreement with the observational M(3000)F2 than the M(3000)F2 model currently used by IRI. The constructed models can represent the global variations of M(3000)F2 better. 3. The hmF2 data used to construct the hmF2 model were converted from the observed M(3000)F2 based on the empirical formula connecting them. We also constructed two types of the global hmF2 model using the similar method of modeling M(3000)F2. Statistical analysis showed that the prediction of our models is more accurate than the model of IRI. This demonstrated that using EOF analysis method to construct global model of hmF2 directly is feasible. The results in this thesis indicate that the modeling technique based on EOF expansion combined with regression analysis is very promising when used to construct the global models of M(3000)F2 and hmF2. It is worthwhile to investigate further and has the potential to be used to the global modeling of other ionospheric parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we propose a method for interpolation over a set of retrieved cases in the adaptation phase of the case-based reasoning cycle. The method has two advantages over traditional systems: the first is that it can predict “new” instances, not yet present in the case base; the second is that it can predict solutions not present in the retrieval set. The method is a generalisation of Shepard’s Interpolation method, formulated as the minimisation of an error function defined in terms of distance metrics in the solution and problem spaces. We term the retrieval algorithm the Generalised Shepard Nearest Neighbour (GSNN) method. A novel aspect of GSNN is that it provides a general method for interpolation over nominal solution domains. The method is illustrated in the paper with reference to the Irises classification problem. It is evaluated with reference to a simulated nominal value test problem, and to a benchmark case base from the travel domain. The algorithm is shown to out-perform conventional nearest neighbour methods on these problems. Finally, GSNN is shown to improve in efficiency when used in conjunction with a diverse retrieval algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Kriging interpolation method is combined with an object-based evaluation measure to assess the ability of the UK Met Office's dispersion and weather prediction models to predict the evolution of a plume of tracer as it was transported across Europe. The object-based evaluation method, SAL, considers aspects of the Structure, Amplitude and Location of the pollutant field. The SAL method is able to quantify errors in the predicted size and shape of the pollutant plume, through the structure component, the over- or under-prediction of the pollutant concentrations, through the amplitude component, and the position of the pollutant plume, through the location component. The quantitative results of the SAL evaluation are similar for both models and close to a subjective visual inspection of the predictions. A negative structure component for both models, throughout the entire 60 hour plume dispersion simulation, indicates that the modelled plumes are too small and/or too peaked compared to the observed plume at all times. The amplitude component for both models is strongly positive at the start of the simulation, indicating that surface concentrations are over-predicted by both models for the first 24 hours, but modelled concentrations are within a factor of 2 of the observations at later times. Finally, for both models, the location component is small for the first 48 hours after the start of the tracer release, indicating that the modelled plumes are situated close to the observed plume early on in the simulation, but this plume location error grows at later times. The SAL methodology has also been used to identify differences in the transport of pollution in the dispersion and weather prediction models. The convection scheme in the weather prediction model is found to transport more pollution vertically out of the boundary layer into the free troposphere than the dispersion model convection scheme resulting in lower pollutant concentrations near the surface and hence a better forecast for this case study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper has several original contributions. The first is to employ a superior interpolation method that enables to estimate, nowcast and forecast monthly Brazilian GDP for 1980-2012 in an integrated way; see Bernanke, Gertler and Watson (1997, Brookings Papers on Economic Activity). Second, along the spirit of Mariano and Murasawa (2003, Journal of Applied Econometrics), we propose and test a myriad of interpolation models and interpolation auxiliary series- all coincident with GDP from a business-cycle dating point of view. Based on these results, we finally choose the most appropriate monthly indicator for Brazilian GDP. Third, this monthly GDP estimate is compared to an economic activity indicator widely used by practitioners in Brazil - the Brazilian Economic Activity Index - (IBC-Br). We found that the our monthly GDP tracks economic activity better than IBC-Br. This happens by construction, since our state-space approach imposes the restriction (discipline) that our monthly estimate must add up to the quarterly observed series in any given quarter, which may not hold regarding IBC-Br. Moreover, our method has the advantage to be easily implemented: it only requires conditioning on two observed series for estimation, while estimating IBC-Br requires the availability of hundreds of monthly series. Third, in a nowcasting and forecasting exercise, we illustrate the advantages of our integrated approach. Finally, we compare the chronology of recessions of our monthly estimate with those done elsewhere.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper has several original contributions. The rst is to employ a superior interpolation method that enables to estimate, nowcast and forecast monthly Brazilian GDP for 1980-2012 in an integrated way; see Bernanke, Gertler and Watson (1997, Brookings Papers on Economic Activity). Second, along the spirit of Mariano and Murasawa (2003, Journal of Applied Econometrics), we propose and test a myriad of interpolation models and interpolation auxiliary series all coincident with GDP from a business-cycle dating point of view. Based on these results, we nally choose the most appropriate monthly indicator for Brazilian GDP. Third, this monthly GDP estimate is compared to an economic activity indicator widely used by practitioners in Brazil - the Brazilian Economic Activity Index - (IBC-Br). We found that the our monthly GDP tracks economic activity better than IBC-Br. This happens by construction, since our state-space approach imposes the restriction (discipline) that our monthly estimate must add up to the quarterly observed series in any given quarter, which may not hold regarding IBC-Br. Moreover, our method has the advantage to be easily implemented: it only requires conditioning on two observed series for estimation, while estimating IBC-Br requires the availability of hundreds of monthly series. Third, in a nowcasting and forecasting exercise, we illustrate the advantages of our integrated approach. Finally, we compare the chronology of recessions of our monthly estimate with those done elsewhere.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper has several original contributions. The rst is to employ a superior interpolation method that enables to estimate, nowcast and forecast monthly Brazilian GDP for 1980-2012 in an integrated way; see Bernanke, Gertler and Watson (1997, Brookings Papers on Economic Activity). Second, along the spirit of Mariano and Murasawa (2003, Journal of Applied Econometrics), we propose and test a myriad of interpolation models and interpolation auxiliary series all coincident with GDP from a business-cycle dating point of view. Based on these results, we nally choose the most appropriate monthly indicator for Brazilian GDP. Third, this monthly GDP estimate is compared to an economic activity indicator widely used by practitioners in Brazil- the Brazilian Economic Activity Index - (IBC-Br). We found that the our monthly GDP tracks economic activity better than IBC-Br. This happens by construction, since our state-space approach imposes the restriction (discipline) that our monthly estimate must add up to the quarterly observed series in any given quarter, whichmay not hold regarding IBC-Br. Moreover, our method has the advantage to be easily implemented: it only requires conditioning on two observed series for estimation, while estimating IBC-Br requires the availability of hundreds of monthly series. Third, in a nowcasting and forecasting exercise, we illustrate the advantages of our integrated approach. Finally, we compare the chronology of recessions of our monthly estimate with those done elsewhere.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The first contribution of this paper is to employ a superior interpolation method that enables to estimate, nowcast and forecast monthly Brazilian GDP for 1980-2012 in an integrated way; see Bernanke, Gertler and Watson (1997, Brookings Papers on Economic Activity). The second contribution, along the spirit of Mariano and Murasawa (2003, Journal of Applied Econometrics), is to propose and test a myriad of inter-polation models and interpolation auxiliary series all coincident with GDP from a business-cycle dating point of view. Based on these results, we finally choose the most appropriate monthly indicator for Brazilian GDP. Third, this monthly GDP estimate is compared to an economic activity indicator widely used by practitioners in Brazil - the Brazilian Economic Activity Index - (IBC-Br). We found that our monthly GDP tracks economic activity better than IBC-Br. This happens by construction, since our state-space approach imposes the restriction (discipline) that our monthly estimate must add up to the quarterly observed series in any given quarter, which may not hold regarding IBC-Br. Moreover, our method has the advantage to be easily implemented: it only requires conditioning on two observed series for estimation, while estimating IBC-Br requires the availability of hundreds of monthly series. The third contribution is to illustrate, in a nowcasting and forecasting exercise, the advantages of our integrated approach. Finally, we compare the chronology of recessions of our monthly estimate with those done elsewhere.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a fast procedure for scanning electron microscopy (SEM) analysis in which hexamethyldisilazane (HMDS) solvent, instead of the critical point drying, is used to remove liquids from a microbiological specimen. The results indicate that the HMDS solvent is suitable for drying samples of anaerobic cells for examination by SEM and does not cause cell structure disruption.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to rapid and continuous deforestation, recent bird surveys in the Atlantic Forest are following rapid assessment programs to accumulate significant amounts of data during short periods of time. During this study, two surveying methods were used to evaluate which technique rapidly accumulated most species (> 90% of the estimated empirical value) at lowland Atlantic Forests in the state of São Paulo, southeastern Brazil. Birds were counted during the 2008-2010 breeding seasons using 10-minute point counts and 10-species lists. Overall, point counting detected as many species as lists (79 vs. 83, respectively), and 88 points (14.7 h) detected 90% of the estimated species richness. Forty-one lists were insufficient to detect 90% of all species. However, lists accumulated species faster in a shorter time period, probably due to the nature of the point count method in which species detected while moving between points are not considered. Rapid assessment programs in these forests will rapidly detect more species using 10-species lists. Both methods shared 63% of all forest species, but this may be due to spatial and temporal mismatch between samplings of each method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Klimamontoring benötigt eine operative, raum-zeitliche Analyse der Klimavariabilität. Mit dieser Zielsetzung, funktionsbereite Karten regelmäßig zu erstellen, ist es hilfreich auf einen Blick, die räumliche Variabilität der Klimaelemente in der zeitlichen Veränderungen darzustellen. Für aktuelle und kürzlich vergangene Jahre entwickelte der Deutsche Wetterdienst ein Standardverfahren zur Erstellung solcher Karten. Die Methode zur Erstellung solcher Karten variiert für die verschiedenen Klimaelemente bedingt durch die Datengrundlage, die natürliche Variabilität und der Verfügbarkeit der in-situ Daten.rnIm Rahmen der Analyse der raum-zeitlichen Variabilität innerhalb dieser Dissertation werden verschiedene Interpolationsverfahren auf die Mitteltemperatur der fünf Dekaden der Jahre 1951-2000 für ein relativ großes Gebiet, der Region VI der Weltorganisation für Meteorologie (Europa und Naher Osten) angewendet. Die Region deckt ein relativ heterogenes Arbeitsgebiet von Grönland im Nordwesten bis Syrien im Südosten hinsichtlich der Klimatologie ab.rnDas zentrale Ziel der Dissertation ist eine Methode zur räumlichen Interpolation der mittleren Dekadentemperaturwerte für die Region VI zu entwickeln. Diese Methode soll in Zukunft für die operative monatliche Klimakartenerstellung geeignet sein. Diese einheitliche Methode soll auf andere Klimaelemente übertragbar und mit der entsprechenden Software überall anwendbar sein. Zwei zentrale Datenbanken werden im Rahmen dieser Dissertation verwendet: So genannte CLIMAT-Daten über dem Land und Schiffsdaten über dem Meer.rnIm Grunde wird die Übertragung der Punktwerte der Temperatur per räumlicher Interpolation auf die Fläche in drei Schritten vollzogen. Der erste Schritt beinhaltet eine multiple Regression zur Reduktion der Stationswerte mit den vier Einflussgrößen der Geographischen Breite, der Höhe über Normalnull, der Jahrestemperaturamplitude und der thermischen Kontinentalität auf ein einheitliches Niveau. Im zweiten Schritt werden die reduzierten Temperaturwerte, so genannte Residuen, mit der Interpolationsmethode der Radialen Basis Funktionen aus der Gruppe der Neuronalen Netzwerk Modelle (NNM) interpoliert. Im letzten Schritt werden die interpolierten Temperaturraster mit der Umkehrung der multiplen Regression aus Schritt eins mit Hilfe der vier Einflussgrößen auf ihr ursprüngliches Niveau hochgerechnet.rnFür alle Stationswerte wird die Differenz zwischen geschätzten Wert aus der Interpolation und dem wahren gemessenen Wert berechnet und durch die geostatistische Kenngröße des Root Mean Square Errors (RMSE) wiedergegeben. Der zentrale Vorteil ist die wertegetreue Wiedergabe, die fehlende Generalisierung und die Vermeidung von Interpolationsinseln. Das entwickelte Verfahren ist auf andere Klimaelemente wie Niederschlag, Schneedeckenhöhe oder Sonnenscheindauer übertragbar.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El presente estudio se realizó con la finalidad de modelizar la distribución espacial del carbón de la espiga del maíz causada por Sporisorium reilianum durante 2006 en el Estado de México y su visualización a través de la generación mapas de densidad. El muestreo se realizó en 100 parcelas georreferenciadas por cada localidad analizada. La incidencia de la enfermedad (porcentaje de plantas enfermas) se determinó al establecer cinco puntos parcela, en cada punto se contabilizaron 100 plantas. Se realizó el análisis geoestadístico para estimar el semivariograma experimental, una vez obtenido, se ajustó a un modelo teórico (esférico, exponencial o gaussiano) a través de los programas Variowin 2.2., su ajuste se validó a través de la validación cruzada. Posteriormente, se elaboraron mapas de agregación de la enfermedad con el método de interpolación geoestadística o krigeado. Los resultados indicaron que la enfermedad se presentó en 20 localidades de 19 municipios del Estado de México; todas las localidades presentaron un comportamiento espacial agregado de la enfermedad, 16 localidades se ajustaron al modelo esférico, dos al modelo exponencial y dos localidades se ajustaron al modelo gaussiano. En todos los modelos se lograron establecer mapas de agregación que permitirá adecuar las acciones de manejo en términos de puntos o sitios específicos.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In 2014, UniDive (The University of Queensland Underwater Club) conducted an ecological assessment of the Point Lookout Dive sites for comparison with similar surveys conducted in 2001. Involvement in the project was voluntary. Members of UniDive who were marine experts conducted training for other club members who had no, or limited, experience in identifying marine organisms and mapping habitats. Since the 2001 detailed baseline study, no similar seasonal survey has been conducted. The 2014 data is particularly important given that numerous changes have taken place in relation to the management of, and potential impacts on, these reef sites. In 2009, Moreton Bay Marine Park was re-zoned, and Flat Rock was converted to a marine national park zone (Green zone) with no fishing or anchoring. In 2012, four permanent moorings were installed at Flat Rock. Additionally, the entire area was exposed to the potential effects of the 2011 and 2013 Queensland floods, including flood plumes which carried large quantities of sediment into Moreton Bay and surrounding waters. The population of South East Queensland has increased from 2.49 million in 2001 to 3.18 million in 2011 (BITRE, 2013). This rapidly expanding coastal population has increased the frequency and intensity of both commercial and recreational activities around Point Lookout dive sites (EPA 2008). Methodology used for the PLEA project was based on the 2001 survey protocols, Reef Check Australia protocols and Coral Watch methods. This hybrid methodology was used to monitor substrate and benthos, invertebrates, fish, and reef health impacts. Additional analyses were conducted with georeferenced photo transects. The PLEA marine surveys were conducted over six weekends in 2014 totaling 535 dives and 376 hours underwater. Two training weekends (February and March) were attended by 44 divers, whilst biological surveys were conducted on seasonal weekends (February, May, July and October). Three reefs were surveyed, with two semi-permanent transects at Flat Rock, two at Shag Rock, and one at Manta Ray Bommie. Each transect was sampled once every survey weekend, with the transect tapes deployed at a depth of 10 m below chart datum. Fish populations were assessed using a visual census along 3 x 20 m transects. Each transect was 5 m wide (2.5 m either side of the transect tape), 5 m high and 20 m in length. Fish families and species were chosen that are commonly targeted by recreational or commercial fishers, or targeted by aquarium collectors, and that were easily identified by their body shape. Rare or otherwise unusual species were also recorded. Target invertebrate populations were assessed using visual census along 3 x 20 m transects. Each transect was 5 m wide (2.5 m either side of the transect tape) and 20 m in length. The diver surveying invertebrates conducted a 'U-shaped' search pattern, covering 2.5 m on either side of the transect tape. Target impacts were assessed using a visual census along the 3 x 20 m transects. Each transect was 5 m wide (2.5 m either side of the transect tape) and 20 m in length. The transect was surveyed via a 'U-shaped' search pattern, covering 2.5 m on either side of the transect tape. Substrate surveys were conducted using the point sampling method, enabling percentage cover of substrate types and benthic organisms to be calculated. The substrate or benthos under the transect line was identified at 0.5m intervals, with a 5m gap between each of the three 20m segments. Categories recorded included various growth forms of hard and soft coral, key species/growth forms of algae, other living organisms (i.e. sponges), recently killed coral, and, non-living substrate types (i.e. bare rock, sand, rubble, silt/clay).