956 resultados para Scanline sampling technique
Resumo:
A late Quaternary pollen record from northern Sakhalin Island (51.34°N, 142.14°E, 15 m a.s.l.) spanning the last 43.7 ka was used to reconstruct regional climate dynamics and vegetation distribution by using the modern analogue technique (MAT). The long-term trends of the reconstructed mean annual temperature (TANN) and precipitation (PANN), and total tree cover are generally in line with key palaeoclimate records from the North Atlantic region and the Asian monsoon domain. TANN largely follows the fluctuations in solar summer insolation at 55°N. During Marine Isotope Stage (MIS) 3, TANN and PANN were on average 0.2 °C and 700 mm, respectively, thus very similar to late Holocene/modern conditions. Full glacial climate deterioration (TANN = -3.3 °C, PANN = 550 mm) was relatively weak as suggested by the MAT-inferred average climate parameters and tree cover densities. However, error ranges of the climate reconstructions during this interval are relatively large and the last glacial environments in northern Sakhalin could be much colder and drier than suggested by the weighted average values. An anti-phase relationship between mean temperature of the coldest (MTCO) and warmest (MTWA) month is documented during the last glacial period, i.e. MIS 2 and 3, suggesting more continental climate due to sea levels that were lower than present. Warmest and wettest climate conditions have prevailed since the end of the last glaciation with an optimum (TANN = 1.5 °C, PANN = 800 mm) in the middle Holocene interval (ca 8.7-5.2 cal. ka BP). This lags behind the solar insolation peak during the early Holocene. We propose that this is due to continuous Holocene sea level transgression and regional influence of the Tsushima Warm Current, which reached maximum intensity during the middle Holocene. Several short-term climate oscillations are suggested by our reconstruction results and correspond to Northern Hemisphere Heinrich and Dansgaard-Oeschger events, the Bølling-Allerød and the Younger Dryas. The most prominent fluctuation is registered during Heinrich 4 event, which is marked by noticeably colder and drier conditions and the spread of herbaceous taxa.
Resumo:
The aim of this work is to solve a question raised for average sampling in shift-invariant spaces by using the well-known matrix pencil theory. In many common situations in sampling theory, the available data are samples of some convolution operator acting on the function itself: this leads to the problem of average sampling, also known as generalized sampling. In this paper we deal with the existence of a sampling formula involving these samples and having reconstruction functions with compact support. Thus, low computational complexity is involved and truncation errors are avoided. In practice, it is accomplished by means of a FIR filter bank. An answer is given in the light of the generalized sampling theory by using the oversampling technique: more samples than strictly necessary are used. The original problem reduces to finding a polynomial left inverse of a polynomial matrix intimately related to the sampling problem which, for a suitable choice of the sampling period, becomes a matrix pencil. This matrix pencil approach allows us to obtain a practical method for computing the compactly supported reconstruction functions for the important case where the oversampling rate is minimum. Moreover, the optimality of the obtained solution is established.
Resumo:
The penalty corner is one of the most important goal plays in field hockey. The drag-flick is used less by women than men in a penalty corner. The aim of this study was to describe training-induced changes in the drag-flick technique in female field hockey players. Four female players participated in the study. The VICON optoelectronic system (Oxford Metrics, Oxford, UK) measured the kinematic parameters of the drag-flick with six cameras sampling at 250 Hz, prior to and after training. Fifteen shots were captured for each subject. A Wilcoxon test assessed the differences between pre-training and post-training parameters. Two players received specific training twice a week for 8 weeks; the other two players did not train. The proposed drills improved the position of the stick at the beginning of the shot (p<0.05), the total distance of the shot (p<0.05)and the rotation radius at ball release (p<0.01). It was noted that all players had lost speed of the previous run. Further studies should include a larger sample, in order to provide more information on field hockey performance.
Resumo:
The Nakagami-m distribution is widely used for the simulation of fading channels in wireless communications. A novel, simple and extremely efficient acceptance-rejection algorithm is introduced for the generation of independent Nakagami-m random variables. The proposed method uses another Nakagami density with a half-integer value of the fading parameter, mp ¼ n/2 ≤ m, as proposal function, from which samples can be drawn exactly and easily. This novel rejection technique is able to work with arbitrary values of m ≥ 1, average path energy, V, and provides a higher acceptance rate than all currently available methods. RESUMEN. Método extremadamente eficiente para generar variables aleatorias de Nakagami (utilizadas para modelar el desvanecimiento en canales de comunicaciones móviles) basado en "rejection sampling".
Resumo:
Correlations in low-frequency atomic displacements predicted by molecular dynamics simulations on the order of 1 ns are undersampled for the time scales currently accessible by the technique. This is shown with three different representations of the fluctuations in a macromolecule: the reciprocal space of crystallography using diffuse x-ray scattering data, real three-dimensional Cartesian space using covariance matrices of the atomic displacements, and the 3N-dimensional configuration space of the protein using dimensionally reduced projections to visualize the extent to which phase space is sampled.
Resumo:
Through the processes of the biological pump, carbon is exported to the deep ocean in the form of dissolved and particulate organic matter. There are several ways by which downward export fluxes can be estimated. The great attraction of the 234Th technique is that its fundamental operation allows a downward flux rate to be determined from a single water column profile of thorium coupled to an estimate of POC/234Th ratio in sinking matter. We present a database of 723 estimates of organic carbon export from the surface ocean derived from the 234Th technique. Data were collected from tables in papers published between 1985 and 2013 only. We also present sampling dates, publication dates and sampling areas. Most of the open ocean Longhurst provinces are represented by several measurements. However, the Western Pacific, the Atlantic Arctic, South Pacific and the South Indian Ocean are not well represented. There is a variety of integration depths ranging from surface to 220m. Globally the fluxes ranged from -22 to 125 mmol of C/m**2/d. We believe that this database is important for providing new global estimate of the magnitude of the biological carbon pump.
Resumo:
Blood sampling is an essential technique in many herpetological studies. This paper describes a quick and humane technique to collect blood samples from three species of Australian chelid turtles ( Order Pleurodira): Chelodina expansa, Elseya latisternum, and Emydura macquarii signata.
Resumo:
Radar target identification based on complex natural resonances is sometimes achieved by convolving a linear time-domain filter with a received target signature. The filter is constructed from measured or pre-calculated target resonances. The performance of the target identification procedure is degraded if the difference between the sampling rates of the target signature and the filter is ignored. The problem is investigated for the natural extinction pulse technique (E-pulse) for the case of identifying stick models of aircraft.
Resumo:
This thesis is concerned with the study of a non-sequential identification technique, so that it may be applied to the identification of process plant mathematical models from process measurements with the greatest degree of accuracy and reliability. In order to study the accuracy of the technique under differing conditions, simple mathematical models were set up on a parallel hybrid. computer and these models identified from input/output measurements by a small on-line digital computer. Initially, the simulated models were identified on-line. However, this method of operation was found not suitable for a thorough study of the technique due to equipment limitations. Further analysis was carried out in a large off-line computer using data generated by the small on-line computer. Hence identification was not strictly on-line. Results of the work have shovm that the identification technique may be successfully applied in practice. An optimum sampling period is suggested, together with noise level limitations for maximum accuracy. A description of a double-effect evaporator is included in this thesis. It is proposed that the next stage in the work will be the identification of a mathematical model of this evaporator using the teclmique described.
Resumo:
Significant improvements have been made in estimating gross primary production (GPP), ecosystem respiration (R), and net ecosystem production (NEP) from diel, “free-water” changes in dissolved oxygen (DO). Here we evaluate some of the assumptions and uncertainties that are still embedded in the technique and provide guidelines on how to estimate reliable metabolic rates from high-frequency sonde data. True whole-system estimates are often not obtained because measurements reflect an unknown zone of influence which varies over space and time. A minimum logging frequency of 30 min was sufficient to capture metabolism at the daily time scale. Higher sampling frequencies capture additional pattern in the DO data, primarily related to physical mixing. Causes behind the often large daily variability are discussed and evaluated for an oligotrophic and a eutrophic lake. Despite a 3-fold higher day-to-day variability in absolute GPP rates in the eutrophic lake, both lakes required at least 3 sonde days per week for GPP estimates to be within 20% of the weekly average. A sensitivity analysis evaluated uncertainties associated with DO measurements, piston velocity (k), and the assumption that daytime R equals nighttime R. In low productivity lakes, uncertainty in DO measurements and piston velocity strongly impacts R but has no effect on GPP or NEP. Lack of accounting for higher R during the day underestimates R and GPP but has no effect on NEP. We finally provide suggestions for future research to improve the technique.
Resumo:
The growing need for fast sampling of explosives in high throughput areas has increased the demand for improved technology for the trace detection of illicit compounds. Detection of the volatiles associated with the presence of the illicit compounds offer a different approach for sensitive trace detection of these compounds without increasing the false positive alarm rate. This study evaluated the performance of non-contact sampling and detection systems using statistical analysis through the construction of Receiver Operating Characteristic (ROC) curves in real-world scenarios for the detection of volatiles in the headspace of smokeless powder, used as the model system for generalizing explosives detection. A novel sorbent coated disk coined planar solid phase microextraction (PSPME) was previously used for rapid, non-contact sampling of the headspace containers. The limits of detection for the PSPME coupled to IMS detection was determined to be 0.5-24 ng for vapor sampling of volatile chemical compounds associated with illicit compounds and demonstrated an extraction efficiency of three times greater than other commercially available substrates, retaining >50% of the analyte after 30 minutes sampling of an analyte spike in comparison to a non-detect for the unmodified filters. Both static and dynamic PSPME sampling was used coupled with two ion mobility spectrometer (IMS) detection systems in which 10-500 mg quantities of smokeless powders were detected within 5-10 minutes of static sampling and 1 minute of dynamic sampling time in 1-45 L closed systems, resulting in faster sampling and analysis times in comparison to conventional solid phase microextraction-gas chromatography-mass spectrometry (SPME-GC-MS) analysis. Similar real-world scenarios were sampled in low and high clutter environments with zero false positive rates. Excellent PSPME-IMS detection of the volatile analytes were visualized from the ROC curves, resulting with areas under the curves (AUC) of 0.85-1.0 and 0.81-1.0 for portable and bench-top IMS systems, respectively. Construction of ROC curves were also developed for SPME-GC-MS resulting with AUC of 0.95-1.0, comparable with PSPME-IMS detection. The PSPME-IMS technique provides less false positive results for non-contact vapor sampling, cutting the cost and providing an effective sampling and detection needed in high-throughput scenarios, resulting in similar performance in comparison to well-established techniques with the added advantage of fast detection in the field.
Resumo:
As the world's synchrotrons and X-FELs endeavour to meet the need to analyse ever-smaller protein crystals, there grows a requirement for a new technique to present nano-dimensional samples to the beam for X-ray diffraction experiments.The work presented here details developmental work to reconfigure the nano tweezer technology developed by Optofluidics (PA, USA) for the trapping of nano dimensional protein crystals for X-ray crystallography experiments. The system in its standard configuration is used to trap nano particles for optical microscopy. It uses silicon nitride laser waveguides that bridge a micro fluidic channel. These waveguides contain 180 nm apertures of enabling the system to use biologically compatible 1.6 micron wavelength laser light to trap nano dimensional biological samples. Using conventional laser tweezers, the wavelength required to trap such nano dimensional samples would destroy them. The system in its optical configuration has trapped protein molecules as small as 10 nanometres.
Resumo:
As the world's synchrotrons and X-FELs endeavour to meet the need to analyse ever-smaller protein crystals, there grows a requirement for a new technique to present nano-dimensional samples to the beam for X-ray diffraction experiments.The work presented here details developmental work to reconfigure the nano tweezer technology developed by Optofluidics (PA, USA) for the trapping of nano dimensional protein crystals for X-ray crystallography experiments. The system in its standard configuration is used to trap nano particles for optical microscopy. It uses silicon nitride laser waveguides that bridge a micro fluidic channel. These waveguides contain 180 nm apertures of enabling the system to use biologically compatible 1.6 micron wavelength laser light to trap nano dimensional biological samples. Using conventional laser tweezers, the wavelength required to trap such nano dimensional samples would destroy them. The system in its optical configuration has trapped protein molecules as small as 10 nanometres.