772 resultados para Distance sampling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following protection measures implemented since the 1970s, large carnivores are currently increasing in number and returning to areas from which they were absent for decades or even centuries. Monitoring programmes for these species rely extensively on non-invasive sampling and genotyping. However, attempts to connect results of such studies at larger spatial or temporal scales often suffer from the incompatibility of genetic markers implemented by researchers in different laboratories. This is particularly critical for long-distance dispersers, revealing the need for harmonized monitoring schemes that would enable the understanding of gene flow and dispersal dynamics. Based on a review of genetic studies on grey wolves Canis lupus from Europe, we provide an overview of the genetic markers currently in use, and identify opportunities and hurdles for studies based on continent-scale datasets. Our results highlight an urgent need for harmonization of methods to enable transnational research based on data that have already been collected, and to allow these data to be linked to material collected in the future. We suggest timely standardization of newly developed genotyping approaches, and propose that action is directed towards the establishment of shared single nucleotide polymorphism panels, next-generation sequencing of microsatellites, a common reference sample collection and an online database for data exchange. Enhanced cooperation among genetic researchers dealing with large carnivores in consortia would facilitate streamlining of methods, their faster and wider adoption, and production of results at the large spatial scales that ultimately matter for the conservation of these charismatic species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given their central role in mercury (Hg) excretion and suitability as reservoirs, bird feathers are useful Hg biomonitors. Nevertheless, the interpretation of Hg concentrations is still questioned as a result of a poor knowledge of feather physiology and mechanisms affecting Hg deposition. Given the constraints of feather availability to ecotoxicological studies, we tested the effect of intra-individual differences in Hg concentrations according to feather type (body vs. flight feathers), position in the wing and size (mass and length) in order to understand how these factors could affect Hg estimates. We measured Hg concentration of 154 feathers from 28 un-moulted barn owls (Tyto alba), collected dead on roadsides. Median Hg concentration was 0.45 (0.076-4.5) mg kg(-1) in body feathers, 0.44 (0.040-4.9) mg kg(-1) in primary and 0.60 (0.042-4.7) mg kg(-1) in secondary feathers, and we found a poor effect of feather type on intra-individual Hg levels. We also found a negative effect of wing feather mass on Hg concentration but not of feather length and of its position in the wing. We hypothesize that differences in feather growth rate may be the main driver of between-feather differences in Hg concentrations, which can have implications in the interpretation of Hg concentrations in feathers. Finally, we recommend that, whenever possible, several feathers from the same individual should be analysed. The five innermost primaries have lowest mean deviations to both between-feather and intra-individual mean Hg concentration and thus should be selected under restrictive sampling scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the relationship between stable sampling sequences for bandlimited functions in $L^p(\R^n)$ and the Fourier multipliers in $L^p$. In the case that the sequence is a lattice and the spectrum is a fundamental domain for the lattice the connection is complete. In the case of irregular sequences there is still a partial relationship.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main purpose of the present work is to study the concentration of atmospheric particles (PM10 and PM2.5) in the Candiota (RS) region using HV PM10 and dichotomous samplers. Four sampling sites at a distance of 50 km from the emission source were selected: Aceguá, Aeroporto, 8 de Agosto and Pedras Altas. Samples were collected from December 2000 to December 2001. The values obtained with the ISCST (Industrial Source Complex Term) model and with the samplers were compared on January 21st, April 5th, July 14th, August 1st, and October 13th 2001, and are representative of frontal systems occurring in the study area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present view-dependent information theory quality measures for pixel sampling and scene discretization in flatland. The measures are based on a definition for the mutual information of a line, and have a purely geometrical basis. Several algorithms exploiting them are presented and compare well with an existing one based on depth differences

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,