969 resultados para Sampling method


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In populational sampling it is vitally important to clarify and discern: first, the design or sampling method used to solve the research problem; second, the sampling size, taking into account different components (precision, reliability, variance); third, random selection and fourth, the precision estimate (sampling errors), so as to determine if it is possible to infer the obtained estimates from the target population. The existing difficulty to use concepts from the sampling theory is to understand them with absolute clarity and, to achieve it, the help from didactic-pedagogical strategies arranged as conceptual “mentefactos” (simple hierarchic diagrams organized from propositions) may prove useful. This paper presents the conceptual definition, through conceptual “mentefactos”, of the most important populational probabilistic sampling concepts, in order to obtain representative samples from populations in health research.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The goal of the review is to provide a state-of-the-art survey on sampling and probe methods for the solution of inverse problems. Further, a configuration approach to some of the problems will be presented. We study the concepts and analytical results for several recent sampling and probe methods. We will give an introduction to the basic idea behind each method using a simple model problem and then provide some general formulation in terms of particular configurations to study the range of the arguments which are used to set up the method. This provides a novel way to present the algorithms and the analytic arguments for their investigation in a variety of different settings. In detail we investigate the probe method (Ikehata), linear sampling method (Colton-Kirsch) and the factorization method (Kirsch), singular sources Method (Potthast), no response test (Luke-Potthast), range test (Kusiak, Potthast and Sylvester) and the enclosure method (Ikehata) for the solution of inverse acoustic and electromagnetic scattering problems. The main ideas, approaches and convergence results of the methods are presented. For each method, we provide a historical survey about applications to different situations.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The soil fauna is often a neglected group in many large-scale studies of farmland biodiversity due to difficulties in extracting organisms efficiently from the soil. This study assesses the relative efficiency of the simple and cheap sampling method of handsorting against Berlese-Tullgren funnel and Winkler apparatus extraction. Soil cores were taken from grassy arable field margins and wheat fields in Cambridgeshire, UK, and the efficiencies of the three methods in assessing the abundances and species densities of soil macroinver-tebrates were compared. Handsorting in most cases was as efficient at extracting the majority of the soil macrofauna as the Berlese-Tullgren funnel and Winkler bag methods, although it underestimated the species densities of the woodlice and adult beetles. There were no obvious biases among the three methods for the particular vegetation types sampled and no significant differences in the size distributions of the earthworms and beetles. Proportionally fewer damaged earthworms were recorded in larger (25 x 25 cm) soil cores when compared with smaller ones (15 x 15 cm). Handsorting has many benefits, including targeted extraction, minimum disturbance to the habitat and shorter sampling periods and may be the most appropriate method for studies of farmland biodiversity when a high number of soil cores need to be sampled. (C) 2008 Elsevier Masson SAS. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The goal of this paper is to study and further develop the orthogonality sampling or stationary waves algorithm for the detection of the location and shape of objects from the far field pattern of scattered waves in electromagnetics or acoustics. Orthogonality sampling can be seen as a special beam forming algorithm with some links to the point source method and to the linear sampling method. The basic idea of orthogonality sampling is to sample the space under consideration by calculating scalar products of the measured far field pattern , with a test function for all y in a subset Q of the space , m = 2, 3. The way in which this is carried out is important to extract the information which the scattered fields contain. The theoretical foundation of orthogonality sampling is only partly resolved, and the goal of this work is to initiate further research by numerical demonstration of the high potential of the approach. We implement the method for a two-dimensional setting for the Helmholtz equation, which represents electromagnetic scattering when the setup is independent of the third coordinate. We show reconstructions of the location and shape of objects from measurements of the scattered field for one or several directions of incidence and one or many frequencies or wave numbers, respectively. In particular, we visualize the indicator function both with the Dirichlet and Neumann boundary condition and for complicated inhomogeneous media.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this study a new, fully non-linear, approach to Local Earthquake Tomography is presented. Local Earthquakes Tomography (LET) is a non-linear inversion problem that allows the joint determination of earthquakes parameters and velocity structure from arrival times of waves generated by local sources. Since the early developments of seismic tomography several inversion methods have been developed to solve this problem in a linearized way. In the framework of Monte Carlo sampling, we developed a new code based on the Reversible Jump Markov Chain Monte Carlo sampling method (Rj-McMc). It is a trans-dimensional approach in which the number of unknowns, and thus the model parameterization, is treated as one of the unknowns. I show that our new code allows overcoming major limitations of linearized tomography, opening a new perspective in seismic imaging. Synthetic tests demonstrate that our algorithm is able to produce a robust and reliable tomography without the need to make subjective a-priori assumptions about starting models and parameterization. Moreover it provides a more accurate estimate of uncertainties about the model parameters. Therefore, it is very suitable for investigating the velocity structure in regions that lack of accurate a-priori information. Synthetic tests also reveal that the lack of any regularization constraints allows extracting more information from the observed data and that the velocity structure can be detected also in regions where the density of rays is low and standard linearized codes fails. I also present high-resolution Vp and Vp/Vs models in two widespread investigated regions: the Parkfield segment of the San Andreas Fault (California, USA) and the area around the Alto Tiberina fault (Umbria-Marche, Italy). In both the cases, the models obtained with our code show a substantial improvement in the data fit, if compared with the models obtained from the same data set with the linearized inversion codes.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

For the detection of hidden objects by low-frequency electromagnetic imaging the Linear Sampling Method works remarkably well despite the fact that the rigorous mathematical justification is still incomplete. In this work, we give an explanation for this good performance by showing that in the low-frequency limit the measurement operator fulfills the assumptions for the fully justified variant of the Linear Sampling Method, the so-called Factorization Method. We also show how the method has to be modified in the physically relevant case of electromagnetic imaging with divergence-free currents. We present numerical results to illustrate our findings, and to show that similar performance can be expected for the case of conducting objects and layered backgrounds.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Despite widespread use of species-area relationships (SARs), dispute remains over the most representative SAR model. Using data of small-scale SARs of Estonian dry grassland communities, we address three questions: (1) Which model describes these SARs best when known artifacts are excluded? (2) How do deviating sampling procedures (marginal instead of central position of the smaller plots in relation to the largest plot; single values instead of average values; randomly located subplots instead of nested subplots) influence the properties of the SARs? (3) Are those effects likely to bias the selection of the best model? Our general dataset consisted of 16 series of nested-plots (1 cm(2)-100 m(2), any-part system), each of which comprised five series of subplots located in the four corners and the centre of the 100-m(2) plot. Data for the three pairs of compared sampling designs were generated from this dataset by subsampling. Five function types (power, quadratic power, logarithmic, Michaelis-Menten, Lomolino) were fitted with non-linear regression. In some of the communities, we found extremely high species densities (including bryophytes and lichens), namely up to eight species in 1 cm(2) and up to 140 species in 100 m(2), which appear to be the highest documented values on these scales. For SARs constructed from nested-plot average-value data, the regular power function generally was the best model, closely followed by the quadratic power function, while the logarithmic and Michaelis-Menten functions performed poorly throughout. However, the relative fit of the latter two models increased significantly relative to the respective best model when the single-value or random-sampling method was applied, however, the power function normally remained far superior. These results confirm the hypothesis that both single-value and random-sampling approaches cause artifacts by increasing stochasticity in the data, which can lead to the selection of inappropriate models.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

OBJECTIVES To examine whether circulating levels of matrix metalloproteinase 9 (MMP-9) were associated with ultrasound-assessed intima-media thickness (IMT) and echolucent plaques in the carotid and femoral arteries. To examine preanalytical sources of variability in MMP-9 concentrations related to sampling procedures. SUBJECTS AND DESIGN Plasma and serum MMP-9 levels were compared with ultrasound assessed measures of femoral and carotid atherosclerosis, in a cross-sectional study of 61-year-old men (n = 473). Preanalytical sources of variability in MMP-9 levels were examined in 10 healthy subjects. Main outcome measures were circulating levels of MMP-9 in serum and plasma, IMT of the carotid and femoral arteries, and plaque status based on size and echolucency. SETTING Research unit at university hospital. RESULTS Plasma concentrations of total and active MMP-9 were associated with femoral artery IMT independently of traditional cardiovascular risk factors, and were higher in subjects with moderate to large femoral plaques. Plasma MMP-9 concentration was higher in men with echolucent femoral plaques (P = 0.006) compared with subjects without femoral plaques. No similar associations were found for carotid plaques. MMP-9 concentrations were higher in serum than in plasma, and higher when sampling was performed with Vacutainer than with syringe. MMP-9 levels in serum were more strongly associated with peripheral neutrophil count compared with MMP-9 levels in plasma. CONCLUSIONS Plasma MMP-9 levels were associated with atherosclerosis in the femoral artery, and total MMP-9 concentration was higher in men with echolucent femoral plaques. The choice of sample material and sampling method affect the measurements of circulating MMP-9 levels.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Light traps have been used widely to sample insect abundance and diversity, but their performance for sampling scarab beetles in tropical forests based on light source type and sampling hours throughout the night has not been evaluated. The efficiency of mercury-vapour lamps, cool white light and ultraviolet light sources in attracting Dynastinae, Melolonthinae and Rutelinae scarab beetles, and the most adequate period of the night to carry out the sampling was tested in different forest areas of Costa Rica. Our results showed that light source wavelengths and hours of sampling influenced scarab beetle catches. No significant differences were observed in trap performance between the ultraviolet light and mercury-vapour traps, whereas these two methods caught significantly more species richness and abundance than cool white light traps. Species composition also varied between methods. Large differences appear between catches in the sampling period, with the first five hours of the night being more effective than the last five hours. Because of their high efficiency and logistic advantages, we recommend ultraviolet light traps deployed during the first hours of the night as the best sampling method for biodiversity studies of those scarab beetles in tropical forests.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This study is concerned with labour productivity in traditional house building in Scotland. Productivity is a measure of the effective use of resources and provides vital benefits that can be combined in a number of ways. The introduction gives the background to two Scottish house building sites (Blantyre and Greenfield) that were surveyed by the Building Research Establishment (BEE) activity sampling method to provide the data for the study. The study had two main objectives; (1) summary data analysis in average manhours per house between all the houses on the site, and (2) detailed data analysis in average manhours for each house block on the site. The introduction also provides a literature review related to the objectives. The method is outlined in Chapter 2, the sites are discussed in Chapter 3, and Chapter 4 covers the method application on each site and a method development made in the study. The summary data analysis (Chapter 5) compares Blantyre and Greenfield, and two previous BEE surveys in England. The main detailed data analysis consisted of three forms, (Chapters 6, 7 and 8) each applied to a set of operations. The three forms of analysis were variations in average manhours per house for each house block on the site compared with; (1) block construction order, (2) average number of separate visits per house made by operatives to each block to complete an operation, and (3) average number of different operatives per house employed on an operation in each block. Three miscellaneous items of detail data analysis are discussed in Chapter 9. The conclusions to the whole study state that considerable variations in manhours for repeated operations were discovered, that the numbers of visits by operatives to complete operations were large and that the numbers of different operatives employed in some operations were a factor related to productivity. A critique of the activity sampling method suggests that the data produced is reliable in summary form and can give a good context for more detailed data collection. For future work, this could take the form of selected operations, with the context of an activity sampling survey, that wuld be intensively surveyed by other methods.