915 resultados para Sampling method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: An important challenge in conducting social research of specific relevance to harm reduction programs is locating hidden populations of consumers of substances like cannabis who typically report few adverse or unwanted consequences of their use. Much of the deviant, pathologized perception of drug users is historically derived from, and empirically supported, by a research emphasis on gaining ready access to users in drug treatment or in prison populations with higher incidence of problems of dependence and misuse. Because they are less visible, responsible recreational users of illicit drugs have been more difficult to study. Methods: This article investigates Respondent Driven Sampling (RDS) as a method of recruiting experienced marijuana users representative of users in the general population. Based on sampling conducted in a multi-city study (Halifax, Montreal, Toronto, and Vancouver), and compared to samples gathered using other research methods, we assess the strengths and weaknesses of RDS recruitment as a means of gaining access to illicit substance users who experience few harmful consequences of their use. Demographic characteristics of the sample in Toronto are compared with those of users in a recent household survey and a pilot study of Toronto where the latter utilized nonrandom self-selection of respondents. Results: A modified approach to RDS was necessary to attain the target sample size in all four cities (i.e., 40 'users' from each site). The final sample in Toronto was largely similar, however, to marijuana users in a random household survey that was carried out in the same city. Whereas well-educated, married, whites and females in the survey were all somewhat overrepresented, the two samples, overall, were more alike than different with respect to economic status and employment. Furthermore, comparison with a self-selected sample suggests that (even modified) RDS recruitment is a cost-effective way of gathering respondents who are more representative of users in the general population than nonrandom methods of recruitment ordinarily produce. Conclusions: Research on marijuana use, and other forms of drug use hidden in the general population of adults, is important for informing and extending harm reduction beyond its current emphasis on 'at-risk' populations. Expanding harm reduction in a normalizing context, through innovative research on users often overlooked, further challenges assumptions about reducing harm through prohibition of drug use and urges consideration of alternative policies such as decriminalization and legal regulation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been generally accepted that the method of moments (MoM) variogram, which has been widely applied in soil science, requires about 100 sites at an appropriate interval apart to describe the variation adequately. This sample size is often larger than can be afforded for soil surveys of agricultural fields or contaminated sites. Furthermore, it might be a much larger sample size than is needed where the scale of variation is large. A possible alternative in such situations is the residual maximum likelihood (REML) variogram because fewer data appear to be required. The REML method is parametric and is considered reliable where there is trend in the data because it is based on generalized increments that filter trend out and only the covariance parameters are estimated. Previous research has suggested that fewer data are needed to compute a reliable variogram using a maximum likelihood approach such as REML, however, the results can vary according to the nature of the spatial variation. There remain issues to examine: how many fewer data can be used, how should the sampling sites be distributed over the site of interest, and how do different degrees of spatial variation affect the data requirements? The soil of four field sites of different size, physiography, parent material and soil type was sampled intensively, and MoM and REML variograms were calculated for clay content. The data were then sub-sampled to give different sample sizes and distributions of sites and the variograms were computed again. The model parameters for the sets of variograms for each site were used for cross-validation. Predictions based on REML variograms were generally more accurate than those from MoM variograms with fewer than 100 sampling sites. A sample size of around 50 sites at an appropriate distance apart, possibly determined from variograms of ancillary data, appears adequate to compute REML variograms for kriging soil properties for precision agriculture and contaminated sites. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The concentrations of the water-soluble inorganic aerosol species, ammonium (NH4+), nitrate (NO3-), chloride (Cl-), and sulfate (SO42-), were measured from September to November 2002 at a pasture site in the Amazon Basin (Rondnia, Brazil) (LBA-SMOCC). Measurements were conducted using a semi-continuous technique (Wet-annular denuder/Steam-Jet Aerosol Collector: WAD/SJAC) and three integrating filter-based methods, namely (1) a denuder-filter pack (DFP: Teflon and impregnated Whatman filters), (2) a stacked-filter unit (SFU: polycarbonate filters), and (3) a High Volume dichotomous sampler (HiVol: quartz fiber filters). Measurements covered the late dry season (biomass burning), a transition period, and the onset of the wet season (clean conditions). Analyses of the particles collected on filters were performed using ion chromatography (IC) and Particle-Induced X-ray Emission spectrometry (PIXE). Season-dependent discrepancies were observed between the WAD/SJAC system and the filter-based samplers. During the dry season, when PM2.5 (D-p <= 2.5 mu m) concentrations were similar to 100 mu g m(-3), aerosol NH4+ and SO42- measured by the filter-based samplers were on average two times higher than those determined by the WAD/SJAC. Concentrations of aerosol NO3- and Cl- measured with the HiVol during daytime, and with the DFP during day- and nighttime also exceeded those of the WAD/SJAC by a factor of two. In contrast, aerosol NO3- and Cl- measured with the SFU during the dry season were nearly two times lower than those measured by the WAD/SJAC. These differences declined markedly during the transition period and towards the cleaner conditions during the onset of the wet season (PM2.5 similar to 5 mu g m(-3)); when filter-based samplers measured on average 40-90% less than the WAD/SJAC. The differences were not due to consistent systematic biases of the analytical techniques, but were apparently a result of prevailing environmental conditions and different sampling procedures. For the transition period and wet season, the significance of our results is reduced by a low number of data points. We argue that the observed differences are mainly attributable to (a) positive and negative filter sampling artifacts, (b) presence of organic compounds and organosulfates on filter substrates, and (c) a SJAC sampling efficiency of less than 100%.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DNA-based studies have been one of the major interests in conservation biology of endangered species and in population genetics. As species and population genetic assessment requires a source of biological material, the sampling strategy can be overcome by non-destructive procedures for DNA isolation. An improved method for obtaining DNA from fish fins and scales with the use of an extraction buffer containing urea and further DNA purification with phenol-chloroform is described. The methodology combines the benefits of a non-destructive DNA sampling and its high efficiency. In addition, comparisons with other methodologies for isolating DNA from fish demonstrated that the present procedure also becomes a very attractive alternative to obtain large amounts of high-quality DNA for use in different molecular analyses. The DNA samples, isolated from different fish species, have been successfully used on random amplified polymorphic DNA (RAPD) experiments, as well as on amplification of specific ribosomal and mitochondrial DNA sequences. The present DNA extraction procedure represents an alternative for population approaches and genetic studies on rare or endangered taxa.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently, researches have shown that the performance of metaheuristics can be affected by population initialization. Opposition-based Differential Evolution (ODE), Quasi-Oppositional Differential Evolution (QODE), and Uniform-Quasi-Opposition Differential Evolution (UQODE) are three state-of-the-art methods that improve the performance of the Differential Evolution algorithm based on population initialization and different search strategies. In a different approach to achieve similar results, this paper presents a technique to discover promising regions in a continuous search-space of an optimization problem. Using machine-learning techniques, the algorithm named Smart Sampling (SS) finds regions with high possibility of containing a global optimum. Next, a metaheuristic can be initialized inside each region to find that optimum. SS and DE were combined (originating the SSDE algorithm) to evaluate our approach, and experiments were conducted in the same set of benchmark functions used by ODE, QODE and UQODE authors. Results have shown that the total number of function evaluations required by DE to reach the global optimum can be significantly reduced and that the success rate improves if SS is employed first. Such results are also in consonance with results from the literature, stating the importance of an adequate starting population. Moreover, SS presents better efficacy to find initial populations of superior quality when compared to the other three algorithms that employ oppositional learning. Finally and most important, the SS performance in finding promising regions is independent of the employed metaheuristic with which SS is combined, making SS suitable to improve the performance of a large variety of optimization techniques. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To investigate the inhomogeneity of radiofrequency fields at higher field strengths that can interfere with established volumetric methods, in particular for the determination of visceral (VAT) and subcutaneous adipose tissue (SCAT). A versatile, interactive sparse sampling (VISS) method is proposed to determine VAT, SCAT, and also total body volume (TBV).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Blood sampling is an essential technique in many herpetological studies. This paper describes a quick and humane technique to collect blood samples from three species of Australian chelid turtles ( Order Pleurodira): Chelodina expansa, Elseya latisternum, and Emydura macquarii signata.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper examines the nature of qualitative empirical studies published in the AHRD proceedings from 1999-2003 and discusses findings on method, rationale for method, data collection, sampling strategies, and integrity measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a method of spatial sampling based on stratification by Local Moran’s I i calculated using auxiliary information. The sampling technique is compared to other design-based approaches including simple random sampling, systematic sampling on a regular grid, conditional Latin Hypercube sampling and stratified sampling based on auxiliary information, and is illustrated using two different spatial data sets. Each of the samples for the two data sets is interpolated using regression kriging to form a geostatistical map for their respective areas. The proposed technique is shown to be competitive in reproducing specific areas of interest with high accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Client owners usually need an estimate or forecast of their likely building costs in advance of detailed design in order to confirm the financial feasibility of their projects. Because of their timing in the project life cycle, these early stage forecasts are characterized by the minimal amount of information available concerning the new (target) project to the point that often only its size and type are known. One approach is to use the mean contract sum of a sample, or base group, of previous projects of a similar type and size to the project for which the estimate is needed. Bernoulli’s law of large numbers implies that this base group should be as large as possible. However, increasing the size of the base group inevitably involves including projects that are less and less similar to the target project. Deciding on the optimal number of base group projects is known as the homogeneity or pooling problem. A method of solving the homogeneity problem is described involving the use of closed form equations to compare three different sampling arrangements of previous projects for their simulated forecasting ability by a cross-validation method, where a series of targets are extracted, with replacement, from the groups and compared with the mean value of the projects in the base groups. The procedure is then demonstrated with 450 Hong Kong projects (with different project types: Residential, Commercial centre, Car parking, Social community centre, School, Office, Hotel, Industrial, University and Hospital) clustered into base groups according to their type and size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a method for investigating ship emissions, the plume capture and analysis system (PCAS), and its application in measuring airborne pollutant emission factors (EFs) and particle size distributions. The current investigation was conducted in situ, aboard two dredgers (Amity: a cutter suction dredger and Brisbane: a hopper suction dredger) but the PCAS is also capable of performing such measurements remotely at a distant point within the plume. EFs were measured relative to the fuel consumption using the fuel combustion derived plume CO2. All plume measurements were corrected by subtracting background concentrations sampled regularly from upwind of the stacks. Each measurement typically took 6 minutes to complete and during one day, 40 to 50 measurements were possible. The relationship between the EFs and plume sample dilution was examined to determine the plume dilution range over which the technique could deliver consistent results when measuring EFs for particle number (PN), NOx, SO2, and PM2.5 within a targeted dilution factor range of 50-1000 suitable for remote sampling. The EFs for NOx, SO2, and PM2.5 were found to be independent of dilution, for dilution factors within that range. The EF measurement for PN was corrected for coagulation losses by applying a time dependant particle loss correction to the particle number concentration data. For the Amity, the EF ranges were PN: 2.2 - 9.6 × 1015 (kg-fuel)-1; NOx: 35-72 g(NO2).(kg-fuel)-1, SO2 0.6 - 1.1 g(SO2).(kg-fuel)-1and PM2.5: 0.7 – 6.1 g(PM2.5).(kg-fuel)-1. For the Brisbane they were PN: 1.0 – 1.5 x 1016 (kg-fuel)-1, NOx: 3.4 – 8.0 g(NO2).(kg-fuel)-1, SO2: 1.3 – 1.7 g(SO2).(kg-fuel)-1 and PM2.5: 1.2 – 5.6 g(PM2.5).(kg-fuel)-1. The results are discussed in terms of the operating conditions of the vessels’ engines. Particle number emission factors as a function of size as well as the count median diameter (CMD), and geometric standard deviation of the size distributions are provided. The size distributions were found to be consistently uni-modal in the range below 500 nm, and this mode was within the accumulation mode range for both vessels. The representative CMDs for the various activities performed by the dredgers ranged from 94-131 nm in the case of the Amity, and 58-80 nm for the Brisbane. A strong inverse relationship between CMD and EF(PN) was observed.