982 resultados para Stratified sampling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

As part of two research projects for analysing bycatch and discards, we quantified catch composition, catch rates, bycatch and discard,, in two important commercial bottom trawl fisheries (crustacean and fish trawls) off the Southern coast of Portugal (Algarve). Stratified sampling by onboard observers took place from February 1999 to March 2001 and data were collected from 165 tows during 52 fishing trips. Commercial tat-get species included crustaceans: blue and red shrimp (Aristeus antennatus), deep-water rose shrimp (Parapenaeus longirostris), Norway lobster (Nepharops norvegicus); and fishes: scabreams (Diplodus spl). and Pagellus Spp.), horse Mackerels (Trachurus spp.) and European hake (Merluccius merluccius). The trawl fisheries are characterised by considerable amounts of bycatch: 59.5% and 80.4% of the overall total catch for crustacean and fish trawlers respectively. A total of 255 species were identified, which belonged to 15 classes of organisms ( 137 vertebrates, 112 invertebrates and algae). Crustacean trawlers had higher bycatch biodiversity. Bony fish (45.6% and 37.8%) followed by crustaceans (14.6% and 11.5%) were the dominant bycatch components of both crustacean and fish trawlers respectively. The influence of a number of factors (e.g. depth, fishing gear, tow duration and season) on bycatch and discards is discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

RESUMEN Objetivo: Estimar la prevalencia de las diferentes enfermedades oftalmológicas que aparecen en el contexto de una enfermedad autoinmune (EAI) en pacientes de un centro de referencia reumatológica en Colombia, según características clínicas y sociodemográficas durante un período de 15 años, comprendido entre los años 2000 a 2015. Métodos: Se realizó un estudio descriptivo, observacional de prevalencia. El tipo de muestreo fue aleatorio estratificado con asignación proporcional en el programa Epidat 3.4. Los datos se analizaron en el programa SPSS v22.0 y se realizó análisis univariado de las variables categóricas, para las variables cuantitativas se realizaron medidas de tendencia central. Resultados: De 1640 historias clínicas revisadas, se encontraron 634 pacientes (38,65%) con compromiso ocular. Si excluimos los pacientes con SS, que por definición presentan ojo seco, 222 pacientes (13,53%) presentaron compromiso oftalmológico. Del total de pacientes, el 83,3% fueron mujeres. La AR fue la enfermedad autoinmune con mayor compromiso oftalmológico con 138 pacientes (62,2%), y en último lugar la sarcoidosis con 1 solo paciente afectado. La QCS fue la manifestación más común en todos los grupos diagnósticos de EAI, con 146 pacientes (63,5%). De 414 pacientes con Síndrome de Sjögren (SS) y QCS 8 presentaron compromiso ocular adicional, siendo la uveítis la segunda patología ocular asociada en pacientes con SS y la primera causa en las espondiloartropatias (71,4 %). Los pacientes con catarata (4,1%) presentaron la mayor prevalencia de uso de corticoide (88.8%). De 222 pacientes, 28 (12,6%) presentaron uveítis. Del total de pacientes, 16 (7,2%) presentaron maculopatía por antimalaráricos y 6 (18,75%) de los pacientes con LES. Los ANAS se presentaron en el 100% los pacientes con trastorno vascular de la retina. Los pacientes con epiescleritis presentaron la mayor proporción de positivización de anticuerpos anti-DNA. La EAI que más presentó epiescleritis fue LES con 4 pacientes (12,5%) El 22% de paciente con anticuerpos anti-RNP presentaron escleritis y 32,1% de los pacientes con uveítis presentaron HLA-B27 positivo. Las manifestaciones oftalmológicas precedieron a las sistémicas entre un 11,1% y un 33,3% de los pacientes. Conclusión: Las enfermedades oculares se presentan con frecuencia en los pacientes colombianos con EAI (38.65%), siendo la AR la enfermedad con mayor compromiso ocular (62,2%) y la QCS la enfermedad ocular con mayor prevalencia en todas las EAI (63,5%). La uveítis se presentó en 28 pacientes (12,6%). Las manifestaciones oftalmológicas pueden preceder a las sistémicas. El examen oftalmológico debe ser incluido en los pacientes con EAI, por ser la enfermedad ocular una comorbilidad frecuente. Adicionalmente, los efectos oftalmológicos de las medicaciones sistémicas utilizadas en EAI deben ser estrechamente monitorizados, durante el curso del tratamiento.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A 37-m thick layer of stratified clay encountered during a site investigation at Swann's Bridge, near the sea-coast at Limavady, Northern Ireland, is one of the deepest and thickest layers of this type of material recorded in Ireland. A study of the relevant literature and stratigraphic evidence obtained from the site investigation showed that despite being close to the current shoreline, the clay was deposited in a fresh-water glacial lake formed approximately 13 000 BP. The 37-m layer of clay can be divided into two separate zones. The lower zone was deposited as a series of laminated layers of sand, silt, and clay, whereas the upper zone was deposited as a largely homogeneous mixture. A comprehensive series of tests was carried out on carefully selected samples from the full thickness of the deposit. The results obtained from these tests were complex and confusing, particularly the results of tests done on samples from the lower zone. The results of one-dimensional compression tests, unconsolidated undrained triaxial tests, and consolidated undrained triaxial compression tests showed that despite careful sampling, all of the specimens from the lower zone exhibited behaviour similar to that of reconstituted clays. It was immediately clear that the results needed explanation. This paper studies possible causes of the results from tests carried out on the lower Limavady clay. It suggests a possible mechanism based on anisotropic elasticity, yielding, and destructuring that provides an understanding of the observed behaviour.Key words: clay, laminations, disturbance, yielding, destructuring, reconstituted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acoustic sensors provide an effective means of monitoring biodiversity at large spatial and temporal scales. They can continuously and passively record large volumes of data over extended periods, however these data must be analysed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced users can produce accurate results, however the time and effort required to process even small volumes of data can make manual analysis prohibitive. Our research examined the use of sampling methods to reduce the cost of analysing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilising five days of manually analysed acoustic sensor data from four sites, we examined a range of sampling rates and methods including random, stratified and biologically informed. Our findings indicate that randomly selecting 120, one-minute samples from the three hours immediately following dawn provided the most effective sampling method. This method detected, on average 62% of total species after 120 one-minute samples were analysed, compared to 34% of total species from traditional point counts. Our results demonstrate that targeted sampling methods can provide an effective means for analysing large volumes of acoustic sensor data efficiently and accurately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acoustic sensors can be used to estimate species richness for vocal species such as birds. They can continuously and passively record large volumes of data over extended periods. These data must subsequently be analyzed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced surveyors can produce accurate results; however the time and effort required to process even small volumes of data can make manual analysis prohibitive. This study examined the use of sampling methods to reduce the cost of analyzing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilizing five days of manually analyzed acoustic sensor data from four sites, we examined a range of sampling frequencies and methods including random, stratified, and biologically informed. We found that randomly selecting 120 one-minute samples from the three hours immediately following dawn over five days of recordings, detected the highest number of species. On average, this method detected 62% of total species from 120 one-minute samples, compared to 34% of total species detected from traditional area search methods. Our results demonstrate that targeted sampling methods can provide an effective means for analyzing large volumes of acoustic sensor data efficiently and accurately. Development of automated and semi-automated techniques is required to assist in analyzing large volumes of acoustic sensor data. Read More: http://www.esajournals.org/doi/abs/10.1890/12-2088.1

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On-going, high-profile public debate about climate change has focussed attention on how to monitor the soil organic carbon stock (C(s)) of rangelands (savannas). Unfortunately, optimal sampling of the rangelands for baseline C(s) - the critical first step towards efficient monitoring - has received relatively little attention to date. Moreover, in the rangelands of tropical Australia relatively little is known about how C(s) is influenced by the practice of cattle grazing. To address these issues we used linear mixed models to: (i) unravel how grazing pressure (over a 12-year period) and soil type have affected C(s) and the stable carbon isotope ratio of soil organic carbon (delta(13)C) (a measure of the relative contributions of C(3) and C(4) vegetation to C(s)); (ii) examine the spatial covariation of C(s) and delta(13)C; and, (iii) explore the amount of soil sampling required to adequately determine baseline C(s). Modelling was done in the context of the material coordinate system for the soil profile, therefore the depths reported, while conventional, are only nominal. Linear mixed models revealed that soil type and grazing pressure interacted to influence C(s) to a depth of 0.3 m in the profile. At a depth of 0.5 m there was no effect of grazing on C(s), but the soil type effect on C(s) was significant. Soil type influenced delta(13)C to a soil depth of 0.5 m but there was no effect of grazing at any depth examined. The linear mixed model also revealed the strong negative correlation of C(s) with delta(13)C, particularly to a depth of 0.1 m in the soil profile. This suggested that increased C(s) at the study site was associated with increased input of C from C(3) trees and shrubs relative to the C(4) perennial grasses; as the latter form the bulk of the cattle diet, we contend that C sequestration may be negatively correlated with forage production. Our baseline C(s) sampling recommendation for cattle-grazing properties of the tropical rangelands of Australia is to: (i) divide the property into units of apparently uniform soil type and grazing management; (ii) use stratified simple random sampling to spread at least 25 soil sampling locations about each unit, with at least two samples collected per stratum. This will be adequate to accurately estimate baseline mean C(s) to within 20% of the true mean, to a nominal depth of 0.3 m in the profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Milkfish and prawn pond operation in the Philippines is often associated with lab-lab culture. Lab-lab is a biological complex of blue-green algae, diatoms, bacteria and various animals which form a mat at the bottom of nursery ponds or floating patches along the margins of ponds. This complex is considered the most favorable food of milkfish in brackishwater ponds. Variations in the quantity and quality of lab-lab between and within areas of a 1,000 sq. m. pond was determined over 2 culture periods (6 month duration) and the applicability and suitability of stratified random sampling as a method of sampling lab-lab was evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term monitoring of forest soils as part of a pan-European network to detect environmental change depends on an accurate determination of the mean of the soil properties at each monitoring event. Forest soil is known to be very variable spatially, however. A study was undertaken to explore and quantify this variability at three forest monitoring plots in Britain. Detailed soil sampling was carried out, and the data from the chemical analyses were analysed by classical statistics and geostatistics. An analysis of variance showed that there were no consistent effects from the sample sites in relation to the position of the trees. The variogram analysis showed that there was spatial dependence at each site for several variables and some varied in an apparently periodic way. An optimal sampling analysis based on the multivariate variogram for each site suggested that a bulked sample from 36 cores would reduce error to an acceptable level. Future sampling should be designed so that it neither targets nor avoids trees and disturbed ground. This can be achieved best by using a stratified random sampling design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Representative Soil Sampling Scheme of England and Wales has recorded information on the soil of agricultural land in England and Wales since 1969. It is a valuable source of information about the soil in the context of monitoring for sustainable agricultural development. Changes in soil nutrient status and pH were examined over the period 1971-2001. Several methods of statistical analysis were applied to data from the surveys during this period. The main focus here is on the data for 1971, 1981, 1991 and 2001. The results of examining change over time in general show that levels of potassium in the soil have increased, those of magnesium have remained fairly constant, those of phosphorus have declined and pH has changed little. Future sampling needs have been assessed in the context of monitoring, to determine the mean at a given level of confidence and tolerable error and to detect change in the mean over time at these same levels over periods of 5 and 10 years. The results of a non-hierarchical multivariate classification suggest that England and Wales could be stratified to optimize future sampling and analysis. To monitor soil quality and health more generally than for agriculture, more of the country should be sampled and a wider range of properties recorded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is turned to the advanced Monte Carlo methods for realistic image creation. It offers a new stratified approach for solving the rendering equation. We consider the numerical solution of the rendering equation by separation of integration domain. The hemispherical integration domain is symmetrically separated into 16 parts. First 9 sub-domains are equal size of orthogonal spherical triangles. They are symmetric each to other and grouped with a common vertex around the normal vector to the surface. The hemispherical integration domain is completed with more 8 sub-domains of equal size spherical quadrangles, also symmetric each to other. All sub-domains have fixed vertices and computable parameters. The bijections of unit square into an orthogonal spherical triangle and into a spherical quadrangle are derived and used to generate sampling points. Then, the symmetric sampling scheme is applied to generate the sampling points distributed over the hemispherical integration domain. The necessary transformations are made and the stratified Monte Carlo estimator is presented. The rate of convergence is obtained and one can see that the algorithm is of super-convergent type.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of forecast ensembles to probabilistic weather prediction has spurred considerable interest in their evaluation. Such ensembles are commonly interpreted as Monte Carlo ensembles meaning that the ensemble members are perceived as random draws from a distribution. Under this interpretation, a reasonable property to ask for is statistical consistency, which demands that the ensemble members and the verification behave like draws from the same distribution. A widely used technique to assess statistical consistency of a historical dataset is the rank histogram, which uses as a criterion the number of times that the verification falls between pairs of members of the ordered ensemble. Ensemble evaluation is rendered more specific by stratification, which means that ensembles that satisfy a certain condition (e.g., a certain meteorological regime) are evaluated separately. Fundamental relationships between Monte Carlo ensembles, their rank histograms, and random sampling from the probability simplex according to the Dirichlet distribution are pointed out. Furthermore, the possible benefits and complications of ensemble stratification are discussed. The main conclusion is that a stratified Monte Carlo ensemble might appear inconsistent with the verification even though the original (unstratified) ensemble is consistent. The apparent inconsistency is merely a result of stratification. Stratified rank histograms are thus not necessarily flat. This result is demonstrated by perfect ensemble simulations and supplemented by mathematical arguments. Possible methods to avoid or remove artifacts that stratification induces in the rank histogram are suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In order to optimise the cost-effectiveness of active surveillance to substantiate freedom from disease, a new approach using targeted sampling of farms was developed and applied on the example of infectious bovine rhinotracheitis (IBR) and enzootic bovine leucosis (EBL) in Switzerland. Relevant risk factors (RF) for the introduction of IBR and EBL into Swiss cattle farms were identified and their relative risks defined based on literature review and expert opinions. A quantitative model based on the scenario tree method was subsequently used to calculate the required sample size of a targeted sampling approach (TS) for a given sensitivity. We compared the sample size with that of a stratified random sample (sRS) with regard to efficiency. RESULTS: The required sample sizes to substantiate disease freedom were 1,241 farms for IBR and 1,750 farms for EBL to detect 0.2% herd prevalence with 99% sensitivity. Using conventional sRS, the required sample sizes were 2,259 farms for IBR and 2,243 for EBL. Considering the additional administrative expenses required for the planning of TS, the risk-based approach was still more cost-effective than a sRS (40% reduction on the full survey costs for IBR and 8% for EBL) due to the considerable reduction in sample size. CONCLUSIONS: As the model depends on RF selected through literature review and was parameterised with values estimated by experts, it is subject to some degree of uncertainty. Nevertheless, this approach provides the veterinary authorities with a promising tool for future cost-effective sampling designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider estimation of the causal effect of a treatment on an outcome from observational data collected in two phases. In the first phase, a simple random sample of individuals are drawn from a population. On these individuals, information is obtained on treatment, outcome, and a few low-dimensional confounders. These individuals are then stratified according to these factors. In the second phase, a random sub-sample of individuals are drawn from each stratum, with known, stratum-specific selection probabilities. On these individuals, a rich set of confounding factors are collected. In this setting, we introduce four estimators: (1) simple inverse weighted, (2) locally efficient, (3) doubly robust and (4)enriched inverse weighted. We evaluate the finite-sample performance of these estimators in a simulation study. We also use our methodology to estimate the causal effect of trauma care on in-hospital mortality using data from the National Study of Cost and Outcomes of Trauma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This data set contains soil carbon measurements (Organic carbon, inorganic carbon, and total carbon; all measured in dried soil samples) from the main experiment plots of a large grassland biodiversity experiment (the Jena Experiment; see further details below). In the main experiment, 82 grassland plots of 20 x 20 m were established from a pool of 60 species belonging to four functional groups (grasses, legumes, tall and small herbs). In May 2002, varying numbers of plant species from this species pool were sown into the plots to create a gradient of plant species richness (1, 2, 4, 8, 16 and 60 species) and functional richness (1, 2, 3, 4 functional groups). Plots were maintained by bi-annual weeding and mowing. Stratified soil sampling to a depth of 1 m was repeated in April 2007 (as had been done before sowing in April 2002). Three independent samples per plot were taken of all plots in block 2 using a motor-driven soil column cylinder (Cobra, Eijkelkamp, 8.3 cm in diameter). Soil samples were dried at 40°C and segmented to a depth resolution of 5 cm giving 20 depth subsamples per core. All samples were analyzed independently. All soil samples were passed through a sieve with a mesh size of 2 mm. Because of much higher proportions of roots in the soil, the samples in 2007 were further sieved to 1 mm according to common root removal methods. No additional mineral particles were removed by this procedure. Total carbon concentration was analyzed on ball-milled subsamples (time 4 min, frequency 30 s**-1) by an elemental analyzer at 1150°C (Elementaranalysator vario Max CN; Elementar Analysensysteme GmbH, Hanau, Germany). We measured inorganic carbon concentration by elemental analysis at 1150°C after removal of organic carbon for 16 h at 450°C in a muffle furnace. Organic carbon concentration was calculated as the difference between both measurements of total and inorganic carbon.