987 resultados para Over sampling


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantitative use of satellite-derived rainfall products for various scientific applications often requires them to be accompanied with an error estimate. Rainfall estimates inferred from low earth orbiting satellites like the Tropical Rainfall Measuring Mission (TRMM) will be subjected to sampling errors of nonnegligible proportions owing to the narrow swath of satellite sensors coupled with a lack of continuous coverage due to infrequent satellite visits. The authors investigate sampling uncertainty of seasonal rainfall estimates from the active sensor of TRMM, namely, Precipitation Radar (PR), based on 11 years of PR 2A25 data product over the Indian subcontinent. In this paper, a statistical bootstrap technique is investigated to estimate the relative sampling errors using the PR data themselves. Results verify power law scaling characteristics of relative sampling errors with respect to space-time scale of measurement. Sampling uncertainty estimates for mean seasonal rainfall were found to exhibit seasonal variations. To give a practical example of the implications of the bootstrap technique, PR relative sampling errors over a subtropical river basin of Mahanadi, India, are examined. Results reveal that the bootstrap technique incurs relative sampling errors < 33% (for the 2 degrees grid), < 36% (for the 1 degrees grid), < 45% (for the 0.5 degrees grid), and < 57% (for the 0.25 degrees grid). With respect to rainfall type, overall sampling uncertainty was found to be dominated by sampling uncertainty due to stratiform rainfall over the basin. The study compares resulting error estimates to those obtained from latin hypercube sampling. Based on this study, the authors conclude that the bootstrap approach can be successfully used for ascertaining relative sampling errors offered by TRMM-like satellites over gauged or ungauged basins lacking in situ validation data. This technique has wider implications for decision making before incorporating microwave orbital data products in basin-scale hydrologic modeling.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The general assumption under which the (X) over bar chart is designed is that the process mean has a constant in-control value. However, there are situations in which the process mean wanders. When it wanders according to a first-order autoregressive (AR (1)) model, a complex approach involving Markov chains and integral equation methods is used to evaluate the properties of the (X) over bar chart. In this paper, we propose the use of a pure Markov chain approach to study the performance of the (X) over bar chart. The performance of the chat (X) over bar with variable parameters and the (X) over bar with double sampling are compared. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent studies have shown that the (X) over bar chart with variable sampling intervals (VSI) and/or with variable sample sizes (VSS) detects process shifts faster than the traditional (X) over bar chart. This article extends these studies for processes that are monitored by both the (X) over bar and R charts. A Markov chain model is used to determine the properties of the joint (X) over bar and R charts with variable sample sizes and sampling intervals (VSSI). The VSSI scheme improves the joint (X) over bar and R control chart performance in terms of the speed with which shifts in the process mean and/or variance are detected.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A standard (X) over bar chart for controlling the process mean takes samples of size no at specified, equally-spaced, fixed-time points. This article proposes a modification of the standard (X) over bar chart that allows one to take additional samples, bigger than no, between these fixed times. The additional samples are taken from the process when there is evidence that the process mean moved from target. Following the notation proposed by Reynolds (1996a) and Costs (1997) we shortly call the proposed (X) over bar chart as VSSIFT (X) over bar chart: where VSSIFT means variable sample size and sampling intervals with fixed times. The (X) over bar chart with the VSSIFT feature is easier to be administered than a standard VSSI (X) over bar chart that is not constrained to sample at the specified fixed times. The performances of the charts in detecting process mean shifts are comparable.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The steady-state average run length is used to measure the performance of the recently proposed synthetic double sampling (X) over bar chart (synthetic DS chart). The overall performance of the DS X chart in signaling process mean shifts of different magnitudes does not improve when it is integrated with the conforming run length chart, except when the integrated charts are designed to offer very high protection against false alarms, and the use of large samples is prohibitive. The synthetic chart signals when a second point falls beyond the control limits, no matter whether one of them falls above the centerline and the other falls below it; with the side-sensitive feature, the synthetic chart does not signal when they fall on opposite sides of the centerline. We also investigated the steady-state average run length of the side-sensitive synthetic DS X chart. With the side-sensitive feature, the overall performance of the synthetic DS X chart improves, but not enough to outperform the non-synthetic DS X chart. Copyright (C) 2014 John Wiley &Sons, Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this study was to assess the accuracy and precision of airborne volatile organic compound (VOC) concentrations measured using passive air samplers (3M 3500 organic vapor monitors) over extended sampling durations (9 and 15 days). A total of forty-five organic vapor monitor samples were collected at a State of Texas air monitoring site during two different sampling periods (July/August and November 2008). The results of this study indicate that for most of the tested compounds, there was no significant difference between long-term (9 or 15 days) sample concentrations and the means of parallel consecutive short-term (3 days) sample concentrations. Biases of 9 or 15-day measurements vs. consecutive 3-day measurements showed considerable variability. Those compounds that had percent bias values of <10% are suggested as acceptable for long-term sampling (9 and 15 days). Of the twenty-one compounds examined, 10 compounds are classified as acceptable for long-term sampling; these include m,p-xylene, 1,2,4-trimethylbenzene, n-hexane, ethylbenzene, benzene, toluene, o-xylene, d-limonene, dimethylpentane and methyl tertbutyl ether. The ratio of sampling procedure variability relative to variability within days was approximately 1.89 for both sampling periods for the 3-day vs. 9-day comparisons and approximately 2.19 for both sampling periods for the 3-day vs. 15-day comparisons. Considerably higher concentrations of most VOCs were measured during the November sampling period compared to the July/August period. These differences may be a result of varying meteorological conditions during these two time periods, e.g., the differences in wind direction, and wind speed. Further studies are suggested to further evaluate the accuracy and precision of 3M 3500 organic vapor monitors over extended sampling durations. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Field and laboratory measurements identified a complex relationship between odour emission rates provided by the US EPA dynamic emission chamber and the University of New South Wales wind tunnel. Using a range of model compounds in an aqueous odour source, we demonstrate that emission rates derived from the wind tunnel and flux chamber are a function of the solubility of the materials being emitted, the concentrations of the materials within the liquid; and the aerodynamic conditions within the device – either velocity in the wind tunnel, or flushing rate for the flux chamber. The ratio of wind tunnel to flux chamber odour emission rates (OU m-2 s) ranged from about 60:1 to 112:1. The emission rates of the model odorants varied from about 40:1 to over 600:1. These results may provide, for the first time, a basis for the development of a model allowing an odour emission rate derived from either device to be used for odour dispersion modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: This methodological paper reports on the development and validation of a work sampling instrument and data collection processes to conduct a national study of nurse practitioners’ work patterns. ---------- Design: Published work sampling instruments provided the basis for development and validation of a tool for use in a national study of nurse practitioner work activities across diverse contextual and clinical service models. Steps taken in the approach included design of a nurse practitioner-specific data collection tool and development of an innovative web-based program to train and establish inter rater reliability of a team of data collectors who were geographically dispersed across metropolitan, rural and remote health care settings. ---------- Setting: The study is part of a large funded study into nurse practitioner service. The Australian Nurse Practitioner Study is a national study phased over three years and was designed to provide essential information for Australian health service planners, regulators and consumer groups on the profile, process and outcome of nurse practitioner service. ---------- Results: The outcome if this phase of the study is empirically tested instruments, process and training materials for use in an international context by investigators interested in conducting a national study of nurse practitioner work practices. ---------- Conclusion: Development and preparation of a new approach to describing nurse practitioner practices using work sampling methods provides the groundwork for international collaboration in evaluation of nurse practitioner service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean processes are dynamic, complex, and occur on multiple spatial and temporal scales. To obtain a synoptic view of such processes, ocean scientists collect data over long time periods. Historically, measurements were continually provided by fixed sensors, e.g., moorings, or gathered from ships. Recently, an increase in the utilization of autonomous underwater vehicles has enabled a more dynamic data acquisition approach. However, we still do not utilize the full capabilities of these vehicles. Here we present algorithms that produce persistent monitoring missions for underwater vehicles by balancing path following accuracy and sampling resolution for a given region of interest, which addresses a pressing need among ocean scientists to efficiently and effectively collect high-value data. More specifically, this paper proposes a path planning algorithm and a speed control algorithm for underwater gliders, which together give informative trajectories for the glider to persistently monitor a patch of ocean. We optimize a cost function that blends two competing factors: maximize the information value along the path, while minimizing deviation from the planned path due to ocean currents. Speed is controlled along the planned path by adjusting the pitch angle of the underwater glider, so that higher resolution samples are collected in areas of higher information value. The resulting paths are closed circuits that can be repeatedly traversed to collect long-term ocean data in dynamic environments. The algorithms were tested during sea trials on an underwater glider operating off the coast of southern California, as well as in Monterey Bay, California. The experimental results show significant improvements in data resolution and path reliability compared to previously executed sampling paths used in the respective regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the feasibility and methodological considerations of using the Short Message System Experience Sampling (SMS-ES) Method, which is an experience sampling research method developed to assist researchers to collect repeat measures of consumers’ affective experiences. The method combines SMS with web-based technology in a simple yet effective way. It is described using a practical implementation study that collected consumers’ emotions in response to using mobile phones in everyday situations. The method is further evaluated in terms of the quality of data collected in the study, as well as against the methodological considerations for experience sampling studies. These two evaluations suggest that the SMS-ES Method is both a valid and reliable approach for collecting consumers’ affective experiences. Moreover, the method can be applied across a range of for-profit and not-for-profit contexts where researchers want to capture repeated measures of consumers’ affective experiences occurring over a period of time. The benefits of the method are discussed to assist researchers who wish to apply the SMS-ES Method in their own research designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective, statistically robust sampling and surveillance strategies form an integral component of large agricultural industries such as the grains industry. Intensive in-storage sampling is essential for pest detection, Integrated Pest Management (IPM), to determine grain quality and to satisfy importing nation’s biosecurity concerns, while surveillance over broad geographic regions ensures that biosecurity risks can be excluded, monitored, eradicated or contained within an area. In the grains industry, a number of qualitative and quantitative methodologies for surveillance and in-storage sampling have been considered. Primarily, research has focussed on developing statistical methodologies for in storage sampling strategies concentrating on detection of pest insects within a grain bulk, however, the need for effective and statistically defensible surveillance strategies has also been recognised. Interestingly, although surveillance and in storage sampling have typically been considered independently, many techniques and concepts are common between the two fields of research. This review aims to consider the development of statistically based in storage sampling and surveillance strategies and to identify methods that may be useful for both surveillance and in storage sampling. We discuss the utility of new quantitative and qualitative approaches, such as Bayesian statistics, fault trees and more traditional probabilistic methods and show how these methods may be used in both surveillance and in storage sampling systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acoustic sensors provide an effective means of monitoring biodiversity at large spatial and temporal scales. They can continuously and passively record large volumes of data over extended periods, however these data must be analysed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced users can produce accurate results, however the time and effort required to process even small volumes of data can make manual analysis prohibitive. Our research examined the use of sampling methods to reduce the cost of analysing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilising five days of manually analysed acoustic sensor data from four sites, we examined a range of sampling rates and methods including random, stratified and biologically informed. Our findings indicate that randomly selecting 120, one-minute samples from the three hours immediately following dawn provided the most effective sampling method. This method detected, on average 62% of total species after 120 one-minute samples were analysed, compared to 34% of total species from traditional point counts. Our results demonstrate that targeted sampling methods can provide an effective means for analysing large volumes of acoustic sensor data efficiently and accurately.