946 resultados para Thompson sampling
Resumo:
Spinosad, diatomaceous earth, and cyfluthrin were assessed on two broiler farms at Gleneagle and Gatton in southeastern Queensland, Australia in 2004-2005 and 2007-2009, respectively to determine their effectiveness in controlling lesser mealworm, Alphitobius diaperinus (Panzer) (Coleoptera: Tenebrionidae). Insecticide treatments were applied mostly to earth or 'hard' cement floors of broiler houses before the placement of new bedding. Efficacy of each agent was assessed by regular sampling of litter and counting of immature stages and adult beetles, and comparing insect counts in treatments to counts in untreated houses. Generally, the lowest numbers of lesser mealworm were recorded in the house with hard floors, these numbers equalling the most effective spinosad applications. The most effective treatment was a strategic application of spinosad under feed supply lines on a hard floor. In compacted earth floor houses, mean numbers of lesser mealworms for two under-feed-line spinosad treatments (i.e., 2-m-wide application at 0.18 g of active insecticide (g [AI]) in 100-ml water/m(2), and 1-m-wide application at 0.11 g ([AI] in 33-ml water/m(2)), and an entire floor spinosad treatment (0.07 g [AI] in 86-ml water/m2) were significantly lower (i.e., better control) than those numbers for cyfluthrin, and no treatment (controls). The 1-m-wide under-feed-line treatment was the most cost-effective dose, providing similar control to the other two most effective spinosad treatments, but using less than half the active component per broiler house. No efficacy was demonstrated when spinosad was applied to the surface of bedding in relatively large volumes of water. All applications of diatomaceous earth, applied with and without spinosad, and cyfluthrin at the label rate of 0.02 g (AI)/100-ml water/m(2) showed no effect, with insect counts not significantly different to untreated controls. Overall, the results of this field assessment indicate that cyfluthrin (the Australian industry standard) and diatomaceous earth were ineffective on these two farms and that spinosad can be a viable alternative for broiler house use.
Resumo:
We present a Bayesian sampling algorithm called adaptive importance sampling or population Monte Carlo (PMC), whose computational workload is easily parallelizable and thus has the potential to considerably reduce the wall-clock time required for sampling, along with providing other benefits. To assess the performance of the approach for cosmological problems, we use simulated and actual data consisting of CMB anisotropies, supernovae of type Ia, and weak cosmological lensing, and provide a comparison of results to those obtained using state-of-the-art Markov chain Monte Carlo (MCMC). For both types of data sets, we find comparable parameter estimates for PMC and MCMC, with the advantage of a significantly lower wall-clock time for PMC. In the case of WMAP5 data, for example, the wall-clock time scale reduces from days for MCMC to hours using PMC on a cluster of processors. Other benefits of the PMC approach, along with potential difficulties in using the approach, are analyzed and discussed.
Resumo:
Pathogens and pests of stored grains move through complex dynamic networks linking fields, farms, and bulk storage facilities. Human transport and other forms of dispersal link the components of this network. A network model for pathogen and pest movement through stored grain systems is a first step toward new sampling and mitigation strategies that utilize information about the network structure. An understanding of network structure can be applied to identifying the key network components for pathogen or pest movement through the system. For example, it may be useful to identify a network node, such as a local grain storage facility, through which grain from a large number of fields will be accumulated and move through the network. This node may be particularly important for sampling and mitigation. In some cases more detailed information about network structure can identify key nodes that link two large sections of the network, such that management at the key nodes will greatly reduce the risk of spread between the two sections. In addition to the spread of particular species of pathogens and pests, we also evaluate the spread of problematic subpopulations, such as subpopulations with pesticide resistance. We present an analysis of stored grain pathogen and pest networks for Australia and the United States.
Resumo:
Invasive and noxious weeds are well known as a pervasive problem, imposing significant economic burdens on all areas of agriculture. Whilst there are multiple possible pathways of weed dispersal in this industry, of particular interest to this discussion is the unintended dispersal of weed seeds within fodder. During periods of drought or following natural disasters such as wild fire or flood, there arises the urgent need for 'relief' fodder to ensure survival and recovery of livestock. In emergency situations, relief fodder may be sourced from widely dispersed geographic regions, and some of these regions may be invaded by an extensive variety of weeds that are both exotic and detrimental to the intended destination for the fodder. Pasture hay is a common source of relief fodder and it typically consists of a mixture of grassy and broadleaf species that may include noxious weeds. When required urgently, pasture hay for relief fodder can be cut, baled, and transported over long distances in a short period of time, with little opportunity for prebaling inspection. It appears that, at the present time, there has been little effort towards rapid testing of bales, post-baling, for the presence of noxious weeds, as a measure to prevent dispersal of seeds. Published studies have relied on the analysis of relatively small numbers of bales, tested to destruction, in order to reveal seed species for identification and enumeration. The development of faster, more reliable, and non-destructive sampling methods is essential to increase the fodder industry's capacity to prevent the dispersal of noxious weeds to previously unaffected locales.
Resumo:
The rapid uptake of transcriptomic approaches in freshwater ecology has seen a wealth of data produced concerning the ways in which organisms interact with their environment on a molecular level. Typically, such studies focus either at the community level and so don’t require species identifications, or on laboratory strains of known species identity or natural populations of large, easily identifiable taxa. For chironomids, impediments still exist for applying these technologies to natural populations because they are small-bodied and often require time-consuming secondary sorting of stream material and morphological voucher preparation to confirm species diagnosis. These procedures limit the ability to maintain RNA quantity and quality in such organisms because RNA degrades rapidly and gene expression can be altered rapidly in organisms; thereby limiting the inclusion of such taxa in transcriptomic studies. Here, we demonstrate that these limitations can be overcome and outline an optimised protocol for collecting, sorting and preserving chironomid larvae that enables retention of both morphological vouchers and RNA for subsequent transcriptomics purposes. By ensuring that sorting and voucher preparation are completed within <4 hours after collection and that samples are kept cold at all times, we successfully retained both RNA and morphological vouchers from all specimens. Although not prescriptive in specific methodology, we anticipate that this paper will assist in promoting transcriptomic investigations of the sublethal impact on chironomid gene expression of changes to aquatic environments.
Resumo:
Quantifying nitrous oxide (N(2)O) fluxes, a potent greenhouse gas, from soils is necessary to improve our knowledge of terrestrial N(2)O losses. Developing universal sampling frequencies for calculating annual N(2)O fluxes is difficult, as fluxes are renowned for their high temporal variability. We demonstrate daily sampling was largely required to achieve annual N(2)O fluxes within 10% of the best estimate for 28 annual datasets collected from three continents, Australia, Europe and Asia. Decreasing the regularity of measurements either under- or overestimated annual N(2)O fluxes, with a maximum overestimation of 935%. Measurement frequency was lowered using a sampling strategy based on environmental factors known to affect temporal variability, but still required sampling more than once a week. Consequently, uncertainty in current global terrestrial N(2)O budgets associated with the upscaling of field-based datasets can be decreased significantly using adequate sampling frequencies.
Resumo:
Accurately quantifying total greenhouse gas emissions (e.g. methane) from natural systems such as lakes, reservoirs and wetlands requires the spatial-temporal measurement of both diffusive and ebullitive (bubbling) emissions. Traditional, manual, measurement techniques provide only limited localised assessment of methane flux, often introducing significant errors when extrapolated to the whole-of-system. In this paper, we directly address these current sampling limitations and present a novel multiple robotic boat system configured to measure the spatiotemporal release of methane to atmosphere across inland waterways. The system, consisting of multiple networked Autonomous Surface Vehicles (ASVs) and capable of persistent operation, enables scientists to remotely evaluate the performance of sampling and modelling algorithms for real-world process quantification over extended periods of time. This paper provides an overview of the multi-robot sampling system including the vehicle and gas sampling unit design. Experimental results are shown demonstrating the system’s ability to autonomously navigate and implement an exploratory sampling algorithm to measure methane emissions on two inland reservoirs.
Resumo:
Emissions of coal combustion fly ash through real scale ElectroStatic Precipitators (ESP) were studied in different coal combustion and operation conditions. Sub-micron fly-ash aerosol emission from a power plant boiler and the ESP were determined and consequently the aerosol penetration, as based on electrical mobility measurements, thus giving thereby an indication for an estimate on the size and the maximum extent that the small particles can escape. The experimentals indicate a maximum penetration of 4% to 20 % of the small particles, as counted on number basis instead of the normally used mass basis, while simultaneously the ESP is operating at a nearly 100% collection efficiency on mass basis. Although the size range as such seems to appear independent of the coal, of the boiler or even of the device used for the emission control, the maximum penetration level on the number basis depends on the ESP operating parameters. The measured emissions were stable during stable boiler operation for a fired coal, and the emissions seemed each to be different indicating that the sub-micron size distribution of the fly-ash could be used as a specific characteristics for recognition, for instance for authenticity, provided with an indication of known stable operation. Consequently, the results on the emissions suggest an optimum particle size range for environmental monitoring in respect to the probability of finding traces from the samples. The current work embodies also an authentication system for aerosol samples for post-inspection from any macroscopic sample piece. The system can comprise newly introduced new devices, for mutually independent use, or, for use in a combination with each other, as arranged in order to promote the sampling operation length and/or the tag selection diversity. The tag for the samples can be based on naturally occurring measures and/or added measures of authenticity in a suitable combination. The method involves not only military related applications but those in civil industries as well. Alternatively to the samples, the system can be applied to ink for note printing or other monetary valued papers, but also in a filter manufacturing for marking fibrous filters.
Resumo:
We evaluated trained listener-based acoustic sampling as a reliable and non-invasive method for rapid assessment of ensiferan species diversity in tropical evergreen forests. This was done by evaluating the reliability of identification of species and numbers of calling individuals using psychoacoustic experiments in the laboratory and by comparing psychoacoustic sampling in the field with ambient noise recordings made at the same time. The reliability of correct species identification by the trained listener was 100% for 16 out of 20 species tested in the laboratory. The reliability of identifying the numbers of individuals correctly was 100% for 13 out of 20 species. The human listener performed slightly better than the instrument in detecting low frequency and broadband calls in the field, whereas the recorder detected high frequency calls with greater probability. To address the problem of pseudoreplication during spot sampling in the field, we monitored the movement of calling individuals using focal animal sampling. The average distance moved by calling individuals for 17 out of 20 species was less than 1.5 m in half an hour. We suggest that trained listener-based sampling is preferable for crickets and low frequency katydids, whereas broadband recorders are preferable for katydid species with high frequency calls for accurate estimation of ensiferan species richness and relative abundance in an area.
Resumo:
As an extension to an activity introducing Year 5 students to the practice of statistics, the software TinkerPlots made it possible to collect repeated random samples from a finite population to informally explore students’ capacity to begin reasoning with a distribution of sample statistics. This article provides background for the sampling process and reports on the success of students in making predictions for the population from the collection of simulated samples and in explaining their strategies. The activity provided an application of the numeracy skill of using percentages, the numerical summary of the data, rather than graphing data in the analysis of samples to make decisions on a statistical question. About 70% of students made what were considered at least moderately good predictions of the population percentages for five yes–no questions, and the correlation between predictions and explanations was 0.78.
Resumo:
The Orthogonal Frequency Division Multiplexing (OFDM) is a form of Multi-Carrier Modulation where the data stream is transmitted over a number of carriers which are orthogonal to each other i.e. the carrier spacing is selected such that each carrier is located at the zeroes of all other carriers in the spectral domain. This paper proposes a new novel sampling offset estimation algorithm for an OFDM system in order to receive the OFDM data symbols error-free over the noisy channel at the receiver and to achieve fine timing synchronization between the transmitter and the receiver. The performance of this algorithm has been studied in AWGN, ADSL and SUI channels successfully.
Resumo:
The Orthogonal Frequency Division Multiplexing (OFDM) is a form of Multi-Carrier Modulation where the data stream is transmitted over a number of carriers which are orthogonal to each other i.e. the carrier spacing is selected such that each carrier is located at the zeroes of all other carriers in the spectral domain. This paper proposes a new novel sampling offset estimation algorithm for an OFDM system in order to receive the OFDM data symbols error-free over the noisy channel at the receiver and to achieve fine timing synchronization between the transmitter and the receiver. The performance of this algorithm has been studied in AWGN, ADSL and SUI channels successfully.
Resumo:
The Earth s climate is a highly dynamic and complex system in which atmospheric aerosols have been increasingly recognized to play a key role. Aerosol particles affect the climate through a multitude of processes, directly by absorbing and reflecting radiation and indirectly by changing the properties of clouds. Because of the complexity, quantification of the effects of aerosols continues to be a highly uncertain science. Better understanding of the effects of aerosols requires more information on aerosol chemistry. Before the determination of aerosol chemical composition by the various available analytical techniques, aerosol particles must be reliably sampled and prepared. Indeed, sampling is one of the most challenging steps in aerosol studies, since all available sampling techniques harbor drawbacks. In this study, novel methodologies were developed for sampling and determination of the chemical composition of atmospheric aerosols. In the particle-into-liquid sampler (PILS), aerosol particles grow in saturated water vapor with further impaction and dissolution in liquid water. Once in water, the aerosol sample can then be transported and analyzed by various off-line or on-line techniques. In this study, PILS was modified and the sampling procedure was optimized to obtain less altered aerosol samples with good time resolution. A combination of denuders with different coatings was tested to adsorb gas phase compounds before PILS. Mixtures of water with alcohols were introduced to increase the solubility of aerosols. Minimum sampling time required was determined by collecting samples off-line every hour and proceeding with liquid-liquid extraction (LLE) and analysis by gas chromatography-mass spectrometry (GC-MS). The laboriousness of LLE followed by GC-MS analysis next prompted an evaluation of solid-phase extraction (SPE) for the extraction of aldehydes and acids in aerosol samples. These two compound groups are thought to be key for aerosol growth. Octadecylsilica, hydrophilic-lipophilic balance (HLB), and mixed phase anion exchange (MAX) were tested as extraction materials. MAX proved to be efficient for acids, but no tested material offered sufficient adsorption for aldehydes. Thus, PILS samples were extracted only with MAX to guarantee good results for organic acids determined by liquid chromatography-mass spectrometry (HPLC-MS). On-line coupling of SPE with HPLC-MS is relatively easy, and here on-line coupling of PILS with HPLC-MS through the SPE trap produced some interesting data on relevant acids in atmospheric aerosol samples. A completely different approach to aerosol sampling, namely, differential mobility analyzer (DMA)-assisted filter sampling, was employed in this study to provide information about the size dependent chemical composition of aerosols and understanding of the processes driving aerosol growth from nano-size clusters to climatically relevant particles (>40 nm). The DMA was set to sample particles with diameters of 50, 40, and 30 nm and aerosols were collected on teflon or quartz fiber filters. To clarify the gas-phase contribution, zero gas-phase samples were collected by switching off the DMA every other 15 minutes. Gas-phase compounds were adsorbed equally well on both types of filter, and were found to contribute significantly to the total compound mass. Gas-phase adsorption is especially significant during the collection of nanometer-size aerosols and needs always to be taken into account. Other aims of this study were to determine the oxidation products of β-caryophyllene (the major sesquiterpene in boreal forest) in aerosol particles. Since reference compounds are needed for verification of the accuracy of analytical measurements, three oxidation products of β-caryophyllene were synthesized: β-caryophyllene aldehyde, β-nocaryophyllene aldehyde, and β-caryophyllinic acid. All three were identified for the first time in ambient aerosol samples, at relatively high concentrations, and their contribution to the aerosol mass (and probably growth) was concluded to be significant. Methodological and instrumental developments presented in this work enable fuller understanding of the processes behind biogenic aerosol formation and provide new tools for more precise determination of biosphere-atmosphere interactions.