924 resultados para sampling methods


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Killer whale (Orcinus orca Linnaeus, 1758) abundance in the North Pacific is known only for a few populations for which extensive longitudinal data are available, with little quantitative data from more remote regions. Line-transect ship surveys were conducted in July and August of 2001–2003 in coastal waters of the western Gulf of Alaska and the Aleutian Islands. Conventional and Multiple Covariate Distance Sampling methods were used to estimate the abundance of different killer whale ecotypes, which were distinguished based upon morphological and genetic data. Abundance was calculated separately for two data sets that differed in the method by which killer whale group size data were obtained. Initial group size (IGS) data corresponded to estimates of group size at the time of first sighting, and post-encounter group size (PEGS) corresponded to estimates made after closely approaching sighted groups.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

"How large a sample is needed to survey the bird damage to corn in a county in Ohio or New Jersey or South Dakota?" Like those in the Bureau of Sport Fisheries and Wildlife and the U.S.D.A. who have been faced with a question of this sort we found only meager information on which to base an answer, whether the problem related to a county in Ohio or to one in New Jersey, or elsewhere. Many sampling methods and rates of sampling did yield reliable estimates but the judgment was often intuitive or based on the reasonableness of the resulting data. Later, when planning the next study or survey, little additional information was available on whether 40 samples of 5 ears each or 5 samples of 200 ears should be examined, i.e., examination of a large number of small samples or a small number of large samples. What information is needed to make a reliable decision? Those of us involved with the Agricultural Experiment Station regional project concerned with the problems of bird damage to crops, known as NE-49, thought we might supply an ans¬wer if we had a corn field in which all the damage was measured. If all the damage were known, we could then sample this field in various ways and see how the estimates from these samplings compared to the actual damage and pin-point the best and most accurate sampling procedure. Eventually the investigators in four states became involved in this work1 and instead of one field we were able to broaden the geographical base by examining all the corn ears in 2 half-acre sections of fields in each state, 8 sections in all. When the corn had matured well past the dough stage, damage on each corn ear was assessed, without removing the ear from the stalk, by visually estimating the percent of the kernel surface which had been destroyed and rating it in one of 5 damage categories. Measurements (by row-centimeters) of the rows of kernels pecked by birds also were made on selected ears representing all categories and all parts of each field section. These measurements provided conversion factors that, when fed into a computer, were applied to the more than 72,000 visually assessed ears. The machine now had in its memory and could supply on demand a map showing each ear, its location and the intensity of the damage.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Studies have shown similarities in the microflora between titanium implants or tooth sites when samples are taken by gingival crevicular fluid (GCF) sampling methods. The purpose of the present study was to study the microflora from curette and GCF samples using the checkerboard DNA-DNA hybridization method to assess the microflora of patients who had at least one oral osseo-integrated implant and who were otherwise dentate. Plaque samples were taken from tooth/implant surfaces and from sulcular gingival surfaces with curettes, and from gingival fluid using filter papers. A total of 28 subjects (11 females) were enrolled in the study. The mean age of the subjects was 64.1 years (SD+/-4.7). On average, the implants studied had been in function for 3.7 years (SD+/-2.9). The proportion of Streptococcus oralis (P<0.02) and Fusobacterium periodonticum (P<0.02) was significantly higher at tooth sites (curette samples). The GCF samples yielded higher proportions for 28/40 species studies (P-values varying between 0.05 and 0.001). The proportions of Tannerella forsythia (T. forsythensis), and Treponema denticola were both higher in GCF samples (P<0.02 and P<0.05, respectively) than in curette samples (implant sites). The microbial composition in gingival fluid from samples taken at implant sites differed partly from that of curette samples taken from implant surfaces or from sulcular soft tissues, providing higher counts for most bacteria studied at implant surfaces, but with the exception of Porphyromonas gingivalis. A combination of GCF and curette sampling methods might be the most representative sample method.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Proteins are linear chain molecules made out of amino acids. Only when they fold to their native states, they become functional. This dissertation aims to model the solvent (environment) effect and to develop & implement enhanced sampling methods that enable a reliable study of the protein folding problem in silico. We have developed an enhanced solvation model based on the solution to the Poisson-Boltzmann equation in order to describe the solvent effect. Following the quantum mechanical Polarizable Continuum Model (PCM), we decomposed net solvation free energy into three physical terms– Polarization, Dispersion and Cavitation. All the terms were implemented, analyzed and parametrized individually to obtain a high level of accuracy. In order to describe the thermodynamics of proteins, their conformational space needs to be sampled thoroughly. Simulations of proteins are hampered by slow relaxation due to their rugged free-energy landscape, with the barriers between minima being higher than the thermal energy at physiological temperatures. In order to overcome this problem a number of approaches have been proposed of which replica exchange method (REM) is the most popular. In this dissertation we describe a new variant of canonical replica exchange method in the context of molecular dynamic simulation. The advantage of this new method is the easily tunable high acceptance rate for the replica exchange. We call our method Microcanonical Replica Exchange Molecular Dynamic (MREMD). We have described the theoretical frame work, comment on its actual implementation, and its application to Trp-cage mini-protein in implicit solvent. We have been able to correctly predict the folding thermodynamics of this protein using our approach.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Methods are described for working with Nosema apis and Nosema ceranae in the field and in the laboratory. For fieldwork, different sampling methods are described to determine colony level infections at a given point in time, but also for following the temporal infection dynamics. Suggestions are made for how to standardise field trials for evaluating treatments and disease impact. The laboratory methods described include different means for determining colony level and individual bee infection levels and methods for species determination, including light microscopy, electron microscopy, and molecular methods (PCR). Suggestions are made for how to standardise cage trials, and different inoculation methods for infecting bees are described, including control methods for spore viability. A cell culture system for in vitro rearing of Nosema spp. is described. Finally, how to conduct different types of experiments are described, including infectious dose, dose effects, course of infection and longevity tests

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Conservation and monitoring of forest biodiversity requires reliable information about forest structure and composition at multiple spatial scales. However, detailed data about forest habitat characteristics across large areas are often incomplete due to difficulties associated with field sampling methods. To overcome this limitation we employed a nationally available light detection and ranging (LiDAR) remote sensing dataset to develop variables describing forest landscape structure across a large environmental gradient in Switzerland. Using a model species indicative of structurally rich mountain forests (hazel grouse Bonasa bonasia), we tested the potential of such variables to predict species occurrence and evaluated the additional benefit of LiDAR data when used in combination with traditional, sample plot-based field variables. We calibrated boosted regression trees (BRT) models for both variable sets separately and in combination, and compared the models’ accuracies. While both field-based and LiDAR models performed well, combining the two data sources improved the accuracy of the species’ habitat model. The variables retained from the two datasets held different types of information: field variables mostly quantified food resources and cover in the field and shrub layer, LiDAR variables characterized heterogeneity of vegetation structure which correlated with field variables describing the understory and ground vegetation. When combined with data on forest vegetation composition from field surveys, LiDAR provides valuable complementary information for encompassing species niches more comprehensively. Thus, LiDAR bridges the gap between precise, locally restricted field-data and coarse digital land cover information by reliably identifying habitat structure and quality across large areas.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Research on open source software (OSS) projects often focuses on the SourceForge collaboration platform. We argue that a GNU/Linwr distribution, such as Debian, is better suited for the sampling ofprojects because it avoids biases and contains unique information only available in an integrated environment. Especially research on the reuse of components can build on dependency information inherent in the Debian GNU/Linux packaging system. This paper therefore contributes to the practice of sampling methods in OSS research and provides empirical data on reuse dependencies in Debian.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neurodegenerative disease. Disorders such as Alzheimer's disease (AD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This article reviews the advantages and limitations of the different methods of quantifying the abundance of pathological lesions in histological sections, including estimates of density, frequency, coverage, and the use of semiquantitative scores. The major sampling methods by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are also described. In addition, the data analysis methods commonly used to analyse quantitative data in neuropathology, including analyses of variance (ANOVA) and principal components analysis (PCA), are discussed. These methods are illustrated with reference to particular problems in the pathological diagnosis of AD and dementia with Lewy bodies (DLB).

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The last decade has seen a considerable increase in the application of quantitative methods in the study of histological sections of brain tissue and especially in the study of neurodegenerative disease. These disorders are characterised by the deposition and aggregation of abnormal or misfolded proteins in the form of extracellular protein deposits such as senile plaques (SP) and intracellular inclusions such as neurofibrillary tangles (NFT). Quantification of brain lesions and studying the relationships between lesions and normal anatomical features of the brain, including neurons, glial cells, and blood vessels, has become an important method of elucidating disease pathogenesis. This review describes methods for quantifying the abundance of a histological feature such as density, frequency, and 'load' and the sampling methods by which quantitative measures can be obtained including plot/quadrat sampling, transect sampling, and the point-quarter method. In addition, methods for determining the spatial pattern of a histological feature, i.e., whether the feature is distributed at random, regularly, or is aggregated into clusters, are described. These methods include the use of the Poisson and binomial distributions, pattern analysis by regression, Fourier analysis, and methods based on mapped point patterns. Finally, the statistical methods available for studying the degree of spatial correlation between pathological lesions and neurons, glial cells, and blood vessels are described.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Direct sampling methods are increasingly being used to solve the inverse medium scattering problem to estimate the shape of the scattering object. A simple direct method using one incident wave and multiple measurements was proposed by Ito, Jin and Zou. In this report, we performed some analytic and numerical studies of the direct sampling method. The method was found to be effective in general. However, there are a few exceptions exposed in the investigation. Analytic solutions in different situations were studied to verify the viability of the method while numerical tests were used to validate the effectiveness of the method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The value of soil evidence in the forensic discipline is well known. However, it would be advantageous if an in-situ method was available that could record responses from tyre or shoe impressions in ground soil at the crime scene. The development of optical fibres and emerging portable NIR instruments has unveiled a potential methodology which could permit such a proposal. The NIR spectral region contains rich chemical information in the form of overtone and combination bands of the fundamental infrared absorptions and low-energy electronic transitions. This region has in the past, been perceived as being too complex for interpretation and consequently was scarcely utilized. The application of NIR in the forensic discipline is virtually non-existent creating a vacancy for research in this area. NIR spectroscopy has great potential in the forensic discipline as it is simple, nondestructive and capable of rapidly providing information relating to chemical composition. The objective of this study is to investigate the ability of NIR spectroscopy combined with Chemometrics to discriminate between individual soils. A further objective is to apply the NIR process to a simulated forensic scenario where soil transfer occurs. NIR spectra were recorded from twenty-seven soils sampled from the Logan region in South-East Queensland, Australia. A series of three high quartz soils were mixed with three different kaolinites in varying ratios and NIR spectra collected. Spectra were also collected from six soils as the temperature of the soils was ramped from room temperature up to 6000C. Finally, a forensic scenario was simulated where the transferral of ground soil to shoe soles was investigated. Chemometrics methods such as the commonly known Principal Component Analysis (PCA), the less well known fuzzy clustering (FC) and ranking by means of multicriteria decision making (MCDM) methodology were employed to interpret the spectral results. All soils were characterised using Inductively Coupled Plasma Optical Emission Spectroscopy and X-Ray Diffractometry. Results were promising revealing NIR combined with Chemometrics is capable of discriminating between the various soils. Peak assignments were established by comparing the spectra of known minerals with the spectra collected from the soil samples. The temperature dependent NIR analysis confirmed the assignments of the absorptions due to adsorbed and molecular bound water. The relative intensities of the identified NIR absorptions reflected the quantitative XRD and ICP characterisation results. PCA and FC analysis of the raw soils in the initial NIR investigation revealed that the soils were primarily distinguished on the basis of their relative quartz and kaolinte contents, and to a lesser extent on the horizon from which they originated. Furthermore, PCA could distinguish between the three kaolinites used in the study, suggesting that the NIR spectral region was sensitive enough to contain information describing variation within kaolinite itself. The forensic scenario simulation PCA successfully discriminated between the ‘Backyard Soil’ and ‘Melcann® Sand’, as well as the two sampling methods employed. Further PCA exploration revealed that it was possible to distinguish between the various shoes used in the simulation. In addition, it was possible to establish association between specific sampling sites on the shoe with the corresponding site remaining in the impression. The forensic application revealed some limitations of the process relating to moisture content and homogeneity of the soil. These limitations can both be overcome by simple sampling practices and maintaining the original integrity of the soil. The results from the forensic scenario simulation proved that the concept shows great promise in the forensic discipline.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation is primarily an applied statistical modelling investigation, motivated by a case study comprising real data and real questions. Theoretical questions on modelling and computation of normalization constants arose from pursuit of these data analytic questions. The essence of the thesis can be described as follows. Consider binary data observed on a two-dimensional lattice. A common problem with such data is the ambiguity of zeroes recorded. These may represent zero response given some threshold (presence) or that the threshold has not been triggered (absence). Suppose that the researcher wishes to estimate the effects of covariates on the binary responses, whilst taking into account underlying spatial variation, which is itself of some interest. This situation arises in many contexts and the dingo, cypress and toad case studies described in the motivation chapter are examples of this. Two main approaches to modelling and inference are investigated in this thesis. The first is frequentist and based on generalized linear models, with spatial variation modelled by using a block structure or by smoothing the residuals spatially. The EM algorithm can be used to obtain point estimates, coupled with bootstrapping or asymptotic MLE estimates for standard errors. The second approach is Bayesian and based on a three- or four-tier hierarchical model, comprising a logistic regression with covariates for the data layer, a binary Markov Random field (MRF) for the underlying spatial process, and suitable priors for parameters in these main models. The three-parameter autologistic model is a particular MRF of interest. Markov chain Monte Carlo (MCMC) methods comprising hybrid Metropolis/Gibbs samplers is suitable for computation in this situation. Model performance can be gauged by MCMC diagnostics. Model choice can be assessed by incorporating another tier in the modelling hierarchy. This requires evaluation of a normalization constant, a notoriously difficult problem. Difficulty with estimating the normalization constant for the MRF can be overcome by using a path integral approach, although this is a highly computationally intensive method. Different methods of estimating ratios of normalization constants (N Cs) are investigated, including importance sampling Monte Carlo (ISMC), dependent Monte Carlo based on MCMC simulations (MCMC), and reverse logistic regression (RLR). I develop an idea present though not fully developed in the literature, and propose the Integrated mean canonical statistic (IMCS) method for estimating log NC ratios for binary MRFs. The IMCS method falls within the framework of the newly identified path sampling methods of Gelman & Meng (1998) and outperforms ISMC, MCMC and RLR. It also does not rely on simplifying assumptions, such as ignoring spatio-temporal dependence in the process. A thorough investigation is made of the application of IMCS to the three-parameter Autologistic model. This work introduces background computations required for the full implementation of the four-tier model in Chapter 7. Two different extensions of the three-tier model to a four-tier version are investigated. The first extension incorporates temporal dependence in the underlying spatio-temporal process. The second extensions allows the successes and failures in the data layer to depend on time. The MCMC computational method is extended to incorporate the extra layer. A major contribution of the thesis is the development of a fully Bayesian approach to inference for these hierarchical models for the first time. Note: The author of this thesis has agreed to make it open access but invites people downloading the thesis to send her an email via the 'Contact Author' function.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tobacco yellow dwarf virus (TbYDV, family Geminiviridae, genus Mastrevirus) is an economically important pathogen causing summer death and yellow dwarf disease in bean (Phaseolus vulgaris L.) and tobacco (Nicotiana tabacum L.), respectively. Prior to the commencement of this project, little was known about the epidemiology of TbYDV, its vector and host-plant range. As a result, disease control strategies have been restricted to regular poorly timed insecticide applications which are largely ineffective, environmentally hazardous and expensive. In an effort to address this problem, this PhD project was carried out in order to better understand the epidemiology of TbYDV, to identify its host-plant and vectors as well as to characterise the population dynamics and feeding physiology of the main insect vector and other possible vectors. The host-plants and possible leafhopper vectors of TbYDV were assessed over three consecutive growing seasons at seven field sites in the Ovens Valley, Northeastern Victoria, in commercial tobacco and bean growing properties. Leafhoppers and plants were collected and tested for the presence of TbYDV by PCR. Using sweep nets, twenty-three leafhopper species were identified at the seven sites with Orosius orientalis the predominant leafhopper. Of the 23 leafhopper species screened for TbYDV, only Orosius orientalis and Anzygina zealandica tested positive. Forty-two different plant species were also identified at the seven sites and tested. Of these, TbYDV was only detected in four dicotyledonous species, Amaranthus retroflexus, Phaseolus vulgaris, Nicotiana tabacum and Raphanus raphanistrum. Using a quadrat survey, the temporal distribution and diversity of vegetation at four of the field sites was monitored in order to assess the presence of, and changes in, potential host-plants for the leafhopper vector(s) and the virus. These surveys showed that plant composition and the climatic conditions at each site were the major influences on vector numbers, virus presence and the subsequent occurrence of tobacco yellow dwarf and bean summer death diseases. Forty-two plant species were identified from all sites and it was found that sites with the lowest incidence of disease had the highest proportion of monocotyledonous plants that are non hosts for both vector and the virus. In contrast, the sites with the highest disease incidence had more host-plant species for both vector and virus, and experienced higher temperatures and less rainfall. It is likely that these climatic conditions forced the leafhopper to move into the irrigated commercial tobacco and bean crop resulting in disease. In an attempt to understand leafhopper species diversity and abundance, in and around the field borders of commercially grown tobacco crops, leafhoppers were collected from four field sites using three different sampling techniques, namely pan trap, sticky trap and sweep net. Over 51000 leafhopper samples were collected, which comprised 57 species from 11 subfamilies and 19 tribes. Twentythree leafhopper species were recorded for the first time in Victoria in addition to several economically important pest species of crops other than tobacco and bean. The highest number and greatest diversity of leafhoppers were collected in yellow pan traps follow by sticky trap and sweep nets. Orosius orientalis was found to be the most abundant leafhopper collected from all sites with greatest numbers of this leafhopper also caught using the yellow pan trap. Using the three sampling methods mentioned above, the seasonal distribution and population dynamics of O. orientalis was studied at four field sites over three successive growing seasons. The population dynamics of the leafhopper was characterised by trimodal peaks of activity, occurring in the spring and summer months. Although O. orientalis was present in large numbers early in the growing season (September-October), TbYDV was only detected in these leafhoppers between late November and the end of January. The peak in the detection of TbYDV in O. orientalis correlated with the observation of disease symptoms in tobacco and bean and was also associated with warmer temperatures and lower rainfall. To understand the feeding requirements of Orosius orientalis and to enable screening of potential control agents, a chemically-defined artificial diet (designated PT-07) and feeding system was developed. This novel diet formulation allowed survival for O. orientalis for up to 46 days including complete development from first instar through to adulthood. The effect of three selected plant derived proteins, cowpea trypsin inhibitor (CpTi), Galanthus nivalis agglutinin (GNA) and wheat germ agglutinin (WGA), on leafhopper survival and development was assessed. Both GNA and WGA were shown to reduce leafhopper survival and development significantly when incorporated at a 0.1% (w/v) concentration. In contrast, CpTi at the same concentration did not exhibit significant antimetabolic properties. Based on these results, GNA and WGA are potentially useful antimetabolic agents for expression in genetically modified crops to improve the management of O. orientalis, TbYDV and the other pathogens it vectors. Finally, an electrical penetration graph (EPG) was used to study the feeding behaviour of O. orientalis to provide insights into TbYDV acquisition and transmission. Waveforms representing different feeding activity were acquired by EPG from adult O. orientalis feeding on two plant species, Phaseolus vulgaris and Nicotiana tabacum and a simple sucrose-based artificial diet. Five waveforms (designated O1-O5) were observed when O. orientalis fed on P. vulgaris, while only four (O1-O4) and three (O1-O3) waveforms were observed during feeding on N. tabacum and the artificial diet, respectively. The mean duration of each waveform and the waveform type differed markedly depending on the food source. This is the first detailed study on the tritrophic interactions between TbYDV, its leafhopper vector, O. orientalis, and host-plants. The results of this research have provided important fundamental information which can be used to develop more effective control strategies not only for O. orientalis, but also for TbYDV and other pathogens vectored by the leafhopper.