956 resultados para Scanline sampling technique
Resumo:
In this study cross-section data was used to analyze the effect of farmers’ demographic, socioeconomic and institutional setting, market access and physical attributes on the probability and intensity of tissue culture banana (TCB) adoption. The study was carried out between July 2011 and November 2011. Both descriptive (mean, variance, promotions) and regression analysis were used in the analysis. A double hurdle regression model was fitted on the data. Using multistage sampling technique, four counties and eight sub-locations were randomly selected. Using random sampling technique, three hundred and thirty farmers were selected from a list of banana households in the selected sub-locations. The adoption level of tissue culture banana (TCB) was about 32%. The results also revealed that the likelihood of TCB adoption was significantly influenced by: availability of TCB planting material, proportion of banana income to the total farm income, per capita household expenditure and the location of the farmer in Kisii County; while those that significantly influenced the intensity of TCB adoption were: occupation of farmers, family size, labour source, farm size, soil fertility, availability/access of TCB plantlets to farmers, distance to banana market, use of manure in planting banana, access to agricultural extension services and index of TCB/non-TCB banana cultivar attributes which were scored by farmers. Compared to West Pokot County, farmers located in Bungoma County are more significantly and likely to adopt TCB technology. Therefore, the results of the study suggest that the probability of adoption and intensity of the use of TCB should be enhanced. This can be done by taking cognizance of these variables in order to meet the priority needs of the smallholder farmers who were the target group. This would lead to alleviating banana shortage in the region for enhanced food security. Subsequently, actors along the banana value chain are encouraged to target the intervention strategies based on the identified farmer, farm and institutional characteristics for enhanced impact on food provision. Opening up more TCB multiplication centres in different regions will make farmers access the TCB technology for enhanced impact on the target population.
Resumo:
Gunshot residue (GSR) is the term used to describe the particles originating from different parts of the firearm and ammunition during the discharge. A fast and practical field tool to detect the presence of GSR can assist law enforcement in the accurate identification of subjects. A novel field sampling device is presented for the first time for the fast detection and quantitation of volatile organic compounds (VOCs). The capillary microextraction of volatiles (CMV) is a headspace sampling technique that provides fast results (< 2 min. sampling time) and is reported as a versatile and high-efficiency sampling tool. The CMV device can be coupled to a Gas Chromatography-Mass Spectrometry (GC-MS) instrument by installation of a thermal separation probe in the injection port of the GC. An analytical method using the CMV device was developed for the detection of 17 compounds commonly found in polluted environments. The acceptability of the CMV as a field sampling method for the detection of VOCs is demonstrated by following the criteria established by the Environmental Protection Agency (EPA) compendium method TO-17. The CMV device was used, for the first time, for the detection of VOCs on swabs from the hands of shooters, and non-shooters and spent cartridges from different types of ammunition (i.e., pistol, rifle, and shotgun). The proposed method consists in the headspace extraction of VOCs in smokeless powders present in the propellant of ammunition. The sensitivity of this method was demonstrated with method detection limits (MDLs) 4-26 ng for diphenylamine (DPA), nitroglycerine (NG), 2,4-dinitrotoluene (2,4-DNT), and ethyl centralite (EC). In addition, a fast method was developed for the detection of the inorganic components (i.e., Ba, Pb, and Sb) characteristic of GSR presence by Laser Induced Breakdown Spectroscopy (LIBS). Advantages of LIBS include fast analysis (~ 12 seconds per sample) and good sensitivity, with expected MDLs in the range of 0.1-20 ng for target elements. Statistical analysis of the results using both techniques was performed to determine any correlation between the variables analyzed. This work demonstrates that the information collected from the analysis of organic components has the potential to improve the detection of GSR.
Resumo:
Quantitative use of satellite-derived rainfall products for various scientific applications often requires them to be accompanied with an error estimate. Rainfall estimates inferred from low earth orbiting satellites like the Tropical Rainfall Measuring Mission (TRMM) will be subjected to sampling errors of nonnegligible proportions owing to the narrow swath of satellite sensors coupled with a lack of continuous coverage due to infrequent satellite visits. The authors investigate sampling uncertainty of seasonal rainfall estimates from the active sensor of TRMM, namely, Precipitation Radar (PR), based on 11 years of PR 2A25 data product over the Indian subcontinent. In this paper, a statistical bootstrap technique is investigated to estimate the relative sampling errors using the PR data themselves. Results verify power law scaling characteristics of relative sampling errors with respect to space-time scale of measurement. Sampling uncertainty estimates for mean seasonal rainfall were found to exhibit seasonal variations. To give a practical example of the implications of the bootstrap technique, PR relative sampling errors over a subtropical river basin of Mahanadi, India, are examined. Results reveal that the bootstrap technique incurs relative sampling errors < 33% (for the 2 degrees grid), < 36% (for the 1 degrees grid), < 45% (for the 0.5 degrees grid), and < 57% (for the 0.25 degrees grid). With respect to rainfall type, overall sampling uncertainty was found to be dominated by sampling uncertainty due to stratiform rainfall over the basin. The study compares resulting error estimates to those obtained from latin hypercube sampling. Based on this study, the authors conclude that the bootstrap approach can be successfully used for ascertaining relative sampling errors offered by TRMM-like satellites over gauged or ungauged basins lacking in situ validation data. This technique has wider implications for decision making before incorporating microwave orbital data products in basin-scale hydrologic modeling.
Resumo:
Human scent and human remains detection canines are used to locate living or deceased humans under many circumstances. Human scent canines locate individual humans on the basis of their unique scent profile, while human remains detection canines locate the general scent of decomposing human remains. Scent evidence is often collected by law enforcement agencies using a Scent Transfer Unit, a dynamic headspace concentration device. The goals of this research were to evaluate the STU-100 for the collection of human scent samples, and to apply this method to the collection of living and deceased human samples, and to the creation of canine training aids. The airflow rate and collection material used with the STU-100 were evaluated using a novel scent delivery method. Controlled Odor Mimic Permeation Systems were created containing representative standard compounds delivered at known rates, improving the reproducibility of optimization experiments. Flow rates and collection materials were compared. Higher air flow rates usually yielded significantly less total volatile compounds due to compound breakthrough through the collection material. Collection from polymer and cellulose-based materials demonstrated that the molecular backbone of the material is a factor in the trapping and releasing of compounds. The weave of the material also affects compound collection, as those materials with a tighter weave demonstrated enhanced collection efficiencies. Using the optimized method, volatiles were efficiently collected from living and deceased humans. Replicates of the living human samples showed good reproducibility; however, the odor profiles from individuals were not always distinguishable from one another. Analysis of the human remains samples revealed similarity in the type and ratio of compounds. Two types of prototype training aids were developed utilizing combinations of pure compounds as well as volatiles from actual human samples concentrated onto sorbents, which were subsequently used in field tests. The pseudo scent aids had moderate success in field tests, and the Odor pad aids had significant success. This research demonstrates that the STU-100 is a valuable tool for dog handlers and as a field instrument; however, modifications are warranted in order to improve its performance as a method for instrumental detection.
Resumo:
The deposition of biological material (biofouling) onto polymeric contact lenses is thought to be a major contributor to lens discomfort and hence discontinuation of wear. We describe a method to characterize lipid deposits directly from worn contact lenses utilizing liquid extraction surface analysis coupled to tandem mass spectrometry (LESA-MS/MS). This technique effected facile and reproducible extraction of lipids from the contact lens surfaces and identified lipid molecular species representing all major classes present in human tear film. Our data show that LESA-MS/MS is a rapid and comprehensive technique for the characterization of lipid-related biofouling on polymer surfaces.
Resumo:
Mammographic density (MD) adjusted for age and body mass index (BMI) is a strong heritable breast cancer risk factor; however, its biological basis remains elusive. Previous studies assessed MD-associated histology using random sampling approaches, despite evidence that high and low MD areas exist within a breast and are negatively correlated with respect to one another. We have used an image-guided approach to sample high and low MD tissues from within individual breasts to examine the relationship between histology and degree of MD. Image-guided sampling was performed using two different methodologies on mastectomy tissues (n = 12): (1) sampling of high and low MD regions within a slice guided by bright (high MD) and dark (low MD) areas in a slice X-ray film; (2) sampling of high and low MD regions within a whole breast using a stereotactically guided vacuum-assisted core biopsy technique. Pairwise analysis accounting for potential confounders (i.e. age, BMI, menopausal status, etc.) provides appropriate power for analysis despite the small sample size. High MD tissues had higher stromal (P = 0.002) and lower fat (P = 0.002) compositions, but no evidence of difference in glandular areas (P = 0.084) compared to low MD tissues from the same breast. High MD regions had higher relative gland counts (P = 0.023), and a preponderance of Type I lobules in high MD compared to low MD regions was observed in 58% of subjects (n = 7), but did not achieve significance. These findings clarify the histologic nature of high MD tissue and support hypotheses regarding the biophysical impact of dense connective tissue on mammary malignancy. They also provide important terms of reference for ongoing analyses of the underlying genetics of MD.
Resumo:
In this study, a non-linear excitation controller using inverse filtering is proposed to damp inter-area oscillations. The proposed controller is based on determining generator flux value for the next sampling time which is obtained by maximising reduction rate of kinetic energy of the system after the fault. The desired flux for the next time interval is obtained using wide-area measurements and the equivalent area rotor angles and velocities are predicted using a non-linear Kalman filter. A supplementary control input for the excitation system, using inverse filtering approach, to track the desired flux is implemented. The inverse filtering approach ensures that the non-linearity introduced because of saturation is well compensated. The efficacy of the proposed controller with and without communication time delay is evaluated on different IEEE benchmark systems including Kundur's two area, Western System Coordinating Council three-area and 16-machine, 68-bus test systems.
Resumo:
We present a technique for an all-digital on-chip delay measurement system to measure the skews in a clock distribution network. It uses the principle of sub-sampling. Measurements from a prototype fabricated in a 65 nm industrial process, indicate the ability to measure delays with a resolution of 0.5ps and a DNL of 1.2 ps.
Resumo:
Traditional comparisons between the capture efficiency of sampling devices have generally looked at the absolute differences between devices. We recommend that the signal-to-noise ratio be used when comparing the capture efficiency of benthic sampling devices. Using the signal-to-noise ratio rather than the absolute difference has the advantages that the variance is taken into account when determining how important the difference is, the hypothesis and minimum detectable difference can be made identical for all taxa, it is independent of the units used for measurement, and the sample-size calculation is independent of the variance. This new technique is illustrated by comparing the capture efficiency of a 0.05 m(2) van Veen grab and an airlift suction device, using samples taken from Heron and One Tree lagoons, Australia.
Resumo:
This paper demonstrates the application of inverse filtering technique for power systems. In order to implement this method, the control objective should be based on a system variable that needs to be set on a specific value for each sampling time. A control input is calculated to generate the desired output of the plant and the relationship between the two is used design an auto-regressive model. The auto-regressive model is converted to a moving average model to calculate the control input based on the future values of the desired output. Therefore, required future values to construct the output are predicted to generate the appropriate control input for the next sampling time.
Resumo:
A new method for decomposition of compo,.~itsei gnals is presented. It is shown that high freyuency portion of composite signal spectrum possesses information on echo structure. The proposed technique does not assume the shape of basic wavelet and does not place any restrictions on the amplitudes and arrival times of echoes inm the composite signal. In the absence of noise any desirrd resolution can he obtained The effect of sampling rate and jFequency window function on echo resolutio.~ are di.wussed. Voiced speech segment is considered as an example of conzpxite sigrnl to demonstrate the application of the decomposition technique.
Resumo:
We apply the theta modulation technique to simultaneously multiple image more than one object independently with a Fourier plane sampling type of multiple imaging system. Experimental results of multiple imaging two objects is presented.
Resumo:
Sampling based planners have been successful in path planning of robots with many degrees of freedom, but still remains ineffective when the configuration space has a narrow passage. We present a new technique based on a random walk strategy to generate samples in narrow regions quickly, thus improving efficiency of Probabilistic Roadmap Planners. The algorithm substantially reduces instances of collision checking and thereby decreases computational time. The method is powerful even for cases where the structure of the narrow passage is not known, thus giving significant improvement over other known methods.
Resumo:
We consider the speech production mechanism and the asso- ciated linear source-filter model. For voiced speech sounds in particular, the source/glottal excitation is modeled as a stream of impulses and the filter as a cascade of second-order resonators. We show that the process of sampling speech signals can be modeled as filtering a stream of Dirac impulses (a model for the excitation) with a kernel function (the vocal tract response),and then sampling uniformly. We show that the problem of esti- mating the excitation is equivalent to the problem of recovering a stream of Dirac impulses from samples of a filtered version. We present associated algorithms based on the annihilating filter and also make a comparison with the classical linear prediction technique, which is well known in speech analysis. Results on synthesized as well as natural speech data are presented.
Resumo:
Multivariate neural data provide the basis for assessing interactions in brain networks. Among myriad connectivity measures, Granger causality (GC) has proven to be statistically intuitive, easy to implement, and generate meaningful results. Although its application to functional MRI (fMRI) data is increasing, several factors have been identified that appear to hinder its neural interpretability: (a) latency differences in hemodynamic response function (HRF) across different brain regions, (b) low-sampling rates, and (c) noise. Recognizing that in basic and clinical neuroscience, it is often the change of a dependent variable (e.g., GC) between experimental conditions and between normal and pathology that is of interest, we address the question of whether there exist systematic relationships between GC at the fMRI level and that at the neural level. Simulated neural signals were convolved with a canonical HRF, down-sampled, and noise-added to generate simulated fMRI data. As the coupling parameters in the model were varied, fMRI GC and neural GC were calculated, and their relationship examined. Three main results were found: (1) GC following HRF convolution is a monotonically increasing function of neural GC; (2) this monotonicity can be reliably detected as a positive correlation when realistic fMRI temporal resolution and noise level were used; and (3) although the detectability of monotonicity declined due to the presence of HRF latency differences, substantial recovery of detectability occurred after correcting for latency differences. These results suggest that Granger causality is a viable technique for analyzing fMRI data when the questions are appropriately formulated.