9 resultados para Application techniques
em DigitalCommons@University of Nebraska - Lincoln
Resumo:
Registration is a necessarily sophisticated evaluation process applied to vertebrate pesticide products. Although conducted to minimize any potential impacts upon public health, the environment and food production, the all-encompassing process of registration can stifle innovation. Vertebrate pesticides are rarely used to control pest animals in food crops. In contrast to agrochemicals, relatively small amounts of vertebrate pesticides are used (50.1%), usually in solid or paste baits, and generally by discrete application methods rather than by broad-scale spray applications. We present a hierarchy or sliding scale of typical data requirements relative to application techniques, to help clarify an evolving science-based approach which focuses on requiring data to address key scientific questions while allowing waivers where additional data have minor value. Such an approach will facilitate the development and delivery of increasingly humane, species-targeted, low residue pesticides in the New World, along with the phasing out of less desirable chemicals that continue to be used due to a lack of alternatives.
Resumo:
Polymerase chain reaction techniques were developed and applied to identify DNA from .40 species of prey contained in fecal (scat) soft-part matrix collected at terrestrial sites used by Steller sea lions (Eumetopias jubatus) in British Columbia and the eastern Aleutian Islands, Alaska. Sixty percent more fish and cephalopod prey were identified by morphological analyses of hard parts compared with DNA analysis of soft parts (hard parts identified higher relative proportions of Ammodytes sp., Cottidae, and certain Gadidae). DNA identified 213 prey occurrences, of which 75 (35%) were undetected by hard parts (mainly Salmonidae, Pleuronectidae, Elasmobranchii, and Cephalopoda), and thereby increased species occurrences by 22% overall and species richness in 44% of cases (when comparing 110 scats that amplified prey DNA). Prey composition was identical within only 20% of scats. Overall, diet composition derived from both identification techniques combined did not differ significantly from hard-part identification alone, suggesting that past scat-based diet studies have not missed major dietary components. However, significant differences in relative diet contributions across scats (as identified using the two techniques separately) reflect passage rate differences between hard and soft digesta material and highlight certain hypothesized limitations in conventional morphological-based methods (e.g., differences in resistance to digestion, hard part regurgitation, partial and secondary prey consumption), as well as potential technical issues (e.g., resolution of primer efficiency and sensitivity and scat subsampling protocols). DNA analysis of salmon occurrence (from scat soft-part matrix and 238 archived salmon hard parts) provided species-level taxonomic resolution that could not be obtained by morphological identification and showed that Steller sea lions were primarily consuming pink (Oncorhynchus gorbuscha) and chum (Oncorhynchus keta) salmon. Notably, DNA from Atlantic salmon (Salmo salar) that likely originated from a distant fish farm was also detected in two scats from one site in the eastern Aleutian Islands. Overall, molecular techniques are valuable for identifying prey in the fecal remains of marine predators. Combining DNA and hard-part identification will effectively alleviate certain predicted biases and will ultimately enhance measures of diet richness, fisheries interactions (especially salmon-related ones), and the ecological role of pinnipeds and other marine predators, to the benefit of marine wildlife conservationists and fisheries managers.
Resumo:
An epidemiological survey for the monitoring of bovine tuberculosis transmission was carried out in western Liguria, a region in northern Italy. Fifteen Mycobacterium bovis strains were isolated from 63 wild boar samples (62 from mandibular lymph nodes and 1 from a liver specimen). Sixteen mediastinal lymph nodes of 16 head of cattle were collected, and 15 Mycobacterium bovis strains were subsequently cultured. All M. bovisstrains isolated from cattle and wild boars were genotyped by spoligotyping and by restriction fragment length polymorphism (RFLP) analysis with the IS6110 and IS1081 probes. All M. bovis strains showed the typical spoligotype characterized by the absence of the 39 to 43 spacers in comparison with the number in M. tuberculosis. A total of nine different clusters were identified by spoligotyping. The largest cluster included 9 strains isolated from wild boars and 11 strains isolated from cattle, thus confirming the possibility of transmission between the two animal species. Fingerprinting by RFLP analysis with the IS6110 probe showed an identical single-band pattern for 29 of 30 strains analyzed, and only 1 strain presented a five-band pattern. The use of IS1081 as a second probe was useful for differentiation of M. bovis from M. bovis BCG but not for differentiation among M. bovis strains, which presented the same undifferentiated genomic profile. In relation to the epidemiological investigation, we hypothesized that the feeding in pastures contaminated by cattle discharges could represent the most probable route of transmission of M. bovis between the two animal species. In conclusion, our results confirmed the higher discriminatory power of spoligotyping in relation to that of RFLP analysis for the differentiation of M. bovis genomic profiles. Our data showed the presence of a common M. bovis genotype in both cattle and wild boars, confirming the possible interspecies transmission of M. bovis.
Resumo:
The reaction of living anionic polymers with 2,2,5,5-tetramethyl-1-(3-bromopropyl)-1-aza-2,5- disilacyclopentane (1) was investigated using coupled thin layer chromatography and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. Structures of byproducts as well as the major product were determined. The anionic initiator having a protected primary amine functional group, 2,2,5,5-tetramethyl- 1-(3-lithiopropyl)-1-aza-2,5-disilacyclopentane (2), was synthesized using all-glass high-vacuum techniques, which allows the long-term stability of this initiator to be maintained. The use of 2 in the preparation of well-defined aliphatic primary amine R-end-functionalized polystyrene and poly(methyl methacrylate) was investigated. Primary amino R-end-functionalized poly(methyl methacrylate) can be obtained near-quantitatively by reacting 2 with 1,1-diphenylethylene in tetrahydrofuran at room temperature prior to polymerizing methyl methacrylate at -78 °C. When 2 is used to initiate styrene at room temperature in benzene, an additive such as N,N,N',N'- tetramethylethylenediamine is necessary to activate the polymerization. However, although the resulting polymers have narrow molecular weight distributions and well-controlled molecular weights, our mass spectra data suggest that the yield of primary amine α-end-functionalized polystyrene from these syntheses is very low. The majority of the products are methyl α-end-functionalized polystyrene.
Resumo:
Where the creation, understanding, and assessment of software testing and regression testing techniques are concerned, controlled experimentation is an indispensable research methodology. Obtaining the infrastructure necessary to support such experimentation, however, is difficult and expensive. As a result, progress in experimentation with testing techniques has been slow, and empirical data on the costs and effectiveness of techniques remains relatively scarce. To help address this problem, we have been designing and constructing infrastructure to support controlled experimentation with testing and regression testing techniques. This paper reports on the challenges faced by researchers experimenting with testing techniques, including those that inform the design of our infrastructure. The paper then describes the infrastructure that we are creating in response to these challenges, and that we are now making available to other researchers, and discusses the impact that this infrastructure has and can be expected to have.
Resumo:
Regression testing is an important part of software maintenance, but it can also be very expensive. To reduce this expense, software testers may prioritize their test cases so that those that are more important are run earlier in the regression testing process. Previous work has shown that prioritization can improve a test suite’s rate of fault detection, but the assessment of prioritization techniques has been limited to hand-seeded faults, primarily due to the belief that such faults are more realistic than automatically generated (mutation) faults. A recent empirical study, however, suggests that mutation faults can be representative of real faults. We have therefore designed and performed a controlled experiment to assess the ability of prioritization techniques to improve the rate of fault detection techniques, measured relative to mutation faults. Our results show that prioritization can be effective relative to the faults considered, and they expose ways in which that effectiveness can vary with characteristics of faults and test suites. We also compare our results to those collected earlier with respect to the relationship between hand-seeded faults and mutation faults, and the implications this has for researchers performing empirical studies of prioritization.
Resumo:
Observability measures the support of computer systems to accurately capture, analyze, and present (collectively observe) the internal information about the systems. Observability frameworks play important roles for program understanding, troubleshooting, performance diagnosis, and optimizations. However, traditional solutions are either expensive or coarse-grained, consequently compromising their utility in accommodating today’s increasingly complex software systems. New solutions are emerging for VM-based languages due to the full control language VMs have over program executions. Existing such solutions, nonetheless, still lack flexibility, have high overhead, or provide limited context information for developing powerful dynamic analyses. In this thesis, we present a VM-based infrastructure, called marker tracing framework (MTF), to address the deficiencies in the existing solutions for providing better observability for VM-based languages. MTF serves as a solid foundation for implementing fine-grained low-overhead program instrumentation. Specifically, MTF allows analysis clients to: 1) define custom events with rich semantics ; 2) specify precisely the program locations where the events should trigger; and 3) adaptively enable/disable the instrumentation at runtime. In addition, MTF-based analysis clients are more powerful by having access to all information available to the VM. To demonstrate the utility and effectiveness of MTF, we present two analysis clients: 1) dynamic typestate analysis with adaptive online program analysis (AOPA); and 2) selective probabilistic calling context analysis (SPCC). In addition, we evaluate the runtime performance of MTF and the typestate client with the DaCapo benchmarks. The results show that: 1) MTF has acceptable runtime overhead when tracing moderate numbers of marker events; and 2) AOPA is highly effective in reducing the event frequency for the dynamic typestate analysis; and 3) language VMs can be exploited to offer greater observability.
Resumo:
In 1979, the Game Division Administration of the Wyoming Game and Fish Department (WGFD) appointed John Demaree and Tim Fagan to develop a handbook that would address the ever increasing problem of wildlife depredation. Field personnel were often times at a loss on how to deal with or evaluate the assorted types of damage situations they were encountering. Because Wyoming requires landowners to be reimbursed for damage done by big and trophy game and game birds to their crops and livestock, an evaluation and techniques handbook was desperately needed. The initial handbook, completed in January 1981, was 74 pages, and both John and I considered it a masterpiece. It did not take long, however, for this handbook to become somewhat lacking in information and outdated. In 1990, our administration approached us again asking this time for an update of our ten-year-old handbook. John and I went to work, and with the assistance of Evin Oneale of the Wyoming Cooperative Fish and Wildlife Research unit, and Bill Hepworth and John Schneidmiller of the WGFD, have just completed the second edition. This edition is over 600 pages and titled "The Handbook of Wildlife Depredation Techniques." Neither of us care to be around when a third edition is needed. In this handbook we have attempted to cover any type of damage situation our personnel may encounter. Although the primary function of this manual is to inform department personnel about proper and uniform damage prevention and evaluation techniques, it also provides relative and pertinent information concerning the many aspects of wildlife depredation. Information for this handbook has been compiled from techniques developed by our personnel, personnel from other states and provinces, and published data on wildlife depredation. There are nine chapters, a reprint, and Appendix section in this handbook. We will briefly summarize each chapter regarding its contents.
Resumo:
Bird depredations in Virginia have been estimated by the Extension Service, State Department of Agriculture, and the Division of Wildlife Services to be approxi¬mately $5,000,000 annually. As part of a continuing program to reduce this damage, these agencies have tested certain experimental techniques using the avicide, 3, chloro-p-toluidine, chosen for its relative selectivity, low secondary hazard, and slow action. The situations in which the avicide was tested were feedlots, decoy crops, roost reduction, and pigeon control.