2 resultados para Food and feed safety
em DigitalCommons@University of Nebraska - Lincoln
Resumo:
Disease transmission between wildlife and livestock is a worldwide issue. Society needs better methods to prevent interspecies transmission to reduce disease risks. Producers have successfully used livestock protection dogs (LPDs) for thousands of years to reduce predation. We theorized that LPDs raised and bonded with cattle could be used to also reduce risk of bovine tuberculosis (Myobacterium bovis; TB) transmission between white-tailed deer (Odocoileus virginianus) and cattle by minimizing contact between the 2 species and use of cattle feed by deer. We evaluated 4 LPDs over 5 months, utilizing 2 data collection methods (direct observation and motion-activated video) on deer farms that supported higher densities than wild populations. Dogs were highly effective in preventing deer from using concentrated cattle feed (hay bales), likely the greatest risk factor of TB transmission on farms. Dogs also prevented deer from approaching cattle in core areas of pastures (near hay bales) and were very effective throughout pastures. Our research supports the theory that LPDs, specifically trained to remain with cattle, may be a practical tool to minimize potential for livestock to contract TB from infected deer in small-scale cattle operations. Where disease is present in deer, it may be possible to reduce the potential for disease transmission by employing LPDs.
Resumo:
Maize demand for food, livestock feed, and biofuel is expected to increase substantially. The Western U.S. Corn Belt accounts for 23% of U.S. maize production, and irrigated maize accounts for 43 and 58% of maize land area and total production, respectively, in this region. The most sensitive parameters (yield potential [YP], water-limited yield potential [YP-W], yield gap between actual yield and YP, and resource-use efficiency) governing performance of maize systems in the region are lacking. A simulation model was used to quantify YP under irrigated and rainfed conditions based on weather data, soil properties, and crop management at 18 locations. In a separate study, 5-year soil water data measured in central Nebraska were used to analyze soil water recharge during the non-growing season because soil water content at sowing is a critical component of water supply available for summer crops. On-farm data, including yield, irrigation, and nitrogen (N) rate for 777 field-years, was used to quantify size of yield gaps and evaluate resource-use efficiency. Simulated average YP and YP-W were 14.4 and 8.3 Mg ha-1, respectively. Geospatial variation of YP was associated with solar radiation and temperature during post-anthesis phase while variation in water-limited yield was linked to the longitudinal variation in seasonal rainfall and evaporative demand. Analysis of soil water recharge indicates that 80% of variation in soil water content at sowing can be explained by precipitation during non-growing season and residual soil water at end of previous growing season. A linear relationship between YP-W and water supply (slope: 19.3 kg ha-1 mm-1; x-intercept: 100 mm) can be used as a benchmark to diagnose and improve farmer’s water productivity (WP; kg grain per unit of water supply). Evaluation of data from farmer’s fields provides proof-of-concept and helps identify management constraints to high levels of productivity and resource-use efficiency. On average, actual yields of irrigated maize systems were 11% below YP. WP and N-fertilizer use efficiency (NUE) were high despite application of large amounts of irrigation water and N fertilizer (14 kg grain mm-1 water supply and 71 kg grain kg-1 N fertilizer). While there is limited scope for substantial increases in actual average yields, WP and NUE can be further increased by: (1) switching surface to pivot systems, (2) using conservation instead of conventional tillage systems in soybean-maize rotations, (3) implementation of irrigation schedules based on crop water requirements, and (4) better N fertilizer management.