39 resultados para lagoonal environment
Resumo:
Sorghum is a staple food for half a billion people and, through growth on marginal land with minimal inputs, is an important source of feed, forage and increasingly, biofuel feedstock. Here we present information about non-cellulosic cell wall polysaccharides in a diverse set of cultivated and wild Sorghum bicolor grains. Sorghum grain contains predominantly starch (64–76) but is relatively deficient in other polysaccharides present in wheat, oats and barley. Despite overall low quantities, sorghum germplasm exhibited a remarkable range in polysaccharide amount and structure. Total (1,3;1,4)-β-glucan ranged from 0.06 to 0.43 (w/w) whilst internal cellotriose:cellotetraose ratios ranged from 1.8 to 2.9:1. Arabinoxylan amounts fell between 1.5 and 3.6 (w/w) and the arabinose:xylose ratio, denoting arabinoxylan structure, ranged from 0.95 to 1.35. The distribution of these and other cell wall polysaccharides varied across grain tissues as assessed by electron microscopy. When ten genotypes were tested across five environmental sites, genotype (G) was the dominant source of variation for both (1,3;1,4)-β-glucan and arabinoxylan content (69–74), with environment (E) responsible for 5–14. There was a small G × E effect for both polysaccharides. This study defines the amount and spatial distribution of polysaccharides and reveals a significant genetic influence on cell wall composition in sorghum grain.
Resumo:
In semi-arid sub-tropical areas, a number of studies concerning no-till (NT) farming systems have demonstrated advantages in economic, environmental and soil quality aspects over conventional tillage (CT). However, adoption of continuous NT has contributed to the build-up of herbicide resistant weed populations, increased incidence of soil- and stubble-borne diseases, and stratification of nutrients and organic carbon near the soil surface. Some farmers often resort to an occasional strategic tillage (ST) to manage these problems of NT systems. However, farmers who practice strict NT systems are concerned that even one-time tillage may undo positive soil condition benefits of NT farming systems. We reviewed the pros and cons of the use of occasional ST in NT farming systems. Impacts of occasional ST on agronomy, soil and environment are site-specific and depend on many interacting soil, climatic and management conditions. Most studies conducted in North America and Europe suggest that introducing occasional ST in continuous NT farming systems could improve productivity and profitability in the short term; however in the long-term, the impact is negligible or may be negative. The short term impacts immediately following occasional ST on soil and environment include reduced protective cover, soil loss by erosion, increased runoff, loss of C and water, and reduced microbial activity with little or no detrimental impact in the long-term. A potential negative effect immediately following ST would be reduced plant available water which may result in unreliability of crop sowing in variable seasons. The occurrence of rainfall between the ST and sowing or immediately after the sowing is necessary to replenish soil water lost from the seed zone. Timing of ST is likely to be critical and must be balanced with optimising soil water prior to seeding. The impact of occasional ST varies with the tillage implement used; for example, inversion tillage using mouldboard tillage results in greater impacts as compared to chisel or disc. Opportunities for future research on occasional ST with the most commonly used implements such as tine and/or disc in Australia’s northern grains-growing region are presented in the context of agronomy, soil and the environment.
Resumo:
The urban presence of flying-foxes (pteropid bats) in eastern Australia has increased in the last 20 years, putatively reflecting broader landscape change. The influx of large numbers often precipitates community angst, typically stemming from concerns about loss of social amenity, economic loss or negative health impacts from recently emerged bat-mediated zoonotic diseases such as Hendra virus and Australian bat lyssavirus. Local authorities and state wildlife authorities are increasingly asked to approve the dispersal or modification of flying-fox roosts to address expressed concerns, yet the scale of this concern within the community, and the veracity of the basis for concern are often unclear. We conducted an on-line survey to capture community attitudes and opinions on flying-foxes in the urban environment to inform management policy and decision-making. Analysis focused on awareness, concerns, and management options, and primarily compared responses from communities where flying-fox management was and was not topical at the time of the survey. While a majority of respondents indicated a moderate to high level of knowledge of both flying-foxes and Hendra virus, a substantial minority mistakenly believed that flying-foxes pose a direct infection risk to humans, suggesting miscommunication or misinformation, and the need for additional risk communication strategies. Secondly, a minority of community members indicated they were directly impacted by urban roosts, most plausibly those living in close proximity to the roost, suggesting that targeted management options are warranted. Thirdly, neither dispersal nor culling was seen as an appropriate management strategy by the majority of respondents, including those from postcodes where flying-fox management was topical. These findings usefully inform community debate and policy development and demonstrate the value of social analysis in defining the issues and options in this complex human - wildlife interaction. The mobile nature of flying-foxes underlines the need for a management strategy at a regional or larger scale, and independent of state borders.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
In Maize, as with most cereals, grain yield is mostly determined by the total grain number per unit area, which is highly related to the rate of crop growth during the critical period around silking. Management practices such as plant density or nitrogen fertilization can affect the growth of the crop during this period, and consequently the final grain yield. Across the Northern Region maize is grown under a large range of plant populations under high year-to-year rainfall variability. Clear guidelines on how to match hybrids and management across environments and expected seasonal condition, would allow growers to increase yields and profits while managing risks. The objective of this research was to screen the response of commercial maize hybrids differing in maturity and prolificity (i.e. multi or single cobbing) types for their efficiency in the allocation of biomass into grain.
Resumo:
Pratylenchus thornei is a root-lesion nematode (RLN) of economic significance in the grain growing regions of Australia. Chickpea (Cicer arietinum) is a significant legume crop grown throughout these regions, but previous testing found most cultivars were susceptible to P. thornei. Therefore, improved resistance to P. thornei is an important objective of the Australian chickpea breeding program. A glasshouse method was developed to assess resistance of chickpea lines to P. thornei, which requires relatively low labour and resource input, and hence is suited to routine adoption within a breeding program. Using this method, good differentiation of chickpea cultivars for P. thornei resistance was measured after 12 weeks. Nematode multiplication was higher for all genotypes than the unplanted control, but of the 47 cultivars and breeding lines tested, 17 exhibited partial resistance, allowing less than two fold multiplication. The relative differences in resistance identified using this method were highly heritable (0.69) and were validated against P. thornei data from seven field trials using a multi-environment trial analysis. Genetic correlations for cultivar resistance between the glasshouse and six of the field trials were high (>0.73). These results demonstrate that resistance to P. thornei in chickpea is highly heritable and can be effectively selected in a limited set of environments. The improved resistance found in a number of the newer chickpea cultivars tested shows that some advances have been made in the P. thornei resistance of Australian chickpea cultivars, and that further targeted breeding and selection should provide incremental improvements.
Resumo:
Exposure to hot environments affects milk yield (MY) and milk composition of pasture and feed-pad fed dairy cows in subtropical regions. This study was undertaken during summer to compare MY and physiology of cows exposed to six heat-load management treatments. Seventy-eight Holstein-Friesian cows were blocked by season of calving, parity, milk yield, BW, and milk protein (%) and milk fat (%) measured in 2 weeks prior to the start of the study. Within blocks, cows were randomly allocated to one of the following treatments: open-sided iron roofed day pen adjacent to dairy (CID) + sprinklers (SP); CID only; non-shaded pen adjacent to dairy + SP (NSD + SP); open-sided shade cloth roofed day pen adjacent to dairy (SCD); NSD + sprinkler (sprinkler on for 45 min at 1100 h if mean respiration rate >80 breaths per minute (NSD + WSP)); open-sided shade cloth roofed structure over feed bunk in paddock + 1 km walk to and from the dairy (SCP + WLK). Sprinklers for CID + SP and NSD + SP cycled 2 min on, 12 min off when ambient temperature >26°C. The highest milk yields were in the CID + SP and CID treatments (23.9 L cow−1 day−1), intermediate for NSD + SP, SCD and SCP + WLK (22.4 L cow−1 day−1), and lowest for NSD + WSP (21.3 L cow−1 day−1) (P < 0.05). The highest (P < 0.05) feed intakes occurred in the CID + SP and CID treatments while intake was lowest (P < 0.05) for NSD + WSP and SCP + WLK. Weather data were collected on site at 10-min intervals, and from these, THI was calculated. Nonlinear regression modelling of MY × THI and heat-load management treatment demonstrated that cows in CID + SP showed no decline in MY out to a THI break point value of 83.2, whereas the pooled MY of the other treatments declined when THI >80.7. A combination of iron roof shade plus water sprinkling throughout the day provided the most effective control of heat load.