45 resultados para anthropogenic environment
Resumo:
Experiments were conducted over 5 years to understand the seasonal phenology of bare-rooted ?Festival? strawberry plants (Fragaria ?ananassa) growing at Nambour in southeastern Queensland, Australia. Yields ranged from 661 to 966 g/plant, and average seasonal fruit fresh weight ranged from 15 to 18 g. The growth of the leaves, crowns, roots, flowers and fruit over time followed a linear or sigmoid pattern. Maximum values of leaf, crown and root dry weight towards the end of the growing season about 190 days after planting were 30, 15 and 7 g/plant, respectively. The rates of leaf and crown growth were lower than those achieved in California under a Mediterranean climate. There were strong relationships between the allocation of dry matter to the leaves, crowns and roots and plant dry weight. Allocation to the leaves, and especially to the crowns and roots, declined as the plants grew. The number of fruit/plant increased initially over time with a decline later in the season. Average fruit fresh weight was generally higher early in the season and then declined as fruit production increased. There were strong relationships between the growth of the whole plant and the growth of the flowers and immature fruit, and leaf expansion, across the growing season and across the 5 different years. These results indicate that seasonal growth and potential productivity were strongly linked to the expansion of the leaves in this environment.
Resumo:
Commercial environments may receive only a fraction of expected genetic gains for growth rate as predicted from the selection environment. This fraction is result of undesirable genotype-by-environment interactions (GxE) and measured by the genetic correlation (rg) of growth between environments. Rapid estimates of genetic correlation achieved in one generation are notoriously difficult to estimate with precision. A new design is proposed where genetic correlations can be estimated by utilising artificial mating from cryopreserved semen and unfertilised eggs stripped from a single female. We compare a traditional phenotype analysis of growth to a threshold model where only the largest fish are genotyped for sire identification. The threshold model was robust to differences in family mortality differing up to 30%. The design is unique as it negates potential re-ranking of families caused by an interaction between common maternal environmental effects and growing environment. The design is suitable for rapid assessment of GxE over one generation with a true 0.70 genetic correlation yielding standard errors as low as 0.07. Different design scenarios were tested for bias and accuracy with a range of heritability values, number of half-sib families created, number of progeny within each full-sib family, number of fish genotyped, number of fish stocked, differing family survival rates and at various simulated genetic correlation levels.
Resumo:
Methane is a potent greenhouse gas with a global warming potential ∼28 times that of carbon dioxide. Consequently, sources and sinks that influence the concentration of methane in the atmosphere are of great interest. In Australia, agriculture is the primary source of anthropogenic methane emissions (60.4% of national emissions, or 3260kt-1methaneyear-1, between 1990 and 2011), and cropping and grazing soils represent Australia's largest potential terrestrial methane sink. As of 2011, the expansion of agricultural soils, which are ∼70% less efficient at consuming methane than undisturbed soils, to 59% of Australia's land mass (456Mha) and increasing livestock densities in northern Australia suggest negative implications for national methane flux. Plant biomass burning does not appear to have long-term negative effects on methane flux unless soils are converted for agricultural purposes. Rice cultivation contributes marginally to national methane emissions and this fluctuates depending on water availability. Significant available research into biological, geochemical and agronomic factors has been pertinent for developing effective methane mitigation strategies. We discuss methane-flux feedback mechanisms in relation to climate change drivers such as temperature, atmospheric carbon dioxide and methane concentrations, precipitation and extreme weather events. Future research should focus on quantifying the role of Australian cropping and grazing soils as methane sinks in the national methane budget, linking biodiversity and activity of methane-cycling microbes to environmental factors, and quantifying how a combination of climate change drivers will affect total methane flux in these systems.
Resumo:
The prospect of climate change has revived both fears of food insecurity and its corollary, market opportunities for agricultural production. In Australia, with its long history of state-sponsored agricultural development, there is renewed interest in the agricultural development of tropical and sub-tropical northern regions. Climate projections suggest that there will be less water available to the main irrigation systems of the eastern central and southern regions of Australia, while net rainfall could be sustained or even increase in the northern areas. Hence, there could be more intensive use of northern agricultural areas, with the relocation of some production of economically important commodities such as vegetables, rice and cotton. The problem is that the expansion of cropping in northern Australia has been constrained by agronomic and economic considerations. The present paper examines the economics, at both farm and regional level, of relocating some cotton production from the east-central irrigation areas to the north where there is an existing irrigation scheme together with some industry and individual interest in such relocation. Integrated modelling and expert knowledge are used to examine this example of prospective climate change adaptation. Farm-level simulations show that without adaptation, overall gross margins will decrease under a combination of climate change and reduction in water availability. A dynamic regional Computable General Equilibrium model is used to explore two scenarios of relocating cotton production from south east Queensland, to sugar-dominated areas in northern Queensland. Overall, an increase in real economic output and real income was realized when some cotton production was relocated to sugar cane fallow land/new land. There were, however, large negative effects on regional economies where cotton production displaced sugar cane. It is concluded that even excluding the agronomic uncertainties, which are not examined here, there is unlikely to be significant market-driven relocation of cotton production.
Resumo:
Hendra virus (HeV), a highly pathogenic zoonotic paramyxovirus recently emerged from bats, is a major concern to the horse industry in Australia. Previous research has shown that higher temperatures led to lower virus survival rates in the laboratory. We develop a model of survival of HeV in the environment as influenced by temperature. We used 20 years of daily temperature at six locations spanning the geographic range of reported HeV incidents to simulate the temporal and spatial impacts of temperature on HeV survival. At any location, simulated virus survival was greater in winter than in summer, and in any month of the year, survival was higher in higher latitudes. At any location, year-to-year variation in virus survival 24 h post-excretion was substantial and was as large as the difference between locations. Survival was higher in microhabitats with lower than ambient temperature, and when environmental exposure was shorter. The within-year pattern of virus survival mirrored the cumulative within-year occurrence of reported HeV cases, although there were no overall differences in survival in HeV case years and non-case years. The model examines the effect of temperature in isolation; actual virus survivability will reflect the effect of additional environmental factors
Resumo:
Weather is a general stochastic influence on the life history of weeds. In contrast, anthropogenic disturbance (e.g. land use) is an important deterministic influence on weed demography. Our aim with this study was to investigate the relative contributions of land use and weather on the demography of Lantana camara (lantana), a weed of agricultural and natural habitats, based on the intensive monitoring of lantana populations under three land uses (viz. farm[pasture], and burnt and grazed forests) in subtropical Australia. Lantana populations were growing vigorously across all land uses (asymptotic population growth rate, λ > 3). Examination of historical demography using retrospective perturbation analyses showed that weather was a strong influence on lantana demography with the transition from an El Niño (2008–09) to a La Niña (2009–10) year having a strong positive effect on population growth rate. This effect was most marked at the grazed site, and to a lesser extent at the burnt site, with seedling-to-juvenile and juvenile-to-adult transitions contributing most to these effects. This is likely the result of burning and grazing having eliminated/reduced interspecific competition at these sites. Prospective perturbation analyses revealed that λ was most sensitive to proportionate changes in growth transitions, followed by fecundity and survival transitions. Examination of context-specific patterns in elasticity revealed that growth and fecundity transitions are likely to be the more critical vital rates to reduce λ in wet years at the burnt and grazed forest sites, compared to the farm/pasture site. Management of lantana may need to limit the transition of juveniles into the adult stages, especially in sites where lantana is free from competition (e.g. in the presence of fire or grazing), and this particularly needs to be achieved in wet years. Collectively, these results shed light on aspects of spatial and temporal variation in the demography of lantana, and offer insights on its context-specific management.
Resumo:
Sorghum is a staple food for half a billion people and, through growth on marginal land with minimal inputs, is an important source of feed, forage and increasingly, biofuel feedstock. Here we present information about non-cellulosic cell wall polysaccharides in a diverse set of cultivated and wild Sorghum bicolor grains. Sorghum grain contains predominantly starch (64–76) but is relatively deficient in other polysaccharides present in wheat, oats and barley. Despite overall low quantities, sorghum germplasm exhibited a remarkable range in polysaccharide amount and structure. Total (1,3;1,4)-β-glucan ranged from 0.06 to 0.43 (w/w) whilst internal cellotriose:cellotetraose ratios ranged from 1.8 to 2.9:1. Arabinoxylan amounts fell between 1.5 and 3.6 (w/w) and the arabinose:xylose ratio, denoting arabinoxylan structure, ranged from 0.95 to 1.35. The distribution of these and other cell wall polysaccharides varied across grain tissues as assessed by electron microscopy. When ten genotypes were tested across five environmental sites, genotype (G) was the dominant source of variation for both (1,3;1,4)-β-glucan and arabinoxylan content (69–74), with environment (E) responsible for 5–14. There was a small G × E effect for both polysaccharides. This study defines the amount and spatial distribution of polysaccharides and reveals a significant genetic influence on cell wall composition in sorghum grain.
Resumo:
In semi-arid sub-tropical areas, a number of studies concerning no-till (NT) farming systems have demonstrated advantages in economic, environmental and soil quality aspects over conventional tillage (CT). However, adoption of continuous NT has contributed to the build-up of herbicide resistant weed populations, increased incidence of soil- and stubble-borne diseases, and stratification of nutrients and organic carbon near the soil surface. Some farmers often resort to an occasional strategic tillage (ST) to manage these problems of NT systems. However, farmers who practice strict NT systems are concerned that even one-time tillage may undo positive soil condition benefits of NT farming systems. We reviewed the pros and cons of the use of occasional ST in NT farming systems. Impacts of occasional ST on agronomy, soil and environment are site-specific and depend on many interacting soil, climatic and management conditions. Most studies conducted in North America and Europe suggest that introducing occasional ST in continuous NT farming systems could improve productivity and profitability in the short term; however in the long-term, the impact is negligible or may be negative. The short term impacts immediately following occasional ST on soil and environment include reduced protective cover, soil loss by erosion, increased runoff, loss of C and water, and reduced microbial activity with little or no detrimental impact in the long-term. A potential negative effect immediately following ST would be reduced plant available water which may result in unreliability of crop sowing in variable seasons. The occurrence of rainfall between the ST and sowing or immediately after the sowing is necessary to replenish soil water lost from the seed zone. Timing of ST is likely to be critical and must be balanced with optimising soil water prior to seeding. The impact of occasional ST varies with the tillage implement used; for example, inversion tillage using mouldboard tillage results in greater impacts as compared to chisel or disc. Opportunities for future research on occasional ST with the most commonly used implements such as tine and/or disc in Australia’s northern grains-growing region are presented in the context of agronomy, soil and the environment.
Resumo:
The urban presence of flying-foxes (pteropid bats) in eastern Australia has increased in the last 20 years, putatively reflecting broader landscape change. The influx of large numbers often precipitates community angst, typically stemming from concerns about loss of social amenity, economic loss or negative health impacts from recently emerged bat-mediated zoonotic diseases such as Hendra virus and Australian bat lyssavirus. Local authorities and state wildlife authorities are increasingly asked to approve the dispersal or modification of flying-fox roosts to address expressed concerns, yet the scale of this concern within the community, and the veracity of the basis for concern are often unclear. We conducted an on-line survey to capture community attitudes and opinions on flying-foxes in the urban environment to inform management policy and decision-making. Analysis focused on awareness, concerns, and management options, and primarily compared responses from communities where flying-fox management was and was not topical at the time of the survey. While a majority of respondents indicated a moderate to high level of knowledge of both flying-foxes and Hendra virus, a substantial minority mistakenly believed that flying-foxes pose a direct infection risk to humans, suggesting miscommunication or misinformation, and the need for additional risk communication strategies. Secondly, a minority of community members indicated they were directly impacted by urban roosts, most plausibly those living in close proximity to the roost, suggesting that targeted management options are warranted. Thirdly, neither dispersal nor culling was seen as an appropriate management strategy by the majority of respondents, including those from postcodes where flying-fox management was topical. These findings usefully inform community debate and policy development and demonstrate the value of social analysis in defining the issues and options in this complex human - wildlife interaction. The mobile nature of flying-foxes underlines the need for a management strategy at a regional or larger scale, and independent of state borders.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
In Maize, as with most cereals, grain yield is mostly determined by the total grain number per unit area, which is highly related to the rate of crop growth during the critical period around silking. Management practices such as plant density or nitrogen fertilization can affect the growth of the crop during this period, and consequently the final grain yield. Across the Northern Region maize is grown under a large range of plant populations under high year-to-year rainfall variability. Clear guidelines on how to match hybrids and management across environments and expected seasonal condition, would allow growers to increase yields and profits while managing risks. The objective of this research was to screen the response of commercial maize hybrids differing in maturity and prolificity (i.e. multi or single cobbing) types for their efficiency in the allocation of biomass into grain.
Resumo:
Pratylenchus thornei is a root-lesion nematode (RLN) of economic significance in the grain growing regions of Australia. Chickpea (Cicer arietinum) is a significant legume crop grown throughout these regions, but previous testing found most cultivars were susceptible to P. thornei. Therefore, improved resistance to P. thornei is an important objective of the Australian chickpea breeding program. A glasshouse method was developed to assess resistance of chickpea lines to P. thornei, which requires relatively low labour and resource input, and hence is suited to routine adoption within a breeding program. Using this method, good differentiation of chickpea cultivars for P. thornei resistance was measured after 12 weeks. Nematode multiplication was higher for all genotypes than the unplanted control, but of the 47 cultivars and breeding lines tested, 17 exhibited partial resistance, allowing less than two fold multiplication. The relative differences in resistance identified using this method were highly heritable (0.69) and were validated against P. thornei data from seven field trials using a multi-environment trial analysis. Genetic correlations for cultivar resistance between the glasshouse and six of the field trials were high (>0.73). These results demonstrate that resistance to P. thornei in chickpea is highly heritable and can be effectively selected in a limited set of environments. The improved resistance found in a number of the newer chickpea cultivars tested shows that some advances have been made in the P. thornei resistance of Australian chickpea cultivars, and that further targeted breeding and selection should provide incremental improvements.
Resumo:
Exposure to hot environments affects milk yield (MY) and milk composition of pasture and feed-pad fed dairy cows in subtropical regions. This study was undertaken during summer to compare MY and physiology of cows exposed to six heat-load management treatments. Seventy-eight Holstein-Friesian cows were blocked by season of calving, parity, milk yield, BW, and milk protein (%) and milk fat (%) measured in 2 weeks prior to the start of the study. Within blocks, cows were randomly allocated to one of the following treatments: open-sided iron roofed day pen adjacent to dairy (CID) + sprinklers (SP); CID only; non-shaded pen adjacent to dairy + SP (NSD + SP); open-sided shade cloth roofed day pen adjacent to dairy (SCD); NSD + sprinkler (sprinkler on for 45 min at 1100 h if mean respiration rate >80 breaths per minute (NSD + WSP)); open-sided shade cloth roofed structure over feed bunk in paddock + 1 km walk to and from the dairy (SCP + WLK). Sprinklers for CID + SP and NSD + SP cycled 2 min on, 12 min off when ambient temperature >26°C. The highest milk yields were in the CID + SP and CID treatments (23.9 L cow−1 day−1), intermediate for NSD + SP, SCD and SCP + WLK (22.4 L cow−1 day−1), and lowest for NSD + WSP (21.3 L cow−1 day−1) (P < 0.05). The highest (P < 0.05) feed intakes occurred in the CID + SP and CID treatments while intake was lowest (P < 0.05) for NSD + WSP and SCP + WLK. Weather data were collected on site at 10-min intervals, and from these, THI was calculated. Nonlinear regression modelling of MY × THI and heat-load management treatment demonstrated that cows in CID + SP showed no decline in MY out to a THI break point value of 83.2, whereas the pooled MY of the other treatments declined when THI >80.7. A combination of iron roof shade plus water sprinkling throughout the day provided the most effective control of heat load.