22 resultados para Architecture, Domestic -- Australia -- Designs and plans
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.
Resumo:
Purpose We investigated the effects of weed control and fertilization at early establishment on foliar stable carbon (δ13C) and nitrogen (N) isotope (δ15N) compositions, foliar N concentration, tree growth and biomass, relative weed cover and other physiological traits in a 2-year old F1 hybrid (Pinus elliottii var. elliottii (Engelm) × Pinus caribaea var. hondurensis (Barr. ex Golf.)) plantation grown on a yellow earth in southeast Queensland of subtropical Australia. Materials and methods Treatments included routine weed control, luxury weed control, intermediate weed control, mechanical weed control, nil weed control, and routine and luxury fertilization in a randomised complete block design. Initial soil nutrition and soil fertility parameters included (hot water extractable organic carbon (C) and total nitrogen (N), total C and N, C/N ratio, labile N pools (nitrate (NO3 −) and ammonium (NH4 +)), extractable potassium (K+)), soil δ15N and δ13C. Relative weed cover, foliar N concentrations, tree growth rate and physiological parameters including photosynthesis, stomatal conductance, photosynthetic nitrogen use efficiency, foliar δ15N and foliar δ13C were also measured at early establishment. Results and discussion Foliar N concentration at 1.25 years was significantly different amongst the weed control treatments and was negatively correlated to the relative weed cover at 1.1 years. Foliar N concentration was also positively correlated to foliar δ15N and foliar δ13C, tree height, height growth rates and tree biomass. Foliar δ15N was negatively correlated to the relative weed cover at 0.8 and 1.1 years. The physiological measurements indicated that luxury fertilization and increasing weed competition on these soils decreased leaf xylem pressure potential (Ψxpp) when compared to the other treatments. Conclusions These results indicate how increasing N resources and weed competition have implications for tree N and water use at establishment in F1 hybrid plantations of southeast Queensland, Australia. These results suggest the desirability of weed control, in the inter-planting row, in the first year to maximise site N and water resources available for seedling growth. It also showed the need to avoid over-fertilisation, which interfered with the balance between available N and water on these soils.
Resumo:
Phylogenetic group D extraintestinal pathogenic Escherichia coli (ExPEC), including O15:K52:H1 and clonal group A, have spread globally and become fluoroquinolone-resistant. Here we investigated the role of canine feces as a reservoir of these (and other) human-associated ExPEC and their potential as canine pathogens. We characterized and compared fluoroquinolone-resistant E. coli isolates originally identified as phylogenetic group D from either the feces of hospitalized dogs (n = 67; 14 dogs) or extraintestinal infections (n = 53; 33 dogs). Isolates underwent phylogenetic grouping, random amplified polymorphic DNA (RAPD) analysis, virulence genotyping, resistance genotyping, human-associated ExPEC O-typing, and multi-locus sequence typing. Five of seven human-associated sequence types (STs) exhibited ExPEC-associated O-types, and appeared in separate RAPD clusters. The largest subgroup (16 fecal, 26 clinical isolates) were ST354 (phylogroup F) isolates. ST420 (phylogroup B2); O1-ST38, O15:K52:H1-ST393, and O15:K1-ST130 (phylogroup D); and O7-ST457, and O1-ST648 (phylogroup F) were also identified. Three ST-specific RAPD sub-clusters (ST354, ST393, and ST457) contained closely related isolates from both fecal or clinical sources. Genes encoding CTX-M and AmpC β-lactamases were identified in isolates from five STs. Major human-associated fluoroquinolone-resistant ± extended-spectrum cephalosporin-resistant ExPEC of public health importance may be carried in dog feces and cause extraintestinal infections in some dogs.
Resumo:
During the past 15 years, surveys to identify virus diseases affecting cool-season food legume crops in Australia and 11 CWANA countries (Algeria, China, Egypt, Ethiopia, Lebanon, Morocco, Sudan, Syria, Tunisia, Uzbekistan and Yemen) were conducted. More than 20,000 samples were collected and tested for the presence of 14 legume viruses by the tissue-blot immunoassay (TBIA) using a battery of antibodies, including the following Luteovirus monoclonal antibodies (McAbs): a broad-spectrum legume Luteovirus (5G4), BLRV, BWYV, SbDV and CpCSV. A total of 195 Luteovirus samples were selected for further testing by RT-PCR using 7 primers (one is degenerate, and can detect a wide range of Luteoviridae virus species and the other six are species-specific primers) at the Virology Laboratory, QDAF, Australia, during 2014. A total of 145 DNA fragments (represented 105 isolates) were sequenced. The following viruses were characterized based on molecular analysis: BLRV from Lebanon, Morocco, Tunisia and Uzbekistan; SbDV from Australia, Syria and Uzbekistan; BWYV from Algeria, China, Ethiopia, Lebanon, Morocco, Sudan, Tunisia and Uzbekistan; CABYV from Algeria, Lebanon, Syria, Sudan and Uzbekistan; CpCSV from Algeria, Ethiopia, Lebanon, Morocco, Syria and Tunisia, and unknown Luteoviridae species from Algeria, Ethiopia, Morocco, Sudan, Uzbekistan and Yemen. This study has clearly shown that there are a number of Polerovirus species, in addition to BWYV, all can produce yellowing/stunting symptoms in pulses (e.g. CABYV, CpCSV, and other unknown Polerovirus species). Based on our knowledge this is the first report of CABYV affecting food legumes. Moreover, there was about 95% agreement between results obtained from serological analysis (TBIA) and molecular analysis for the detection of BLRV and SbDV. Whereas, TBIA results were not accurate when using CpCSV and BWYV McAbs . It seems that the McAbs for CpCSV and BWYV used in this study and those available worldwide, are not virus species specific. Both antibodies, reacted with other Polerovirus species (e.g. CABYV, and unknown Polerovirus). This highlights the need for more accurate characterization of existing antibodies and where necessary the development of better, virus-specific antibodies to enable their use for accurate diagnosis of Poleroviruses.
Resumo:
With the aim of increasing peanut production in Australia, the Australian peanut industry has recently considered growing peanuts in rotation with maize at Katherine in the Northern Territory—a location with a semi-arid tropical climate and surplus irrigation capacity. We used the well-validated APSIM model to examine potential agronomic benefits and long-term risks of this strategy under the current and warmer climates of the new region. Yield of the two crops, irrigation requirement, total soil organic carbon (SOC), nitrogen (N) losses and greenhouse gas (GHG) emissions were simulated. Sixteen climate stressors were used; these were generated by using global climate models ECHAM5, GFDL2.1, GFDL2.0 and MRIGCM232 with a median sensitivity under two Special Report of Emissions Scenarios over the 2030 and 2050 timeframes plus current climate (baseline) for Katherine. Effects were compared at three levels of irrigation and three levels of N fertiliser applied to maize grown in rotations of wet-season peanut and dry-season maize (WPDM), and wet-season maize and dry-season peanut (WMDP). The climate stressors projected average temperature increases of 1°C to 2.8°C in the dry (baseline 24.4°C) and wet (baseline 29.5°C) seasons for the 2030 and 2050 timeframes, respectively. Increased temperature caused a reduction in yield of both crops in both rotations. However, the overall yield advantage of WPDM increased from 41% to up to 53% compared with the industry-preferred sequence of WMDP under the worst climate projection. Increased temperature increased the irrigation requirement by up to 11% in WPDM, but caused a smaller reduction in total SOC accumulation and smaller increases in N losses and GHG emission compared with WMDP. We conclude that although increased temperature will reduce productivity and total SOC accumulation, and increase N losses and GHG emissions in Katherine or similar northern Australian environments, the WPDM sequence should be preferable over the industry-preferred sequence because of its overall yield and sustainability advantages in warmer climates. Any limitations of irrigation resulting from climate change could, however, limit these advantages.