9 resultados para 90-01-PC1
em eResearch Archive - Queensland Department of Agriculture
Resumo:
This paper describes the fishery and reproductive biology for Linuparus trigonus obtained from trawl fishermen operating off Queensland’s east coast, Australia. The smallest mature female lobster measured 59.8 mm CL, however, 50% maturity was reached between 80 and 85 mm CL. Brood fecundity (BF) was size dependent and ranged between 19,287 and 100,671 eggs in 32 females from 59.8 to 104.3 mm CL. The relationship was best described by the power equation BF = 0.1107*CL to the power of 2.9241 (r to the power of 2 = 0:74). Egg size ranged from 0.96 to 1.12 mm in diameter (mean = 1:02 (+or-) 0:01 mm). Egg weight and size were independent of lobster size. Length frequencies displayed multi-modal distributions.The percentage of female to male lobsters was relatively stable for small size classes (30 to 70 mm CL; 50.0 to 63.6% females), but female proportions rose markedly between 75 and 90 mm (72.2 to 85.4%) suggesting that at the onset of sexual maturity female growth rates are reduced. In size classes greater than 95 mm, males were numerically dominant. A description of the L. trigonus fishery in Queensland is also detailed.
Resumo:
The sequential nature of gel-based marker systems entails low throughput and high costs per assay. Commonly used marker systems such as SSR and SNP are also dependent on sequence information. These limitations result in high cost per data point and significantly limit the capacity of breeding programs to obtain sufficient return on investment to justify the routine use of marker-assisted breeding for many traits and particularly quantitative traits. Diversity Arrays Technology (DArT™) is a cost effective hybridisation-based marker technology that offers a high multiplexing level while being independent of sequence information. This technology offers sorghum breeding programs an alternative approach to whole-genome profiling. We report on the development, application, mapping and utility of DArT™ markers for sorghum germplasm. Results: A genotyping array was developed representing approximately 12,000 genomic clones using PstI+BanII complexity with a subset of clones obtained through the suppression subtractive hybridisation (SSH) method. The genotyping array was used to analyse a diverse set of sorghum genotypes and screening a Recombinant Inbred Lines (RIL) mapping population. Over 500 markers detected variation among 90 accessions used in a diversity analysis. Cluster analysis discriminated well between all 90 genotypes. To confirm that the sorghum DArT markers behave in a Mendelian manner, we constructed a genetic linkage map for a cross between R931945-2-2 and IS 8525 integrating DArT and other marker types. In total, 596 markers could be placed on the integrated linkage map, which spanned 1431.6 cM. The genetic linkage map had an average marker density of 1/2.39 cM, with an average DArT marker density of 1/3.9 cM. Conclusion: We have successfully developed DArT markers for Sorghum bicolor and have demonstrated that DArT provides high quality markers that can be used for diversity analyses and to construct medium-density genetic linkage maps. The high number of DArT markers generated in a single assay not only provides a precise estimate of genetic relationships among genotypes, but also their even distribution over the genome offers real advantages for a range of molecular breeding and genomics applications.
Resumo:
The majority of Australian weeds are exotic plant species that were intentionally introduced for a variety of horticultural and agricultural purposes. A border weed risk assessment system (WRA) was implemented in 1997 in order to reduce the high economic costs and massive environmental damage associated with introducing serious weeds. We review the behaviour of this system with regard to eight years of data collected from the assessment of species proposed for importation or held within genetic resource centres in Australia. From a taxonomic perspective, species from the Chenopodiaceae and Poaceae were most likely to be rejected and those from the Arecaceae and Flacourtiaceae were most likely to be accepted. Dendrogram analysis and classification and regression tree (TREE) models were also used to analyse the data. The latter revealed that a small subset of the 35 variables assessed was highly associated with the outcome of the original assessment. The TREE model examining all of the data contained just five variables: unintentional human dispersal, congeneric weed, weed elsewhere, tolerates or benefits from mutilation, cultivation or fire, and reproduction by vegetative propagation. It gave the same outcome as the full WRA model for 71% of species. Weed elsewhere was not the first splitting variable in this model, indicating that the WRA has a capacity for capturing species that have no history of weediness. A reduced TREE model (in which human-mediated variables had been removed) contained four variables: broad climate suitability, reproduction in less or than equal to 1 year, self-fertilisation, and tolerates and benefits from mutilation, cultivation or fire. It yielded the same outcome as the full WRA model for 65% of species. Data inconsistencies and the relative importance of questions are discussed, with some recommendations made for improving the use of the system.
Resumo:
An experiment using herds of similar to 20 cows (farmlets) assessed the effects of high stocking rates on production and profitability of feeding systems based on dryland and irrigated perennial ryegrass-based pastures in a Mediterranean environment in South Australia over 4 years. A target level of milk production of 7000 L/cow.year was set, based on predicted intakes of 2.7 t DM/cow.year as concentrates, pasture intakes from 1.5 to 2.7 t/cow.year and purchased fodder. In years 1 and 2, up to 1.5 t DM/cow.year of purchased fodder was used and in years 3 and 4 the amounts were increased if necessary to enable levels of milk production per cow to be maintained at target levels. Cows in dryland farmlets calved in March to May inclusive and were stocked at 2.5, 2.9, 3.3, 3.6 and 4.1 cows/ha, while those in irrigated farmlets calved in August to October inclusive and were stocked at 4.1, 5.2, 6.3 and 7.4 cows/ha. In the first 2 years, when inputs of purchased fodder were limited, milk production per cow was reduced with higher stocking rates (P < 0.01), but in years 3 and 4 there were no differences. Mean production was 7149 kg/cow.year in years 1 and 2, and 8162 kg/cow.year in years 3 and 4. Production per hectare was very closely related to stocking rate in all years (P < 0.01), increasing from 18 to 34 t milk/ha.year for dryland farmlets (1300 to 2200 kg milk solids/ha) and from 30 to 60 t milk/ha.year for irrigated farmlets (2200 to 4100 kg milk solids/ha). Almost all of these increases were attributed to the increases in grain and purchased fodder inputs associated with the increases in stocking rate. Net pasture accumulation rates and pasture harvest were generally not altered with stocking rate, though as stocking rate increased there was a change to more of the pasture being grazed and less conserved in both dryland and irrigated farmlets. Total pasture harvest averaged similar to 8 and 14 t DM/ha.year for dryland and irrigated pastures, respectively. An exception was at the highest stocking rate under irrigation, where pugging during winter was associated with a 14% reduction in annual pasture growth. There were several indications that these high stocking rates may not be sustainable without substantial changes in management practice. There were large and positive nutrient balances and associated increases in soil mineral content (P < 0.01), especially for phosphorus and nitrate nitrogen, with both stocking rate and succeeding years. Levels under irrigation were considerably higher (up to 90 and 240 mg/kg of soil for nitrate nitrogen and phosphorus, respectively) than under dryland pastures (60 and 140 mg/kg, respectively). Soil organic carbon levels did not change with stocking rate, indicating a high level of utilisation of forage grown. Weed ingress was also high (to 22% DM) in all treatments and especially in heavily stocked irrigated pastures during winter. It was concluded the higher stocking rates used exceeded those that are feasible for Mediterranean pastures in this environment and upper levels of stocking are suggested to be 2.5 cows/ha for dryland pastures and 5.2 cows/ha for irrigated pastures. To sustain these suggested stocking rates will require further development of management practices to avoid large increases in soil minerals and weed invasion of pastures.
Resumo:
Sporobolus pyramidalis, S. africanus, S. natalensis, S. fertilis and S. jacquemontii, known collectively as the weedy Sporobolus grasses, are exotic weeds causing serious economic losses in grazing areas along Australia's entire eastern coast. In one of the first attempts to provide biological control for a grass, the potential of a smut, Ustilago sporoboli-indici, as a biological control agent for all five weedy Sporobolus spp. found in Australia was evaluated in glasshouse studies. Application of basidiospores to 21-day-old Sporobolus seedlings and subsequent incubation in a moist chamber (26 °C, 90% RH, 48 h) resulted in infection of S. pyramidalis, S. africanus, S. natalensis and S. fertilis but not S. jacquemontii. Host-range trials with 13 native Australian Sporobolus spp. resulted in infection of four native species. Evaluation of damage caused by the smut on two Australian native and two weedy Sporobolus spp. showed that the total numbers of flowers infected for the four grasses were in the following order: S. creber > S. fertilis > S. elongatus > S. natalensis with percentage flower infections of 21%, 14%, 12% and 3%, respectively. Significant differences (P = 0.001) were found when the numbers of infected flowers caused by each treatment were compared. The infection of the four native Sporobolus spp. by the smut indicated that it was not sufficiently host specific for release in Australia and the organism was rejected as a potential biological control agent. The implications of these results are discussed.
Resumo:
Zeaxanthin, along with its isomer lutein, are the major carotenoids contributing to the characteristic colour of yellow sweet-corn. From a human health perspective, these two carotenoids are also specifically accumulated in the human macula, and are thought to protect the photoreceptor cells of the eye from blue light oxidative damage and to improve visual acuity. As humans cannot synthesise these compounds, they must be accumulated from dietary components containing zeaxanthin and lutein. In comparison to most dietary sources, yellow sweet-corn (Zea mays var. rugosa) is a particularly good source of zeaxanthin, although the concentration of zeaxanthin is still fairly low in comparison to what is considered a supplementary dose to improve macular pigment concentration (2 mg/person/day). In our present project, we have increased zeaxanthin concentration in sweet-corn kernels from 0.2 to 0.3 mg/100 g FW to greater than 2.0 mg/100 g FW at sweet-corn eating-stage, substantially reducing the amount of corn required to provide the same dosage of zeaxanthin. This was achieved by altering the carotenoid synthesis pathway to more than double total carotenoid synthesis and to redirect carotenoid synthesis towards the beta-arm of the pathway where zeaxanthin is synthesised. This resulted in a proportional increase of zeaxanthin from 22% to 70% of the total carotenoid present. As kernels increase in physiological maturity, carotenoid concentration also significantly increases, mainly due to increased synthesis but also due to a decline in moisture content of the kernels. When fully mature, dried kernels can reach zeaxanthin and carotene concentrations of 8.7 mg/100 g and 2.6 mg/100 g, respectively. Although kernels continue to increase in zeaxanthin when harvested past their normal harvest maturity stage, the texture of these 'over-mature' kernels is tough, making them less appealing for fresh consumption. Increase in zeaxanthin concentration and other orange carotenoids such as p-carotene also results in a decline in kernel hue angle of fresh sweet-corn from approximately 90 (yellow) to as low as 75 (orange-yellow). This enables high-zeaxanthin sweet-corn to be visually-distinguishable from standard yellow sweet-corn, which is predominantly pigmented by lutein.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
Ammonia volatilised and re-deposited to the landscape is an indirect N2O emission source. This study established a relationship between N2O emissions, low magnitude NH4 deposition (0–30 kg N ha − 1 ), and soil moisture content in two soils using in-vessel incubations. Emissions from the clay soil peaked ( < 0.002 g N [ g soil ] − 1 min − 1 ) from 85 to 93% WFPS (water filled pore space), increasing to a plateau as remaining mineral-N increased. Peak N2O emissions for the sandy soil were much lower ( < 5 × 10 − 5 μg N [ g soil ] − 1 min − 1 ) and occurred at about 60% WFPS, with an indistinct relationship with increasing resident mineral N due to the low rate of nitrification in that soil. Microbial community and respiration data indicated that the clay soil was dominated by denitrifiers and was more biologically active than the sandy soil. However, the clay soil also had substantial nitrifier communities even under peak emission conditions. A process-based mathematical denitrification model was well suited to the clay soil data where all mineral-N was assumed to be nitrified ( R 2 = 90 % ), providing a substrate for denitrification. This function was not well suited to the sandy soil where nitrification was much less complete. A prototype relationship representing mineral-N pool conversions (NO3− and NH4+) was proposed based on time, pool concentrations, moisture relationships, and soil rate constants (preliminary testing only). A threshold for mineral-N was observed: emission of N2O did not occur from the clay soil for mineral-N <70 mg ( kg of soil ) − 1 , suggesting that soil N availability controls indirect N2O emissions. This laboratory process investigation challenges the IPCC approach which predicts indirect emissions from atmospheric N deposition alone.