125 resultados para 670503 Treatments (e.g. chemicals, antibiotics)
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.
Resumo:
There are many reports of efficient embryo germination and the method has been optimized to suit subtropical low chill genotypes. However the subsequent growth, vigor, and ability of germinated embryos to develop and survive acclimatization is rarely reported. Many germinated embryos do not survive acclimatization, develop slowly, or fail to develop normally. Methods to improve plant development from in vitro embryo cultures are needed to improve the number of plants that survive to be useful in breeding programs. This paper describes an improved method of embryo rescue that significantly increases embryo shoot and root development that leads to increased plant survival. Four treatments: Woody Plant Media (WPM) solidified with agar, vermiculite with liquid WPM, vermiculite with WPM plus agar, and conventional stratification, were evaluated for embryo growth and subsequent plantlet development and survival for two low-chill peach and one low-chill nectarine cultivar. Highly significant improvements were found for shoot and root development of seedlings germinated in vermiculite based media compared to embryos germinated in conventional agar-based media. Vermiculite with WPM and agar improved plantlet growth subsequent to in vitro culture and significantly increased survival of germinated embryos resulting in more plants reaching the field.
Resumo:
Grain feeding low bodyweight, cast-for-age (CFA) sheep from pastoral areas of eastern Australia at the end of the growing season can enable critical carcass weight grades to be achieved and thus yield better economic returns. The aim of this work was to compare growth and carcass characteristics for CFA Merino ewes consuming either simple diets based on whole sorghum grain or commercial feed pellets. The experiment also compared various sources of additional nitrogen (N) for inclusion in sorghum diets and evaluated several introductory regimes. Seventeen ewes were killed initially to provide baseline carcass data and the remaining 301 ewes were gradually introduced to the concentrate diets over 14 days before being fed concentrates and wheaten hay ad libitum for 33 or 68 days. Concentrate treatments were: (i) commercial feed pellets, (ii) sorghum mix (SM; whole sorghum grain, limestone, salt and molasses) + urea and ammonium sulfate (SMU), (iii) SMU + whole cottonseed at 286 g/kg of concentrate dry matter (DM), (iv) SM + cottonseed meal at 139 g/kg of concentrate DM, (v) SMU + virginiamycin (20 mg/kg of concentrate) for the first 21 days of feeding, and (vi) whole cottonseed gradually replaced by SMU over the first 14 days of feeding. The target carcass weight of 18 kg was achieved after only 33 days on feed for the pellets and the SM + cottonseed meal diet. All other whole grain sorghum diets required between 33 and 68 days on feed to achieve the target carcass weight. Concentrates based on whole sorghum grain generally produced significantly (P < 0.05) lower carcass weight and fat score than pellets and this may have been linked to the significantly (P < 0.05) higher faecal starch concentrations for ewes consuming sorghum-based diets (270 v. 72 g/kg DM on day 51 of feeding for sorghum-based diets and pellets, respectively). Source of N in whole grain sorghum rations and special introductory regimes had no significant (P > 0.05) effects on carcass weight or fat score of ewes with the exception of carcass weight for SMU + whole cottonseed being significantly lower than SM + cottonseed meal at day 33. Ewes finished on all diets produced acceptable carcasses although muscle pH was high in all ewe carcasses (average 5.8 and 5.7 at 33 and 68 days, respectively). There were no significant (P > 0.05) differences between diets in concentrate DM intake, rumen fluid pH, meat colour score, fat colour score, eye muscle area, meat pH or meat temperature.
Resumo:
In parts of Australia, sorghum grain is a cheaper alternative to other cereal grains but its use and nutritive value in sheep feeding systems is not well understood. The aim of this work was to compare growth and carcass characteristics for crossbred lambs consuming several simple, sorghum-based diets. The treatments were: (1) whole sorghum grain, (2) whole sorghum grain + urea and ammonium sulfate, (3) cracked sorghum grain + urea and ammonium sulfate, (4) expanded sorghum grain + urea and ammonium sulfate, (5) whole sorghum grain + cottonseed meal, and (6) whole sorghum grain + whole cottonseed. Nine lambs were slaughtered initially to provide baseline carcass data and the remaining 339 lambs were gradually introduced to the concentrate diets over 14 days before being fed concentrates and wheaten hay ad libitum for 41, 56 or 76 days. Neither cracking nor expanding whole sorghum grain with added non-protein nitrogen (N) resulted in significantly (P > 0.05) increased final liveweight, growth rates or carcass weights for lambs, or in decreased days on feed to reach 18-kg carcass weight, although carcass fat depth was significantly (P < 0.05) increased compared with the whole sorghum plus non-protein N diet. However, expanding sorghum grain significantly (P < 0.05) reduced faecal starch concentrations compared with whole or cracked sorghum diets with added non-protein N (79 v. 189 g/kg DM after 59 days on feed). Lambs fed whole sorghum grain without an additional N source had significantly (P < 0.05) lower concentrate intake and required significantly (P < 0.05) more days on feed to reach a carcass weight of 18 kg than for all diets containing added N. These lambs also had significantly (P < 0.05) lower carcass weight and fat depth than for lambs consuming whole sorghum plus true protein diets. Substituting sources of true protein (cottonseed meal and whole cottonseed) for non-protein N (urea and ammonium sulfate) did not significantly (P > 0.05) affect concentrate intakes or carcass weights of lambs although carcass fat depth was significantly (P < 0.05) increased and the days to reach 18-kg carcass weight were significantly (P < 0.05) decreased for the whole sorghum plus cottonseed meal diet. In conclusion, processing sorghum grain by cracking or expanding did not significantly improve lamb performance. While providing an additional N source with sorghum grain significantly increased lamb performance, there was no benefit in final carcass weight of lambs from substituting sources of true protein for non-protein N.
Resumo:
For essential elements, such as copper (Cu) and zinc (Zn), the bioavailability in biosolids is important from a nutrient release and a potential contamination perspective. Most ecotoxicity studies are done using metal salts and it has been argued that the bioavailability of metals in biosolids can be different to that of metal salts. We compared the bioavailability of Cu and Zn in biosolids with those of metal salts in the same soils using twelve Australian field trials. Three different measures of bioavailability were assessed: soil solution extraction, CaCl2 extractable fractions and plant uptake. The results showed that bioavailability for Zn was similar in biosolid and salt treatments. For Cu, the results were inconclusive due to strong Cu homeostasis in plants and dissolved organic matter interference in extractable measures. We therefore recommend using isotope dilution methods to assess differences in Cu availability between biosolid and salt treatments.
Resumo:
Faecal Egg Count Reduction Tests (FECRTs) for macrocyclic lactone (ML) and levamisole (LEV) drenches were conducted on two dairy farms in the subtropical, summer rainfall region of eastern Australia to determine if anthelmintic failure contributed to severe gastrointestinal nematode infections observed in weaner calves. Subtropical Cooperia spp. were the dominant nematodes on both farms although significant numbers of Haemonchus placei were also present on Farm 2. On Farm 1, moxidectin pour-on (MXD) drenched at 0.5 mg kg-1 liveweight (LW) reduced the overall Cooperia burden by 82% (95% confidence limits, 37-95%) at day 7 post-drench. As worm burdens increased rapidly in younger animals in the control group (n = 4), levamisole was used as a salvage drench and these calves withdrawn from the trial on animal welfare grounds after sample collection at day 7. Levamisole (LEV) dosed at 6.8 mg kg-1 LW reduced the worm burden in these calves by 100%, 7 days after drenching. On Farm 2, MXD given at 0.5 mg kg-1 LW reduced the faecal worm egg count of cooperioids at day 8 by 96% (71-99%), ivermectin oral (IVM) at 0.2 mg kg-1 LW by 1.6% (-224 to 70%) and LEV oral at 7.1 mg kg-1 LW by 100%. For H. placei the reductions were 98% (85-99.7%) for MXD, 0.7% (-226 to 70%) for IVM and 100% for LEV. This is the first report in Australia of the failure of macrocyclic lactone treatments to control subtropical Cooperia spp. and suspected failure to control H. placei in cattle.
Resumo:
The genus Asparagus includes at least six invasive species in Australia. Asparagus aethiopicus and A. africanus are invasive in subtropical Australia, and a third species, A. virgatus is naturalized and demonstrates localized spread in south east Queensland. To better understand how the attributes of these species contribute to their invasiveness, we compared fruit and seed traits, germination, seedling emergence, seed survival, and time-to-maturity. We further investigated dispersal ecology of A. africanus, examining the diet of a local frugivore, the figbird (Sphecotheres viridis) and the effect of gut passage on seedling emergence. Overall, A. aethiopicus was superior in germination and emergence, with the highest mean germination (98.8%) and emergence (94.5%) under optimal conditions and higher emergence (mean of 73.3%) across all treatments. In contrast, A. africanus had the lowest germination under optimal conditions (71.7%) and low mean seedling emergence (49.5%), but had fruits with the highest relative yield (ratio of dry pulp to fruit fresh weight) that were favored by a local frugivore. Figbirds consumed large numbers of A. africanus fruits (~30% of all non-Ficus fruits), and seedling germination was not significantly affected by gut passage compared to unprocessed fruits. Asparagus virgatus germinated poorly under cool, light conditions (1.4%) despite a high optimum mean (95.0%) and had low mean performance across emergence treatments (36.3%). The species also had fruits with a low pulp return for frugivores. For all species, seed survival declined rapidly in the first 12 mo and fell to < 3.2% viability at 36 mo. On the basis of the traits considered, A. virgatus is unlikely to have the invasive potential of its congeners. Uniformly short seed survival times suggest that weed managers do not have to contend with a substantial persistent soil-stored seed bank, but frugivore-mediated dispersal beyond existing infestations will present a considerable management challenge.
Resumo:
Foraging by feral pigs can strongly affect wetland vegetation assemblages and so too wider ecological processes, although their effects on freshwater ecosystems have seldom been tudied. We assessed the ecological effects of pig foraging in replicate fenced and unfenced ephemeral floodplain lagoons in tropical north-eastern Australia. Pig foraging activities in unfenced lagoons caused major changes to aquatic macrophyte communities and as a consequence, to the proportional amounts of open water and bare ground. The destruction of macrophyte communities and upheaval of wetland sediments significantly affected wetland turbidity, and caused prolonged anoxia and pH imbalances in the unfenced treatments. Whilst fencing of floodplain lagoons will protect against feral pig foraging activities, our repeated measures of many biological, physical and chemical parameters inferred that natural seasonal (i.e. temporal) effects had a greater influence on these variables than did pigs. To validate this observation requires measuring how these effects are influenced by the seemingly greater annual disturbance regime of variable flooding and drying in this tropical climate.
Resumo:
Interest in cashew production in Australia has been stimulated by domestic and export market opportunities and suitability of large areas of tropical Australia. Economic models indicate that cashew production is profitable at 2.8 t ha-1 nut-in-shell (NIS). Balanced plant nutrition is essential to achieve economic yields in Australia, with nitrogen (N) of particular importance because of its capacity to modify growth, affect nut yield and cause environmental degradation through soil acidification and off-site contamination. The study on a commercial cashew plantation at Dimbulah, Australia, investigated the effect of N rate and timing on cashew growth, nut production, N leaching and soil chemical properties over five growth cycles (1995-1999). Nitrogen was applied during the main periods of vegetative (December-April) and reproductive (June-October) growth. Commercial NIS yields (up to 4.4 t ha-1 from individual trees) that exceeded the economic threshold of 2.8 t ha-1 were achieved. The yield response was mainly determined by canopy size as mean nut weight, panicle density and nuts per panicle were largely unaffected by N treatments. Nitrogen application confined to the main period of vegetative growth (December-April) produced a seasonal growth pattern that corresponded most consistently with highest NIS yield. This N timing also reduced late season flowering and undesirable post-November nut drop. Higher yields were not produced at N rates greater than 17 g m-2 of canopy surface area (equating to 210 kg N ha-1 for mature size trees). High yields were attained when N concentrations in Mveg leaves in May-June were about 2%, but this assessment occurs at a time when it is not feasible to correct N deficiency. The Mflor leaf of the preceding November, used in conjunction with the Mveg leaf, was proposed as a diagnostic tool to guide N rate decisions. Leaching of nitrate-N and acidification of the soil profile was recorded to 0.9 m. This is an environmental and sustainability hazard, and demonstrates that improved methods of N management are required.
Resumo:
Root-lesion nematode (Pratylenchus thornei) significantly reduces wheat yields in the northern Australian grain region. Canola is thought to have a 'biofumigation' potential to control nematodes; therefore, a field experiment was designed to compare canola with other winter crops or clean-fallow for reducing P. thornei population densities and improving growth of P. thornei-intolerant wheat (cv. Batavia) in the following year. Immediately after harvest of the first-year crops, populations of P. thornei were lowest following various canola cultivars or clean-fallow (1957-5200 P. thornei/kg dry soil) and were highest following susceptible wheat cultivars (31 033-41 294/kg dry soil). Unexpectedly, at planting of the second-year wheat crop, nematode populations were at more uniform lower levels (<5000/kg dry soil), irrespective of the previous season's treatment, and remained that way during the growing season, which was quite dry. Growth and grain yield of the second-year wheat crop were poorest on plots previously planted with canola or left fallow due to poor colonisation with arbuscular mycorrhizal (AM) fungi, with the exception of canola cv. Karoo, which had high AM fungal colonisation and low wheat yields. There were significant regressions between growth and yield parameters of the second-year wheat and levels of AMF following the pre-crop treatments. Thus, canola appears to be a good crop for reducing P. thornei populations, but AM fungal-dependence of subsequent crops should be considered, particularly in the northern Australian grain region.
Resumo:
In this study, nasal swabs taken from multiparous sows at weaning time or from sick pigs displaying symptoms of Glasser's disease from farms in Australia [date not given] were cultured and analysed by polymerase chain reaction (PCR). Within each genotype detected on a farm, representative isolates were serotyped by gel diffusion (GD) testing or indirect haemagglutination (IHA) test. Isolates which did not react in any of the tests were regarded as non-typable and were termed serovar NT. Serovars 1, 5, 12, 13 and 14 were classified as highly pathogenic; serovars 2, 4 and 15 being moderately pathogenic; serovar 8 being slightly pathogenic and serovars 3, 6, 7, 9 and 11 being non-pathogenic. Sows were inoculated with the strain of Haemophilus parasuis (serovars 4, 6 and 9 from Farms 1, 2 and 4, respectively) used for controlled challenge 3 and 5 weeks before farrowing. Before farrowing the sows were divided into control and treatment groups. Five to seven days after birth, the piglets of the treatment group were challenged with a strain from the farm which had were used to vaccinate the sows. The effectiveness of the controlled exposure was evaluated by number of piglets displaying clinical signs possibly related to infection, number of antibiotic treatments and pig mortality. Nasal swabs of sick pigs were taken twice a week to find a correlation to infection. A subsample of pigs was weighed after leaving the weaning sheds. The specificity of a realtime PCR amplifying the infB gene was evaluated with 68 H. parasuis isolates and 36 strains of closely related species. 239 samples of DNA from tissues and fluids of 16 experimentally challenged animals were also tested with the realtime PCR, and the results compared with culture and a conventional PCR. The farm experiments showed that none of the controlled challenge pigs showed any signs of illness due to Glasser's disease, although the treatment groups required more antibiotics than the controls. A total of 556 H. parasuis isolates were genotyped, while 150 isolates were serotyped. H. parasuis was detected on 19 of 20 farms, including 2 farms with an extensive history of freedom from Glasser's disease. Isolates belonging to serovars regarded as potentially pathogenic were obtained from healthy pigs at weaning on 8 of the 10 farms with a history of Glasser's disease outbreaks. Sampling 213 sick pigs yielded 115 isolates, 99 of which belonged to serovars that were either potentially pathogenic or of unknown pathogenicity. Only 16 isolates from these sick pigs were of a serovar known to be non-pathogenic. Healthy pigs also had H. parasuis, even on farms free of Glasser's disease. The realtime PCR gave positive results for all 68 H. parasuis isolates and negative results for all 36 non-target bacteria. When used on the clinical material from experimental infections, the realtime PCR produced significantly more positive results than the conventional PCR (165 compared to 86).
Resumo:
Soft-leaf buffalo grass is increasing in popularity as an amenity turfgrass in Australia. This project was instigated to assess the adaptation of and establish management guidelines for its use in Australias vast array of growing environments. There is an extensive selection of soft-leaf buffalo grass cultivars throughout Australia and with the countrys changing climates from temperate in the south to tropical in the north not all cultivars are going to be adapted to all regions. The project evaluated 19 buffalo grass cultivars along with other warm-season grasses including green couch, kikuyu and sweet smother grass. The soft-leaf buffalo grasses were evaluated for their growth and adaptation in a number of regions throughout Australia including Western Australia, Victoria, ACT, NSW and Queensland. The growth habit of the individual cultivars was examined along with their level of shade tolerance, water use, herbicide tolerance, resistance to wear, response to nitrogen applications and growth potential in highly alkaline (pH) soils. The growth habit of the various cultivars currently commercially available in Australia differs considerably from the more robust type that spreads quicker and is thicker in appearance (Sir Walter, Kings Pride, Ned Kelly and Jabiru) to the dwarf types that are shorter and thinner in appearance (AusTine and AusDwarf). Soft-leaf buffalo grass types tested do not differ in water use when compared to old-style common buffalo grass. Thus, soft-leaf buffalo grasses, like other warm-season turfgrass species, are efficient in water use. These grasses also recover after periods of low water availability. Individual cultivar differences were not discernible. In high pH soils (i.e. on alkaline-side) some elements essential for plant growth (e.g. iron and manganese) may be deficient causing turfgrass to appear pale green, and visually unacceptable. When 14 soft-leaf buffalo grass genotypes were grown on a highly alkaline soil (pH 7.5-7.9), cultivars differed in leaf iron, but not in leaf manganese, concentrations. Nitrogen is critical to the production of quality turf. The methods for applying this essential element can be manipulated to minimise the maintenance inputs (mowing) during the peak growing period (summer). By applying the greatest proportion of the turfs total nitrogen requirements in early spring, peak summer growth can be reduced resulting in a corresponding reduction in mowing requirements. Soft-leaf buffalo grass cultivars are more shade and wear tolerant than other warm-season turfgrasses being used by homeowners. There are differences between the individual buffalo grass varieties however. The majority of types currently available would be classified as having moderate levels of shade tolerance and wear reasonably well with good recovery rates. The impact of wear in a shaded environment was not tested and there is a need to investigate this as this is a typical growing environment for many homeowners. The use of herbicides is required to maintain quality soft-leaf buffalo grass turf. The development of softer herbicides for other turfgrasses has seen an increase in their popularity. The buffalo grass cultivars currently available have shown varying levels of susceptibility to the chemicals tested. The majority of the cultivars evaluated have demonstrated low levels of phytotoxicity to the herbicides chlorsulfuron (Glean) and fluroxypyr (Starane and Comet). In general, soft leaf buffalo grasses are varied in their makeup and have demonstrated varying levels of tolerance/susceptibility/adaptation to the conditions they are grown under. Consequently, there is a need to choose the cultivar most suited to the environment it is expected to perform in and the management style it will be exposed to. Future work is required to assess how the structure of the different cultivars impacts on their capacity to tolerate wear, varying shade levels, water use and herbicide tolerance. The development of a growth model may provide the solution.
Resumo:
The impact of cropping histories (sugarcane, maize and soybean), tillage practices (conventional tillage and direct drill) and fertiliser N in the plant and 1st ratoon (1R) crops of sugarcane were examined in field trials at Bundaberg and Ingham. Average yields at Ingham (Q200) and Bundaberg (Q151) were quite similar in both the plant crop (83 t/ha and 80 t/ha, respectively) and the 1R (89 t/ha v 94 t/ha, respectively), with only minor treatment effects on CCS at each site. Cane yield responses to tillage, break history and N fertiliser varied significantly between sites. There was a 27% yield increase in the plant crop from the soybean fallow at Ingham, with soybeans producing a yield advantage over continuous cane, but there were no clear break effects at Bundaberg - possibly due to a complex of pathogenic nematodes that responded differently to soybeans and maize breaks. There was no carryover benefit of the soybean break into the 1R crop at Ingham, while at Bundaberg the maize break produced a 15% yield advantage over soybeans and continuous cane. The Ingham site recorded positive responses to N fertiliser addition in both the plant (20% yield increase) and 1R (34% yield increase) crops, but there was negligible carryover benefit from plant crop N in the 1R crop, or of a reduced N response after a soybean rotation. By contrast, the Bundaberg site showed no N response in any history in the plant crop, and only a small (5%) yield increase with N applied in the 1R crop. There was again no evidence of a reduced N response in the 1R crop after a soybean fallow. There were no significant effects of tillage on cane yields at either site, although there were some minor interactions between tillage, breaks and N management in the 1R crop at both sites. Crop N contents at Bundaberg were more than 3 times those recorded at Ingham in both the plant and 1R crops, with N concentrations in millable stalk at Ingham suggesting N deficiencies in all treatments. There was negligible additional N recovered in crop biomass from N fertiliser application or soybean residues at the Ingham site. There was additional N recovered in crop biomass in response to N fertiliser and soybean breaks at Bundaberg, but effects were small and fertiliser use efficiencies poor. Loss pathways could not be quantified, but denitrification or losses in runoff were the likely causes at Ingham while leaching predominated at Bundaberg. Results highlight the complexity involved in developing sustainable farming systems for contrasting soil types and climatic conditions. A better understanding of key sugarcane pathogens and their host range, as well as improved capacity to predict in-crop N mineralisation, will be key factors in future improvements to sugarcane farming systems.
Resumo:
Navua sedge, a member of the Cyperaceae family, is an aggressive weed of pastures in Fiji, Sri Lanka, Malay Peninsula, Vanuatu, Samoa, Solomons, and Tahiti and is now a weed of pastures and roadsides in north Queensland, Australia. Primarily restricted to areas with an annual rainfall exceeding 2500 mm, Navua sedge is capable of forming dense stands smothering many tropical pasture species. Seventeen herbicides were field tested at three sites in north Queensland, with glyphosate, halosulfuron, hexazinone, imazapic, imazapyr, or MSMA the most effective for Navua sedge control. Environmental problems such as persistence in soil, lack of selectivity and movement off-site may occur using some herbicides at the predicted LC90 control level rates. A seasonality trial using halosulfuron (97.5 g ai/ha) gave better Navua sedge control (84%) spraying March to September than spraying at other times (50%). In a frequency trial, sequential glyphosate applications (2,160 g ae/ha) every two months was more effective for continued Navua sedge control (67%) than a single application of glyphosate (36%), though loss of ground cover would occur. In a management trial, single applications of glyphosate (2,160 to 3,570 g ae/ha) using either a rope wick, ground foliar spraying or a rotary rope wick gave 59 to 73% control, while other treatments (rotary hoe (3%), slashing (-13%) or crushing (-30%)) were less effective. In a second management trial, four monthly rotary wick applications were much more effective (98%) than four monthly crushing applications (42%). An effective management plan must include the application of regular herbicide treatments to eliminate Navua sedge seed being added to the soil seed bank. Treatments that result in seed burial, for example, discing are likely to prolong seed persistence and should be avoided. The sprouting activity of vegetative propagules and root fragmentation needs to also be considered when selecting control options.
Resumo:
The productivity of containerized and bare-rooted plants of strawberry (Fragaria * ananassa) was investigated over 4 years in southeastern Queensland, Australia. In the first experiment, plants in small, 75-cm3 cells were compared with bare-rooted plants of 'Festival' and 'Sugarbaby'. A similar experiment was conducted in year 2 with these two cultivars, plus 'Rubygem'. In year 3, plants in large, 125-cm3 cells were compared with small and large bare-rooted plants of 'Festival' and 'Rubygem'. Treatments in each of these experiments were planted on the same date. In the final experiment, plants in large cells and bare-rooted plants of 'Festival' were planted in late March, early April, mid-April, or early May. The plants grown in small cells produced 60% to 85% of the yields of the bare-rooted plants, whereas the yield of plants in large cells was equal to that of the bare-rooted plants. Containerized plants are twice as expensive as bare-rooted plants (A$0.60 vs. A$0.32) (A$=Australian dollar), and gave only similar or lower returns than the bare-rooted plants (A$0.54 to A$3.73 vs. A$1.40 to A$4.09). It can be concluded that containerized strawberry plants are not economically viable in subtropical Queensland under the current price structure and growing system. There was a strong relationship between yield and average plant dry weight (leaves, crowns, and roots) in 'Festival' in the last three experiments, where harvesting continued to late September or early October. Productivity increased by about 18 g for each gram increase in plant dry weight, indicating the dependence of fruit production on vegetative growth in this environment.