92 resultados para lay percentage
Resumo:
Mixtures of single odours were used to explore the receptor response profile across individual antennae of Helicoverpa armigera (Hübner) (Lepidoptera: Noctuidae). Seven odours were tested including floral and green-leaf volatiles: phenyl acetaldehyde, benzaldehyde, b-caryophyllene, limonene, a-pinene, 1-hexanol, 3Z-hexenyl acetate. Electroantennograms of responses to paired mixtures of odours showed that there was considerable variation in receptor tuning across the receptor field between individuals. Data from some moth antennae showed no additivity, which indicated a restricted receptor profile. Results from other moth antennae to the same odour mixtures showed a range of partial additivity. This indicated that a wider array of receptor types was present in these moths, with a greater percentage of the receptors tuned exclusively to each odour. Peripheral receptor fields show variation in the spectrum of response within a population (of moths) when exposed to high doses of plant volatiles. This may be related to recorded variation in host choice within moth populations as reported by other authors.
Resumo:
BACKGROUND: In spite of the extensive use of phosphine fumigation around the world to control insects in stored grain, and the knowledge that grain sorbs phosphine, the effect of concentration on sorption has not been quantified. A laboratory study was undertaken, therefore, to investigate the effect of phosphine dose on sorption in wheat. Wheat was added to glass flasks to achieve filling ratios of 0.25-0.95, and the flasks were sealed and injected with phosphine at 0.1-1.5 mg L-1 based on flask volume. Phosphine concentration was monitored for 8 days at 25°C and 55% RH. RESULTS: When sorption occurred, phosphine concentration declined with time and was approximately first order, i.e. the data fitted an exponential decay equation. Percentage sorption per day was directly proportional to filling ratio, and was negatively correlated with dose for any given filling ratio. Based on the results, a tenfold increase in dose would result in a halving of the sorption constant and the percentage daily loss. Wheat was less sorptive if it was fumigated for a second time. CONCLUSIONS: The results have implications for the use of phosphine for control of insects in stored wheat. This study shows that dose is a factor that must be considered when trying to understand the impact of sorption on phosphine concentration, and that there appears to be a limit to the capacity of wheat to sorb phosphine.
Resumo:
Sorghum ergot, caused predominantly by Claviceps africana Frederickson, Mantle, de Milliano, is a significant threat to the sorghum industry worldwide. The objectives of this study were firstly, to identify molecular markers linked to ergot resistance and to two pollen traits, pollen quantity (PQ) and pollen viability (PV), and secondly, to assess the relationship between the two pollen traits and ergot resistance in sorghum. A genetic linkage map of sorghum RIL population R931945-2-2 x IS 8525 (resistance source) was constructed using 303 markers including 36 SSR, 117 AFLP™, 148 DArT™ and two morphological trait loci. Composite interval mapping identified nine, five, and four QTL linked to molecular markers for percentage ergot infection (PCERGOT), PQ and PV, respectively, at a LOD >2.0. Co-location/linkage of QTL were identified on four chromosomes while other QTL for the three traits mapped independently, indicating that both pollen and non pollen-based mechanisms of ergot resistance were operating in this sorghum population. Of the nine QTL identified for PCERGOT, five were identified using the overall data set while four were specific to the group data sets defined by temperature and humidity. QTL identified on SBI-02 and SBI-06 were further validated in additional populations. This is the first report of QTL associated with ergot resistance in sorghum. The markers reported herein could be used for marker-assisted selection for this important disease of sorghum.
Resumo:
Synthetic backcrossed-derived bread wheats (SBWs) from CIMMYT were grown in the Northwest of Mexico at Centro de Investigaciones Agrícolas del Noroeste (CIANO) and sites across Australia during three seasons. During three consecutive years Australia received “shipments” of different SBWs from CIMMYT for evaluation. A different set of lines was evaluated each season, as new materials became available from the CIMMYT crop enhancement program. These consisted of approximately 100 advanced lines (F7) per year. SBWs had been top and backcrossed to CIMMYT cultivars in the first two shipments and to Australian wheat cultivars in the third one. At CIANO, the SBWs were trialled under receding soil moisture conditions. We evaluated both the performance of each line across all environments and the genotype-by-environment interaction using an analysis that fits a multiplicative mixed model, adjusted for spatial field trends. Data were organised in three groups of multienvironment trials (MET) containing germplasm from shipment 1 (METShip1), 2 (METShip2), and 3 (METShip3), respectively. Large components of variance for the genotype × environment interaction were found for each MET analysis, due to the diversity of environments included and the limited replication over years (only in METShip2, lines were tested over 2 years). The average percentage of genetic variance explained by the factor analytic models with two factors was 50.3% for METShip1, 46.7% for METShip2, and 48.7% for METShip3. Yield comparison focused only on lines that were present in all locations within a METShip, or “core” SBWs. A number of core SBWs, crossed to both Australian and CIMMYT backgrounds, outperformed the local benchmark checks at sites from the northern end of the Australian wheat belt, with reduced success at more southern locations. In general, lines that succeeded in the north were different from those in the south. The moderate positive genetic correlation between CIANO and locations in the northern wheat growing region likely reflects similarities in average temperature during flowering, high evaporative demand, and a short flowering interval. We are currently studying attributes of this germplasm that may contribute to adaptation, with the aim of improving the selection process in both Mexico and Australia.
Resumo:
An adaptive conjoint analysis was use to evaluate stakeholders' opinion of welfare indicators for ship-transported sheep and cattle, both onboard and in pre-export depots. In consultations with two nominees of each identified stakeholder group (government officials, animal welfare representatives, animal scientists, stockpersons, producers/pre-export depot operators, exporters/ship owners and veterinarians), 18 potential indicators were identified Three levels were assigned to each using industry statistics and expert opinion, representing those observed on the best and worst 5% of voyages and an intermediate value. A computer-based questionnaire was completed by 135 stakeholders (48% of those invited). All indicators were ranked by respondents in the assigned order, except fodder intake, in which case providing the amount necessary to maintain bodyweight was rated better than over or underfeeding, and time in the pre-export assembly depot, in which case 5 days was rated better than 0 or 10 days. The respective Importance Values (a relative rating given by the respondent) for each indicator were, in order of declining importance: mortality (8.6%), clinical disease incidence (8.2%), respiration rate (6.8%), space allowance (6.2%), ammonia levels (6.1%), weight change (6.0%), wet bulb temperature (6.0%), time in assembly depot (5.4%), percentage of animals in hospital pen (5.4%), fodder intake (5.2%), stress-related metabolites (5.0%), percentage of feeding trough utilised (5.0%), injuries (4.8%), percentage of animals able to access food troughs at any one time (4.8%), percentage of animals lying down (4.7%), cortisol concentration (4.5Y.), noise (3.9y.), and photoperiod (3.4%). The different stakeholder groups were relatively consistent in their ranking of the indicators, with all groups nominating the some top two and at least five of the top seven indicators. Some of the top indicators, in particular mortality, disease incidence and temperature, are already recorded in the Australian industry, but the study identified potential new welfare indicators for exported livestock, such as space allowance and ammonia concentration, which could be used to improve welfare standards if validated by scientific data. The top indicators would also be useful worldwide for countries engaging in long distance sea transport of livestock.
Resumo:
In zucchini, the use of row covers until flowering and the insect growth regulator (IGR) pyriproxyfen are effective methods of reducing the number of insects, especially silverleaf whitefly (Bemisia tabaci (Gennadius) Biotype B), on plants. We compared floating row covers (FRCs) up until flowering with silverleaf whitefly (SLW) introduced (FRC + SLW), or not introduced (FRC-only), or with introduction of SLW in open plots (SLW-only), or with introduction of SLW in open plots with IGR (SLW + IGR). FRC increased temperature and humidity compared with the uncovered treatments. Average fruit weight was less (P < 0.01) for the FRC + SLW treatment compared with the other treatments and the percentage of marketable fruit was less for the FRC + SLW than for the other three treatments. This result indicates that the use of either row covers or IGR controls whiteflies, reduces fruit damage and increases the size, weight, and quality of fruit, and may also control other sap-sucking insects. However, if SLW are already present on plants, the use of FRC may reduce predation and favour build up of SLW. Thus, FRC and IGR, if used judiciously, may provide an effective alternative to broad-spectrum pesticides in small-scale cucurbit production.
Resumo:
Quantitative trait loci (QTL) detection was carried out for adventitious rooting and associated propagation traits in a second-generation outbred Corymbia torelliana x Corymbia citriodora subspecies variegata hybrid family (n=186). The parental species of this cross are divergent in their capacity to develop roots adventitiously on stem cuttings and their propensity to form lignotubers. For the ten traits studied, there was one or two QTL detected, with some QTL explaining large amounts of phenotypic variation (e.g. 66% for one QTL for percentage rooting), suggesting that major effects influence rooting in this cross. Collocation of QTL for many strongly genetically correlated rooting traits to a single region on linkage group 12 suggested pleiotropy. A three locus model was most parsimonious for linkage group 12, however, as differences in QTL position and lower genetic correlations suggested separate loci for each of the traits of shoot production and root initiation. Species differences were thought to be the major source of phenotypic variation for some rooting rate and root quality traits because of the major QTL effects and up to 59-fold larger homospecific deviations (attributed to species differences) relative to heterospecific deviations (attributed to standing variation within species) evident at some QTL for these traits. A large homospecific/heterospecific ratio at major QTL suggested that the gene action evident in one cross may be indicative of gene action more broadly in hybrids between these species for some traits.
Resumo:
Genetic control of vegetative propagation traits was described for a second-generation, outbred, intersectional hybrid family (N = 208) derived from two species, Corymbia torelliana (F. Muell.) K.D. Hill & L.A.S. Johnson and Corymbia variegata (F. Muell.) K.D. Hill & L.A.S. Johnson, which contrast for propagation characteristics and in their capacity to develop lignotubers. Large phenotypic variances were evident for rooting and most other propagation traits, with significant proportions attributable to differences between clones (broad-sense heritabilities 0.2-0.5). Bare root assessment of rooting rate and root quality parameters tended to have the highest heritabilities, whereas rooting percentage based on root emergence from pots and shoot production were intermediate. Root biomass and root initiation had the lowest heritabilities. Strong favourable genetic correlations were found between rooting percentage and root quality traits such as root biomass, volume, and length. Lignotuber development on a seedling was associated with low rooting and a tendency to poor root quality in cuttings and was in accord with the persistence of species parent types due to gametic phase disequilibrium. On average, nodal cuttings rooted more frequently and with higher quality root systems, but significant cutting type x genotype interaction indicated that for some clones, higher rooting rates were obtained from tips. Low germination, survival of seedlings, and rooting rates suggested strong hybrid breakdown in this family.
Resumo:
The ability of blocking ELISAs and haemagglutination-inhibition (HI) tests to detect antibodies in sera from chickens challenged with either Avibacterium (Haemophilus) paragallinarum isolate Hp8 (serovar A) or H668 (serovar C) was compared. Serum samples were examined weekly over the 9 weeks following infection. The results showed that the positive rate of serovar A specific antibody in the B-ELISA remained at 100% from the second week to the ninth week. In chickens given the serovar C challenge, the highest positive rate of serovar C specific antibody in the B-ELISA appeared at the seventh week (60% positive) and was then followed by a rapid decrease. The B-ELISA gave significantly more positives at weeks 2, 3, 7, 8 and 9 post-infection for serovar A and at week 7 post-infection for serovar C. In qualitative terms, for both serovar A and serovar C infections, the HI tests gave a lower percentage of positive sera at all time points except at 9 weeks post-infection with serovar C. The highest positive rate for serovar A HI antibodies was 70% of sera at the fourth and fifth weeks post-infection. The highest rate of serovar C HI antibodies was 20% at the fifth and sixth weeks post-infection. The results have provided further evidence of the suitability of the serovar A and C B-ELISAs for the diagnosis of infectious coryza.
Resumo:
Aerial surveys of kangaroos (Macropus spp.) in Queensland are used to make economically important judgements on the levels of viable commercial harvest. Previous analysis methods for aerial kangaroo surveys have used both mark-recapture methodologies and conventional distance-sampling analyses. Conventional distance sampling has the disadvantage that detection is assumed to be perfect on the transect line, while mark-recapture methods are notoriously sensitive to problems with unmodelled heterogeneity in capture probabilities. We introduce three methodologies for combining together mark-recapture and distance-sampling data, aimed at exploiting the strengths of both methodologies and overcoming the weaknesses. Of these methods, two are based on the assumption of full independence between observers in the mark-recapture component, and this appears to introduce more bias in density estimation than it resolves through allowing uncertain trackline detection. Both of these methods give lower density estimates than conventional distance sampling, indicating a clear failure of the independence assumption. The third method, termed point independence, appears to perform very well, giving credible density estimates and good properties in terms of goodness-of-fit and percentage coefficient of variation. Estimated densities of eastern grey kangaroos range from 21 to 36 individuals km-2, with estimated coefficients of variation between 11% and 14% and estimated trackline detection probabilities primarily between 0.7 and 0.9.
Resumo:
Three drafts of Bos indicus cross steers (initially 178-216 kg) grazed Leucaena-grass pasture [Leucaena leucocephala subspecies glabrata cv. Cunningham with green panic (Panicum maximum cv. trichoglume)] from late winter through to autumn during three consecutive years in the Burnett region of south-east Queensland. Measured daily weight gain (DWGActual) of the steers was generally 0.7-1.1 kg/day during the summer months. Estimated intakes of metabolisable energy and dry matter (DM) were calculated from feeding standards as the intakes required by the steers to grow at the DWGActual. Diet attributes were predicted from near infrared reflectance spectroscopy spectra of faeces (F.NIRS) using established calibration equations appropriate for northern Australian forages. Inclusion of some additional reference samples from cattle consuming Leucaena diets into F.NIRS calibrations based on grass and herbaceous legume-grass pastures improved prediction of the proportion of Leucaena in the diet. Mahalanobis distance values supported the hypothesis that the F.NIRS predictions of diet crude protein concentration and DM digestibility (DMD) were acceptable. F.NIRS indicated that the percentage of Leucaena in the diet varied widely (10-99%). Diet crude protein concentration and DMD were usually high, averaging 12.4 and 62%, respectively, and were related asymptotically to the percentage of Leucaena in the diet (R2 = 0.48 and 0.33, respectively). F.NIRS calibrations for DWG were not satisfactory to predict this variable from an individual faecal sample since the s.e. of prediction were 0.33-0.40 kg/day. Cumulative steer liveweight (LW) predicted from F.NIRS DWG calibrations, which had been previously developed with tropical grass and grass-herbaceous legume pastures, greatly overestimated the measured steer LW; therefore, these calibrations were not useful. Cumulative steer LW predicted from a modified F.NIRS DWG calibration, which included data from the present study, was strongly correlated (R2 = 0.95) with steer LW but overestimated LW by 19-31 kg after 8 months. Additional reference data are needed to develop robust F.NIRS calibrations to encompass the diversity of Leucaena pastures of northern Australia. In conclusion, the experiment demonstrated that F.NIRS could improve understanding of diet quality and nutrient intake of cattle grazing Leucaena-grass pasture, and the relationships between nutrient supply and cattle growth.
Resumo:
Salinity, sodicity, acidity, and phytotoxic levels of chloride (Cl) in subsoils are major constraints to crop production in many soils of north-eastern Australia because they reduce the ability of crop roots to extract water and nutrients from the soil. The complex interactions and correlations among soil properties result in multi-colinearity between soil properties and crop yield that makes it difficult to determine which constraint is the major limitation. We used ridge-regression analysis to overcome colinearity to evaluate the contribution of soil factors and water supply to the variation in the yields of 5 winter crops on soils with various levels and combinations of subsoil constraints in the region. Subsoil constraints measured were soil Cl, electrical conductivity of the saturation extract (ECse), and exchangeable sodium percentage (ESP). The ridge regression procedure selected several of the variables used in a descriptive model, which included in-crop rainfall, plant-available soil water at sowing in the 0.90-1.10 m soil layer, and soil Cl in the 0.90-1.10 m soil layer, and accounted for 77-85% of the variation in the grain yields of the 5 winter crops. Inclusion of ESP of the top soil (0.0-0.10 m soil layer) marginally increased the descriptive capability of the models for bread wheat, barley and durum wheat. Subsoil Cl concentration was found to be an effective substitute for subsoil water extraction. The estimates of the critical levels of subsoil Cl for a 10% reduction in the grain yield were 492 mg cl/kg for chickpea, 662 mg Cl/kg for durum wheat, 854 mg Cl/kg for bread wheat, 980 mg Cl/kg for canola, and 1012 mg Cl/kg for barley, thus suggesting that chickpea and durum wheat were more sensitive to subsoil Cl than bread wheat, barley, and canola.
Resumo:
Single or multiple factors implicated in subsoil constraints including salinity, sodicity, and phytotoxic concentrations of chloride (Cl) are present in many Vertosols including those occurring in Queensland, Australia. The variable distribution and the complex interactions that exist between these constraints limit the agronomic or management options available to manage the soil with these subsoil constraints. The identification of crops and cultivars adapted to these adverse subsoil conditions and/or able to exploit subsoil water may be an option to maintain productivity of these soils. We evaluated relative performance of 5 winter crop species, in terms of grain yields, nutrient concentration, and ability to extract soil water, grown on soils with various levels and combinations of subsoil constraints in 19 field experiments over 2 years. Subsoil constraints were measured by levels of soil Cl, electrical conductivity of the saturation extract (ECse), and exchangeable sodium percentage (ESP). Increasing levels of subsoil constraints significantly decreased maximum depth of water extraction, grain yield, and plant-available water capacity for all the 5 crops and more so for chickpea and durum wheat than bread wheat, barley, or canola. Increasing soil Cl levels had a greater restricting effect on water availability than did ECse and ESP. We developed empirical relationships between soil Cl, ECse, and ESP and crop lower limit (CLL) for estimating subsoil water extraction by 5 winter crops. However, the presence of gypsum influenced the ability to predict CLL based on the levels of ECse. Stronger relationships between apparent unused plant-available water (CLL - LL15; LL15 is lower limit at -1.5 MPa) and soil Cl concentrations than ESP or ECse suggested that the presence of high Cl in these soils most likely inhibited the subsoil water extraction by the crops. This was supported by increased sodium (Na) and Cl concentration with a corresponding decrease in calcium (Ca) and potassium (K) in young mature leaf of bread wheat, durum wheat, and chickpea with increasing levels of subsoil constraints. Of the 2 ions, Na and Cl, the latter appears to be more damaging than the former, resulting in plant dieback and reduced grain yields.
Resumo:
Surveys were conducted between 1997 and 2001 to investigate the incidence of overwintering Helicoverpa spp. pupae under summer crop residues on the Darling Downs, Queensland. Only Helicoverpa armigera was represented in collections of overwintering pupae. The results indicated that late-season crops of cotton, sorghum, maize, soybean, mungbean and sunflower were equally likely to have overwintering pupae under them. In the absence of tillage practices, these crops had the potential to produce similar numbers of moths/ha in the spring. There were expected differences between years in the densities of overwintering pupae and the number of emerged moths/ha. Irrigated crops produced 2.5 times more moths/ha than dryland crops. Overall survival from autumn-formed pupae to emerged moths averaged 44%, with a higher proportion of pupae under maize surviving to produce moths than each of the other crops. Parasitoids killed 44.1% of pupae, with Heteropelma scaposum representing 83.3% of all parasitoids reared from pupae. Percentage parasitism levels were lower in irrigated crops (27.6%) compared with dryland crops (40.5%). Recent changes to Helicoverpa spp. management in cotton/grain-farming systems in south-eastern Queensland, including widespread adoption of Bt cotton, and use of more effective and more selective insecticides, could lead to lower densities of overwintering pupae under late summer crops.
Resumo:
The first larval instar has been identified as a critical stage for population mortality in Lepidoptera, yet due to the body size of these larvae, the factors that contribute to mortality under field conditions are still not clear. Dispersal behaviour has been suggested as a significant, but ignored factor contributing to mortality in first-instar lepidopteran larvae. The impact that leaving the host plant has on the mortality rate of Helicoverpa armigera neonates was examined in field crops and laboratory trials. In this study the following are examined: (1) the effects of soil surface temperature, and the level of shade within the crop, on the mortality of neonates on the soil after dropping off from the host plant; (2) the percentage of neonates that dropped off from a host plant and landed on the soil; and (3) the effects of exposure to different soil surface temperatures on the development and mortality of neonates. The findings of this study showed that: (1) on the soil, surface temperatures above 43°C were lethal for neonates, and exposure to these temperatures contributed greatly to the overall mortality rate observed; however, the fate of neonates on the soil varied significantly depending on canopy closure within the crop; (2) at least 15% of neonates dropped off from the host plant and landed on the soil, meaning that the proportion of neonates exposed to these condition is not trivial; and (3) 30 min exposure to soil surface temperatures approaching the lethal level (>43°C) has no significant negative effects on the development and mortality of larvae through to the second instar. Overall leaving the plant through drop-off contributes to first-instar mortality in crops with open canopies; however, survival of neonates that have lost contact with a host plant is possible, and becomes more likely later in the crop growing season.