13 resultados para Failure of IndyMac Bank
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Lantana camara L. (Verbenaceae) is a weed of great significance in Australia and worldwide, but little is known about connections among components of its life history. We document over a 3-year period, the links between L. camara seed-bank dynamics and its above-ground growth, including size asymmetry in four land-use types (a farm, a hoop pine plantation and two open eucalypt forests) invaded by the weed near Brisbane, Queensland Australia. Seed-bank populations varied appreciably across sites and in response to rainfall and control measures, and they were higher (~1,000 seeds/m2) when annual rainfall was 15-30 % below the long-term yearly average. Fire reduced seed-bank populations but not the proportion germinating (6-8 %). Nearly a quarter of fresh seeds remain germinable after 3 years of soil burial. For small seedlings (<10 cm high), the expected trade-offs in two life-history traits-survival and growth-did not apply; rather the observed positive association between these two traits, coupled with a persistent seed-bank population could contribute to the invasiveness of the plant. Relationships between absolute growth rate and initial plant size (crown volume) were positively linear, suggesting that most populations are still at varying stages of the exponential phase of the sigmoid growth; this trend also suggests that at most sites and despite increasing stand density and limiting environmental resources of light and soil moisture, lantana growth is inversely size asymmetric. From the observed changes in measures of plant size inequality, asymmetric competition appeared limited in all the infestations surveyed. © 2013 Crown Copyright as represented by: Department of Agriculture, Fisheries and Forestry, Australia.
Resumo:
Aims: To investigate methods for the recovery of airborne bacteria within pig sheds and to then use the appropriate methods to determine the levels of heterotrophs and Escherichia coli in the air within sheds. Methods and Results: AGI-30 impingers and a six-stage Andersen multi-stage sampler (AMS) were used for the collection of aerosols. Betaine and catalase were added to impinger collection fluid and the agar plates used in the AMS. Suitable media for enumerating E. coli with the Andersen sampler were also evaluated. The addition of betaine and catalase gave no marked increase in the recovery of heterotrophs or E. coli. No marked differences were found in the media used for enumeration of E. coli. The levels of heterotrophs and E. coli in three piggeries, during normal pig activities, were 2Æ2 · 105 and 21 CFU m)3 respectively. Conclusions: The failure of the additives to improve the recovery of either heterotrophs or E. coli suggests that these organisms are not stressed in the piggery environment. The levels of heterotrophs in the air inside the three Queensland piggeries investigated are consistent with those previously reported in other studies. Flushing with ponded effluent had no marked or consistent effect on the heterotroph or E. coli levels. Significance and Impact of the Study: Our work suggests that levels of airborne heterotrophs and E. coli inside pig sheds have no strong link with effluent flushing. It would seem unlikely that any single management activity within a pig shed has a dominant influence on levels of airborne heterotrophs and E. coli
Resumo:
Line-transect distance sampling is a widely used method for estimating animal density from aerial surveys. Analysis of line-transect distance data usually relies on a requirement that the statistical distribution of distances of animal groups from the transect line is uniform. We show that this requirement is satisfied by the survey design if all other assumptions of distance sampling hold, but it can be violated by consistent survey problems such as responsive movement of the animals towards or away from the observer. We hypothesise that problems with the uniform requirement are unlikely to be encountered for immobile taxa, but might become substantial for species of high mobility. We test evidence for non-uniformity using double-observer distance data from two aerial surveys of five species with a spectrum of mobility capabilities and tendencies. No clear evidence against uniformity was found for crabeater seals or emperor penguins on the pack-ice in East Antarctica, while minor non-uniformity consistent with responsive movement up to 30 m was found for Adelie penguins. Strong evidence of either non-uniformity or a failure of the capture-recapture validating method was found for eastern grey kangaroos and red kangaroos in Queensland.
Resumo:
Aerial surveys of kangaroos (Macropus spp.) in Queensland are used to make economically important judgements on the levels of viable commercial harvest. Previous analysis methods for aerial kangaroo surveys have used both mark-recapture methodologies and conventional distance-sampling analyses. Conventional distance sampling has the disadvantage that detection is assumed to be perfect on the transect line, while mark-recapture methods are notoriously sensitive to problems with unmodelled heterogeneity in capture probabilities. We introduce three methodologies for combining together mark-recapture and distance-sampling data, aimed at exploiting the strengths of both methodologies and overcoming the weaknesses. Of these methods, two are based on the assumption of full independence between observers in the mark-recapture component, and this appears to introduce more bias in density estimation than it resolves through allowing uncertain trackline detection. Both of these methods give lower density estimates than conventional distance sampling, indicating a clear failure of the independence assumption. The third method, termed point independence, appears to perform very well, giving credible density estimates and good properties in terms of goodness-of-fit and percentage coefficient of variation. Estimated densities of eastern grey kangaroos range from 21 to 36 individuals km-2, with estimated coefficients of variation between 11% and 14% and estimated trackline detection probabilities primarily between 0.7 and 0.9.
Resumo:
The partial gene sequencing of the matrix (M) protein from seven clinical isolates of bovine parainfluenza virus type 3 (BPIV-3), and the complete sequencing of a representative isolate (Q5592) was completed in this study. Nucleotide sequence analysis was initiated because of the failure of in-house BPIV-3 RT-PCR methods to yield expected products for four of the isolates. Phylogenetic reconstructions based on the nucleotide sequences for the M-protein and the entire genome, using all of the available BPIV-3 nucleotide sequences, demonstrated that there were two distinct BPIV-3 genotypes (BPIV-3a and BPIV-3b). These newly identified genotypes have implications for the development of BPIV-3 molecular detection methods and may also impact on BPIV-3 vaccine formulations.
Resumo:
The enemy release hypothesis predicts that native herbivores will either prefer or cause more damage to native than introduced plant species. We tested this using preference and performance experiments in the laboratory and surveys of leaf damage caused by the magpie moth Nyctemera amica on a co-occuring native and introduced species of fireweed (Senecio) in eastern Australia. In the laboratory, ovipositing females and feeding larvae preferred the native S. pinnatifolius over the introduced S. madagascariensis. Larvae performed equally well on foliage of S. pinnatifolius and S. madagascariensis: pupal weights did not differ between insects reared on the two species, but growth rates were significantly faster on S. pinnatifolius. In the field, foliage damage was significantly greater on native S. pinnatifolius than introduced S. madagascariensis. These results support the enemy release hypothesis, and suggest that the failure of native consumers to switch to introduced species contributes to their invasive success. Both plant species experienced reduced, rather than increased, levels of herbivory when growing in mixed populations, as opposed to pure stands in the field; thus, there was no evidence that apparent competition occurred.
Resumo:
Weed eradication programs often require 10 years or more to achieve their objective. It is important that progress is evaluated on a regular basis so that programs that are 'on track' can be distinguished from those that are unlikely to succeed. Earlier research has addressed conformity of eradication programs to the delimitation criterion. In this paper evaluation in relation to the containment and extirpation criteria is considered. Because strong evidence of containment failure (i.e. spread from infestations targeted for eradication) is difficult to obtain, it generally will not be practicable to evaluate how effective eradication programs are at containing the target species. However, chronic failure of containment will be reflected in sustained increases in cumulative infested area and thus a failure to delimit a weed invasion. Evaluating the degree of conformity to the delimitation and extirpation criteria is therefore sufficient to give an appraisal of progress towards the eradication objective. A significant step towards eradication occurs when a weed is no longer readily detectable at an infested site, signalling entry to the monitoring phase. This transition will occur more quickly if reproduction is prevented consistently. Where an invasion consists of multiple infestations, the monitoring profile (frequency distribution of time since detection) provides a summary of the overall effectiveness of the eradication program in meeting the extirpation criterion. Eradication is generally claimed when the target species has not been detected for a period equal to or greater than its seed longevity, although there is often considerable uncertainty in estimates of the latter. Recently developed methods, which take into consideration the cost of continued monitoring vs. the potential cost of damage should a weed escape owing to premature cessation of an eradication program, can assist managers to decide when to terminate weed eradication programs.
Resumo:
The present review identifies various constraints relating to poor adoption of ley-pastures in south-west Queensland, and suggests changes in research, development and extension efforts for improved adoption. The constraints include biophysical, economic and social constraints. In terms of biophysical constraints, first, shallower soil profiles with subsoil constraints (salt and sodicity), unpredictable rainfall, drier conditions with higher soil temperature and evaporative demand in summer, and frost and subzero temperature in winter, frequently result in a failure of established, or establishing, pastures. Second, there are limited options for legumes in a ley-pasture, with the legumes currently being mostly winter-active legumes such as lucerne and medics. Winter-active legumes are ineffective in improving soil conditions in a region with summer-dominant rainfall. Third, most grain growers are reluctant to include grasses in their ley-pasture mix, which can be uneconomical for various reasons, including nitrogen immobilisation, carryover of cereal diseases and depressed yields of the following cereal crops. Fourth, a severe depletion of soil water following perennial ley-pastures (grass + legumes or lucerne) can reduce the yields of subsequent crops for several seasons, and the practice of longer fallows to increase soil water storage may be uneconomical and damaging to the environment. Economic assessments of integrating medium- to long-term ley-pastures into cropping regions are generally less attractive because of reduced capital flow, increased capital investment, economic loss associated with establishment and termination phases of ley-pastures, and lost opportunities for cropping in a favourable season. Income from livestock on ley-pastures and soil productivity gains to subsequent crops in rotation may not be comparable to cropping when grain prices are high. However, the economic benefits of ley-pastures may be underestimated, because of unaccounted environmental benefits such as enhanced water use, and reduced soil erosion from summer-dominant rainfall, and therefore, this requires further investigation. In terms of social constraints, the risk of poor and unreliable establishment and persistence, uncertainties in economic and environmental benefits, the complicated process of changing from crop to ley-pastures and vice versa, and the additional labour and management requirements of livestock, present growers socially unattractive and complex decision-making processes for considering adoption of an existing medium- to long-term ley-pasture technology. It is essential that research, development and extension efforts should consider that new ley-pasture options, such as incorporation of a short-term summer forage legume, need to be less risky in establishment, productive in a region with prevailing biophysical constraints, economically viable, less complex and highly flexible in the change-over processes, and socially attractive to growers for adoption in south-west Queensland.
Resumo:
Faecal Egg Count Reduction Tests (FECRTs) for macrocyclic lactone (ML) and levamisole (LEV) drenches were conducted on two dairy farms in the subtropical, summer rainfall region of eastern Australia to determine if anthelmintic failure contributed to severe gastrointestinal nematode infections observed in weaner calves. Subtropical Cooperia spp. were the dominant nematodes on both farms although significant numbers of Haemonchus placei were also present on Farm 2. On Farm 1, moxidectin pour-on (MXD) drenched at 0.5 mg kg-1 liveweight (LW) reduced the overall Cooperia burden by 82% (95% confidence limits, 37-95%) at day 7 post-drench. As worm burdens increased rapidly in younger animals in the control group (n = 4), levamisole was used as a salvage drench and these calves withdrawn from the trial on animal welfare grounds after sample collection at day 7. Levamisole (LEV) dosed at 6.8 mg kg-1 LW reduced the worm burden in these calves by 100%, 7 days after drenching. On Farm 2, MXD given at 0.5 mg kg-1 LW reduced the faecal worm egg count of cooperioids at day 8 by 96% (71-99%), ivermectin oral (IVM) at 0.2 mg kg-1 LW by 1.6% (-224 to 70%) and LEV oral at 7.1 mg kg-1 LW by 100%. For H. placei the reductions were 98% (85-99.7%) for MXD, 0.7% (-226 to 70%) for IVM and 100% for LEV. This is the first report in Australia of the failure of macrocyclic lactone treatments to control subtropical Cooperia spp. and suspected failure to control H. placei in cattle.
Resumo:
Better understanding of seed-bank dynamics of Echinochloa colona, Urochloa panicoides and Hibiscus trionum, major crop weeds in sub-tropical Australia, was needed to improve weed control. Emergence patterns and seed persistence were investigated, with viable seeds sown at different depths in large in-ground pots. Seedlings of all species emerged between October and March when mean soil temperatures were 21-23C. However, E. colona emerged as a series of flushes predominantly in the first year, with most seedlings emerging from 0-2 cm. Urochloa panicoides emerged mostly as a single large flush in the first two years, with most seedlings emerging from 5 cm. Hibiscus trionum emerged as a series of flushes over three seasons, initially with majority from 5 cm and then 0-2 cm in the later seasons. Longevity of the grass seed was short, with <5% remaining after burial at 0-2 cm for 24 months. In contrast, 38% of H. trionum seeds remained viable after the same period. Persistence of all species increased significantly with burial depth. These data highlight that management strategies need to be tailored for each species, particularly relating to the need for monitoring, application times for control tactics, impact of tillage, and time needed to reduce the seed-bank to low numbers.
Resumo:
Bellyache bush (Jatropha gossypifolia L.) is an invasive shrub that adversely impacts agricultural and natural systems of northern Australia. While several techniques are available to control bellyache bush, depletion of soil seed banks is central to its management. A 10-year study determined the persistence of intact and ant-discarded bellyache bush seeds buried in shade cloth packets at six depths (ranging from 0 to 40 cm) under both natural rainfall and rainfall-excluded conditions. A second study monitored changes in seedling emergence over time, to provide an indication of the natural rate of seed bank depletion at two sites (rocky and heavy clay) following the physical removal of all bellyache bush plants. Persistence of seed in the burial trial varied depending on seed type, rainfall conditions and burial depth. No viable seeds of bellyache bush remained after 72 months irrespective of seed type under natural rainfall conditions. When rainfall was excluded seeds persisted for much longer, with a small portion (0.4%) of ant-discarded seeds still viable after 120 months. Seed persistence was prolonged (> 96 months to decline to < 1% viability) at all burial depths under rainfall-excluded conditions. In contrast, under natural rainfall, surface located seeds took twice as long (70 months) to decline to 1% viability compared with buried seeds (35 months). No seedling emergence was observed after 58 months and 36 months at the rocky and heavy clay soil sites, respectively. These results suggest that the required duration of control programs on bellyache bush may vary due to the effect of biotic and abiotic factors on persistence of soil seed banks.
Resumo:
During the post-rainy (rabi) season in India around 3 million tonnes of sorghum grain is produced from 5.7 million ha of cropping. This underpins the livelihood of about 5 million households. Severe drought is common as the crop grown in these areas relies largely on soil moisture stored during the preceding rainy season. Improvement of rabi sorghum cultivars through breeding has been slow but could be accelerated if drought scenarios in the production regions were better understood. The sorghum crop model within the APSIM (Agricultural Production Systems sIMulator) platform was used to simulate crop growth and yield and the pattern of crop water status through each season using available historical weather data. The current model reproduced credibly the observed yield variation across the production region (R2=0.73). The simulated trajectories of drought stress through each crop season were clustered into five different drought stress patterns. A majority of trajectories indicated terminal drought (43%) with various timings of onset during the crop cycle. The most severe droughts (25% of seasons) were when stress began before flowering and resulted in failure of grain production in most cases, although biomass production was not affected so severely. The frequencies of drought stress types were analyzed for selected locations throughout the rabi tract and showed different zones had different predominating stress patterns. This knowledge can help better focus the search for adaptive traits and management practices to specific stress situations and thus accelerate improvement of rabi sorghum via targeted specific adaptation. The case study presented here is applicable to other sorghum growing environments. © 2012 Elsevier B.V.
Resumo:
Recolonisation of soil by macrofauna (especially ants, termites and earthworms) in rehabilitated open-cut mine sites is inevitable and, in terms of habitat restoration and function, typically of great value. In these highly disturbed landscapes, soil invertebrates play a major role in soil development (macropore configuration, nutrient cycling, bioturbation, etc.) and can influence hydrological processes such as infiltration, seepage, runoff generation and soil erosion. Understanding and quantifying these ecosystem processes is important in rehabilitation design, establishment and subsequent management to ensure progress to the desired end goal, especially in waste cover systems designed to prevent water reaching and transporting underlying hazardous waste materials. However, the soil macrofauna is typically overlooked during hydrological modelling, possibly due to uncertainties on the extent of their influence, which can lead to failure of waste cover systems or rehabilitation activities. We propose that scientific experiments under controlled conditions and field trials on post-mining lands are required to quantify (i) macrofauna–soil structure interactions, (ii) functional dynamics of macrofauna taxa,and (iii) their effects on macrofauna and soil development over time. Such knowledge would provide crucial information for soil water models, which would increase confidence in mine waste cover design recommendations and eventually lead to higher likelihood of rehabilitation success of open-cut mining land.