5 resultados para data reduction by factor analysis

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To protect terrestrial ecosystems and humans from contaminants many countries and jurisdictions have developed soil quality guidelines (SQGs). This study proposes a new framework to derive SQGs and guidelines for amended soils and uses a case study based on phytotoxicity data of copper (Cu) and zinc (Zn) from field studies to illustrate how the framework could be applied. The proposed framework uses normalisation relationships to account for the effects of soil properties on toxicity data followed by a species sensitivity distribution (SSD) method to calculate a soil added contaminant limit (soil ACL) for a standard soil. The normalisation equations are then used to calculate soil ACLs for other soils. A soil amendment availability factor (SAAF) is then calculated as the toxicity and bioavailability of pure contaminants and contaminants in amendments can be different. The SAAF is used to modify soil ACLs to ACLs for amended soils. The framework was then used to calculate soil ACLs for copper (Cu) and zinc (Zn). For soils with pH of 4-8 and OC content of 1-6%, the ACLs range from 8 mg/kg to 970 mg/kg added Cu. The SAAF for Cu was pH dependant and varied from 1.44 at pH 4 to 2.15 at pH 8. For soils with pH of 4-8 and OC content of 1-6%, the ACLs for amended soils range from 11 mg/kg to 2080 mg/kg added Cu. For soils with pH of 4-8 and a CEC from 5-60, the ACLs for Zn ranged from 21 to 1470 mg/kg added Zn. A SAAF of one was used for Zn as it concentrations in plant tissue and soil to water partitioning showed no difference between biosolids and soluble Zn salt treatments, indicating that Zn from biosolids and Zn salts are equally bioavailable to plants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The highly variable flagellin-encoding flaA gene has long been used for genotyping Campylobacter jejuni and Campylobacter coli. High-resolution melting (HRM) analysis is emerging as an efficient and robust method for discriminating DNA sequence variants. The objective of this study was to apply HRM analysis to flaA-based genotyping. The initial aim was to identify a suitable flaA fragment. It was found that the PCR primers commonly used to amplify the flaA short variable repeat (SVR) yielded a mixed PCR product unsuitable for HRM analysis. However, a PCR primer set composed of the upstream primer used to amplify the fragment used for flaA restriction fragment length polymorphism (RFLP) analysis and the downstream primer used for flaA SVR amplification generated a very pure PCR product, and this primer set was used for the remainder of the study. Eighty-seven C. jejuni and 15 C. coli isolates were analyzed by flaA HRM and also partial flaA sequencing. There were 47 flaA sequence variants, and all were resolved by HRM analysis. The isolates used had previously also been genotyped using single-nucleotide polymorphisms (SNPs), binary markers, CRISPR HRM, and flaA RFLP.flaA HRM analysis provided resolving power multiplicative to the SNPs, binary markers, and CRISPR HRM and largely concordant with the flaA RFLP. It was concluded that HRM analysis is a promising approach to genotyping based on highly variable genes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predicting which species are likely to cause serious impacts in the future is crucial for targeting management efforts, but the characteristics of such species remain largely unconfirmed. We use data and expert opinion on tropical and subtropical grasses naturalised in Australia since European settlement to identify naturalised and high-impact species and subsequently to test whether high-impact species are predictable. High-impact species for the three main affected sectors (environment, pastoral and agriculture) were determined by assessing evidence against pre-defined criteria. Twenty-one of the 155 naturalised species (14%) were classified as high-impact, including four that affected more than one sector. High-impact species were more likely to have faster spread rates (regions invaded per decade) and to be semi-aquatic. Spread rate was best explained by whether species had been actively spread (as pasture), and time since naturalisation, but may not be explanatory as it was tightly correlated with range size and incidence rate. Giving more weight to minimising the chance of overlooking high-impact species, a priority for biosecurity, meant a wider range of predictors was required to identify high-impact species, and the predictive power of the models was reduced. By-sector analysis of predictors of high impact species was limited by their relative rarity, but showed sector differences, including to the universal predictors (spread rate and habitat) and life history. Furthermore, species causing high impact to agriculture have changed in the past 10 years with changes in farming practice, highlighting the importance of context in determining impact. A rationale for invasion ecology is to improve the prediction and response to future threats. Although our study identifies some universal predictors, it suggests improved prediction will require a far greater emphasis on impact rather than invasiveness, and will need to account for the individual circumstances of affected sectors and the relative rarity of high-impact species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Common coral trout Plectropomus leopardus is an iconic fish of the Great Barrier Reef (GBR) and is the most important fish for the commercial fishery there. Most of the catch is exported live to Asia. This stock assessment was undertaken in response to falls in catch sizes and catch rates in recent years, in order to gauge the status of the stock. It is the first stock assessment ever conducted of coral trout on the GBR, and brings together a multitude of different data sources for the first time. The GBR is very large and was divided into a regional structure based on the Bioregions defined by expert committees appointed by the Great Barrier Reef Marine Park Authority (GBRMPA) as part of the 2004 rezoning of the GBR. The regional structure consists of six Regions, from the Far Northern Region in the north to the Swains and Capricorn–Bunker Regions in the south. Regions also closely follow the boundaries between Bioregions. Two of the northern Regions are split into Subregions on the basis of potential changes in fishing intensity between the Subregions; there are nine Subregions altogether, which include four Regions that are not split. Bioregions are split into Subbioregions along the Subregion boundaries. Finally, each Subbioregion is split into a “blue” population which is open to fishing and a “green” population which is closed to fishing. The fishery is unusual in that catch rates as an indicator of abundance of coral trout are heavily influenced by tropical cyclones. After a major cyclone, catch rates fall for two to three years, and rebound after that. This effect is well correlated with the times of occurrence of cyclones, and usually occurs in the same month that the cyclone strikes. However, statistical analyses correlating catch rates with cyclone wind energy did not provide significantly different catch rate trends. Alternative indicators of cyclone strength may explain more of the catch rate decline, and future work should investigate this. Another feature of catch rates is the phenomenon of social learning in coral trout populations, whereby when a population of coral trout is fished, individuals quickly learn not to take bait. Then the catch rate falls sharply even when the population size is still high. The social learning may take place by fish directly observing their fellows being hooked, or perhaps heeding a chemo-sensory cue emitted by fish that are hooked. As part of the assessment, analysis of data from replenishment closures of Boult Reef in the Capricorn–Bunker Region (closed 1983–86) and Bramble Reef in the Townsville Subregion (closed 1992–95) estimated a strong social learning effect. A major data source for the stock assessment was the large collection of underwater visual survey (UVS) data collected by divers who counted the coral trout that they sighted. This allowed estimation of the density of coral trout in the different Bioregions (expressed as a number of fish per hectare). Combined with mapping data of all the 3000 or so reefs making up the GBR, the UVS results provided direct estimates of the population size in each Subbioregion. A regional population dynamic model was developed to account for the intricacies of coral trout population dynamics and catch rates. Because the statistical analysis of catch rates did not attribute much of the decline to tropical cyclones, (and thereby implied “real” declines in biomass), and because in contrast the UVS data indicate relatively stable population sizes, model outputs were unduly influenced by the unlikely hypothesis that falling catch rates are real. The alternative hypothesis that UVS data are closer to the mark and declining catch rates are an artefact of spurious (e.g., cyclone impact) effects is much more probable. Judging by the population size estimates provided by the UVS data, there is no biological problem with the status of coral trout stocks. The estimate of the total number of Plectropomus leopardus on blue zones on the GBR in the mid-1980s (the time of the major UVS series) was 5.34 million legal-sized fish, or about 8400 t exploitable biomass, with an 2 additional 3350 t in green zones (using the current zoning which was introduced on 1 July 2004). For the offshore regions favoured by commercial fishers, the figure was about 4.90 million legal-sized fish in blue zones, or about 7700 t exploitable biomass. There is, however, an economic problem, as indicated by relatively low catch rates and anecdotal information provided by commercial fishers. The costs of fishing the GBR by hook and line (the only method compatible with the GBR’s high conservation status) are high, and commercial fishers are unable to operate profitably when catch rates are depressed (e.g., from a tropical cyclone). The economic problem is compounded by the effect of social learning in coral trout, whereby catch rates fall rapidly if fishers keep returning to the same fishing locations. In response, commercial fishers tend to spread out over the GBR, including the Far Northern and Swains Regions which are far from port and incur higher travel costs. The economic problem provides some logic to a reduction in the TACC. Such a reduction during good times, such as when the fishery is rebounding after a major tropical cyclone, could provide a net benefit to the fishery, as it would provide a margin of stock safety and make the fishery more economically robust by providing higher catch rates during subsequent periods of depressed catches. During hard times when catch rates are low (e.g., shortly after a major tropical cyclone), a change to the TACC would have little effect as even a reduced TACC would not come close to being filled. Quota adjustments based on catch rates should take account of long-term trends in order to mitigate variability and cyclone effects in data.