991 resultados para 0704 Fisheries Sciences
Resumo:
We propose a new model for estimating the size of a population from successive catches taken during a removal experiment. The data from these experiments often have excessive variation, known as overdispersion, as compared with that predicted by the multinomial model. The new model allows catchability to vary randomly among samplings, which accounts for overdispersion. When the catchability is assumed to have a beta distribution, the likelihood function, which is refered to as beta-multinomial, is derived, and hence the maximum likelihood estimates can be evaluated. Simulations show that in the presence of extravariation in the data, the confidence intervals have been substantially underestimated in previous models (Leslie-DeLury, Moran) and that the new model provides more reliable confidence intervals. The performance of these methods was also demonstrated using two real data sets: one with overdispersion, from smallmouth bass (Micropterus dolomieu), and the other without overdispersion, from rat (Rattus rattus).
Resumo:
Although subsampling is a common method for describing the composition of large and diverse trawl catches, the accuracy of these techniques is often unknown. We determined the sampling errors generated from estimating the percentage of the total number of species recorded in catches, as well as the abundance of each species, at each increase in the proportion of the sorted catch. We completely partitioned twenty prawn trawl catches from tropical northern Australia into subsamples of about 10 kg each. All subsamples were then sorted, and species numbers recorded. Catch weights ranged from 71 to 445 kg, and the number of fish species in trawls ranged from 60 to 138, and invertebrate species from 18 to 63. Almost 70% of the species recorded in catches were "rare" in subsamples (less than one individual per 10 kg subsample or less than one in every 389 individuals). A matrix was used to show the increase in the total number of species that were recorded in each catch as the percentage of the sorted catch increased. Simulation modelling showed that sorting small subsamples (about 10% of catch weights) identified about 50% of the total number of species caught in a trawl. Larger subsamples (50% of catch weight on average) identified about 80% of the total species caught in a trawl. The accuracy of estimating the abundance of each species also increased with increasing subsample size. For the "rare" species, sampling error was around 80% after sorting 10% of catch weight and was just less than 50% after 40% of catch weight had been sorted. For the "abundant" species (five or more individuals per 10 kg subsample or five or more in every 389 individuals), sampling error was around 25% after sorting 10% of catch weight, but was reduced to around 10% after 40% of catch weight had been sorted.
Resumo:
Kirjallisuuden- ja kulttuurintutkimus on viimeisten kolmen vuosikymmenen aikana tullut yhä enenevässä määrin tietoiseksi tieteen ja taiteen suhteen monimutkaisesta luonteesta. Nykyään näiden kahden kulttuurin tutkimus muodostaa oman kenttänsä, jolla niiden suhdetta tarkastellaan ennen kaikkea dynaamisena vuorovaikutuksena, joka heijastaa kulttuurimme kieltä, arvoja ja ideologisia sisältöjä. Toisin kuin aiemmat näkemykset, jotka pitävät tiedettä ja taidetta toisilleen enemmän tai vähemmän vastakkaisina pyrkimyksinä, nykytutkimus lähtee oletuksesta, jonka mukaan ne ovat kulttuurillisesti rakentuneita diskursseja, jotka kohtaavat usein samankaltaisia todellisuuden mallintamiseen liittyviä ongelmia, vaikka niiden käyttämät metodit eroavatkin toisistaan. Väitöskirjani keskittyy yllä mainitun suhteen osa-alueista popularisoidun tietokirjallisuuden (muun muassa Paul Davies, James Gleick ja Richard Dawkins) käyttämän kielen ja luonnontieteistä ideoita ammentavan kaunokirjallisuuden (muun muassa Jeanette Winterson, Tom Stoppard ja Richard Powers) hyödyntämien keinojen tarkasteluun nojautuen yli 30 teoksen kattavaa aineistoa koskevaan tyylin ja teemojen tekstianalyysiin. Populaarin tietokirjallisuuden osalta tarkoituksenani on osoittaa, että sen käyttämä kieli rakentuu huomattavassa määrin sellaisille rakenteille, jotka tarjoavat mahdollisuuden esittää todellisuutta koskevia argumentteja mahdollisimman vakuuttavalla tavalla. Tässä tehtävässä monilla klassisen retoriikan määrittelemillä kuvioilla on tärkeä rooli, koska ne auttavat liittämään sanotun sisällön ja muodon tiukasti toisiinsa: retoristen kuvioiden käyttö ei näin ollen edusta pelkkää tyylikeinoa, vaan se myös usein kiteyttää argumenttien taustalla olevat tieteenfilosofiset olettamukset ja auttaa vakiinnuttamaan argumentoinnin logiikan. Koska monet aikaisemmin ilmestyneistä tutkimuksista ovat keskittyneet pelkästään metaforan rooliin tieteellisissä argumenteissa, tämä väitöskirja pyrkii laajentamaan tutkimuskenttää analysoimalla myös toisenlaisten kuvioiden käyttöä. Osoitan myös, että retoristen kuvioiden käyttö muodostaa yhtymäkohdan tieteellisiä ideoita hyödyntävään kaunokirjallisuuteen. Siinä missä popularisoitu tiede käyttää retoriikkaa vahvistaakseen sekä argumentatiivisia että kaunokirjallisia ominaisuuksiaan, kuvaa tällainen sanataide tiedettä tavoilla, jotka usein heijastelevat tietokirjallisuuden kielellisiä rakenteita. Toisaalta on myös mahdollista nähdä, miten kaunokirjallisuuden keinot heijastuvat popularisoidun tieteen kerrontatapoihin ja kieleen todistaen kahden kulttuurin dynaamisesta vuorovaikutuksesta. Nykyaikaisen populaaritieteen retoristen elementtien ja kaunokirjallisuuden keinojen vertailu näyttää lisäksi, kuinka tiede ja taide osallistuvat keskusteluun kulttuurimme tiettyjen peruskäsitteiden kuten identiteetin, tiedon ja ajan merkityksestä. Tällä tavoin on mahdollista nähdä, että molemmat ovat perustavanlaatuisia osia merkityksenantoprosessissa, jonka kautta niin tieteelliset ideat kuin ihmiselämän suuret kysymyksetkin saavat kulttuurillisesti rakentuneen merkityksensä.
Resumo:
We consider estimation of mortality rates and growth parameters from length-frequency data of a fish stock when there is individual variability in the von Bertalanffy growth parameter L-infinity and investigate the possible bias in the estimates when the individual variability is ignored. Three methods are examined: (i) the regression method based on the Beverton and Holt's (1956, Rapp. P.V. Reun. Cons. Int. Explor. Mer, 140: 67-83) equation; (ii) the moment method of Powell (1979, Rapp. PV. Reun. Int. Explor. Mer, 175: 167-169); and (iii) a generalization of Powell's method that estimates the individual variability to be incorporated into the estimation. It is found that the biases in the estimates from the existing methods are, in general, substantial, even when individual variability in growth is small and recruitment is uniform, and the generalized method performs better in terms of bias but is subject to a larger variation. There is a need to develop robust and flexible methods to deal with individual variability in the analysis of length-frequency data.
Resumo:
In the analysis of tagging data, it has been found that the least-squares method, based on the increment function known as the Fabens method, produces biased estimates because individual variability in growth is not allowed for. This paper modifies the Fabens method to account for individual variability in the length asymptote. Significance tests using t-statistics or log-likelihood ratio statistics may be applied to show the level of individual variability. Simulation results indicate that the modified method reduces the biases in the estimates to negligible proportions. Tagging data from tiger prawns (Penaeus esculentus and Penaeus semisulcatus) and rock lobster (Panulirus ornatus) are analysed as an illustration.
Resumo:
The impact of global positioning systems (GPS) and plotter systems on the relative fishing power of the northern prawn fishery fleet on tiger prawns (Penaeus esculentus Haswell, 1879, and P. semisulcatus de Haan, 1850) was investigated from commercial catch data. A generalized linear model was used to account for differences in fishing power between boats and changes in prawn abundance. It was found that boats that used a GPS alone had 4% greater fishing power than boats without a CPS. The addition of a plotter raised the power by 7% over boats without the equipment. For each year between the first to third that a fisher has been working with plotters, there is an additional 2 or 3% increase. It appears that when all boats have a GPS and plotter for at least 3 years, the fishing power of the fleet will increase by 12%. Management controls have reduced the efficiency of each boat and lowered the number of days available to fish, but this may not have been sufficient to counteract the increases. Further limits will be needed to maintain the desired levels of mortality.
Resumo:
A tethered remote instrument package (TRIP) has been developed for biological surveys over Queensland's continental shelf and slope. The present system, evolved from an earlier sled configuration, is suspended above the sea bed and towed at low speeds. Survey information is collected through video and film cameras while instrument and environmental variables are handled by a minicomputer. The operator was able to "fly" the instrument package above the substrate by using an altitude echosounder, forward-looking sonar and real-time television viewing. Unwanted movements of the viewing system were stabilized through a gyro-controlled camera-head panning system. the hydrodynamic drag of the umbilical presented a major control problem which could be overcome only by a reduction in towing speed. Despite the constraints of towing a device such as this through the coral reef environment, the package performed well during a recent biological survey where it was worked at 50% of its 350 m design depth.
Resumo:
Perceiving students, science students especially, as mere consumers of facts and information belies the importance of a need to engage them with the principles underlying those facts and is counter-intuitive to the facilitation of knowledge and understanding. Traditional didactic lecture approaches need a re-think if student classroom engagement and active learning are to be valued over fact memorisation and fact recall. In our undergraduate biomedical science programs across Years 1, 2 and 3 in the Faculty of Health at QUT, we have developed an authentic learning model with an embedded suite of pedagogical strategies that foster classroom engagement and allow for active learning in the sub-discipline area of medical bacteriology. The suite of pedagogical tools we have developed have been designed to enable their translation, with appropriate fine-tuning, to most biomedical and allied health discipline teaching and learning contexts. Indeed, aspects of the pedagogy have been successfully translated to the nursing microbiology study stream at QUT. The aims underpinning the pedagogy are for our students to: (1) Connect scientific theory with scientific practice in a more direct and authentic way, (2) Construct factual knowledge and facilitate a deeper understanding, and (3) Develop and refine their higher order flexible thinking and problem solving skills, both semi-independently and independently. The mindset and role of the teaching staff is critical to this approach since for the strategy to be successful tertiary teachers need to abandon traditional instructional modalities based on one-way information delivery. Face-to-face classroom interactions between students and lecturer enable realisation of pedagogical aims (1), (2) and (3). The strategy we have adopted encourages teachers to view themselves more as expert guides in what is very much a student-focused process of scientific exploration and learning. Specific pedagogical strategies embedded in the authentic learning model we have developed include: (i) interactive lecture-tutorial hybrids or lectorials featuring teacher role-plays as well as class-level question-and-answer sessions, (ii) inclusion of “dry” laboratory activities during lectorials to prepare students for the wet laboratory to follow, (iii) real-world problem-solving exercises conducted during both lectorials and wet laboratory sessions, and (iv) designing class activities and formative assessments that probe a student’s higher order flexible thinking skills. Flexible thinking in this context encompasses analytical, critical, deductive, scientific and professional thinking modes. The strategic approach outlined above is designed to provide multiple opportunities for students to apply principles flexibly according to a given situation or context, to adapt methods of inquiry strategically, to go beyond mechanical application of formulaic approaches, and to as much as possible self-appraise their own thinking and problem solving. The pedagogical tools have been developed within both workplace (real world) and theoretical frameworks. The philosophical core of the pedagogy is a coherent pathway of teaching and learning which we, and many of our students, believe is more conducive to student engagement and active learning in the classroom. Qualitative and quantitative data derived from online and hardcopy evaluations, solicited and unsolicited student and graduate feedback, anecdotal evidence as well as peer review indicate that: (i) our students are engaging with the pedagogy, (ii) a constructivist, authentic-learning approach promotes active learning, and (iii) students are better prepared for workplace transition.
Resumo:
The aim of this dissertation is to provide conceptual tools for the social scientist for clarifying, evaluating and comparing explanations of social phenomena based on formal mathematical models. The focus is on relatively simple theoretical models and simulations, not statistical models. These studies apply a theory of explanation according to which explanation is about tracing objective relations of dependence, knowledge of which enables answers to contrastive why and how-questions. This theory is developed further by delineating criteria for evaluating competing explanations and by applying the theory to social scientific modelling practices and to the key concepts of equilibrium and mechanism. The dissertation is comprised of an introductory essay and six published original research articles. The main theses about model-based explanations in the social sciences argued for in the articles are the following. 1) The concept of explanatory power, often used to argue for the superiority of one explanation over another, compasses five dimensions which are partially independent and involve some systematic trade-offs. 2) All equilibrium explanations do not causally explain the obtaining of the end equilibrium state with the multiple possible initial states. Instead, they often constitutively explain the macro property of the system with the micro properties of the parts (together with their organization). 3) There is an important ambivalence in the concept mechanism used in many model-based explanations and this difference corresponds to a difference between two alternative research heuristics. 4) Whether unrealistic assumptions in a model (such as a rational choice model) are detrimental to an explanation provided by the model depends on whether the representation of the explanatory dependency in the model is itself dependent on the particular unrealistic assumptions. Thus evaluating whether a literally false assumption in a model is problematic requires specifying exactly what is supposed to be explained and by what. 5) The question of whether an explanatory relationship depends on particular false assumptions can be explored with the process of derivational robustness analysis and the importance of robustness analysis accounts for some of the puzzling features of the tradition of model-building in economics. 6) The fact that economists have been relatively reluctant to use true agent-based simulations to formulate explanations can partially be explained by the specific ideal of scientific understanding implicit in the practise of orthodox economics.
Resumo:
The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.
Resumo:
Annual discard ogives were estimated using generalized additive models (GAMs) for four demersal fish species: whiting, haddock, megrim, and plaice. The analysis was based on data collected on board commercial vessels and at Irish fishing ports from 1995 to 2003. For all species the most important factors influencing annual discard ogives were fleet (combination of gear, fishing ground, and targeted species), mean length of the catch and year, and, for megrim, also minimum landing size. The length at which fish are discarded has increased since 2000 for haddock, whiting, and plaice. In contrast, discarded length has decreased for megrim, accompanying a reduction in minimum landing size in 2000.
Resumo:
Including collaboration with industry members as an integral part of research activities is a relatively new approach to fisheries research. Earlier approaches to involving fishers in research usually involved compulsory accommodations of research, such as through compulsory observer programs, in which fishers were seen as subjects of rather than participants in research. This new approach brings with it significant potential benefits but also some unique issues both for the researchers and the participating industry members. In this paper we describe a research project involving the Queensland Coral Reef Finfish Fishery that originated from industry and community concerns about changes in marketing practices in an established commercial line fishery. A key aspect of this project was industry collaboration in all stages of the research, from formulation of objectives to assistance with interpretation of results. We discuss this research as a case study of some of the issues raised by collaboration between industry and research groups in fisheries research and the potential pitfalls and benefits of such collaborations for all parties. A dedicated liaison and extension strategy was a key element in the project to develop and maintain the relationships between fishers and researchers that were fundamental to the success of the collaboration. A major research benefit of the approach was the provision of information not available from other sources: 300 days of direct and unimpeded observation of commercial fishing by researchers; detailed catch and effort records from a further 126 fishing trips; and 53 interviews completed with fishers. Fishers also provided extensive operational information about the fishery as well as ongoing support for subsequent research projects. The time and resources required to complete the research in this consultative framework were greater than for more traditional, researcher-centric fisheries research, but the benefits gained far outweighed the costs.