908 resultados para average complexity
Resumo:
We consider the problem of deciding whether the output of a boolean circuit is determined by a partial assignment to its inputs. This problem is easily shown to be hard, i.e., co-Image Image -complete. However, many of the consequences of a partial input assignment may be determined in linear time, by iterating the following step: if we know the values of some inputs to a gate, we can deduce the values of some outputs of that gate. This process of iteratively deducing some of the consequences of a partial assignment is called propagation. This paper explores the parallel complexity of propagation, i.e., the complexity of determining whether the output of a given boolean circuit is determined by propagating a given partial input assignment. We give a complete classification of the problem into those cases that are Image -complete and those that are unlikely to be Image complete.
Resumo:
A variety of data structures such as inverted file, multi-lists, quad tree, k-d tree, range tree, polygon tree, quintary tree, multidimensional tries, segment tree, doubly chained tree, the grid file, d-fold tree. super B-tree, Multiple Attribute Tree (MAT), etc. have been studied for multidimensional searching and related problems. Physical data base organization, which is an important application of multidimensional searching, is traditionally and mostly handled by employing inverted file. This study proposes MAT data structure for bibliographic file systems, by illustrating the superiority of MAT data structure over inverted file. Both the methods are compared in terms of preprocessing, storage and query costs. Worst-case complexity analysis of both the methods, for a partial match query, is carried out in two cases: (a) when directory resides in main memory, (b) when directory resides in secondary memory. In both cases, MAT data structure is shown to be more efficient than the inverted file method. Arguments are given to illustrate the superiority of MAT data structure in an average case also. An efficient adaptation of MAT data structure, that exploits the special features of MAT structure and bibliographic files, is proposed for bibliographic file systems. In this adaptation, suitable techniques for fixing and ranking of the attributes for MAT data structure are proposed. Conclusions and proposals for future research are presented.
Resumo:
We address the issue of complexity for vector quantization (VQ) of wide-band speech LSF (line spectrum frequency) parameters. The recently proposed switched split VQ (SSVQ) method provides better rate-distortion (R/D) performance than the traditional split VQ (SVQ) method, even at the requirement of lower computational complexity. but at the expense of much higher memory. We develop the two stage SVQ (TsSVQ) method, by which we gain both the memory and computational advantages and still retain good R/D performance. The proposed TsSVQ method uses a full dimensional quantizer in its first stage for exploiting all the higher dimensional coding advantages and then, uses an SVQ method for quantizing the residual vector in the second stage so as to reduce the complexity. We also develop a transform domain residual coding method in this two stage architecture such that it further reduces the computational complexity. To design an effective residual codebook in the second stage, variance normalization of Voronoi regions is carried out which leads to the design of two new methods, referred to as normalized two stage SVQ (NTsSVQ) and normalized two stage transform domain SVQ (NTsTrSVQ). These two new methods have complimentary strengths and hence, they are combined in a switched VQ mode which leads to the further improvement in R/D performance, but retaining the low complexity requirement. We evaluate the performances of new methods for wide-band speech LSF parameter quantization and show their advantages over established SVQ and SSVQ methods.
Resumo:
In this dissertation, I present an overall methodological framework for studying linguistic alternations, focusing specifically on lexical variation in denoting a single meaning, that is, synonymy. As the practical example, I employ the synonymous set of the four most common Finnish verbs denoting THINK, namely ajatella, miettiä, pohtia and harkita ‘think, reflect, ponder, consider’. As a continuation to previous work, I describe in considerable detail the extension of statistical methods from dichotomous linguistic settings (e.g., Gries 2003; Bresnan et al. 2007) to polytomous ones, that is, concerning more than two possible alternative outcomes. The applied statistical methods are arranged into a succession of stages with increasing complexity, proceeding from univariate via bivariate to multivariate techniques in the end. As the central multivariate method, I argue for the use of polytomous logistic regression and demonstrate its practical implementation to the studied phenomenon, thus extending the work by Bresnan et al. (2007), who applied simple (binary) logistic regression to a dichotomous structural alternation in English. The results of the various statistical analyses confirm that a wide range of contextual features across different categories are indeed associated with the use and selection of the selected think lexemes; however, a substantial part of these features are not exemplified in current Finnish lexicographical descriptions. The multivariate analysis results indicate that the semantic classifications of syntactic argument types are on the average the most distinctive feature category, followed by overall semantic characterizations of the verb chains, and then syntactic argument types alone, with morphological features pertaining to the verb chain and extra-linguistic features relegated to the last position. In terms of overall performance of the multivariate analysis and modeling, the prediction accuracy seems to reach a ceiling at a Recall rate of roughly two-thirds of the sentences in the research corpus. The analysis of these results suggests a limit to what can be explained and determined within the immediate sentential context and applying the conventional descriptive and analytical apparatus based on currently available linguistic theories and models. The results also support Bresnan’s (2007) and others’ (e.g., Bod et al. 2003) probabilistic view of the relationship between linguistic usage and the underlying linguistic system, in which only a minority of linguistic choices are categorical, given the known context – represented as a feature cluster – that can be analytically grasped and identified. Instead, most contexts exhibit degrees of variation as to their outcomes, resulting in proportionate choices over longer stretches of usage in texts or speech.
Resumo:
Space-time codes from complex orthogonal designs (CODs) with no zero entries offer low Peak to Average Power Ratio (PAPR) and avoid the problem of switching off antennas. But square CODs for 2(a) antennas with a + 1. complex variables, with no zero entries were discovered only for a <= 3 and if a + 1 = 2(k), for k >= 4. In this paper, a method of obtaining no zero entry (NZE) square designs, called Complex Partial-Orthogonal Designs (CPODs), for 2(a+1) antennas whenever a certain type of NZE code exists for 2(a) antennas is presented. Then, starting from a so constructed NZE CPOD for n = 2(a+1) antennas, a construction procedure is given to obtain NZE CPODs for 2n antennas, successively. Compared to the CODs, CPODs have slightly more ML decoding complexity for rectangular QAM constellations and the same ML decoding complexity for other complex constellations. Using the recently constructed NZE CODs for 8 antennas our method leads to NZE CPODs for 16 antennas. The class of CPODs do not offer full-diversity for all complex constellations. For the NZE CPODs presented in the paper, conditions on the signal sets which will guarantee full-diversity are identified. Simulation results show that bit error performance of our codes is same as that of the CODs under average power constraint and superior to CODs under peak power constraint.
Resumo:
Les histoires de l’art et du design ont délaissé, au cours desquatre dernières décennies, l’étude canonique des objets, des artistes/concepteurs et des styles et se sont tournées vers des recherches plus interdisciplinaires. Nous soutenons néanmoins que les historiens et historiennes du design doivent continuer de pousser leur utilisation d’approches puisant dans la culturelle matérielle et la criticalité afin de combler des lacunes dans l’histoire du design et de développer des méthodes et des approches pertinentes pour son étude. Puisant dans notre expérience d’enseignement auprès de la génération des « milléniaux », qui sont portés vers un « design militant », nous offrons des exemples pédagogiques qui ont aidé nos étudiants et étudiantes à assimiler des histoires du design responsables, engagées et réflexives et à comprendre la complexité et la criticalité du design.
Resumo:
The research in software science has so far been concentrated on three measures of program complexity: (a) software effort; (b) cyclomatic complexity; and (c) program knots. In this paper we propose a measure of the logical complexity of programs in terms of the variable dependency of sequence of computations, inductive effort in writing loops and complexity of data structures. The proposed complexity mensure is described with the aid of a graph which exhibits diagrammatically the dependence of a computation at a node upon the computation of other (earlier) nodes. Complexity measures of several example programs have been computed and the related issues have been discussed. The paper also describes the role played by data structures in deciding the program complexity.
Resumo:
The sequential nature of gel-based marker systems entails low throughput and high costs per assay. Commonly used marker systems such as SSR and SNP are also dependent on sequence information. These limitations result in high cost per data point and significantly limit the capacity of breeding programs to obtain sufficient return on investment to justify the routine use of marker-assisted breeding for many traits and particularly quantitative traits. Diversity Arrays Technology (DArT™) is a cost effective hybridisation-based marker technology that offers a high multiplexing level while being independent of sequence information. This technology offers sorghum breeding programs an alternative approach to whole-genome profiling. We report on the development, application, mapping and utility of DArT™ markers for sorghum germplasm. Results: A genotyping array was developed representing approximately 12,000 genomic clones using PstI+BanII complexity with a subset of clones obtained through the suppression subtractive hybridisation (SSH) method. The genotyping array was used to analyse a diverse set of sorghum genotypes and screening a Recombinant Inbred Lines (RIL) mapping population. Over 500 markers detected variation among 90 accessions used in a diversity analysis. Cluster analysis discriminated well between all 90 genotypes. To confirm that the sorghum DArT markers behave in a Mendelian manner, we constructed a genetic linkage map for a cross between R931945-2-2 and IS 8525 integrating DArT and other marker types. In total, 596 markers could be placed on the integrated linkage map, which spanned 1431.6 cM. The genetic linkage map had an average marker density of 1/2.39 cM, with an average DArT marker density of 1/3.9 cM. Conclusion: We have successfully developed DArT markers for Sorghum bicolor and have demonstrated that DArT provides high quality markers that can be used for diversity analyses and to construct medium-density genetic linkage maps. The high number of DArT markers generated in a single assay not only provides a precise estimate of genetic relationships among genotypes, but also their even distribution over the genome offers real advantages for a range of molecular breeding and genomics applications.
Resumo:
Following an invariant-imbedding approach, we obtain analytical expressions for the ensemble-averaged resistance (ρ) and its Sinai’s fluctuations for a one-dimensional disordered conductor in the presence of a finite electric field F. The mean resistance shows a crossover from the exponential to the power-law length dependence with increasing field strength in agreement with known numerical results. More importantly, unlike the zero-field case the resistance distribution saturates to a Poissonian-limiting form proportional to A‖F‖exp(-A‖F‖ρ) for large sample lengths, where A is constant.
Resumo:
The impact of cropping histories (sugarcane, maize and soybean), tillage practices (conventional tillage and direct drill) and fertiliser N in the plant and 1st ratoon (1R) crops of sugarcane were examined in field trials at Bundaberg and Ingham. Average yields at Ingham (Q200) and Bundaberg (Q151) were quite similar in both the plant crop (83 t/ha and 80 t/ha, respectively) and the 1R (89 t/ha v 94 t/ha, respectively), with only minor treatment effects on CCS at each site. Cane yield responses to tillage, break history and N fertiliser varied significantly between sites. There was a 27% yield increase in the plant crop from the soybean fallow at Ingham, with soybeans producing a yield advantage over continuous cane, but there were no clear break effects at Bundaberg - possibly due to a complex of pathogenic nematodes that responded differently to soybeans and maize breaks. There was no carryover benefit of the soybean break into the 1R crop at Ingham, while at Bundaberg the maize break produced a 15% yield advantage over soybeans and continuous cane. The Ingham site recorded positive responses to N fertiliser addition in both the plant (20% yield increase) and 1R (34% yield increase) crops, but there was negligible carryover benefit from plant crop N in the 1R crop, or of a reduced N response after a soybean rotation. By contrast, the Bundaberg site showed no N response in any history in the plant crop, and only a small (5%) yield increase with N applied in the 1R crop. There was again no evidence of a reduced N response in the 1R crop after a soybean fallow. There were no significant effects of tillage on cane yields at either site, although there were some minor interactions between tillage, breaks and N management in the 1R crop at both sites. Crop N contents at Bundaberg were more than 3 times those recorded at Ingham in both the plant and 1R crops, with N concentrations in millable stalk at Ingham suggesting N deficiencies in all treatments. There was negligible additional N recovered in crop biomass from N fertiliser application or soybean residues at the Ingham site. There was additional N recovered in crop biomass in response to N fertiliser and soybean breaks at Bundaberg, but effects were small and fertiliser use efficiencies poor. Loss pathways could not be quantified, but denitrification or losses in runoff were the likely causes at Ingham while leaching predominated at Bundaberg. Results highlight the complexity involved in developing sustainable farming systems for contrasting soil types and climatic conditions. A better understanding of key sugarcane pathogens and their host range, as well as improved capacity to predict in-crop N mineralisation, will be key factors in future improvements to sugarcane farming systems.
Resumo:
This paper review the some of the recent developments in Complexity theory as applied to telephone-switching. Some of these techniques are suitable for practical implementation in India.
Resumo:
Prescribing for older patients is challenging. The prevalence of diseases increases with advancing age and causes extensive drug use. Impairments in cognitive, sensory, social and physical functioning, multimorbidity and comorbidities, as well as age-related changes in pharmacokinetics and pharmacodynamics all add to the complexity of prescribing. This study is a cross-sectional assessment of all long-term residents aged ≥ 65 years in all nursing homes in Helsinki, Finland. The residents’ health status was assessed and data on their demographic factors, health and medications were collected from their medical records in February 2003. This study assesses some essential issues in prescribing for older people: psychotropic drugs (Paper I), laxatives (Paper II), vitamin D and calcium supplements (Paper III), potentially inappropriate drugs for older adults (PIDs) and drug-drug interactions (DDIs)(Paper IV), as well as prescribing in public and private nursing homes. A resident was classified as a medication user if his or her medication record indicated a regular sequence for its dosage. Others were classified as non-users. Mini Nutritional Assessment (MNA) was used to assess residents’ nutritional status, Beers 2003 criteria to assess the use of PIDs, and the Swedish, Finnish, INteraction X-referencing database (SFINX) to evaluate their exposure to DDIs. Of all nursing home residents in Helsinki, 82% (n=1987) participated in studies I, II, and IV and 87% (n=2114) participated in the study III. The residents’ mean age was 84 years, 81% were female, and 70% were diagnosed with dementia. The mean number of drugs was 7.9 per resident; 40% of the residents used ≥ 9 drugs per day, and were thus exposed to polypharmacy. Eighty percent of the residents received psychotropics; 43% received antipsychotics, and 45% used antidepressants. Anxiolytics were prescribed to 26%, and hypnotics to 28% of the residents. Of those residents diagnosed with dementia, 11% received antidementia drugs. Fifty five percent of the residents used laxatives regularly. In multivariate analysis, those factors associated with regular laxative use were advanced age, immobility, poor nutritional status, chewing problems, Parkinson’s disease, and a high number of drugs. Eating snacks between meals was associated with lower risk for laxative use. Of all participants, 33% received vitamin D supplementation, 28% received calcium supplementation, and 20% received both vitamin D and calcium. The dosage of vitamin D was rather low: 21% received vitamin D 400 IU (10 µg) or more, and only 4% received 800 IU (20 µg) or more. In multivariate analysis, residents who received vitamin D supplementation enjoyed better nutritional status, ate snacks between meals, suffered no constipation, and received regular weight monitoring. Those residents receiving PIDs (34% of all residents) more often used psychotropic medication and were more often exposed to polypharmacy than residents receiving no PIDs. Residents receiving PIDs were less often diagnosed with dementia than were residents receiving no PIDs. The three most prevalent PIDs were short-acting benzodiazepine in greater dosages than recommended, hydroxyzine, and nitrofurantoin. These three drugs accounted for nearly 77% of all PID use. Of all residents, less than 5% were susceptible to a clinically significant DDI. The most common DDIs were related to the use of potassium-sparing diuretics, carbamazepine, and codeine. Residents exposed to potential DDIs were younger, had more often suffered a previous stroke, more often used psychotropics, and were more often exposed to PIDs and polypharmacy than were residents not exposed to DDIs. Residents in private nursing homes were less often exposed to polypharmacy than were residents in public nursing homes. Long-term residents in nursing homes in Helsinki use, on average, nearly eight drugs daily. The use of psychotropic drugs in our study was notably more common than in international studies. The prevalence of laxatives equaled other prior international studies. Regardless of the known benefit and recommendation of vitamin D supplementation for elderly residing mostly indoors, the proportion of nursing home residents receiving vitamin D and calcium was surprisingly low. The use of PIDs was common among nursing home residents. PIDs increased the likelihood of DDIs. However, DDIs did not seem a major concern among the nursing home population. Monitoring PIDs and potential drug interactions could improve the quality of prescribing.
Resumo:
We computed Higuchi's fractal dimension (FD) of resting, eyes closed EEG recorded from 30 scalp locations in 18 male neuroleptic-naive, recent-onset schizophrenia (NRS) subjects and 15 male healthy control (HC) subjects, who were group-matched for age. Schizophrenia patients showed a diffuse reduction of FD except in the bilateral temporal and occipital regions, with the reduction being most prominent bifrontally. The positive symptom (PS) schizophrenia subjects showed FD values similar to or even higher than HC in the bilateral temporo-occipital regions, along with a co-existent bifrontal FD reduction as noted in the overall sample of NRS. In contrast, this increase in FD values in the bilateral temporo-occipital region was absent in the negative symptom (NS) subgroup. The regional differences in complexity suggested by these findings may reflect the aberrant brain dynamics underlying the pathophysiology of schizophrenia and its symptom dimensions. Higuchi's method of measuring FD directly in the time domain provides an alternative for the more computationally intensive nonlinear methods of estimating EEG complexity.
Resumo:
Non-Technical Summary Seafood CRC Project 2009/774. Harvest strategy evaluations and co-management for the Moreton Bay Trawl Fishery Principal Investigator: Dr Tony Courtney, Principal Fisheries Biologist Fisheries and Aquaculture, Agri-Science Queensland Department of Agriculture, Fisheries and Forestry Level B1, Ecosciences Precinct, Joe Baker St, Dutton Park, Queensland 4102 Email: tony.courtney@daff.qld.gov.au Project objectives: 1. Review the literature and data (i.e., economic, biological and logbook) relevant to the Moreton Bay trawl fishery. 2. Identify and prioritise management objectives for the Moreton Bay trawl fishery, as identified by the trawl fishers. 3. Undertake an economic analysis of Moreton Bay trawl fishery. 4. Quantify long-term changes to fishing power for the Moreton Bay trawl fishery. 5. Assess priority harvest strategies identified in 2 (above). Present results to, and discuss results with, Moreton Bay Seafood Industry Association (MBSIA), fishers and Fisheries Queensland. Note: Additional, specific objectives for 2 (above) were developed by fishers and the MBSIA after commencement of the project. These are presented in detail in section 5 (below). The project was an initiative of the MBSIA, primarily in response to falling profitability in the Moreton Bay prawn trawl fishery. The analyses were undertaken by a consortium of DAFF, CSIRO and University of Queensland researchers. This report adopted the Australian Standard Fish Names (http://www.fishnames.com.au/). Trends in catch and effort The Moreton Bay otter trawl fishery is a multispecies fishery, with the majority of the catch composed of Greasyback Prawns (Metapenaeus bennettae), Brown Tiger Prawns (Penaeus esculentus), Eastern King Prawns (Melicertus plebejus), squid (Uroteuthis spp., Sepioteuthis spp.), Banana Prawns (Fenneropenaeus merguiensis), Endeavour Prawns (Metapenaeus ensis, Metapenaeus endeavouri) and Moreton Bay bugs (Thenus parindicus). Other commercially important byproduct includes blue swimmer crabs (Portunus armatus), three-spot crabs (Portunus sanguinolentus), cuttlefish (Sepia spp.) and mantis shrimp (Oratosquilla spp.). Logbook catch and effort data show that total annual reported catch of prawns from the Moreton Bay otter trawl fishery has declined to 315 t in 2008 from a maximum of 901 t in 1990. The number of active licensed vessels participating in the fishery has also declined from 207 in 1991 to 57 in 2010. Similarly, fishing effort has fallen from a peak of 13,312 boat-days in 1999 to 3817 boat-days in 2008 – a 71% reduction. The declines in catch and effort are largely attributed to reduced profitability in the fishery due to increased operational costs and depressed prawn prices. The low prawn prices appear to be attributed to Australian aquacultured prawns and imported aquacultured vannamei prawns, displacing the markets for trawl-caught prawns, especially small species such as Greasyback Prawns which traditionally dominated landings in Moreton Bay. In recent years, the relatively high Australian dollar has resulted in reduced exports of Australian wild-caught prawns. This has increased supply on the domestic market which has also suppressed price increases. Since 2002, Brown Tiger Prawns have dominated annual reported landings in the Moreton Bay fishery. While total catch and effort in the bay have declined to historically low levels, the annual catch and catch rates of Brown Tiger Prawns have been at record highs in recent years. This appears to be at least partially attributed to the tiger prawn stock having recovered from excessive effort in previous decades. The total annual value of the Moreton Bay trawl fishery catch, including byproduct, is about $5 million, of which Brown Tiger Prawns account for about $2 million. Eastern King Prawns make up about 10% of the catch and are mainly caught in the bay from October to December as they migrate to offshore waters outside the bay where they contribute to a large mono-specific trawl fishery. Some of the Eastern King Prawns harvested in Moreton Bay may be growth overfished (i.e., caught below the size required to maximise yield or value), although the optimum size-at-capture was not determined in this study. Banana Prawns typically make up about 5% of the catch, but can exceed 20%, particularly following heavy rainfall. Economic analysis of the fishery From the economic survey, cash profits were, on average, positive for both fleet segments in both years of the survey. However, after the opportunity cost of capital and depreciation were taken into account, the residual owner-operator income was relatively low, and substantially lower than the average share of revenue paid to employed skippers. Consequently, owner-operators were earning less than their opportunity cost of their labour, suggesting that the fleets were economically unviable in the longer term. The M2 licensed fleet were, on average, earning similar boat cash profits as the T1/M1 fleet, although after the higher capital costs were accounted for the T1/M1 boats were earning substantially lower returns to owner-operator labour. The mean technical efficiency for the fleet as a whole was estimated to be 0.67. That is, on average, the boats were only catching 67 per cent of what was possible given their level of inputs (hours fished and hull units). Almost one-quarter of observations had efficiency scores above 0.8, suggesting a substantial proportion of the fleet are relatively efficient, but some are also relatively inefficient. Both fleets had similar efficiency distributions, with median technical efficiency score of 0.71 and 0.67 for the M2 and T1/M1 boats respectively. These scores are reasonably consistent with other studies of prawn trawl fleets in Australia, although higher average efficiency scores were found in the NSW prawn trawl fleet. From the inefficiency model, several factors were found to significantly influence vessel efficiency. These included the number of years of experience as skipper, the number of generations that the skipper’s family had been fishing and the number of years schooling. Skippers with more schooling were significantly more efficient than skippers with lower levels of schooling, consistent with other studies. Skippers who had been fishing longer were, in fact, less efficient than newer skippers. However, this was mitigated in the case of skippers whose family had been involved in fishing for several generations, consistent with other studies and suggesting that skill was passed through by families over successive generations. Both the linear and log-linear regression models of total fishing effort against the marginal profit per hour performed reasonably well, explaining between 70 and 84 per cent of the variation in fishing effort. As the models had different dependent variables (one logged and the other not logged) this is not a good basis for model choice. A better comparator is the square root of the mean square error (SMSE) expressed as a percentage of the mean total effort. On this criterion, both models performed very similarly. The linear model suggests that each additional dollar of average profits per hour in the fishery increases total effort by around 26 hours each month. From the log linear model, each percentage increase in profits per hour increases total fishing effort by 0.13 per cent. Both models indicate that economic performance is a key driver of fishing effort in the fishery. The effect of removing the boat-replacement policy is to increase individual vessel profitability, catch and effort, but the overall increase in catch is less than that removed by the boats that must exit the fishery. That is, the smaller fleet (in terms of boat numbers) is more profitable but the overall catch is not expected to be greater than before. This assumes, however, that active boats are removed, and that these were also taking an average level of catch. If inactive boats are removed, then catch of the remaining group as a whole could increase by between 14 and 17 per cent depending on the degree to which costs are reduced with the new boats. This is still substantially lower than historical levels of catch by the fleet. Fishing power analyses An analysis of logbook data from 1988 to 2010, and survey information on fishing gear, was performed to estimate the long-term variation in the fleet’s ability to catch prawns (known as fishing power) and to derive abundance estimates of the three most commercially important prawn species (i.e., Brown Tiger, Eastern King and Greasyback Prawns). Generalised linear models were used to explain the variation in catch as a function of effort (i.e., hours fished per day), vessel and gear characteristics, onboard technologies, population abundance and environmental factors. This analysis estimated that fishing power associated with Brown Tiger and Eastern King Prawns increased over the past 20 years by 10–30% and declined by approximately 10% for greasybacks. The density of tiger prawns was estimated to have almost tripled from around 0.5 kg per hectare in 1988 to 1.5 kg/ha in 2010. The density of Eastern King Prawns was estimated to have fluctuated between 1 and 2 kg per hectare over this time period, without any noticeable overall trend, while Greasyback Prawn densities were estimated to have fluctuated between 2 and 6 kg per hectare, also without any distinctive trend. A model of tiger prawn catches was developed to evaluate the impact of fishing on prawn survival rates in Moreton Bay. The model was fitted to logbook data using the maximum-likelihood method to provide estimates of the natural mortality rate (0.038 and 0.062 per week) and catchability (which can be defined as the proportion of the fished population that is removed by one unit of effort, in this case, estimated to be 2.5 ± 0.4 E-04 per boat-day). This approach provided a method for industry and scientists to develop together a realistic model of the dynamics of the fishery. Several aspects need to be developed further to make this model acceptable to industry. Firstly, there is considerable evidence to suggest that temperature influences prawn catchability. This ecological effect should be incorporated before developing meaningful harvest strategies. Secondly, total effort has to be allocated between each species. Such allocation of effort could be included in the model by estimating several catchability coefficients. Nevertheless, the work presented in this report is a stepping stone towards estimating essential fishery parameters and developing representative mathematical models required to evaluate harvest strategies. Developing a method that allowed an effective discussion between industry, management and scientists took longer than anticipated. As a result, harvest strategy evaluations were preliminary and only included the most valuable species in the fishery, Brown Tiger Prawns. Additional analyses and data collection, including information on catch composition from field sampling, migration rates and recruitment, would improve the modelling. Harvest strategy evaluations As the harvest strategy evaluations are preliminary, the following results should not be adopted for management purposes until more thorough evaluations are performed. The effects, of closing the fishery for one calendar month, on the annual catch and value of Brown Tiger Prawns were investigated. Each of the 12 months (i.e., January to December) was evaluated. The results were compared against historical records to determine the magnitude of gain or loss associated with the closure. Uncertainty regarding the trawl selectivity was addressed using two selectivity curves, one with a weight at 50% selection (S50%) of 7 g, based on research data, and a second with S50% of 14 g, put forward by industry. In both cases, it was concluded that any monthly closure after February would not be beneficial to the industry. The magnitude of the benefit of closing the fishery in either January or February was sensitive to which mesh selectivity curve that was assumed, with greater benefit achieved when the smaller selectivity curve (i.e., S50% = 7 g) was assumed. Using the smaller selectivity (S50% = 7 g), the expected increase in catch value was 10–20% which equates to $200,000 to $400,000 annually, while the larger selectivity curve (S50% = 14 g) suggested catch value would be improved by 5–10%, or $100,000 to $200,000. The harvest strategy evaluations showed that greater benefits, in the order of 30–60% increases in the tiger annual catch value, could have been obtained by closing the fishery early in the year when annual effort levels were high (i.e., > 10,000 boat-days). In recent years, as effort levels have declined (i.e., ~4000 boat-days annually), expected benefits from such closures are more modest. In essence, temporal closures offer greater benefit when fishing mortality rates are high. A spatial analysis of Brown Tiger Prawn catch and effort was also undertaken to obtain a better understanding of the prawn population dynamics. This indicated that, to improve profitability of the fishery, fishers could consider closing the fishery in the period from June to October, which is already a period of low profitability. This would protect the Brown Tiger Prawn spawning stock, increase catch rates of all species in the lucrative pre-Christmas period (November–December), and provide fishers with time to do vessel maintenance, arrange markets for the next season’s harvest, and, if they wish, work at other jobs. The analysis found that the instantaneous rate of total mortality (Z) for the March–June period did not vary significantly over the last two decades. As the Brown Tiger Prawn population in Moreton Bay has clearly increased over this time period, an interesting conclusion is that the instantaneous rate of natural mortality (M) must have increased, suggesting that tiger prawn natural mortality may be density-dependent at this time of year. Mortality rates of tiger prawns for June–October were found to have decreased over the last two decades, which has probably had a positive effect on spawning stocks in the October–November spawning period. Abiotic effects on the prawns The influence of air temperature, rainfall, freshwater flow, the southern oscillation index (SOI) and lunar phase on the catch rates of the four main prawn species were investigated. The analyses were based on over 200,000 daily logbook catch records over 23 years (i.e., 1988–2010). Freshwater flow was more influential than rainfall and SOI, and of the various sources of flow, the Brisbane River has the greatest volume and influence on Moreton Bay prawn catches. A number of time-lags were also considered. Flow in the preceding month prior to catch (i.e., 30 days prior, Logflow1_30) and two months prior (31–60 days prior, Logflow31_60) had strong positive effects on Banana Prawn catch rates. Average air temperature in the preceding 4-6 months (Temp121_180) also had a large positive effect on Banana Prawn catch rates. Flow in the month immediately preceding catch (Logflow1_30) had a strong positive influence on Greasyback Prawn catch rates. Air temperature in the preceding two months prior to catch (Temp1_60) had a large positive effect on Brown Tiger Prawn catch rates. No obvious or marked effects were detected for Eastern King Prawns, although interestingly, catch rates declined with increasing air temperature 4–6 months prior to catch. As most Eastern King Prawn catches in Moreton Bay occur in October to December, the results suggest catch rates decline with increasing winter temperatures. In most cases, the prawn catch rates declined with the waxing lunar phase (high luminance/full moon), and increased with the waning moon (low luminance/new moon). The SOI explains little additional variation in prawn catch rates (~ <2%), although its influence was higher for Banana Prawns. Extrapolating findings of the analyses to long-term climate change effects should be interpreted with caution. That said, the results are consistent with likely increases in abundance in the region for the two tropical species, Banana Prawns and Brown Tiger Prawns, as coastal temperatures rise. Conversely, declines in abundance could be expected for the two temperate species, Greasyback and Eastern King Prawns. Corporate management structures An examination of alternative governance systems was requested by the industry at one of the early meetings, particularly systems that may give them greater autonomy in decision making as well as help improve the marketing of their product. Consequently, a review of alternative management systems was undertaken, with a particular focus on the potential for self-management of small fisheries (small in terms of number of participants) and corporate management. The review looks at systems that have been implemented or proposed for other small fisheries internationally, with a particular focus on self-management as well as the potential benefits and challenges for corporate management. This review also highlighted particular opportunities for the Moreton Bay prawn fishery. Corporate management differs from other co-management and even self-management arrangements in that ‘ownership’ of the fishery is devolved to a company in which fishers and government are shareholders. The company manages the fishery as well as coordinates marketing to ensure that the best prices are received and that the catch taken meets the demands of the market. Coordinated harvesting will also result in increased profits, which are returned to fishers in the form of dividends. Corporate management offers many of the potential benefits of an individual quota system without formally implementing such a system. A corporate management model offers an advantage over a self-management model in that it can coordinate both marketing and management to take advantage of this unique geographical advantage. For such a system to be successful, the fishery needs to be relatively small and self- contained. Small in this sense is in terms of number of operators. The Moreton Bay prawn fishery satisfies these key conditions for a successful self-management and potentially corporate management system. The fishery is small both in terms of number of participants and geography. Unlike other fisheries that have progressed down the self-management route, the key market for the product from the Moreton Bay fishery is right at its doorstep. Corporate management also presents a number of challenges. First, it will require changes in the way fishers operate. In particular, the decision on when to fish and what to catch will be taken away from the individual and decided by the collective. Problems will develop if individuals do not join the corporation but continue to fish and market their own product separately. While this may seem an attractive option to fishers who believe they can do better independently, this is likely to be just a short- term advantage with an overall long-run cost to themselves as well as the rest of the industry. There are also a number of other areas that need further consideration, particularly in relation to the allocation of shares, including who should be allocated shares (e.g. just boat owners or also some employed skippers). Similarly, how harvesting activity is to be allocated by the corporation to the fishers. These are largely issues that cannot be answered without substantial consultation with those likely to be affected, and these groups cannot give these issues serious consideration until the point at which they are likely to become a reality. Given the current structure and complexity of the fishery, it is unlikely that such a management structure will be feasible in the short term. However, the fishery is a prime candidate for such a model, and development of such a management structure in the future should be considered as an option for the longer term.