6 resultados para Mathematical and Computer Modelling
em Aquatic Commons
Resumo:
The implementation of various types of marine protected areas is one of several management tools available for conserving representative examples of the biological diversity within marine ecosystems in general and National Marine Sanctuaries in particular. However, deciding where and how many sites to establish within a given area is frequently hampered by incomplete knowledge of the distribution of organisms and an understanding of the potential tradeoffs that would allow planners to address frequently competing interests in an objective manner. Fortunately, this is beginning to change. Recent studies on the continental shelf of the northeastern United States suggest that substrate and water mass characteristics are highly correlated with the composition of benthic communities and may therefore, serve as proxies for the distribution of biological biodiversity. A detailed geo-referenced interpretative map of major sediment types within Stellwagen Bank National Marine Sanctuary (SBNMS) has recently been developed, and computer-aided decision support tools have reached new levels of sophistication. We demonstrate the use of simulated annealing, a type of mathematical optimization, to identify suites of potential conservation sites within SBNMS that equally represent 1) all major sediment types and 2) derived habitat types based on both sediment and depth in the smallest amount of space. The Sanctuary was divided into 3610 0.5 min2 sampling units. Simulations incorporated constraints on the physical dispersion of sampling units to varying degrees such that solutions included between one and four site clusters. Target representation goals were set at 5, 10, 15, 20, and 25 percent of each sediment type, and 10 and 20 percent of each habitat type. Simulations consisted of 100 runs, from which we identified the best solution (i.e., smallest total area) and four nearoptimal alternates. We also plotted total instances in which each sampling unit occurred in solution sets of the 100 runs as a means of gauging the variety of spatial configurations available under each scenario. Results suggested that the total combined area needed to represent each of the sediment types in equal proportions was equal to the percent representation level sought. Slightly larger areas were required to represent all habitat types at the same representation levels. Total boundary length increased in direct proportion to the number of sites at all levels of representation for simulations involving sediment and habitat classes, but increased more rapidly with number of sites at higher representation levels. There were a large number of alternate spatial configurations at all representation levels, although generally fewer among one and two versus three- and four-site solutions. These differences were less pronounced among simulations targeting habitat representation, suggesting that a similar degree of flexibility is inherent in the spatial arrangement of potential protected area systems containing one versus several sites for similar levels of habitat representation. We attribute these results to the distribution of sediment and depth zones within the Sanctuary, and to the fact that even levels of representation were sought in each scenario. (PDF contains 33 pages.)
Resumo:
Automatic recording instruments provide the ideal means of recording the responses of rivers, lakes and reservoirs to short-term changes in the weather. As part of the project ‘Using Automatic Monitoring and Dynamic Modelling for the Active Management of Lakes and Reservoirs', a family of three automatic monitoring stations were designed by engineers at the Centre for Ecology and Hydrology in Windermere to monitor such responses. In this article, the authors describe this instrument network in some detail and present case studies that illustrate the value of high resolution automatic monitoring in both catchment and reservoir applications.
Resumo:
This article outlines the outcome of work that set out to provide one of the specified integral contributions to the overarching objectives of the EU- sponsored LIFE98 project described in this volume. Among others, these included a requirement to marry automatic monitoring and dynamic modelling approaches in the interests of securing better management of water quality in lakes and reservoirs. The particular task given to us was to devise the elements of an active management strategy for the Queen Elizabeth II Reservoir. This is one of the larger reservoirs supplying the population of the London area: after purification and disinfection, its water goes directly to the distribution network and to the consumers. The quality of the water in the reservoir is of primary concern, for the greater is the content of biogenic materials, including phytoplankton, then the more prolonged is the purification and the more expensive is the treatment. Whatever good that phytoplankton may do by way of oxygenation and oxidative purification, it is eventually relegated to an impurity that has to be removed from the final product. Indeed, it has been estimated that the cost of removing algae and microorganisms from water represents about one quarter of its price at the tap. In chemically fertile waters, such as those typifying the resources of the Thames Valley, there is thus a powerful and ongoing incentive to be able to minimise plankton growth in storage reservoirs. Indeed, the Thames Water company and its predecessor undertakings, have a long and impressive history of confronting and quantifying the fundamentals of phytoplankton growth in their reservoirs and of developing strategies for operation and design to combat them. The work to be described here follows in this tradition. However, the use of the model PROTECH-D to investigate present phytoplankton growth patterns in the Queen Elizabeth II Reservoir questioned the interpretation of some of the recent observations. On the other hand, it has reinforced the theories underpinning the original design of this and those Thames-Valley storage reservoirs constructed subsequently. The authors recount these experiences as an example of how simulation models can hone the theoretical base and its application to the practical problems of supplying water of good quality at economic cost, before the engineering is initiated.
Resumo:
This paper presents an evaluation of the 15-week course on Training in Fisheries Planning and Management being offered at the University of Namibia since 1991. This course includes instruction in fisheries technology, fisheries biology, fisheries law and law of the sea, fisheries economics, fisheries sociology, environment impact assessment, planning and management, the logical framework approach to planning and computer literacy. The participats in the course have rated the various elements in a range of 2.9 to 4.7 out of a maximum of 5 points.
Resumo:
Bycatch, or the incidental catch of nontarget organisms during fi shing operations, is a major issue in U.S. shrimp trawl fisheries. Because bycatch is typically discarded at sea, total bycatch is usually estimated by extrapolating from an observed bycatch sample to the entire fleet with either mean-per-unit or ratio estimators. Using both field observations of commercial shrimp trawlers and computer simulations, I compared five methods for generating bycatch estimates that were used in past studies, a mean-per-unit estimator and four forms of the ratio estimator, respectively: 1) the mean fish catch per unit of effort, where unit effort was a proxy for sample size, 2) the mean of the individual fish to shrimp ratios, 3) the ratio of mean fish catch to mean shrimp catch, 4) the mean of the ratios of fish catch per time fished (a variable measure of effort), and 5) the ratio of mean fish catch per mean time fished. For field data, different methods used to estimate bycatch of Atlantic croaker, spot, and weakfish yielded extremely different results, with no discernible pattern in the estimates by method, geographic region, or species. Simulated fishing fleets were used to compare bycatch estimated by the fi ve methods with “actual” (simulated) bycatch. Simulations were conducted by using both normal and delta lognormal distributions of fish and shrimp and employed a range of values for several parameters, including mean catches of fish and shrimp, variability in the catches of fish and shrimp, variability in fishing effort, number of observations, and correlations between fish and shrimp catches. Results indicated that only the mean per unit estimators provided statistically unbiased estimates, while all other methods overestimated bycatch. The mean of the individual fish to shrimp ratios, the method used in the South Atlantic Bight before the 1990s, gave the most biased estimates. Because of the statistically significant two- and 3-way interactions among parameters, it is unlikely that estimates generated by one method can be converted or corrected to estimates made by another method: therefore bycatch estimates obtained with different methods should not be compared directly.
Resumo:
There are two groups of factors, namely fishery independent factors such as current, temperature and salinity and fishery dependent factors such as types of fishing, namely trawling, gill netting etc. with different mesh sizes and intensity of fishing indicating the number of units of each type of fishing. Hence assessment of capture fishery resources remains a puzzle even today. However, attempts have been made to develop suitable mathematical and statistical models for assessing them and for offering suggestions for judicious management of the resources. This paper indicates in brief the important characteristics of the capture fisheries, their assessment and management with particular reference to India.