7 resultados para Efficient use of land
em University of Queensland eSpace - Australia
Resumo:
The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The feasibility of using photosynthetic sulfide-oxidizing bacteria to remove sulfide from wastewater in circumstances where axenic cultures are unrealistic has been completely reconsidered on the basis of known ecophysiological data, and the principles of photobioreactor and chemical reactor engineering. This has given rise to the development of two similar treatment concepts relying on biofilms dominated by green sulfur bacteria (GSB) that develop on the exterior of transparent surfaces suspended in the wastewater. The GSB are sustained and selected for by radiant energy in the band 720 - 780 nm, supplied from within the transparent surface. A model of one of these concepts was constructed and with it the reactor concept was proven. The dependence of sulfide-removal rate on bulk sulfide concentration has been ascertained. The maximum net areal sulfide removal rate was 2.23 g m(-2) day(-1) at a bulk sulfide concentration of 16.5 mg L-1 and an incident irradiance of 1.51 W m(-2). The system has a demonstrated capacity to mitigate surges in sulfide load, and appears to use much less radiant power than comparable systems. The efficacy with which this energy was used for sulfide removal was 1.47 g day(-1) W-1. The biofilm was dominated by GSB, and evidence gathered indicated that other types of phototrophs were not present. (C) 2004 Wiley Periodicals, Inc.
Resumo:
Intensive animal industries create large volumes of nutrient rich effluent, which, if untreated, has the potential for substantial environmental degradation. Aquatic plants in aerobic lagoon systems have the potential to achieve inexpensive and efficient remediation of effluent, and to recover valuable nutrients that would otherwise be lost. Members of the family Lemnaceae (duckweeds) are widely used in lagoon systems, but despite their widespread use in the cleansing of sewage, only limited research has been conducted into their growth in highly eutrophic media, and little has been done to systematically distinguish between different types of media. This study examined the growth characteristics of duckweed in abattoir effluent, and explored possible ways of ameliorating the inhibitory factors to growth on this medium. A series of pot trials was conducted to test the tolerance of duckweed to abattoir effluent partially remediated by a sojourn in anaerobic fermentation ponds, both in its unmodified form, and after the addition of acid to manipulate pH, and the addition of bentonite. Unmodified abattoir effluent was highly toxic to duckweed, although duckweed remained viable and grew sub optimally in media with total ammonia nitrogen (TAN) concentrations of up to 100 mg/l. Duckweed also grew vigorously in effluent diluted 1:4 v/v, containing 56 mg TAN/L and also modified by addition of acid to decrease pH to 7 and by adding bentonite (0.5%).
Resumo:
Sorghum ergot, caused by Claviceps africana, has remained a major disease problem in Australia since it was first recorded in 1996, and is the focus of a range of biological and integrated management research. Artificial inoculation using conidial suspensions is an important tool in this research. Ergot infection is greatly influenced by environmental factors, so it is important to reduce controllable sources of variation such as inoculum concentration. The use of optical density was tested as a method of quantifying conidial suspensions of C. africana, as an alternative to haemocytometer counts. This method was found to be accurate and time efficient, with possible applications in other disease systems.
Resumo:
Retrieving large amounts of information over wide area networks, including the Internet, is problematic due to issues arising from latency of response, lack of direct memory access to data serving resources, and fault tolerance. This paper describes a design pattern for solving the issues of handling results from queries that return large amounts of data. Typically these queries would be made by a client process across a wide area network (or Internet), with one or more middle-tiers, to a relational database residing on a remote server. The solution involves implementing a combination of data retrieval strategies, including the use of iterators for traversing data sets and providing an appropriate level of abstraction to the client, double-buffering of data subsets, multi-threaded data retrieval, and query slicing. This design has recently been implemented and incorporated into the framework of a commercial software product developed at Oracle Corporation.