1000 resultados para environment
Resumo:
Homestead fish culture is a recent innovation for mass production of fish at backyard in Nigeria. The processes of pond construction often have resulted in soil disturbances, vegetation losses, and creation of new aquatic environment. The paper discusses homestead ponds in Nigeria, their potential impact on the environment which includes erosion, over flooding, pest and disease, accident risk, undesired fossil fuel production, vegetation destruction and fish genetic conservation, strategies for environmental management in relation to pond construction are suggested
Resumo:
Gold Coast Water is responsible for the management of the water and wastewater assets of the City of the Gold Coast on Australia’s east coast. Treated wastewater is released at the Gold Coast Seaway on an outgoing tide in order for the plume to be dispersed before the tide changes and renters the Broadwater estuary. Rapid population growth over the past decade has placed increasing demands on the receiving waters for the release of the City’s effluent. The Seaway SmartRelease Project is designed to optimise the release of the effluent from the City’s main wastewater treatment plant in order to minimise the impact of the estuarine water quality and maximise the cost efficiency of pumping. In order to do this an optimisation study that involves water quality monitoring, numerical modelling and a web based decision support system was conducted. An intensive monitoring campaign provided information on water levels, currents, winds, waves, nutrients and bacterial levels within the Broadwater. These data were then used to calibrate and verify numerical models using the MIKE by DHI suite of software. The decision support system then collects continually measured data such as water levels, interacts with the WWTP SCADA system, runs the models in forecast mode and provides the optimal time window to release the required amount of effluent from the WWTP. The City’s increasing population means that the length of time available for releasing the water with minimal impact may be exceeded within 5 years. Optimising the release of the treated water through monitoring, modelling and a decision support system has been an effective way of demonstrating the limited environmental impact of the expected short term increase in effluent disposal procedures. (PDF contains 5 pages)
Resumo:
In this reservoir, the parameters being assessed are very important in the aspect of fish culture. These parameters are: physical parameters which includes temperature (O), Transparency (M).Chemical parameters include: Dissolve oxygen (mg/l) pH concentration and the Biological Parameters which include phytoplankton and zooplankton. The phytoplankton and zooplankton identification and estimation were carried out in the NIFFR Limnology Laboratory, (Green House), New Bussa. Each identified zooplankton and phytoplankton species was placed according to its major group e.g. zooplankton was grouped into three families, Roifera, Cladocera and Copepods. During this study period it was observed that copepods have the highest total number of zooplankton both beside the poultry and monk (station 'A'&'B'). Water temperature of station 'A' (beside the poultry house) ranges from 27 C-29, 5 c also same station 'B' (near the monk). Dissolve oxygen station 'A' range from 6.30mg/l-7.40mg/l while that of station 'B' ranges from 6.20mg/7.50mg/l, turbidity reading of station A'ranges from 0.19m-0.3m while station 'B' ranges from 0.22m-0.37m. The last parameter, which is pH concentration, in both stations 8.2 was observed this is an indication that the pH was constant. According to some literature review all the water parameter figures obtained were good for fish culture
Resumo:
Whereas some species may rely on periodic drought conditions for part of their life histories, or have life strategies suited to exploiting the habitat or changed environmental conditions that are created by drought, for other organisms it is a time of stress. Periodic drought conditions therefore generate a series of waves of colonization and extinctions. Studies on lowland wet grassland, in winterbournes and in the toiche zone of both ponds and rivers, also demonstrate that different organisms are competitively favoured with changing hydrological conditions, and that this process prevents any one species from overwhelming its competitors. Competitive impacts may be inter- and intraspecific. It is therefore apparent that the death of organisms such as adult fish during severe drought conditions, though traumatic for human onlookers and commercial interests, may be merely a regular occurrence to which the ecosystem is adapted. The variability of climatic conditions thereby provides a direct influence on the maintenance of biological diversity, and it is this very biodiversity that provides the ecosystem with the resilience to respond to environmental changes in both the short and the longer term.
Resumo:
The bulk of the European Community's water policy legislation was developed between the mid 1970s and the early 1990s. These directives addressed specific substances, sources, uses or processes but caused problems with differing methods definitions and aims. The Water Framework Directive (WFD) aims to resolve the piecemeal approach. The Environemnt Agency (EA) welcomes and supported the overall objective of establishing a coherent legislative framework. The EA has been discussing the implications of the WFD with European partners and has developed a timetable for the implementation and a special team will commission necessary research.
Resumo:
Complexity in the earthquake rupture process can result from many factors. This study investigates the origin of such complexity by examining several recent, large earthquakes in detail. In each case the local tectonic environment plays an important role in understanding the source of the complexity.
Several large shallow earthquakes (Ms > 7.0) along the Middle American Trench have similarities and differences between them that may lead to a better understanding of fracture and subduction processes. They are predominantly thrust events consistent with the known subduction of the Cocos plate beneath N. America. Two events occurring along this subduction zone close to triple junctions show considerable complexity. This may be attributable to a more heterogeneous stress environment in these regions and as such has implications for other subduction zone boundaries.
An event which looks complex but is actually rather simple is the 1978 Bermuda earthquake (Ms ~ 6). It is located predominantly in the mantle. Its mechanism is one of pure thrust faulting with a strike N 20°W and dip 42°NE. Its apparent complexity is caused by local crustal structure. This is an important event in terms of understanding and estimating seismic hazard on the eastern seaboard of N. America.
A study of several large strike-slip continental earthquakes identifies characteristics which are common to them and may be useful in determining what to expect from the next great earthquake on the San Andreas fault. The events are the 1976 Guatemala earthquake on the Motagua fault and two events on the Anatolian fault in Turkey (the 1967, Mudurnu Valley and 1976, E. Turkey events). An attempt to model the complex P-waveforms of these events results in good synthetic fits for the Guatemala and Mudurnu Valley events. However, the E. Turkey event proves to be too complex as it may have associated thrust or normal faulting. Several individual sources occurring at intervals of between 5 and 20 seconds characterize the Guatemala and Mudurnu Valley events. The maximum size of an individual source appears to be bounded at about 5 x 1026 dyne-cm. A detailed source study including directivity is performed on the Guatemala event. The source time history of the Mudurnu Valley event illustrates its significance in modeling strong ground motion in the near field. The complex source time series of the 1967 event produces amplitudes greater by a factor of 2.5 than a uniform model scaled to the same size for a station 20 km from the fault.
Three large and important earthquakes demonstrate an important type of complexity --- multiple-fault complexity. The first, the 1976 Philippine earthquake, an oblique thrust event, represents the first seismological evidence for a northeast dipping subduction zone beneath the island of Mindanao. A large event, following the mainshock by 12 hours, occurred outside the aftershock area and apparently resulted from motion on a subsidiary fault since the event had a strike-slip mechanism.
An aftershock of the great 1960 Chilean earthquake on June 6, 1960, proved to be an interesting discovery. It appears to be a large strike-slip event at the main rupture's southern boundary. It most likely occurred on the landward extension of the Chile Rise transform fault, in the subducting plate. The results for this event suggest that a small event triggered a series of slow events; the duration of the whole sequence being longer than 1 hour. This is indeed a "slow earthquake".
Perhaps one of the most complex of events is the recent Tangshan, China event. It began as a large strike-slip event. Within several seconds of the mainshock it may have triggered thrust faulting to the south of the epicenter. There is no doubt, however, that it triggered a large oblique normal event to the northeast, 15 hours after the mainshock. This event certainly contributed to the great loss of life-sustained as a result of the Tangshan earthquake sequence.
What has been learned from these studies has been applied to predict what one might expect from the next great earthquake on the San Andreas. The expectation from this study is that such an event would be a large complex event, not unlike, but perhaps larger than, the Guatemala or Mudurnu Valley events. That is to say, it will most likely consist of a series of individual events in sequence. It is also quite possible that the event could trigger associated faulting on neighboring fault systems such as those occurring in the Transverse Ranges. This has important bearing on the earthquake hazard estimation for the region.
Resumo:
Genetic engineering now makes possible the insertion of DNA from many organisms into other prokaryotic, eukaryotic and viral hosts. This technology has been used to construct a variety of such genetically engineered microorganisms (GEMs). The possibility of accidental or deliberate release of GEMs into the natural environment has recently raised much public concern. The prospect of deliberate release of these microorganisms has prompted an increased need to understand the processes of survival, expression, transfer and rearrangement of recombinant DNA molecules in microbial communities. The methodology which is being developed to investigate these processes will greatly enhance our ability to study microbial population ecology.
Resumo:
The biomass of the phytoplankton and its composition is one of the most important factors in water quality control. Determination of the phytoplankton assemblage is usually done by microscopic analysis (Utermöhl's method). Quantitative estimations of the biovolume, by cell counting and cell size measurements, are time-consuming and normally are not done in routine water quality control. Several alternatives have been tried: computer-based image analysis, spectral fluorescence signatures, flow cytometry and pigment fingerprinting aided by high performance liquid chromatography (HPLC). The latter method is based on the fact that each major algal group of taxa contains a specific carotenoid which can be used for identification and relative quantification of the taxa in the total assemblage. This article gives a brief comparative introduction to the different techniques available and presents some recent results obtained by HPLC-based pigment fingerprinting, applied to three lakes of different trophic status. The results show that this technique yields reliable results from different lake types and is a powerful tool for studying the distribution pattern of the phytoplankton community in relation to water depth. However, some restrictions should be taken into account for the interpretation of routine data.