10 resultados para Peat.

em Queensland University of Technology - ePrints Archive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background In developing countries, infectious diseases such as diarrhoea and acute respiratory infections are the main cause of mortality and morbidity in infants aged less than one year. The importance of exclusive breastfeeding in the prevention of infectious diseases during infancy is well known. Although breastfeeding is almost universal in Bangladesh, the rates of exclusive breastfeeding remain low. This cohort study was designed to compare the prevalence of diarrhoea and acute respiratory infection (ARI) in infants according to their breastfeeding status in a prospective cohort of infants from birth to six months of age. Methods A total of 351 pregnant women were recruited in the Anowara subdistrict of Chittagong. Breastfeeding practices and the 7-day prevalence of diarrhoea and ARI were recorded at monthly home visits. Prevalences were compared using chi-squared tests and logistic regression. Results A total of 272 mother-infant pairs completed the study to six months. Infants who were exclusively breastfed for six months had a significantly lower 7-day prevalence of diarrhoea [AOR for lack of EBF = 2.50 (95%CI 1.10, 5.69), p = 0.03] and a significantly lower 7-day prevalence of ARI [AOR for lack of EBF = 2.31 (95%CI 1.33, 4.00), p < 0.01] than infants who were not exclusively breastfed. However, when the association between patterns of infant feeding (exclusive, predominant and partial breastfeeding) and illness was investigated in more detail, there was no significant difference in the prevalence of diarrhoea between exclusively [6.6% (95% CI 2.8, 10.4)] and predominantly breastfed infants [3.7% (95% CI 0.09, 18.3), (p = 0.56)]. Partially breastfed infants had a higher prevalence of diarrhoea than the others [19.2% (95% CI 10.4, 27.9), (p = 0.01)]. Similarly, although there was a large difference in prevalence in acute respiratory illness between exclusively [54.2% (95%CI 46.6, 61.8)] and predominantly breastfed infants [70.4% (95%CI 53.2, 87.6)] there was no significant difference in the prevalence (p = 0.17). Conclusion The findings suggest that exclusive or predominant breastfeeding can reduce rates of morbidity significantly in this region of rural Bangladesh.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter introduces the principles of coring, including the objectives for taking a good core and the major factors that should be considered to ensure the collection of non disturbed, representative core samples. The chapter also provides an overview of the design, function, and operation of the main types of coring equipment that can be used to sample sediments from a variety of settings including lakes, the ocean, peat bogs, soils, and permafrost. The major advantages and disadvantages of each type of corer are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the Australian sugar industry, sugar cane is smashed into a straw like material by hammers before being squeezed between large rollers to extract the sugar juice. The straw like material is initially called prepared cane and then bagasse as it passes through successive roller milling units. The sugar cane materials are highly compressible, have high moisture content, are fibrous, and they resemble some peat soils in both appearance and mechanical behaviour. A promising avenue to improve the performance of milling units for increased throughput and juice extraction, and to reduce costs is by modelling of the crushing process. To achieve this, it is believed necessary that milling models should be able to reproduce measured bagasse behaviour. This investigation sought to measure the mechanical (compression, shear, and volume) behaviour of prepared cane and bagasse, to identify limitations in currently used material models, and to progress towards a material model that can predict bagasse behaviour adequately. Tests were carried out using a modified direct shear test equipment and procedure at most of the large range of pressures occurring in the crushing process. The investigation included an assessment of the performance of the direct shear test for measuring bagasse behaviour. The assessment was carried out using finite element modelling. It was shown that prepared cane and bagasse exhibited critical state behavior similar to that of soils and the magnitudes of material parameters were determined. The measurements were used to identify desirable features for a bagasse material model. It was shown that currently used material models had major limitations for reproducing bagasse behaviour. A model from the soil mechanics literature was modified and shown to achieve improved reproduction while using magnitudes of material parameters that better reflected the measured values. Finally, a typical three roller mill pressure feeder configuration was modelled. The predictions and limitations were assessed by comparison to measured data from a sugar factory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When crystallization screening is conducted many outcomes are observed but typically the only trial recorded in the literature is the condition that yielded the crystal(s) used for subsequent diffraction studies. The initial hit that was optimized and the results of all the other trials are lost. These missing results contain information that would be useful for an improved general understanding of crystallization. This paper provides a report of a crystallization data exchange (XDX) workshop organized by several international large-scale crystallization screening laboratories to discuss how this information may be captured and utilized. A group that administers a significant fraction of the worlds crystallization screening results was convened, together with chemical and structural data informaticians and computational scientists who specialize in creating and analysing large disparate data sets. The development of a crystallization ontology for the crystallization community was proposed. This paper (by the attendees of the workshop) provides the thoughts and rationale leading to this conclusion. This is brought to the attention of the wider audience of crystallographers so that they are aware of these early efforts and can contribute to the process going forward. © 2012 International Union of Crystallography All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coastal subsidence causes sea-level rise, shoreline erosion and wetland loss, which poses a threat to coastal populations. This is especially evident in the Mississippi Delta in the southern United States, which was devastated by Hurricane Katrina in 2005. The loss of protective wetlands is considered a critical factor in the extensive flood damage. The causes of subsidence in coastal Louisiana, attributed to factors as diverse as shallow compaction and deep crustal processes, remain controversial. Current estimates of subsidence rates vary by several orders of magnitude. Here, we use a series of radiocarbon-dated sediment cores from the Mississippi Delta to analyse late Holocene deposits and assess compaction rates. We find that millennial-scale compaction rates primarily associated with peat can reach 5mm per year, values that exceed recent model predictions. Locally and on timescales of decades to centuries, rates are likely to be 10 mm or more per year. We conclude that compaction of Holocene strata contributes significantly to the exceptionally high rates of relative sea-level rise and coastal wetland loss in the Mississippi Delta, and is likely to cause subsidence in other organic-rich and often densely populated coastal plains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Great Sandy Region (incorporating Fraser Island and the Cooloola sand-mass), south-east Queensland, contains a significant area of Ramsar-listed coastal wetlands, including the globally important patterned fen complexes. These mires form an elaborate network of pools surrounded by vegetated peat ridges and are the only known subtropical, Southern Hemisphere examples, with wetlands of this type typically located in high northern latitudes. Sedimentological, palynological and charcoal analysis from the Wathumba and Moon Point complexes on Fraser Island indicate two periods of swamp formation (that may contain patterned fens), one commencing at 12 000 years ago (Moon Point) and the other ~4300 years ago (Wathumba). Wetland formation and development is thought to be related to a combination of biological and hydrological processes with the dominant peat-forming rush, Empodisma minus, being an important component of both patterned and non-patterned mires within the region. In contrast to Northern Hemisphere paludifying systems, the patterning appears to initiate at the start of wetland development or as part of an infilling process. The wetlands dominated by E. minus are highly resilient to disturbance, particularly burning and sea level alterations, and appear to form important refuge areas for amphibians, fish and birds (both non-migratory and migratory) over thousands of years.