929 resultados para Adsorption. Zeolite 13X. Langmuir model. Dynamic modeling. Pyrolysis of sewage sludge
Resumo:
In this paper we investigate the trade-off faced by regulators who must set a price for an intermediate good somewhere between the marginal cost and the monopoly price. We utilize a growth model with monopolistic suppliers of intermediate goods. Investment in innovation is required to produce a new intermediate good. Marginal cost pricing deters innovation, while monopoly pricing maximizes innovation and economic growth at the cost of some static inefficiency. We demonstrate the existence of a second-best price above the marginal cost but below the monopoly price, which maximizes consumer welfare. Simulation results suggest that substantial reductions in consumption, production, growth, and welfare occur where regulators focus on static efficiency issues by setting prices at or near marginal cost.
Resumo:
The research was aimed at developing a technology to combine the production of useful microfungi with the treatment of wastewater from food processing. A recycle bioreactor equipped with a micro-screen was developed as a wastewater treatment system on a laboratory scale to contain a Rhizopus culture and maintain its dominance under non-aseptic conditions. Competitive growth of bacteria was observed, but this was minimised by manipulation of the solids retention time and the hydraulic retention time. Removal of about 90% of the waste organic material (as BOD) from the wastewater was achieved simultaneously. Since essentially all fungi are retained behind the 100 mum aperture screen, the solids retention time could be controlled by the rate of harvesting. The hydraulic retention time was employed to control the bacterial growth as the bacteria were washed through the screen at a short HRT. A steady state model was developed to determine these two parameters. This model predicts the effluent quality. Experimental work is still needed to determine the growth characteristics of the selected fungal species under optimum conditions (pH and temperature).
Resumo:
The marginalisation of the teaching and learning of legal research in the Australian law school curriculum is, in the author's experience, a condition common to many law schools. This is reflected in the reluctance of some law teachers to include legal research skills in the substantive law teaching schedule — often the result of unwillingness on the part of law school administrators to provide the resources necessary to ensure that such integration does not place a disproportionately heavy burden of assessment on those who are tempted. However, this may only be one of many reasons for the marginalisation of legal research in the law school experience. Rather than analyse the reasons for this marginalisation, this article deals with what needs to be done to rectify the situation, and to ensure that the teaching of legal research can be integrated into the law school curriculum in a meaningful way. This requires the use of teaching and learning theory which focuses on student-centred learning. This article outlines a model of legal research. It incorporates five transparent stages which are: analysis, contextualisation, bibliographic skills, interpretation and assessment and application.
Resumo:
The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.
Resumo:
Tropical deforestation is the major contemporary threat to global biodiversity, because a diminishing extent of tropical forests supports the majority of the Earth's biodiversity. Forest clearing is often spatially concentrated in regions where human land use pressures, either planned or unplanned, increase the likelihood of deforestation. However, it is not a random process, but often moves in waves originating from settled areas. We investigate the spatial dynamics of land cover change in a tropical deforestation hotspot in the Colombian Amazon. We apply a forest cover zoning approach which permitted: calculation of colonization speed; comparative spatial analysis of patterns of deforestation and regeneration; analysis of spatial patterns of mature and recently regenerated forests; and the identification of local-level hotspots experiencing the fastest deforestation or regeneration. The colonization frontline moved at an average of 0.84 km yr(-1) from 1989 to 2002, resulting in the clearing of 3400 ha yr(-1) of forests beyond the 90% forest cover line. The dynamics of forest clearing varied across the colonization front according to the amount of forest in the landscape, but was spatially concentrated in well-defined 'local hotspots' of deforestation and forest regeneration. Behind the deforestation front, the transformed landscape mosaic is composed of cropping and grazing lands interspersed with mature forest fragments and patches of recently regenerated forests. We discuss the implications of the patterns of forest loss and fragmentation for biodiversity conservation within a framework of dynamic conservation planning.