456 resultados para SOCIETY SOURCE CLAYS
Resumo:
Over the past ten years, scaled-up utilisation of a previously under-exploited zeolite, Zeolite N1, has been demonstrated for selective ion exchange of ammonium and other ions in aqueous environments. As with many zeolite syntheses, the required source material should contain predictable levels of aluminium and silicon and, for full-scale industrial applications, kaolin and/or montmorillonite serve such a purpose. Field, pilot and commercial scale trials of kaolin-derived Zeolite N have focused on applications in agriculture and water treatment as these sectors are primary producers or users of ammonium. The format for the material – as fine powders, granules or extrudates – depends on the specific application albeit each has been evaluated.
Resumo:
Despite the existence of air quality guidelines in Australia and New Zealand, the concentrations of particulate matter have exceeded these guidelines on several occasions. To identify the sources of particulate matter, examine the contributions of the sources to the air quality at specific areas and estimate the most likely locations of the sources, a growing number of source apportionment studies have been conducted. This paper provides an overview of the locations of the studies, salient features of the results obtained and offers some perspectives for the improvement of future receptor modelling of air quality in these countries. The review revealed that because of its advantages over alternative models, Positive Matrix Factorisation (PMF) was the most commonly applied model in the studies. Although there were differences in the sources identified in the studies, some general trends were observed. While biomass burning was a common problem in both countries, the characteristics of this source varied from one location to another. In New Zealand, domestic heating was the highest contributor to particle levels on days when the guidelines were exceeded. On the other hand, forest back-burning was a concern in Brisbane while marine aerosol was a major source in most studies. Secondary sulphate, traffic emissions, industrial emissions and re-suspended soil were also identified as important sources. Some unique species, for example, volatile organic compounds and particle size distribution were incorporated into some of the studies with results that have significant ramifications for the improvement of air quality. Overall, the application of source apportionment models provided useful information that can assist the design of epidemiological studies and refine air pollution reduction strategies in Australia and New Zealand.
Resumo:
In the current market, extensive software development is taking place and the software industry is thriving. Major software giants have stated source code theft as a major threat to revenues. By inserting an identity-establishing watermark in the source code, a company can prove it's ownership over the source code. In this paper, we propose a watermarking scheme for C/C++ source codes by exploiting the language restrictions. If a function calls another function, the latter needs to be defined in the code before the former, unless one uses function pre-declarations. We embed the watermark in the code by imposing an ordering on the mutually independent functions by introducing bogus dependency. Removal of dependency by the attacker to erase the watermark requires extensive manual intervention thereby making the attack infeasible. The scheme is also secure against subtractive and additive attacks. Using our watermarking scheme, an n-bit watermark can be embedded in a program having n independent functions. The scheme is implemented on several sample codes and performance changes are analyzed.
Resumo:
Several significant studies have been made in recent decades toward understanding road traffic noise and its effects on residential balconies. These previous studies have used a variety of techniques such as theoretical models, scale models and measurements on real balconies. The studies have considered either road traffic noise levels within the balcony space or inside an adjacent habitable room or both. Previous theoretical models have used, for example, simplified specular reflection calculations, boundary element methods (BEM), adaptations of CoRTN or the use of Sabine Theory. This paper presents an alternative theoretical model to predict the effects of road traffic noise spatially within the balcony space. The model includes a specular reflection component by calculating up to 10 orders of source images. To account for diffusion effects, a two compartment radiosity component is utilised. The first radiosity compartment is the urban street, represented as a street with building facades on either side. The second radiosity compartment is the balcony space. The model is designed to calculate the predicted road traffic noise levels within the balcony space and is capable of establishing the effect of changing street and balcony geometries. Screening attenuation algorithms are included to determine the effects of solid balcony parapets and balcony ceiling shields.
Resumo:
Declining fossil fuels reserves, a need for increased energy security and concerns over carbon emissions from fossil fuel use are the global drivers for alternative, renewable, biosources of fuels and chemicals. In the present study the identification of long chain (C29–C33) saturated hydrocarbons from Nicotiana glauca leaves is reported. The occurrence of these hydrocarbons was detected by gas chromatography–mass spectrometry (GC–MS) and identification confirmed by comparison of physico-chemical properties displayed by the authentic standards available. A simple, robust procedure was developed to enable the generation of an extract containing a high percentage of hydrocarbons (6.3% by weight of dried leaf material) higher than previous reports in other higher plant species consequently, it is concluded that N. glauca could be a crop of greater importance than previously recognised for biofuel production. The plant can be grown on marginal lands, negating the need to compete with food crops or farmland, and the hydrocarbon extract can be produced in a non-invasive manner, leaving remaining biomass intact for bioethanol production and the generation of valuable co-products.
Resumo:
The Source Monitoring Framework is a promising model of constructive memory, yet fails because it is connectionist and does not allow content tagging. The Dual-Process Signal Detection Model is an improvement because it reduces mnemic qualia to a single memory signal (or degree of belief), but still commits itself to non-discrete representation. By supposing that ‘tagging’ means the assignment of propositional attitudes to aggregates of anemic characteristics informed inductively, then a discrete model becomes plausible. A Bayesian model of source monitoring accounts for the continuous variation of inputs and assignment of prior probabilities to memory content. A modified version of the High-Threshold Dual-Process model is recommended to further source monitoring research.
Resumo:
Language is a unique aspect of human communication because it can be used to discuss itself in its own terms. For this reason, human societies potentially have superior capacities of co-ordination, reflexive self-correction, and innovation than other animal, physical or cybernetic systems. However, this analysis also reveals that language is interconnected with the economically and technologically mediated social sphere and hence is vulnerable to abstraction, objectification, reification, and therefore ideology – all of which are antithetical to its reflexive function, whilst paradoxically being a fundamental part of it. In particular, in capitalism, language is increasingly commodified within the social domains created and affected by ubiquitous communication technologies. The advent of the so-called ‘knowledge economy’ implicates exchangeable forms of thought (language) as the fundamental commodities of this emerging system. The historical point at which a ‘knowledge economy’ emerges, then, is the critical point at which thought itself becomes a commodified ‘thing’, and language becomes its “objective” means of exchange. However, the processes by which such commodification and objectification occurs obscures the unique social relations within which these language commodities are produced. The latest economic phase of capitalism – the knowledge economy – and the obfuscating trajectory which accompanies it, we argue, is destroying the reflexive capacity of language particularly through the process of commodification. This can be seen in that the language practices that have emerged in conjunction with digital technologies are increasingly non-reflexive and therefore less capable of self-critical, conscious change.
Resumo:
Titanium dioxide nanocrystals are an important commercial product used primarily in white pigments and abrasives, however, more recently the anatase form of TiO2 has become a major component in electrochemical and photoelectrochemical devices. An important property of titanium dioxide nanocrystals for electrical applications is the degree of crystallinity. Numerous preparation methods exist for the production of highly crystalline TiO2 particles. The majority of these processes require long reaction times, high pressures and temperatures (450–1400 °C). Recently, hydrothermal treatment of colloidal TiO2 suspensions has been shown to produce quality crystalline products at low temperatures (<250 °C). In this paper we extend this idea utilising a direct microwave heating source. A comparison between convection and microwave hydrothermal treatment of colloidal TiO2 is presented. The resulting highly crystalline TiO2 colloids were characterised using Raman spectroscopy, XRD, TEM, and electron diffraction. The results show that the microwave treatment of colloidal TiO2 gives comparable increases in crystallinity with respect to normal hydrothermal treatments while requiring significantly less time and energy than the hydrothermal convection treatment.
Resumo:
Until recently, integration of enterprise systems has been supported largely by monolithic architectures. From a technical perspective, this approach has been challenged by the suggestion of component-based enterprise systems. Lately, the nature of software as proprietary item has been questioned through the increased use of open source software in business computing in general. This suggests the potential for altered technological and commercial constellations for the design of enterprise systems, which are presented in four scenarios. © Springer-Verlag 2004.