698 resultados para grants awarded
Resumo:
This paper presents a novel topology for the generation of high voltage pulses that uses both slow and fast solid-state power switches. This topology includes diode-capacitor units in parallel with commutation circuits connected to a positive buck-boost converter. This enables the generation of a range of high output voltages with a given number of capacitors. The advantages of this topology are the use of slow switches and a reduced number of diodes in comparison with conventional Marx generator. Simulations performed for single and repetitive pulse generation and experimental tests of a prototype hardware verify the proposed topology.
Resumo:
Automated visual surveillance of crowds is a rapidly growing area of research. In this paper we focus on motion representation for the purpose of abnormality detection in crowded scenes. We propose a novel visual representation called textures of optical flow. The proposed representation measures the uniformity of a flow field in order to detect anomalous objects such as bicycles, vehicles and skateboarders; and can be combined with spatial information to detect other forms of abnormality. We demonstrate that the proposed approach outperforms state-of-the-art anomaly detection algorithms on a large, publicly-available dataset.
Resumo:
We present a hierarchical model for assessing an object-oriented program's security. Security is quantified using structural properties of the program code to identify the ways in which `classified' data values may be transferred between objects. The model begins with a set of low-level security metrics based on traditional design characteristics of object-oriented classes, such as data encapsulation, cohesion and coupling. These metrics are then used to characterise higher-level properties concerning the overall readability and writability of classified data throughout the program. In turn, these metrics are then mapped to well-known security design principles such as `assigning the least privilege' and `reducing the size of the attack surface'. Finally, the entire program's security is summarised as a single security index value. These metrics allow different versions of the same program, or different programs intended to perform the same task, to be compared for their relative security at a number of different abstraction levels. The model is validated via an experiment involving five open source Java programs, using a static analysis tool we have developed to automatically extract the security metrics from compiled Java bytecode.
Resumo:
All Australian governments recognize the need to ensure that land and natural resources are used sustainably. In this context, ‘resources’ includes natural resources found on land such as trees and other vegetation, fauna, soil and minerals, and cultural resources found on land such as archaeological sites and artefacts. Regulators use a wide range of techniques to promote sustainability. To achieve their objectives, they may, for example, create economic incentives through bounties, grants and subsidies, encourage the development of self-regulatory codes, or enter into agreements with landowners specifying how the land is to be managed. A common way of regulating is by making administrative orders, determinations or decisions under powers given to regulators by Acts of Parliament (statutes) or by regulations (delegated legislation). Generally the legislation provides for specified rights or duties, and authorises a regulator to make an order or decision to apply the legislative provisions to particular land or cases. For example, legislation might empower a regulator to make an order that requires the owner of a contaminated site to remediate it. When the regulator exercises the power by making an order in relation to particular land, the owner is placed under a statutory duty to remediate. When regulators exercise their statutory powers to manage the use of private land or natural or cultural resources on private land, property law issues can arise. The owner of land has a private property right that the law will enforce against anybody else who interferes with the enjoyment of the right, without legal authority to do so. The law dealing with the enforcement of private property rights forms part of private law. This report focuses on the relationship between the law of private property and the regulation of land and resources by legislation and by administrative decisions made under powers given by legislation (statutory powers).
Resumo:
This is the first article in a series of three that examines the legal role of medical professionals in decisions to withhold or withdraw life-sustaining treatment from adults who lack capacity. This article considers the position in New South Wales. A review of the law in this State reveals that medical professionals play significant legal roles in these decisions. However, the law is problematic in a number of respects and this is likely to impede medical professionals’ legal knowledge in this area. The article examines the level of training medical professionals receive on issues such as advance directives and substitute decision-making, and the available empirical evidence as to the state of medical professionals’ knowledge of the law at the end of life. It concludes that there are gaps in legal knowledge and that law reform is needed in New South Wales.
Resumo:
This is the second article in a series of three that examines the legal role of medical professionals in decisions to withhold or withdraw life-sustaining treatment from adults who lack capacity. This article considers the position in Queensland, including the parens patriae jurisdiction of the Supreme Court. A review of the law in this State reveals that medical professionals play significant legal roles in these decisions. However, the law is problematic in a number of respects and this is likely to impede medical professionals’ legal knowledge in this area. The article examines the level of training medical professionals receive on issues such as advance health directives and substitute decision-making, and the available empirical evidence as to the state of medical professionals’ knowledge of the law at the end of life. It concludes that there are gaps in legal knowledge and that law reform is needed in Queensland.
Resumo:
The heterogeneous photocatalytic water purification process has gained wide attention due to its effectiveness in degrading and mineralizing the recalcitrant organic compounds as well as the possibility of utilizing the solar UV and visible light spectrum. This paper aims to review and summarize the recently published works in the field of photocatalytic oxidation of toxic organic compounds such as phenols and dyes, predominant in waste water effluent. In this review, the effects of various operating parameters on the photocatalytic degradation of phenols and dyes are presented. Recent findings suggested that different parameters, such as type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidizing agents/electron acceptors, mode of catalyst application, and calcinations temperature can play an important role on the photocatlytic degradation of organic compounds in water environment. Extensive research has focused on the enhancement of photocatalysis by modification of TiO2 employing metal, non-metal and ion doping. Recent advances in TiO2 photocatalysis for the degradation of various phenols and dyes are also highlighted in this review.
Resumo:
In recent years, the application of heterogeneous photocatalytic water purification process has gained wide attention due to its effectiveness in degrading and mineralizing the recalcitrant organic compounds as well as the possibility of utilizing the solar UV and visible light spectrum. This paper aims to review and summarize the recently published works on the titanium dioxide (TiO2) photocatalytic oxidation of pesticides and phenolic compounds, predominant in storm and waste water effluents. The effect of various operating parameters on the photocatalytic degradation of pesticides and phenols are discussed. Results reported here suggested that the photocatalytic degradation of organic compounds depends on the type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidizing agents/electron acceptors, catalyst application mode, and calcinations temperature in water environment. A substantial amount of research has focused on the enhancement of TiO2 photocatalysis by modification with metal, non-metal and ion doping. Recent developments in TiO2 photocatalysis for the degradation of various pesticides and phenols are also highlighted in this review. It is evident from the literature survey that photocatalysis has shown good potential for the removal of various organic pollutants. However, still there is a need to find out the practical utility of this technique on commercial scale.
Resumo:
In recent years, there has been an enormous amount of research and development in the area of heterogeneous photocatalytic water purification process due to its effectiveness in degrading and mineralising the recalcitrant organic compounds as well as the possibility of utilising the solar UV and visible spectrum. One hundred and twenty recently published papers are reviewed and summarised here with the focus being on the photocatalytic oxidation of phenols and their derivatives, predominant in waste water effluent. In this review, the effects of various operating parameters on the photocatalytic degradation of phenols and substituted phenols are presented. Recent findings suggested that different parameters, such as type of photocatalyst and composition, light intensity, initial substrate concentration, amount of catalyst, pH of the reaction medium, ionic components in water, solvent types, oxidising agents/electron acceptors, mode of catalyst application, and calcination temperatures can play an important role on the photocatalytic degradation of phenolic compounds in wastewater. Extensive research has focused on the enhancement of photocatalysis by modification of TiO2 employing metal, non-metal and ion doping. Recent developments in TiO2 photocatalysis for the degradation of various phenols and substituted phenols are also reviewed.
Resumo:
A computational fluid dynamics (CFD) analysis has been performed for a flat plate photocatalytic reactor using CFD code FLUENT. Under the simulated conditions (Reynolds number, Re around 2650), a detailed time accurate computation shows the different stages of flow evolution and the effects of finite length of the reactor in creating flow instability, which is important to improve the performance of the reactor for storm and wastewater reuse. The efficiency of a photocatalytic reactor for pollutant decontamination depends on reactor hydrodynamics and configurations. This study aims to investigate the role of different parameters on the optimization of the reactor design for its improved performance. In this regard, more modelling and experimental efforts are ongoing to better understand the interplay of the parameters that influence the performance of the flat plate photocatalytic reactor.
Resumo:
This is the final article in a series of three that examines the legal role of medical professionals in decisions to withhold or withdraw life-sustaining treatment from adults who lack capacity. This article considers the position in Victoria. A review of the law in this State reveals that medical professionals play significant legal roles in these decisions. However, the law is problematic in a number of respects and this is likely to impede medical professionals’ legal knowledge in this area. The article examines the level of training that medical professionals receive on issues such as refusal of treatment certificates and substitute decision-making, and the available empirical evidence as to the state of medical professionals’ knowledge of the law at the end of life. It concludes that there are gaps in legal knowledge and that law reform is needed in Victoria. The article also draws together themes from the series as a whole, including conclusions about the need for more and better medical education and about law reform generally.
Resumo:
The Reporting and Reception of Indigenous Issues in the Australian Media was a three year project financed by the Australian government through its Australian Research Council Large Grants Scheme and run by Professor John Hartley (of Murdoch and then Edith Cowan University, Western Australia). The purpose of the research was to map the ways in which indigeneity was constructed and circulated in Australia's mediasphere. The analysis of the 'reporting' element of the project was almost straightforward: a mixture of content analysis of a large number of items in the media, and detailed textual analysis of a smaller number of key texts. The discoveries were interesting - that when analysis approaches the media as a whole, rather than focussing exclusively on news or serious drama genres, then representation of indigeneity is not nearly as homogenous as has previously been assumed. And if researchers do not explicitly set out to uncover racism in every text, it is by no means guaranteed they will find it1. The question of how to approach the 'reception' of these issues - and particularly reception by indigenous Australians - proved to be a far more challenging one. In attempting to research this area, Hartley and I (working as a research assistant on the project) often found ourselves hampered by the axioms that underlie much media research. Traditionally, the 'reception' of media by indigenous people in Australia has been researched in ethnographic ways. This research repeatedly discovers that indigenous people in Australia are powerless in the face of new forms of media. Indigenous populations are represented as victims of aggressive and powerful intrusions: ‘What happens when a remote community is suddenly inundated by broadcast TV?’; ‘Overnight they will go from having no radio and television to being bombarded by three TV channels’; ‘The influence of film in an isolated, traditionally oriented Aboriginal community’2. This language of ‘influence’, ‘bombarded’, and ‘inundated’, presents metaphors not just of war but of a war being lost. It tells of an unequal struggle, of a more powerful force impinging upon a weaker one. What else could be the relationship of an Aboriginal audience to something which is ‘bombarding’ them? Or by which they are ‘inundated’? This attitude might best be summed up by the title of an article by Elihu Katz: ‘Can authentic cultures survive new media?’3. In such writing, there is little sense that what is being addressed might be seen as a series of discursive encounters, negotiations and acts of meaning-making in which indigenous people — communities and audiences —might be productive. Certainly, the points of concern in this type of writing are important. The question of what happens when a new communication medium is summarily introduced to a culture is certainly an important one. But the language used to describe this interaction is a misleading one. And it is noticeable that such writing is fascinated with the relationship of only traditionally-oriented Aboriginal communities to the media of mass communication.
Resumo:
It is a big challenge to clearly identify the boundary between positive and negative streams for information filtering systems. Several attempts have used negative feedback to solve this challenge; however, there are two issues for using negative relevance feedback to improve the effectiveness of information filtering. The first one is how to select constructive negative samples in order to reduce the space of negative documents. The second issue is how to decide noisy extracted features that should be updated based on the selected negative samples. This paper proposes a pattern mining based approach to select some offenders from the negative documents, where an offender can be used to reduce the side effects of noisy features. It also classifies extracted features (i.e., terms) into three categories: positive specific terms, general terms, and negative specific terms. In this way, multiple revising strategies can be used to update extracted features. An iterative learning algorithm is also proposed to implement this approach on the RCV1 data collection, and substantial experiments show that the proposed approach achieves encouraging performance and the performance is also consistent for adaptive filtering as well.
Resumo:
Many data mining techniques have been proposed for mining useful patterns in text documents. However, how to effectively use and update discovered patterns is still an open research issue, especially in the domain of text mining. Since most existing text mining methods adopted term-based approaches, they all suffer from the problems of polysemy and synonymy. Over the years, people have often held the hypothesis that pattern (or phrase) based approaches should perform better than the term-based ones, but many experiments did not support this hypothesis. This paper presents an innovative technique, effective pattern discovery which includes the processes of pattern deploying and pattern evolving, to improve the effectiveness of using and updating discovered patterns for finding relevant and interesting information. Substantial experiments on RCV1 data collection and TREC topics demonstrate that the proposed solution achieves encouraging performance.
Resumo:
It is a big challenge to guarantee the quality of discovered relevance features in text documents for describing user preferences because of the large number of terms, patterns, and noise. Most existing popular text mining and classification methods have adopted term-based approaches. However, they have all suffered from the problems of polysemy and synonymy. Over the years, people have often held the hypothesis that pattern-based methods should perform better than term-based ones in describing user preferences, but many experiments do not support this hypothesis. The innovative technique presented in paper makes a breakthrough for this difficulty. This technique discovers both positive and negative patterns in text documents as higher level features in order to accurately weight low-level features (terms) based on their specificity and their distributions in the higher level features. Substantial experiments using this technique on Reuters Corpus Volume 1 and TREC topics show that the proposed approach significantly outperforms both the state-of-the-art term-based methods underpinned by Okapi BM25, Rocchio or Support Vector Machine and pattern based methods on precision, recall and F measures.