357 resultados para Proximal Point Algorithm
Resumo:
This paper presents an algorithm for mining unordered embedded subtrees using the balanced-optimal-search canonical form (BOCF). A tree structure guided scheme based enumeration approach is defined using BOCF for systematically enumerating the valid subtrees only. Based on this canonical form and enumeration technique, the balanced optimal search embedded subtree mining algorithm (BEST) is introduced for mining embedded subtrees from a database of labelled rooted unordered trees. The extensive experiments on both synthetic and real datasets demonstrate the efficiency of BEST over the two state-of-the-art algorithms for mining embedded unordered subtrees, SLEUTH and U3.
Resumo:
Partial shading and rapidly changing irradiance conditions significantly impact on the performance of photovoltaic (PV) systems. These impacts are particularly severe in tropical regions where the climatic conditions result in very large and rapid changes in irradiance. In this paper, a hybrid maximum power point (MPP) tracking (MPPT) technique for PV systems operating under partially shaded conditions witapid irradiance change is proposed. It combines a conventional MPPT and an artificial neural network (ANN)-based MPPT. A low cost method is proposed to predict the global MPP region when expensive irradiance sensors are not available or are not justifiable for cost reasons. It samples the operating point on the stairs of I–V curve and uses a combination of the measured current value at each stair to predict the global MPP region. The conventional MPPT is then used to search within the classified region to get the global MPP. The effectiveness of the proposed MPPT is demonstrated using both simulations and an experimental setup. Experimental comparisons with four existing MPPTs are performed. The results show that the proposed MPPT produces more energy than the other techniques and can effectively track the global MPP with a fast tracking speed under various shading patterns.
Resumo:
We describe an investigation into how Massey University’s Pollen Classifynder can accelerate the understanding of pollen and its role in nature. The Classifynder is an imaging microscopy system that can locate, image and classify slide based pollen samples. Given the laboriousness of purely manual image acquisition and identification it is vital to exploit assistive technologies like the Classifynder to enable acquisition and analysis of pollen samples. It is also vital that we understand the strengths and limitations of automated systems so that they can be used (and improved) to compliment the strengths and weaknesses of human analysts to the greatest extent possible. This article reviews some of our experiences with the Classifynder system and our exploration of alternative classifier models to enhance both accuracy and interpretability. Our experiments in the pollen analysis problem domain have been based on samples from the Australian National University’s pollen reference collection (2,890 grains, 15 species) and images bundled with the Classifynder system (400 grains, 4 species). These samples have been represented using the Classifynder image feature set.We additionally work through a real world case study where we assess the ability of the system to determine the pollen make-up of samples of New Zealand honey. In addition to the Classifynder’s native neural network classifier, we have evaluated linear discriminant, support vector machine, decision tree and random forest classifiers on these data with encouraging results. Our hope is that our findings will help enhance the performance of future releases of the Classifynder and other systems for accelerating the acquisition and analysis of pollen samples.
Resumo:
Non-rigid image registration is an essential tool required for overcoming the inherent local anatomical variations that exist between images acquired from different individuals or atlases. Furthermore, certain applications require this type of registration to operate across images acquired from different imaging modalities. One popular local approach for estimating this registration is a block matching procedure utilising the mutual information criterion. However, previous block matching procedures generate a sparse deformation field containing displacement estimates at uniformly spaced locations. This neglects to make use of the evidence that block matching results are dependent on the amount of local information content. This paper presents a solution to this drawback by proposing the use of a Reversible Jump Markov Chain Monte Carlo statistical procedure to optimally select grid points of interest. Three different methods are then compared to propagate the estimated sparse deformation field to the entire image including a thin-plate spline warp, Gaussian convolution, and a hybrid fluid technique. Results show that non-rigid registration can be improved by using the proposed algorithm to optimally select grid points of interest.
Resumo:
Energy efficient embedded computing enables new application scenarios in mobile devices like software-defined radio and video processing. The hierarchical multiprocessor considered in this work may contain dozens or hundreds of resource efficient VLIW CPUs. Programming this number of CPU cores is a complex task requiring compiler support. The stream programming paradigm provides beneficial properties that help to support automatic partitioning. This work describes a compiler for streaming applications targeting the self-build hierarchical CoreVA-MPSoC multiprocessor platform. The compiler is supported by a programming model that is tailored to fit the streaming programming paradigm. We present a novel simulated-annealing (SA) based partitioning algorithm, called Smart SA. The overall speedup of Smart SA is 12.84 for an MPSoC with 16 CPU cores compared to a single CPU implementation. Comparison with a state of the art partitioning algorithm shows an average performance improvement of 34.07%.
Resumo:
Bushfire responsive design and management strategy at the bioregion scale. 248 Page document containing text, original designs, photographs, masterplans and critique - created as an alternative community-based strategy for risk mitigation and management reponse to bushfire in the Point Henry and Bremer Bay region of Western Australia. Document drafted as an alternative to a local government commissioned plan which had many shortcomings. It was presented as a 'powerpoint' presentaion at a public meeting in Bremer Bay on 7th April 2014 and disseminated to local community members and councillors to encourage public debate and feedback to the Shire of Jerramungup, WA.
Resumo:
There is an increased interest on the use of UAVs for environmental research and to track bush fire plumes, volcanic plumes or pollutant sources. The aim of this paper is to describe the theory and results of a bio-inspired plume tracking algorithm. A memory based and gradient based approach, were developed and compared. A method for generating sparse plumes was also developed. Results indicate the ability of the algorithms to track plumes in 2D and 3D.
Resumo:
This paper relates to the importance of impact of the chosen bottle-point method when conducting ion exchange equilibria experiments. As an illustration, potassium ion exchange with strong acid cation resin was investigated due to its relevance to the treatment of various industrial effluents and groundwater. The “constant mass” bottle-point method was shown to be problematic in that depending upon the resin mass used the equilibrium isotherm profiles were different. Indeed, application of common equilibrium isotherm models revealed that the optimal fit could be with either the Freundlich or Temkin equations, depending upon the conditions employed. It could be inferred that the resin surface was heterogeneous in character, but precise conclusions regarding the variation in the heat of sorption were not possible. Estimation of the maximum potassium loading was also inconsistent when employing the “constant mass” method. The “constant concentration” bottle-point method illustrated that the Freundlich model was a good representation of the exchange process. The isotherms recorded were relatively consistent when compared to the “constant mass” approach. Unification of all the equilibrium isotherm data acquired was achieved by use of the Langmuir Vageler expression. The maximum loading of potassium ions was predicted to be at least 116.5 g/kg resin.
Resumo:
This is a comprehensive study of human kidney proximal tubular epithelial cells (PTEC) which are known to respond to and mediate the pathological process of a range of kidney diseases. It identifies various molecules expressed by PTEC and how these molecules participate in down-regulating the inflammatory process, thereby highlighting the clinical potential of these molecules to treat various kidney diseases. In the disease state, PTEC gain the ability to regulate the immune cell responses present within the interstitium. This down-regulation is a complex interaction of contact dependent/independent mechanisms involving various immuno-regulatory molecules including PD-L1, sHLA-G and IDO. The overall outcome of this down-regulation is suppressed DC maturation, decreased number of antibody producing B cells and low T cell responses. These manifestations within a clinical setting are expected to dampen the ongoing inflammation, preventing the damage caused to the kidney tissue.
Resumo:
Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.
Resumo:
This paper presents an improved field weakening algorithm for synchronous reluctance motor (RSMs) drives. The proposed algorithm is robust to the variations in the machine d- and q-axes inductances. The transition between the maximum torque per ampere (MTPA), current and voltage limits as well as the maximum torque per flux (MTPF) trajectories is smooth. The proposed technique is combined with the direct torque control method to attain a high performance drive in the field weakening region. Simulation and experimental results are supplemented to verify the effectiveness of the proposed approach.
Resumo:
This paper aims to develop a meshless approach based on the Point Interpolation Method (PIM) for numerical simulation of a space fractional diffusion equation. Two fully-discrete schemes for the one-dimensional space fractional diffusion equation are obtained by using the PIM and the strong-forms of the space diffusion equation. Numerical examples with different nodal distributions are studied to validate and investigate the accuracy and efficiency of the newly developed meshless approach.
Resumo:
The April 2015 edition of Curriculum Perspectives has a special focus and casts light on the continuing development of the Australian Curriculum. This paper provides an introduction to a series of papers in the Point and Counterpoint section of this edition on the Review of the Australian Curriculum with reference to History. It makes clear that History is one of the most contested areas of the curriculum and that whilst politicians and policy makers are concerned with the importance of history in relation to national identity and nation building, history serves other purposes. The paper reiterates the need to pay attention to the particularities of discipline–based knowledge for the study of history in schools and the central role of inquiry for student learning in history. In doing so, it establishes the context for the five papers which follow.
Resumo:
Smart Card Automated Fare Collection (AFC) data has been extensively exploited to understand passenger behavior, passenger segment, trip purpose and improve transit planning through spatial travel pattern analysis. The literature has been evolving from simple to more sophisticated methods such as from aggregated to individual travel pattern analysis, and from stop-to-stop to flexible stop aggregation. However, the issue of high computing complexity has limited these methods in practical applications. This paper proposes a new algorithm named Weighted Stop Density Based Scanning Algorithm with Noise (WS-DBSCAN) based on the classical Density Based Scanning Algorithm with Noise (DBSCAN) algorithm to detect and update the daily changes in travel pattern. WS-DBSCAN converts the classical quadratic computation complexity DBSCAN to a problem of sub-quadratic complexity. The numerical experiment using the real AFC data in South East Queensland, Australia shows that the algorithm costs only 0.45% in computation time compared to the classical DBSCAN, but provides the same clustering results.