962 resultados para system improvements
Resumo:
In many active noise control (ANC) applications, an online secondary path modelling method that uses a white noise as a training signal is required. This paper proposes a new feedback ANC system. Here we modified both the FxLMS and the VSS-LMS algorithms to raised noise attenuation and modelling accuracy for the overall system. The proposed algorithm stops injection of the white noise at the optimum point and reactivate the injection during the operation, if needed, to maintain performance of the system. Preventing continuous injection of the white noise increases the performance of the proposed method significantly and makes it more desirable for practical ANC systems. Computer simulation results shown in this paper indicate effectiveness of the proposed method.
Resumo:
In this paper, a demand-responsive decision support system is proposed by integrating the operations of coal shipment, coal stockpiles and coal railing within a whole system. A generic and flexible scheduling optimisation methodology is developed to identify, represent, model, solve and analyse the coal transport problem in a standard and convenient way. As a result, the integrated train-stockpile-ship timetable is created and optimised for improving overall efficiency of coal transport system. A comprehensive sensitivity analysis based on extensive computational experiments is conducted to validate the proposed methodology. The mathematical proposition and proof are concluded as technical and insightful advices for industry practice. The proposed methodology provides better decision making on how to assign rail rolling-stocks and upgrade infrastructure in order to significantly improve capacity utilisation with the best resource-effectiveness ratio. The proposed decision support system with train-stockpile-ship scheduling optimisation techniques is promising to be applied in railway or mining industry, especially as a useful quantitative decision making tool on how to use more current rolling-stocks or whether to buy additional rolling-stocks for mining transportation.
Resumo:
Prolonged intermittent-sprint exercise (i.e., team sports) induce disturbances in skeletal muscle structure and function that are associated with reduced contractile function, a cascade of inflammatory responses, perceptual soreness, and a delayed return to optimal physical performance. In this context, recovery from exercise-induced fatigue is traditionally treated from a peripheral viewpoint, with the regeneration of muscle physiology and other peripheral factors the target of recovery strategies. The direction of this research narrative on post-exercise recovery differs to the increasing emphasis on the complex interaction between both central and peripheral factors regulating exercise intensity during exercise performance. Given the role of the central nervous system (CNS) in motor-unit recruitment during exercise, it too may have an integral role in post-exercise recovery. Indeed, this hypothesis is indirectly supported by an apparent disconnect in time-course changes in physiological and biochemical markers resultant from exercise and the ensuing recovery of exercise performance. Equally, improvements in perceptual recovery, even withstanding the physiological state of recovery, may interact with both feed-forward/feed-back mechanisms to influence subsequent efforts. Considering the research interest afforded to recovery methodologies designed to hasten the return of homeostasis within the muscle, the limited focus on contributors to post-exercise recovery from CNS origins is somewhat surprising. Based on this context, the current review aims to outline the potential contributions of the brain to performance recovery after strenuous exercise.
Resumo:
A security system based on the recognition of the iris of human eyes using the wavelet transform is presented. The zero-crossings of the wavelet transform are used to extract the unique features obtained from the grey-level profiles of the iris. The recognition process is performed in two stages. The first stage consists of building a one-dimensional representation of the grey-level profiles of the iris, followed by obtaining the wavelet transform zerocrossings of the resulting representation. The second stage is the matching procedure for iris recognition. The proposed approach uses only a few selected intermediate resolution levels for matching, thus making it computationally efficient as well as less sensitive to noise and quantisation errors. A normalisation process is implemented to compensate for size variations due to the possible changes in the camera-to-face distance. The technique has been tested on real images in both noise-free and noisy conditions. The technique is being investigated for real-time implementation, as a stand-alone system, for access control to high-security areas.
Resumo:
Facial expression recognition (FER) systems must ultimately work on real data in uncontrolled environments although most research studies have been conducted on lab-based data with posed or evoked facial expressions obtained in pre-set laboratory environments. It is very difficult to obtain data in real-world situations because privacy laws prevent unauthorized capture and use of video from events such as funerals, birthday parties, marriages etc. It is a challenge to acquire such data on a scale large enough for benchmarking algorithms. Although video obtained from TV or movies or postings on the World Wide Web may also contain ‘acted’ emotions and facial expressions, they may be more ‘realistic’ than lab-based data currently used by most researchers. Or is it? One way of testing this is to compare feature distributions and FER performance. This paper describes a database that has been collected from television broadcasts and the World Wide Web containing a range of environmental and facial variations expected in real conditions and uses it to answer this question. A fully automatic system that uses a fusion based approach for FER on such data is introduced for performance evaluation. Performance improvements arising from the fusion of point-based texture and geometry features, and the robustness to image scale variations are experimentally evaluated on this image and video dataset. Differences in FER performance between lab-based and realistic data, between different feature sets, and between different train-test data splits are investigated.
Resumo:
With the ever-increasing penetration level of wind power, the impacts of wind power on the power system are becoming more and more significant. Hence, it is necessary to systematically examine its impacts on the small signal stability and transient stability in order to find out countermeasures. As such, a comprehensive study is carried out to compare the dynamic performances of power system respectively with three widely-used power generators. First, the dynamic models are described for three types of wind power generators, i. e. the squirrel cage induction generator (SCIG), doubly fed induction generator (DFIG) and permanent magnet generator (PMG). Then, the impacts of these wind power generators on the small signal stability and transient stability are compared with that of a substituted synchronous generator (SG) in the WSCC three-machine nine-bus system by the eigenvalue analysis and dynamic time-domain simulations. Simulation results show that the impacts of different wind power generators are different under small and large disturbances.
Resumo:
In this article, we analyse bifurcations from stationary stable spots to travelling spots in a planar three-component FitzHugh-Nagumo system that was proposed previously as a phenomenological model of gas-discharge systems. By combining formal analyses, center-manifold reductions, and detailed numerical continuation studies, we show that, in the parameter regime under consideration, the stationary spot destabilizes either through its zeroth Fourier mode in a Hopf bifurcation or through its first Fourier mode in a pitchfork or drift bifurcation, whilst the remaining Fourier modes appear to create only secondary bifurcations. Pitchfork bifurcations result in travelling spots, and we derive criteria for the criticality of these bifurcations. Our main finding is that supercritical drift bifurcations, leading to stable travelling spots, arise in this model, which does not seem possible for its two-component version.
Resumo:
We have developed a method to test the cytotoxicity of wound dressings, ointments, creams and gels used in our Burn Centre, by placing them on a permeable Nunc Polycarbonate cell culture insert, incubated with a monolayer of cells (HaCaTs and primary human keratinocytes). METHODS: We performed two different methods to determine the relative toxicity to cells. (1) Photo visualisation: The dressings or compounds were positioned on the insert's membrane which was placed onto the monolayer tissue culture plate. After 24 h the surviving adherent cells were stained with Toluidine Blue and photos of the plates were taken. The acellular area of non-adherent dead cells which had been washed off with buffer was measured as a percentage of the total area of the plate. (2) Cell count of surviving cells: After 24 h incubation with the test material, the remaining cells were detached with trypsin, spun down and counted in a Haemocytometer with Trypan Blue, which differentiates between live and dead cells. RESULTS: Seventeen products were tested. The least cytotoxic products were Melolite, White soft Paraffin and Chlorsig1% Ointment. Some cytotoxicity was shown with Jelonet, Mepitel((R)), PolyMem((R)), DuoDerm((R)) and Xeroform. The most cytotoxic products included those which contained silver or Chlorhexidine and Paraffin Cream a moisturizer which contains the preservative Chlorocresol. CONCLUSION: This in vitro cell culture insert method allows testing of agents without direct cell contact. It is easy and quick to perform, and should help the clinician to determine the relative cytotoxicity of various dressings and the optimal dressing for each individual wound.
Resumo:
Background Nontuberculous mycobacteria (NTM) are normal inhabitants of a variety of environmental reservoirs including natural and municipal water. The aim of this study was to document the variety of species of NTM in potable water in Brisbane, QLD, with a specific interest in the main pathogens responsible for disease in this region and to explore factors associated with the isolation of NTM. One-litre water samples were collected from 189 routine collection sites in summer and 195 sites in winter. Samples were split, with half decontaminated with CPC 0.005%, then concentrated by filtration and cultured on 7H11 plates in MGIT tubes (winter only). Results Mycobacteria were grown from 40.21% sites in Summer (76/189) and 82.05% sites in winter (160/195). The winter samples yielded the greatest number and variety of mycobacteria as there was a high degree of subculture overgrowth and contamination in summer. Of those samples that did yield mycobacteria in summer, the variety of species differed from those isolated in winter. The inclusion of liquid media increased the yield for some species of NTM. Species that have been documented to cause disease in humans residing in Brisbane that were also found in water include M. gordonae, M. kansasii, M. abscessus, M. chelonae, M. fortuitum complex, M. intracellulare, M. avium complex, M. flavescens, M. interjectum, M. lentiflavum, M. mucogenicum, M. simiae, M. szulgai, M. terrae. M. kansasii was frequently isolated, but M. avium and M. intracellulare (the main pathogens responsible for disease is QLD) were isolated infrequently. Distance of sampling site from treatment plant in summer was associated with isolation of NTM. Pathogenic NTM (defined as those known to cause disease in QLD) were more likely to be identified from sites with narrower diameter pipes, predominantly distribution sample points, and from sites with asbestos cement or modified PVC pipes. Conclusions NTM responsible for human disease can be found in large urban water distribution systems in Australia. Based on our findings, additional point chlorination, maintenance of more constant pressure gradients in the system, and the utilisation of particular pipe materials should be considered.
Resumo:
There is currently a wide range of research into the recent introduction of student response systems in higher education and tertiary settings (Banks 2006; Kay and Le Sange, 2009; Beatty and Gerace 2009; Lantz 2010; Sprague and Dahl 2009). However, most of this pedagogical literature has generated ‘how to’ approaches regarding the use of ‘clickers’, keypads, and similar response technologies. There are currently no systematic reviews on the effectiveness of ‘GoSoapBox’ – a more recent, and increasingly popular student response system – for its capacity to enhance critical thinking, and achieve sustained learning outcomes. With rapid developments in teaching and learning technologies across all undergraduate disciplines, there is a need to obtain comprehensive, evidence-based advice on these types of technologies, their uses, and overall efficacy. This paper addresses this current gap in knowledge. Our teaching team, in an undergraduate Sociology and Public Health unit at the Queensland University of Technology (QUT), introduced GoSoapBox as a mechanism for discussing controversial topics, such as sexuality, gender, economics, religion, and politics during lectures, and to take opinion polls on social and cultural issues affecting human health. We also used this new teaching technology to allow students to interact with each other during class – both on both social and academic topics – and to generate discussions and debates during lectures. The paper reports on a data-driven study into how this interactive online tool worked to improve engagement and the quality of academic work produced by students. This paper will firstly, cover the recent literature reviewing student response systems in tertiary settings. Secondly, it will outline the theoretical framework used to generate this pedagogical research. In keeping with the social and collaborative features of Web 2.0 technologies, Bandura’s Social Learning Theory (SLT) will be applied here to investigate the effectiveness of GoSoapBox as an online tool for improving learning experiences and the quality of academic output by students. Bandura has emphasised the Internet as a tool for ‘self-controlled learning’ (Bandura 2001), as it provides the education sector with an opportunity to reconceptualise the relationship between learning and thinking (Glassman & Kang 2011). Thirdly, we describe the methods used to implement the use of GoSoapBox in our lectures and tutorials, and which aspects of the technology we drew on for learning purposes, as well as the methods for obtaining feedback from the students about the effectiveness or otherwise of this tool. Fourthly, we report cover findings from an examination of all student/staff activity on GoSoapBox as well as reports from students about the benefits and limitations of it as a learning aid. We then display a theoretical model that is produced via an iterative analytical process between SLT and our data analysis for use by academics and teachers across the undergraduate curriculum. The model has implications for all teachers considering the use of student response systems to improve the learning experiences of their students. Finally, we consider some of the negative aspects of GoSoapBox as a learning aid.
Resumo:
Global awareness for cleaner and renewable energy is transforming the electricity sector at many levels. New technologies are being increasingly integrated into the electricity grid at high, medium and low voltage levels, new taxes on carbon emissions are being introduced and individuals can now produce electricity, mainly through rooftop photovoltaic (PV) systems. While leading to improvements, these changes also introduce challenges, and a question that often rises is ‘how can we manage this constantly evolving grid?’ The Queensland Government and Ergon Energy, one of the two Queensland distribution companies, have partnered with some Australian and German universities on a project to answer this question in a holistic manner. The project investigates the impact the integration of renewables and other new technologies has on the physical structure of the grid, and how this evolving system can be managed in a sustainable and economical manner. To aid understanding of what the future might bring, a software platform has been developed that integrates two modelling techniques: agent-based modelling (ABM) to capture the characteristics of the different system units accurately and dynamically, and particle swarm optimization (PSO) to find the most economical mix of network extension and integration of distributed generation over long periods of time. Using data from Ergon Energy, two types of networks (3 phase, and Single Wired Earth Return or SWER) have been modelled; three-phase networks are usually used in dense networks such as urban areas, while SWER networks are widely used in rural Queensland. Simulations can be performed on these networks to identify the required upgrades, following a three-step process: a) what is already in place and how it performs under current and future loads, b) what can be done to manage it and plan the future grid and c) how these upgrades/new installations will perform over time. The number of small-scale distributed generators, e.g. PV and battery, is now sufficient (and expected to increase) to impact the operation of the grid, which in turn needs to be considered by the distribution network manager when planning for upgrades and/or installations to stay within regulatory limits. Different scenarios can be simulated, with different levels of distributed generation, in-place as well as expected, so that a large number of options can be assessed (Step a). Once the location, sizing and timing of assets upgrade and/or installation are found using optimisation techniques (Step b), it is possible to assess the adequacy of their daily performance using agent-based modelling (Step c). One distinguishing feature of this software is that it is possible to analyse a whole area at once, while still having a tailored solution for each of the sub-areas. To illustrate this, using the impact of battery and PV can have on the two types of networks mentioned above, three design conditions can be identified (amongst others): · Urban conditions o Feeders that have a low take-up of solar generators, may benefit from adding solar panels o Feeders that need voltage support at specific times, may be assisted by installing batteries · Rural conditions - SWER network o Feeders that need voltage support as well as peak lopping may benefit from both battery and solar panel installations. This small example demonstrates that no single solution can be applied across all three areas, and there is a need to be selective in which one is applied to each branch of the network. This is currently the function of the engineer who can define various scenarios against a configuration, test them and iterate towards an appropriate solution. Future work will focus on increasing the level of automation in identifying areas where particular solutions are applicable.
Resumo:
Energy auditing is an effective but costly approach for reducing the long-term energy consumption of buildings. When well-executed, energy loss can be quickly identified in the building structure and its subsystems. This then presents opportunities for improving energy efficiency. We present a low-cost, portable technology called "HeatWave" which allows non-experts to generate detailed 3D surface temperature models for energy auditing. This handheld 3D thermography system consists of two commercially available imaging sensors and a set of software algorithms which can be run on a laptop. The 3D model can be visualized in real-time by the operator so that they can monitor their degree of coverage as the sensors are used to capture data. In addition, results can be analyzed offline using the proposed "Spectra" multispectral visualization toolbox. The presence of surface temperature data in the generated 3D model enables the operator to easily identify and measure thermal irregularities such as thermal bridges, insulation leaks, moisture build-up and HVAC faults. Moreover, 3D models generated from subsequent audits of the same environment can be automatically compared to detect temporal changes in conditions and energy use over time.