187 resultados para Tuned filter
Resumo:
Wilmar’s Pioneer Sugar mill has a need to replace some small rotary vacuum filters (RVFs) due to the condition of existing aged plant. A vacuum belt press filter (VBPF) manufactured by Technopulp of Brazil was purchased and installed at Pioneer Mill in September/October 2012 and commissioning trials undertaken over a five week period commencing in early November. There are no vacuum belt press filters currently in use in Australian sugar mills for mud processing. The Technopulp filter is a relatively common and well accepted technology with over 600 units installed. The main attractions for the VBPF to Pioneer Mill were…
Resumo:
The electronic and optical properties of anatase titanium dioxide (TiO2), co-doped by nitrogen (N) and lithium (Li), have been investigated by density functional theory plus Hubbard correction term U, namely DFT+U. It is found that Li-dopants can effectively balance the net charges brought by N-dopants and shift the local state to the top of valence band. Depending on the distribution of dopants, the adsorption edges of TiO2 may be red- or blue-shifted, being consistent with recent experimental observations.
Resumo:
Social media platforms risk polarising public opinions by employing proprietary algorithms that produce filter bubbles and echo chambers. As a result, the ability of citizens and communities to engage in robust debate in the public sphere is diminished. In response, this paper highlights the capacity of urban interfaces, such as pervasive displays, to counteract this trend by exposing citizens to the socio-cultural diversity of the city. Engagement with different ideas, networks and communities is crucial to both innovation and the functioning of democracy. We discuss examples of urban interfaces designed to play a key role in fostering this engagement. Based on an analysis of works empirically-grounded in field observations and design research, we call for a theoretical framework that positions pervasive displays and other urban interfaces as civic media. We argue that when designed for more than wayfinding, advertisement or television broadcasts, urban screens as civic media can rectify some of the pitfalls of social media by allowing the polarised user to break out of their filter bubble and embrace the cultural diversity and richness of the city.
Resumo:
“Hardware in the Loop” (HIL) testing is widely used in the automotive industry. The sophisticated electronic control units used for vehicle control are usually tested and evaluated using HIL-simulations. The HIL increases the degree of realistic testing of any system. Moreover, it helps in designing the structure and control of the system under test so that it works effectively in the situations that will be encountered in the system. Due to the size and the complexity of interaction within a power network, most research is based on pure simulation. To validate the performance of physical generator or protection system, most testing is constrained to very simple power network. This research, however, examines a method to test power system hardware within a complex virtual environment using the concept of the HIL. The HIL testing for electronic control units and power systems protection device can be easily performed at signal level. But performance of power systems equipments, such as distributed generation systems can not be evaluated at signal level using HIL testing. The HIL testing for power systems equipments is termed here as ‘Power Network in the Loop’ (PNIL). PNIL testing can only be performed at power level and requires a power amplifier that can amplify the simulation signal to the power level. A power network is divided in two parts. One part represents the Power Network Under Test (PNUT) and the other part represents the rest of the complex network. The complex network is simulated in real time simulator (RTS) while the PNUT is connected to the Voltage Source Converter (VSC) based power amplifier. Two way interaction between the simulator and amplifier is performed using analog to digital (A/D) and digital to analog (D/A) converters. The power amplifier amplifies the current or voltage signal of simulator to the power level and establishes the power level interaction between RTS and PNUT. In the first part of this thesis, design and control of a VSC based power amplifier that can amplify a broadband voltage signal is presented. A new Hybrid Discontinuous Control method is proposed for the amplifier. This amplifier can be used for several power systems applications. In the first part of the thesis, use of this amplifier in DSTATCOM and UPS applications are presented. In the later part of this thesis the solution of network in the loop testing with the help of this amplifier is reported. The experimental setup for PNIL testing is built in the laboratory of Queensland University of Technology and the feasibility of PNIL testing has been evaluated using the experimental studies. In the last section of this thesis a universal load with power regenerative capability is designed. This universal load is used to test the DG system using PNIL concepts. This thesis is composed of published/submitted papers that form the chapters in this dissertation. Each paper has been published or submitted during the period of candidature. Chapter 1 integrates all the papers to provide a coherent view of wide bandwidth switching amplifier and its used in different power systems applications specially for the solution of power systems testing using PNIL.
Resumo:
Perceptual aliasing makes topological navigation a difficult task. In this paper we present a general approach for topological SLAM~(simultaneous localisation and mapping) which does not require motion or odometry information but only a sequence of noisy measurements from visited places. We propose a particle filtering technique for topological SLAM which relies on a method for disambiguating places which appear indistinguishable using neighbourhood information extracted from the sequence of observations. The algorithm aims to induce a small topological map which is consistent with the observations and simultaneously estimate the location of the robot. The proposed approach is evaluated using a data set of sonar measurements from an indoor environment which contains several similar places. It is demonstrated that our approach is capable of dealing with severe ambiguities and, and that it infers a small map in terms of vertices which is consistent with the sequence of observations.
Resumo:
This report fully summarises a project designed to enhance commercial real estate performance within both operational and investment contexts through the development of a model aimed at supporting improved decision-making. The model is based on a risk adjusted discounted cash flow, providing a valuable toolkit for building managers, owners, and potential investors for evaluating individual building performance in terms of financial, social and environmental criteria over the complete life-cycle of the asset. The ‘triple bottom line’ approach to the evaluation of commercial property has much significance for the administrators of public property portfolios in particular. It also has applications more generally for the wider real estate industry given that the advent of ‘green’ construction requires new methods for evaluating both new and existing building stocks. The research is unique in that it focuses on the accuracy of the input variables required for the model. These key variables were largely determined by market-based research and an extensive literature review, and have been fine-tuned with extensive testing. In essence, the project has considered probability-based risk analysis techniques that required market-based assessment. The projections listed in the partner engineers’ building audit reports of the four case study buildings were fed into the property evaluation model developed by the research team. The results are strongly consistent with previously existing, less robust evaluation techniques. And importantly, this model pioneers an approach for taking full account of the triple bottom line, establishing a benchmark for related research to follow. The project’s industry partners expressed a high degree of satisfaction with the project outcomes at a recent demonstration seminar. The project in its existing form has not been geared towards commercial applications but it is anticipated that QDPW and other industry partners will benefit greatly by using this tool for the performance evaluation of property assets. The project met the objectives of the original proposal as well as all the specified milestones. The project has been completed within budget and on time. This research project has achieved the objective by establishing research foci on the model structure, the key input variable identification, the drivers of the relevant property markets, the determinants of the key variables (Research Engine no.1), the examination of risk measurement, the incorporation of risk simulation exercises (Research Engine no.2), the importance of both environmental and social factors and, finally the impact of the triple bottom line measures on the asset (Research Engine no. 3).
Resumo:
The validation of Computed Tomography (CT) based 3D models takes an integral part in studies involving 3D models of bones. This is of particular importance when such models are used for Finite Element studies. The validation of 3D models typically involves the generation of a reference model representing the bones outer surface. Several different devices have been utilised for digitising a bone’s outer surface such as mechanical 3D digitising arms, mechanical 3D contact scanners, electro-magnetic tracking devices and 3D laser scanners. However, none of these devices is capable of digitising a bone’s internal surfaces, such as the medullary canal of a long bone. Therefore, this study investigated the use of a 3D contact scanner, in conjunction with a microCT scanner, for generating a reference standard for validating the internal and external surfaces of a CT based 3D model of an ovine femur. One fresh ovine limb was scanned using a clinical CT scanner (Phillips, Brilliance 64) with a pixel size of 0.4 mm2 and slice spacing of 0.5 mm. Then the limb was dissected to obtain the soft tissue free bone while care was taken to protect the bone’s surface. A desktop mechanical 3D contact scanner (Roland DG Corporation, MDX 20, Japan) was used to digitise the surface of the denuded bone. The scanner was used with the resolution of 0.3 × 0.3 × 0.025 mm. The digitised surfaces were reconstructed into a 3D model using reverse engineering techniques in Rapidform (Inus Technology, Korea). After digitisation, the distal and proximal parts of the bone were removed such that the shaft could be scanned with a microCT (µCT40, Scanco Medical, Switzerland) scanner. The shaft, with the bone marrow removed, was immersed in water and scanned with a voxel size of 0.03 mm3. The bone contours were extracted from the image data utilising the Canny edge filter in Matlab (The Mathswork).. The extracted bone contours were reconstructed into 3D models using Amira 5.1 (Visage Imaging, Germany). The 3D models of the bone’s outer surface reconstructed from CT and microCT data were compared against the 3D model generated using the contact scanner. The 3D model of the inner canal reconstructed from the microCT data was compared against the 3D models reconstructed from the clinical CT scanner data. The disparity between the surface geometries of two models was calculated in Rapidform and recorded as average distance with standard deviation. The comparison of the 3D model of the whole bone generated from the clinical CT data with the reference model generated a mean error of 0.19±0.16 mm while the shaft was more accurate(0.08±0.06 mm) than the proximal (0.26±0.18 mm) and distal (0.22±0.16 mm) parts. The comparison between the outer 3D model generated from the microCT data and the contact scanner model generated a mean error of 0.10±0.03 mm indicating that the microCT generated models are sufficiently accurate for validation of 3D models generated from other methods. The comparison of the inner models generated from microCT data with that of clinical CT data generated an error of 0.09±0.07 mm Utilising a mechanical contact scanner in conjunction with a microCT scanner enabled to validate the outer surface of a CT based 3D model of an ovine femur as well as the surface of the model’s medullary canal.
Resumo:
This paper proposes a novel relative entropy rate (RER) based approach for multiple HMM (MHMM) approximation of a class of discrete-time uncertain processes. Under different uncertainty assumptions, the model design problem is posed either as a min-max optimisation problem or stochastic minimisation problem on the RER between joint laws describing the state and output processes (rather than the more usual RER between output processes). A suitable filter is proposed for which performance results are established which bound conditional mean estimation performance and show that estimation performance improves as the RER is reduced. These filter consistency and convergence bounds are the first results characterising multiple HMM approximation performance and suggest that joint RER concepts provide a useful model selection criteria. The proposed model design process and MHMM filter are demonstrated on an important image processing dim-target detection problem.
Resumo:
This thesis is a documented energy audit and long term study of energy and water reduction in a ghee factory. Global production of ghee exceeds 4 million tonnes annually. The factory in this study refines dairy products by non-traditional centrifugal separation and produces 99.9% pure, canned, crystallised Anhydrous Milk Fat (Ghee). Ghee is traditionally made by batch processing methods. The traditional method is less efficient, than centrifugal separation. An in depth systematic investigation was conducted of each item of major equipment including; ammonia refrigeration, a steam boiler, canning equipment, pumps, heat exchangers and compressed air were all fine-tuned. Continuous monitoring of electrical usage showed that not every initiative worked, others had pay back periods of less than a year. In 1994-95 energy consumption was 6,582GJ and in 2003-04 it was 5,552GJ down 16% for a similar output. A significant reduction in water usage was achieved by reducing the airflow in the refrigeration evaporative condensers to match the refrigeration load. Water usage has fallen 68% from18ML in 1994-95 to 5.78ML in 2003-04. The methods reported in this thesis could be applied to other industries, which have similar equipment, and other ghee manufacturers.
Resumo:
In conventional fabrication of ceramic separation membranes, the particulate sols are applied onto porous supports. Major structural deficiencies under this approach are pin-holes and cracks, and the dramatic losses of flux when pore sizes are reduced to enhance selectivity. We have overcome these structural deficiencies by constructing hierarchically structured separation layer on a porous substrate using lager titanate nanofibers and smaller boehmite nanofibers. This yields a radical change in membrane texture. The resulting membranes effectively filter out species larger than 60 nm at flow rates orders of magnitude greater than conventional membranes. This reveals a new direction in membrane fabrication.
Resumo:
Purpose: In this research we examined, by means of case studies, the mechanisms by which relationships can be managed and by which communication and cooperation can be enhanced in sustainable supply chains. The research was predicated on the contention that the development of a sustainable supply chain depends, in part, on the transfer of knowledge and capabilities from the larger players in the supply chain. Design/Methodology/Approach: The research adopted a triangulated approach in which quantitative data were collected by questionnaire, interviews were conducted to explore and enrich the quantitative data and case studies were undertaken in order to illustrate and validate the findings. Handy‟s (1985) view of organisational culture, Allen & Meyer‟s (1990) concepts of organisational commitment and Van de Ven & Ferry‟s (1980) measures of organisational structuring have been combined into a model to test and explain how collaborative mechanisms can affect supply chain sustainability. Findings: It has been shown that the degree of match and mismatch between organisational culture and structure has an impact on staff‟s commitment level. A sustainable supply chain depends on convergence – that is the match between organisational structuring, organisation culture and organisation commitment. Research Limitations/implications: The study is a proof of concept and three case studies have been used to illustrate the nature of the model developed. Further testing and refinement of the model in practice should be the next step in this research. Practical implications: The concept of relationship management needs to filter down to all levels in the supply chain if participants are to retain commitment and buy-in to the relationship. A sustainable supply chain requires proactive relationship management and the development of an appropriate organisational culture, and trust. By legitimising individuals‟ expectations of the type of culture which is appropriate to their company and empowering employees to address mismatches that may occur a situation can be created whereby the collaborating organisations develop their competences symbiotically and so facilitate a sustainable supply chain. Originality/value: The culture/commitment/structure model developed from three separate strands of management thought has proved to be a powerful tool for analysing collaboration in supply chains and explaining how and why some supply chains are sustainable, and others are not.
Resumo:
This paper describes a novel framework for facial expression recognition from still images by selecting, optimizing and fusing ‘salient’ Gabor feature layers to recognize six universal facial expressions using the K nearest neighbor classifier. The recognition comparisons with all layer approach using JAFFE and Cohn-Kanade (CK) databases confirm that using ‘salient’ Gabor feature layers with optimized sizes can achieve better recognition performance and dramatically reduce computational time. Moreover, comparisons with the state of the art performances demonstrate the effectiveness of our approach.
Resumo:
Traditional ceramic separation membranes, which are fabricated by applying colloidal suspensions of metal hydroxides to porous supports, tend to suffer from pinholes and cracks that seriously affect their quality. Other intrinsic problems for these membranes include dramatic losses of flux when the pore sizes are reduced to enhance selectivity and dead-end pores that make no contribution to filtration. In this work, we propose a new strategy for addressing these problems by constructing a hierarchically structured separation layer on a porous substrate using large titanate nanofibers and smaller boehmite nanofibers. The nanofibers are able to divide large voids into smaller ones without forming dead-end pores and with the minimum reduction of the total void volume. The separation layer of nanofibers has a porosity of over 70% of its volume, whereas the separation layer in conventional ceramic membranes has a porosity below 36% and inevitably includes dead-end pores that make no contribution to the flux. This radical change in membrane texture greatly enhances membrane performance. The resulting membranes were able to filter out 95.3% of 60-nm particles from a 0.01 wt % latex while maintaining a relatively high flux of between 800 and 1000 L/m2·h, under a low driving pressure (20 kPa). Such flow rates are orders of magnitude greater than those of conventional membranes with equal selectivity. Moreover, the flux was stable at approximately 800 L/m2·h with a selectivity of more than 95%, even after six repeated runs of filtration and calcination. Use of different supports, either porous glass or porous alumina, had no substantial effect on the performance of the membranes; thus, it is possible to construct the membranes from a variety of supports without compromising functionality. The Darcy equation satisfactorily describes the correlation between the filtration flux and the structural parameters of the new membranes. The assembly of nanofiber meshes to combine high flux with excellent selectivity is an exciting new direction in membrane fabrication.
Resumo:
Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.
Resumo:
Intuitively, any `bag of words' approach in IR should benefit from taking term dependencies into account. Unfortunately, for years the results of exploiting such dependencies have been mixed or inconclusive. To improve the situation, this paper shows how the natural language properties of the target documents can be used to transform and enrich the term dependencies to more useful statistics. This is done in three steps. The term co-occurrence statistics of queries and documents are each represented by a Markov chain. The paper proves that such a chain is ergodic, and therefore its asymptotic behavior is unique, stationary, and independent of the initial state. Next, the stationary distribution is taken to model queries and documents, rather than their initial distri- butions. Finally, ranking is achieved following the customary language modeling paradigm. The main contribution of this paper is to argue why the asymptotic behavior of the document model is a better representation then just the document's initial distribution. A secondary contribution is to investigate the practical application of this representation in case the queries become increasingly verbose. In the experiments (based on Lemur's search engine substrate) the default query model was replaced by the stable distribution of the query. Just modeling the query this way already resulted in significant improvements over a standard language model baseline. The results were on a par or better than more sophisticated algorithms that use fine-tuned parameters or extensive training. Moreover, the more verbose the query, the more effective the approach seems to become.