54 resultados para Built-in test
Low temperature synthesis of carbon nanotubes on indium tin oxide electrodes for organic solar cells
Resumo:
The electrical performance of indium tin oxide (ITO) coated glass was improved by including a controlled layer of carbon nanotubes directly on top of the ITO film. Multi-wall carbon nanotubes (MWCNTs) were synthesized by chemical vapor deposition, using ultra-thin Fe layers as catalyst. The process parameters (temperature, gas flow and duration) were carefully refined to obtain the appropriate size and density of MWCNTs with a minimum decrease of the light harvesting in the cell. When used as anodes for organic solar cells based on poly(3-hexylthiophene) (P3HT) and phenyl-C61-butyric acid methyl ester (PCBM), the MWCNT-enhanced electrodes are found to improve the charge carrier extraction from the photoactive blend, thanks to the additional percolation paths provided by the CNTs. The work function of as-modified ITO surfaces was measured by the Kelvin probe method to be 4.95 eV, resulting in an improved matching to the highest occupied molecular orbital level of the P3HT. This is in turn expected to increase the hole transport and collection at the anode, contributing to the significant increase of current density and open circuit voltage observed in test cells created with such MWCNT-enhanced electrodes.
Resumo:
This paper describes system identification, estimation and control of translational motion and heading angle for a cost effective open-source quadcopter — the MikroKopter. The dynamics of its built-in sensors, roll and pitch attitude controller, and system latencies are determined and used to design a computationally inexpensive multi-rate velocity estimator that fuses data from the built-in inertial sensors and a low-rate onboard laser range finder. Control is performed using a nested loop structure that is also computationally inexpensive and incorporates different sensors. Experimental results for the estimator and closed-loop positioning are presented and compared with ground truth from a motion capture system.
Resumo:
The Old Government House, a former residence of the Queen’s representatives in Brisbane, Australia, symbolises British cultural heritage of Colonial Queensland. Located on the campus of the Queensland University of Technology, it is one of the oldest surviving examples of a stately residence in Queensland. Built in 1860s, the Old Government House was originally intended as a temporary residence for the first governor of the newly independent colony of Queensland. However, it remained the vice-regal residence until 1909, serving eleven succeeding governors. Nearly seven decades later, it became the first building in Queensland to be protected under heritage legislation. Thus its importance, as an excellent exemplar that demonstrates the significance of cultural heritage, was established. The Old Government House has survived 150 years of restoration work, refurbishments, and additions. Through these years, it has served the people of Queensland in a multitude of roles. This paper aims to investigate the survival of heritage listed buildings through their adaptive re-use. Its focus will be on the adaptive reuse of the Old Government House through its refurbishments and additions over a period of 150 years. Through a qualitative research process this paper will endeavour to establish the significance of restoration work on the Old Government house; the new opportunities that has opened up as a result of the restoration work; the continued maintenance and management of the building through adaptive re-use; the economic benefits of restoration work; and its contribution to the on-going interest in the preservation of the Tangible Cultural Heritage.
Resumo:
The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.
Resumo:
Twenty first century learners operate in organic, immersive environments. A pedagogy of student-centred learning is not a recipe for rooms. A contemporary learning environment is like a landscape that grows, morphs, and responds to the pressures of the context and micro-culture. There is no single adaptable solution, nor a suite of off-the-shelf answers; propositions must be customisable and infinitely variable. They must be indeterminate and changeable; based on the creation of learning places, not restrictive or constraining spaces. A sustainable solution will be un-fixed, responsive to the life cycle of the components and materials, able to be manipulated by the users; it will create and construct its own history. Learning occurs as formal education with situational knowledge structures, but also as informal learning, active learning, blended learning social learning, incidental learning, and unintended learning. These are not spatial concepts but socio-cultural patterns of discovery. Individual learning requirements must run free and need to be accommodated as the learner sees fit. The spatial solution must accommodate and enable a full array of learning situations. It is a system not an object. Three major components: 1. The determinate landscape: in-situ concrete 'plate' that is permanent. It predates the other components of the system and remains as a remnant/imprint/fossil after the other components of the system have been relocated. It is a functional learning landscape in its own right; enabling a variety of experiences and activities. 2. The indeterminate landscape: a kit of pre-fabricated 2-D panels assembled in a unique manner at each site to suit the client and context. Manufactured to the principles of design-for-disassembly. A symbiotic barnacle like system that attaches itself to the existing infrastructure through the determinate landscape which acts as a fast growth rhizome. A carapace of protective panels, infinitely variable to create enclosed, semi-enclosed, and open learning places. 3. The stations: pre-fabricated packages of highly-serviced space connected through the determinate landscape. Four main types of stations; wet-room learning centres, dry-room learning centres, ablutions, and low-impact building services. Entirely customised at the factory and delivered to site. The stations can be retro-fitted to suit a new context during relocation. Principles of design for disassembly: material principles • use recycled and recyclable materials • minimise the number of types of materials • no toxic materials • use lightweight materials • avoid secondary finishes • provide identification of material types component principles • minimise/standardise the number of types of components • use mechanical not chemical connections • design for use of common tools and equipment • provide easy access to all components • make component size to suite means of handling • provide built in means of handling • design to realistic tolerances • use a minimum number of connectors and a minimum number of types system principles • design for durability and repeated use • use prefabrication and mass production • provide spare components on site • sustain all assembly and material information
Resumo:
Securing IT infrastructures of our modern lives is a challenging task because of their increasing complexity, scale and agile nature. Monolithic approaches such as using stand-alone firewalls and IDS devices for protecting the perimeter cannot cope with complex malwares and multistep attacks. Collaborative security emerges as a promising approach. But, research results in collaborative security are not mature, yet, and they require continuous evaluation and testing. In this work, we present CIDE, a Collaborative Intrusion Detection Extension for the network security simulation platform ( NeSSi 2 ). Built-in functionalities include dynamic group formation based on node preferences, group-internal communication, group management and an approach for handling the infection process for malware-based attacks. The CIDE simulation environment provides functionalities for easy implementation of collaborating nodes in large-scale setups. We evaluate the group communication mechanism on the one hand and provide a case study and evaluate our collaborative security evaluation platform in a signature exchange scenario on the other.
Resumo:
A technologically innovative study was undertaken across two suburbs in Brisbane, Australia, to assess socioeconomic differences in women's use of the local environment for work, recreation, and physical activity. Mothers from high and low socioeconomic suburbs were instructed to continue with usual daily routines, and to use mobile phone applications (Facebook Places, Twitter, and Foursquare) on their mobile phones to ‘check-in’ at each location and destination they reached during a one-week period. These smartphone applications are able to track travel logistics via built-in geographical information systems (GIS), which record participants’ points of latitude and longitude at each destination they reach. Location data were downloaded to Google Earth and excel for analysis. Women provided additional qualitative data via text regarding the reasons and social contexts of their travel. We analysed 2183 ‘check-ins’ for 54 women in this pilot study to gain quantitative, qualitative, and spatial data on human-environment interactions. Data was gathered on distances travelled, mode of transport, reason for travel, social context of travel, and categorised in terms of physical activity type – walking, running, sports, gym, cycling, or playing in the park. We found that the women in both suburbs had similar daily routines with the exception of physical activity. We identified 15% of ‘check-ins’ in the lower socioeconomic group as qualifying for the physical activity category, compared with 23% in the higher socioeconomic group. This was explained by more daily walking for transport (1.7kms to 0.2kms) and less car travel each week (28.km to 48.4kms) in the higher socioeconomic suburb. We ascertained insights regarding the socio-cultural influences on these differences via additional qualitative data. We discuss the benefits and limitations of using new technologies and Google Earth with implications for informing future physical and social aspects of urban design, and health promotion in socioeconomically diverse cities.
Resumo:
This paper looks at the accuracy of using the built-in camera of smart phones and free software as an economical way to quantify and analyse light exposure by producing luminance maps from High Dynamic Range (HDR) images. HDR images were captured with an Apple iPhone 4S to capture a wide variation of luminance within an indoor and outdoor scene. The HDR images were then processed using Photosphere software (Ward, 2010.) to produce luminance maps, where individual pixel values were compared with calibrated luminance meter readings. This comparison has shown an average luminance error of ~8% between the HDR image pixel values and luminance meter readings, when the range of luminances in the image is limited to approximately 1,500cd/m2.
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.
Resumo:
Teaching introductory programming has challenged educators through the years. Although Intelligent Tutoring Systems that teach programming have been developed to try to reduce the problem, none have been developed to teach web programming. This paper describes the design and evaluation of the PHP Intelligent Tutoring System (PHP ITS) which addresses this problem. The evaluation process showed that students who used the PHP ITS showed a significant improvement in test scores
Resumo:
This paper introduces a parallel implementation of an agent-based model applied to electricity distribution grids. A fine-grained shared memory parallel implementation is presented, detailing the way the agents are grouped and executed on a multi-threaded machine, as well as the way the model is built (in a composable manner) which is an aid to the parallelisation. Current results show a medium level speedup of 2.6, but improvements are expected by incor-porating newer distributed or parallel ABM schedulers into this implementa-tion. While domain-specific, this parallel algorithm can be applied to similarly structured ABMs (directed acyclic graphs).
Resumo:
We introduce Kamouflage: a new architecture for building theft-resistant password managers. An attacker who steals a laptop or cell phone with a Kamouflage-based password manager is forced to carry out a considerable amount of online work before obtaining any user credentials. We implemented our proposal as a replacement for the built-in Firefox password manager, and provide performance measurements and the results from experiments with large real-world password sets to evaluate the feasibility and effectiveness of our approach. Kamouflage is well suited to become a standard architecture for password managers on mobile devices.
Resumo:
Uanda house is of historical importance to Queensland both in terms of its architectural design and its social history. Uanda is a low set, single story house built in 1928, located in the inner city Brisbane suburb of Wilston. Architecturally, the house has a number of features that distinguish it from the surrounding bungalow influenced inter-war houses. The house has been described as a Queensland style house with neo-Georgian influences. Historically, it is associated with the entry of women into the profession of architecture in Queensland. Uanda is the only remaining intact work of architect/draftswoman Nellie McCredie and one of a very few examples of works by pioneering women architects in Queensland. The house was entered into the Queensland Heritage Register, in 2000, after an appeal against Brisbane City Council’s refusal of an application to demolish the house was disputed in the Queensland Planning and Environment court in 1998/1999. In the court’s report, Judge Robin QC, DCJ, stated that, “The importance of preserving women's history and heritage, often previously marginalised or lost, is now accepted at government level, recognising that role models are vital for bringing new generations of women into the professions and public life.” While acknowledging women’s contribution to the profession of architecture is an important endeavour, it also has the potential to isolate women architects as separate to a mainstream history of architecture. As Julie Willis writes, it can imply an atypical, feminine style of architecture. What is the impact or potential implications of recognising heritage buildings designed by women architects? The Judge also highlights the absence of a recorded history of unique Brisbane houses and questions the authority of the heritage register. This research looks at these points of difference through a case study of the Uanda house. The paper will investigate the processes of adding the house to the heritage register, the court case and existing research on Nellie McCredie and Uanda House.
Resumo:
This paper presents a modulation and controller design method for paralleled Z-source inverter systems applicable for alternative energy sources like solar cells, fuel cells, or variablespeed wind turbines with front-end diode rectifiers. A modulation scheme is designed based on simple shoot-through principle with interleaved carriers to give enhanced ripple reduction in the system. Subsequently, a control method is proposed to equalize the amount of power injected by the inverters in the grid-connected mode and also to provide reliable supply to sensitive loads onsite in the islanding mode. The modulation and controlling methods are proposed to have modular independence so that redundancy, maintainability, and improved reliability of supply can be achieved. The performance of the proposed paralleled Z-source inverter configuration is validated with simulations carried out using Matlab/Simulink/Powersim. Moreover, a prototype is built in the laboratory to obtain the experimental verifications.
Resumo:
The present study explores reproducing the closest geometry of a high pressure ratio single stage radial-inflow turbine applied in the Sundstrans Power Systems T-100 Multipurpose Small Power Unit. The commercial software ANSYS-Vista RTD along with a built in module, BladeGen, is used to conduct a meanline design and create 3D geometry of one flow passage. Carefully examining the proposed design against the geometrical and experimental data, ANSYS-TurboGrid is applied to generate computational mesh. CFD simulations are performed with ANSYS-CFX in which three-dimensional Reynolds-Averaged Navier-Stokes equations are solved subject to appropriate boundary conditions. Results are compared with numerical and experimental data published in the literature in order to generate the exact geometry of the existing turbine and validate the numerical results against the experimental ones.