324 resultados para Input datas
Resumo:
This tutorial is designed to help new users become familiar with using the PicoBlaze microcontroller with the Spartan-3E board. The tutorial gives a brief introduction to the PicoBlaze microcontroller, and then steps through the following: - Writing a small PicoBlaze assembly language (.psm) file, and stepping through the process of assembling the .psm file using KCPSM3; - Writing a top level VHDL module to connect the PicoBlaze microcontroller (KCPSM3 component) and the program ROM, and to connect the required input and output ports; - Connecting the top level module inputs and outputs to the switches, buttons and LEDs on the Spartan-3E board; - Downloading the program to the Spartan-3E board using the Project Navigator software.
Resumo:
Background/objectives This study estimates the economic outcomes of a nutrition intervention to at-risk patients compared with standard care in the prevention of pressure ulcer. Subjects/methods Statistical models were developed to predict ‘cases of pressure ulcer avoided’, ‘number of bed days gained’ and ‘change to economic costs’ in public hospitals in 2002–2003 in Queensland, Australia. Input parameters were specified and appropriate probability distributions fitted for: number of discharges per annum; incidence rate for pressure ulcer; independent effect of pressure ulcer on length of stay; cost of a bed day; change in risk in developing a pressure ulcer associated with nutrition support; annual cost of the provision of a nutrition support intervention for at-risk patients. A total of 1000 random re-samples were made and the results expressed as output probability distributions. Results The model predicts a mean 2896 (s.d. 632) cases of pressure ulcer avoided; 12 397 (s.d. 4491) bed days released and corresponding mean economic cost saving of euros 2 869 526 (s.d. 2 078 715) with a nutrition support intervention, compared with standard care. Conclusion Nutrition intervention is predicted to be a cost-effective approach in the prevention of pressure ulcer in at-risk patients.
Resumo:
Emergence is discussed in the context of a practice-based study of interactive art and a new taxonomy of emergence is proposed. The interactive art system ‘plus minus now’ is described and its relationship to emergence is discussed. ‘Plus minus now’ uses a novel method for instantiating emergent shapes. A preliminary investigation of this art system has been conducted and reveals the creation of temporal compositions by a participant. These temporal compositions and the emergent shapes are described using the taxonomy of emergence. Characteristics of emergent interactions and the implications of designing for them are discussed.
Resumo:
This thesis is concerned with creating and evaluating interactive art systems that facilitate emergent participant experiences. For the purposes of this research, interactive art is the computer based arts involving physical participation from the audience, while emergence is when a new form or concept appears that was not directly implied by the context from which it arose. This emergent ‘whole’ is more than a simple sum of its parts. The research aims to develop understanding of the nature of emergent experiences that might arise during participant interaction with interactive art systems. It also aims to understand the design issues surrounding the creation of these systems. The approach used is Practice-based, integrating practice, evaluation and theoretical research. Practice used methods from Reflection-in-action and Iterative design to create two interactive art systems: Glass Pond and +-now. Creation of +-now resulted in a novel method for instantiating emergent shapes. Both art works were also evaluated in exploratory studies. In addition, a main study with 30 participants was conducted on participant interaction with +-now. These sessions were video recorded and participants were interviewed about their experience. Recordings were transcribed and analysed using Grounded theory methods. Emergent participant experiences were identified and classified using a taxonomy of emergence in interactive art. This taxonomy draws on theoretical research. The outcomes of this Practice-based research are summarised as follows. Two interactive art systems, where the second work clearly facilitates emergent interaction, were created. Their creation involved the development of a novel method for instantiating emergent shapes and it informed aesthetic and design issues surrounding interactive art systems for emergence. A taxonomy of emergence in interactive art was also created. Other outcomes are the evaluation findings about participant experiences, including different types of emergence experienced and the coding schemes produced during data analysis.
Resumo:
This paper presents a model for generating a MAC tag by injecting the input message directly into the internal state of a nonlinear filter generator. This model generalises a similar model for unkeyed hash functions proposed by Nakano et al. We develop a matrix representation for the accumulation phase of our model and use it to analyse the security of the model against man-in-the-middle forgery attacks based on collisions in the final register contents. The results of this analysis show that some conclusions of Nakano et al regarding the security of their model are incorrect. We also use our results to comment on several recent MAC proposals which can be considered as instances of our model and specify choices of options within the model which should prevent the type of forgery discussed here. In particular, suitable initialisation of the register and active use of a secure nonlinear filter will prevent an attacker from finding a collision in the final register contents which could result in a forged MAC.
Resumo:
This paper proposes a technique that supports process participants in making risk-informed decisions, with the aim to reduce the process risks. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we prompt the participant with the expected risk that a given fault will occur given the particular input. These risks are predicted by traversing decision trees generated from the logs of past process executions and considering process data, involved resources, task durations and contextual information like task frequencies. The approach has been implemented in the YAWL system and its effectiveness evaluated. The results show that the process instances executed in the tests complete with substantially fewer faults and with lower fault severities, when taking into account the recommendations provided by our technique.
Resumo:
This paper presents a novel technique for segmenting an audio stream into homogeneous regions according to speaker identities, background noise, music, environmental and channel conditions. Audio segmentation is useful in audio diarization systems, which aim to annotate an input audio stream with information that attributes temporal regions of the audio into their specific sources. The segmentation method introduced in this paper is performed using the Generalized Likelihood Ratio (GLR), computed between two adjacent sliding windows over preprocessed speech. This approach is inspired by the popular segmentation method proposed by the pioneering work of Chen and Gopalakrishnan, using the Bayesian Information Criterion (BIC) with an expanding search window. This paper will aim to identify and address the shortcomings associated with such an approach. The result obtained by the proposed segmentation strategy is evaluated on the 2002 Rich Transcription (RT-02) Evaluation dataset, and a miss rate of 19.47% and a false alarm rate of 16.94% is achieved at the optimal threshold.
The backfilled GEI : a cross-capture modality gait feature for frontal and side-view gait recognition
Resumo:
In this paper, we propose a novel direction for gait recognition research by proposing a new capture-modality independent, appearance-based feature which we call the Back-filled Gait Energy Image (BGEI). It can can be constructed from both frontal depth images, as well as the more commonly used side-view silhouettes, allowing the feature to be applied across these two differing capturing systems using the same enrolled database. To evaluate this new feature, a frontally captured depth-based gait dataset was created containing 37 unique subjects, a subset of which also contained sequences captured from the side. The results demonstrate that the BGEI can effectively be used to identify subjects through their gait across these two differing input devices, achieving rank-1 match rate of 100%, in our experiments. We also compare the BGEI against the GEI and GEV in their respective domains, using the CASIA dataset and our depth dataset, showing that it compares favourably against them. The experiments conducted were performed using a sparse representation based classifier with a locally discriminating input feature space, which show significant improvement in performance over other classifiers used in gait recognition literature, achieving state of the art results with the GEI on the CASIA dataset.
Resumo:
In 1991, McNabb introduced the concept of mean action time (MAT) as a finite measure of the time required for a diffusive process to effectively reach steady state. Although this concept was initially adopted by others within the Australian and New Zealand applied mathematics community, it appears to have had little use outside this region until very recently, when in 2010 Berezhkovskii and coworkers rediscovered the concept of MAT in their study of morphogen gradient formation. All previous work in this area has been limited to studying single–species differential equations, such as the linear advection–diffusion–reaction equation. Here we generalise the concept of MAT by showing how the theory can be applied to coupled linear processes. We begin by studying coupled ordinary differential equations and extend our approach to coupled partial differential equations. Our new results have broad applications including the analysis of models describing coupled chemical decay and cell differentiation processes, amongst others.
Resumo:
Flood related scientific and community-based data are rarely systematically collected and analysed in the Philippines. Over the last decades the Pagsangaan River Basin, Leyte, has experienced several flood events. However, documentation describing flood characteristics such as extent, duration or height of these floods are close to non-existing. To address this issue, computerized flood modelling was used to reproduce past events where there was data available for at least partial calibration and validation. The model was also used to provide scenario-based predictions based on A1B climate change assumptions for the area. The most important input for flood modelling is a Digital Elevation Model (DEM) of the river basin. No accurate topographic maps or Light Detection And Ranging (LIDAR)-generated data are available for the Pagsangaan River. Therefore, the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Map (GDEM), Version 1, was chosen as the DEM. Although the horizontal spatial resolution of 30 m is rather desirable, it contains substantial vertical errors. These were identified, different correction methods were tested and the resulting DEM was used for flood modelling. The above mentioned data were combined with cross-sections at various strategic locations of the river network, meteorological records, river water level, and current velocity to develop the 1D-2D flood model. SOBEK was used as modelling software to create different rainfall scenarios, including historic flooding events. Due to the lack of scientific data for the verification of the model quality, interviews with local stakeholders served as the gauge to judge the quality of the generated flood maps. According to interviewees, the model reflects reality more accurately than previously available flood maps. The resulting flood maps are now used by the operations centre of a local flood early warning system for warnings and evacuation alerts. Furthermore these maps can serve as a basis to identify flood hazard areas for spatial land use planning purposes.
Implementation Guide for Surveillance of Staphylococcus aureus Bacteraemia -- [Consultation Edition]
Resumo:
The Implementation Guide for the Hospital Surveillance of SAB has been produced by the Healthcare Associated Infection (HAI) Technical Working Group of the Australian Commission on Safety and Quality in Health Care (ACSQHC), and endorsed by the HAI Advisory Group. The Technical Working Group is made up of representatives invited from surveillance units and the ACSQHC, who have had input into the preparation of this Guide. The Guide has been developed to ensure consistency in reporting of SAB across public and private hospitals to enable accurate national reporting and benchmarking. It is intended to be used by Australian hospitals and organisations to support the implementation of healthcare associated Staphylococcus aureus bacteraemia(SAB) surveillance using the endorsed case definition1 in the box below and further detail in the Data Set Specification.
Resumo:
The Implementation Guide for hospital surveillance of Clostridium difficile infection (CDI) has been produced by the Healthcare Associated Infection (HAI) Technical Working Group of the Australian Commission on Safety and Quality in Health Care (ACSQHC), and endorsed by the HAI Advisory Group. State jurisdictions and the ACSQHC have representatives on the Technical Working Group, and have had input into this document. (See acknowledgements on inside front cover)...
Resumo:
The implementation guide for the surveillance of CLABSI in intensive care units (ICU) was produced by the Healthcare Associated Infection (HAI) Technical Working Group of the Australian Commission on Safety and Quality in Health Care(ACSQHC), and endorsed by the ACSQHC HAI Advisory Committee. State surveillance units, the ACSQHC and the Australian and New Zealand Intensive Care Society (ANZICS) have representatives on the Technical Working Group, and have provided input into this document.
Resumo:
Providing help for research degree writing within a formal structure is difficult because research students come into their degree with widely varying needs and levels of experience. Providing writing assistance within a less structured learning context is an approach which has been trialled in higher education with promising results (Boud, Cohen & Sampson, 2001; Stracke, 2010; Devendish et al., 2009). While semi structured approaches have been the subject of study, little attention has been paid to the processes of informal learning which exist within doctoral education. In this paper we explore a 'writing movement' which has started to be taken up at various locations in Australia through the auspices of social media (Twitter and Facebook). 'Shut up and Write' is a concept first used in the cafe scene in San Francisco, where writers converge at a specific time and place and write together, without showing each other the outcomes, temporarily transforming writing from a solitary practice to a social one. In this paper we compare the experience of facilitating shut up and write sessions in two locations: RMIT University and Queensland University of Technology. The authors describe the set up and functioning of the different groups and report on feedback from regular participants, both physical and virtual. We suggest that informal learning practices can be exploited to assist research students to orientate themselves to the university environment and share vital technical skills, with very minimal input from academic staff. This experience suggests there is untapped potential within these kinds of activities to promote learning within the research degree experience which is sustainable and builds a stronger sense of community.
Resumo:
The feasibility of real-time calculation of parameters for an internal combustion engine via reconfigurable hardware implementation is investigated as an alternative to software computation. A detailed in-hardware field programmable gate array (FPGA)-based design is developed and evaluated using input crank angle and in-cylinder pressure data from fully instrumented diesel engines in the QUT Biofuel Engine Research Facility (BERF). Results indicate the feasibility of employing a hardware-based implementation for real-time processing for speeds comparable to the data sampling rate currently used in the facility, with acceptably low level of discrepancies between hardware and software-based calculation of key engine parameters.