919 resultados para Citation
Resumo:
In this work we used a 3D quantitative CT ultrasound imaging system to characterise polymer gel dosimeters. The system comprised of two identical 5 MHz 128 element phased-array ultrasound transducers co-axially aligned and submerged in water as a coupling agent. Rotational and translational movement of the gel dosimeter sample between the transducers were performed using a robotic arm. Ultrasound signals were generated and received using an Olympus Omniscan unit. Dose sensitivity of attenuation and time of flight ultrasonic parameters were assessed using this system.
Resumo:
Ultrasound has been examined previously as an alternative readout method for irradiated polymer gel dosimeters, with authors reporting varying dose response to ultrasound transmission measurements. In this current work we extend previous work to measure the broadband ultrasound attenuation (BUA) response of irradiated PAGAT gel dosimeters, using a novel ultrasound computed tomography system.
Resumo:
In this study x-ray CT has been used to produce a 3D image of an irradiated PAGAT gel sample, with noise-reduction achieved using the ‘zero-scan’ method. The gel was repeatedly CT scanned and a linear fit to the varying Hounsfield unit of each pixel in the 3D volume was evaluated across the repeated scans, allowing a zero-scan extrapolation of the image to be obtained. To minimise heating of the CT scanner’s x-ray tube, this study used a large slice thickness (1 cm), to provide image slices across the irradiated region of the gel, and a relatively small number of CT scans (63), to extrapolate the zero-scan image. The resulting set of transverse images shows reduced noise compared to images from the initial CT scan of the gel, without being degraded by the additional radiation dose delivered to the gel during the repeated scanning. The full, 3D image of the gel has a low spatial resolution in the longitudinal direction, due to the selected scan parameters. Nonetheless, important features of the dose distribution are apparent in the 3D x-ray CT scan of the gel. The results of this study demonstrate that the zero-scan extrapolation method can be applied to the reconstruction of multiple x-ray CT slices, to provide useful 2D and 3D images of irradiated dosimetry gels.
Resumo:
This paper investigates engaging experienced birders, as volunteer citizen scientists, to analyze large recorded audio datasets gathered through environmental acoustic monitoring. Although audio data is straightforward to gather, automated analysis remains a challenging task; the existing expertise, local knowledge and motivation of the birder community can complement computational approaches and provide distinct benefits. We explored both the culture and practice of birders, and paradigms for interacting with recorded audio data. A variety of candidate design elements were tested with birders. This study contributes an understanding of how virtual interactions and practices can be developed to complement existing practices of experienced birders in the physical world. In so doing this study contributes a new approach to engagement in e-science. Whereas most citizen science projects task lay participants with discrete real world or artificial activities, sometimes using extrinsic motivators, this approach builds on existing intrinsically satisfying practices.
Resumo:
Remote monitoring for heart failure has been evaluated in numerous systematic reviews. The aim of this meta-review was to appraise their quality and synthesise results. We electronically searched online databases, performed a forward citation search and hand-searched bibliographies. Systematic reviews of remote monitoring interventions that were used for surveillance of heart failure patients were included. Seven (41%) systematic reviews pooled results for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Five (29%) focused on telemonitoring. Four (24%) included both non-invasive and invasive technologies. According to AMSTAR criteria, ten (58%) systematic reviews were of poor methodological quality. In high quality reviews, the relative risk of mortality in patients who received remote monitoring ranged from 0.53 (95% CI=0.29-0.96) to 0.88 (95% CI=0.76-1.01). High quality reviews also reported that remote monitoring reduced the relative risk of all-cause (0.52; 95% CI=0.28-0.96 to 0.96; 95% CI=0.90–1.03) and heart failure-related hospitalizations (0.72; 95% CI=0.64–0.81 to RR 0.79; 95% CI=0.67-0.94) and, as a consequence, healthcare costs. As the high quality reviews reported that remote monitoring reduced hospitalizations, mortality and healthcare costs, research efforts should now be directed towards optimising these interventions in preparation for more widespread implementation.
Resumo:
Background/aims: Remote monitoring for heart failure has not only been evaluated in a large number of randomised controlled trials, but also in many systematic reviews and meta-analyses. The aim of this meta-review was to identify, appraise and synthesise existing systematic reviews that have evaluated the effects of remote monitoring in heart failure. Methods: Using a Cochrane methodology, we electronically searched all relevant online databases and search engines, performed a forward citation search as well as hand-searched bibliographies. Only fully published systematic reviews of invasive and/or non-invasive remote monitoring interventions were included. Two reviewers independently extracted data. Results: Sixty-five publications from 3333 citations were identified. Seventeen fulfilled the inclusion and exclusion criteria. Quality varied with A Measurement Tool to Assess Systematic Reviews (AMSTAR scores) ranging from 2 to 11 (mean 5.88). Seven reviews (41%) pooled results from individual studies for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Four (24%) focused specifically on telemonitoring. Four (24%) included studies investigating both non-invasive and invasive technologies. Population characteristics of the included studies were not reported consistently. Mortality and hospitalisations were the most frequently reported outcomes 12 (70%). Only five reviews (29%) reported healthcare costs and compliance. A high degree of heterogeneity was reported in many of the meta-analyses. Conclusions: These results should be considered in context of two negative RCTs of remote monitoring for heart failure that have been published since the meta-analyses (TIM-HF and Tele-HF). However, high quality reviews demonstrated improved mortality, quality of life, reduction in hospitalisations and healthcare costs.
Resumo:
The boundaries between 'the digital' and our everyday physical world are dissolving as we develop more physical ways of interacting with computing. This forum presents some of the topics discussed in the colorful multidisciplinary field of tangible and embodied interaction
Resumo:
“The Cube” is a unique facility that combines 48 large multi-touch screens and very large-scale projection surfaces to form one of the world’s largest interactive learning and engagement spaces. The Cube facility is part of the Queensland University of Technology’s (QUT) newly established Science and Engineering Centre, designed to showcase QUT’s teaching and research capabilities in the STEM (Science, Technology, Engineering, and Mathematics) disciplines. In this application paper we describe, the Cube, its technical capabilities, design rationale and practical day-to-day operations, supporting up to 70,000 visitors per week. Essential to the Cube’s operation are five interactive applications designed and developed in tandem with the Cube’s technical infrastructure. Each of the Cube’s launch applications was designed and delivered by an independent team, while the overall vision of the Cube was shepherded by a small executive team. The diversity of design, implementation and integration approaches pursued by these five teams provides some insight into the challenges, and opportunities, presented when working with large distributed interaction technologies. We describe each of these applications in order to discuss the different challenges and user needs they address, which types of interactions they support and how they utilise the capabilities of the Cube facility.
Resumo:
Road surface skid resistance has been shown to have a strong relationship to road crash risk, however, applying the current method of using investigatory levels to identify crash prone roads is problematic as they may fail in identifying risky roads outside of the norm. The proposed method analyses a complex and formerly impenetrable volume of data from roads and crashes using data mining. This method rapidly identifies roads with elevated crash-rate, potentially due to skid resistance deficit, for investigation. A hypothetical skid resistance/crash risk curve is developed for each road segment, driven by the model deployed in a novel regression tree extrapolation method. The method potentially solves the problem of missing skid resistance values which occurs during network-wide crash analysis, and allows risk assessment of the major proportion of roads without skid resistance values.
Resumo:
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
Resumo:
Cone-beam computed tomography (CBCT) has enormous potential to improve the accuracy of treatment delivery in image-guided radiotherapy (IGRT). To assist radiotherapists in interpreting these images, we use a Bayesian statistical model to label each voxel according to its tissue type. The rich sources of prior information in IGRT are incorporated into a hidden Markov random field model of the 3D image lattice. Tissue densities in the reference CT scan are estimated using inverse regression and then rescaled to approximate the corresponding CBCT intensity values. The treatment planning contours are combined with published studies of physiological variability to produce a spatial prior distribution for changes in the size, shape and position of the tumour volume and organs at risk. The voxel labels are estimated using iterated conditional modes. The accuracy of the method has been evaluated using 27 CBCT scans of an electron density phantom. The mean voxel-wise misclassification rate was 6.2\%, with Dice similarity coefficient of 0.73 for liver, muscle, breast and adipose tissue. By incorporating prior information, we are able to successfully segment CBCT images. This could be a viable approach for automated, online image analysis in radiotherapy.
Resumo:
This paper describes a method for analysing videogames based on game activities. It examines the impact of these activities on the player experience. The research approach applies heuristic checklists that deconstruct games in terms of cognitive processes that players engage in during gameplay (e.g., addressing goals, interpreting feedback). For this study we examined three puzzle games, Portal 2, I-Fluid and Braid. The Player Experience of Need Satisfaction (PENS) survey is used to measure player experience following gameplay. Cognitive action provided within games is examined in light of reported player experiences to determine the extent to which these activities influence players’ feelings of competence, autonomy, intuitive control and presence. Findings indicate that the positive experiences are directly influenced by game activity design. Our study also demonstrates the value of expert review in deconstructing gameplay activity as a means of providing direction for game design that enhances the player experience.
Resumo:
In contemporary game development circles the ‘game making jam’ has become an important rite of passage and baptism event, an exploration space and a central indie lifestyle affirmation and community event. Game jams have recently become a focus for design researchers interested in the creative process. In this paper we tell the story of an established local game jam and our various documentation and data collection methods. We present the beginnings of the current project, which seeks to map the creative teams and their process in the space of the challenge, and which aims to enable participants to be more than the objects of the data collection. A perceived issue is that typical documentation approaches are ‘about’ the event as opposed to ‘made by’ the participants and are thus both at odds with the spirit of the jam as a phenomenon and do not really access the rich playful potential of participant experience. In the data collection and visualisation projects described here, we focus on using collected data to re-include the participants in telling stories about their experiences of the event as a place-based experience. Our goal is to find a means to encourage production of ‘anecdata’ - data based on individual story telling that is subjective, malleable, and resists collection via formal mechanisms - and to enable mimesis, or active narrating, on the part of the participants. We present a concept design for data as game based on the logic of early medieval maps and we reflect on how we could enable participation in the data collection itself.
Resumo:
Even though web security protocols are designed to make computer communication secure, it is widely known that there is potential for security breakdowns at the human-machine interface. This paper examines findings from a qualitative study investigating the identification of security decisions used on the web. The study was designed to uncover how security is perceived in an individual user's context. Study participants were tertiary qualified individuals, with a focus on HCI designers, security professionals and the general population. The study identifies that security frameworks for the web are inadequate from an interaction perspective, with even tertiary qualified users having a poor or partial understanding of security, of which they themselves are acutely aware. The result is that individuals feel they must protect themselves on the web. The findings contribute a significant mapping of the ways in which individuals reason and act to protect themselves on the web. We use these findings to highlight the need to design for trust at three levels, and the need to ensure that HCI design does not impact on the users' main identified protection mechanism: separation.