169 resultados para Visualisation de logiciels
Resumo:
Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.
Resumo:
Aim: In 2013 QUT introduced the Medical Imaging Training Immersive Environment (MITIE) as a virtual reality (VR) platform that allowed students to practice general radiography. The system software has been expanded to now include C-Arm. The aim of this project was to investigate the use of this technology in the pedagogy of undergraduate medical imaging students who have limited to no experience in the use of the C-Arm clinically. Method: The Medical Imaging Training Immersive Environment (MITIE) application provides students with realistic and fully interactive 3D models of C-Arm equipment. As with VR initiatives in other health disciplines (1–2) the software mimics clinical practice as much as possible and uses 3D technology to enhance 3D spatial awareness and realism. The application allows students to set up and expose a virtual patient in a 3D environment as well as creating the resultant “image” for comparison with a gold standard. Automated feedback highlights ways for the student to improve their patient positioning, equipment setup or exposure factors. The students' equipment knowledge was tested using an on line assessment quiz and surveys provided information on the students' pre-clinical confidence scale, with post-clinical data comparisons. Ethical approval for the project was provided by the university ethics panel. Results: This study is currently under way and this paper will present analysis of initial student feedback relating to the perceived value of the application for confidence in a high risk environment (i.e. operating theatre) and related clinical skills development. Further in-depth evaluation is ongoing with full results to be presented. Conclusion: MITIE C-Arm has a development role to play in the pre-clinical skills training for Medical Radiation Science students. It will augment their theoretical understanding prior to their clinical experience. References 1. Bridge P, Appleyard R, Ward J, Phillips R, Beavis A. The development and evaluation of a virtual radiotherapy treatment machine using an immersive visualisation environment. Computers and Education 2007; 49(2): 481–494. 2. Gunn T, Berry C, Bridge P et al. 3D Virtual Radiography: Development and Initial Feedback. Paper presented at the 10th Annual Scientific Meeting of Medical Imaging and Radiation Therapy, March 2013 Hobart, Tasmania.
Resumo:
Asoftware-based environment was developed to provide practical training in medical radiation principles and safety. The Virtual Radiation Laboratory application allowed students to conduct virtual experiments using simulated diagnostic and radiotherapy X-ray generators. The experiments were designed to teach students about the inverse square law, half value layer and radiation protection measures and utilised genuine clinical and experimental data. Evaluation of the application was conducted in order to ascertain the impact of the software on students’ understanding, satisfaction and collaborative learning skills and also to determine potential further improvements to the software and guidelines for its continued use. Feedback was gathered via an anonymous online survey consisting of a mixture of Likert-style questions and short answer open questions. Student feedback was highly positive with 80 % of students reporting increased understanding of radiation protection principles. Furthermore 72 % enjoyed using the software and 87 %of students felt that the project facilitated collaboration within small groups. The main themes arising in the qualitative feedback comments related to efficiency and effectiveness of teaching, safety of environment, collaboration and realism. Staff and students both report gains in efficiency and effectiveness associated with the virtual experiments. In addition students particularly value the visualisation of ‘‘invisible’’ physical principles and increased opportunity for experimentation and collaborative problembased learning. Similar ventures will benefit from adopting an approach that allows for individual experimentation while visualizing challenging concepts.
Resumo:
This project developed a visual strategy and graphic outcomes to communicate the results of a scientific collaborative project to the Mackay community. During 2013 and 2014 a team from CSIRO engaged with the community in Mackay to collaboratively develop a set of strategies to improve the management of the Great Barrier Reef. The result of this work was a 300+ page scientific report that needed to be translated and summarised to the general community. The aim of this project was to strategically synthesise information contained in the report and to design and produce an outcome to be distributed to the participant community. By working with the CISRO researchers, an action toolkit was developed, with twelve cards and a booklet. Each card represented the story behind a certain local management issue and the actions that the participants suggested should be taken in order to improve management of The Reef. During the design synthesis it was identified that for all management issues there was a reference to the need to develop some sort of "educational campaign" to the area. That was then translated as an underlying action to support all other actions proposed in the toolkit.
Resumo:
As technological capabilities for capturing, aggregating, and processing large quantities of data continue to improve, the question becomes how to effectively utilise these resources. Whenever automatic methods fail, it is necessary to rely on human background knowledge, intuition, and deliberation. This creates demand for data exploration interfaces that support the analytical process, allowing users to absorb and derive knowledge from data. Such interfaces have historically been designed for experts. However, existing research has shown promise in involving a broader range of users that act as citizen scientists, placing high demands in terms of usability. Visualisation is one of the most effective analytical tools for humans to process abstract information. Our research focuses on the development of interfaces to support collaborative, community-led inquiry into data, which we refer to as Participatory Data Analytics. The development of data exploration interfaces to support independent investigations by local communities around topics of their interest presents a unique set of challenges, which we discuss in this paper. We present our preliminary work towards suitable high-level abstractions and interaction concepts to allow users to construct and tailor visualisations to their own needs.
Resumo:
Variability is observed at all levels of cardiac electrophysiology. Yet, the underlying causes and importance of this variability are generally unknown, and difficult to investigate with current experimental techniques. The aim of the present study was to generate populations of computational ventricular action potential models that reproduce experimentally observed intercellular variability of repolarisation (represented by action potential duration) and to identify its potential causes. A systematic exploration of the effects of simultaneously varying the magnitude of six transmembrane current conductances (transient outward, rapid and slow delayed rectifier K(+), inward rectifying K(+), L-type Ca(2+), and Na(+)/K(+) pump currents) in two rabbit-specific ventricular action potential models (Shannon et al. and Mahajan et al.) at multiple cycle lengths (400, 600, 1,000 ms) was performed. This was accomplished with distributed computing software specialised for multi-dimensional parameter sweeps and grid execution. An initial population of 15,625 parameter sets was generated for both models at each cycle length. Action potential durations of these populations were compared to experimentally derived ranges for rabbit ventricular myocytes. 1,352 parameter sets for the Shannon model and 779 parameter sets for the Mahajan model yielded action potential duration within the experimental range, demonstrating that a wide array of ionic conductance values can be used to simulate a physiological rabbit ventricular action potential. Furthermore, by using clutter-based dimension reordering, a technique that allows visualisation of multi-dimensional spaces in two dimensions, the interaction of current conductances and their relative importance to the ventricular action potential at different cycle lengths were revealed. Overall, this work represents an important step towards a better understanding of the role that variability in current conductances may play in experimentally observed intercellular variability of rabbit ventricular action potential repolarisation.
Resumo:
Acoustic recordings play an increasingly important role in monitoring terrestrial environments. However, due to rapid advances in technology, ecologists are accumulating more audio than they can listen to. Our approach to this big-data challenge is to visualize the content of long-duration audio recordings by calculating acoustic indices. These are statistics which describe the temporal-spectral distribution of acoustic energy and reflect content of ecological interest. We combine spectral indices to produce false-color spectrogram images. These not only reveal acoustic content but also facilitate navigation. An additional analytic challenge is to find appropriate descriptors to summarize the content of 24-hour recordings, so that it becomes possible to monitor long-term changes in the acoustic environment at a single location and to compare the acoustic environments of different locations. We describe a 24-hour ‘acoustic-fingerprint’ which shows some preliminary promise.
Resumo:
Rapid advances in sequencing technologies (Next Generation Sequencing or NGS) have led to a vast increase in the quantity of bioinformatics data available, with this increasing scale presenting enormous challenges to researchers seeking to identify complex interactions. This paper is concerned with the domain of transcriptional regulation, and the use of visualisation to identify relationships between specific regulatory proteins (the transcription factors or TFs) and their associated target genes (TGs). We present preliminary work from an ongoing study which aims to determine the effectiveness of different visual representations and large scale displays in supporting discovery. Following an iterative process of implementation and evaluation, representations were tested by potential users in the bioinformatics domain to determine their efficacy, and to understand better the range of ad hoc practices among bioinformatics literate users. Results from two rounds of small scale user studies are considered with initial findings suggesting that bioinformaticians require richly detailed views of TF data, features to compare TF layouts between organisms quickly, and ways to keep track of interesting data points.
Resumo:
When designed effectively dashboards are expected to reduce information overload and improve performance management. Hence, interest in dashboards has increased recently,which is also evident from the proliferation of dashboard solution providers in the market. Despite dashboards popularity, little is known about the extent of their effectiveness in organizations. Dashboards draw from multiple disciplines but ultimately use visualization to communicate important information to stakeholders. Thus,a better understanding of visualization can improve the design and use of dashboards. This paper reviews the foundations and roles of dashboards in performance management and proposes a framework for future research, which can enhance dashboard design and perceived usefulness depending on the fit between the features of the dashboard and the characteristics of the users.
Resumo:
Background Corneal oedema is a common post-operative problem that delays or prevents visual recovery from ocular surgery. Honey is a supersaturated solution of sugars with an acidic pH, high osmolarity and low water content. These characteristics inhibit the growth of micro-organisms, reduce oedema and promote epithelialisation. This clinical case series describes the use of a regulatory approved Leptospermum species honey ophthalmic product, in the management of post-operative corneal oedema and bullous keratopathy. Methods A retrospective review of 18 consecutive cases (30 eyes) with corneal oedema persisting beyond one month after single or multiple ocular surgical procedures (phacoemulsification cataract surgery and additional procedures) treated with Optimel Antibacterial Manuka Eye Drops twice to three times daily as an adjunctive therapy to conventional topical management with corticosteroid, aqueous suppressants, hypertonic sodium chloride five per cent, eyelid hygiene and artificial tears. Visual acuity and central corneal thickness were measured before and at the conclusion of Optimel treatment. Results A temporary reduction in corneal epithelial oedema lasting up to several hours was observed after the initial Optimel instillation and was associated with a reduction in central corneal thickness, resolution of epithelial microcysts, collapse of epithelial bullae, improved corneal clarity, improved visualisation of the intraocular structures and improved visual acuity. Additionally, with chronic use, reduction in punctate epitheliopathy, reduction in central corneal thickness and improvement in visual acuity were achieved. Temporary stinging after Optimel instillation was experienced. No adverse infectious or inflammatory events occurred during treatment with Optimel. Conclusions Optimel was a safe and effective adjunctive therapeutic strategy in the management of persistent post-operative corneal oedema and warrants further investigation in clinical trials.
Resumo:
This is the fourth TAProViz workshop being run at the 13th International Conference on Business Process Management (BPM). The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. Towards this goal, the workshop topics were extended to human computer interaction and related domains. Submitted papers were evaluated by at least three program committee members, in a double blind manner, on the basis of significance, originality, technical quality and exposition. Three full and one position papers were accepted for presentation at the workshop. In addition, we invited a keynote speaker, Jakob Pinggera, a postdoctoral researcher at the Business Process Management Research Cluster at the University of Innsbruck, Austria.
Resumo:
The effect of material properties of an environmentally friendly, optically transparent dielectric material, polyterpenol, on the carrier transients within the pentacene-based double-layer MTM device was investigated. Polyterpenol films were RF plasma polymerised under varied process conditions, with resultant films differing in surface chemistry and morphology. Independent of type of polyterpenol, time-resolved EFISHG study of IZO/polyterpenol/pentacene/Au structures showed similar transient behaviour with carriers injected into pentacene from Au electrode only, confirming polyterpenol to be a suitable blocking layer for visualisation of single-species carrier transportation during charging and discharging under different bias conditions. Polyterpenol fabricated under higher input power show better promise due to higher chemical and thermal stability, improved uniformity, and absence of defects.
Resumo:
Time-resolved electric field induced second harmonic generation technique was used to probe the carrier transients within double-layer pentacene-based MIM devices. Polyterpenol thin films fabricated from non-synthetic environmentally sustainable source were used as a blocking layer to assist in visualisation of single-species carrier transportation during charging and discharging under different bias conditions. Results demonstrated that carrier transients were comprised of charging on electrodes, followed by carrier injection and charging of the interface. Polyterpenol was demonstrated to be a sound blocking material and can therefore be effectively used for probing of double-layer devices using EFISHG.
Resumo:
Disease maps are effective tools for explaining and predicting patterns of disease outcomes across geographical space, identifying areas of potentially elevated risk, and formulating and validating aetiological hypotheses for a disease. Bayesian models have become a standard approach to disease mapping in recent decades. This article aims to provide a basic understanding of the key concepts involved in Bayesian disease mapping methods for areal data. It is anticipated that this will help in interpretation of published maps, and provide a useful starting point for anyone interested in running disease mapping methods for areal data. The article provides detailed motivation and descriptions on disease mapping methods by explaining the concepts, defining the technical terms, and illustrating the utility of disease mapping for epidemiological research by demonstrating various ways of visualising model outputs using a case study. The target audience includes spatial scientists in health and other fields, policy or decision makers, health geographers, spatial analysts, public health professionals, and epidemiologists.
Resumo:
Self-tracking, the process of recording one's own behaviours, thoughts and feelings, is a popular approach to enhance one's self-knowledge. While dedicated self-tracking apps and devices support data collection, previous research highlights that the integration of data constitutes a barrier for users. In this study we investigated how members of the Quantified Self movement---early adopters of self-tracking tools---overcome these barriers. We conducted a qualitative analysis of 51 videos of Quantified Self presentations to explore intentions for collecting data, methods for integrating and representing data, and how intentions and methods shaped reflection. The findings highlight two different intentions---striving for self-improvement and curiosity in personal data---which shaped how these users integrated data, i.e. the effort required. Furthermore, we identified three methods for representing data---binary, structured and abstract---which influenced reflection. Binary representations supported reflection-in-action, whereas structured and abstract representations supported iterative processes of data collection, integration and reflection. For people tracking out of curiosity, this iterative engagement with personal data often became an end in itself, rather than a means to achieve a goal. We discuss how these findings contribute to our current understanding of self-tracking amongst Quantified Self members and beyond, and we conclude with directions for future work to support self-trackers with their aspirations.