980 resultados para Data manipulation


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A simple and inexpensive method is described for analysis of uranium (U) activity and mass in water by liquid scintillation counting using $\alpha$/$\beta$ discrimination. This method appears to offer a solution to the need for an inexpensive protocol for monitoring U activity and mass simultaneously and an alternative to the potential inaccuracy involved when depending on the mass-to-activity conversion factor or activity screen.^ U is extracted virtually quantitatively into 20 ml extractive scintillator from a 1-$\ell$ aliquot of water acidified to less than pH 2. After phase separation, the sample is counted for a 20-minute screening count with a minimum detection level of 0.27 pCi $\ell\sp{-1}$. $\alpha$-particle emissions from the extracted U are counted with close to 100% efficiency with a Beckman LS6000 LL liquid scintillation counter equipped with pulse-shape discrimination electronics. Samples with activities higher than 10 pCi $\ell\sp-1$ are recounted for 500-1000 minutes for isotopic analysis. Isotopic analysis uses events that are automatically stored in spectral files and transferred to a computer during assay. The data can be transferred to a commercially available spreadsheet and retrieved for examination or data manipulation. Values for three readily observable spectral features can be rapidly identified by data examination and substituted into a simple formula to obtain $\sp{234}$U/$\sp{238}$U ratio for most samples. U mass is calculated by substituting the isotopic ratio value into a simple equation.^ The utility of this method for the proposed compliance monitoring of U in public drinking water supplies was field tested with a survey of drinking water from Texas supplies that had previously been known to contain elevated levels of gross $\alpha$ activity. U concentrations in 32 samples from 27 drinking water supplies ranged from 0.26 to 65.5 pCi $\ell\sp{-1}$, with seven samples exceeding the proposed Maximum Contaminant Level of 20 $\mu$g $\ell\sp{-1}$. Four exceeded the proposed activity screening level of 30 pCi $\ell\sp{-1}$. Isotopic ratios ranged from 0.87 to 41.8, while one sample contained $\sp{234}$U activity of 34.6 pCi $\ell\sp{-1}$ in the complete absence of its parent, $\sp{238}$U. U mass in the samples with elevated activity ranged from 0.0 to 103 $\mu$g $\ell\sp{-1}$. A limited test of screening surface and groundwaters for contamination by U from waste sites and natural processes was also successful. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes JANUS, a modular massively parallel and reconfigurable FPGA-based computing system. Each JANUS module has a computational core and a host. The computational core is a 4x4 array of FPGA-based processing elements with nearest-neighbor data links. Processors are also directly connected to an I/O node attached to the JANUS host, a conventional PC. JANUS is tailored for, but not limited to, the requirements of a class of hard scientific applications characterized by regular code structure, unconventional data manipulation instructions and not too large data-base size. We discuss the architecture of this configurable machine, and focus on its use on Monte Carlo simulations of statistical mechanics. On this class of application JANUS achieves impressive performances: in some cases one JANUS processing element outperfoms high-end PCs by a factor ≈1000. We also discuss the role of JANUS on other classes of scientific applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Buildings are key mediators between human activity and the environment around them, but details of energy usage and activity in buildings is often poorly communicated and understood. ECOS is an Eco-Visualization project that aims to contextualize the energy generation and consumption of a green building in a variety of different climates. The ECOS project is being developed for a large public interactive space installed in the new Science and Engineering Centre of the Queensland University of Technology that is dedicated to delivering interactive science education content to the public. This paper focuses on how design can develop ICT solutions from large data sets to create meaningful engagement with environmental data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Buildings are key mediators between human activity and the environment around them, but details of energy usage and activity in buildings is often poorly communicated and understood. ECOS is an Eco-Visualization project that aims to contextualize the energy generation and consumption of a green building in a variety of different climates. The ECOS project is being developed for a large public interactive space installed in the new Science and Engineering Centre of the Queensland University of Technology that is dedicated to delivering interactive science education content to the public. This paper focuses on how design can develop ICT solutions from large data sets to create meaningful engagement with environmental data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper the main features of ARDBID (A Relational Database for Interactive Design) have been described. An overview of the organization of the database has been presented and a detailed description of the data definition and manipulation languages has been given. These have been implemented on a DEC 1090 system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dynamic power consumption is very dependent on interconnect, so clever mapping of digital signal processing algorithms to parallelised realisations with data locality is vital. This is a particular problem for fast algorithm implementations where typically, designers will have sacrificed circuit structure for efficiency in software implementation. This study outlines an approach for reducing the dynamic power consumption of a class of fast algorithms by minimising the index space separation; this allows the generation of field programmable gate array (FPGA) implementations with reduced power consumption. It is shown how a 50% reduction in relative index space separation results in a measured power gain of 36 and 37% over a Cooley-Tukey Fast Fourier Transform (FFT)-based solution for both actual power measurements for a Xilinx Virtex-II FPGA implementation and circuit measurements for a Xilinx Virtex-5 implementation. The authors show the generality of the approach by applying it to a number of other fast algorithms namely the discrete cosine, the discrete Hartley and the Walsh-Hadamard transforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oberon-2 is an object-oriented language with a class structure based on type extension. The runtime structure of Oberon-2 is described and the low-level mechanism for dynamic type checking explained. It is shown that the superior type-safety of the language, when used for programming styles based on heterogeneous, pointer-linked data structures, has an entirely negligible cost in runtime performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: To develop clinical protocols for acquiring PET images, performing CT-PET registration and tumour volume definition based on the PET image data, for radiotherapy for lung cancer patients and then to test these protocols with respect to levels of accuracy and reproducibility. Method: A phantom-based quality assurance study of the processes associated with using registered CT and PET scans for tumour volume definition was conducted to: (1) investigate image acquisition and manipulation techniques for registering and contouring CT and PET images in a radiotherapy treatment planning system, and (2) determine technology-based errors in the registration and contouring processes. The outcomes of the phantom image based quality assurance study were used to determine clinical protocols. Protocols were developed for (1) acquiring patient PET image data for incorporation into the 3DCRT process, particularly for ensuring that the patient is positioned in their treatment position; (2) CT-PET image registration techniques and (3) GTV definition using the PET image data. The developed clinical protocols were tested using retrospective clinical trials to assess levels of inter-user variability which may be attributed to the use of these protocols. A Siemens Somatom Open Sensation 20 slice CT scanner and a Philips Allegro stand-alone PET scanner were used to acquire the images for this research. The Philips Pinnacle3 treatment planning system was used to perform the image registration and contouring of the CT and PET images. Results: Both the attenuation-corrected and transmission images obtained from standard whole-body PET staging clinical scanning protocols were acquired and imported into the treatment planning system for the phantom-based quality assurance study. Protocols for manipulating the PET images in the treatment planning system, particularly for quantifying uptake in volumes of interest and window levels for accurate geometric visualisation were determined. The automatic registration algorithms were found to have sub-voxel levels of accuracy, with transmission scan-based CT-PET registration more accurate than emission scan-based registration of the phantom images. Respiration induced image artifacts were not found to influence registration accuracy while inadequate pre-registration over-lap of the CT and PET images was found to result in large registration errors. A threshold value based on a percentage of the maximum uptake within a volume of interest was found to accurately contour the different features of the phantom despite the lower spatial resolution of the PET images. Appropriate selection of the threshold value is dependant on target-to-background ratios and the presence of respiratory motion. The results from the phantom-based study were used to design, implement and test clinical CT-PET fusion protocols. The patient PET image acquisition protocols enabled patients to be successfully identified and positioned in their radiotherapy treatment position during the acquisition of their whole-body PET staging scan. While automatic registration techniques were found to reduce inter-user variation compared to manual techniques, there was no significant difference in the registration outcomes for transmission or emission scan-based registration of the patient images, using the protocol. Tumour volumes contoured on registered patient CT-PET images using the tested threshold values and viewing windows determined from the phantom study, demonstrated less inter-user variation for the primary tumour volume contours than those contoured using only the patient’s planning CT scans. Conclusions: The developed clinical protocols allow a patient’s whole-body PET staging scan to be incorporated, manipulated and quantified in the treatment planning process to improve the accuracy of gross tumour volume localisation in 3D conformal radiotherapy for lung cancer. Image registration protocols which factor in potential software-based errors combined with adequate user training are recommended to increase the accuracy and reproducibility of registration outcomes. A semi-automated adaptive threshold contouring technique incorporating a PET windowing protocol, accurately defines the geometric edge of a tumour volume using PET image data from a stand alone PET scanner, including 4D target volumes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Australian e-Health Research Centre in collaboration with the Queensland University of Technology's Paediatric Spine Research Group is developing software for visualisation and manipulation of large three-dimensional (3D) medical image data sets. The software allows the extraction of anatomical data from individual patients for use in preoperative planning. State-of-the-art computer technology makes it possible to slice through the image dataset at any angle, or manipulate 3D representations of the data instantly. Although the software was initially developed to support planning for scoliosis surgery, it can be applied to any dataset whether obtained from computed tomography, magnetic resonance imaging or any other imaging modality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental manipulation removes students from their everyday worlds to unfamiliar worlds, to facil- itate learning. This article reports that this strategy was effective when applied in a university design unit, using the tactic of immersion in the Second Life online virtual environment. The objective was for teams of stu- dents each to design a series of modules for an orbiting space station using supplied data. The changed and futuristic environment led the students to an important but previously unconsidered design decision which they were able to address in novel ways because of, rather than in spite of, the Second Life immersion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this project involved an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry. Working with data provided by Hoodlum Entertainment and Foxtel Marketing, the outcome of the study was a prototype for a custom data visualisation tool that allowed access, manipulation and presentation of user engagement data, both historic and predictive. The prototyped interfaces demonstrate how the visualization tool would collect and organise data specific to multiplatform projects by aggregating data across a number of platform reporting tools. Such a tool is designed to encompass not only platforms developed by the transmedia producer but also sites developed by fans. This visualisation tool accounted for multiplatform experience projects whose top level is comprised of people, platforms and content. People include characters, actors, audience, distributors and creators. Platforms include television, Facebook and other relevant social networks, literature, cinema and other media that might be included in the multiplatform experience. Content refers to discreet media texts employed within the platform, such as tweet, a You Tube video, a Facebook post, an email, a television episode, etc. Core content is produced by the creators’ multiplatform experiences to advance the narrative, while complimentary content generated by audience members offers further contributions to the experience. Equally important is the timing with which the components of the experience are introduced and how they interact with and impact upon each other. Being able to combine, filter and sort these elements in multiple ways we can better understand the value of certain components of a project. It also offers insights into the relationship between the timing of the release of components and user activity associated with them, which further highlights the efficacy (or, indeed, failure) of assets as catalysts for engagement. In collaboration with Hoodlum we have developed a number of design scenarios experimenting with the ways in which data can be visualised and manipulated to tell a more refined story about the value of user engagement with certain project components and activities. This experimentation will serve as the basis for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore the relationship between form and data as a design agenda and learning strategy for novice visual information designers. Our students are university seniors in digital, visual design but novices to information design, manipulation and interpretation. We describe design strategies developed to scaffold sophisticated aesthetic and conceptual engagement despite limited understanding of the domain of designing with information. These revolve around an open-ended design project where students created a physical design from data of their choosing and research. The accompanying learning strategies concern this relationship between data and form to investigate it materially, formally and through ideation. Exemplifying student works that cross media and design domains are described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This pilot project investigated the existing practices and processes of Proficient, Highly Accomplished and Lead teachers in the interpretation, analysis and implementation of National Assessment Program – Literacy and Numeracy (NAPLAN) data. A qualitative case study approach was the chosen methodology, with nine teachers across a variety of school sectors interviewed. Themes and sub-themes were identified from the participants’ interview responses revealing the ways in which Queensland teachers work with NAPLAN data. The data illuminated that generally individual schools and teachers adopted their own ways of working with data, with approaches ranging from individual/ad hoc, to hierarchical or a whole school approach. Findings also revealed that data are the responsibility of various persons from within the school hierarchy; some working with the data electronically whilst others rely on manual manipulation. Manipulation of data is used for various purposes including tracking performance, value adding and targeting programmes for specific groups of students, for example the gifted and talented. Whilst all participants had knowledge of intervention programmes and how practice could be modified, there were large inconsistencies in knowledge and skills across schools. Some see the use of data as a mechanism for accountability, whilst others mention data with regards to changing the school culture and identifying best practice. Overall, the findings showed inconsistencies in approach to focus area 5.4. Recommendations therefore include a more national approach to the use of educational data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The aim of this study was to determine the linear acceleration, time-to-peak acceleration, and effect of hand position comparing 2 clinicians completing a thoracic manipulation. Methods Thirteen volunteers received a right- and left-“handed” prone thoracic manipulation while accelerations were recorded by an inertial sensor. Peak thrust acceleration and time-to-peak thrust were measured. Results There were differences in thrust acceleration between right- and left-handed techniques for one therapist. The mean peak thrust acceleration was different between therapists, with the more practiced therapist demonstrating greater peak thrust accelerations. Time-to-peak acceleration also revealed between therapist differences, with the more practiced therapist demonstrating shorter time-to-peak acceleration. Cavitation data suggested that manipulations with greater accelerations were more likely to result in cavitation. Conclusion The results of this study suggest that with greater frequency of use, therapists are likely to achieve greater accelerations and shorter time-to-peak accelerations. Furthermore, this study showed that an inertial sensor can be used to quantify important variables during thoracic manipulation and are able to detect intertherapist differences in technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Cervical Spinal Manipulation (CSM) is considered a high-level skill of the central nervous system because it requires bimanual coordinated rhythmical movements therefore necessitating training to achieve proficiency. The objective of the present study was to investigate the effect of real-time feedback on the performance of CSM. Methods Six postgraduate physiotherapy students attending a training workshop on Cervical Spine Manipulation Technique (CSMT) using inertial sensor derived real-time feedback participated in this study. The key variables were pre-manipulative position, angular displacement of the thrust and angular velocity of the thrust. Differences between variables before and after training were investigated using t-tests. Results There were no significant differences after training for the pre-manipulative position (rotation p = 0.549; side bending p = 0.312) or for thrust displacement (rotation p = 0.247; side bending p = 0.314). Thrust angular velocity demonstrated a significant difference following training for rotation (pre-training mean (sd) 48.9°/s (35.1); post-training mean (sd) 96.9°/s (53.9); p = 0.027) but not for side bending (p = 0.521). Conclusion Real-time feedback using an inertial sensor may be valuable in the development of specific manipulative skill. Future studies investigating manipulation could consider a randomized controlled trial using inertial sensor real time feedback compared to traditional training.