969 resultados para LINEAR CURRENT SCANNING
Resumo:
In this study x-ray CT has been used to produce a 3D image of an irradiated PAGAT gel sample, with noise-reduction achieved using the ‘zero-scan’ method. The gel was repeatedly CT scanned and a linear fit to the varying Hounsfield unit of each pixel in the 3D volume was evaluated across the repeated scans, allowing a zero-scan extrapolation of the image to be obtained. To minimise heating of the CT scanner’s x-ray tube, this study used a large slice thickness (1 cm), to provide image slices across the irradiated region of the gel, and a relatively small number of CT scans (63), to extrapolate the zero-scan image. The resulting set of transverse images shows reduced noise compared to images from the initial CT scan of the gel, without being degraded by the additional radiation dose delivered to the gel during the repeated scanning. The full, 3D image of the gel has a low spatial resolution in the longitudinal direction, due to the selected scan parameters. Nonetheless, important features of the dose distribution are apparent in the 3D x-ray CT scan of the gel. The results of this study demonstrate that the zero-scan extrapolation method can be applied to the reconstruction of multiple x-ray CT slices, to provide useful 2D and 3D images of irradiated dosimetry gels.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
This document provides data for the case study presented in our recent earthwork planning papers. Some results are also provided in a graphical format using Excel.
Resumo:
Purpose: Electronic Portal Imaging Devices (EPIDs) are available with most linear accelerators (Amonuk, 2002), the current technology being amorphous silicon flat panel imagers. EPIDs are currently used routinely in patient positioning before radiotherapy treatments. There has been an increasing interest in using EPID technology tor dosimetric verification of radiotherapy treatments (van Elmpt, 2008). A straightforward technique involves the EPID panel being used to measure the fluence exiting the patient during a treatment which is then compared to a prediction of the fluence based on the treatment plan. However, there are a number of significant limitations which exist in this Method: Resulting in a limited proliferation ot this technique in a clinical environment. In this paper, we aim to present a technique of simulating IMRT fields using Monte Carlo to predict the dose in an EPID which can then be compared to the measured dose in the EPID. Materials: Measurements were made using an iView GT flat panel a-SI EPfD mounted on an Elekta Synergy linear accelerator. The images from the EPID were acquired using the XIS software (Heimann Imaging Systems). Monte Carlo simulations were performed using the BEAMnrc and DOSXVZnrc user codes. The IMRT fieids to be delivered were taken from the treatment planning system in DICOMRT format and converted into BEAMnrc and DOSXYZnrc input files using an in-house application (Crowe, 2009). Additionally. all image processing and analysis was performed using another in-house application written using the Interactive Data Language (IDL) (In Visual Information Systems). Comparison between the measured and Monte Carlo EPID images was performed using a gamma analysis (Low, 1998) incorporating dose and distance to agreement criteria. Results: The fluence maps recorded by the EPID were found to provide good agreement between measured and simulated data. Figure 1 shows an example of measured and simulated IMRT dose images and profiles in the x and y directions. "A technique for the quantitative evaluation of dose distributions", Med Phys, 25(5) May 1998 S. Crowe, 1. Kairn, A. Fielding, "The Development of a Monte Carlo system to verify Radiotherapy treatment dose calculations", Radiotherapy & Oncology, Volume 92, Supplement 1, August 2009, Pages S71-S71.
Resumo:
Critical analysis and problem-solving skills are two graduate attributes that are important in ensuring that graduates are well equipped in working across research and practice settings within the discipline of psychology. Despite the importance of these skills, few psychology undergraduate programmes have undertaken any systematic development, implementation, and evaluation of curriculum activities to foster these graduate skills. The current study reports on the development and implementation of a tutorial programme designed to enhance the critical analysis and problem-solving skills of undergraduate psychology students. Underpinned by collaborative learning and problem-based learning, the tutorial programme was administered to 273 third year undergraduate students in psychology. Latent Growth Curve Modelling revealed that students demonstrated a significant linear increase in self-reported critical analysis and problem-solving skills across the tutorial programme. The findings suggest that the development of inquiry-based curriculum offers important opportunities for psychology undergraduates to develop critical analysis and problem-solving skills.
Resumo:
Background/aims: Remote monitoring for heart failure has not only been evaluated in a large number of randomised controlled trials, but also in many systematic reviews and meta-analyses. The aim of this meta-review was to identify, appraise and synthesise existing systematic reviews that have evaluated the effects of remote monitoring in heart failure. Methods: Using a Cochrane methodology, we electronically searched all relevant online databases and search engines, performed a forward citation search as well as hand-searched bibliographies. Only fully published systematic reviews of invasive and/or non-invasive remote monitoring interventions were included. Two reviewers independently extracted data. Results: Sixty-five publications from 3333 citations were identified. Seventeen fulfilled the inclusion and exclusion criteria. Quality varied with A Measurement Tool to Assess Systematic Reviews (AMSTAR scores) ranging from 2 to 11 (mean 5.88). Seven reviews (41%) pooled results from individual studies for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Four (24%) focused specifically on telemonitoring. Four (24%) included studies investigating both non-invasive and invasive technologies. Population characteristics of the included studies were not reported consistently. Mortality and hospitalisations were the most frequently reported outcomes 12 (70%). Only five reviews (29%) reported healthcare costs and compliance. A high degree of heterogeneity was reported in many of the meta-analyses. Conclusions: These results should be considered in context of two negative RCTs of remote monitoring for heart failure that have been published since the meta-analyses (TIM-HF and Tele-HF). However, high quality reviews demonstrated improved mortality, quality of life, reduction in hospitalisations and healthcare costs.
Resumo:
The application of different EMS current thresholds on muscle activates not only the muscle but also peripheral sensory axons that send proprioceptive and pain signals to the cerebral cortex. A 32-channel time-domain fNIRS instrument was employed to map regional cortical activities under varied EMS current intensities applied on the right wrist extensor muscle. Eight healthy volunteers underwent four EMS at different current thresholds based on their individual maximal tolerated intensity (MTI), i.e., 10 % < 50 % < 100 % < over 100 % MTI. Time courses of the absolute oxygenated and deoxygenated hemoglobin concentrations primarily over the bilateral sensorimotor cortical (SMC) regions were extrapolated, and cortical activation maps were determined by general linear model using the NIRS-SPM software. The stimulation-induced wrist extension paradigm significantly increased activation of the contralateral SMC region according to the EMS intensities, while the ipsilateral SMC region showed no significant changes. This could be due in part to a nociceptive response to the higher EMS current intensities and result also from increased sensorimotor integration in these cortical regions.
Resumo:
The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.
Resumo:
Each year The Australian Centre for Philanthropy and Nonprofit Studies (ACPNS) at QUT analyses statistics on tax-deductible donations made by Australians in their individual income tax returns to Deductible Gift Recipients (DGRs). The information presented below is based on the amount and type of tax-deductible donations made by Australian taxpayers to DGRs for the period 1 July 2010 to 30 June 2011 extracted from the Australian Taxation Office's publication Taxation Statistics 2010-2011.1
Resumo:
Educational and developmental psychology faces a number of current and future challenges and opportunities in Australia. In this commentary we consider the identity of educational and developmental psychology in terms of the features that distinguish it from other specialisations, and address issues related to training, specialist endorsement, supervision and rebating under the Australian government's Medicare system. The current status of training in Australia is considered through a review of the four university programs in educational and developmental psychology currently offered, and the employment destinations of their graduates. Although the need for traditional services in settings such as schools, hospitals, disability and community organisations will undoubtedly continue, the role of educational and developmental psychologists is being influenced and to some extent redefined by advances in technology, medicine, genetics, and neuroscience. We review some of these advances and conclude with recommendations for training and professional development that will enable Australian educational and developmental psychologists to meet the challenges ahead.
Resumo:
Morphology changes induced in polycrystalline silver catalysts as a result of heating in either oxygen, water or oxygen-methanol atmospheres have been investigated by environmental scanning electron microscopy (ESEM), FT-Raman spectroscopy and temperature programmed desorption (TPD). The silver catalyst of interest consisted of two distinct particle types, one of which contained a significant concentration of sub-surface hydroxy species (in addition to surface adsorbed atomic oxygen). Heating the sample to 663 K resulted in the production of 'pin-holes' in the silver structure as a consequence of near-surface explosions caused by sub-surface hydroxy recombination. Furthermore, 'pin-holes' were predominantly found in the vicinity of surface defects, such as platelets and edge structures. Reaction between methanol and oxygen also resulted in the formation of 'pin-holes' in the silver surface, which were inherently associated with the catalytic process. A reaction mechanism is suggested that involves the interaction of methanol with sub-surface oxygen species to form sub-surface hydroxy groups. The sub-surface hydroxy species subsequently erupt through the silver surface to again produce 'pin-holes'.
Resumo:
The techniques of environmental scanning electron microscopy (ESEM) and Raman microscopy have been used to respectively elucidate the morphological changes and nature of the adsorbed species on silver(I) oxide powder, during methanol oxidation conditions. Heating Ag2O in either water vapour or oxygen resulted firstly in the decomposition of silver(I) oxide to polycrystalline silver at 578 K followed by sintering of the particles at higher temperature. Raman spectroscopy revealed the presence of subsurface oxygen and hydroxyl species in addition to surface hydroxyl groups after interaction with water vapour. Similar species were identified following exposure to oxygen in an ambient atmosphere. This behaviour indicated that the polycrystalline silver formed from Ag2O decomposition was substantially more reactive than silver produced by electrochemical methods. The interaction of water at elevated temperatures subsequent to heating silver(I) oxide in oxygen resulted in a significantly enhanced concentration of subsurface hydroxyl species. The reaction of methanol with Ag2O at high temperatures was interesting in that an inhibition in silver grain growth was noted. Substantial structural modification of the silver(I) oxide material was induced by catalytic etching in a methanol/air mixture. In particular, "pin-hole" formation was observed to occur at temperatures in excess of 773 K, and it was also recorded that these "pin- holes" coalesced to form large-scale defects under typical industrial reaction conditions. Raman spectroscopy revealed that the working surface consisted mainly of subsurface oxygen and surface Ag=O species. The relative lack of sub-surface hydroxyl species suggested that it was the desorption of such moieties which was the cause of the "pin-hole" formation.
Resumo:
Polycrystalline silver is used to catalytically oxidise methanol to formaldehyde. This paper reports the results of extensive investigations involving the use of environmental scanning electron microscopy (ESEM) to monitor structural changes in silver during simulated industrial reaction conditions. The interaction of oxygen, nitrogen, and water, either singly or in combination, with a silver catalyst at temperatures up to 973 K resulted in the appearance of a reconstructed silver surface. More spectacular was the effect an oxygen/methanol mixture had on the silver morphology. At a temperature of ca. 713 K pinholes were created in the vicinity of defects as a consequence of subsurface explosions. These holes gradually increased in size and large platelet features were created. Elevation of the catalyst temperature to 843 K facilitated the wholescale oxygen induced restructuring of the entire silver surface. Methanol reacted with subsurface oxygen to produce subsurface hydroxyl species which ultimately formed water in the subsurface layers of silver. The resultant hydrostatic pressure forced the silver surface to adopt a "hill and valley" conformation in order to minimise the surface free energy. Upon approaching typical industrial operating conditions widespread explosions occurred on the catalyst and it was also apparent that the silver surface was extremely mobile under the applied conditions. The interaction of methanol alone with silver resulted in the initial formation of pinholes primarily in the vicinity of defects, due to reaction with oxygen species incorporated in the catalyst during electrochemical synthesis. However, dramatic reduction in the hole concentration with time occurred as all the available oxygen became consumed. A remarkable correlation between formaldehyde production and hole concentration was found.
Resumo:
We investigated critical belief-based targets for promoting the introduction of solid foods to infants at six months. First-time mothers (N = 375) completed a Theory of Planned Behaviour belief-based questionnaire and follow-up questionnaire assessing the age the infant was first introduced to solids. Normative beliefs about partner/spouse (β = 0.16) and doctor (β = 0.22), and control beliefs about commercial baby foods available for infants before six months (β = −0.20), predicted introduction of solids at six months. Intervention programs should target these critical beliefs to promote mothers’ adherence to current infant feeding guidelines to introduce solids at around six months.
Resumo:
Migraine is a complex familial condition that imparts a significant burden on society. There is evidence for a role of genetic factors in migraine, and elucidating the genetic basis of this disabling condition remains the focus of much research. In this review we discuss results of genetic studies to date, from the discovery of the role of neural ion channel gene mutations in familial hemiplegic migraine (FHM) to linkage analyses and candidate gene studies in the more common forms of migraine. The success of FHM regarding discovery of genetic defects associated with the disorder remains elusive in common migraine, and causative genes have not yet been identified. Thus we suggest additional approaches for analysing the genetic basis of this disorder. The continuing search for migraine genes may aid in a greater understanding of the mechanisms that underlie the disorder and potentially lead to significant diagnostic and therapeutic applications.