964 resultados para Thematic Treatment of Information
Resumo:
Kaolinite surfaces were modified by mechanochemical treatment for periods of time up to 10 h. X-ray diffraction shows a steady decrease in intensity of the d(001) spacing with mechanochemical treatment, resulting in the delamination of the kaolinite and a subsequent decrease in crystallite size with grinding time. Thermogravimetric analyses show the dehydroxylation patterns of kaolinite are significantly modified. Changes in the molecular structure of the kaolinite surface hydroxyls were followed by infrared spectroscopy. Hydroxyls were lost after 10 h of grinding as evidenced by a decrease in intensity of the OH stretching vibrations at 3695 and 3619 cm−1 and the deformation modes at 937 and 915 cm−1. Concomitantly an increase in the hydroxyl stretching vibrations of water is found. The water-bending mode was observed at 1650 cm−1, indicating that water is coordinating to the modified kaolinite surface. Changes in the surface structure of the OSiO units were reflected in the SiO stretching and OSiO bending vibrations. The decrease in intensity of the 1056 and 1034 cm−1 bands attributed to kaolinite SiO stretching vibrations were concomitantly matched by the increase in intensity of additional bands at 1113 and 520 cm−1 ascribed to the new mechanically synthesized kaolinite surface. Mechanochemical treatment of the kaolinite results in a new surface structure.
Resumo:
Thermal transformations of natural calcium oxalate dihydrate known in mineralogy as weddellite have been undertaken using a combination of Raman microscopy and infrared emission spectroscopy. The vibrational spectroscopic data was complimented with high resolution thermogravimetric analysis combined with evolved gas mass spectrometry. TG–MS identified three mass loss steps at 114, 422 and 592 °C. In the first mass loss step water is evolved only, in the second and third steps carbon dioxide is evolved. The combination of Raman microscopy and a thermal stage clearly identifies the changes in the molecular structure with thermal treatment. Weddellite is the phase in the temperature range up to the pre-dehydration temperature of 97 °C. At this temperature, the phase formed is whewellite (calcium oxalate monohydrate) and above 114 °C the phase is the anhydrous calcium oxalate. Above 422 °C, calcium carbonate is formed. Infrared emission spectroscopy shows that this mineral decomposes at around 650 °C. Changes in the position and intensity of the C=O and C---C stretching vibrations in the Raman spectra indicate the temperature range at which these phase changes occur.
Resumo:
The main aim of radiotherapy is to deliver a dose of radiation that is high enough to destroy the tumour cells while at the same time minimising the damage to normal healthy tissues. Clinically, this has been achieved by assigning a prescription dose to the tumour volume and a set of dose constraints on critical structures. Once an optimal treatment plan has been achieved the dosimetry is assessed using the physical parameters of dose and volume. There has been an interest in using radiobiological parameters to evaluate and predict the outcome of a treatment plan in terms of both a tumour control probability (TCP) and a normal tissue complication probability (NTCP). In this study, simple radiobiological models that are available in a commercial treatment planning system were used to compare three dimensional conformal radiotherapy treatments (3D-CRT) and intensity modulated radiotherapy (IMRT) treatments of the prostate. Initially both 3D-CRT and IMRT were planned for 2 Gy/fraction to a total dose of 60 Gy to the prostate. The sensitivity of the TCP and the NTCP to both conventional dose escalation and hypo-fractionation was investigated. The biological responses were calculated using the Källman S-model. The complication free tumour control probability (P+) is generated from the combined NTCP and TCP response values. It has been suggested that the alpha/beta ratio for prostate carcinoma cells may be lower than for most other tumour cell types. The effect of this on the modelled biological response for the different fractionation schedules was also investigated.
Resumo:
In 2003, the “ICT Curriculum Integration Performance Measurement Instrument” was developed froman extensive review ofthe contemporary international and Australian research pertaining to the definition and measurement of ICT curriculum integration in classrooms (Proctor, Watson, & Finger, 2003). The 45-item instrument that resulted was based on theories and methodologies identified by the literature review. This paper describes psychometric results from a large-scale evaluation of the instrument subsequently conducted, as recommended by Proctor, Watson, and Finger (2003). The resultant 20-item, two-factor instrument, now called “Learning with ICTs: Measuring ICT Use in the Curriculum,” is both statistically and theoretically robust. This paper should be read in association with the original paper published in Computers in the Schools(Proctor, Watson, & Finger, 2003) that described in detail the theoretical framework underpinning the development of the instrument.
Resumo:
Information graphics have become increasingly important in representing, organising and analysing information in a technological age. In classroom contexts, information graphics are typically associated with graphs, maps and number lines. However, all students need to become competent with the broad range of graphics that they will encounter in mathematical situations. This paper provides a rationale for creating a test to measure students’ knowledge of graphics. This instrument can be used in mass testing and individual (in-depth) situations. Our analysis of the utility of this instrument informs policy and practice. The results provide an appreciation of the relative difficulty of different information graphics; and provide the capacity to benchmark information about students’ knowledge of graphics. The implications for practice include the need to support the development of students’ knowledge of graphics, the existence of gender differences, the role of cross-curriculum applications in learning about graphics, and the need to explicate the links among graphics.
Resumo:
Objective: To examine the reliability of work-related activity coding for injury-related hospitalisations in Australia. Method: A random sample of 4373 injury-related hospital separations from 1 July 2002 to 30 June 2004 were obtained from a stratified random sample of 50 hospitals across 4 states in Australia. From this sample, cases were identified as work-related if they contained an ICD-10-AM work-related activity code (U73) allocated by either: (i) the original coder; (ii) an independent auditor, blinded to the original code; or (iii) a research assistant, blinded to both the original and auditor codes, who reviewed narrative text extracted from the medical record. The concordance of activity coding and number of cases identified as work-related using each method were compared. Results: Of the 4373 cases sampled, 318 cases were identified as being work-related using any of the three methods for identification. The original coder identified 217 and the auditor identified 266 work-related cases (68.2% and 83.6% of the total cases identified, respectively). Around 10% of cases were only identified through the text description review. The original coder and auditor agreed on the assignment of work-relatedness for 68.9% of cases. Conclusions and Implications: The current best estimates of the frequency of hospital admissions for occupational injury underestimate the burden by around 32%. This is a substantial underestimate that has major implications for public policy, and highlights the need for further work on improving the quality and completeness of routine, administrative data sources for a more complete identification of work-related injuries.
Resumo:
The generic IS-success constructs first identified by DeLone and McLean (1992) continue to be widely employed in research. Yet, recent work by Petter et al (2007) has cast doubt on the validity of many mainstream constructs employed in IS research over the past 3 decades; critiquing the almost universal conceptualization and validation of these constructs as reflective when in many studies the measures appear to have been implicitly operationalized as formative. Cited examples of proper specification of the Delone and McLean constructs are few, particularly in light of their extensive employment in IS research. This paper introduces a four-stage formative construct development framework: Conceive > Operationalize > Respond > Validate (CORV). Employing the CORV framework in an archival analysis of research published in top outlets 1985-2007, the paper explores the extent of possible problems with past IS research due to potential misspecification of the four application-related success dimensions: Individual-Impact, Organizational-Impact, System-Quality and Information-Quality. Results suggest major concerns where there is a mismatch of the Respond and Validate stages. A general dearth of attention to the Operationalize and Respond stages in methodological writings is also observed.
Resumo:
Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem. Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and their dimensions of complexity; (2) to review and assess current progress; and (3) to discuss remaining research challenges. We focus on four areas: information uncertainty modelling, visualisation techniques, management of information uncertainty modelling, propagation and visualisation, and the uptake of uncertainty visualisation in application domains.