168 resultados para Software Acquisition
Resumo:
The introduction of functional data into the radiotherapy treatment planning process is currently the focus of significant commercial, technical, scientific and clinical development. The potential of such data from positron emission tomography (PET) was recognized at an early stage and was integrated into the radiotherapy treatment planning process through the use of image fusion software. The combination of PET and CT in a single system (PET/CT) to form an inherently fused anatomical and functional dataset has provided an imaging modality which could be used as the prime tool in the delineation of tumour volumes and the preparation of patient treatment plans, especially when integrated with virtual simulation. PET imaging typically using F-Fluorodeoxyglucose (F-FDG) can provide data on metabolically active tumour volumes. These functional data have the potential to modify treatment volumes and to guide treatment delivery to cells with particular metabolic characteristics. This paper reviews the current status of the integration of PET and PET/CT data into the radiotherapy treatment process. Consideration is given to the requirements of PET/CT data acquisition with reference to patient positioning aids and the limitations imposed by the PET/CT system. It also reviews the approaches being taken to the definition of functional/ tumour volumes and the mechanisms available to measure and include physiological motion into the imaging process. The use of PET data must be based upon a clear understanding of the interpretation and limitations of the functional signal. Protocols for the implementation of this development remain to be defined, and outcomes data based upon clinical trials are still awaited. © 2006 The British Institute of Radiology.
Resumo:
The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones highlighted only, or with no help. The group using the homophone tool significantly outperformed the other two groups on assisted proofreading and outperformed the others on unassisted spelling, although not significantly. Remedial (unassisted) improvements in automaticity of word recognition, homophone proofreading, and basic reading were found over all groups. Results elucidate the differential contributions of each function of the homophone tool and suggest that with the proper training, assistive software can help not only students with diagnosed disabilities but also those with generally weak reading skills.
Resumo:
The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory ‘tail’ DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the ‘randomness’ of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.
Resumo:
Purpose – The Six Sigma approach to business improvement has emerged as a phenomenon in both the practitioner and academic literature with potential for achieving increased competitiveness and contributing. However, there is a lack of critical reviews covering both theory and practice. Therefore, the purpose of this paper is to critically review the literature of Six Sigma using a consistent theoretical perspective, namely absorptive capacity.
Design/methodology/approach – The literature from peer-reviewed journals has been critically reviewed using the absorptive capacity framework and dimensions of acquisition, assimilation, transformation, and exploitation.
Findings – There is evidence of emerging theoretical underpinning in relation to Six Sigma borrowing from an eclectic range of organisational theories. However, this theoretical development lags behind practice in the area. The development of Six Sigma in practice is expanding mainly through more rigorous studies and applications in service-based environments (profit and not for profit). The absorptive capacity framework is found to be a useful overarching framework within which to situate existing theoretical and practice studies.
Research limitations/implications – Agendas for further research from the critical review, in relation to both theory and practice, have been established in relation to each dimension of the absorptive capacity framework.
Practical implications – The paper shows that Six Sigma is both a strategic and operational issue and that focussing solely on define, measure, analyse, improve control-based projects can limit the strategic effectiveness of the approach within organisations.
Originality/value – Despite the increasing volume of Six Sigma literature and organisational applications, there is a paucity of critical reviews which cover both theory and practice and which suggest research agendas derived from such reviews.
Resumo:
Changes to software requirements occur during initial development and subsequent to delivery, posing a risk to cost and quality while at the same time providing an opportunity to add value. Provision of a generic change source taxonomy will support requirements change risk visibility, and also facilitate richer recording of both pre- and post-delivery change data. In this paper we present a collaborative study to investigate and classify sources of requirements change, drawing comparison between those pertaining to software development and maintenance. We begin by combining evolution, maintenance and software lifecycle research to derive a definition of software maintenance, which provides the foundation for empirical context and comparison. Previously published change ‘causes’ pertaining to development are elicited from the literature, consolidated using expert knowledge and classified using card sorting. A second study incorporating causes of requirements change during software maintenance results in a taxonomy which accounts for the entire evolutionary progress of applications software. We conclude that the distinction between the terms maintenance and development is imprecise, and that changes to requirements in both scenarios arise due to a combination of factors contributing to requirements uncertainty and events that trigger change. The change trigger taxonomy constructs were initially validated using a small set of requirements change data, and deemed sufficient and practical as a means to collect common requirements change statistics across multiple projects.