983 resultados para Investigative techniques
Resumo:
Background: In a selective group of patients accelerated partial breast irradiation (APBI) might be applied after conservative breast surgery to reduce the amount of irradiated healthy tissue. The role of volumetric modulated arc therapy (VMAT) and voluntary moderately deep inspiration breath-hold (vmDIBH) techniques in further reducing irradiated healthy – especially heart – tissue is investigated.
Material and methods: For 37 partial breast planning target volumes (PTVs), three-dimensional conformal radiotherapy (3D-CRT) (3 – 5 coplanar or non-coplanar 6 and/or 10 MV beams) and VMAT (two partial 6 MV arcs) plans were made on CTs acquired in free-breathing (FB) and/or in vmDIBH. Dose-volume parameters for the PTV, heart, lungs, and breasts were compared.
Results: Better dose conformity was achieved with VMAT compared to 3D-CRT (conformity index 1.24 0.09 vs. 1.49 0.20). Non-PTV ipsilateral breast receiving 50% of the prescribed dose was on average reduced by 28% in VMAT plans compared to 3D-CRT plans. Mean heart dose (MHD) reduced from 2.0 (0.1 – 5.1) Gy in 3D-CRT(FB) to 0.6 (0.1 – 1.6) Gy in VMAT(vmDIBH). VMAT is benefi cial for MHD reduction if MHD with 3D-CRT exceeds 0.5Gy. Cardiac dose reduction as a result of VMAT increases with increasing initial MHD, and adding vmDIBH reduces the cardiac dose further. Mean dose to the ipsilateral lung decreased from 3.7 (0.7 – 8.7) to 1.8 (0.5 – 4.0) Gy with VMAT(vmDIBH) compared to 3D-CRT(FB). VMAT resulted in a slight increase in the contralateral breast dose (DMean ) always remaining 1.9 Gy).
Conclusions: For APBI patients, VMAT improves PTV dose conformity and delivers lower doses to the ipsilateral breast and lung compared to 3D-CRT. This goes at the cost of a slight but acceptable increase of the contralateral breast dose. VMAT reduces cardiac dose if MHD exceeds 0.5 Gy for 3D-CRT. Adding vmDIBH results in a further reduction of heart and ipsilateral lung dose.
Resumo:
Automatically determining and assigning shared and meaningful text labels to data extracted from an e-Commerce web page is a challenging problem. An e-Commerce web page can display a list of data records, each of which can contain a combination of data items (e.g. product name and price) and explicit labels, which describe some of these data items. Recent advances in extraction techniques have made it much easier to precisely extract individual data items and labels from a web page, however, there are two open problems: 1. assigning an explicit label to a data item, and 2. determining labels for the remaining data items. Furthermore, improvements in the availability and coverage of vocabularies, especially in the context of e-Commerce web sites, means that we now have access to a bank of relevant, meaningful and shared labels which can be assigned to extracted data items. However, there is a need for a technique which will take as input a set of extracted data items and assign automatically to them the most relevant and meaningful labels from a shared vocabulary. We observe that the Information Extraction (IE) community has developed a great number of techniques which solve problems similar to our own. In this work-in-progress paper we propose our intention to theoretically and experimentally evaluate different IE techniques to ascertain which is most suitable to solve this problem.
Resumo:
Critical decisions are made by decision-makers throughout
the life-cycle of large-scale projects. These decisions are crucial as they
have a direct impact upon the outcome and the success of projects. To aid
decision-makers in the decision making process we present an evidential
reasoning framework. This approach utilizes the Dezert-Smarandache
theory to fuse heterogeneous evidence sources that suffer from levels
of uncertainty, imprecision and conflicts to provide beliefs for decision
options. To analyze the impact of source reliability and priority upon
the decision making process, a reliability discounting technique and a
priority discounting technique, are applied. A maximal consistent subset
is constructed to aid in dening where discounting should be applied.
Application of the evidential reasoning framework is illustrated using a
case study based in the Aerospace domain.
Resumo:
Background: Medical Research Council (MRC) guidelines recommend applying theory within complex interventions to explain how behaviour change occurs. Guidelines endorse self-management of chronic low back pain (CLBP) and osteoarthritis (OA), but evidence for its effectiveness is weak. Objective: This literature review aimed to determine the use of behaviour change theory and techniques within randomised controlled trials of group-based self-management programmes for chronic musculoskeletal pain, specifically CLBP and OA. Methods: A two-phase search strategy of electronic databases was used to identify systematic reviews and studies relevant to this area. Articles were coded for their use of behaviour change theory, and the number of behaviour change techniques (BCTs) was identified using a 93-item taxonomy, Taxonomy (v1). Results: 25 articles of 22 studies met the inclusion criteria, of which only three reported having based their intervention on theory, and all used Social Cognitive Theory. A total of 33 BCTs were coded across all articles with the most commonly identified techniques being '. instruction on how to perform the behaviour', '. demonstration of the behaviour', '. behavioural practice', '. credible source', '. graded tasks' and '. body changes'. Conclusion: Results demonstrate that theoretically driven research within group based self-management programmes for chronic musculoskeletal pain is lacking, or is poorly reported. Future research that follows recommended guidelines regarding the use of theory in study design and reporting is warranted.
Resumo:
Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.
Combining multi-band and frequency-filtering techniques for speech recognition in noisy environments
Resumo:
While current speech recognisers give acceptable performance in carefully controlled environments, their performance degrades rapidly when they are applied in more realistic situations. Generally, the environmental noise may be classified into two classes: the wide-band noise and narrow band noise. While the multi-band model has been shown to be capable of dealing with speech corrupted by narrow-band noise, it is ineffective for wide-band noise. In this paper, we suggest a combination of the frequency-filtering technique with the probabilistic union model in the multi-band approach. The new system has been tested on the TIDIGITS database, corrupted by white noise, noise collected from a railway station, and narrow-band noise, respectively. The results have shown that this approach is capable of dealing with noise of narrow-band or wide-band characteristics, assuming no knowledge about the noisy environment.
Resumo:
The increasing popularity of the social networking service, Twitter, has made it more involved in day-to-day communications, strengthening social relationships and information dissemination. Conversations on Twitter are now being explored as indicators within early warning systems to alert of imminent natural disasters such earthquakes and aid prompt emergency responses to crime. Producers are privileged to have limitless access to market perception from consumer comments on social media and microblogs. Targeted advertising can be made more effective based on user profile information such as demography, interests and location. While these applications have proven beneficial, the ability to effectively infer the location of Twitter users has even more immense value. However, accurately identifying where a message originated from or author’s location remains a challenge thus essentially driving research in that regard. In this paper, we survey a range of techniques applied to infer the location of Twitter users from inception to state-of-the-art. We find significant improvements over time in the granularity levels and better accuracy with results driven by refinements to algorithms and inclusion of more spatial features.
Resumo:
The Copney Stone Circle Complex, Co. Tyrone, N. Ireland, is an important Bronze Age site forming part of the Mid-Ulster Stone Circle Complex. The Environment Service: Historic Monuments and Buildings (ESHMB) initiated a program of bog-clearance in August 1994 to excavate the stone circles. This work was completed by October 1994 and the excavated site was surveyed in August 1995. Almost immediately, the rate at which the stones forming the circles were breaking down was noted and a program of study initiated to make recommendations upon the conservation of this important site. Digital photogrammetric techniques were applied to aerial images of the stone circles and digital terrain models created from the images at a range of scales. These provide base data sets for comparison with identical surveys to be completed in successive years and will allow the rate of deterioration, and the areas most affected, of the circles to be determined. In addition, a 2D analysis of the stones provides an accurate analysis of the absolute 2D dimensions of the stones for rapid desktop computer analysis by researchers remote from the digital photogrammetric workstation used in the survey.
The products of this work are readily incorporated into web sites, educational packages and databases. The technique provides a rapid and user friendly method of presentation of a large body of information and measurements, and a reliable method of storage of the information from Copney should it become necessary to re-cover the site.