996 resultados para adaptive sampling
Resumo:
Introduction: Clinical investigation has revealed a subgroup of head and neck cancers that are virally mediated. The relationship between nasopharyngeal cancer and Epstein Barr Virus (EBV) has long been established and more recently, the association between oropharyngeal cancer and Human Papillomavirus (HPV) has been revealed1,2 These cancers often present with nodal involvement and generally respond well to radiation treatment, evidenced by tumour regression1. This results in the need for treatment plan adaptation or re-planning in a subset of patients. Adaptive techniques allow the target region of the radiotherapy treatment plan to be altered in accordance with treatment-induced changes to ensure that under or over dosing does not occur3. It also assists in limiting potential overdosing of surrounding critical normal tissues4. We sought to identify a high-risk group based on nodal size to be evaluated in a future prospective adaptive radiotherapy trial. Method: Between 2005-2010, 121 patients with virally mediated, node positive nasopharyngeal (EBV positive) or oropharyngeal (HPV positive) cancers, receiving curative intent radiotherapy treatment were reviewed. Patients were analysed based on maximum size of the dominant node at diagnosis with a view to grouping them in varying risk categories to determine the need of re-planning. The frequency and timing of the re-planning scans were also evaluated. Results: Sixteen nasopharyngeal and 105 oropharyngeal tumours were reviewed. Twenty-five (21%) patients underwent a re-planning CT at a median of 22 (range, 0-29) fractions with 1 patient requiring re-planning prior to the commencement of treatment. Based on the analysis, patients were subsequently placed into risk categories; ≤35mm (Group 1), 36-45mm (Group 2), ≥46mm (Group 3). Re-planning CT’s were performed in Group 1- 8/68 (11.8%), Group 2- 4/28 (14.3%), Group 3- 13/25 (52%). Conclusion: In this series, patients with virally mediated head and neck cancer and nodal size > 46mm appear to be a high-risk group for the need of re-planning during a course of curative radiotherapy. This finding will now be tested in a prospective adaptive radiotherapy study. ‘Real World’ Implications: This research identifies predictive factors for those patients with virally mediated head and neck cancer that will benefit most from treatment adaptation. This will assist in minimising the side effects experienced by these patients thereby improving their quality of life after treatment.
Resumo:
Purpose: Virally mediated head and neck cancers (VMHNC) often present with nodal involvement, and are generally considered radioresponsive, resulting in the need for a re-planning CT during radiotherapy (RT) in a subset of patients. We sought to identify a high-risk group based on nodal size to be evaluated in a future prospective adaptive RT trial. Methodology: Between 2005-2010, 121 patients with virally-mediated, node positive nasopharyngeal (EBV positive) or oropharyngeal (HPV positive) cancers, receiving curative intent RT were reviewed. Patients were analysed based on maximum size of the dominant node with a view to grouping them in varying risk categories for the need of re-planning. The frequency and timing of the re-planning scans were also evaluated. Results: Sixteen nasopharyngeal and 105 oropharyngeal tumours were reviewed. Twenty-five (21%) patients underwent a re-planning CT at a median of 22 (range, 0-29) fractions with 1 patient requiring re-planning prior to the commencement of treatment. Based on the analysis, patients were subsequently placed into 3 groups; ≤35mm (Group 1), 36-45mm (Group 2), ≥46mm (Group 3). Re-planning CT’s were performed in Group 1- 8/68 (11.8%), Group 2- 4/28 (14.3%), Group 3- 13/25 (52%). Sample size did not allow statistical analysis to detect a significant difference or exclusion of a lack of difference between the 3 groups. Conclusion: In this series, patients with VMHNC and nodal size > 46mm appear to be a high-risk group for the need of re-planning during a course of definitive radiotherapy. This finding will now be tested in a prospective adaptive RT study.
Resumo:
Purpose: Virally mediated head and neck cancers (VMHNC) often present with nodal involvement, and are generally considered radioresponsive, resulting in the need for a re-planning CT during radiotherapy (RT) in a subset of patients. We sought to identify a high-risk group based on nodal size to be evaluated in a future prospective adaptive RT trial. Methodology: Between 2005-2010, 121 patients with virally-mediated, node positive nasopharyngeal (EBV positive) or oropharyngeal (HPV positive) cancers, receiving curative intent RT were reviewed. Patients were analysed based on maximum size of the dominant node with a view to grouping them in varying risk categories for the need of re-planning. The frequency and timing of the re-planning scans were also evaluated. Results: Sixteen nasopharyngeal and 105 oropharyngeal tumours were reviewed. Twenty-five (21%) patients underwent a re-planning CT at a median of 22 (range, 0-29) fractions with 1 patient requiring re-planning prior to the commencement of treatment. Based on the analysis, patients were subsequently placed into 3 groups; ≤35mm (Group 1), 36-45mm (Group 2), ≥46mm (Group 3). Re-planning CT’s were performed in Group 1- 8/68 (11.8%), Group 2- 4/28 (14.3%), Group 3- 13/25 (52%). Sample size did not allow statistical analysis to detect a significant difference or exclusion of a lack of difference between the 3 groups. Conclusion: In this series, patients with VMHNC and nodal size > 46mm appear to be a high-risk group for the need of re-planning during a course of definitive radiotherapy. This finding will now be tested in a prospective adaptive RT study.
Resumo:
Purpose: Virally mediated head and neck cancers (VMHNC) often present with nodal involvement, and are generally considered radioresponsive, resulting in the need for plan adaptation during radiotherapy in a subset of patients. We sought to identify a high-risk group based on pre-treatment nodal size to be evaluated in a future prospective adaptive radiotherapy trial. Methodology: Between 2005-2010, 121 patients with virally-mediated, node positive nasopharyngeal or oropharyngeal cancers, receiving definitive radiotherapy were reviewed. Patients were analysed based on maximum size of the dominant node at diagnosis with a view to grouping them in varying risk categories for the need of re-planning. The frequency and timing of the re-planning scans were also evaluated. Results: Sixteen nasopharyngeal and 105 oropharyngeal tumours were reviewed. Twenty-five (21%) patients underwent a re-planning CT at a median of 22 (range, 0-29) fractions with 1 patient requiring re-planning prior to the commencement of treatment. Based on the analysis, patients were subsequently placed into 3 groups defined by pre-treatment nodal size; ≤ 35mm (Group 1), 36-45mm (Group 2), ≥ 46mm (Group 3). Applying these groups to the patient cohort, re-planning CT’s were performed in Group 1- 8/68 (11.8%), Group 2- 4/28 (14.3%), Group 3- 13/25 (52%). Conclusion: In this series, patients with VMHNC and nodal size > 46mm appear to be a high-risk group for the need of plan adaptation during a course of definitive radiotherapy. This finding will now be tested in a prospective adaptive radiotherapy study.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
Ocean gliders constitute an important advance in the highly demanding ocean monitoring scenario. Their effciency, endurance and increasing robustness make these vehicles an ideal observing platform for many long term oceanographic applications. However, they have proved to be also useful in the opportunis-tic short term characterization of dynamic structures. Among these, mesoscale eddies are of particular interest due to the relevance they have in many oceano-graphic processes.
Resumo:
The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.
Resumo:
This work investigates the accuracy and efficiency tradeoffs between centralized and collective (distributed) algorithms for (i) sampling, and (ii) n-way data analysis techniques in multidimensional stream data, such as Internet chatroom communications. Its contributions are threefold. First, we use the Kolmogorov-Smirnov goodness-of-fit test to show that statistical differences between real data obtained by collective sampling in time dimension from multiple servers and that of obtained from a single server are insignificant. Second, we show using the real data that collective data analysis of 3-way data arrays (users x keywords x time) known as high order tensors is more efficient than centralized algorithms with respect to both space and computational cost. Furthermore, we show that this gain is obtained without loss of accuracy. Third, we examine the sensitivity of collective constructions and analysis of high order data tensors to the choice of server selection and sampling window size. We construct 4-way tensors (users x keywords x time x servers) and analyze them to show the impact of server and window size selections on the results.
Resumo:
In July 2010, China announced the “National Plan for Medium and Long-term Education Reform and Development(2010-2020)” (PRC 2010). The Plan calls for an education system that: • promotes an integrated development which harnesses everyone’s talent; • combines learning and thinking; unifies knowledge and practice; • allows teachers to teach according to individuals’ needs; and • reforms education quality evaluation and personnel evaluation systems focusing on performance including character, knowledge, ability and other factors. This paper discusses the design and implementation of a Professional Learning Program (PLP) undertaken by 432 primary, middle and high school teachers in China. The aim of this initiative was to develop adaptive expertise in using technology that facilitated innovative science and technology teaching and learning as envisaged by the Chinese Ministry of Education’s (2010-2020) education reforms. Key principles derived from literature about professional learning and scaffolding of learning informed the design of the PLP. The analysis of data revealed that the participants had made substantial progress towards the development of adaptive expertise. This was manifested not only by advances in the participants’ repertoires of Subject Matter Knowledge and Pedagogical Content Knowledge but also in changes to their levels of confidence and identities as teachers. It was found that through time the participants had coalesced into a professional learning community that readily engaged in the sharing, peer review, reuse and adaption, and collaborative design of innovative science and technology learning and assessment activities.
Resumo:
This book describes the mortality for all causes of death and the trend in major causes of death since 1970s in Shandong Province, China.
Resumo:
This paper presents an adaptive metering algorithm for enhancing the electronic screening (e-screening) operation at truck weight stations. This algorithm uses a feedback control mechanism to control the level of truck vehicles entering the weight station. The basic operation of the algorithm allows more trucks to be inspected when the weight station is underutilized by adjusting the weight threshold lower. Alternatively, the algorithm restricts the number of trucks to inspect when the station is overutilized to prevent queue spillover. The proposed control concept is demonstrated and evaluated in a simulation environment. The simulation results demonstrate the considerable benefits of the proposed algorithm in improving overweight enforcement with minimal negative impacts on nonoverweighed trucks. The test results also reveal that the effectiveness of the algorithm improves with higher truck participation rates in the e-screening program.
Resumo:
The main aim of this paper is to describe an adaptive re-planning algorithm based on a RRT and Game Theory to produce an efficient collision free obstacle adaptive Mission Path Planner for Search and Rescue (SAR) missions. This will provide UAV autopilots and flight computers with the capability to autonomously avoid static obstacles and No Fly Zones (NFZs) through dynamic adaptive path replanning. The methods and algorithms produce optimal collision free paths and can be integrated on a decision aid tool and UAV autopilots.
Resumo:
Historically a significant gap between male and female wages has existed in the Australian labour market. Indeed this wage differential was institutionalised in the 1912 arbitration decision which determined that the basic female wage would be set at between 54 and 66 per cent of the male wage. More recently however, the 1969 and 1972 Equal Pay Cases determined that male/female wage relativities should be based upon the premise of equal pay for work of equal value. It is important to note that the mere observation that average wages differ between males and females is not sine qua non evidence of sex discrimination. Economists restrict the definition of wage discrimination to cases where two distinct groups receive different average remuneration for reasons unrelated to differences in productivity characteristics. This paper extends previous studies of wage discrimination in Australia (Chapman and Mulvey, 1986; Haig, 1982) by correcting the estimated male/female wage differential for the existence of non-random sampling. Previous Australian estimates of male/female human capital basedwage specifications together with estimates of the corresponding wage differential all suffer from a failure to address this issue. If the sample of females observed to be working does not represent a random sample then the estimates of the male/female wage differential will be both biased and inconsistent.
Resumo:
One remaining difficulty in the Information Technology (IT) business value evaluation domain is the direct linkage between IT value and the underlying determinants of IT value or surrogates of IT value. This paper proposes a research that examines the interacting effects of the determinants of IT value, and their influences on IT value. The overarching research question is how those determinants interact with each other and affect the IT value at organizational value. To achieve this, this research embraces a multilevel, complex, and adaptive system view, where the IT value emerges from the interacting of underlying determinants. This research is theoretically grounded on three organizational theories – multilevel theory, complex adaptive systems theory, and adaptive structuration theory. By integrating those theoretical paradigms, this research proposes a conceptual model that focuses on the process where IT value is created from interactions of those determinants. To answer the research question, agent-based modeling technique is used in this research to build a computational representation based on the conceptual model. Computational experimentation will be conducted based on the computational representation. Validation procedures will be applied to consolidate the validity of this model. In the end, hypotheses will be tested using computational experimentation data.
Resumo:
Purpose Virally mediated head and neck cancers (VMHNC) often present with nodal involvement and are highly radioresponsive, meaning that treatment plan adaptation during radiotherapy (RT) in a subset of patients is required. We sought to determine potential risk profiles and a corresponding adaptive treatment strategy for these patients. Methodology 121 patients with virally mediated, node positive nasopharyngeal (Epstein Barr Virus positive) or oropharyngeal (Human Papillomavirus positive) cancers, receiving curative intent RT were reviewed. The type, frequency and timing of adaptive interventions, including source-to-skin distance (SSD) corrections, re-scanning and re-planning, were evaluated. Patients were reviewed based on the maximum size of the dominant node to assess the need for plan adaptation. Results Forty-six patients (38%) required plan adaptation during treatment. The median fraction at which the adaptive intervention occurred was 26 for SSD corrections and 22 for re-planning CTs. A trend toward 3 risk profile groupings was discovered: 1) Low risk with minimal need (< 10%) for adaptive intervention (dominant pre-treatment nodal size of ≤ 35 mm), 2) Intermediate risk with possible need (< 20%) for adaptive intervention (dominant pre-treatment nodal size of 36 mm – 45 mm) and 3) High-risk with increased likelihood (> 50%) for adaptive intervention (dominant pre-treatment nodal size of ≥ 46 mm). Conclusion In this study, patients with VMHNC and a maximum dominant nodal size of > 46 mm were identified at a higher risk of requiring re-planning during a course of definitive RT. Findings will be tested in a future prospective adaptive RT study.