928 resultados para planning model
Resumo:
Knowledge-based radiation treatment is an emerging concept in radiotherapy. It
mainly refers to the technique that can guide or automate treatment planning in
clinic by learning from prior knowledge. Dierent models are developed to realize
it, one of which is proposed by Yuan et al. at Duke for lung IMRT planning. This
model can automatically determine both beam conguration and optimization ob-
jectives with non-coplanar beams based on patient-specic anatomical information.
Although plans automatically generated by this model demonstrate equivalent or
better dosimetric quality compared to clinical approved plans, its validity and gener-
ality are limited due to the empirical assignment to a coecient called angle spread
constraint dened in the beam eciency index used for beam ranking. To eliminate
these limitations, a systematic study on this coecient is needed to acquire evidences
for its optimal value.
To achieve this purpose, eleven lung cancer patients with complex tumor shape
with non-coplanar beams adopted in clinical approved plans were retrospectively
studied in the frame of the automatic lung IMRT treatment algorithm. The primary
and boost plans used in three patients were treated as dierent cases due to the
dierent target size and shape. A total of 14 lung cases, thus, were re-planned using
the knowledge-based automatic lung IMRT planning algorithm by varying angle
spread constraint from 0 to 1 with increment of 0.2. A modied beam angle eciency
index used for navigate the beam selection was adopted. Great eorts were made to assure the quality of plans associated to every angle spread constraint as good
as possible. Important dosimetric parameters for PTV and OARs, quantitatively
re
ecting the plan quality, were extracted from the DVHs and analyzed as a function
of angle spread constraint for each case. Comparisons of these parameters between
clinical plans and model-based plans were evaluated by two-sampled Students t-tests,
and regression analysis on a composite index built on the percentage errors between
dosimetric parameters in the model-based plans and those in the clinical plans as a
function of angle spread constraint was performed.
Results show that model-based plans generally have equivalent or better quality
than clinical approved plans, qualitatively and quantitatively. All dosimetric param-
eters except those for lungs in the automatically generated plans are statistically
better or comparable to those in the clinical plans. On average, more than 15% re-
duction on conformity index and homogeneity index for PTV and V40, V60 for heart
while an 8% and 3% increase on V5, V20 for lungs, respectively, are observed. The
intra-plan comparison among model-based plans demonstrates that plan quality does
not change much with angle spread constraint larger than 0.4. Further examination
on the variation curve of the composite index as a function of angle spread constraint
shows that 0.6 is the optimal value that can result in statistically the best achievable
plans.
Resumo:
Background: Since 2007, there has been an ongoing collaboration between Duke University and Mulago National Referral Hospital (NRH) in Kampala, Uganda to increase surgical capacity. This program is prepared to expand to other sites within Uganda to improve neurosurgery outside of Kampala as well. This study assessed the existing progress at Mulago NRH and the neurosurgical needs and assets at two potential sites for expansion. Methods: Three public hospitals were visited to assess needs and assets: Mulago NRH, Mbarara Regional Referral Hospital (RRH), and Gulu RRH. At each site, a surgical capacity tool was administered and healthcare workers were interviewed about perceived needs and assets. A total of 39 interviews were conducted between the three sites. Thematic analysis of the interviews was conducted to identify the reported needs and assets at each hospital. Results: Some improvements are needed to the Duke-Mulago Collaboration model prior to expansion; minor changes to the neurosurgery residency program as well as the method for supply donation and training provided during neurosurgery camps need to examined. Neurosurgery can be implemented at Mbarara RRH currently but the hospital needs a biomedical equipment technician on staff immediately. Gulu RRH is not well positioned for Neurosurgery until there is a CT Scanner somewhere in the Northern Region of Uganda or at the hospital. Conclusions: Neurosurgery is already present in Uganda on a small scale and needs rapid expansion to meet patient needs. This progression is possible with prudent allocation of resources on strategic equipment purchases, human resources including clinical staff and biomedical staff, and changes to the supply chain management system.
Resumo:
Radiotherapy is commonly used to treat lung cancer. However, radiation induced damage to lung tissue is a major limiting factor to its use. To minimize normal tissue lung toxicity from conformal radiotherapy treatment planning, we investigated the use of Perfluoropropane(PFP)-enhanced MR imaging to assess and guide the sparing of functioning lung. Fluorine Enhanced MRI using Perfluoropropane(PFP) is a dynamic multi-breath steady state technique enabling quantitative and qualitative assessments of lung function(1).
Imaging data was obtained from studies previously acquired in the Duke Image Analysis Laboratory. All studies were approved by the Duke IRB. The data was de-identified for this project, which was also approved by the Duke IRB. Subjects performed several breath-holds at total lung capacity(TLC) interspersed with multiple tidal breaths(TB) of Perfluoropropane(PFP)/oxygen mixture. Additive wash-in intensity images were created through the summation of the wash-in phase breath-holds. Additionally, model based fitting was utilized to create parametric images of lung function(1).
Varian Eclipse treatment planning software was used for putative treatment planning. For each subject two plans were made, a standard plan, with no regional functional lung information considered other than current standard models. Another was created using functional information to spare functional lung while maintaining dose to the target lesion. Plans were optimized to a prescription dose of 60 Gy to the target over the course of 30 fractions.
A decrease in dose to functioning lung was observed when utilizing this functional information compared to the standard plan for all five subjects. PFP-enhanced MR imaging is a feasible method to assess ventilatory lung function and we have shown how this can be incorporated into treatment planning to potentially decrease the dose to normal tissue.
Resumo:
Bayesian nonparametric models, such as the Gaussian process and the Dirichlet process, have been extensively applied for target kinematics modeling in various applications including environmental monitoring, traffic planning, endangered species tracking, dynamic scene analysis, autonomous robot navigation, and human motion modeling. As shown by these successful applications, Bayesian nonparametric models are able to adjust their complexities adaptively from data as necessary, and are resistant to overfitting or underfitting. However, most existing works assume that the sensor measurements used to learn the Bayesian nonparametric target kinematics models are obtained a priori or that the target kinematics can be measured by the sensor at any given time throughout the task. Little work has been done for controlling the sensor with bounded field of view to obtain measurements of mobile targets that are most informative for reducing the uncertainty of the Bayesian nonparametric models. To present the systematic sensor planning approach to leaning Bayesian nonparametric models, the Gaussian process target kinematics model is introduced at first, which is capable of describing time-invariant spatial phenomena, such as ocean currents, temperature distributions and wind velocity fields. The Dirichlet process-Gaussian process target kinematics model is subsequently discussed for modeling mixture of mobile targets, such as pedestrian motion patterns.
Novel information theoretic functions are developed for these introduced Bayesian nonparametric target kinematics models to represent the expected utility of measurements as a function of sensor control inputs and random environmental variables. A Gaussian process expected Kullback Leibler divergence is developed as the expectation of the KL divergence between the current (prior) and posterior Gaussian process target kinematics models with respect to the future measurements. Then, this approach is extended to develop a new information value function that can be used to estimate target kinematics described by a Dirichlet process-Gaussian process mixture model. A theorem is proposed that shows the novel information theoretic functions are bounded. Based on this theorem, efficient estimators of the new information theoretic functions are designed, which are proved to be unbiased with the variance of the resultant approximation error decreasing linearly as the number of samples increases. Computational complexities for optimizing the novel information theoretic functions under sensor dynamics constraints are studied, and are proved to be NP-hard. A cumulative lower bound is then proposed to reduce the computational complexity to polynomial time.
Three sensor planning algorithms are developed according to the assumptions on the target kinematics and the sensor dynamics. For problems where the control space of the sensor is discrete, a greedy algorithm is proposed. The efficiency of the greedy algorithm is demonstrated by a numerical experiment with data of ocean currents obtained by moored buoys. A sweep line algorithm is developed for applications where the sensor control space is continuous and unconstrained. Synthetic simulations as well as physical experiments with ground robots and a surveillance camera are conducted to evaluate the performance of the sweep line algorithm. Moreover, a lexicographic algorithm is designed based on the cumulative lower bound of the novel information theoretic functions, for the scenario where the sensor dynamics are constrained. Numerical experiments with real data collected from indoor pedestrians by a commercial pan-tilt camera are performed to examine the lexicographic algorithm. Results from both the numerical simulations and the physical experiments show that the three sensor planning algorithms proposed in this dissertation based on the novel information theoretic functions are superior at learning the target kinematics with
little or no prior knowledge
Resumo:
A scenario-based two-stage stochastic programming model for gas production network planning under uncertainty is usually a large-scale nonconvex mixed-integer nonlinear programme (MINLP), which can be efficiently solved to global optimality with nonconvex generalized Benders decomposition (NGBD). This paper is concerned with the parallelization of NGBD to exploit multiple available computing resources. Three parallelization strategies are proposed, namely, naive scenario parallelization, adaptive scenario parallelization, and adaptive scenario and bounding parallelization. Case study of two industrial natural gas production network planning problems shows that, while the NGBD without parallelization is already faster than a state-of-the-art global optimization solver by an order of magnitude, the parallelization can improve the efficiency by several times on computers with multicore processors. The adaptive scenario and bounding parallelization achieves the best overall performance among the three proposed parallelization strategies.
Resumo:
In radiotherapy planning, computed tomography (CT) images are used to quantify the electron density of tissues and provide spatial anatomical information. Treatment planning systems use these data to calculate the expected spatial distribution of absorbed dose in a patient. CT imaging is complicated by the presence of metal implants which cause increased image noise, produce artifacts throughout the image and can exceed the available range of CT number values within the implant, perturbing electron density estimates in the image. Furthermore, current dose calculation algorithms do not accurately model radiation transport at metal-tissue interfaces. Combined, these issues adversely affect the accuracy of dose calculations in the vicinity of metal implants. As the number of patients with orthopedic and dental implants grows, so does the need to deliver safe and effective radiotherapy treatments in the presence of implants. The Medical Physics group at the Cancer Centre of Southeastern Ontario and Queen's University has developed a Cobalt-60 CT system that is relatively insensitive to metal artifacts due to the high energy, nearly monoenergetic Cobalt-60 photon beam. Kilovoltage CT (kVCT) images, including images corrected using a commercial metal artifact reduction tool, were compared to Cobalt-60 CT images throughout the treatment planning process, from initial imaging through to dose calculation. An effective metal artifact reduction algorithm was also implemented for the Cobalt-60 CT system. Electron density maps derived from the same kVCT and Cobalt-60 CT images indicated the impact of image artifacts on estimates of photon attenuation for treatment planning applications. Measurements showed that truncation of CT number data in kVCT images produced significant mischaracterization of the electron density of metals. Dose measurements downstream of metal inserts in a water phantom were compared to dose data calculated using CT images from kVCT and Cobalt-60 systems with and without artifact correction. The superior accuracy of electron density data derived from Cobalt-60 images compared to kVCT images produced calculated dose with far better agreement with measured results. These results indicated that dose calculation errors from metal image artifacts are primarily due to misrepresentation of electron density within metals rather than artifacts surrounding the implants.
Resumo:
Marine Protected Areas (MPAs) are widely used as tools to maintain biodiversity, protect habitats and ensure that development is sustainable. If MPAs are to maintain their role into the future it is important for managers to understand how conditions at these sites may change as a result of climate change and other drivers, and this understanding needs to extend beyond temperature to a range of key ecosystem indicators. This case study demonstrates how spatially-aggregated model results for multiple variables can provide useful projections for MPA planners and managers. Conditions in European MPAs have been projected for the 2040s using unmitigated and globally managed scenarios of climate change and river management, and hence high and low emissions of greenhouse gases and riverborne nutrients. The results highlight the vulnerability of potential refuge sites in the north-west Mediterranean and the need for careful monitoring at MPAs to the north and west of the British Isles, which may be affected by changes in Atlantic circulation patterns. The projections also support the need for more MPAs in the eastern Mediterranean and Adriatic Sea, and can inform the selection of sites.
Resumo:
Marine Protected Areas (MPAs) are widely used as tools to maintain biodiversity, protect habitats and ensure that development is sustainable. If MPAs are to maintain their role into the future it is important for managers to understand how conditions at these sites may change as a result of climate change and other drivers, and this understanding needs to extend beyond temperature to a range of key ecosystem indicators. This case study demonstrates how spatially-aggregated model results for multiple variables can provide useful projections for MPA planners and managers. Conditions in European MPAs have been projected for the 2040s using unmitigated and globally managed scenarios of climate change and river management, and hence high and low emissions of greenhouse gases and riverborne nutrients. The results highlight the vulnerability of potential refuge sites in the north-west Mediterranean and the need for careful monitoring at MPAs to the north and west of the British Isles, which may be affected by changes in Atlantic circulation patterns. The projections also support the need for more MPAs in the eastern Mediterranean and Adriatic Sea, and can inform the selection of sites.
Resumo:
Since the beginning of the 20th century, the Garden City model has been a predominant theory emerging from Ecological Urbanism. In his book Howard observed the disastrous effects of rapid urbanization and as a response, proposed the Garden City. Although Howard’s proposal was first published in the late 1800’s, the clear imbalance that Howard aimed to address is still prevalent in the UK today. Each year, the UK wastes nearly 15 million tons of food, despite this an estimated 500,000 people in the UK go without sufficient access to food. While the urban population is rapidly increasing and cities are becoming hubs of economic activity, producing wealth and improving education and access to markets, it is within these cities that the imbalance is most evident, with a significant proportion of the world’s population with unmet needs living in urban areas. Despite Howard’s model being a response to 17th century London, many still consider the Garden City model to be an effective solution for the 21st century. In his book, Howard details the metrics required for the design of a Garden City. This paper will discuss how, by using this methodology and comparing it with more recent studies by Cornell University and Matthew Wheeland (Pure Energies); it is possible to test the validity of Howard’s proposal to establish whether the Garden City model is a viable solution to the increasing pressures of urbanization.
This paper outlines how the analysis of Howard’s proposal has shown the model to be flawed, incapable of producing enough food to sustain the proposed 32,000 population, with a capacity to produce only 23% of the food required to meet the current average UK consumption rate. Beyond the limited productive capacity of Howard’s model, the design itself does little to increase local resilience or the ecological base. This paper will also discuss how a greater understanding of the
Land-share requirements enables the design of a new urban model, building on the foundations initially laid out by Howard and combining a number of other theories to produce a more resilient and efficient model of ecological urbanism.
Resumo:
Adjoint methods have proven to be an efficient way of calculating the gradient of an objective function with respect to a shape parameter for optimisation, with a computational cost nearly independent of the number of the design variables [1]. The approach in this paper links the adjoint surface sensitivities (gradient of objective function with respect to the surface movement) with the parametric design velocities (movement of the surface due to a CAD parameter perturbation) in order to compute the gradient of the objective function with respect to CAD variables.
For a successful implementation of shape optimization strategies in practical industrial cases, the choice of design variables or parameterisation scheme used for the model to be optimized plays a vital role. Where the goal is to base the optimization on a CAD model the choices are to use a NURBS geometry generated from CAD modelling software, where the position of the NURBS control points are the optimisation variables [2] or to use the feature based CAD model with all of the construction history to preserve the design intent [3]. The main advantage of using the feature based model is that the optimized model produced can be directly used for the downstream applications including manufacturing and process planning.
This paper presents an approach for optimization based on the feature based CAD model, which uses CAD parameters defining the features in the model geometry as the design variables. In order to capture the CAD surface movement with respect to the change in design variable, the “Parametric Design Velocity” is calculated, which is defined as the movement of the CAD model boundary in the normal direction due to a change in the parameter value.
The approach presented here for calculating the design velocities represents an advancement in terms of capability and robustness of that described by Robinson et al. [3]. The process can be easily integrated to most industrial optimisation workflows and is immune to the topology and labelling issues highlighted by other CAD based optimisation processes. It considers every continuous (“real value”) parameter type as an optimisation variable, and it can be adapted to work with any CAD modelling software, as long as it has an API which provides access to the values of the parameters which control the model shape and allows the model geometry to be exported. To calculate the movement of the boundary the methodology employs finite differences on the shape of the 3D CAD models before and after the parameter perturbation. The implementation procedure includes calculating the geometrical movement along a normal direction between two discrete representations of the original and perturbed geometry respectively. Parametric design velocities can then be directly linked with adjoint surface sensitivities to extract the gradients to use in a gradient-based optimization algorithm.
The optimisation of a flow optimisation problem is presented, in which the power dissipation of the flow in an automotive air duct is to be reduced by changing the parameters of the CAD geometry created in CATIA V5. The flow sensitivities are computed with the continuous adjoint method for a laminar and turbulent flow [4] and are combined with the parametric design velocities to compute the cost function gradients. A line-search algorithm is then used to update the design variables and proceed further with optimisation process.
Resumo:
Introduction: Seeking preconception care is recognized as an important health behavior for women with preexisting diabetes. Yet many women with diabetes do not seek care or advice until after they are pregnant, and many enter pregnancy with suboptimal glycemic control. This study explored the attitudes about pregnancy and preconception care seeking in a group of nonpregnant women with type 1 diabetes mellitus. Methods: In-depth semistructured interviews were completed with 14 nonpregnant women with type 1 diabetes. Results: Analysis of the interview data revealed 4 main themes: 1) the emotional complexity of childbearing decisions, 2) preferences for information related to pregnancy, 3) the importance of being known by your health professional, and 4) frustrations with the medical model of care. Discussion: These findings raise questions about how preconception care should be provided to women with diabetes and highlight the pivotal importance of supportive, familiar relationships between health professionals and women with diabetes in the provision of individualized care and advice. By improving the quality of relationships and communication between health care providers and patients, we will be better able to provide care and advice that is perceived as relevant to the individual, whatever her stage of family planning. © 2012 by the American College of Nurse-Midwives.
Resumo:
Today a number of studies are published on how organizational strategy is developed and how organizations contribute to local and regional development through the realization of these strategies. There are also many articles dealing with the success of a project by identifying the criteria and the factors that influence them. This article introduces the project-oriented strategic planning process that reveals how projects contribute to local and regional development and demonstrates the relationship between this approach and the regional competitiveness model as well as the KRAFT concept. There is a lot of research that focuses on sustainability in business. These studies argue that sustainability is very important to the success of a business in the future. The Project Excellence Model that analyses project success does not contain the sustainability criteria; the GPM P5 standard consists of sustainability components related either to the organizational level. To fill this gap a Project Sustainability Excellence Model (PSEM) was developed. The model was tested by interviews with managers of Hungarian for-profit and non-profit organizations. This paper introduces the PSEM and highlights the most important elements of the empirical analysis.
Resumo:
Production Planning and Control (PPC) systems have grown and changed because of the developments in planning tools and models as well as the use of computers and information systems in this area. Though so much is available in research journals, practice of PPC is lagging behind and does not use much from published research. The practices of PPC in SMEs lag behind because of many reasons, which need to be explored. This research work deals with the effect of identified variables such as forecasting, planning and control methods adopted, demographics of the key person, standardization practices followed, effect of training, learning and IT usage on firm performance. A model and framework has been developed based on literature. Empirical testing of the model has been done after collecting data using a questionnaire schedule administered among the selected respondents from Small and Medium Enterprises (SMEs) in India. Final data included 382 responses. Hypotheses linking SME performance with the use of forecasting, planning and controlling were formed and tested. Exploratory factor analysis was used for data reduction and for identifying the factor structure. High and low performing firms were classified using a Logistic Regression model. A confirmatory factor analysis was used to study the structural relationship between firm performance and dependent variables.
Resumo:
Lors du transport du bois de la forêt vers les usines, de nombreux événements imprévus peuvent se produire, événements qui perturbent les trajets prévus (par exemple, en raison des conditions météo, des feux de forêt, de la présence de nouveaux chargements, etc.). Lorsque de tels événements ne sont connus que durant un trajet, le camion qui accomplit ce trajet doit être détourné vers un chemin alternatif. En l’absence d’informations sur un tel chemin, le chauffeur du camion est susceptible de choisir un chemin alternatif inutilement long ou pire, qui est lui-même "fermé" suite à un événement imprévu. Il est donc essentiel de fournir aux chauffeurs des informations en temps réel, en particulier des suggestions de chemins alternatifs lorsqu’une route prévue s’avère impraticable. Les possibilités de recours en cas d’imprévus dépendent des caractéristiques de la chaîne logistique étudiée comme la présence de camions auto-chargeurs et la politique de gestion du transport. Nous présentons trois articles traitant de contextes d’application différents ainsi que des modèles et des méthodes de résolution adaptés à chacun des contextes. Dans le premier article, les chauffeurs de camion disposent de l’ensemble du plan hebdomadaire de la semaine en cours. Dans ce contexte, tous les efforts doivent être faits pour minimiser les changements apportés au plan initial. Bien que la flotte de camions soit homogène, il y a un ordre de priorité des chauffeurs. Les plus prioritaires obtiennent les volumes de travail les plus importants. Minimiser les changements dans leurs plans est également une priorité. Étant donné que les conséquences des événements imprévus sur le plan de transport sont essentiellement des annulations et/ou des retards de certains voyages, l’approche proposée traite d’abord l’annulation et le retard d’un seul voyage, puis elle est généralisée pour traiter des événements plus complexes. Dans cette ap- proche, nous essayons de re-planifier les voyages impactés durant la même semaine de telle sorte qu’une chargeuse soit libre au moment de l’arrivée du camion à la fois au site forestier et à l’usine. De cette façon, les voyages des autres camions ne seront pas mo- difiés. Cette approche fournit aux répartiteurs des plans alternatifs en quelques secondes. De meilleures solutions pourraient être obtenues si le répartiteur était autorisé à apporter plus de modifications au plan initial. Dans le second article, nous considérons un contexte où un seul voyage à la fois est communiqué aux chauffeurs. Le répartiteur attend jusqu’à ce que le chauffeur termine son voyage avant de lui révéler le prochain voyage. Ce contexte est plus souple et offre plus de possibilités de recours en cas d’imprévus. En plus, le problème hebdomadaire peut être divisé en des problèmes quotidiens, puisque la demande est quotidienne et les usines sont ouvertes pendant des périodes limitées durant la journée. Nous utilisons un modèle de programmation mathématique basé sur un réseau espace-temps pour réagir aux perturbations. Bien que ces dernières puissent avoir des effets différents sur le plan de transport initial, une caractéristique clé du modèle proposé est qu’il reste valable pour traiter tous les imprévus, quelle que soit leur nature. En effet, l’impact de ces événements est capturé dans le réseau espace-temps et dans les paramètres d’entrée plutôt que dans le modèle lui-même. Le modèle est résolu pour la journée en cours chaque fois qu’un événement imprévu est révélé. Dans le dernier article, la flotte de camions est hétérogène, comprenant des camions avec des chargeuses à bord. La configuration des routes de ces camions est différente de celle des camions réguliers, car ils ne doivent pas être synchronisés avec les chargeuses. Nous utilisons un modèle mathématique où les colonnes peuvent être facilement et naturellement interprétées comme des itinéraires de camions. Nous résolvons ce modèle en utilisant la génération de colonnes. Dans un premier temps, nous relaxons l’intégralité des variables de décision et nous considérons seulement un sous-ensemble des itinéraires réalisables. Les itinéraires avec un potentiel d’amélioration de la solution courante sont ajoutés au modèle de manière itérative. Un réseau espace-temps est utilisé à la fois pour représenter les impacts des événements imprévus et pour générer ces itinéraires. La solution obtenue est généralement fractionnaire et un algorithme de branch-and-price est utilisé pour trouver des solutions entières. Plusieurs scénarios de perturbation ont été développés pour tester l’approche proposée sur des études de cas provenant de l’industrie forestière canadienne et les résultats numériques sont présentés pour les trois contextes.