866 resultados para OPTIMIZATION MODEL
Resumo:
A basic prerequisite for in vivo X-ray imaging of the lung is the exact determination of radiation dose. Achieving resolutions of the order of micrometres may become particularly challenging owing to increased dose, which in the worst case can be lethal for the imaged animal model. A framework for linking image quality to radiation dose in order to optimize experimental parameters with respect to dose reduction is presented. The approach may find application for current and future in vivo studies to facilitate proper experiment planning and radiation risk assessment on the one hand and exploit imaging capabilities on the other.
Resumo:
Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.
Resumo:
In process industries, make-and-pack production is used to produce food and beverages, chemicals, and metal products, among others. This type of production process allows the fabrication of a wide range of products in relatively small amounts using the same equipment. In this article, we consider a real-world production process (cf. Honkomp et al. 2000. The curse of reality – why process scheduling optimization problems are diffcult in practice. Computers & Chemical Engineering, 24, 323–328.) comprising sequence-dependent changeover times, multipurpose storage units with limited capacities, quarantine times, batch splitting, partial equipment connectivity, and transfer times. The planning problem consists of computing a production schedule such that a given demand of packed products is fulfilled, all technological constraints are satisfied, and the production makespan is minimised. None of the models in the literature covers all of the technological constraints that occur in such make-and-pack production processes. To close this gap, we develop an efficient mixed-integer linear programming model that is based on a continuous time domain and general-precedence variables. We propose novel types of symmetry-breaking constraints and a preprocessing procedure to improve the model performance. In an experimental analysis, we show that small- and moderate-sized instances can be solved to optimality within short CPU times.
Resumo:
Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.
Resumo:
Cost-efficient operation while satisfying performance and availability guarantees in Service Level Agreements (SLAs) is a challenge for Cloud Computing, as these are potentially conflicting objectives. We present a framework for SLA management based on multi-objective optimization. The framework features a forecasting model for determining the best virtual machine-to-host allocation given the need to minimize SLA violations, energy consumption and resource wasting. A comprehensive SLA management solution is proposed that uses event processing for monitoring and enables dynamic provisioning of virtual machines onto the physical infrastructure. We validated our implementation against serveral standard heuristics and were able to show that our approach is significantly better.
Resumo:
In several regions of the world, climate change is expected to have severe impacts on agricultural systems. Changes in land management are one way to adapt to future climatic conditions, including land-use changes and local adjustments of agricultural practices. In previous studies, options for adaptation have mostly been explored by testing alternative scenarios. Systematic explorations of land management possibilities using optimization approaches were so far mainly restricted to studies of land and resource management under constant climatic conditions. In this study, we bridge this gap and exploit the benefits of multi-objective regional optimization for identifying optimum land management adaptations to climate change. We design a multi-objective optimization routine that integrates a generic crop model and considers two climate scenarios for 2050 in a meso-scale catchment on the Swiss Central Plateau with already limited water resources. The results indicate that adaptation will be necessary in the study area to cope with a decrease in productivity by 0–10 %, an increase in soil loss by 25–35 %, and an increase in N-leaching by 30–45 %. Adaptation options identified here exhibit conflicts between productivity and environmental goals, but compromises are possible. Necessary management changes include (i) adjustments of crop shares, i.e. increasing the proportion of early harvested winter cereals at the expense of irrigated spring crops, (ii) widespread use of reduced tillage, (iii) allocation of irrigated areas to soils with low water-retention capacity at lower elevations, and (iv) conversion of some pre-alpine grasslands to croplands.
Resumo:
Purpose: To evaluate normal tissue dose reduction in step-and-shoot intensity-modulated radiation therapy (IMRT) on the Varian 2100 platform by tracking the multileaf collimator (MLC) apertures with the accelerator jaws. Methods: Clinical radiation treatment plans for 10 thoracic, 3 pediatric and 3 head and neck patients were converted to plans with the jaws tracking each segment’s MLC apertures. Each segment was then renormalized to account for the change in collimator scatter to obtain target coverage within 1% of that in the original plan. The new plans were compared to the original plans in a commercial radiation treatment planning system (TPS). Reduction in normal tissue dose was evaluated in the new plan by using the parameters V5, V10, and V20 in the cumulative dose-volume histogram for the following structures: total lung minus GTV (gross target volume), heart, esophagus, spinal cord, liver, parotids, and brainstem. In order to validate the accuracy of our beam model, MLC transmission measurements were made and compared to those predicted by the TPS. Results: The greatest change between the original plan and new plan occurred at lower dose levels. The reduction in V20 was never more than 6.3% and was typically less than 1% for all patients. The reduction in V5 was 16.7% maximum and was typically less than 3% for all patients. The variation in normal tissue dose reduction was not predictable, and we found no clear parameters that indicated which patients would benefit most from jaw tracking. Our TPS model of MLC transmission agreed with measurements with absolute transmission differences of less than 0.1 % and thus uncertainties in the model did not contribute significantly to the uncertainty in the dose determination. Conclusion: The amount of dose reduction achieved by collimating the jaws around each MLC aperture in step-and-shoot IMRT does not appear to be clinically significant.
Resumo:
Computer vision-based food recognition could be used to estimate a meal's carbohydrate content for diabetic patients. This study proposes a methodology for automatic food recognition, based on the Bag of Features (BoF) model. An extensive technical investigation was conducted for the identification and optimization of the best performing components involved in the BoF architecture, as well as the estimation of the corresponding parameters. For the design and evaluation of the prototype system, a visual dataset with nearly 5,000 food images was created and organized into 11 classes. The optimized system computes dense local features, using the scale-invariant feature transform on the HSV color space, builds a visual dictionary of 10,000 visual words by using the hierarchical k-means clustering and finally classifies the food images with a linear support vector machine classifier. The system achieved classification accuracy of the order of 78%, thus proving the feasibility of the proposed approach in a very challenging image dataset.
Resumo:
Latrepirdine (Dimebon) is a pro-neurogenic, antihistaminic compound that has yielded mixed results in clinical trials of mild to moderate Alzheimer's disease, with a dramatically positive outcome in a Russian clinical trial that was unconfirmed in a replication trial in the United States. We sought to determine whether latrepirdine (LAT)-stimulated amyloid precursor protein (APP) catabolism is at least partially attributable to regulation of macroautophagy, a highly conserved protein catabolism pathway that is known to be impaired in brains of patients with Alzheimer's disease (AD). We utilized several mammalian cellular models to determine whether LAT regulates mammalian target of rapamycin (mTOR) and Atg5-dependent autophagy. Male TgCRND8 mice were chronically administered LAT prior to behavior analysis in the cued and contextual fear conditioning paradigm, as well as immunohistological and biochemical analysis of AD-related neuropathology. Treatment of cultured mammalian cells with LAT led to enhanced mTOR- and Atg5-dependent autophagy. Latrepirdine treatment of TgCRND8 transgenic mice was associated with improved learning behavior and with a reduction in accumulation of Aβ42 and α-synuclein. We conclude that LAT possesses pro-autophagic properties in addition to the previously reported pro-neurogenic properties, both of which are potentially relevant to the treatment and/or prevention of neurodegenerative diseases. We suggest that elucidation of the molecular mechanism(s) underlying LAT effects on neurogenesis, autophagy and behavior might warranty the further study of LAT as a potentially viable lead compound that might yield more consistent clinical benefit following the optimization of its pro-neurogenic, pro-autophagic and/or pro-cognitive activities.
Resumo:
Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.
Resumo:
PURPOSE A beamlet based direct aperture optimization (DAO) for modulated electron radiotherapy (MERT) using photon multileaf collimator (pMLC) shaped electron fields is developed and investigated. METHODS The Swiss Monte Carlo Plan (SMCP) allows the calculation of dose distributions for pMLC shaped electron beams. SMCP is interfaced with the Eclipse TPS (Varian Medical Systems, Palo Alto, CA) which can thus be included into the inverse treatment planning process for MERT. This process starts with the import of a CT-scan into Eclipse, the contouring of the target and the organs at risk (OARs), and the choice of the initial electron beam directions. For each electron beam, the number of apertures, their energy, and initial shape are defined. Furthermore, the DAO requires dose-volume constraints for the structures contoured. In order to carry out the DAO efficiently, the initial electron beams are divided into a grid of beamlets. For each of those, the dose distribution is precalculated using a modified electron beam model, resulting in a dose list for each beamlet and energy. Then the DAO is carried out, leading to a set of optimal apertures and corresponding weights. These optimal apertures are now converted into pMLC shaped segments and the dose calculation for each segment is performed. For these dose distributions, a weight optimization process is launched in order to minimize the differences between the dose distribution using the optimal apertures and the pMLC segments. Finally, a deliverable dose distribution for the MERT plan is obtained and loaded back into Eclipse for evaluation. For an idealized water phantom geometry, a MERT treatment plan is created and compared to the plan obtained using a previously developed forward planning strategy. Further, MERT treatment plans for three clinical situations (breast, chest wall, and parotid metastasis of a squamous cell skin carcinoma) are created using the developed inverse planning strategy. The MERT plans are compared to clinical standard treatment plans using photon beams and the differences between the optimal and the deliverable dose distributions are determined. RESULTS For the idealized water phantom geometry, the inversely optimized MERT plan is able to obtain the same PTV coverage, but with an improved OAR sparing compared to the forwardly optimized plan. Regarding the right-sided breast case, the MERT plan is able to reduce the lung volume receiving more than 30% of the prescribed dose and the mean lung dose compared to the standard plan. However, the standard plan leads to a better homogeneity within the CTV. The results for the left-sided thorax wall are similar but also the dose to the heart is reduced comparing MERT to the standard treatment plan. For the parotid case, MERT leads to lower doses for almost all OARs but to a less homogeneous dose distribution for the PTV when compared to a standard plan. For all cases, the weight optimization successfully minimized the differences between the optimal and the deliverable dose distribution. CONCLUSIONS A beamlet based DAO using multiple beam angles is implemented and successfully tested for an idealized water phantom geometry and clinical situations.
Resumo:
We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.
Resumo:
The influence of respiratory motion on patient anatomy poses a challenge to accurate radiation therapy, especially in lung cancer treatment. Modern radiation therapy planning uses models of tumor respiratory motion to account for target motion in targeting. The tumor motion model can be verified on a per-treatment session basis with four-dimensional cone-beam computed tomography (4D-CBCT), which acquires an image set of the dynamic target throughout the respiratory cycle during the therapy session. 4D-CBCT is undersampled if the scan time is too short. However, short scan time is desirable in clinical practice to reduce patient setup time. This dissertation presents the design and optimization of 4D-CBCT to reduce the impact of undersampling artifacts with short scan times. This work measures the impact of undersampling artifacts on the accuracy of target motion measurement under different sampling conditions and for various object sizes and motions. The results provide a minimum scan time such that the target tracking error is less than a specified tolerance. This work also presents new image reconstruction algorithms for reducing undersampling artifacts in undersampled datasets by taking advantage of the assumption that the relevant motion of interest is contained within a volume-of-interest (VOI). It is shown that the VOI-based reconstruction provides more accurate image intensity than standard reconstruction. The VOI-based reconstruction produced 43% fewer least-squares error inside the VOI and 84% fewer error throughout the image in a study designed to simulate target motion. The VOI-based reconstruction approach can reduce acquisition time and improve image quality in 4D-CBCT.
Resumo:
Existing models estimating oil spill costs at sea are based on data from the past, and they usually lack a systematic approach. This make them passive, and limits their ability to forecast the effect of the changes in the oil combating fleet or location of a spill on the oil spill costs. In this paper we make an attempt towards the development of a probabilistic and systematic model estimating the costs of clean-up operations for the Gulf of Finland. For this purpose we utilize expert knowledge along with the available data and information from literature. Then, the obtained information is combined into a framework with the use of a Bayesian Belief Networks. Due to lack of data, we validate the model by comparing its results with existing models, with which we found good agreement. We anticipate that the presented model can contribute to the cost-effective oil-combating fleet optimization for the Gulf of Finland. It can also facilitate the accident consequences estimation in the framework of formal safety assessment (FSA).
Resumo:
The Armington Assumption in the context of multi-regional CGE models is commonly interpreted as follows: Same commodities with different origins are imperfect substitutes for each other. In this paper, a static spatial CGE model that is compatible with this assumption and explicitly considers the transport sector and regional price differentials is formulated. Trade coefficients, which are derived endogenously from the optimization behaviors of firms and households, are shown to take the form of a potential function. To investigate how the elasticity of substitutions affects equilibrium solutions, a simpler version of the model that incorporates three regions and two sectors (besides the transport sector) is introduced. Results indicate: (1) if commodities produced in different regions are perfect substitutes, regional economies will be either autarkic or completely symmetric and (2) if they are imperfect substitutes, the impact of elasticity on the price equilibrium system as well as trade coefficients will be nonlinear and sometimes very sensitive.