22 resultados para Multi-objective functions

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cost-efficient operation while satisfying performance and availability guarantees in Service Level Agreements (SLAs) is a challenge for Cloud Computing, as these are potentially conflicting objectives. We present a framework for SLA management based on multi-objective optimization. The framework features a forecasting model for determining the best virtual machine-to-host allocation given the need to minimize SLA violations, energy consumption and resource wasting. A comprehensive SLA management solution is proposed that uses event processing for monitoring and enables dynamic provisioning of virtual machines onto the physical infrastructure. We validated our implementation against serveral standard heuristics and were able to show that our approach is significantly better.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In several regions of the world, climate change is expected to have severe impacts on agricultural systems. Changes in land management are one way to adapt to future climatic conditions, including land-use changes and local adjustments of agricultural practices. In previous studies, options for adaptation have mostly been explored by testing alternative scenarios. Systematic explorations of land management possibilities using optimization approaches were so far mainly restricted to studies of land and resource management under constant climatic conditions. In this study, we bridge this gap and exploit the benefits of multi-objective regional optimization for identifying optimum land management adaptations to climate change. We design a multi-objective optimization routine that integrates a generic crop model and considers two climate scenarios for 2050 in a meso-scale catchment on the Swiss Central Plateau with already limited water resources. The results indicate that adaptation will be necessary in the study area to cope with a decrease in productivity by 0–10 %, an increase in soil loss by 25–35 %, and an increase in N-leaching by 30–45 %. Adaptation options identified here exhibit conflicts between productivity and environmental goals, but compromises are possible. Necessary management changes include (i) adjustments of crop shares, i.e. increasing the proportion of early harvested winter cereals at the expense of irrigated spring crops, (ii) widespread use of reduced tillage, (iii) allocation of irrigated areas to soils with low water-retention capacity at lower elevations, and (iv) conversion of some pre-alpine grasslands to croplands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centers from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multi-objective optimization algorithms aim at finding Pareto-optimal solutions. Recovering Pareto fronts or Pareto sets from a limited number of function evaluations are challenging problems. A popular approach in the case of expensive-to-evaluate functions is to appeal to metamodels. Kriging has been shown efficient as a base for sequential multi-objective optimization, notably through infill sampling criteria balancing exploitation and exploration such as the Expected Hypervolume Improvement. Here we consider Kriging metamodels not only for selecting new points, but as a tool for estimating the whole Pareto front and quantifying how much uncertainty remains on it at any stage of Kriging-based multi-objective optimization algorithms. Our approach relies on the Gaussian process interpretation of Kriging, and bases upon conditional simulations. Using concepts from random set theory, we propose to adapt the Vorob’ev expectation and deviation to capture the variability of the set of non-dominated points. Numerical experiments illustrate the potential of the proposed workflow, and it is shown on examples how Gaussian process simulations and the estimated Vorob’ev deviation can be used to monitor the ability of Kriging-based multi-objective optimization algorithms to accurately learn the Pareto front.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

SOMS is a general surrogate-based multistart algorithm, which is used in combination with any local optimizer to find global optima for computationally expensive functions with multiple local minima. SOMS differs from previous multistart methods in that a surrogate approximation is used by the multistart algorithm to help reduce the number of function evaluations necessary to identify the most promising points from which to start each nonlinear programming local search. SOMS’s numerical results are compared with four well-known methods, namely, Multi-Level Single Linkage (MLSL), MATLAB’s MultiStart, MATLAB’s GlobalSearch, and GLOBAL. In addition, we propose a class of wavy test functions that mimic the wavy nature of objective functions arising in many black-box simulations. Extensive comparisons of algorithms on the wavy testfunctions and on earlier standard global-optimization test functions are done for a total of 19 different test problems. The numerical results indicate that SOMS performs favorably in comparison to alternative methods and does especially well on wavy functions when the number of function evaluations allowed is limited.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work deals with parallel optimization of expensive objective functions which are modelled as sample realizations of Gaussian processes. The study is formalized as a Bayesian optimization problem, or continuous multi-armed bandit problem, where a batch of q > 0 arms is pulled in parallel at each iteration. Several algorithms have been developed for choosing batches by trading off exploitation and exploration. As of today, the maximum Expected Improvement (EI) and Upper Confidence Bound (UCB) selection rules appear as the most prominent approaches for batch selection. Here, we build upon recent work on the multipoint Expected Improvement criterion, for which an analytic expansion relying on Tallis’ formula was recently established. The computational burden of this selection rule being still an issue in application, we derive a closed-form expression for the gradient of the multipoint Expected Improvement, which aims at facilitating its maximization using gradient-based ascent algorithms. Substantial computational savings are shown in application. In addition, our algorithms are tested numerically and compared to state-of-the-art UCB-based batchsequential algorithms. Combining starting designs relying on UCB with gradient-based EI local optimization finally appears as a sound option for batch design in distributed Gaussian Process optimization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cloud Computing enables provisioning and distribution of highly scalable services in a reliable, on-demand and sustainable manner. However, objectives of managing enterprise distributed applications in cloud environments under Service Level Agreement (SLA) constraints lead to challenges for maintaining optimal resource control. Furthermore, conflicting objectives in management of cloud infrastructure and distributed applications might lead to violations of SLAs and inefficient use of hardware and software resources. This dissertation focusses on how SLAs can be used as an input to the cloud management system, increasing the efficiency of allocating resources, as well as that of infrastructure scaling. First, we present an extended SLA semantic model for modelling complex service-dependencies in distributed applications, and for enabling automated cloud infrastructure management operations. Second, we describe a multi-objective VM allocation algorithm for optimised resource allocation in infrastructure clouds. Third, we describe a method of discovering relations between the performance indicators of services belonging to distributed applications and then using these relations for building scaling rules that a CMS can use for automated management of VMs. Fourth, we introduce two novel VM-scaling algorithms, which optimally scale systems composed of VMs, based on given SLA performance constraints. All presented research works were implemented and tested using enterprise distributed applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

These guidelines were developed in the context of working block 3 of the DESIRE project. They address the facilitators in the 18 DESIRE study sites and support them in conducting stakeholder workshops aiming at the selection and decision on mitigation strategies to be implemented in the study site context. The decision-making process is supported by a multi-objective decision support system (MODSS) Software called 'Facilitator'.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Soil degradation is widespread in the Ethiopian Highlands. Its negative impacts on soil productivity contribute to the extreme poverty of the rural population. Soil conservation is propagated as a means of reducing soil erosion, however, it is a costly investment for small-scale farming households. The present study is an attempt to show whether or not selected mechanical Soil and Water Conservation (SWC) technologies are profitable from a farmer’s point of view. A financial Cost-Benefit Analysis (CBA) is carried out to assess whether or not the considered SWC technologies are profitable from a farmer’s point of view. The CBA is supplemented by an evaluation of aspects from the economic and institutional environment. Whether or not soil conservation is profitable from a farmer’s point of view depends on a broad range of factors from the ecological, economic, political, institutional and socio-cultural sphere and also depends on the technology and the prevailing farming system. Because these factors are closely interlinked, it is often not sufficient to change or influence one to make SWC profitable. Several recommendations are formulated with regard to improving the profitability of SWC investments from a farmer’s point of view. Because the reasons for unsustainable resource use are manifold and highly interlinked, only a multi-stakeholder, multi-level and multi-objective approach is likely to offer solutions that address the underlying problems adequately.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Peatlands are widely exploited archives of paleoenvironmental change. We developed and compared multiple transfer functions to infer peatland depth to the water table (DWT) and pH based on testate amoeba (percentages, or presence/absence), bryophyte presence/absence, and vascular plant presence/absence data from sub-alpine peatlands in the SE Swiss Alps in order to 1) compare the performance of single-proxy vs. multi-proxy models and 2) assess the performance of presence/absence models. Bootstrapping cross-validation showing the best performing single-proxy transfer functions for both DWT and pH were those based on bryophytes. The best performing transfer functions overall for DWT were those based on combined testate amoebae percentages, bryophytes and vascular plants; and, for pH, those based on testate amoebae and bryophytes. The comparison of DWT and pH inferred from testate amoeba percentages and presence/absence data showed similar general patterns but differences in the magnitude and timing of some shifts. These results show new directions for paleoenvironmental research, 1) suggesting that it is possible to build good-performing transfer functions using presence/absence data, although with some loss of accuracy, and 2) supporting the idea that multi-proxy inference models may improve paleoecological reconstruction. The performance of multi-proxy and single-proxy transfer functions should be further compared in paleoecological data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to explore whether it is possible to describe based on the International Classification of Functioning, Disability and Health (ICF) relevant aspects of functioning and disability affected in multiple sclerosis (MS) as well as environmental factors relevant to persons with MS. The specific aim was to identify most relevant 'Body functions', 'Body structures', 'Activities and participation', as well as 'Environmental factors' in patients with MS using the ICF. Additionally, different MS forms were compared with respect to the identified problems. A multi-centre study was conducted in an empirical cross-sectional design. Data from 205 individuals with MS were collected in rehabilitation centres: disease related data, socio-demographic data, single interviews based on the Extended ICF Checklist and a patient questionnaire including ratings on general health and functioning status, Beck Depression Inventory II (BDI-II) and Comorbidity Questionnaire (SCQ). The 129 ICF categories identified represent a comprehensive classification of functioning in MS from the clinical perspective. Differences between MS forms were observed for several ICF categories, EDSS, general health and functioning status, but not for BDI and SCQ. The study showed that it is possible to describe based on the ICF the spectrum in functioning and disability affected in MS as well as environmental factors relevant to persons with MS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to investigate whether it is possible to pool together diffusion spectrum imaging data from four different scanners, located at three different sites. Two of the scanners had identical configuration whereas two did not. To measure the variability, we extracted three scalar maps (ADC, FA and GFA) from the DSI and utilized a region and a tract-based analysis. Additionally, a phantom study was performed to rule out some potential factors arising from the scanner performance in case some systematic bias occurred in the subject study. This work was split into three experiments: intra-scanner reproducibility, reproducibility with twin-scanner settings and reproducibility with other configurations. Overall for the intra-scanner and twin-scanner experiments, the region-based analysis coefficient of variation (CV) was in a range of 1%-4.2% and below 3% for almost every bundle for the tract-based analysis. The uncinate fasciculus showed the worst reproducibility, especially for FA and GFA values (CV 3.7-6%). For the GFA and FA maps, an ICC value of 0.7 and above is observed in almost all the regions/tracts. Looking at the last experiment, it was found that there is a very high similarity of the outcomes from the two scanners with identical setting. However, this was not the case for the two other imagers. Given the fact that the overall variation in our study is low for the imagers with identical settings, our findings support the feasibility of cross-site pooling of DSI data from identical scanners.