917 resultados para Chance-constrained optimisation
Resumo:
Energy storage is a potential alternative to conventional network reinforcementof the low voltage (LV) distribution network to ensure the grid’s infrastructure remainswithin its operating constraints. This paper presents a study on the control of such storagedevices, owned by distribution network operators. A deterministic model predictive control (MPC) controller and a stochastic receding horizon controller (SRHC) are presented, wherethe objective is to achieve the greatest peak reduction in demand, for a given storagedevice specification, taking into account the high level of uncertainty in the prediction of LV demand. The algorithms presented in this paper are compared to a standard set-pointcontroller and bench marked against a control algorithm with a perfect forecast. A specificcase study, using storage on the LV network, is presented, and the results of each algorithmare compared. A comprehensive analysis is then carried out simulating a large number of LV networks of varying numbers of households. The results show that the performance of each algorithm is dependent on the number of aggregated households. However, on a typical aggregation, the novel SRHC algorithm presented in this paper is shown to outperform each of the comparable storage control techniques.
Resumo:
Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.
Resumo:
Consultation on the Reform of the Planning System in Northern Ireland commenced on 6 July 2009 with the publication of the long awaited proposals paper: 'Reform of the Planning System in Northern Ireland: Your chance to influence change'. A 12 week consultation period followed during which time a series of consultation roadshow events were undertaken. This report is an account of that strand of the reform consultation and the discussions that took place at the roadshows during a three week period in September 2009. The roadshow events formed the central part in a process of encouraging engagement and response to the Reform Proposals before the closing date of 2 October 2009. They were organised and facilitated by a team of event managers and independent planners who, together with key Planning Service personnel, attended a mixture of day and evening events in each of the new eleven council areas to hear the views and opinions of those who came along. Aside from being publicly advertised, over 1,500 invitations (written and e-invites) were issued to a wide range of sectors, including the business community,environmentalists, councils, community and voluntary groups and other organisations, and 1,000 fliers were issued to libraries, leisure centres, council offices and civic centres. In total almost 500 people took up the invitation and came along to one or more of the events.
Resumo:
Incomplete understanding of three aspects of the climate system—equilibrium climate sensitivity, rate of ocean heat uptake and historical aerosol forcing—and the physical processes underlying them lead to uncertainties in our assessment of the global-mean temperature evolution in the twenty-first century1,2. Explorations of these uncertainties have so far relied on scaling approaches3,4, large ensembles of simplified climate models1,2, or small ensembles of complex coupled atmosphere–ocean general circulation models5,6 which under-represent uncertainties in key climate system properties derived from independent sources7–9. Here we present results from a multi-thousand-member perturbed-physics ensemble of transient coupled atmosphere–ocean general circulation model simulations. We find that model versions that reproduce observed surface temperature changes over the past 50 years show global-mean temperature increases of 1.4–3 K by 2050, relative to 1961–1990, under a mid-range forcing scenario. This range of warming is broadly consistent with the expert assessment provided by the Intergovernmental Panel on Climate Change Fourth Assessment Report10, but extends towards larger warming than observed in ensemblesof-opportunity5 typically used for climate impact assessments. From our simulations, we conclude that warming by the middle of the twenty-first century that is stronger than earlier estimates is consistent with recent observed temperature changes and a mid-range ‘no mitigation’ scenario for greenhouse-gas emissions.
Resumo:
In recent years several methodologies have been developed to combine and interpret ensembles of climate models with the aim of quantifying uncertainties in climate projections. Constrained climate model forecasts have been generated by combining various choices of metrics used to weight individual ensemble members, with diverse approaches to sampling the ensemble. The forecasts obtained are often significantly different, even when based on the same model output. Therefore, a climate model forecast classification system can serve two roles: to provide a way for forecast producers to self-classify their forecasts; and to provide information on the methodological assumptions underlying the forecast generation and its uncertainty when forecasts are used for impacts studies. In this review we propose a possible classification system based on choices of metrics and sampling strategies. We illustrate the impact of some of the possible choices in the uncertainty quantification of large scale projections of temperature and precipitation changes, and briefly discuss possible connections between climate forecast uncertainty quantification and decision making approaches in the climate change context.
Resumo:
Functional advantages of probiotics combined with interesting composition of oat were considered as an alternative to dairy products. In this study, fermentation of oat milk with Lactobacillus reuteri and Streptococcus thermophilus was analysed to develop a new probiotic product. Central composite design with response surface methodology was used to analyse the effect of different factors (glucose, fructose, inulin and starters) on the probiotic population in the product. Optimised formulation was characterised throughout storage time at 4 ℃ in terms of pH, acidity, β-glucan and oligosaccharides contents, colour and rheological behaviour. All formulations studied were adequate to produce fermented foods and minimum dose of each factor was considered as optimum. The selected formulation allowed starters survival above 107/cfu ml to be considered as a functional food and was maintained during the 28 days controlled. β-glucans remained in the final product with a positive effect on viscosity. Therefore, a new probiotic non-dairy milk was successfully developed in which high probiotic survivals were assured throughout the typical yoghurt-like shelf life.
Resumo:
We investigated the potential of soil moisture and nutrient amendments to enhance the biodegradation of oil in the soils from an ecologically unique semi-arid island. This was achieved using a series of controlled laboratory incubations where moisture or nutrient levels were experimentally manipulated. Respired CO2 increased sharply with moisture amendment reflecting the severe moisture limitation of these porous and semi-arid soils. The greatest levels of CO2 respiration were generally obtained with a soil pore water saturation of 50–70%. Biodegradation in these nutrient poor soils was also promoted by the moderate addition of a nitrogen fertiliser. Increased biodegradation was greater at the lowest amendment rate (100 mg N kg−1 soil) than the higher levels (500 or 1,000 mg N kg−1 soil), suggesting the higher application rates may introduce N toxicity. Addition of phosphorous alone had little effect, but a combined 500 mg N and 200 mg P kg−1 soil amendment led to a synergistic increase in CO2 respiration (3.0×), suggesting P can limit the biodegradation of hydrocarbons following exogenous N amendment.
Resumo:
Periocular recognition has recently become an active topic in biometrics. Typically it uses 2D image data of the periocular region. This paper is the first description of combining 3D shape structure with 2D texture. A simple and effective technique using iterative closest point (ICP) was applied for 3D periocular region matching. It proved its strength for relatively unconstrained eye region capture, and does not require any training. Local binary patterns (LBP) were applied for 2D image based periocular matching. The two modalities were combined at the score-level. This approach was evaluated using the Bosphorus 3D face database, which contains large variations in facial expressions, head poses and occlusions. The rank-1 accuracy achieved from the 3D data (80%) was better than that for 2D (58%), and the best accuracy (83%) was achieved by fusing the two types of data. This suggests that significant improvements to periocular recognition systems could be achieved using the 3D structure information that is now available from small and inexpensive sensors.
Resumo:
In this paper, we develop a novel constrained recursive least squares algorithm for adaptively combining a set of given multiple models. With data available in an online fashion, the linear combination coefficients of submodels are adapted via the proposed algorithm.We propose to minimize the mean square error with a forgetting factor, and apply the sum to one constraint to the combination parameters. Moreover an l1-norm constraint to the combination parameters is also applied with the aim to achieve sparsity of multiple models so that only a subset of models may be selected into the final model. Then a weighted l2-norm is applied as an approximation to the l1-norm term. As such at each time step, a closed solution of the model combination parameters is available. The contribution of this paper is to derive the proposed constrained recursive least squares algorithm that is computational efficient by exploiting matrix theory. The effectiveness of the approach has been demonstrated using both simulated and real time series examples.
Resumo:
The constrained compartmentalized knapsack problem can be seen as an extension of the constrained knapsack problem. However, the items are grouped into different classes so that the overall knapsack has to be divided into compartments, and each compartment is loaded with items from the same class. Moreover, building a compartment incurs a fixed cost and a fixed loss of the capacity in the original knapsack, and the compartments are lower and upper bounded. The objective is to maximize the total value of the items loaded in the overall knapsack minus the cost of the compartments. This problem has been formulated as an integer non-linear program, and in this paper, we reformulate the non-linear model as an integer linear master problem with a large number of variables. Some heuristics based on the solution of the restricted master problem are investigated. A new and more compact integer linear model is also presented, which can be solved by a branch-and-bound commercial solver that found most of the optimal solutions for the constrained compartmentalized knapsack problem. On the other hand, heuristics provide good solutions with low computational effort. (C) 2011 Elsevier BM. All rights reserved.
Resumo:
A method for linearly constrained optimization which modifies and generalizes recent box-constraint optimization algorithms is introduced. The new algorithm is based on a relaxed form of Spectral Projected Gradient iterations. Intercalated with these projected steps, internal iterations restricted to faces of the polytope are performed, which enhance the efficiency of the algorithm. Convergence proofs are given and numerical experiments are included and commented. Software supporting this paper is available through the Tango Project web page: http://www.ime.usp.br/similar to egbirgin/tango/.
Resumo:
In this work we reported the synthesis and evaluation of the analgesic, anti-inflammatory, and platelet anti-aggregating properties of new 3-(arylideneamino)-2-methyl-6,7-methylenedioxy-quinazolin-4 (3H)-one derivatives (3a-j), designed as conformationally constrained analogues of analgesic 1,3- benzodioxolyl-N- acylhydrazones (1) previously developed at LASSBio. Target compounds were synthesized in very good yields exploiting abundant Brazilian natural product safrole (2) as starting material. The pharmacological assays lead us to identify compounds LASSBio-1240 (3b) and LASSBio-1272 (3d) as new analgesic prototypes, presenting an antinociceptive pro. le more potent and effective than dipyrone and indomethacin used, respectively, as standards in AcOH-induced abdominal constrictions assay and in the formalin test. These results confirmed the success in the exploitation of conformation restriction strategy for identification of novel cyclic N-acylhydrazone analogues with optimized analgesic profile (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Global optimization seeks a minimum or maximum of a multimodal function over a discrete or continuous domain. In this paper, we propose a hybrid heuristic-based on the CGRASP and GENCAN methods-for finding approximate solutions for continuous global optimization problems subject to box constraints. Experimental results illustrate the relative effectiveness of CGRASP-GENCAN on a set of benchmark multimodal test functions.
Resumo:
A Nonlinear Programming algorithm that converges to second-order stationary points is introduced in this paper. The main tool is a second-order negative-curvature method for box-constrained minimization of a certain class of functions that do not possess continuous second derivatives. This method is used to define an Augmented Lagrangian algorithm of PHR (Powell-Hestenes-Rockafellar) type. Convergence proofs under weak constraint qualifications are given. Numerical examples showing that the new method converges to second-order stationary points in situations in which first-order methods fail are exhibited.
Resumo:
Given an algorithm A for solving some mathematical problem based on the iterative solution of simpler subproblems, an outer trust-region (OTR) modification of A is the result of adding a trust-region constraint to each subproblem. The trust-region size is adaptively updated according to the behavior of crucial variables. The new subproblems should not be more complex than the original ones, and the convergence properties of the OTR algorithm should be the same as those of Algorithm A. In the present work, the OTR approach is exploited in connection with the ""greediness phenomenon"" of nonlinear programming. Convergence results for an OTR version of an augmented Lagrangian method for nonconvex constrained optimization are proved, and numerical experiments are presented.