299 resultados para Parameter Optimization
Resumo:
Seismic design of reinforced soil structures involves many uncertainties that arise from the backfill soil properties and tensile strength of the reinforcement which is not addressed in current design guidelines. This paper highlights the significance of variability in the internal stability assessment of reinforced soil structures. Reliability analysis is applied to estimate probability of failure and pseudo‐static approach has been used for the calculation of the tensile strength and length of the reinforcement needed to maintain the internal stability against tension and pullout failures. Logarithmic spiral failure surface has been considered in conjunction with the limit equilibrium method. Two modes of failure namely, tension failure and pullout failure have been considered. The influence of variations of the backfill soil friction angle, the tensile strength of reinforcement, horizontal seismic acceleration on the reliability index against tension failure and pullout failure of reinforced earth structure have been discussed.
Resumo:
We consider the two-parameter Sturm–Liouville system $$ -y_1''+q_1y_1=(\lambda r_{11}+\mu r_{12})y_1\quad\text{on }[0,1], $$ with the boundary conditions $$ \frac{y_1'(0)}{y_1(0)}=\cot\alpha_1\quad\text{and}\quad\frac{y_1'(1)}{y_1(1)}=\frac{a_1\lambda+b_1}{c_1\lambda+d_1}, $$ and $$ -y_2''+q_2y_2=(\lambda r_{21}+\mu r_{22})y_2\quad\text{on }[0,1], $$ with the boundary conditions $$ \frac{y_2'(0)}{y_2(0)} =\cot\alpha_2\quad\text{and}\quad\frac{y_2'(1)}{y_2(1)}=\frac{a_2\mu+b_2}{c_2\mu+d_2}, $$ subject to the uniform-left-definite and uniform-ellipticity conditions; where $q_{i}$ and $r_{ij}$ are continuous real valued functions on $[0,1]$, the angle $\alpha_{i}$ is in $[0,\pi)$ and $a_{i}$, $b_{i}$, $c_{i}$, $d_{i}$ are real numbers with $\delta_{i}=a_{i}d_{i}-b_{i}c_{i}>0$ and $c_{i}\neq0$ for $i,j=1,2$. Results are given on asymptotics, oscillation of eigenfunctions and location of eigenvalues.
Resumo:
We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.
Resumo:
The maintenance of chlorine residual is needed at all the points in the distribution system supplied with chlorine as a disinfectant. The propagation and level of chlorine in a distribution system is affected by both bulk and pipe wall reactions. It is well known that the field determination of wall reaction parameter is difficult. The source strength of chlorine to maintain a specified chlorine residual at a target node is also an important parameter. The inverse model presented in the paper determines these water quality parameters, which are associated with different reaction kinetics, either in single or in groups of pipes. The weighted-least-squares method based on the Gauss-Newton minimization technique is used for the estimation of these parameters. The validation and application of the inverse model is illustrated with an example pipe distribution system under steady state. A generalized procedure to handle noisy and bad (abnormal) data is suggested, which can be used to estimate these parameters more accurately. The developed inverse model is useful for water supply agencies to calibrate their water distribution system and to improve their operational strategies to maintain water quality.
Resumo:
This paper proposes a nonlinear voltage regulator with one tunable parameter for multimachine power systems. Based on output feedback linearization, this regulator can achieve simultaneous voltage regulation and small-signal performance objectives. Conventionally output feedback linearization has been used for voltage regulator design by taking infinite bus voltage as reference. Unfortunately, this controller has poor small-signal performance and cannot be applied to multimachine systems without the estimation of the equivalent external reactance seen from the generator. This paper proposes a voltage regulator design by redefining the rotor angle at each generator with respect to the secondary voltage of the step-up transformer as reference instead of a common synchronously rotating reference frame. Using synchronizing and damping torques analysis, we show that the proposed voltage regulator achieves simultaneous voltage regulation and damping performance over a range of system and operating conditions by controlling the relative angle between the generator internal voltage angle delta and the secondary voltage of the step up transformer. The performance of the proposed voltage regulator is evaluated on a single machine infinite bus system and two widely used multimachine test systems.
Resumo:
In this paper analytical expressions for optimal Vdd and Vth to minimize energy for a given speed constraint are derived. These expressions are based on the EKV model for transistors and are valid in both strong inversion and sub threshold regions. The effect of gate leakage on the optimal Vdd and Vth is analyzed. A new gradient based algorithm for controlling Vdd and Vth based on delay and power monitoring results is proposed. A Vdd-Vth controller which uses the algorithm to dynamically control the supply and threshold voltage of a representative logic block (sum of absolute difference computation of an MPEG decoder) is designed. Simulation results using 65 nm predictive technology models are given.
Resumo:
A circular array of Piezoelectric Wafer Active Sensor (PWAS) has been employed to detect surface damages like corrosion using lamb waves. The array consists of a number of small PWASs of 10 mm diameter and 1 mm thickness. The advantage of a circular array is its compact arrangement and large area of coverage for monitoring with small area of physical access. Growth of corrosion is monitored in a laboratory-scale set-up using the PWAS array and the nature of reflected and transmitted Lamb wave patterns due to corrosion is investigated. The wavelet time-frequency maps of the sensor signals are employed and a damage index is plotted against the damage parameters and varying frequency of the actuation signal (a windowed sine signal). The variation of wavelet coefficient for different growth of corrosion is studied. Wavelet coefficient as function of time gives an insight into the effect of corrosion in time-frequency scale. We present here a method to eliminate the time scale effect which helps in identifying easily the signature of damage in the measured signals. The proposed method becomes useful in determining the approximate location of the corrosion with respect to the location of three neighboring sensors in the circular array. A cumulative damage index is computed for varying damage sizes and the results appear promising.
Resumo:
Bid optimization is now becoming quite popular in sponsored search auctions on the Web. Given a keyword and the maximum willingness to pay of each advertiser interested in the keyword, the bid optimizer generates a profile of bids for the advertisers with the objective of maximizing customer retention without compromising the revenue of the search engine. In this paper, we present a bid optimization algorithm that is based on a Nash bargaining model where the first player is the search engine and the second player is a virtual agent representing all the bidders. We make the realistic assumption that each bidder specifies a maximum willingness to pay values and a discrete, finite set of bid values. We show that the Nash bargaining solution for this problem always lies on a certain edge of the convex hull such that one end point of the edge is the vector of maximum willingness to pay of all the bidders. We show that the other endpoint of this edge can be computed as a solution of a linear programming problem. We also show how the solution can be transformed to a bid profile of the advertisers.
Resumo:
Given a parametrized n-dimensional SQL query template and a choice of query optimizer, a plan diagram is a color-coded pictorial enumeration of the execution plan choices of the optimizer over the query parameter space. These diagrams have proved to be a powerful metaphor for the analysis and redesign of modern optimizers, and are gaining currency in diverse industrial and academic institutions. However, their utility is adversely impacted by the impractically large computational overheads incurred when standard brute-force exhaustive approaches are used for producing fine-grained diagrams on high-dimensional query templates. In this paper, we investigate strategies for efficiently producing close approximations to complex plan diagrams. Our techniques are customized to the features available in the optimizer's API, ranging from the generic optimizers that provide only the optimal plan for a query, to those that also support costing of sub-optimal plans and enumerating rank-ordered lists of plans. The techniques collectively feature both random and grid sampling, as well as inference techniques based on nearest-neighbor classifiers, parametric query optimization and plan cost monotonicity. Extensive experimentation with a representative set of TPC-H and TPC-DS-based query templates on industrial-strength optimizers indicates that our techniques are capable of delivering 90% accurate diagrams while incurring less than 15% of the computational overheads of the exhaustive approach. In fact, for full-featured optimizers, we can guarantee zero error with less than 10% overheads. These approximation techniques have been implemented in the publicly available Picasso optimizer visualization tool.