974 resultados para Dynamic optimization
Resumo:
An in-situ power monitoring technique for Dynamic Voltage and Threshold scaling (DVTS) systems is proposed which measures total power consumed by load circuit using sleep transistor acting as power sensor. Design details of power monitor are examined using simulation framework in UMC 90nm CMOS process. Experimental results of test chip fabricated in AMS 0.35µm CMOS process are presented. The test chip has variable activity between 0.05 and 0.5 and has PMOS VTH control through nWell contact. Maximum resolution obtained from power monitor is 0.25mV. Overhead of power monitor in terms of its power consumption is 0.244 mW (2.2% of total power of load circuit). Lastly, power monitor is used to demonstrate closed loop DVTS system. DVTS algorithm shows 46.3% power savings using in-situ power monitor.
Resumo:
The objectives of this paper are to examine the loss of crack tip constraint in dynamically loaded fracture specimens and to assess whether it can lead to enhancement in the fracture toughness at high loading rates which has been observed in several experimental studies. To this end, 2-D plane strain finite element analyses of single edge notched (tension) specimen and three point bend specimen subjected to time varying loads are performed. The material is assumed to obey the small strain J(2) flow theory of plasticity with rate independent behaviour. The results demonstrate that a valid J-Q field exists under dynamic loading irrespective of the crack length and specimen geometry. Further, the constraint parameter Q becomes strongly negative at high loading rates, particularly in deeply cracked specimens. The variation of dynamic fracture toughness K-dc with stress intensity rate K for cleavage cracking is predicted using a simple critical stress criterion. It is found that inertia-driven constraint loss can substantially enhance K-dc for (K) over dot > 10(5) MPa rootm/s.
Resumo:
Experimental study and optimization of Plasma Ac- tuators for Flow control in subsonic regime PRADEEP MOISE, JOSEPH MATHEW, KARTIK VENKATRAMAN, JOY THOMAS, Indian Institute of Science, FLOW CONTROL TEAM | The induced jet produced by a dielectric barrier discharge (DBD) setup is capable of preventing °ow separation on airfoils at high angles of attack. The ef-fect of various parameters on the velocity of this induced jet was studied experimentally. The glow discharge was created at atmospheric con-ditions by using a high voltage RF power supply. Flow visualization,photographic studies of the plasma, and hot-wire measurements on the induced jet were performed. The parametric investigation of the charac- teristics of the plasma show that the width of the plasma in the uniform glow discharge regime was an indication of the velocity induced. It was observed that the spanwise and streamwise overlap of the two electrodes,dielectric thickness, voltage and frequency of the applied voltage are the major parameters that govern the velocity and the extent of plasma.e®ect of the optimized con¯guration on the performance characteristics of an airfoil was studied experimentally.
Resumo:
A trajectory optimization approach is applied to the design of a sequence of open-die forging operations in order to control the transient thermal response of a large titanium alloy billet. The amount of time tire billet is soaked in furnace prior to each successive forging operation is optimized to minimize the total process time while simultaneously satisfying constraints on the maximum and minimum values of the billet's temperature distribution to avoid microstructural defects during forging. The results indicate that a "differential" heating profile is the most effective at meeting these design goals.
Resumo:
GaAs/Ge heterostructures having abrupt interfaces were grown on 2degrees, 6degrees, and 9degrees off-cut Ge substrates and investigated by cross-sectional high-resolution transmission electron microscopy (HRTEM), scanning electron microscopy, photoluminescence spectroscopy and electrochemical capacitance voltage (ECV) profiler. The GaAs films were grown on off-oriented Ge substrates with growth temperature in the range of 600-700degreesC, growth rate of 3-12 mum/hr and a V/III ratio of 29-88. The lattice indexing of HRTEM exhibits an excellent lattice line matching between GaAs and Ge substrate. The PL spectra from GaAs layer on 6degrees off-cut Ge substrate shows the higher excitonic peak compared with 2degrees and 9degrees off-cut Ge substrates. In addition, the luminescence intensity from the GaAs solar cell grown on 6degrees off-cut is higher than on 9degrees off-cut Ge substrates and signifies the potential use of 6degrees off-cut Ge substrate in the GaAs solar cells industry. The ECV profiling shows an abrupt film/substrate interface as well as between various layers of the solar cell structures.
Resumo:
Alopex is a correlation-based gradient-free optimization technique useful in many learning problems. However, there are no analytical results on the asymptotic behavior of this algorithm. This article presents a new version of Alopex that can be analyzed using techniques of two timescale stochastic approximation method. It is shown that the algorithm asymptotically behaves like a gradient-descent method, though it does not need (or estimate) any gradient information. It is also shown, through simulations, that the algorithm is quite effective.
Resumo:
Seismic design of reinforced soil structures involves many uncertainties that arise from the backfill soil properties and tensile strength of the reinforcement which is not addressed in current design guidelines. This paper highlights the significance of variability in the internal stability assessment of reinforced soil structures. Reliability analysis is applied to estimate probability of failure and pseudo‐static approach has been used for the calculation of the tensile strength and length of the reinforcement needed to maintain the internal stability against tension and pullout failures. Logarithmic spiral failure surface has been considered in conjunction with the limit equilibrium method. Two modes of failure namely, tension failure and pullout failure have been considered. The influence of variations of the backfill soil friction angle, the tensile strength of reinforcement, horizontal seismic acceleration on the reliability index against tension failure and pullout failure of reinforced earth structure have been discussed.
Resumo:
In this paper a new parallel algorithm for nonlinear transient dynamic analysis of large structures has been presented. An unconditionally stable Newmark-beta method (constant average acceleration technique) has been employed for time integration. The proposed parallel algorithm has been devised within the broad framework of domain decomposition techniques. However, unlike most of the existing parallel algorithms (devised for structural dynamic applications) which are basically derived using nonoverlapped domains, the proposed algorithm uses overlapped domains. The parallel overlapped domain decomposition algorithm proposed in this paper has been formulated by splitting the mass, damping and stiffness matrices arises out of finite element discretisation of a given structure. A predictor-corrector scheme has been formulated for iteratively improving the solution in each step. A computer program based on the proposed algorithm has been developed and implemented with message passing interface as software development environment. PARAM-10000 MIMD parallel computer has been used to evaluate the performances. Numerical experiments have been conducted to validate as well as to evaluate the performance of the proposed parallel algorithm. Comparisons have been made with the conventional nonoverlapped domain decomposition algorithms. Numerical studies indicate that the proposed algorithm is superior in performance to the conventional domain decomposition algorithms. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
In-situ transmission electron microscopy (TEM) has developed rapidly over the last decade. In particular, with the inclusion of scanning probes in TEM holders, allows both mechanical and electrical testing to be performed whilst simultaneously imaging the microstructure at high resolution. In-situ TEM nanoindentation and tensile experiments require only an axial displacement perpendicular to the test surface. However, here, through the development of a novel in-situ TEM triboprobe, other surface characterisation experiments are now possible, with the introduction of a fully programmable 3D positioning system. Programmable lateral displacement control allows scratch tests to be performed at high resolution with simultaneous imaging of the changing microstructure. With the addition of repeated cyclic movements, both nanoscale fatigue and friction experiments can also now be performed. We demonstrate a range of movement profiles for a variety of applications, in particular, lateral sliding wear. The developed NanoLAB TEM triboprobe also includes a new closed loop vision control system for intuitive control during positioning and alignment. It includes an automated online calibration to ensure that the fine piezotube is controlled accurately throughout any type of test. Both the 3D programmability and the closed loop vision feedback system are demonstrated here.
Resumo:
Emerging high-dimensional data mining applications needs to find interesting clusters embeded in arbitrarily aligned subspaces of lower dimensionality. It is difficult to cluster high-dimensional data objects, when they are sparse and skewed. Updations are quite common in dynamic databases and they are usually processed in batch mode. In very large dynamic databases, it is necessary to perform incremental cluster analysis only to the updations. We present a incremental clustering algorithm for subspace clustering in very high dimensions, which handles both insertion and deletions of datapoints to the backend databases.
Resumo:
We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.
Resumo:
In this paper, we investigate the effect of vacuum sealing the backside cavity of a Capacitive Micromachined Ultrasonic Transducer (CMUT). The presence or absence of air inside the cavity has a marked effect upon the system parameters, such as the natural frequency, damping, and the pull-in voltage. The presence of vacuum inside the cavity of the device causes a reduction in the effective gap height which leads to a reduction in the pull-in voltage. We carry out ANSYS simulations to quantify this reduction. The presence of vacuum inside the cavity of the device causes stress stiffening of the membrane, which changes the natural frequency of the device. A prestressed modal analysis is carried out to determine the change in natural frequency due to stress stiffening. The equivalent circuit method is used to evaluate the performance of the device in the receiver mode. The lumped parameters of the device are obtained and an equivalent circuit model of the device is constructed to determine the open circuit receiving sensitivity of the device. The effect of air in the cavity is included by incorporating an equivalent compliance and an equivalent resistance in the equivalent circuit.
Resumo:
A dynamic model of the COREX melter gasifier is developed to study the transient behavior of the furnace. The effect of pulse disturbance and step disturbance on the process performance has been studied. This study shows that the effect of pulse disturbance decays asymptotically. The step change brings the system to a new steady state after a delay of about 5 hours. The dynamic behavior of the melter gasifier with respect to a shutdown/blow-on condition and the effect of tapping are also studied. The results show that the time response of the melter gasifier is much less than that of a blast furnace.
Resumo:
The specified range of free chlorine residual (between minimum and maximum) in water distribution systems needs to be maintained to avoid deterioration of the microbial quality of water, control taste and/or odor problems, and hinder formation of carcino-genic disinfection by-products. Multiple water quality sources for providing chlorine input are needed to maintain the chlorine residuals within a specified range throughout the distribution system. The determination of source dosage (i.e., chlorine concentrations/chlorine mass rates) at water quality sources to satisfy the above objective under dynamic conditions is a complex process. A nonlinear optimization problem is formulated to determine the chlorine dosage at the water quality sources subjected to minimum and maximum constraints on chlorine concentrations at all monitoring nodes. A genetic algorithm (GA) approach in which decision variables (chlorine dosage) are coded as binary strings is used to solve this highly nonlinear optimization problem, with nonlinearities arising due to set-point sources and non-first-order reactions. Application of the model is illustrated using three sample water distribution systems, and it indicates that the GA,is a useful tool for evaluating optimal water quality source chlorine schedules.