288 resultados para VLSI, floorplanning, optimization, greedy algorithim, ordered tree


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compliant mechanisms are elastic continua used to transmit or transform force and motion mechanically. The topology optimization methods developed for compliant mechanisms also give the shape for a chosen parameterization of the design domain with a fixed mesh. However, in these methods, the shapes of the flexible segments in the resulting optimal solutions are restricted either by the type or the resolution of the design parameterization. This limitation is overcome in this paper by focusing on optimizing the skeletal shape of the compliant segments in a given topology. It is accomplished by identifying such segments in the topology and representing them using Bezier curves. The vertices of the Bezier control polygon are used to parameterize the shape-design space. Uniform parameter steps of the Bezier curves naturally enable adaptive finite element discretization of the segments as their shapes change. Practical constraints such as avoiding intersections with other segments, self-intersections, and restrictions on the available space and material, are incorporated into the formulation. A multi-criteria function from our prior work is used as the objective. Analytical sensitivity analysis for the objective and constraints is presented and is used in the numerical optimization. Examples are included to illustrate the shape optimization method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term surveys of entire communities of species are needed to measure fluctuations in natural populations and elucidate the mechanisms driving population dynamics and community assembly. We analysed changes in abundance of over 4000 tree species in 12 forests across the world over periods of 6-28years. Abundance fluctuations in all forests are large and consistent with population dynamics models in which temporal environmental variance plays a central role. At some sites we identify clear environmental drivers, such as fire and drought, that could underlie these patterns, but at other sites there is a need for further research to identify drivers. In addition, cross-site comparisons showed that abundance fluctuations were smaller at species-rich sites, consistent with the idea that stable environmental conditions promote higher diversity. Much community ecology theory emphasises demographic variance and niche stabilisation; we encourage the development of theory in which temporal environmental variance plays a central role.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work presents the results of experimental investigation of semi-solid rheocasting of A356 Al alloy using a cooling slope. The experiments have been carried out following Taguchi method of parameter design (orthogonal array of L-9 experiments). Four key process variables (slope angle, pouring temperature, wall temperature, and length of travel of the melt) at three different levels have been considered for the present experimentation. Regression analysis and analysis of variance (ANOVA) has also been performed to develop a mathematical model for degree of sphericity evolution of primary alpha-Al phase and to find the significance and percentage contribution of each process variable towards the final outcome of degree of sphericity, respectively. The best processing condition has been identified for optimum degree of sphericity (0.83) as A(3), B-3, C-2, D-1 i.e., slope angle of 60 degrees, pouring temperature of 650 degrees C, wall temperature 60 degrees C, and 500 mm length of travel of the melt, based on mean response and signal to noise ratio (SNR). ANOVA results shows that the length of travel has maximum impact on degree of sphericity evolution. The predicted sphericity obtained from the developed regression model and the values obtained experimentally are found to be in good agreement with each other. The sphericity values obtained from confirmation experiment, performed at 95% confidence level, ensures that the optimum result is correct and also the confirmation experiment values are within permissible limits. (c) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study a problem of designing a multi-hop wireless network for interconnecting sensors (hereafter called source nodes) to a Base Station (BS), by deploying a minimum number of relay nodes at a subset of given potential locations, while meeting a quality of service (QoS) objective specified as a hop count bound for paths from the sources to the BS. The hop count bound suffices to ensure a certain probability of the data being delivered to the BS within a given maximum delay under a light traffic model. We observe that the problem is NP-Hard. For this problem, we propose a polynomial time approximation algorithm based on iteratively constructing shortest path trees and heuristically pruning away the relay nodes used until the hop count bound is violated. Results show that the algorithm performs efficiently in various randomly generated network scenarios; in over 90% of the tested scenarios, it gave solutions that were either optimal or were worse than optimal by just one relay. We then use random graph techniques to obtain, under a certain stochastic setting, an upper bound on the average case approximation ratio of a class of algorithms (including the proposed algorithm) for this problem as a function of the number of source nodes, and the hop count bound. To the best of our knowledge, the average case analysis is the first of its kind in the relay placement literature. Since the design is based on a light traffic model, we also provide simulation results (using models for the IEEE 802.15.4 physical layer and medium access control) to assess the traffic levels up to which the QoS objectives continue to be met. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smoothed functional (SF) schemes for gradient estimation are known to be efficient in stochastic optimization algorithms, especially when the objective is to improve the performance of a stochastic system However, the performance of these methods depends on several parameters, such as the choice of a suitable smoothing kernel. Different kernels have been studied in the literature, which include Gaussian, Cauchy, and uniform distributions, among others. This article studies a new class of kernels based on the q-Gaussian distribution, which has gained popularity in statistical physics over the last decade. Though the importance of this family of distributions is attributed to its ability to generalize the Gaussian distribution, we observe that this class encompasses almost all existing smoothing kernels. This motivates us to study SF schemes for gradient estimation using the q-Gaussian distribution. Using the derived gradient estimates, we propose two-timescale algorithms for optimization of a stochastic objective function in a constrained setting with a projected gradient search approach. We prove the convergence of our algorithms to the set of stationary points of an associated ODE. We also demonstrate their performance numerically through simulations on a queuing model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new global stochastic search, guided mainly through derivative-free directional information computable from the sample statistical moments of the design variables within a Monte Carlo setup, is proposed. The search is aided by imparting to the directional update term additional layers of random perturbations referred to as `coalescence' and `scrambling'. A selection step, constituting yet another avenue for random perturbation, completes the global search. The direction-driven nature of the search is manifest in the local extremization and coalescence components, which are posed as martingale problems that yield gain-like update terms upon discretization. As anticipated and numerically demonstrated, to a limited extent, against the problem of parameter recovery given the chaotic response histories of a couple of nonlinear oscillators, the proposed method appears to offer a more rational, more accurate and faster alternative to most available evolutionary schemes, prominently the particle swarm optimization. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the first q-Gaussian smoothed functional (SF) estimator of the Hessian and the first Newton-based stochastic optimization algorithm that estimates both the Hessian and the gradient of the objective function using q-Gaussian perturbations. Our algorithm requires only two system simulations (regardless of the parameter dimension) and estimates both the gradient and the Hessian at each update epoch using these. We also present a proof of convergence of the proposed algorithm. In a related recent work (Ghoshdastidar, Dukkipati, & Bhatnagar, 2014), we presented gradient SF algorithms based on the q-Gaussian perturbations. Our work extends prior work on SF algorithms by generalizing the class of perturbation distributions as most distributions reported in the literature for which SF algorithms are known to work turn out to be special cases of the q-Gaussian distribution. Besides studying the convergence properties of our algorithm analytically, we also show the results of numerical simulations on a model of a queuing network, that illustrate the significance of the proposed method. In particular, we observe that our algorithm performs better in most cases, over a wide range of q-values, in comparison to Newton SF algorithms with the Gaussian and Cauchy perturbations, as well as the gradient q-Gaussian SF algorithms. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Friction stir processing (FSP) is emerging as one of the most competent severe plastic deformation (SPD) method for producing bulk ultra-fine grained materials with improved properties. Optimizing the process parameters for a defect free process is one of the challenging aspects of FSP to mark its commercial use. For the commercial aluminium alloy 2024-T3 plate of 6 mm thickness, a bottom-up approach has been attempted to optimize major independent parameters of the process such as plunge depth, tool rotation speed and traverse speed. Tensile properties of the optimum friction stir processed sample were correlated with the microstructural characterization done using Scanning Electron Microscope (SEM) and Electron Back-Scattered Diffraction (EBSD). Optimum parameters from the bottom-up approach have led to a defect free FSP having a maximum strength of 93% the base material strength. Micro tensile testing of the samples taken from the center of processed zone has shown an increased strength of 1.3 times the base material. Measured maximum longitudinal residual stress on the processed surface was only 30 MPa which was attributed to the solid state nature of FSP. Microstructural observation reveals significant grain refinement with less variation in the grain size across the thickness and a large amount of grain boundary precipitation compared to the base metal. The proposed experimental bottom-up approach can be applied as an effective method for optimizing parameters during FSP of aluminium alloys, which is otherwise difficult through analytical methods due to the complex interactions between work-piece, tool and process parameters. Precipitation mechanisms during FSP were responsible for the fine grained microstructure in the nugget zone that provided better mechanical properties than the base metal. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new Hessian estimator based on the simultaneous perturbation procedure, that requires three system simulations regardless of the parameter dimension. We then present two Newton-based simulation optimization algorithms that incorporate this Hessian estimator. The two algorithms differ primarily in the manner in which the Hessian estimate is used. Both our algorithms do not compute the inverse Hessian explicitly, thereby saving on computational effort. While our first algorithm directly obtains the product of the inverse Hessian with the gradient of the objective, our second algorithm makes use of the Sherman-Morrison matrix inversion lemma to recursively estimate the inverse Hessian. We provide proofs of convergence for both our algorithms. Next, we consider an interesting application of our algorithms on a problem of road traffic control. Our algorithms are seen to exhibit better performance than two Newton algorithms from a recent prior work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accuracy in tree woody growth estimates is important to global carbon budget estimation and climate-change science. Tree growth in permanent sampling plots (PSPs) is commonly estimated by measuring stem diameter changes, but this method is susceptible to bias resulting from water-induced reversible stem shrinkage. In the absence of bias correction, temporal variability in growth is likely to be overestimated and incorrectly attributed to fluctuations in resource availability, especially in forests with high seasonal and inter-annual variability in water. We propose and test a novel approach for estimating and correcting this bias at the community level. In a 50-ha PSP from a seasonally dry tropical forest in southern India, where tape measurements have been taken every four years from 1988 to 2012, for nine trees we estimated bias due to reversible stem shrinkage as the difference between woody growth measured using tree rings and that estimated from tape. We tested if the bias estimated from these trees could be used as a proxy to correct bias in tape-based growth estimates at the PSP scale. We observed significant shrinkage-related bias in the growth estimates of the nine trees in some censuses. This bias was strongly linearly related to tape-based growth estimates at the level of the PSP, and could be used as a proxy. After bias was corrected, the temporal variance in growth rates of the PSP decreased, while the effect of exceptionally dry or wet periods was retained, indicating that at least a part of the temporal variability arose from reversible shrinkage-related bias. We also suggest that the efficacy of the bias correction could be improved by measuring the proxy on trees that belong to different size classes and census timing, but not necessarily to different species. Our approach allows for reanalysis - and possible reinterpretation of temporal trends in tree growth, above ground biomass change, or carbon fluxes in forests, and their relationships with resource availability in the context of climate change. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of two major electrodeposition process conditions, electrolyte bath temperature and current density, on the microstructure and crystallographic texture of pure tin coatings on brass and, ultimately, on the extent of whisker formation have been examined. The grain size of the deposited coatings increased with increasing electrolyte bath temperature and current density, which significantly affected the dominant texture: (211) or (420) was the dominant texture at low current densities whereas, depending on deposition temperature, (200) or (220) became the dominant texture at high current densities. After deposition, coatings were subjected to different environmental conditions, for example isothermal aging (room temperature, 50A degrees C, or 150A degrees C) for up to 90 days and thermal cycling between -25A degrees C and 85A degrees C for 100 cycles, and whisker growth was studied. The Sn coatings with low Miller index planes, for example (200) and (220), and with moderate aging temperature were more prone to whiskering than coating with high Miller index planes, for example (420), and high aging temperature. A processing route involving the optimum combination of current density and deposition temperature is proposed for suppressing whisker growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We revisit a problem studied by Padakandla and Sundaresan SIAM J. Optim., August 2009] on the minimization of a separable convex function subject to linear ascending constraints. The problem arises as the core optimization in several resource allocation problems in wireless communication settings. It is also a special case of an optimization of a separable convex function over the bases of a specially structured polymatroid. We give an alternative proof of the correctness of the algorithm of Padakandla and Sundaresan. In the process we relax some of their restrictions placed on the objective function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compressive Sensing (CS) theory combines the signal sampling and compression for sparse signals resulting in reduction in sampling rate. In recent years, many recovery algorithms have been proposed to reconstruct the signal efficiently. Subspace Pursuit and Compressive Sampling Matching Pursuit are some of the popular greedy methods. Also, Fusion of Algorithms for Compressed Sensing is a recently proposed method where several CS reconstruction algorithms participate and the final estimate of the underlying sparse signal is determined by fusing the estimates obtained from the participating algorithms. All these methods involve solving a least squares problem which may be ill-conditioned, especially in the low dimension measurement regime. In this paper, we propose a step prior to least squares to ensure the well-conditioning of the least squares problem. Using Monte Carlo simulations, we show that in low dimension measurement scenario, this modification improves the reconstruction capability of the algorithm in clean as well as noisy measurement cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The high species richness of tropical forests has long been recognized, yet there remains substantial uncertainty regarding the actual number of tropical tree species. Using a pantropical tree inventory database from closed canopy forests, consisting of 657,630 trees belonging to 11,371 species, we use a fitted value of Fisher's alpha and an approximate pantropical stem total to estimate the minimum number of tropical forest tree species to fall between similar to 40,000 and similar to 53,000, i.e., at the high end of previous estimates. Contrary to common assumption, the Indo-Pacific region was found to be as species-rich as the Neotropics, with both regions having a minimum of similar to 19,000-25,000 tree species. Continental Africa is relatively depauperate with a minimum of similar to 4,500-6,000 tree species. Very few species are shared among the African, American, and the Indo-Pacific regions. We provide a methodological framework for estimating species richness in trees that may help refine species richness estimates of tree-dependent taxa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time Projection Chamber (TPC) based X-ray polarimeters using Gas Electron Multiplier (GEM) are currently being developed to make sensitive measurement of polarization in 2-10 keV energy range. The emission direction of the photoelectron ejected via photoelectric effect carries the information of the polarization of the incident X-ray photon. Performance of a gas based polarimeter is affected by the operating drift parameters such as gas pressure, drift field and drift-gap. We present simulation studies carried out in order to understand the effect of these operating parameters on the modulation factor of a TPC polarimeter. Models of Garfield are used to study photoelectron interaction in gas and drift of electron cloud towards GEM. Our study is aimed at achieving higher modulation factors by optimizing drift parameters. Study has shown that Ne/DME (50/50) at lower pressure and drift field can lead to desired performance of a TPC polarimeter.