203 resultados para Greedy randomized adaptive search procedure


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a free software tool that supports the next-generation Mobile Communications, through the automatic generation of models of components and electronic devices based on neural networks. This tool enables the creation, training, validation and simulation of the model directly from measurements made on devices of interest, using an interface totally oriented to non-experts in neural models. The resulting model can be exported automatically to a traditional circuit simulator to test different scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A procedure to evaluate mine rehabilitation practices during the operational phase was developed and validated. It is based on a comparison of actually observed or documented practices with internationally recommended best practices (BP). A set of 150 BP statements was derived from international guides in order to establish the benchmark. The statements are arranged in six rehabilitation programs under three categories: (1) planning (2) operational and (3) management, corresponding to the adoption of the plan-do-check-act management systems model to mine rehabilitation. The procedure consists of (i) performing technical inspections guided by a series of field forms containing BP statements; (ii) classifying evidences in five categories; and (iii) calculating conformity indexes and levels. For testing and calibration purposes, the procedure was applied to nine limestone quarries and conformity indexes were calculated for the rehabilitation programs in each quarry. Most quarries featured poor planning practices, operational practices reached high conformity levels in 50% of the cases and management practices scored moderate conformity. Despite all quarries being ISO 14001 certified, their management systems pay low attention to issues pertaining to land rehabilitation and biodiversity. The best results were achieved by a quarry whose expansion was recently submitted to the environmental impact assessment process, suggesting that public scrutiny may play a positive role in enhancing rehabilitation practices. Conformity indexes and levels can be used to chart the evolution of rehabilitation practices at regular intervals, to establish corporate goals and for communication with stakeholders. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work deals with the problem of minimizing the waste of space that occurs on a rotational placement of a set of irregular bi-dimensional items inside a bi-dimensional container. This problem is approached with a heuristic based on Simulated Annealing (SA) with adaptive neighborhood. The objective function is evaluated in a constructive approach, where the items are placed sequentially. The placement is governed by three different types of parameters: sequence of placement, the rotation angle and the translation. The rotation applied and the translation of the polygon are cyclic continuous parameters, and the sequence of placement defines a combinatorial problem. This way, it is necessary to control cyclic continuous and discrete parameters. The approaches described in the literature deal with only type of parameter (sequence of placement or translation). In the proposed SA algorithm, the sensibility of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensibility of each parameter is associated to its probability distribution in the definition of the next candidate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bezier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-density polyethylene resins have increasingly been used in the production of pipes for water- and gas-pressurized distribution systems and are expected to remain in service for several years, but they eventually fail prematurely by creep fracture. Usual standard methods used to rank resins in terms of their resistance to fracture are expensive and non-practical for quality control purposes, justifying the search for alternative methods. Essential work of fracture (EWF) method provides a relatively simple procedure to characterize the fracture behavior of ductile polymers, such as polyethylene resins. In the present work, six resins were analyzed using the EWF methodology. The results show that the plastic work dissipation factor, beta w(p), is the most reliable parameter to evaluate the performance. Attention must be given to specimen preparation that might result in excessive dispersion in the results, especially for the essential work of fracture w(e).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cost of a new ship design heavily depends on the principal dimensions of the ship; however, dimensions minimization often conflicts with the minimum oil outflow (in the event of an accidental spill). This study demonstrates one rational methodology for selecting the optimal dimensions and coefficients of form of tankers via the use of a genetic algorithm. Therein, a multi-objective optimization problem was formulated by using two objective attributes in the evaluation of each design, specifically, total cost and mean oil outflow. In addition, a procedure that can be used to balance the designs in terms of weight and useful space is proposed. A genetic algorithm was implemented to search for optimal design parameters and to identify the nondominated Pareto frontier. At the end of this study, three real ships are used as case studies. [DOI:10.1115/1.4002740]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work examines the effect of weld strength mismatch on fracture toughness measurements defined by J and CTOD fracture parameters using single edge notch bend (SE(B)) specimens. A central objective of the present study is to enlarge on previous developments of J and CTOD estimation procedures for welded bend specimens based upon plastic eta factors (eta) and plastic rotational factors (r (p) ). Very detailed non-linear finite element analyses for plane-strain models of standard SE(B) fracture specimens with a notch located at the center of square groove welds and in the heat affected zone provide the evolution of load with increased crack mouth opening displacement required for the estimation procedure. One key result emerging from the analyses is that levels of weld strength mismatch within the range +/- 20% mismatch do not affect significantly J and CTOD estimation expressions applicable to homogeneous materials, particularly for deeply cracked fracture specimens with relatively large weld grooves. The present study provides additional understanding on the effect of weld strength mismatch on J and CTOD toughness measurements while, at the same time, adding a fairly extensive body of results to determine parameters J and CTOD for different materials using bend specimens with varying geometries and mismatch levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose an approach to the transient and steady-state analysis of the affine combination of one fast and one slow adaptive filters. The theoretical models are based on expressions for the excess mean-square error (EMSE) and cross-EMSE of the component filters, which allows their application to different combinations of algorithms, such as least mean-squares (LMS), normalized LMS (NLMS), and constant modulus algorithm (CMA), considering white or colored inputs and stationary or nonstationary environments. Since the desired universal behavior of the combination depends on the correct estimation of the mixing parameter at every instant, its adaptation is also taken into account in the transient analysis. Furthermore, we propose normalized algorithms for the adaptation of the mixing parameter that exhibit good performance. Good agreement between analysis and simulation results is always observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the problem of distributed estimation based on the affine projection algorithm (APA), which is developed from Newton`s method for minimizing a cost function. The proposed solution is formulated to ameliorate the limited convergence properties of least-mean-square (LMS) type distributed adaptive filters with colored inputs. The analysis of transient and steady-state performances at each individual node within the network is developed by using a weighted spatial-temporal energy conservation relation and confirmed by computer simulations. The simulation results also verify that the proposed algorithm provides not only a faster convergence rate but also an improved steady-state performance as compared to an LMS-based scheme. In addition, the new approach attains an acceptable misadjustment performance with lower computational and memory cost, provided the number of regressor vectors and filter length parameters are appropriately chosen, as compared to a distributed recursive-least-squares (RLS) based method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As is well known, Hessian-based adaptive filters (such as the recursive-least squares algorithm (RLS) for supervised adaptive filtering, or the Shalvi-Weinstein algorithm (SWA) for blind equalization) converge much faster than gradient-based algorithms [such as the least-mean-squares algorithm (LMS) or the constant-modulus algorithm (CMA)]. However, when the problem is tracking a time-variant filter, the issue is not so clear-cut: there are environments for which each family presents better performance. Given this, we propose the use of a convex combination of algorithms of different families to obtain an algorithm with superior tracking capability. We show the potential of this combination and provide a unified theoretical model for the steady-state excess mean-square error for convex combinations of gradient- and Hessian-based algorithms, assuming a random-walk model for the parameter variations. The proposed model is valid for algorithms of the same or different families, and for supervised (LMS and RLS) or blind (CMA and SWA) algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a wide analysis of local search multiuser detection (LS-MUD) for direct sequence/code division multiple access (DS/CDMA) systems under multipath channels is carried out considering the performance-complexity trade-off. It is verified the robustness of the LS-MUD to variations in loading, E(b)/N(0), near-far effect, number of fingers of the Rake receiver and errors in the channel coefficients estimates. A compared analysis of the bit error rate (BER) and complexity trade-off is accomplished among LS, genetic algorithm (GA) and particle swarm optimization (PSO). Based on the deterministic behavior of the LS algorithm, it is also proposed simplifications over the cost function calculation, obtaining more efficient algorithms (simplified and combined LS-MUD versions) and creating new perspectives for the MUD implementation. The computational complexity is expressed in terms of the number of operations in order to converge. Our conclusion pointed out that the simplified LS (s-LS) method is always more efficient, independent of the system conditions, achieving a better performance with a lower complexity than the others heuristics detectors. Associated to this, the deterministic strategy and absence of input parameters made the s-LS algorithm the most appropriate for the MUD problem. (C) 2008 Elsevier GmbH. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SKAN: Skin Scanner - System for Skin Cancer Detection Using Adaptive Techniques - combines computer engineering concepts with areas like dermatology and oncology. Its objective is to discern images of skin cancer, specifically melanoma, from others that show only common spots or other types of skin diseases, using image recognition. This work makes use of the ABCDE visual rule, which is often used by dermatologists for melanoma identification, to define which characteristics are analyzed by the software. It then applies various algorithms and techniques, including an ellipse-fitting algorithm, to extract and measure these characteristics and decide whether the spot is a melanoma or not. The achieved results are presented with special focus on the adaptive decision-making and its effect on the diagnosis. Finally, other applications of the software and its algorithms are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper it is presented the theoretical background, the architecture (using the ""4+1"" model), and the use of the library for execution of adaptive devices, AdapLib. This library was created seeking to be accurate to the adaptive devices theory, and to allow its easy extension considering the specific details of solutions that employ this kind of device. As an example, it is presented a case study in which the library was used to create a proof of concept to monitor and diagnose problems in an online news portal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to present an economical design of an X chart for a short-run production. The process mean starts equal to mu(0) (in-control, State I) and in a random time it shifts to mu(1) > mu(0) (out-of-control, State II). The monitoring procedure consists of inspecting a single item at every m produced ones. If the measurement of the quality characteristic does not meet the control limits, the process is stopped, adjusted, and additional (r - 1) items are inspected retrospectively. The probabilistic model was developed considering only shifts in the process mean. A direct search technique is applied to find the optimum parameters which minimizes the expected cost function. Numerical examples illustrate the proposed procedure. (C) 2009 Elsevier B.V. All rights reserved.