872 resultados para direct search optimization algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work explores the design of piezoelectric transducers based on functional material gradation, here named functionally graded piezoelectric transducer (FGPT). Depending on the applications, FGPTs must achieve several goals, which are essentially related to the transducer resonance frequency, vibration modes, and excitation strength at specific resonance frequencies. Several approaches can be used to achieve these goals; however, this work focuses on finding the optimal material gradation of FGPTs by means of topology optimization. Three objective functions are proposed: (i) to obtain the FGPT optimal material gradation for maximizing specified resonance frequencies; (ii) to design piezoelectric resonators, thus, the optimal material gradation is found for achieving desirable eigenvalues and eigenmodes; and (iii) to find the optimal material distribution of FGPTs, which maximizes specified excitation strength. To track the desirable vibration mode, a mode-tracking method utilizing the `modal assurance criterion` is applied. The continuous change of piezoelectric, dielectric, and elastic properties is achieved by using the graded finite element concept. The optimization algorithm is constructed based on sequential linear programming, and the concept of continuum approximation of material distribution. To illustrate the method, 2D FGPTs are designed for each objective function. In addition, the FGPT performance is compared with the non-FGPT one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a systematic and logical study of the topology optimized design, microfabrication, and static/dynamic performance characterization of an electro-thermo-mechanical microgripper. The microgripper is designed using a topology optimization algorithm based on a spatial filtering technique and considering different penalization coefficients for different material properties during the optimization cycle. The microgripper design has a symmetric monolithic 2D structure which consists of a complex combination of rigid links integrating both the actuating and gripping mechanisms. The numerical simulation is performed by studying the effects of convective heat transfer, thermal boundary conditions at the fixed anchors, and microgripper performance considering temperature-dependent and independent material properties. The microgripper is fabricated from a 25 mm thick nickel foil using laser microfabrication technology and its static/dynamic performance is experimentally evaluated. The static and dynamic electro-mechanical characteristics are analyzed as step response functions with respect to tweezing/actuating displacements, applied current/power, and actual electric resistance. A microgripper prototype having overall dimensions of 1mm (L) X 2.5mm (W) is able to deliver the maximum tweezing and actuating displacements of 25.5 mm and 33.2 mm along X and Y axes, respectively, under an applied power of 2.32 W. Experimental performance is compared with finite element modeling simulation results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems of distributed artificial intelligence can be powerful tools in a wide variety of practical applications. Its most surprising characteristic, the emergent behavior, is also the most answerable for the difficulty in. projecting these systems. This work proposes a tool capable to beget individual strategies for the elements of a multi-agent system and thereof providing to the group means on obtaining wanted results, working in a coordinated and cooperative manner as well. As an application example, a problem was taken as a basis where a predators` group must catch a prey in a three-dimensional continuous ambient. A synthesis of system strategies was implemented of which internal mechanism involves the integration between simulators by Particle Swarm Optimization algorithm (PSO), a Swarm Intelligence technique. The system had been tested in several simulation settings and it was capable to synthesize automatically successful hunting strategies, substantiating that the developed tool can provide, as long as it works with well-elaborated patterns, satisfactory solutions for problems of complex nature, of difficult resolution starting from analytical approaches. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to present an economical design of an X chart for a short-run production. The process mean starts equal to mu(0) (in-control, State I) and in a random time it shifts to mu(1) > mu(0) (out-of-control, State II). The monitoring procedure consists of inspecting a single item at every m produced ones. If the measurement of the quality characteristic does not meet the control limits, the process is stopped, adjusted, and additional (r - 1) items are inspected retrospectively. The probabilistic model was developed considering only shifts in the process mean. A direct search technique is applied to find the optimum parameters which minimizes the expected cost function. Numerical examples illustrate the proposed procedure. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a fast method for finding optimal parameters for a low-resolution (threading) force field intended to distinguish correct from incorrect folds for a given protein sequence. In contrast to other methods, the parameterization uses information from >10(7) misfolded structures as well as a set of native sequence-structure pairs. In addition to testing the resulting force field's performance on the protein sequence threading problem, results are shown that characterize the number of parameters necessary for effective structure recognition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The large penetration of intermittent resources, such as solar and wind generation, involves the use of storage systems in order to improve power system operation. Electric Vehicles (EVs) with gridable capability (V2G) can operate as a means for storing energy. This paper proposes an algorithm to be included in a SCADA (Supervisory Control and Data Acquisition) system, which performs an intelligent management of three types of consumers: domestic, commercial and industrial, that includes the joint management of loads and the charge/discharge of EVs batteries. The proposed methodology has been implemented in a SCADA system developed by the authors of this paper – the SCADA House Intelligent Management (SHIM). Any event in the system, such as a Demand Response (DR) event, triggers the use of an optimization algorithm that performs the optimal energy resources scheduling (including loads and EVs), taking into account the priorities of each load defined by the installation users. A case study considering a specific consumer with several loads and EVs is presented in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diversas plataformas permitem que os utilizadores rotulem recursos com tags e partilhem informação com outros utilizadores. Assim, foram desenvolvidas várias formas de visualização das tags associados aos recursos, com o intuito de facilitar aos utilizadores a pesquisa dos mesmos, assim como a visualização do tag space. De entre os vários conceitos desenvolvidos, a nuvem de tags destaca-se como a forma mais comum de visualização. Este documento apresenta um estudo efetuado sobre as suas limitações e propõe uma forma de visualização alternativa. Sugere-se também uma nova interpretação sobre como pesquisar e visualizar informação associada a tags, diferindo assim do método de pesquisa direta do termo na base de dados que atualmente é maioritariamente utilizado. Como resultado desta implementação, obteve-se uma solução viável e inovadora, o sistema Molecule, para vários dos problemas associados à tradicional nuvem de tags.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most machining tasks require high accuracy and are carried out by dedicated machine-tools. On the other hand, traditional robots are flexible and easy to program, but they are rather inaccurate for certain tasks. Parallel kinematic robots could combine the accuracy and flexibility that are usually needed in machining operations. Achieving this goal requires proper design of the parallel robot. In this chapter, a multi-objective particle swarm optimization algorithm is used to optimize the structure of a parallel robot according to specific criteria. Afterwards, for a chosen optimal structure, the best location of the workpiece with respect to the robot, in a machining robotic cell, is analyzed based on the power consumed by the manipulator during the machining process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new unsupervised hyperspectral unmixing method conceived to linear but highly mixed hyperspectral data sets, in which the simplex of minimum volume, usually estimated by the purely geometrically based algorithms, is far way from the true simplex associated with the endmembers. The proposed method, an extension of our previous studies, resorts to the statistical framework. The abundance fraction prior is a mixture of Dirichlet densities, thus automatically enforcing the constraints on the abundance fractions imposed by the acquisition process, namely, nonnegativity and sum-to-one. A cyclic minimization algorithm is developed where the following are observed: 1) The number of Dirichlet modes is inferred based on the minimum description length principle; 2) a generalized expectation maximization algorithm is derived to infer the model parameters; and 3) a sequence of augmented Lagrangian-based optimizations is used to compute the signatures of the endmembers. Experiments on simulated and real data are presented to show the effectiveness of the proposed algorithm in unmixing problems beyond the reach of the geometrically based state-of-the-art competitors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clustering ensemble methods produce a consensus partition of a set of data points by combining the results of a collection of base clustering algorithms. In the evidence accumulation clustering (EAC) paradigm, the clustering ensemble is transformed into a pairwise co-association matrix, thus avoiding the label correspondence problem, which is intrinsic to other clustering ensemble schemes. In this paper, we propose a consensus clustering approach based on the EAC paradigm, which is not limited to crisp partitions and fully exploits the nature of the co-association matrix. Our solution determines probabilistic assignments of data points to clusters by minimizing a Bregman divergence between the observed co-association frequencies and the corresponding co-occurrence probabilities expressed as functions of the unknown assignments. We additionally propose an optimization algorithm to find a solution under any double-convex Bregman divergence. Experiments on both synthetic and real benchmark data show the effectiveness of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation presented at Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia in fulfilment of the requirements for the Masters degree in Mathematics and Applications, specialization in Actuarial Sciences, Statistics and Operations Research

Relevância:

100.00% 100.00%

Publicador:

Resumo:

4th International Conference on Future Generation Communication Technologies (FGCT 2015), Luton, United Kingdom.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality of life is a concept influenced by social, economic, psychological, spiritual or medical state factors. More specifically, the perceived quality of an individual's daily life is an assessment of their well-being or lack of it. In this context, information technologies may help on the management of services for healthcare of chronic patients such as estimating the patient quality of life and helping the medical staff to take appropriate measures to increase each patient quality of life. This paper describes a Quality of Life estimation system developed using information technologies and the application of data mining algorithms to access the information of clinical data of patients with cancer from Otorhinolaryngology and Head and Neck services of an oncology institution. The system was evaluated with a sample composed of 3013 patients. The results achieved show that there are variables that may be significant predictors for the Quality of Life of the patient: years of smoking (p value 0.049) and size of the tumor (p value < 0.001). In order to assign the variables to the classification of the quality of life the best accuracy was obtained by applying the John Platt's sequential minimal optimization algorithm for training a support vector classifier. In conclusion data mining techniques allow having access to patients additional information helping the physicians to be able to know the quality of life and produce a well-informed clinical decision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation to obtain the degree of Doctor of Philosophy in Biomedical Engineering

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract:INTRODUCTION:The Montenegro skin test (MST) has good clinical applicability and low cost for the diagnosis of American tegumentary leishmaniasis (ATL). However, no studies have validated the reference value (5mm) typically used to discriminate positive and negative results. We investigated MST results and evaluated its performance using different cut-off points.METHODS:The results of laboratory tests for 4,256 patients with suspected ATL were analyzed, and 1,182 individuals were found to fulfill the established criteria. Two groups were formed. The positive cutaneous leishmaniasis (PCL) group included patients with skin lesions and positive direct search for parasites (DS) results. The negative cutaneous leishmaniasis (NCL) group included patients with skin lesions with evolution up to 2 months, negative DS results, and negative indirect immunofluorescence assay results who were residents of urban areas that were reported to be probable sites of infection at domiciles and peridomiciles.RESULTS:The PCL and NCL groups included 769 and 413 individuals, respectively. The mean ± standard deviation MST in the PCL group was 12.62 ± 5.91mm [95% confidence interval (CI): 12.20-13.04], and that in the NCL group was 1.43 ± 2.17mm (95% CI: 1.23-1.63). Receiver-operating characteristic curve analysis indicated 97.4% sensitivity and 93.9% specificity for a cut-off of 5mm and 95.8% sensitivity and 97.1% specificity for a cut-off of 6mm.CONCLUSIONS:Either 5mm or 6mm could be used as the cut-off value for diagnosing ATL, as both values had high sensitivity and specificity.