928 resultados para Unconstrained and convex optimization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new myrmicine ant, Tropidomyrmex elianae gen. n. & sp. n., is described from southeastern and central Brazil, based on workers, ergatoid gynes, males and larvae. Tropidomyrmex workers are relatively small, monomorphic, characterized mainly by the feebly pigmented and extremely thin integument; subfalcate mandibles bearing a single apical tooth; palpal formula 1,2; clypeus relatively broad and convex; reduced compound eyes; propodeum unarmed and with a strongly medially depressed declivous face; double and bilobed well developed subpostpetiolar processes; and peculiarities in the sting apparatus. A colony fragment of T. elianae containing workers, ergatoid gynes, males, and brood was found inside a ground termite nest (Anoplotermes pacificus Apicotermitinae) in a montane rocky scrubland in the state of Minas Gerais, southeastern Brazil. Tropidomyrmex elianae is known also from two workers collected in leaf litter samples processed with a Winkler extractor, from the state of Tocantins, central-north Brazil. Despite the differences from the accepted solenopsidine genera, Tropidomyrmex is tentatively assigned to this tribe. Within the solenopsidine ants, the genus is apparently related to Tranopelta. Tropidomyrmex is marked by extreme reductions, perhaps reflecting adaptations to particular habits and habitats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past few years, the field of global optimization has been very active, producing different kinds of deterministic and stochastic algorithms for optimization in the continuous domain. These days, the use of evolutionary algorithms (EAs) to solve optimization problems is a common practice due to their competitive performance on complex search spaces. EAs are well known for their ability to deal with nonlinear and complex optimization problems. Differential evolution (DE) algorithms are a family of evolutionary optimization techniques that use a rather greedy and less stochastic approach to problem solving, when compared to classical evolutionary algorithms. The main idea is to construct, at each generation, for each element of the population a mutant vector, which is constructed through a specific mutation operation based on adding differences between randomly selected elements of the population to another element. Due to its simple implementation, minimum mathematical processing and good optimization capability, DE has attracted attention. This paper proposes a new approach to solve electromagnetic design problems that combines the DE algorithm with a generator of chaos sequences. This approach is tested on the design of a loudspeaker model with 17 degrees of freedom, for showing its applicability to electromagnetic problems. The results show that the DE algorithm with chaotic sequences presents better, or at least similar, results when compared to the standard DE algorithm and other evolutionary algorithms available in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a metaheuristic algorithm inspired in evolutionary computation and swarm intelligence concepts and fundamentals of echolocation of micro bats. The aim is to optimize the mono and multiobjective optimization problems related to the brushless DC wheel motor problems, which has 5 design parameters and 6 constraints for the mono-objective problem and 2 objectives, 5 design parameters, and 5 constraints for multiobjective version. Furthermore, results are compared with other optimization approaches proposed in the recent literature, showing the feasibility of this newly introduced technique to high nonlinear problems in electromagnetics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract This paper describes a design methodology for piezoelectric energy harvester s that thinly encapsulate the mechanical devices and expl oit resonances from higher- order vibrational modes. The direction of polarization determines the sign of the pi ezoelectric tensor to avoid cancellations of electric fields from opposite polarizations in the same circuit. The resultant modified equations of state are solved by finite element method (FEM). Com- bining this method with the solid isotropic material with penalization (SIMP) method for piezoelectric material, we have developed an optimization methodology that optimizes the piezoelectric material layout and polarization direc- tion. Updating the density function of the SIMP method is performed based on sensitivity analysis, the sequen- tial linear programming on the early stage of the opti- mization, and the phase field method on the latter stage

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molecular imaging technologies as Positron Emission Tomography (PET) are playing a key role in drug discovery, development and delivery due to the possibility to quantify e.g. the binding potential in vivo, non-invasively and repetitively. In this context, it provides a significant advance in the understanding of many CNS disorders and conditions. The serotonergic receptor system is involved in a number of important physiological processes and diseases such as depression, schizophrenia, Alzheimer’s disease, sleep or sexual behaviour. Especially, the 5-HT2A and the 5-HT1A receptor subtypes are in the focus of fundamental and clinical research due to the fact that many psychotic drugs interact with these neuronal transmembrane receptors. This work describes the successful development, as well as in vitro and in vivo evaluation of 5-HT2A and 5-HT1A selective antagonistic PET-radiotracers. The major achievements obtained in this thesis are: 1. the development and in vitro evaluation of several 5-HT2A antagonistic compounds, namely MH.MZ (Ki = 9.0 nM), (R)-MH.MZ (Ki = 0.72 nM) and MA-1 (Ki = 3.0 nM). 2. the 18F-labeling procedure of these compounds and their optimization, whereby radiochemical yields > 35 % in high specific activities (> 15 GBq/µmol) could be observed. Synthesis time inclusive secondary synthon synthesis, the radioactive labeling procedure, separation and final formulation took no longer than 120 min and provided the tracer in high radiochemical purity. 3. the in vivo µPET evaluation of [18F]MH.MZ and (R)-[18F]MH.MZ resulting in promising imaging agents of the 5-HT2A receptor status; from which (R)-[18F]MH.MZ seems to be the most promising ligand. 4. the determination of the influence of P-gp on the brain biodistribution of [18F]MH.MZ showing a strong P-gp dependency but no regional alteration. 5. the four-step radiosynthesis and evaluation of [18F]MDL 100907 resulting in another high affine tracer, which is, however, limited due to its low radiochemical yield. 6. the development and evaluation of 3 novel possible 5-HT2A imaging agents combining structural elements of altanserin, MDL 100907 and SR 46349B demonstrating different binding modes of these compounds. 7. the development, the labeling and in vitro evaluation of the novel 5-HT1A antagonistic tracer [18F]AH1.MZ (Ki = 4.2 nM).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major weakness of composite materials is that low-velocity impact, introduced accidentally during manufacture, operation or maintenance of the aircraft, may result in delaminations between the plies. Therefore, the first part of this study is focused on mechanics of curved laminates under impact. For this aim, the effect of preloading on impact response of curved composite laminates is considered. By applying the preload, the stress through the thickness and curvature of the laminates increased. The results showed that all impact parameters are varied significantly. For understanding the contribution rate of preloading and pre-stress on the obtained results another test is designed. The interesting phenomenon is that the preloading can decrease the damaged area when the curvature of the both specimens is the same. Finally the effect of curvature type, concave and convex, is investigated under impact loading. In the second part, a new composition of nanofibrous mats are developed to improve the efficiency of curved laminates under impact loading. Therefore, at first some fracture tests are conducted to consider the effect of Nylon 6,6, PCL, and their mixture on mode I and mode II fracture toughness. For this goal, nanofibers are electrospun and interleaved between mid-plane of laminate composite to conduct mode I and mode II tests. The results shows that efficiency of Nylon 6,6 is better than PCL in mode II, while the effect of PCL on fracture toughness of mode I is more. By mixing these nanofibers the shortage of the individual nanofibers is compensated and so the Nylon 6,6/PCL nanofibers could increased mode I and II fracture toughness. Then all these nanofibers are used between all layers of composite layers to investigate their effect on damaged area. The results showed that PCL could decrease the damaged area about 25% and Nylon 6,6 and mixed nanofibers about 50%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Sequencing based mutation screening assays of genes encompassing large numbers of exons could be substantially optimized by multiplex PCR, which enables simultaneous amplification of many targets in one reaction. In the present study, a multiplex PCR protocol originally developed for fragment analysis was evaluated for sequencing based mutation screening of the ornithine transcarbamylase (OTC) and the medium-chain acyl-CoA dehydrogenase (MCAD) genes. METHODS: Single exon and multiplex PCR protocols were applied to generate PCR templates for subsequent DNA sequencing of all exons of the OTC and the MCAD genes. For each PCR protocol and using the same DNA samples, 66 OTC and 98 MCAD sequence reads were generated. The sequences derived from the two different PCR methods were compared at the level of individual signal-to-noise ratios of the four bases and the proportion of high-quality base-signals. RESULTS: The single exon and the multiplex PCR protocol gave qualitatively comparable results for the two genes. CONCLUSIONS: Many existing sequencing based mutation analysis protocols may be easily optimized with the proposed method, since the multiplex PCR protocol was successfully applied without any re-design of the PCR primers and other optimization steps for generating sequencing templates for the OTC and MCAD genes, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Responses of many real-world problems can only be evaluated perturbed by noise. In order to make an efficient optimization of these problems possible, intelligent optimization strategies successfully coping with noisy evaluations are required. In this article, a comprehensive review of existing kriging-based methods for the optimization of noisy functions is provided. In summary, ten methods for choosing the sequential samples are described using a unified formalism. They are compared on analytical benchmark problems, whereby the usual assumption of homoscedastic Gaussian noise made in the underlying models is meet. Different problem configurations (noise level, maximum number of observations, initial number of observations) and setups (covariance functions, budget, initial sample size) are considered. It is found that the choices of the initial sample size and the covariance function are not critical. The choice of the method, however, can result in significant differences in the performance. In particular, the three most intuitive criteria are found as poor alternatives. Although no criterion is found consistently more efficient than the others, two specialized methods appear more robust on average.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several strategies relying on kriging have recently been proposed for adaptively estimating contour lines and excursion sets of functions under severely limited evaluation budget. The recently released R package KrigInv 3 is presented and offers a sound implementation of various sampling criteria for those kinds of inverse problems. KrigInv is based on the DiceKriging package, and thus benefits from a number of options concerning the underlying kriging models. Six implemented sampling criteria are detailed in a tutorial and illustrated with graphical examples. Different functionalities of KrigInv are gradually explained. Additionally, two recently proposed criteria for batch-sequential inversion are presented, enabling advanced users to distribute function evaluations in parallel on clusters or clouds of machines. Finally, auxiliary problems are discussed. These include the fine tuning of numerical integration and optimization procedures used within the computation and the optimization of the considered criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Statistical shape models are widely used in biomedical research. They are routinely implemented for automatic image segmentation or object identification in medical images. In these fields, however, the acquisition of the large training datasets, required to develop these models, is usually a time-consuming process. Even after this effort, the collections of datasets are often lost or mishandled resulting in replication of work. Objective: To solve these problems, the Virtual Skeleton Database (VSD) is proposed as a centralized storage system where the data necessary to build statistical shape models can be stored and shared. Methods: The VSD provides an online repository system tailored to the needs of the medical research community. The processing of the most common image file types, a statistical shape model framework, and an ontology-based search provide the generic tools to store, exchange, and retrieve digital medical datasets. The hosted data are accessible to the community, and collaborative research catalyzes their productivity. Results: To illustrate the need for an online repository for medical research, three exemplary projects of the VSD are presented: (1) an international collaboration to achieve improvement in cochlear surgery and implant optimization, (2) a population-based analysis of femoral fracture risk between genders, and (3) an online application developed for the evaluation and comparison of the segmentation of brain tumors. Conclusions: The VSD is a novel system for scientific collaboration for the medical image community with a data-centric concept and semantically driven search option for anatomical structures. The repository has been proven to be a useful tool for collaborative model building, as a resource for biomechanical population studies, or to enhance segmentation algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of fitting a union of subspaces to a collection of data points drawn from one or more subspaces and corrupted by noise and/or gross errors. We pose this problem as a non-convex optimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean and self-expressive dictionary plus a matrix of noise and/or gross errors. By self-expressive we mean a dictionary whose atoms can be expressed as linear combinations of themselves with low-rank coefficients. In the case of noisy data, our key contribution is to show that this non-convex matrix decomposition problem can be solved in closed form from the SVD of the noisy data matrix. The solution involves a novel polynomial thresholding operator on the singular values of the data matrix, which requires minimal shrinkage. For one subspace, a particular case of our framework leads to classical PCA, which requires no shrinkage. For multiple subspaces, the low-rank coefficients obtained by our framework can be used to construct a data affinity matrix from which the clustering of the data according to the subspaces can be obtained by spectral clustering. In the case of data corrupted by gross errors, we solve the problem using an alternating minimization approach, which combines our polynomial thresholding operator with the more traditional shrinkage-thresholding operator. Experiments on motion segmentation and face clustering show that our framework performs on par with state-of-the-art techniques at a reduced computational cost.