892 resultados para ISE and ITSE optimization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural net-works that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its inter-nal parameters are computed explicitly using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the problem considered. The problems that can be treated by the proposed approach include combinatorial optimiza-tion problems, dynamic programming problems, and nonlinear optimization problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless sensor network (WSN) Is a technology that can be used to monitor and actuate on environments in a non-intrusive way. The main difference from WSN and traditional sensor networks is the low dependability of WSN nodes. In this way, WSN solutions are based on a huge number of cheap tiny nodes that can present faults in hardware, software and wireless communication. The deployment of hundreds of nodes can overcome the low dependability of individual nodes, however this strategy introduces a lot of challenges regarding network management, real-time requirements and self-optimization. In this paper we present a simulated annealing approach that self-optimize large scale WSN. Simulation results indicate that our approach can achieve self-optimization characteristics in a dynamic WSN. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The aim of this systematic review was to evaluate clinical and safety data for recombinant human bone morpho-genetic protein-2 (rhBMP-2) in an absorbable collagen sponge (ACS) carrier when used for alveolar ridge/maxillary sinusaugmentation in humans.Materials and Methods: Clinical studies/case ser ies published 1980 through June 2012 using rhBMP-2/ACS were searched.Studies meeting the following criteria were considered eligible for inclusion: >10 subjects at baseline and maxillary sinus oralveolar ridge augmentation not concomitant with implant placement.Results: Seven of 69 publications were eligible for review. rhBMP-2/ACS yielded clinically meaningful bone formationfor maxillary sinus augmentation that would allow placement of regular dental implants without consistent differencesbetween rhBMP-2 concentrations. Never theless, the statistical analysis showed that sinus augmentation following autog-enous bone graft was significantly greater (mean bone height: 1.6 mm, 95% CI: 0.5–2.7 mm) than for rhBMP-2/ACS(rhBMP-2 at 1.5 mg/mL). In extraction sockets, rhBMP-2/ACS maintained alveolar ridge height while enhancing alve olarridge width. Safety reports did not represent concerns for the proposed indications.Conclusions: rhBMP-2/ACS appears a promising alternative to autogenous bone grafts for alveolar ridge/maxillary sinusaugmentation; dose and carrier optimization may expand its efficacy, use, and clinical application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past few years, the field of global optimization has been very active, producing different kinds of deterministic and stochastic algorithms for optimization in the continuous domain. These days, the use of evolutionary algorithms (EAs) to solve optimization problems is a common practice due to their competitive performance on complex search spaces. EAs are well known for their ability to deal with nonlinear and complex optimization problems. Differential evolution (DE) algorithms are a family of evolutionary optimization techniques that use a rather greedy and less stochastic approach to problem solving, when compared to classical evolutionary algorithms. The main idea is to construct, at each generation, for each element of the population a mutant vector, which is constructed through a specific mutation operation based on adding differences between randomly selected elements of the population to another element. Due to its simple implementation, minimum mathematical processing and good optimization capability, DE has attracted attention. This paper proposes a new approach to solve electromagnetic design problems that combines the DE algorithm with a generator of chaos sequences. This approach is tested on the design of a loudspeaker model with 17 degrees of freedom, for showing its applicability to electromagnetic problems. The results show that the DE algorithm with chaotic sequences presents better, or at least similar, results when compared to the standard DE algorithm and other evolutionary algorithms available in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a metaheuristic algorithm inspired in evolutionary computation and swarm intelligence concepts and fundamentals of echolocation of micro bats. The aim is to optimize the mono and multiobjective optimization problems related to the brushless DC wheel motor problems, which has 5 design parameters and 6 constraints for the mono-objective problem and 2 objectives, 5 design parameters, and 5 constraints for multiobjective version. Furthermore, results are compared with other optimization approaches proposed in the recent literature, showing the feasibility of this newly introduced technique to high nonlinear problems in electromagnetics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to determine whether there were significant differences in accounting indicators when comparing sustainable enterprises to other similar companies that are not considered as sustainable. The Corporate Sustainability Index of BM (São Paulo Stock, Commodities and Futures Exchange) was the criterion selected to break down the samples into sustainable and non-sustainable enterprises. The accounting indicators were separated into two kinds: risk (dividend payout, percentage growth of assets, financial leverage, current liquidity, asset size, variability of earnings, and accounting beta) and return (ROA, ROE, asset turnover, and net margin). We individually analyzed the companies in the energy sector, followed by those in the banking sector, as well as the entire ISE portfolio as of 2008/2009, including all the sectors. Mann-Whitney tests were performed in order to verify the difference of the means between the groups (ISE and non-ISE). The results, considering the method chosen and the time span covered by the study, indicate that there are no differences between sustainable companies and the others, when they are assessed by the accounting indicators used here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract This paper describes a design methodology for piezoelectric energy harvester s that thinly encapsulate the mechanical devices and expl oit resonances from higher- order vibrational modes. The direction of polarization determines the sign of the pi ezoelectric tensor to avoid cancellations of electric fields from opposite polarizations in the same circuit. The resultant modified equations of state are solved by finite element method (FEM). Com- bining this method with the solid isotropic material with penalization (SIMP) method for piezoelectric material, we have developed an optimization methodology that optimizes the piezoelectric material layout and polarization direc- tion. Updating the density function of the SIMP method is performed based on sensitivity analysis, the sequen- tial linear programming on the early stage of the opti- mization, and the phase field method on the latter stage

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molecular imaging technologies as Positron Emission Tomography (PET) are playing a key role in drug discovery, development and delivery due to the possibility to quantify e.g. the binding potential in vivo, non-invasively and repetitively. In this context, it provides a significant advance in the understanding of many CNS disorders and conditions. The serotonergic receptor system is involved in a number of important physiological processes and diseases such as depression, schizophrenia, Alzheimer’s disease, sleep or sexual behaviour. Especially, the 5-HT2A and the 5-HT1A receptor subtypes are in the focus of fundamental and clinical research due to the fact that many psychotic drugs interact with these neuronal transmembrane receptors. This work describes the successful development, as well as in vitro and in vivo evaluation of 5-HT2A and 5-HT1A selective antagonistic PET-radiotracers. The major achievements obtained in this thesis are: 1. the development and in vitro evaluation of several 5-HT2A antagonistic compounds, namely MH.MZ (Ki = 9.0 nM), (R)-MH.MZ (Ki = 0.72 nM) and MA-1 (Ki = 3.0 nM). 2. the 18F-labeling procedure of these compounds and their optimization, whereby radiochemical yields > 35 % in high specific activities (> 15 GBq/µmol) could be observed. Synthesis time inclusive secondary synthon synthesis, the radioactive labeling procedure, separation and final formulation took no longer than 120 min and provided the tracer in high radiochemical purity. 3. the in vivo µPET evaluation of [18F]MH.MZ and (R)-[18F]MH.MZ resulting in promising imaging agents of the 5-HT2A receptor status; from which (R)-[18F]MH.MZ seems to be the most promising ligand. 4. the determination of the influence of P-gp on the brain biodistribution of [18F]MH.MZ showing a strong P-gp dependency but no regional alteration. 5. the four-step radiosynthesis and evaluation of [18F]MDL 100907 resulting in another high affine tracer, which is, however, limited due to its low radiochemical yield. 6. the development and evaluation of 3 novel possible 5-HT2A imaging agents combining structural elements of altanserin, MDL 100907 and SR 46349B demonstrating different binding modes of these compounds. 7. the development, the labeling and in vitro evaluation of the novel 5-HT1A antagonistic tracer [18F]AH1.MZ (Ki = 4.2 nM).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Sequencing based mutation screening assays of genes encompassing large numbers of exons could be substantially optimized by multiplex PCR, which enables simultaneous amplification of many targets in one reaction. In the present study, a multiplex PCR protocol originally developed for fragment analysis was evaluated for sequencing based mutation screening of the ornithine transcarbamylase (OTC) and the medium-chain acyl-CoA dehydrogenase (MCAD) genes. METHODS: Single exon and multiplex PCR protocols were applied to generate PCR templates for subsequent DNA sequencing of all exons of the OTC and the MCAD genes. For each PCR protocol and using the same DNA samples, 66 OTC and 98 MCAD sequence reads were generated. The sequences derived from the two different PCR methods were compared at the level of individual signal-to-noise ratios of the four bases and the proportion of high-quality base-signals. RESULTS: The single exon and the multiplex PCR protocol gave qualitatively comparable results for the two genes. CONCLUSIONS: Many existing sequencing based mutation analysis protocols may be easily optimized with the proposed method, since the multiplex PCR protocol was successfully applied without any re-design of the PCR primers and other optimization steps for generating sequencing templates for the OTC and MCAD genes, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Responses of many real-world problems can only be evaluated perturbed by noise. In order to make an efficient optimization of these problems possible, intelligent optimization strategies successfully coping with noisy evaluations are required. In this article, a comprehensive review of existing kriging-based methods for the optimization of noisy functions is provided. In summary, ten methods for choosing the sequential samples are described using a unified formalism. They are compared on analytical benchmark problems, whereby the usual assumption of homoscedastic Gaussian noise made in the underlying models is meet. Different problem configurations (noise level, maximum number of observations, initial number of observations) and setups (covariance functions, budget, initial sample size) are considered. It is found that the choices of the initial sample size and the covariance function are not critical. The choice of the method, however, can result in significant differences in the performance. In particular, the three most intuitive criteria are found as poor alternatives. Although no criterion is found consistently more efficient than the others, two specialized methods appear more robust on average.