974 resultados para Simulated annealing algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical impedance tomography is a technique to estimate the impedance distribution within a domain, based on measurements on its boundary. In other words, given the mathematical model of the domain, its geometry and boundary conditions, a nonlinear inverse problem of estimating the electric impedance distribution can be solved. Several impedance estimation algorithms have been proposed to solve this problem. In this paper, we present a three-dimensional algorithm, based on the topology optimization method, as an alternative. A sequence of linear programming problems, allowing for constraints, is solved utilizing this method. In each iteration, the finite element method provides the electric potential field within the model of the domain. An electrode model is also proposed (thus, increasing the accuracy of the finite element results). The algorithm is tested using numerically simulated data and also experimental data, and absolute resistivity values are obtained. These results, corresponding to phantoms with two different conductive materials, exhibit relatively well-defined boundaries between them, and show that this is a practical and potentially useful technique to be applied to monitor lung aeration, including the possibility of imaging a pneumothorax.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extended gcd computation is interesting itself. It also plays a fundamental role in other calculations. We present a new algorithm for solving the extended gcd problem. This algorithm has a particularly simple description and is practical. It also provides refined bounds on the size of the multipliers obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Qu-Prolog is an extension of Prolog which performs meta-level computations over object languages, such as predicate calculi and lambda-calculi, which have object-level variables, and quantifier or binding symbols creating local scopes for those variables. As in Prolog, the instantiable (meta-level) variables of Qu-Prolog range over object-level terms, and in addition other Qu-Prolog syntax denotes the various components of the object-level syntax, including object-level variables. Further, the meta-level operation of substitution into object-level terms is directly represented by appropriate Qu-Prolog syntax. Again as in Prolog, the driving mechanism in Qu-Prolog computation is a form of unification, but this is substantially more complex than for Prolog because of Qu-Prolog's greater generality, and especially because substitution operations are evaluated during unification. In this paper, the Qu-Prolog unification algorithm is specified, formalised and proved correct. Further, the analysis of the algorithm is carried out in a frame-work which straightforwardly allows the 'completeness' of the algorithm to be proved: though fully explicit answers to unification problems are not always provided, no information is lost in the unification process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An algorithm for explicit integration of structural dynamics problems with multiple time steps is proposed that averages accelerations to obtain subcycle states at a nodal interface between regions integrated with different time steps. With integer time step ratios, the resulting subcycle updates at the interface sum to give the same effect as a central difference update over a major cycle. The algorithm is shown to have good accuracy, and stability properties in linear elastic analysis similar to those of constant velocity subcycling algorithms. The implementation of a generalised form of the algorithm with non-integer time step ratios is presented. (C) 1997 by John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The popular Newmark algorithm, used for implicit direct integration of structural dynamics, is extended by means of a nodal partition to permit use of different timesteps in different regions of a structural model. The algorithm developed has as a special case an explicit-explicit subcycling algorithm previously reported by Belytschko, Yen and Mullen. That algorithm has been shown, in the absence of damping or other energy dissipation, to exhibit instability over narrow timestep ranges that become narrower as the number of degrees of freedom increases, making them unlikely to be encountered in practice. The present algorithm avoids such instabilities in the case of a one to two timestep ratio (two subcycles), achieving unconditional stability in an exponential sense for a linear problem. However, with three or more subcycles, the trapezoidal rule exhibits stability that becomes conditional, falling towards that of the central difference method as the number of subcycles increases. Instabilities over narrow timestep ranges, that become narrower as the model size increases, also appear with three or more subcycles. However by moving the partition between timesteps one row of elements into the region suitable for integration with the larger timestep these the unstable timestep ranges become extremely narrow, even in simple systems with a few degrees of freedom. As well, accuracy is improved. Use of a version of the Newmark algorithm that dissipates high frequencies minimises or eliminates these narrow bands of instability. Viscous damping is also shown to remove these instabilities, at the expense of having more effect on the low frequency response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Although various techniques have been used for breast conservation surgery reconstruction, there are few studies describing a logical approach to reconstruction of these defects. The objectives of this study were to establish a classification system for partial breast defects and to develop a reconstructive algorithm. Methods: The authors reviewed a 7-year experience with 209 immediate breast conservation surgery reconstructions. Mean follow-up was 31 months. Type I defects include tissue resection in smaller breasts (bra size A/B), including type IA, which involves minimal defects that do not cause distortion; type III, which involves moderate defects that cause moderate distortion; and type IC, which involves large defects that cause significant deformities. Type II includes tissue resection in medium-sized breasts with or without ptosis (bra size C), and type III includes tissue resection in large breasts with ptosis (bra size D). Results: Eighteen percent of patients presented type I, where a lateral thoracodorsal flap and a latissimus dorsi flap were performed in 68 percent. Forty-five percent presented type II defects, where bilateral mastopexy was performed in 52 percent. Thirty-seven percent of patients presented type III distortion, where bilateral reduction mammaplasty was performed in 67 percent. Thirty-five percent of patients presented complications, and most were minor. Conclusions: An algorithm based on breast size in relation to tumor location and extension of resection can be followed to determine the best approach to reconstruction. The authors` results have demonstrated that the complications were similar to those in other clinical series. Success depends on patient selection, coordinated planning with the oncologic surgeon, and careful intraoperative management.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Generalized Social Anxiety Disorder (SAD) is one of the most common anxiety conditions with impairment in social life. Cannabidiol (CBD), one major non-psychotomimetic compound of the cannabis sativa plant, has shown anxiolytic effects both in humans and in animals. This preliminary study aimed to compare the effects of a simulation public speaking test (SPST) on healthy control (HC) patients and treatment-naive SAD patients who received a single dose of CBD or placebo. A total of 24 never-treated patients with SAD were allocated to receive either CBD (600 mg; n = 12) or placebo (placebo; n = 12) in a double-blind randomized design 1 h and a half before the test. The same number of HC (n = 12) performed the SPST without receiving any medication. Each volunteer participated in only one experimental session in a double-blind procedure. Subjective ratings on the Visual Analogue Mood Scale (VAMS) and Negative Self-Statement scale (SSPS-N) and physiological measures (blood pressure, heart rate, and skin conductance) were measured at six different time points during the SPST. The results were submitted to a repeated-measures analysis of variance. Pretreatment with CBD significantly reduced anxiety, cognitive impairment and discomfort in their speech performance, and significantly decreased alert in their anticipatory speech. The placebo group presented higher anxiety, cognitive impairment, discomfort, and alert levels when compared with the control group as assessed with the VAMS. The SSPS-N scores evidenced significant increases during the testing of placebo group that was almost abolished in the CBD group. No significant differences were observed between CBD and HC in SSPS-N scores or in the cognitive impairment, discomfort, and alert factors of VAMS. The increase in anxiety induced by the SPST on subjects with SAD was reduced with the use of CBD, resulting in a similar response as the HC. Neuropsychopharmacology (2011) 36, 1219-1226; doi: 10.1038/npp.2011.6; published online 9 February 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated public speaking (SPS) test is sensitive to drugs that interfere with serotonin-mediated neurotransmission and is supposed to recruit neural systems involved in panic disorder. The study was aimed at evaluating the effects of escitalopram, the most selective serotonin-selective reuptake inhibitor available, in SPS. Healthy males received, in a double-blind, randomized design, placebo (n = 12), 10 (n = 17) or 20 (n = 14) mg of escitalopram 2 hours before the test. Behavioural, autonomic and neuroendocrine measures were assessed. Both doses of escitalopram did not produce any effect before or during the speech but prolonged the fear induced by SPS. The test itself did not significantly change cortisol and prolactin levels but under the higher dose of escitalopram, cortisol and prolactin increased immediately after SPS. This fear-enhancing effect of escitalopram agrees with previously reported results with less selective serotonin reuptake inhibitors and the receptor antagonist ritanserin, indicating that serotonin inhibits the fear of speaking in public.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To evaluate the influence of two surface sealants (BisCover/Single Bond) and three application techniques (unsealed/conventional/co-polymerization) on the roughness of two composites (Filtek Z250/Z350) after the toothbrushing test. Methods: Seventy-two rectangular specimens (5 mm x 10 mm x 3 mm) were fabricated and assigned into 12 groups (n = 6). Each sample was subjected to three random roughness readings at baseline, after 100,000 (intermediate), and 200,000 (final) toothbrushing strokes. Roughness (R) at each stage was obtained by the arithmetic mean of the reading of each specimen. Sealant removal was qualitatively examined (optical microscope) and classified into scores (0-3). Data were analyzed by Student`s paired t-test, two-way ANOVA/Tukey`s test, and by Wilcoxon, Kruskal-Wallis and Miller`s test (alpha = 0.05). Results: Z250 groups at baseline did not differ statistically from each other. Unsealed Z350 at baseline had lower R values. All the unsealed groups presented gradual decrease in R from baseline to final brushing. From baseline to the inter-mediate stage, Z250 co-polymerized groups presented a significant reduction in R (score 3). Conventionally sealed groups had no significant changes in R (scores 2-0.8). From baseline to the intermediate stage, the conventionally sealed Z350 Single Bond group had an increase in R (score 1.5). In the final stage, all the conventionally sealed groups presented a reduction in R (scores 0.7-0). Co-polymerized Single Bond groups had a significant reduction in R (scores 2.5-2.7), and co-polymerized BisCover groups an increase in R (scores 2.8-3). Conclusions: At any brushing stage, sealed composites presented superior performance when compared with unsealed composites. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated whether sodium bicarbonate solution, applied on enamel previously exposed to a simulated intrinsic acid, can control dental erosion. Volunteers wore palatal devices containing enamel slabs, which were exposed twice daily extra-orally to hydrochloric acid (0.01 M, pH 2) for 2 min. Immediately afterwards, the palatal devices were re-inserted in the mouth and volunteers rinsed their oral cavity with a sodium bicarbonate solution or deionized water for 60 s. After the washout period, the palatal devices were refilled with a new set of specimens and participants were crossed over to receive the alternate rinse solution. The surface loss and surface microhardness (SMH) of specimens were assessed. The surface loss of eroded enamel rinsed with a sodium bicarbonate solution was significantly lower than the surface loss of eroded enamel rinsed with deionized water. There were no differences between treatments with sodium bicarbonate and deionized water for SMH measurements. Regardless of the solution used as an oral rinse, eroded enamel showed lower SMH than uneroded specimens. Rinsing with a sodium bicarbonate solution after simulated endogenous erosive challenge controlled enamel surface loss but did not alter the microhardness.