932 resultados para efficient algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

MOTOR IMPAIRMENTS ARE COMMON AFTER STROKE but efficacious therapies for these dysfunctions are scarce. Extending an earlier study on the effects of music-supported training (MST), behavioral indices of motor function were obtained before and after a series of training sessions to assess whether this new treatment leads to improved motor functions. Furthermore, music-supported training was contrasted to functional motor training according to the principles of constraint-induced therapy (CIT). In addition to conventional physiotherapy, 32 stroke patients with moderately impaired motor function and no previous musical experience received 15 sessions of MST over a period of three weeks, using a manualized, step-bystep approach. A control group consisting of 15 patients received 15 sessions of CIT in addition to conventional physiotherapy. A third group of 30 patients received exclusively conventional physiotherapy and served as a control group for the other three groups. Fine as well as gross motor skills were trained by using either a MIDI-piano or electronic drum pads programmed to emit piano tones. Motor functions were assessed by an extensive test battery. MST yielded significant improvement in fine as well as gross motor skills with respect to speed, precision, and smoothness of movements. These improvements were greater than after CIT or conventional physiotherapy. In conclusion, with equal treatment intensity, MST leads to more pronounced improvements of motor functions after stroke than CIT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present paper we characterize the optimal use of Poisson signals to establish incentives in the "bad" and "good" news models of Abreu et al. [1]. In the former, for small time intervals the signals' quality is high and we observe a "selective" use of information; otherwise there is a "mass" use. In the latter, for small time intervals the signals' quality is low and we observe a "fine" use of information; otherwise there is a "non-selective" use. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Public Monitoring, Infor- mation Characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis concentrates on developing a practical local approach methodology based on micro mechanical models for the analysis of ductile fracture of welded joints. Two major problems involved in the local approach, namely the dilational constitutive relation reflecting the softening behaviour of material, and the failure criterion associated with the constitutive equation, have been studied in detail. Firstly, considerable efforts were made on the numerical integration and computer implementation for the non trivial dilational Gurson Tvergaard model. Considering the weaknesses of the widely used Euler forward integration algorithms, a family of generalized mid point algorithms is proposed for the Gurson Tvergaard model. Correspondingly, based on the decomposition of stresses into hydrostatic and deviatoric parts, an explicit seven parameter expression for the consistent tangent moduli of the algorithms is presented. This explicit formula avoids any matrix inversion during numerical iteration and thus greatly facilitates the computer implementation of the algorithms and increase the efficiency of the code. The accuracy of the proposed algorithms and other conventional algorithms has been assessed in a systematic manner in order to highlight the best algorithm for this study. The accurate and efficient performance of present finite element implementation of the proposed algorithms has been demonstrated by various numerical examples. It has been found that the true mid point algorithm (a = 0.5) is the most accurate one when the deviatoric strain increment is radial to the yield surface and it is very important to use the consistent tangent moduli in the Newton iteration procedure. Secondly, an assessment of the consistency of current local failure criteria for ductile fracture, the critical void growth criterion, the constant critical void volume fraction criterion and Thomason's plastic limit load failure criterion, has been made. Significant differences in the predictions of ductility by the three criteria were found. By assuming the void grows spherically and using the void volume fraction from the Gurson Tvergaard model to calculate the current void matrix geometry, Thomason's failure criterion has been modified and a new failure criterion for the Gurson Tvergaard model is presented. Comparison with Koplik and Needleman's finite element results shows that the new failure criterion is fairly accurate indeed. A novel feature of the new failure criterion is that a mechanism for void coalescence is incorporated into the constitutive model. Hence the material failure is a natural result of the development of macroscopic plastic flow and the microscopic internal necking mechanism. By the new failure criterion, the critical void volume fraction is not a material constant and the initial void volume fraction and/or void nucleation parameters essentially control the material failure. This feature is very desirable and makes the numerical calibration of void nucleation parameters(s) possible and physically sound. Thirdly, a local approach methodology based on the above two major contributions has been built up in ABAQUS via the user material subroutine UMAT and applied to welded T joints. By using the void nucleation parameters calibrated from simple smooth and notched specimens, it was found that the fracture behaviour of the welded T joints can be well predicted using present methodology. This application has shown how the damage parameters of both base material and heat affected zone (HAZ) material can be obtained in a step by step manner and how useful and capable the local approach methodology is in the analysis of fracture behaviour and crack development as well as structural integrity assessment of practical problems where non homogeneous materials are involved. Finally, a procedure for the possible engineering application of the present methodology is suggested and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes the synthesis of the new (2Z)-2-(4-methoxybenzylidene)-6-nitro-4H -benzo[1,4]thiazin-3-one, (2Z)-2-(4-methoxybenzylidene)-4-methyl-6-nitro-4H-benzo[1,4]thiazin-3-one, (2Z)-6-amino-2-(4-methoxybenzylidene)-4H -benzo[1,4]thiazin-3-one, (2Z)-6-butylamino-2-(4-methoxybenzylidene)-4-methyl-4H-benzo[1,4]-thiazin-3-one and (2E)-N-alkyl-N-(2-hydroxy-5-nitrophenyl)-3-phenylacrylamides and the spectroscopic data. The arylidenebenzothiazine compounds were prepared using the Knoevenagel condensation with substituted benzaldehydes in the presence of sodium methoxide in DMF. The presence of a nitro substituent in the 4-position, water and a slightly acid reaction medium in this condensation caused the rupture of the benzothiazine ring and subsequent formation of the phenylacrylamide compounds. A crystallographic data was presented for (2E)-3-(4-bromophenyl)-N-dodecyl-N -(2-hydroxy-5-nitrophenyl) acrylamide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In esterification of oleic acid with methanol at 25 °C HPA displayed the highest activity. Moreover the HPA could be reused after being transformed into its cesium salt. In the reaction of etherification of glycerol HPA and Amberlyst 35W showed similar initial activity levels. The results of acid properties demonstrate that HPA is a strong protonic acid and that both surface and bulk protons contribute to the acidity. Because of its strong affinity for polar compounds, HPA is also seemingly dissolved in both oleic acid and methanol. The reaction in this case proceeds with the catalyst in the homogenous phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we report the synthesis of sulfonamide derivatives using a conventional procedure and with solid supports, such as silica gel, florisil, alumina, 4Å molecular sieves, montmorillonite KSF, and montmorillonite K10 using solvent-free and microwave-assisted methods. Our results show that solid supports have a catalytic activity in the formation of sulfonamide derivatives. We found that florisil, montmorillonite KSF, and K10 could be used as inexpensive alternative catalysts that are easily separated from the reaction media. Additionally, solvent-free and microwave-assisted methods were more efficient in reducing reaction time and in increasing yield.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new convenient method for preparation of 2-substituted benzimidazoles and bis-benzimidazoles is presented. In this method, o-phenylenediamines were condensed with bisulfite adducts of various aldehydes and di-aldehydes under neat conditions by microwave heating. The results were also compared with results of synthesis by conventional heating under reflux. Structures of the products were confirmed by infrared, ¹H- and 13C-NMR spectroscopy. Short reaction times, good yields, easy purification of products, and mild reaction conditions are the main advantages of this method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Materials based on tungstophosphoric acid (TPA) immobilized on NH4ZSM5 zeolite were prepared by wet impregnation of the zeolite matrix with TPA aqueous solutions. Their concentration was varied in order to obtain TPA contents of 5%, 10%, 20%, and 30% w/w in the solid. The materials were characterized by N2 adsorption-desorption isotherms, XRD, FT-IR, 31P MAS-NMR, TGA-DSC, DRS-UV-Vis, and the acidic behavior was studied by potentiometric titration with n-butylamine. The BET surface area (SBET) decreased when the TPA content was raised as a result of zeolite pore blocking. The X-ray diffraction patterns of the solids modified with TPA only presented the characteristic peaks of NH4ZSM5 zeolites, and an additional set of peaks assigned to the presence of (NH4)3PW12O40. According to the Fourier transform infrared and 31P magic angle spinning-nuclear magnetic resonance spectra, the main species present in the samples was the [PW12O40]3- anion, which was partially transformed into the [P2W21O71]6- anion during the synthesis and drying steps. The thermal stability of the NH4ZSM5TPA materials was similar to that of their parent zeolites. Moreover, the samples with the highest TPA content exhibited band gap energy values similar to those reported for TiO2. The immobilization of TPA on NH4ZSM5 zeolite allowed the obtention of catalysts with high photocatalytic activity in the degradation of methyl orange dye (MO) in water, at 25 ºC. These can be reused at least three times without any significant decrease in degree of degradation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work is to demonstrate the efficient utilization of the Principal Components Analysis (PCA) as a method to pre-process the original multivariate data, that is rewrite in a new matrix with principal components sorted by it's accumulated variance. The Artificial Neural Network (ANN) with backpropagation algorithm is trained, using this pre-processed data set derived from the PCA method, representing 90.02% of accumulated variance of the original data, as input. The training goal is modeling Dissolved Oxygen using information of other physical and chemical parameters. The water samples used in the experiments are gathered from the Paraíba do Sul River in São Paulo State, Brazil. The smallest Mean Square Errors (MSE) is used to compare the results of the different architectures and choose the best. The utilization of this method allowed the reduction of more than 20% of the input data, which contributed directly for the shorting time and computational effort in the ANN training.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Russian Wholesale Market, electricity and capacity are traded separately. Capacity is a special good, the sale of which obliges suppliers to keep their generating equipment ready to produce the quantity of electricity indicated by the System Operator. The purpose of the formation of capacity trading was the maintenance of reliable and uninterrupted delivery of electricity in the wholesale market. The price of capacity reflects constant investments in construction, modernization and maintenance of power plants. So, the capacity sale creates favorable conditions to attract investments in the energy sector because it guarantees the investor that his investments will be returned.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.