67 resultados para Sparse Incremental Em Algorithm

em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"


Relevância:

30.00% 30.00%

Publicador:

Resumo:

To enhance the global search ability of population based incremental learning (PBIL) methods, it is proposed that multiple probability vectors are to be included on available PBIL algorithms. The strategy for updating those probability vectors and the negative learning and mutation operators are thus re-defined correspondingly. Moreover, to strike the best tradeoff between exploration and exploitation searches, an adaptive updating strategy for the learning rate is designed. Numerical examples are reported to demonstrate the pros and cons of the newly implemented algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To enhance the global search ability of Population Based Incremental Learning (PBIL) methods, It Is proposed that multiple probability vectors are to be Included on available PBIL algorithms. As a result, the strategy for updating those probability vectors and the negative learning and mutation operators are redefined as reported. Numerical examples are reported to demonstrate the pros and cons of the newly Implemented algorithm. ©2006 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A study was conducted on the effects of acute administration of aminophylline on physiological variables in purebred Arabian horses submitted to incremental exercise test. Twelve horses were submitted to two physical tests separated by a 10-day interval in a crossover study. These horses were divided into two groups: control (C, n = 12) and aminophylline (AM, n = 12). The drug at 10 mg/kg body weight or saline was given intravenously, 30 minutes before the incremental exercise test. The treadmill exercise test consisted of an initial warmup followed by gradually increasing physical exigency. Blood samples were assayed for lactic acid, glucose, and insulin. Maximal lactic acidemia was greater (P = .0238) in the AM group. Both V-2 and V-4 (velocities at which lactate concentrations were 2 and 4 mmol/ L, respectively) were reduced in the AM group by 15.85% (P = .0402) and 17.76% (P = .0 109), respectively. At rest as well as at 4 minutes, insulinemia was greater in the AM group (P = .0417 and .0393), Glycemia group at times 8 was statistically lower in the Al (P = .0138) and 10 minutes (P = .0432). Use of ammophylline in horses during incremental exercise does not seem to be beneficial, because this drug has a tendency to cause hypoglycemia and to increase dependence on anaerobic glucose metabolism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determinou-se, em eqüinos, o efeito do treinamento sobre as concentrações sangüíneas de lactato e plasmáticas de glicose durante exercício de intensidade progressiva em esteira rolante. Demonstrou-se que o treinamento aeróbico causou diminuição da concentração máxima de lactato e que o limiar de lactato corresponde ao ponto de inflexão da curva de glicose plasmática, confirmando esse parâmetro como indicador da capacidade aeróbica de cavalos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for automatic identification of dust devils tracks in MOC NA and HiRISE images of Mars. The method is based on Mathematical Morphology and is able to successfully process those images despite their difference in spatial resolution or size of the scene. A dataset of 200 images from the surface of Mars representative of the diversity of those track features was considered for developing, testing and evaluating our method, confronting the outputs with reference images made manually. Analysis showed a mean accuracy of about 92%. We also give some examples on how to use the results to get information about dust devils, namelly mean width, main direction of movement and coverage per scene. (c) 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A percepção subjetiva de esforço (PSE) é determinada de forma não invasiva e utilizada juntamente com a resposta lactacidêmica como indicadores de intensidade durante teste incremental. em campo, especialmente na natação, há dificuldades nas coletas sanguíneas; por isso, utilizam-se protocolos alternativos para estimar o limiar anaeróbio. Assim, os objetivos do estudo foram: prescrever um teste incremental baseado na PSE (Borg 6-20) visando estimar os limiares metabólicos determinados por métodos lactacidêmicos [ajuste bi-segmentado (V LL), concentração fixa-3,5mM (V3,5mM) e distância máxima (V Dmáx)]; relacionar a PSE atribuída em cada estágio com a freqüência cardíaca (FC) e com parâmetros mecânicos de nado [freqüência (FB) e amplitude de braçada (AB)], analisar a utilização da escala 6-20 na regularidade do incremento das velocidades no teste e correlacionar os limiares metabólicos com a velocidade crítica (VC). Para isso, 12 nadadores (16,4 ± 1,3 anos) realizaram dois esforços máximos (200 e 400m); os dados foram utilizados para determinar a VC, velocidade de 400m (V400m) e a freqüência crítica de braçada (FCb); e um teste incremental com intensidade inicial baseada na PSE, respectivamente, 9, 11, 13, 15 e 17; sendo monitorados em todos os estágios a FC, lactacidêmia e os tempos de quatro ciclos de braçadas e das distâncias de 20m (parte central da piscina) e 50m. Posteriormente, foram calculadas as velocidades dos estágios, FB, AB, V LL, V3,5mM e V Dmáx. Utilizaram-se ANOVA e correlação de Pearson para análise dos resultados. Não foram encontradas diferenças entre VC, V Dmáx e V LL, porém a V3,5mM foi inferior às demais velocidades (P < 0,05). Correlações significativas (P < 0,05) foram observadas entre VC versus V400m, V Dmáx e V3,5mM; V400m versus V3,5mM e V Dmáx; V Dmáx versus V LL; e no teste incremental entre PSE versus velocidade, [Lac], FC, FB e AB (P < 0,05). Concluímos que a PSE é uma ferramenta confiável no controle da velocidade dos estágios durante teste incremental na natação.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente estudo comparou valores de glicemia, frequência cardíaca em repouso e durante exercício, além da composição corporal entre hipertensos e normotensos. A amostra foi composta por 32 jovens do sexo masculino, com média de idade de 22,6 anos. Inicialmente, aferiu-se a pressão arterial, para divisão em dois grupos: hipertensos e normotensos. Posteriormente foram mensurados, glicemia em jejum, impedância bioelétrica, antropometria, e a frequência cardíaca no repouso, durante o teste de esforço máximo e na fase de recuperação. A análise estatística foi composta pelo teste t- Student e análise de variância para medidas repetidas two-way, entre os grupos. O valor de significância adotado foi p = 0,05. Os dados analisados mostraram que indivíduos hipertensos apresentam maiores índices metabólicos e valores hemodinâmicos do que indivíduos normotensos, sendo estes indicadores de risco cardiovascular.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The scheme is based on Ami Harten's ideas (Harten, 1994), the main tools coming from wavelet theory, in the framework of multiresolution analysis for cell averages. But instead of evolving cell averages on the finest uniform level, we propose to evolve just the cell averages on the grid determined by the significant wavelet coefficients. Typically, there are few cells in each time step, big cells on smooth regions, and smaller ones close to irregularities of the solution. For the numerical flux, we use a simple uniform central finite difference scheme, adapted to the size of each cell. If any of the required neighboring cell averages is not present, it is interpolated from coarser scales. But we switch to ENO scheme in the finest part of the grids. To show the feasibility and efficiency of the method, it is applied to a system arising in polymer-flooding of an oil reservoir. In terms of CPU time and memory requirements, it outperforms Harten's multiresolution algorithm.The proposed method applies to systems of conservation laws in 1Dpartial derivative(t)u(x, t) + partial derivative(x)f(u(x, t)) = 0, u(x, t) is an element of R-m. (1)In the spirit of finite volume methods, we shall consider the explicit schemeupsilon(mu)(n+1) = upsilon(mu)(n) - Deltat/hmu ((f) over bar (mu) - (f) over bar (mu)-) = [Dupsilon(n)](mu), (2)where mu is a point of an irregular grid Gamma, mu(-) is the left neighbor of A in Gamma, upsilon(mu)(n) approximate to 1/mu-mu(-) integral(mu-)(mu) u(x, t(n))dx are approximated cell averages of the solution, (f) over bar (mu) = (f) over bar (mu)(upsilon(n)) are the numerical fluxes, and D is the numerical evolution operator of the scheme.According to the definition of (f) over bar (mu), several schemes of this type have been proposed and successfully applied (LeVeque, 1990). Godunov, Lax-Wendroff, and ENO are some of the popular names. Godunov scheme resolves well the shocks, but accuracy (of first order) is poor in smooth regions. Lax-Wendroff is of second order, but produces dangerous oscillations close to shocks. ENO schemes are good alternatives, with high order and without serious oscillations. But the price is high computational cost.Ami Harten proposed in (Harten, 1994) a simple strategy to save expensive ENO flux calculations. The basic tools come from multiresolution analysis for cell averages on uniform grids, and the principle is that wavelet coefficients can be used for the characterization of local smoothness.. Typically, only few wavelet coefficients are significant. At the finest level, they indicate discontinuity points, where ENO numerical fluxes are computed exactly. Elsewhere, cheaper fluxes can be safely used, or just interpolated from coarser scales. Different applications of this principle have been explored by several authors, see for example (G-Muller and Muller, 1998).Our scheme also uses Ami Harten's ideas. But instead of evolving the cell averages on the finest uniform level, we propose to evolve the cell averages on sparse grids associated with the significant wavelet coefficients. This means that the total number of cells is small, with big cells in smooth regions and smaller ones close to irregularities. This task requires improved new tools, which are described next.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new algorithm for Reverse Monte Carlo (RMC) simulations of liquids. During the simulations, we calculate energy, excess chemical potentials, bond-angle distributions and three-body correlations. This allows us to test the quality and physical meaning of RMC-generated results and its limitations. It also indicates the possibility to explore orientational correlations from simple scattering experiments. The new technique has been applied to bulk hard-sphere and Lennard-Jones systems and compared to standard Metropolis Monte Carlo results. (C) 1998 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work summarizes the HdHr group of Hermitian integration algorithms for dynamic structural analysis applications. It proposes a procedure for their use when nonlinear terms are present in the equilibrium equation. The simple pendulum problem is solved as a first example and the numerical results are discussed. Directions to be pursued in future research are also mentioned. Copyright (C) 2009 H.M. Bottura and A. C. Rigitano.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Capacitated Centered Clustering Problem (CCCP) consists of defining a set of p groups with minimum dissimilarity on a network with n points. Demand values are associated with each point and each group has a demand capacity. The problem is well known to be NP-hard and has many practical applications. In this paper, the hybrid method Clustering Search (CS) is implemented to solve the CCCP. This method identifies promising regions of the search space by generating solutions with a metaheuristic, such as Genetic Algorithm, and clustering them into clusters that are then explored further with local search heuristics. Computational results considering instances available in the literature are presented to demonstrate the efficacy of CS. (C) 2010 Elsevier Ltd. All rights reserved.