960 resultados para Improved sequential algebraic algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Lanczos algorithm is appreciated in many situations due to its speed. and economy of storage. However, the advantage that the Lanczos basis vectors need not be kept is lost when the algorithm is used to compute the action of a matrix function on a vector. Either the basis vectors need to be kept, or the Lanczos process needs to be applied twice. In this study we describe an augmented Lanczos algorithm to compute a dot product relative to a function of a large sparse symmetric matrix, without keeping the basis vectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new algebraic Bethe ansatz scheme is proposed to diagonalize classes of integrable models relevant to the description of Bose-Einstein condensation in dilute alkali gases. This is achieved by introducing the notion of Z-graded representations of the Yang-Baxter algebra. (C) 2003 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most finite element packages use the Newmark algorithm for time integration of structural dynamics. Various algorithms have been proposed to better optimize the high frequency dissipation of this algorithm. Hulbert and Chung proposed both implicit and explicit forms of the generalized alpha method. The algorithms optimize high frequency dissipation effectively, and despite recent work on algorithms that possess momentum conserving/energy dissipative properties in a non-linear context, the generalized alpha method remains an efficient way to solve many problems, especially with adaptive timestep control. However, the implicit and explicit algorithms use incompatible parameter sets and cannot be used together in a spatial partition, whereas this can be done for the Newmark algorithm, as Hughes and Liu demonstrated, and for the HHT-alpha algorithm developed from it. The present paper shows that the explicit generalized alpha method can be rewritten so that it becomes compatible with the implicit form. All four algorithmic parameters can be matched between the explicit and implicit forms. An element interface between implicit and explicit partitions can then be used, analogous to that devised by Hughes and Liu to extend the Newmark method. The stability of the explicit/implicit algorithm is examined in a linear context and found to exceed that of the explicit partition. The element partition is significantly less dissipative of intermediate frequencies than one using the HHT-alpha method. The explicit algorithm can also be rewritten so that the discrete equation of motion evaluates forces from displacements and velocities found at the predicted mid-point of a cycle. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents Monte Carlo techniques for estimating network reliability. For highly reliable networks, techniques based on graph evolution models provide very good performance. However, they are known to have significant simulation cost. An existing hybrid scheme (based on partitioning the time space) is available to speed up the simulations; however, there are difficulties with optimizing the important parameter associated with this scheme. To overcome these difficulties, a new hybrid scheme (based on partitioning the edge set) is proposed in this article. The proposed scheme shows orders of magnitude improvement of performance over the existing techniques in certain classes of network. It also provides reliability bounds with little overhead.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to meet increasingly stringent European discharge standards, new applications and control strategies for the sustainable removal of ammonia from wastewater have to beimplemented. In this paper we discuss anitrogen removal system based on the processesof partial nitrification and anoxic ammoniaoxidation (anammox). The anammox process offers great opportunities to remove ammonia in fully autotrophic systems with biomass retention. No organic carbon is needed in such nitrogenremoval system, since ammonia is used a selectron donor for nitrite reduction. The nitrite can be produced from ammonia in oxygen-limited biofilm systems or in continuous processes without biomass retention. For successful implementation of the combined processes, accurate biosensors for measuring ammonia and nitrite concentrations, insight inthe complex microbial communities involved, and new control strategies have to be developed and evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have recently developed a scaleable Artificial Boundary Inhomogeneity (ABI) method [Chem. Phys. Lett.366, 390–397 (2002)] based on the utilization of the Lanczos algorithm, and in this work explore an alternative iterative implementation based on the Chebyshev algorithm. Detailed comparisons between the two iterative methods have been made in terms of efficiency as well as convergence behavior. The Lanczos subspace ABI method was also further improved by the use of a simpler three-term backward recursion algorithm to solve the subspace linear system. The two different iterative methods are tested on the model collinear H+H2 reactive state-to-state scattering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In standard cylindrical gradient coils consisting of a single layer of wires, a limiting factor in achieving very large magnetic field gradients is the rapid increase in coil resistance with efficiency. This is a particular problem in small-bore scanners, such as those used for MR microscopy. By adopting a multi-layer design in which the coil wires are allowed to spread out into multiple layers wound at increasing radii, a more favourable scaling of resistance with efficiency is achieved, thus allowing the design of more powerful gradient coils with acceptable resistance values. Previously this approach has been applied to the design of unshielded, longitudinal, and transverse gradient coils. Here, the multi-layer approach has been extended to allow the design of actively shielded multi-layer gradient coils, and also to produce coils exhibiting enhanced cooling characteristics. An iterative approach to modelling the steady-state temperature distribution within the coil has also been developed. Results indicate that a good level of screening can be achieved in multi-layer coils, that small versions of such coils can yield higher efficiencies at fixed resistance than conventional two-layer (primary and screen) coils, and that performance improves as the number of layers of increases. Simulations show that by optimising multi-layer coils for cooling it is possible to achieve significantly higher gradient strengths at a fixed maximum operating temperature. A four-layer coil of 8 mm inner diameter has been constructed and used to test the steady-state temperature model. (C) 2003 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Combined Genetic Algorithm and Method of Moments design methods is presented for the design of unusual near-field antennas for use in Magnetic Resonance Imaging systems. The method is successfully applied to the design of an asymmetric coil structure for use at 190MHz and demonstrates excellent radiofrequency field homogeneity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When asked to compare two lateralized shapes for horizontal size, neglect patients often indicate the left stimulus to be smaller. Gainotti and Tiacci (1971) hypothesized that this phenomenon might be related to a rightward bias in the patients' gaze. This study aimed to assess the relation between this size underestimation and oculomotor asymmetries. Eye movements were recorded while three neglect patients judged the horizontal extent of two rectangles. Two experimental manipulations were performed to increase the likelihood of symmetrical scanning of the stimulus display. The first manipulation entailed a sequential, rather than simultaneous presentation of the two rectangles. The second required adaptation to rightward displacing prisms, which is known to reduce many manifestations of neglect. All patients consistently underestimated the left rectangle, but the pattern of verbal responses and eye movements suggested different underlying causes. These include a distortion of space perception without ocular asymmetry, a failure to view the full leftward extent of the left stimulus, and a high-level response bias. Sequential presentation of the rectangles and prism adaptation reduced ocular asymmetries without affecting size underestimation. Overall, the results suggest that leftward size underestimation in neglect can arise for a number of different reasons. Incomplete leftward scanning may perhaps be sufficient to induce perceptual size distortion, but it is not a necessary prerequisite.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho investigou o problema da modelagem da dispersão de compostos odorantes em presença de obstáculos (cúbicos e com forma complexa) sob condição de estabilidade atmosférica neutra. Foi empregada modelagem numérica baseada nas equações de transporte (CFD1) bem como em modelos algébricos baseados na pluma Gausseana (AERMOD2, CALPUFF3 e FPM4). Para a validação dos resultados dos modelos e a avaliação do seu desempenho foram empregados dados de experimentos em túnel de vento e em campo. A fim de incluir os efeitos da turbulência atmosférica na dispersão, dois diferentes modelos de sub-malha associados à Simulação das Grandes Escalas (LES5) foram investigados (Smagorinsky dinâmico e WALE6) e, para a inclusão dos efeitos de obstáculos na dispersão nos modelos Gausseanos, foi empregado o modelo PRIME7. O uso do PRIME também foi proposto para o FPM como uma inovação. De forma geral, os resultados indicam que o uso de CFD/LES é uma ferramenta útil para a investigação da dispersão e o impacto de compostos odorantes em presença de obstáculos e também para desenvolvimento dos modelos Gausseanos. Os resultados também indicam que o modelo FPM proposto, com a inclusão dos efeitos do obstáculo baseado no PRIME também é uma ferramenta muito útil em modelagem da dispersão de odores devido à sua simplicidade e fácil configuração quando comparado a modelos mais complexos como CFD e mesmo os modelos regulatórios AERMOD e CALPUFF. A grande vantagem do FPM é a possibilidade de estimar-se o fator de intermitência e a relação pico-média (P/M), parâmetros úteis para a avaliação do impacto de odores. Os resultados obtidos no presente trabalho indicam que a determinação dos parâmetros de dispersão para os segmentos de pluma, bem como os parâmetros de tempo longo nas proximidades da fonte e do obstáculo no modelo FPM pode ser melhorada e simulações CFD podem ser usadas como uma ferramenta de desenvolvimento para este propósito. Palavras chave: controle de odor, dispersão, fluidodinâmica computacional, modelagem matemática, modelagem gaussiana de pluma flutuante, simulação de grandes vórtices (LES).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many organisations need to extract useful information from huge amounts of movement data. One example is found in maritime transportation, where the automated identification of a diverse range of traffic routes is a key management issue for improving the maintenance of ports and ocean routes, and accelerating ship traffic. This paper addresses, in a first stage, the research challenge of developing an approach for the automated identification of traffic routes based on clustering motion vectors rather than reconstructed trajectories. The immediate benefit of the proposed approach is to avoid the reconstruction of trajectories in terms of their geometric shape of the path, their position in space, their life span, and changes of speed, direction and other attributes over time. For clustering the moving objects, an adapted version of the Shared Nearest Neighbour algorithm is used. The motion vectors, with a position and a direction, are analysed in order to identify clusters of vectors that are moving towards the same direction. These clusters represent traffic routes and the preliminary results have shown to be promising for the automated identification of traffic routes with different shapes and densities, as well as for handling noise data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biocompatibility is a major challenge for successful application of many biomaterials. In this study the ability to coat chemically and enzymatically activated poly(L-lactic acid) (PLA) membranes with heat denatured human serum albumin to improve biocompatibility was investigated. PLA membranes hydrolyzed with NaOH or cutinase and then treated with 1-ethyl-3-(3-dimethylaminopropyl)carbodiimide, hydrochloride (EDAC) as a heterobifunctional cross-linker promoted the coupling single bondCOOH groups on PLA membranes and single bondNH2 groups of heat denatured human serum albumin. This resulted in increased hydrophilicity (lowest water contact angles of 43° and 35°) and highest antioxidant activity (quenching of 79 μM and 115 μM tetramethylazobisquinone (TMAMQ) for NaOH and cutinase pretreated membranes, respectively). FTIR analysis of modified PLA membranes showed new peaks attributed to human serum albumin (amide bond, NH2 and side chain stretching) appearing within 3600–3000 cm−1 and 1700–1500 cm−1 (Fig. 3). MTT studies also showed that osteoblasts-like and MC-3T3-E1 cells viability increased 2.4 times as compared to untreated PLA membranes. The study therefore shows that this strategy of modifying the surfaces of PLA polymers could significantly improve biocompatibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative analysis of cine cardiac magnetic resonance (CMR) images for the assessment of global left ventricular morphology and function remains a routine task in clinical cardiology practice. To date, this process requires user interaction and therefore prolongs the examination (i.e. cost) and introduces observer variability. In this study, we sought to validate the feasibility, accuracy, and time efficiency of a novel framework for automatic quantification of left ventricular global function in a clinical setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In sport there is a great need to obtain as much information as possible about the factors which affect the dynamics of play. This study uses sequential analysis and temporal patterns (T-patterns)to examine the evolution of defence (against an equal number of attackers)as used by the Spanish handball team at the Beijing 2008 Olympic Games. The aim is to help handball coaches (during their training and gathering of professional experience)to understand the importance of the structure of defensive systems. This can be achieved through observational processes that reveal the evolution and adaptation of these defensive systems according to different variables: the match score, the response of the opposing team and progress through the tournament.