959 resultados para Finite volume methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as O(n3), where n is the sample size. We show that the optimal m-dimensional linear model under a given prior is spanned by the first m eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the performance of parity check codes using the mapping onto spin glasses proposed by Sourlas. We study codes where each parity check comprises products of K bits selected from the original digital message with exactly C parity checks per message bit. We show, using the replica method, that these codes saturate Shannon's coding bound for K?8 when the code rate K/C is finite. We then examine the finite temperature case to asses the use of simulated annealing methods for decoding, study the performance of the finite K case and extend the analysis to accommodate different types of noisy channels. The analogy between statistical physics methods and decoding by belief propagation is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this letter is to demonstrate that complete removal of spectral aliasing occurring due to finite numerical bandwidth used in the split-step Fourier simulations of nonlinear interactions of optical waves can be achieved by enlarging each dimension of the spectral domain by a factor (n+1)/2, where n is the number of interacting waves. Alternatively, when using low-pass filtering for dealiasing this amounts to the need for filtering a 2/(n+1) fraction of each spectral dimension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modelling of mechanical structures using finite element analysis has become an indispensable stage in the design of new components and products. Once the theoretical design has been optimised a prototype may be constructed and tested. What can the engineer do if the measured and theoretically predicted vibration characteristics of the structure are significantly different? This thesis considers the problems of changing the parameters of the finite element model to improve the correlation between a physical structure and its mathematical model. Two new methods are introduced to perform the systematic parameter updating. The first uses the measured modal model to derive the parameter values with the minimum variance. The user must provide estimates for the variance of the theoretical parameter values and the measured data. Previous authors using similar methods have assumed that the estimated parameters and measured modal properties are statistically independent. This will generally be the case during the first iteration but will not be the case subsequently. The second method updates the parameters directly from the frequency response functions. The order of the finite element model of the structure is reduced as a function of the unknown parameters. A method related to a weighted equation error algorithm is used to update the parameters. After each iteration the weighting changes so that on convergence the output error is minimised. The suggested methods are extensively tested using simulated data. An H frame is then used to demonstrate the algorithms on a physical structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional structured methods of software engineering are often based on the use of functional decomposition coupled with the Waterfall development process model. This approach is argued to be inadequate for coping with the evolutionary nature of large software systems. Alternative development paradigms, including the operational paradigm and the transformational paradigm, have been proposed to address the inadequacies of this conventional view of software developement, and these are reviewed. JSD is presented as an example of an operational approach to software engineering, and is contrasted with other well documented examples. The thesis shows how aspects of JSD can be characterised with reference to formal language theory and automata theory. In particular, it is noted that Jackson structure diagrams are equivalent to regular expressions and can be thought of as specifying corresponding finite automata. The thesis discusses the automatic transformation of structure diagrams into finite automata using an algorithm adapted from compiler theory, and then extends the technique to deal with areas of JSD which are not strictly formalisable in terms of regular languages. In particular, an elegant and novel method for dealing with so called recognition (or parsing) difficulties is described,. Various applications of the extended technique are described. They include a new method of automatically implementing the dismemberment transformation; an efficient way of implementing inversion in languages lacking a goto-statement; and a new in-the-large implementation strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present dissertation is concerned with the determination of the magnetic field distribution in ma[.rnetic electron lenses by means of the finite element method. In the differential form of this method a Poisson type equation is solved by numerical methods over a finite boundary. Previous methods of adapting this procedure to the requirements of digital computers have restricted its use to computers of extremely large core size. It is shown that by reformulating the boundary conditions, a considerable reduction in core store can be achieved for a given accuracy of field distribution. The magnetic field distribution of a lens may also be calculated by the integral form of the finite element rnethod. This eliminates boundary problems mentioned but introduces other difficulties. After a careful analysis of both methods it has proved possible to combine the advantages of both in a .new approach to the problem which may be called the 'differential-integral' finite element method. The application of this method to the determination of the magnetic field distribution of some new types of magnetic lenses is described. In the course of the work considerable re-programming of standard programs was necessary in order to reduce the core store requirements to a minimum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particle impacts are of fundamental importance in many areas and there has been a renewed interest in research on particle impact problems. A comprehensive investigation of the particle impact problems, using finite element (FE) methods, is presented in this thesis. The capability of FE procedures for modelling particle impacts is demonstrated by excellent agreements between FE analysis results and previous theoretical, experimental and numerical results. For normal impacts of elastic particles, it is found that the energy loss due to stress wave propagation is negligible if it can reflect more than three times during the impact, for which Hertz theory provides a good prediction of impact behaviour provided that the contact deformation is sufficiently small. For normal impact of plastic particles, the energy loss due to stress wave propagation is also generally negligible so that the energy loss is mainly due to plastic deformation. Finite-deformation plastic impact is addressed in this thesis so that plastic impacts can be categorised into elastic-plastic impact and finite-deformation plastic impact. Criteria for the onset of finite-deformation plastic impacts are proposed in terms of impact velocity and material properties. It is found that the coefficient of restitution depends mainly upon the ratio of impact velocity to yield Vni/Vy0 for elastic-plastic impacts, but it is proportional to [(Vni/Vy0)*(Y/E*)]-1/2, where Y /E* is the representative yield strain for finite-deformation plastic impacts. A theoretical model for elastic-plastic impacts is also developed and compares favourably with FEA and previous experimental results. The effect of work hardening is also investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There is substantial evidence that cognitive deficits and brain structural abnormalities are present in patients with Bipolar Disorder (BD) and in their first-degree relatives. Previous studies have demonstrated associations between cognition and functional outcome in BD patients but have not examined the role of brain morphological changes. Similarly, the functional impact of either cognition or brain morphology in relatives remains unknown. Therefore we focused on delineating the relationship between psychosocial functioning, cognition and brain structure, in relation to disease expression and genetic risk for BD. Methods: Clinical, cognitive and brain structural measures were obtained from 41 euthymic BD patients and 50 of their unaffected first-degree relatives. Psychosocial function was evaluated using the General Assessment of Functioning (GAF) scale. We examined the relationship between level of functioning and general intellectual ability (IQ), memory, attention, executive functioning, symptomatology, illness course and total gray matter, white matter and cerebrospinal fluid volumes. Limitations: Cross-sectional design. Results: Multiple regression analyses revealed that IQ, total white matter volume and a predominantly depressive illness course were independently associated with functional outcome in BD patients, but not in their relatives, and accounted for a substantial proportion (53%) of the variance in patients' GAF scores. There were no significant domain-specific associations between cognition and outcome after consideration of IQ. Conclusions: Our results emphasise the role of IQ and white matter integrity in relation to outcome in BD and carry significant implications for treatment interventions. © 2010 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human accommodation system has been extensively examined for over a century, with a particular focus on trying to understand the mechanisms that lead to the loss of accommodative ability with age (Presbyopia). The accommodative process, along with the potential causes of presbyopia, are disputed; hindering efforts to develop methods of restoring accommodation in the presbyopic eye. One method that can be used to provide insight into this complex area is Finite Element Analysis (FEA). The effectiveness of FEA in modelling the accommodative process has been illustrated by a number of accommodative FEA models developed to date. However, there have been limitations to these previous models; principally due to the variation in data on the geometry of the accommodative components, combined with sparse measurements of their material properties. Despite advances in available data, continued oversimplification has occurred in the modelling of the crystalline lens structure and the zonular fibres that surround the lens. A new accommodation model was proposed by the author that aims to eliminate these limitations. A novel representation of the zonular structure was developed, combined with updated lens and capsule modelling methods. The model has been designed to be adaptable so that a range of different age accommodation systems can be modelled, allowing the age related changes that occur to be simulated. The new modelling methods were validated by comparing the changes induced within the model to available in vivo data, leading to the definition of three different age models. These were used in an extended sensitivity study on age related changes, where individual parameters were altered to investigate their effect on the accommodative process. The material properties were found to have the largest impact on the decline in accommodative ability, in particular compared to changes in ciliary body movement or zonular structure. Novel data on the importance of the capsule stiffness and thickness was also established. The new model detailed within this thesis provides further insight into the accommodation mechanism, as well as a foundation for future, more detailed investigations into accommodation, presbyopia and accommodative restoration techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Respiratory-volume monitoring is an indispensable part of mechanical ventilation. Here we present a new method of the respiratory-volume measurement based on a single fibre-optical long-period sensor of bending and the correlation between torso curvature and lung volume. Unlike the commonly used air-flow based measurement methods the proposed sensor is drift-free and immune to air-leaks. In the paper, we explain the working principle of sensors, a two-step calibration-test measurement procedure and present results that establish a linear correlation between the change in the local thorax curvature and the change of the lung volume. We also discuss the advantages and limitations of these sensors with respect to the current standards. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computational mechanics approach has been applied to the orientational behavior of water molecules in a molecular dynamics simulated water–Na + system. The distinctively different statistical complexity of water molecules in the bulk and in the first solvation shell of the ion is demonstrated. It is shown that the molecules undergo more complex orientational motion when surrounded by other water molecules compared to those constrained by the electric field of the ion. However the spatial coordinates of the oxygen atom shows the opposite complexity behavior in that complexity is higher for the solvation shell molecules. New information about the dynamics of water molecules in the solvation shell is provided that is additional to that given by traditional methods of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is partially supported by project ISM-4 of Department for Scientific Research, “Paisii Hilendarski” University of Plovdiv.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reproducing Kernel Hilbert Space (RKHS) and Reproducing Transformation Methods for Series Summation that allow analytically obtaining alternative representations for series in the finite form are developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: Primary 05B05; secondary 62K10.