917 resultados para Numerical Algorithms and Problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Binding, David; Couch, M.A.; Sujatha, K.S.; Webster, M.F., (2003) 'Experimental and numerical simulation of dough kneading in filled geometries', Journal of Food Engineering 58 pp.111-123 RAE2008

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adult humans, infants, pre-school children, and non-human animals appear to share a system of approximate numerical processing for non-symbolic stimuli such as arrays of dots or sequences of tones. Behavioral studies of adult humans implicate a link between these non-symbolic numerical abilities and symbolic numerical processing (e.g., similar distance effects in accuracy and reaction-time for arrays of dots and Arabic numerals). However, neuroimaging studies have remained inconclusive on the neural basis of this link. The intraparietal sulcus (IPS) is known to respond selectively to symbolic numerical stimuli such as Arabic numerals. Recent studies, however, have arrived at conflicting conclusions regarding the role of the IPS in processing non-symbolic, numerosity arrays in adulthood, and very little is known about the brain basis of numerical processing early in development. Addressing the question of whether there is an early-developing neural basis for abstract numerical processing is essential for understanding the cognitive origins of our uniquely human capacity for math and science. Using functional magnetic resonance imaging (fMRI) at 4-Tesla and an event-related fMRI adaptation paradigm, we found that adults showed a greater IPS response to visual arrays that deviated from standard stimuli in their number of elements, than to stimuli that deviated in local element shape. These results support previous claims that there is a neurophysiological link between non-symbolic and symbolic numerical processing in adulthood. In parallel, we tested 4-y-old children with the same fMRI adaptation paradigm as adults to determine whether the neural locus of non-symbolic numerical activity in adults shows continuity in function over development. We found that the IPS responded to numerical deviants similarly in 4-y-old children and adults. To our knowledge, this is the first evidence that the neural locus of adult numerical cognition takes form early in development, prior to sophisticated symbolic numerical experience. More broadly, this is also, to our knowledge, the first cognitive fMRI study to test healthy children as young as 4 y, providing new insights into the neurophysiology of human cognitive development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our media is saturated with claims of ``facts'' made from data. Database research has in the past focused on how to answer queries, but has not devoted much attention to discerning more subtle qualities of the resulting claims, e.g., is a claim ``cherry-picking''? This paper proposes a Query Response Surface (QRS) based framework that models claims based on structured data as parameterized queries. A key insight is that we can learn a lot about a claim by perturbing its parameters and seeing how its conclusion changes. This framework lets us formulate and tackle practical fact-checking tasks --- reverse-engineering vague claims, and countering questionable claims --- as computational problems. Within the QRS based framework, we take one step further, and propose a problem along with efficient algorithms for finding high-quality claims of a given form from data, i.e. raising good questions, in the first place. This is achieved to using a limited number of high-valued claims to represent high-valued regions of the QRS. Besides the general purpose high-quality claim finding problem, lead-finding can be tailored towards specific claim quality measures, also defined within the QRS framework. An example of uniqueness-based lead-finding is presented for ``one-of-the-few'' claims, landing in interpretable high-quality claims, and an adjustable mechanism for ranking objects, e.g. NBA players, based on what claims can be made for them. Finally, we study the use of visualization as a powerful way of conveying results of a large number of claims. An efficient two stage sampling algorithm is proposed for generating input of 2d scatter plot with heatmap, evalutaing a limited amount of data, while preserving the two essential visual features, namely outliers and clusters. For all the problems, we present real-world examples and experiments that demonstrate the power of our model, efficiency of our algorithms, and usefulness of their results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates a modeling and design approach that couples computational mechanics techniques with numerical optimisation and statistical models for virtual prototyping and testing in different application areas concerning reliability of eletronic packages. The integrated software modules provide a design engineer in the electronic manufacturing sector with fast design and process solutions by optimizing key parameters and taking into account complexity of certain operational conditions. The integrated modeling framework is obtained by coupling the multi-phsyics finite element framework - PHYSICA - with the numerical optimisation tool - VisualDOC into a fully automated design tool for solutions of electronic packaging problems. Response Surface Modeling Methodolgy and Design of Experiments statistical tools plus numerical optimisaiton techniques are demonstrated as a part of the modeling framework. Two different problems are discussed and solved using the integrated numerical FEM-Optimisation tool. First, an example of thermal management of an electronic package on a board is illustrated. Location of the device is optimized to ensure reduced junction temperature and stress in the die subject to certain cooling air profile and other heat dissipating active components. In the second example thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and subsequently used to optimise the life-time of solder interconnects under thermal cycling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growth of computer power allows the solution of complex problems related to compressible flow, which is an important class of problems in modern day CFD. Over the last 15 years or so, many review works on CFD have been published. This book concerns both mathematical and numerical methods for compressible flow. In particular, it provides a clear cut introduction as well as in depth treatment of modern numerical methods in CFD. This book is organised in two parts. The first part consists of Chapters 1 and 2, and is mainly devoted to theoretical discussions and results. Chapter 1 concerns fundamental physical concepts and theoretical results in gas dynamics. Chapter 2 describes the basic mathematical theory of compressible flow using the inviscid Euler equations and the viscous Navier–Stokes equations. Existence and uniqueness results are also included. The second part consists of modern numerical methods for the Euler and Navier–Stokes equations. Chapter 3 is devoted entirely to the finite volume method for the numerical solution of the Euler equations and covers fundamental concepts such as order of numerical schemes, stability and high-order schemes. The finite volume method is illustrated for 1-D as well as multidimensional Euler equations. Chapter 4 covers the theory of the finite element method and its application to compressible flow. A section is devoted to the combined finite volume–finite element method, and its background theory is also included. Throughout the book numerous examples have been included to demonstrate the numerical methods. The book provides a good insight into the numerical schemes, theoretical analysis, and validation of test problems. It is a very useful reference for applied mathematicians, numerical analysts, and practice engineers. It is also an important reference for postgraduate researchers in the field of scientific computing and CFD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Financial modelling in the area of option pricing involves the understanding of the correlations between asset and movements of buy/sell in order to reduce risk in investment. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. In turn, analysis tools rely on fast numerical algorithms for the solution of financial mathematical models. There are many different financial activities apart from shares buy/sell activities. The main aim of this chapter is to discuss a distributed algorithm for the numerical solution of a European option. Both linear and non-linear cases are considered. The algorithm is based on the concept of the Laplace transform and its numerical inverse. The scalability of the algorithm is examined. Numerical tests are used to demonstrate the effectiveness of the algorithm for financial analysis. Time dependent functions for volatility and interest rates are also discussed. Applications of the algorithm to non-linear Black-Scholes equation where the volatility and the interest rate are functions of the option value are included. Some qualitative results of the convergence behaviour of the algorithm is examined. This chapter also examines the various computational issues of the Laplace transformation method in terms of distributed computing. The idea of using a two-level temporal mesh in order to achieve distributed computation along the temporal axis is introduced. Finally, the chapter ends with some conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mid-ocean ridges are common features of the world’s oceans but there is a lack of understanding as to how their presence affects overlying pelagic biota. The Mid-Atlantic Ridge (MAR) is a dominant feature of the Atlantic Ocean. Here, we examined data on euphausiid distribution and abundance arising from several international research programmes and from the continuous plankton recorder. We used a generalized additive model (GAM) framework to explore spatial patterns of variability in euphausiid distribution on, and at either side of, the MAR from 60°N to 55°S in conjunction with variability in a suite of biological, physical and environmental parameters. Euphausiid species abundance peaked in mid-latitudes and was significantly higher on the ridge than in adjacent waters, but the ridge did not influence numerical abundance significantly. Sea surface temperature (SST) was the most important single factor influencing both euphausiid numerical abundance and species abundance. Increases in sea surface height variance, a proxy for mixing, increased the numerical abundance of euphausiids. GAM predictions of variability in species abundance as a function of SST and depth of the mixed layer were consistent with present theories, which suggest that pelagic niche availability is related to the thermal structure of the near surface water: more deeply-mixed water contained higher euphausiid biodiversity. In addition to exposing present distributional patterns, the GAM framework enables responses to potential future and past environmental variability including temperature change to be explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Political commentators often cast religious con? ict as the result of the numerical growth and political rise of a single faith. When Islam is involved, arguments about religious fundamentalism are quick to surface and often stand as an explanation in their own right. Yet, as useful as this type of explanation may be, it usually fails to address properly, if at all, two sets of important issues. It avoids, Ž rst, the question of the rise of other religions and their contribution to tensions and con? icts. Second, it reduces the role of the State to a reactive one. The State becomes an object of contest or conquest, or it is simply ignored. Adopting a different approach, this article investigates a controversy that took place in Mozambique in 1996 around the ‘ofŽ cialisation’ of two Islamic holidays. It looks at the role played by religious competition and state mediation. The article shows that the State’s abandonment of religious regulation – the establishment of a free ‘religious market’ – fostered religious competition that created tensions between faiths. It suggests that strife ensued because deregulation was almost absolute: the State did not take a clear stand in religious matters and faith organisations started to believe that the State was becoming, or could become, confessional. The conclusion discusses theoretical implications for the understanding of religious strife as well as Church and State relations. It also draws some implications for the case of Mozambique more speciŽ cally, implications which should have relevance for countries such as Malawi, Zambia and Zimbabwe where problems of a similar nature have arisen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the use of eigenvalue stability analysis of very large dimension aeroelastic numerical models arising from the exploitation of computational fluid dynamics is reviewed. A formulation based on a block reduction of the system Jacobian proves powerful to allow various numerical algorithms to be exploited, including frequency domain solvers, reconstruction of a term describing the fluid–structure interaction from the sparse data which incurs the main computational cost, and sampling to place the expensive samples where they are most needed. The stability formulation also allows non-deterministic analysis to be carried out very efficiently through the use of an approximate Newton solver. Finally, the system eigenvectors are exploited to produce nonlinear and parameterised reduced order models for computing limit cycle responses. The performance of the methods is illustrated with results from a number of academic and large dimension aircraft test cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In collaboration with Airbus-UK, the dimensional growth of aircraft panels while being riveted with stiffeners is investigated. Small panels are used in this investigation. The stiffeners have been fastened to the panels with rivets and it has been observed that during this operation the panels expand in the longitudinal and transverse directions. It has been observed that the growth is variable and the challenge is to control the riveting process to minimize this variability. In this investigation, the assembly of the small panels and longitudinal stiffeners has been simulated using static stress and nonlinear explicit finite element models. The models have been validated against a limited set of experimental measurements; it was found that more accurate predictions of the riveting process are achieved using explicit finite element models. Yet, the static stress finite element model is more time efficient, and more practical to simulate hundreds of rivets and the stochastic nature of the process. Furthermore, through a series of numerical simulations and probabilistic analyses, the manufacturing process control parameters that influence panel growth have been identified. Alternative fastening approaches were examined and it was found that dimensional growth can be controlled by changing the design of the dies used for forming the rivets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe some unsolved problems of current interest; these involve quantum critical points in
ferroelectrics and problems which are not amenable to the usual density functional theory, nor to
classical Landau free energy approaches (they are kinetically limited), nor even to the Landau–
Kittel relationship for domain size (they do not satisfy the assumption of infinite lateral diameter)
because they are dominated by finite aperiodic boundary conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objeto principal desta tese é o estudo de algoritmos de processamento e representação automáticos de dados, em particular de informação obtida por sensores montados a bordo de veículos (2D e 3D), com aplicação em contexto de sistemas de apoio à condução. O trabalho foca alguns dos problemas que, quer os sistemas de condução automática (AD), quer os sistemas avançados de apoio à condução (ADAS), enfrentam hoje em dia. O documento é composto por duas partes. A primeira descreve o projeto, construção e desenvolvimento de três protótipos robóticos, incluindo pormenores associados aos sensores montados a bordo dos robôs, algoritmos e arquitecturas de software. Estes robôs foram utilizados como plataformas de ensaios para testar e validar as técnicas propostas. Para além disso, participaram em várias competições de condução autónoma tendo obtido muito bons resultados. A segunda parte deste documento apresenta vários algoritmos empregues na geração de representações intermédias de dados sensoriais. Estes podem ser utilizados para melhorar técnicas já existentes de reconhecimento de padrões, deteção ou navegação, e por este meio contribuir para futuras aplicações no âmbito dos AD ou ADAS. Dado que os veículos autónomos contêm uma grande quantidade de sensores de diferentes naturezas, representações intermédias são particularmente adequadas, pois podem lidar com problemas relacionados com as diversas naturezas dos dados (2D, 3D, fotométrica, etc.), com o carácter assíncrono dos dados (multiplos sensores a enviar dados a diferentes frequências), ou com o alinhamento dos dados (problemas de calibração, diferentes sensores a disponibilizar diferentes medições para um mesmo objeto). Neste âmbito, são propostas novas técnicas para a computação de uma representação multi-câmara multi-modal de transformação de perspectiva inversa, para a execução de correcção de côr entre imagens de forma a obter mosaicos de qualidade, ou para a geração de uma representação de cena baseada em primitivas poligonais, capaz de lidar com grandes quantidades de dados 3D e 2D, tendo inclusivamente a capacidade de refinar a representação à medida que novos dados sensoriais são recebidos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica