995 resultados para 3D problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with an overview of upwinding schemes, and further nonlinear applications of a recently introduced high resolution upwind differencing scheme, namely the ADBQUICKEST [V.G. Ferreira, F.A. Kurokawa, R.A.B. Queiroz, M.K. Kaibara, C.M. Oishi, J.A.Cuminato, A.F. Castelo, M.F. Tomé, S. McKee, assessment of a high-order finite difference upwind scheme for the simulation of convection-diffusion problems, International Journal for Numerical Methods in Fluids 60 (2009) 1-26]. The ADBQUICKEST scheme is a new TVD version of the QUICKEST [B.P. Leonard, A stable and accurate convective modeling procedure based on quadratic upstream interpolation, Computer Methods in Applied Mechanics and Engineering 19 (1979) 59-98] for solving nonlinear balance laws. The scheme is based on the concept of NV and TVD formalisms and satisfies a convective boundedness criterion. The accuracy of the scheme is compared with other popularly used convective upwinding schemes (see, for example, Roe (1985) [19], Van Leer (1974) [18] and Arora & Roe (1997) [17]) for solving nonlinear conservation laws (for example, Buckley-Leverett, shallow water and Euler equations). The ADBQUICKEST scheme is then used to solve six types of fluid flow problems of increasing complexity: namely, 2D aerosol filtration by fibrous filters; axisymmetric flow in a tubular membrane; 2D two-phase flow in a fluidized bed; 2D compressible Orszag-Tang MHD vortex; axisymmetric jet onto a flat surface at low Reynolds number and full 3D incompressible flows involving moving free surfaces. The numerical simulations indicate that this convective upwinding scheme is a good generic alternative for solving complex fluid dynamics problems. © 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho apresenta o desenvolvimento de um algoritmo computacional para análise do espalhamento eletromagnético de nanoestruturas plasmônicas isoladas. O Método dos Momentos tridimensional (MoM-3D) foi utilizado para resolver numericamente a equação integral do campo elétrico, e o modelo de Lorentz-Drude foi usado para representar a permissividade complexa das nanoestruturas metálicas. Baseado nesta modelagem matemática, um algoritmo computacional escrito em linguagem C foi desenvolvido. Como exemplo de aplicação e validação do código, dois problemas clássicos de espalhamento eletromagnético de nanopartículas metálicas foram analisados: nanoesfera e nanobarra, onde foram calculadas a resposta espectral e a distribuição do campo próximo. Os resultados obtidos foram comparados com resultados calculados por outros modelos e observou-se uma boa concordância e convergência entre eles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present an algorithm for full-wave electromagnetic analysis of nanoplasmonic structures. We use the three-dimensional Method of Moments to solve the electric field integral equation. The computational algorithm is developed in the language C. As examples of application of the code, the problems of scattering from a nanosphere and a rectangular nanorod are analyzed. The calculated characteristics are the near field distribution and the spectral response of these nanoparticles. The convergence of the method for different discretization sizes is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] In the last years we have developed some methods for 3D reconstruction. First we began with the problem of reconstructing a 3D scene from a stereoscopic pair of images. We developed some methods based on energy functionals which produce dense disparity maps by preserving discontinuities from image boundaries. Then we passed to the problem of reconstructing a 3D scene from multiple views (more than 2). The method for multiple view reconstruction relies on the method for stereoscopic reconstruction. For every pair of consecutive images we estimate a disparity map and then we apply a robust method that searches for good correspondences through the sequence of images. Recently we have proposed several methods for 3D surface regularization. This is a postprocessing step necessary for smoothing the final surface, which could be afected by noise or mismatch correspondences. These regularization methods are interesting because they use the information from the reconstructing process and not only from the 3D surface. We have tackled all these problems from an energy minimization approach. We investigate the associated Euler-Lagrange equation of the energy functional, and we approach the solution of the underlying partial differential equation (PDE) using a gradient descent method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to present various aspects of numerical simulation of particle and radiation transport for industrial and environmental protection applications, to enable the analysis of complex physical processes in a fast, reliable, and efficient way. In the first part we deal with speed-up of numerical simulation of neutron transport for nuclear reactor core analysis. The convergence properties of the source iteration scheme of the Method of Characteristics applied to be heterogeneous structured geometries has been enhanced by means of Boundary Projection Acceleration, enabling the study of 2D and 3D geometries with transport theory without spatial homogenization. The computational performances have been verified with the C5G7 2D and 3D benchmarks, showing a sensible reduction of iterations and CPU time. The second part is devoted to the study of temperature-dependent elastic scattering of neutrons for heavy isotopes near to the thermal zone. A numerical computation of the Doppler convolution of the elastic scattering kernel based on the gas model is presented, for a general energy dependent cross section and scattering law in the center of mass system. The range of integration has been optimized employing a numerical cutoff, allowing a faster numerical evaluation of the convolution integral. Legendre moments of the transfer kernel are subsequently obtained by direct quadrature and a numerical analysis of the convergence is presented. In the third part we focus our attention to remote sensing applications of radiative transfer employed to investigate the Earth's cryosphere. The photon transport equation is applied to simulate reflectivity of glaciers varying the age of the layer of snow or ice, its thickness, the presence or not other underlying layers, the degree of dust included in the snow, creating a framework able to decipher spectral signals collected by orbiting detectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In vielen Bereichen der industriellen Fertigung, wie zum Beispiel in der Automobilindustrie, wer- den digitale Versuchsmodelle (sog. digital mock-ups) eingesetzt, um die Entwicklung komplexer Maschinen m ̈oglichst gut durch Computersysteme unterstu ̈tzen zu k ̈onnen. Hierbei spielen Be- wegungsplanungsalgorithmen eine wichtige Rolle, um zu gew ̈ahrleisten, dass diese digitalen Pro- totypen auch kollisionsfrei zusammengesetzt werden k ̈onnen. In den letzten Jahrzehnten haben sich hier sampling-basierte Verfahren besonders bew ̈ahrt. Diese erzeugen eine große Anzahl von zuf ̈alligen Lagen fu ̈r das ein-/auszubauende Objekt und verwenden einen Kollisionserken- nungsmechanismus, um die einzelnen Lagen auf Gu ̈ltigkeit zu u ̈berpru ̈fen. Daher spielt die Kollisionserkennung eine wesentliche Rolle beim Design effizienter Bewegungsplanungsalgorith- men. Eine Schwierigkeit fu ̈r diese Klasse von Planern stellen sogenannte “narrow passages” dar, schmale Passagen also, die immer dort auftreten, wo die Bewegungsfreiheit der zu planenden Objekte stark eingeschr ̈ankt ist. An solchen Stellen kann es schwierig sein, eine ausreichende Anzahl von kollisionsfreien Samples zu finden. Es ist dann m ̈oglicherweise n ̈otig, ausgeklu ̈geltere Techniken einzusetzen, um eine gute Performance der Algorithmen zu erreichen.rnDie vorliegende Arbeit gliedert sich in zwei Teile: Im ersten Teil untersuchen wir parallele Kollisionserkennungsalgorithmen. Da wir auf eine Anwendung bei sampling-basierten Bewe- gungsplanern abzielen, w ̈ahlen wir hier eine Problemstellung, bei der wir stets die selben zwei Objekte, aber in einer großen Anzahl von unterschiedlichen Lagen auf Kollision testen. Wir im- plementieren und vergleichen verschiedene Verfahren, die auf Hu ̈llk ̈operhierarchien (BVHs) und hierarchische Grids als Beschleunigungsstrukturen zuru ̈ckgreifen. Alle beschriebenen Verfahren wurden auf mehreren CPU-Kernen parallelisiert. Daru ̈ber hinaus vergleichen wir verschiedene CUDA Kernels zur Durchfu ̈hrung BVH-basierter Kollisionstests auf der GPU. Neben einer un- terschiedlichen Verteilung der Arbeit auf die parallelen GPU Threads untersuchen wir hier die Auswirkung verschiedener Speicherzugriffsmuster auf die Performance der resultierenden Algo- rithmen. Weiter stellen wir eine Reihe von approximativen Kollisionstests vor, die auf den beschriebenen Verfahren basieren. Wenn eine geringere Genauigkeit der Tests tolerierbar ist, kann so eine weitere Verbesserung der Performance erzielt werden.rnIm zweiten Teil der Arbeit beschreiben wir einen von uns entworfenen parallelen, sampling- basierten Bewegungsplaner zur Behandlung hochkomplexer Probleme mit mehreren “narrow passages”. Das Verfahren arbeitet in zwei Phasen. Die grundlegende Idee ist hierbei, in der er- sten Planungsphase konzeptionell kleinere Fehler zuzulassen, um die Planungseffizienz zu erh ̈ohen und den resultierenden Pfad dann in einer zweiten Phase zu reparieren. Der hierzu in Phase I eingesetzte Planer basiert auf sogenannten Expansive Space Trees. Zus ̈atzlich haben wir den Planer mit einer Freidru ̈ckoperation ausgestattet, die es erlaubt, kleinere Kollisionen aufzul ̈osen und so die Effizienz in Bereichen mit eingeschr ̈ankter Bewegungsfreiheit zu erh ̈ohen. Optional erlaubt unsere Implementierung den Einsatz von approximativen Kollisionstests. Dies setzt die Genauigkeit der ersten Planungsphase weiter herab, fu ̈hrt aber auch zu einer weiteren Perfor- mancesteigerung. Die aus Phase I resultierenden Bewegungspfade sind dann unter Umst ̈anden nicht komplett kollisionsfrei. Um diese Pfade zu reparieren, haben wir einen neuartigen Pla- nungsalgorithmus entworfen, der lokal beschr ̈ankt auf eine kleine Umgebung um den bestehenden Pfad einen neuen, kollisionsfreien Bewegungspfad plant.rnWir haben den beschriebenen Algorithmus mit einer Klasse von neuen, schwierigen Metall- Puzzlen getestet, die zum Teil mehrere “narrow passages” aufweisen. Unseres Wissens nach ist eine Sammlung vergleichbar komplexer Benchmarks nicht ̈offentlich zug ̈anglich und wir fan- den auch keine Beschreibung von vergleichbar komplexen Benchmarks in der Motion-Planning Literatur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistics involves planning, managing, and organizing the flows of goods from the point of origin to the point of destination in order to meet some requirements. Logistics and transportation aspects are very important and represent a relevant costs for producing and shipping companies, but also for public administration and private citizens. The optimization of resources and the improvement in the organization of operations is crucial for all branches of logistics, from the operation management to the transportation. As we will have the chance to see in this work, optimization techniques, models, and algorithms represent important methods to solve the always new and more complex problems arising in different segments of logistics. Many operation management and transportation problems are related to the optimization class of problems called Vehicle Routing Problems (VRPs). In this work, we consider several real-world deterministic and stochastic problems that are included in the wide class of the VRPs, and we solve them by means of exact and heuristic methods. We treat three classes of real-world routing and logistics problems. We deal with one of the most important tactical problems that arises in the managing of the bike sharing systems, that is the Bike sharing Rebalancing Problem (BRP). We propose models and algorithms for real-world earthwork optimization problems. We describe the 3DP process and we highlight several optimization issues in 3DP. Among those, we define the problem related to the tool path definition in the 3DP process, the 3D Routing Problem (3DRP), which is a generalization of the arc routing problem. We present an ILP model and several heuristic algorithms to solve the 3DRP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stereological tools are the gold standard for accurate (i.e., unbiased) and precise quantification of any microscopic sample. The past decades have provided a broad spectrum of tools to estimate a variety of parameters such as volumes, surfaces, lengths, and numbers. Some of them require pairs of parallel sections that can be produced by either physical or optical sectioning, with optical sectioning being much more efficient when applicable. Unfortunately, transmission electron microscopy could not fully profit from these riches, mainly because of the large depth of field. Hence, optical sectioning was a long-time desire for electron microscopists. This desire was fulfilled with the development of electron tomography that yield stacks of slices from electron microscopic sections. Now, parallel optical slices of a previously unimagined small thickness (2-5 nm axial resolution) can be produced. These optical slices minimize problems related to overprojection effects, and allow for direct stereological analysis, e.g., volume estimation with the Cavalieri principle and number estimation with the optical disector method. Here, we demonstrate that the symbiosis of stereology and electron tomography is an easy and efficient way for quantitative analysis at the electron microscopic level. We call this approach quantitative 3D electron microscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Similarity measure is one of the main factors that affect the accuracy of intensity-based 2D/3D registration of X-ray fluoroscopy to CT images. Information theory has been used to derive similarity measure for image registration leading to the introduction of mutual information, an accurate similarity measure for multi-modal and mono-modal image registration tasks. However, it is known that the standard mutual information measure only takes intensity values into account without considering spatial information and its robustness is questionable. Previous attempt to incorporate spatial information into mutual information either requires computing the entropy of higher dimensional probability distributions, or is not robust to outliers. In this paper, we show how to incorporate spatial information into mutual information without suffering from these problems. Using a variational approximation derived from the Kullback-Leibler bound, spatial information can be effectively incorporated into mutual information via energy minimization. The resulting similarity measure has a least-squares form and can be effectively minimized by a multi-resolution Levenberg-Marquardt optimizer. Experimental results are presented on datasets of two applications: (a) intra-operative patient pose estimation from a few (e.g. 2) calibrated fluoroscopic images, and (b) post-operative cup alignment estimation from single X-ray radiograph with gonadal shielding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stereological tools are the gold standard for accurate (i.e., unbiased) and precise quantification of any microscopic sample. The past decades have provided a broad spectrum of tools to estimate a variety of parameters such as volumes, surfaces, lengths, and numbers. Some of them require pairs of parallel sections that can be produced by either physical or optical sectioning, with optical sectioning being much more efficient when applicable. Unfortunately, transmission electron microscopy could not fully profit from these riches, mainly because of the large depth of field. Hence, optical sectioning was a long-time desire for electron microscopists. This desire was fulfilled with the development of electron tomography that yield stacks of slices from electron microscopic sections. Now, parallel optical slices of a previously unimagined small thickness (2-5nm axial resolution) can be produced. These optical slices minimize problems related to overprojection effects, and allow for direct stereological analysis, e.g., volume estimation with the Cavalieri principle and number estimation with the optical disector method. Here, we demonstrate that the symbiosis of stereology and electron tomography is an easy and efficient way for quantitative analysis at the electron microscopic level. We call this approach quantitative 3D electron microscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Um mit den immer kürzer werdenden Produkteinführungszeiten Schritt halten zu können, die der harte Wettbewerb heute vorgibt, setzt die produzierende Industrie mehr und mehr auf das 3D-Drucken von Prototypen. Mit dieser Produktionsmethode lassen sich technische Probleme schon in der frühen Entwicklungsphase lösen. Dies spart Kosten und beschleunigt die Entwicklungsschritte. Die innovative PolyJetTM-Technologie von Objet setzt neue Maßstäbe im 3D-Drucken. 
Die Besonderheit: Modelle aus hauchdünnen Materialschichten. So können mit der 
PolyJetTM-Technologie detailgetreue Modelle extrem schnell, einfach und sauber realisiert werden – und das mit hervorragender Oberflächenqualität

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The acquisition of conventional X-ray radiographs remains the standard imaging procedure for the diagnosis of hip-related problems. However, recent studies demonstrated the benefit of using three-dimensional (3D) surface models in the clinical routine. 3D surface models of the hip joint are useful for assessing the dynamic range of motion in order to identify possible pathologies such as femoroacetabular impingement. In this paper, we present an integrated system which consists of X-ray radiograph calibration and subsequent 2D/3D hip joint reconstruction for diagnosis and planning of hip-related problems. A mobile phantom with two different sizes of fiducials was developed for X-ray radiograph calibration, which can be robustly detected within the images. On the basis of the calibrated X-ray images, a 3D reconstruction method of the acetabulum was developed and applied together with existing techniques to reconstruct a 3D surface model of the hip joint. X-ray radiographs of dry cadaveric hip bones and one cadaveric specimen with soft tissue were used to prove the robustness of the developed fiducial detection algorithm. Computed tomography scans of the cadaveric bones were used to validate the accuracy of the integrated system. The fiducial detection sensitivity was in the same range for both sizes of fiducials. While the detection sensitivity was 97.96% for the large fiducials, it was 97.62% for the small fiducials. The acetabulum and the proximal femur were reconstructed with a mean surface distance error of 1.06 and 1.01 mm, respectively. The results for fiducial detection sensitivity and 3D surface reconstruction demonstrated the capability of the integrated system for 3D hip joint reconstruction from 2D calibrated X-ray radiographs.