977 resultados para Cartesian grid method
Resumo:
Quickremovalofbiosolidsinaquaculturefacilities,andspeciallyinrecirculatingaquaculturesystems(RAS),isoneofthemostimportantstepinwastemanagement.Sedimentationdynamicsofbiosolidsinanaquaculturetankwilldeterminetheiraccumulationatthebottomofthetank.
Resumo:
We present an immersed interface method for the incompressible Navier Stokes equations capable of handling rigid immersed boundaries. The immersed boundary is represented by a set of Lagrangian control points. In order to guarantee that the no-slip condition on the boundary is satisfied, singular forces are applied on the fluid at the immersed boundary. The forces are related to the jumps in pressure and the jumps in the derivatives of both pressure and velocity, and are interpolated using cubic splines. The strength of singular forces is determined by solving a small system of equations at each time step. The Navier-Stokes equations are discretized on a staggered Cartesian grid by a second order accurate projection method for pressure and velocity.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.
Resumo:
La méthode de projection et l'approche variationnelle de Sasaki sont deux techniques permettant d'obtenir un champ vectoriel à divergence nulle à partir d'un champ initial quelconque. Pour une vitesse d'un vent en haute altitude, un champ de vitesse sur une grille décalée est généré au-dessus d'une topographie donnée par une fonction analytique. L'approche cartésienne nommée Embedded Boundary Method est utilisée pour résoudre une équation de Poisson découlant de la projection sur un domaine irrégulier avec des conditions aux limites mixtes. La solution obtenue permet de corriger le champ initial afin d'obtenir un champ respectant la loi de conservation de la masse et prenant également en compte les effets dûs à la géométrie du terrain. Le champ de vitesse ainsi généré permettra de propager un feu de forêt sur la topographie à l'aide de la méthode iso-niveaux. L'algorithme est décrit pour le cas en deux et trois dimensions et des tests de convergence sont effectués.
Resumo:
During the epoch when the first collapsed structures formed (6<z<50) our Universe went through an extended period of changes. Some of the radiation from the first stars and accreting black holes in those structures escaped and changed the state of the Intergalactic Medium (IGM). The era of this global phase change in which the state of the IGM was transformed from cold and neutral to warm and ionized, is called the Epoch of Reionization.In this thesis we focus on numerical methods to calculate the effects of this escaping radiation. We start by considering the performance of the cosmological radiative transfer code C2-Ray. We find that although this code efficiently and accurately solves for the changes in the ionized fractions, it can yield inaccurate results for the temperature changes. We introduce two new elements to improve the code. The first element, an adaptive time step algorithm, quickly determines an optimal time step by only considering the computational cells relevant for this determination. The second element, asynchronous evolution, allows different cells to evolve with different time steps. An important constituent of methods to calculate the effects of ionizing radiation is the transport of photons through the computational domain or ``ray-tracing''. We devise a novel ray tracing method called PYRAMID which uses a new geometry - the pyramidal geometry. This geometry shares properties with both the standard Cartesian and spherical geometries. This makes it on the one hand easy to use in conjunction with a Cartesian grid and on the other hand ideally suited to trace radiation from a radially emitting source. A time-dependent photoionization calculation not only requires tracing the path of photons but also solving the coupled set of photoionization and thermal equations. Several different solvers for these equations are in use in cosmological radiative transfer codes. We conduct a detailed and quantitative comparison of four different standard solvers in which we evaluate how their accuracy depends on the choice of the time step. This comparison shows that their performance can be characterized by two simple parameters and that the C2-Ray generally performs best.
Resumo:
This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.
Resumo:
Apesar da importância econômica e social da aguardente de cana brasileira, são ainda muito escassos os estudos sobre sua qualidade sensorial, porém as crescentes exigências do mercado, tem feito crescer a preocupação com a qualidade dessa bebida. A aguardente de cana é muito apreciada por seu aroma e sabor característicos, que podem ainda melhorar pelo envelhecimento em recipientes de madeira. O complexo processo que ocorre durante o envelhecimento depende além de vários fatores, do tipo de madeira empregada, do tempo de maturação e obviamente da qualidade inicial do destilado. A análise descritiva quantitativa, metodologia muito aplicada na caracterização dos atributos sensoriais de alimentos e bebidas, foi utilizada neste trabalho para estudar o perfil sensorial da aguardente de cana, durante o envelhecimento, em toneis de carvalho. Nesse sentido, foram analisadas amostras de aguardente envelhecidas durante zero, 12, 24, 36 e 48 meses em um tonel de carvalho de 200 litros e duas amostras comerciais, sendo uma delas envelhecida. Dezesseis provadores pré-selecionados através de testes triangulares e análise sequencial, geraram pelo método rede (Kelly's Repertory Grid Method), os termos descritores das aguardentes. Após a etapa de treinamento, foram selecionados 10 provadores com base em seu poder de discriminação, repetibilidade e concordância com a equipe no uso de escalas. As amostras foram então apresentadas e avaliadas pelos provadores, em cabines individuais de forma monádica, com quatro repetições. Os resultados obtidos foram submetidos à análise de variância, teste de médias de Tukey e à Análise de Componentes Principais. Os termos descritores escolhidos em consenso pelos membros da equipe sensorial foram: coloração amarela, aroma alcoólico, aroma de madeira, aroma de baunilha, doçura inicial, doçura residual, sabor alcóolico inicial, sabor alcóolico residual, sabor de madeira inicial, sabor de madeira residual, sabor agressivo, adstringente e ácido. Os resultados obtidos revelaram mudanças significativas (p£0,05) das características sensoriais da aguardente ao longo do envelhecimento. Após 48 meses em tonel de carvalho, a bebida apresentou aroma de madeira, doçura inicial e residual, aroma de baunilha, coloração amarela, gosto inicial e residual de madeira pronunciados, sendo os descritores o aroma alcoólico, agressividade, sabor inicial e residual de álcool, significativamente inferiores aos das demais amostras.
Resumo:
Edulcorantes em solução, com a mesma equivalência de doçura, podem apresentar características sensoriais que os tornam diferentes entre si. O presente estudo teve como objetivo realizar Análise Descritiva Quantitativa de soluções de aspartame (APM), extrato de folhas de estévia (SrB) e mistura ciclamato/sacarina 2:1 (C/S) em diferentes níveis de doçura, ou seja, em equivalência de doçura a uma solução aquosa de sacarose a 3, 10, 20 e 30%. Onze provadores, pré-selecionados através de análise seqüencial, tendo como critério suas habilidades de discriminação, foram treinados após o levantamento da terminologia descritiva. Após o treinamento, os provadores foram selecionados através de seu poder de discriminação, reproducibilidade e concordância com a equipe no uso de escalas. Os termos descritivos dos edulcorantes, para todos os níveis de doçura, gerados através do método rede (Kelly's Repertory Grid Method) foram: doçura inicial, doçura residual, amargo inicial, amargo residual, residual de alcaçuz, corpo e acidez. Os resultados obtidos para cada nível de doçura foram analisados através de análise de variância, teste de Tukey e Análise de Componentes Principais. A análise descritiva foi efetiva em caracterizar o perfil sensorial dos edulcorantes em diferentes concentrações, evidenciando as mudanças no perfil com o aumento de suas concentrações.
Resumo:
A Análise Descritiva Quantitativa(ADQ) foi empregada para caracterização das amostras de pós de cacau que representaram a amplitude do delineamento composto rotacional central 2 do processo de alcalinização dos "nibs" de cacau. As variáveis independentes foram faixas de temperatura de 60 a 120 ºC, de tempo de 30 a 150min e de concentração de K2CO3 de 1,22 a 4,78%. Foram avaliadas oito amostras de pó de cacau representativas das variações de cor e aceitabilidade do total das amostras obtidas experimentalmente e duas amostras de marcas comerciais. A análise do aroma foi feita diretamente nos pós de cacau alcalinizados e a dos demais atributos na forma de bebida achocolatada (2% do pó de cacau e 7% de açúcar em leite desnatado esterilizado). Doze provadores selecionados com base no seu poder de discriminação, reprodutibilidade e concordância com a equipe geraram em consenso, pelo método de rede (Kelly's Repertory Grid Method), três termos descritos para o aroma (alcalino, chocolate e queimado) e doze para os achocolatados (solubilidade, marrom, marrom avermelhado, chocolate, queimado, caramelo, doce, adstringente, alcalino, amargo, salgado e corpo). As avaliações das amostras foram feitas monadicamente com três repetições e em cabines individuais. Os dados obtidos foram submetidos a ANOVA, teste de Tukey e Análise de Componentes Principais. As avaliações dos aromas dos pós de cacau mostraram relação direta entre o aroma alcalino e os teores de álcali, temperatura e tempo do processo. De modo geral, os produtos com menores concentrações de K2CO3 (1,22 - 3,00% ) apresentaram aroma e sabor de chocolate mais fortes. Encontrou-se uma relação direta entre os teores de álcali no produto e o sabor alcalino, queimado e adstringente e uma relação inversa com a luminosidade da cor. Assim, as amostras com maior concentração de K2CO3 (4,78%), foram consideradas pela equipe sensorial, as de mais forte sabor e aroma alcalino, queimado e adstringente, assim como de cor marrom e marrom avermelhado mais escura. Todos os processos levaram à obtenção de amostras com alta solubilidade e fraco sabor amargo.
Resumo:
Moving-least-squares (MLS) surfaces undergoing large deformations need periodic regeneration of the point set (point-set resampling) so as to keep the point-set density quasi-uniform. Previous work by the authors dealt with algebraic MLS surfaces, and proposed a resampling strategy based on defining the new points at the intersections of the MLS surface with a suitable set of rays. That strategy has very low memory requirements and is easy to parallelize. In this article new resampling strategies with reduced CPU-time cost are explored. The basic idea is to choose as set of rays the lines of a regular, Cartesian grid, and to fully exploit this grid: as data structure for search queries, as spatial structure for traversing the surface in a continuation-like algorithm, and also as approximation grid for an interpolated version of the MLS surface. It is shown that in this way a very simple and compact resampling technique is obtained, which cuts the resampling cost by half with affordable memory requirements.
Resumo:
The study aimed to evaluate the radial profile and the uniformity of water distribution of sprinkler manufactured by the company NaanDanJain, model 427 1/2 '' M and nozzle with 2.8 mm of internal diameter, operating at pressures of 150, 200, 300 and 400 kPa and five positions of the deflector (0, 20, 50, 80 and 100%). For the determination of the parameters evaluated, the grid method was used and with the help of computer application CATCH 3D, overlapping layers of water depths was calculated with ten spacing. The results show that the deflector adjustment influences the radius of wetness and the distribution profile while the uniformity of water application showed as an important mechanism, since it permits different behavior for the sprinkler, ensuring wide track of utilization of the equipment.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In astrophysical regimes where the collisional excitation of hydrogen atoms is relevant, the cross-sections for the interactions of hydrogen atoms with electrons and protons are necessary for calculating line profiles and intensities. In particular, at relative velocities exceeding ∼1000 km s−1, collisional excitation by protons dominates over that by electrons. Surprisingly, the H–H+ cross-sections at these velocities do not exist for atomic levels of n≥ 4, forcing researchers to utilize extrapolation via inaccurate scaling laws. In this study, we present a faster and improved algorithm for computing cross-sections for the H–H+ collisional system, including excitation and charge transfer to the n≥ 2 levels of the hydrogen atom. We develop a code named BDSCX which directly solves the Schrödinger equation with variable (but non-adaptive) resolution and utilizes a hybrid spatial-Fourier grid. Our novel hybrid grid reduces the number of grid points needed from ∼4000n6 (for a ‘brute force’, Cartesian grid) to ∼2000n4 and speeds up the computation by a factor of ∼50 for calculations going up to n= 4. We present (l, m)-resolved results for charge transfer and excitation final states for n= 2–4 and for projectile energies of 5–80 keV, as well as fitting functions for the cross-sections. The ability to accurately compute H–H+ cross-sections to n= 4 allows us to calculate the Balmer decrement, the ratio of Hα to Hβ line intensities. We find that the Balmer decrement starts to increase beyond its largely constant value of 2–3 below 10 keV, reaching values of 4–5 at 5 keV, thus complicating its use as a diagnostic of dust extinction when fast (∼1000 km s−1) shocks are impinging upon the ambient interstellar medium.
Resumo:
Aerial photography was used to determine the land use in a test area of the Nigerian savanna in 1950 and 1972. Changes in land use were determined and correlated with accessibility, appropriate low technology methods being used to make it easy to extend the investigation to other areas without incurring great expense. A test area of 750 sq km was chosen located in Kaduna State of Nigeria. The geography of the area is summarised together with the local knowledge which is essential for accurate photo interpretation. A land use classification was devised and tested for use with medium scale aerial photography of the savanna. The two sets of aerial photography at 1:25 000 scale were sampled using systematic dot grids. A dot density of 8 1/2 dots per sq km was calculated to give an acceptable estimate of land use. Problems of interpretation included gradation between categories, sample position uncertainty and personal bias. The results showed that in 22 years the amount of cultivated land in the test area had doubled while there had been a corresponding decrease in the amount of uncultivated land particularly woodland. The intensity of land use had generally increased. The distribution of land use changes was analysed and correlated with accessibility. Highly significant correlations were found for 1972 which had not existed in 1950. Changes in land use could also be correlated with accessibility. It was concluded that in the 22 year test period there had been intensification of land use, movement of human activity towards the main road, and a decrease in natural vegetation particularly close to the road. The classification of land use and the dot grid method of survey were shown to be applicable to a savanna test area.