917 resultados para Numerical Algorithms and Problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A numerical study of turbulent flow in a straight duct of square cross-section is made. An order-of-magnitude analysis of the 3-D, time-averaged Navier-Stokes equations resulted in a parabolic form of the Navier-Stokes equations. The governing equations, expressed in terms of a new vector-potential formulation, are expanded as a multi-deck structure with each deck characterized by its dominant physical forces. The resulting equations are solved using a finite-element approach with a bicubic element representation on each cross-sectional plane. The numerical integration along the streamwise direction is carried out with finite-difference approximations until a fully-developed state is reached. The computed results agree well with other numerical studies and compare very favorably with the available experimental data. One important outcome of the current investigation is the interpretation analytically that the driving force of the secondary flow in a square duct comes mainly from the second-order terms of the difference in the gradients of the normal and transverse Reynolds stresses in the axial vorticity equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The toxicity of sediments in Biscayne Bay and many adjoining tributaries was determined as part of a bioeffects assessments program managed by NOAA’s National Status and Trends Program. The objectives of the survey were to determine: (1) the incidence and degree of toxicity of sediments throughout the study area; (2) the spatial patterns (or gradients) in chemical contamination and toxicity, if any, throughout the study area; (3) the spatial extent of chemical contamination and toxicity; and (4) the statistical relationships between measures of toxicity and concentrations of chemicals in the sediments. The survey was designed to characterize sediment quality throughout the greater Biscayne Bay area. Surficial sediment samples were collected during 1995 and 1996 from 226 randomly-chosen locations throughout nine major regions. Laboratory toxicity tests were performed as indicators of potential ecotoxicological effects in sediments. A battery of tests was performed to generate information from different phases (components) of the sediments. Tests were selected to represent a range in toxicological endpoints from acute to chronic sublethal responses. Toxicological tests were conducted to measure: reduced survival of adult amphipods exposed to solid-phase sediments; impaired fertilization success and abnormal morphological development in gametes and embryos, respectively, of sea urchins exposed to pore waters; reduced metabolic activity of a marine bioluminescent bacteria exposed to organic solvent extracts; induction of a cytochrome P-450 reporter gene system in exposures to solvent extracts; and reduced reproductive success in marine copepods exposed to solid-phase sediments. Contamination and toxicity were most severe in several peripheral canals and tributaries, including the lower Miami River, adjoining the main axis of the bay. In the open basins of the bay, chemical concentrations and toxicity generally were higher in areas north of the Rickenbacker Causeway than south of it. Sediments from the main basins of the bay generally were less toxic than those from the adjoining tributaries and canals. The different toxicity tests, however, indicated differences in severity, incidence, spatial patterns, and spatial extent in toxicity. The most sensitive test among those performed on all samples, a bioassay of normal morphological development of sea urchin embryos, indicated toxicity was pervasive throughout the entire study area. The least sensitive test, an acute bioassay performed with a benthic amphipod, indicated toxicity was restricted to a very small percentage of the area. Both the degree and spatial extent of chemical contamination and toxicity in this study area were similar to or less severe than those observed in many other areas in the U.S. The spatial extent of toxicity in all four tests performed throughout the bay were comparable to the “national averages” calculated by NOAA from previous surveys conducted in a similar manner. Several trace metals occurred in concentrations in excess of those expected in reference sediments. Mixtures of substances, including pesticides, petroleum constituents, trace metals, and ammonia, were associated statistically with the measures of toxicity. Substances most elevated in concentration relative to numerical guidelines and associated with toxicity included polychlorinated biphenyls, DDT pesticides, polynuclear aromatic hydrocarbons, hexachloro cyclohexanes, lead, and mercury. These (and other) substances occurred in concentrations greater than effects-based guidelines in the samples that were most toxic in one or more of the tests. (PDF contains 180 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polymer optical fibers (POFs) doped with organic dyes can be used to make efficient lasers and amplifiers due to the high gains achievable in short distances. This paper analyzes the peculiarities of light amplification in POFs through some experimental data and a computational model capable of carrying out both power and spectral analyses. We investigate the emission spectral shifts and widths and on the optimum signal wavelength and pump power as functions of the fiber length, the fiber numerical aperture and the radial distribution of the dopant. Analyses for both step-index and graded-index POFs have been done.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As academic libraries are increasingly supported by a matrix of databases functions, the use of data mining and visualization techniques offer significant potential for future collection development and service initiatives based on quantifiable data. While data collection techniques are still not standardized and results may be skewed because of granularity problems, faulty algorithms, and a host of other factors, useful baseline data is extractable and broad trends can be identified. The purpose of the current study is to provide an initial assessment of data associated with science monograph collection at the Marston Science Library (MSL), University of Florida. These sciences fall within the major Library of Congress Classification schedules of Q, S, and T, excluding R, TN, TR, and TT. Overall strategy of this project is to look at the potential science audiences within the university community and analyze data related to purchasing and circulation patterns, e-book usage, and interlibrary loan statistics. While a longitudinal study from 2004 to the present would be ideal, this paper presents the results from the academic year July 1, 2008 to June 30, 2009 which was chosen as the pilot period because all data reservoirs identified above were available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report is a detailed description of data processing of NOAA/MLML spectroradiometry data. It introduces the MLML_DBASE programs, describes the assembly of diverse data fues, and describes general algorithms and how individual routines are used. Definitions of data structures are presented in Appendices. [PDF contains 48 pages]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose an integrated algorithm named low dimensional simplex evolution extension (LDSEE) for expensive global optimization in which only a very limited number of function evaluations is allowed. The new algorithm accelerates an existing global optimization, low dimensional simplex evolution (LDSE), by using radial basis function (RBF) interpolation and tabu search. Different from other expensive global optimization methods, LDSEE integrates the RBF interpolation and tabu search with the LDSE algorithm rather than just calling existing global optimization algorithms as subroutines. As a result, it can keep a good balance between the model approximation and the global search. Meanwhile it is self-contained. It does not rely on other GO algorithms and is very easy to use. Numerical results show that it is a competitive alternative for expensive global optimization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A parallel strategy for solving multidimensional tridiagonal equations is investigated in this paper. We present in detail an improved version of single parallel partition (SPP) algorithm in conjunction with message vectorization, which aggregates several communication messages into one to reduce the communication cost. We show the resulting block SPP can achieve good speedup for a wide range of message vector length (MVL), especially when the number of grid points in the divided direction is large. Instead of only using the largest possible MVL, we adopt numerical tests and modeling analysis to determine an optimal MVL so that significant improvement in speedup can be obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bucket Foundations under Dynamic Loadings The liquefaction deformation of sand layer around a bucket foundation is simulated under equivalent dynamic ice-induced loadings. A simplified numerical model is presented by taking the bucket-soil interaction into consideration. The development of vertical and horizontal liquefaction deformations are computed under equivalent dynamic ice-induced loadings. Firstly, the numerical model and results are proved to be reliable by comparing them with the centrifuge testing results. Secondly, the factors and the development characteristics of liquefaction deformation are analyzed. Finally, the following numerical simulation results are obtained: the liquefaction deformation of sand layer increases with the increase of loading amplitude and with the decrease of loading frequency and sand skeleton’s strength. The maximum vertical deformation is located on the sand layer surface and 1/4 times of the bucket’s height apart from the bucket’s side wall (loading boundary). The maximum horizontal deformation occurs at the loading boundary. When the dynamic loadings is applied for more than 5 hours, the vertical deformation on the sand layer surface reaches 3 times that at the bottom, and the horizontal deformation at 2.0 times of the bucket height apart from the loading boundary is 3.3% of which on the loading boundary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smoothed particle hydrodynamics (SPH) is a meshfree particle method based on Lagrangian formulation, and has been widely applied to different areas in engineering and science. This paper presents an overview on the SPH method and its recent developments, including (1) the need for meshfree particle methods, and advantages of SPH, (2) approximation schemes of the conventional SPH method and numerical techniques for deriving SPH formulations for partial differential equations such as the Navier-Stokes (N-S) equations, (3) the role of the smoothing kernel functions and a general approach to construct smoothing kernel functions, (4) kernel and particle consistency for the SPH method, and approaches for restoring particle consistency, (5) several important numerical aspects, and (6) some recent applications of SPH. The paper ends with some concluding remarks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new high-order finite volume method based on local reconstruction is presented in this paper. The method, so-called the multi-moment constrained finite volume (MCV) method, uses the point values defined within single cell at equally spaced points as the model variables (or unknowns). The time evolution equations used to update the unknowns are derived from a set of constraint conditions imposed on multi kinds of moments, i.e. the cell-averaged value and the point-wise value of the state variable and its derivatives. The finite volume constraint on the cell-average guarantees the numerical conservativeness of the method. Most constraint conditions are imposed on the cell boundaries, where the numerical flux and its derivatives are solved as general Riemann problems. A multi-moment constrained Lagrange interpolation reconstruction for the demanded order of accuracy is constructed over single cell and converts the evolution equations of the moments to those of the unknowns. The presented method provides a general framework to construct efficient schemes of high orders. The basic formulations for hyperbolic conservation laws in 1- and 2D structured grids are detailed with the numerical results of widely used benchmark tests. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores the problem of mobile robot navigation in dense human crowds. We begin by considering a fundamental impediment to classical motion planning algorithms called the freezing robot problem: once the environment surpasses a certain level of complexity, the planner decides that all forward paths are unsafe, and the robot freezes in place (or performs unnecessary maneuvers) to avoid collisions. Since a feasible path typically exists, this behavior is suboptimal. Existing approaches have focused on reducing predictive uncertainty by employing higher fidelity individual dynamics models or heuristically limiting the individual predictive covariance to prevent overcautious navigation. We demonstrate that both the individual prediction and the individual predictive uncertainty have little to do with this undesirable navigation behavior. Additionally, we provide evidence that dynamic agents are able to navigate in dense crowds by engaging in joint collision avoidance, cooperatively making room to create feasible trajectories. We accordingly develop interacting Gaussian processes, a prediction density that captures cooperative collision avoidance, and a "multiple goal" extension that models the goal driven nature of human decision making. Navigation naturally emerges as a statistic of this distribution.

Most importantly, we empirically validate our models in the Chandler dining hall at Caltech during peak hours, and in the process, carry out the first extensive quantitative study of robot navigation in dense human crowds (collecting data on 488 runs). The multiple goal interacting Gaussian processes algorithm performs comparably with human teleoperators in crowd densities nearing 1 person/m2, while a state of the art noncooperative planner exhibits unsafe behavior more than 3 times as often as the multiple goal extension, and twice as often as the basic interacting Gaussian process approach. Furthermore, a reactive planner based on the widely used dynamic window approach proves insufficient for crowd densities above 0.55 people/m2. We also show that our noncooperative planner or our reactive planner capture the salient characteristics of nearly any dynamic navigation algorithm. For inclusive validation purposes, we show that either our non-interacting planner or our reactive planner captures the salient characteristics of nearly any existing dynamic navigation algorithm. Based on these experimental results and theoretical observations, we conclude that a cooperation model is critical for safe and efficient robot navigation in dense human crowds.

Finally, we produce a large database of ground truth pedestrian crowd data. We make this ground truth database publicly available for further scientific study of crowd prediction models, learning from demonstration algorithms, and human robot interaction models in general.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis introduces fundamental equations and numerical methods for manipulating surfaces in three dimensions via conformal transformations. Conformal transformations are valuable in applications because they naturally preserve the integrity of geometric data. To date, however, there has been no clearly stated and consistent theory of conformal transformations that can be used to develop general-purpose geometry processing algorithms: previous methods for computing conformal maps have been restricted to the flat two-dimensional plane, or other spaces of constant curvature. In contrast, our formulation can be used to produce---for the first time---general surface deformations that are perfectly conformal in the limit of refinement. It is for this reason that we commandeer the title Conformal Geometry Processing.

The main contribution of this thesis is analysis and discretization of a certain time-independent Dirac equation, which plays a central role in our theory. Given an immersed surface, we wish to construct new immersions that (i) induce a conformally equivalent metric and (ii) exhibit a prescribed change in extrinsic curvature. Curvature determines the potential in the Dirac equation; the solution of this equation determines the geometry of the new surface. We derive the precise conditions under which curvature is allowed to evolve, and develop efficient numerical algorithms for solving the Dirac equation on triangulated surfaces.

From a practical perspective, this theory has a variety of benefits: conformal maps are desirable in geometry processing because they do not exhibit shear, and therefore preserve textures as well as the quality of the mesh itself. Our discretization yields a sparse linear system that is simple to build and can be used to efficiently edit surfaces by manipulating curvature and boundary data, as demonstrated via several mesh processing applications. We also present a formulation of Willmore flow for triangulated surfaces that permits extraordinarily large time steps and apply this algorithm to surface fairing, geometric modeling, and construction of constant mean curvature (CMC) surfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interactions of N2, formic acid and acetone on the Ru(001) surface are studied using thermal desorption mass spectrometry (TDMS), electron energy loss spectroscopy (EELS), and computer modeling.

Low energy electron diffraction (LEED), EELS and TDMS were used to study chemisorption of N2 on Ru(001). Adsorption at 75 K produces two desorption states. Adsorption at 95 K fills only the higher energy desorption state and produces a (√3 x √3)R30° LEED pattern. EEL spectra indicate both desorption states are populated by N2 molecules bonded "on-top" of Ru atoms.

Monte Carlo simulation results are presented on Ru(001) using a kinetic lattice gas model with precursor mediated adsorption, desorption and migration. The model gives good agreement with experimental data. The island growth rate was computed using the same model and is well fit by R(t)m - R(t0)m = At, with m approximately 8. The island size was determined from the width of the superlattice diffraction feature.

The techniques, algorithms and computer programs used for simulations are documented. Coordinate schemes for indexing sites on a 2-D hexagonal lattice, programs for simulation of adsorption and desorption, techniques for analysis of ordering, and computer graphics routines are discussed.

The adsorption of formic acid on Ru(001) has been studied by EELS and TDMS. Large exposures produce a molecular multilayer species. A monodentate formate, bidentate formate, and a hydroxyl species are stable intermediates in formic acid decomposition. The monodentate formate species is converted to the bidentate species by heating. Formic acid decomposition products are CO2, CO, H2, H2O and oxygen adatoms. The ratio of desorbed CO with respect to CO2 increases both with slower heating rates and with lower coverages.

The existence of two different forms of adsorbed acetone, side-on, bonded through the oxygen and acyl carbon, and end-on, bonded through the oxygen, have been verified by EELS. On Pt(111), only the end-on species is observed. On dean Ru(001) and p(2 x 2)O precovered Ru(001), both forms coexist. The side-on species is dominant on clean Ru(001), while O stabilizes the end-on form. The end-on form desorbs molecularly. Bonding geometry stability is explained by surface Lewis acidity and by comparison to organometallic coordination complexes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 0.2% experimental accuracy of the 1968 Beers and Hughes measurement of the annihilation lifetime of ortho-positronium motivates the attempt to compute the first order quantum electrodynamic corrections to this lifetime. The theoretical problems arising in this computation are here studied in detail up to the point of preparing the necessary computer programs and using them to carry out some of the less demanding steps -- but the computation has not yet been completed. Analytic evaluation of the contributing Feynman diagrams is superior to numerical evaluation, and for this process can be carried out with the aid of the Reduce algebra manipulation computer program.

The relation of the positronium decay rate to the electronpositron annihilation-in-flight amplitude is derived in detail, and it is shown that at threshold annihilation-in-flight, Coulomb divergences appear while infrared divergences vanish. The threshold Coulomb divergences in the amplitude cancel against like divergences in the modulating continuum wave function.

Using the lowest order diagrams of electron-positron annihilation into three photons as a test case, various pitfalls of computer algebraic manipulation are discussed along with ways of avoiding them. The computer manipulation of artificial polynomial expressions is preferable to the direct treatment of rational expressions, even though redundant variables may have to be introduced.

Special properties of the contributing Feynman diagrams are discussed, including the need to restore gauge invariance to the sum of the virtual photon-photon scattering box diagrams by means of a finite subtraction.

A systematic approach to the Feynman-Brown method of Decomposition of single loop diagram integrals with spin-related tensor numerators is developed in detail. This approach allows the Feynman-Brown method to be straightforwardly programmed in the Reduce algebra manipulation language.

The fundamental integrals needed in the wake of the application of the Feynman-Brown decomposition are exhibited and the methods which were used to evaluate them -- primarily dis persion techniques are briefly discussed.

Finally, it is pointed out that while the techniques discussed have permitted the computation of a fair number of the simpler integrals and diagrams contributing to the first order correction of the ortho-positronium annihilation rate, further progress with the more complicated diagrams and with the evaluation of traces is heavily contingent on obtaining access to adequate computer time and core capacity.