919 resultados para Best algebraic approximation
Resumo:
Based on the rigorous formulation of integral equations for the propagations of light waves at the medium interface, we carry out the numerical solutions of the random light field scattered from self-affine fractal surface samples. The light intensities produced by the same surface samples are also calculated in Kirchhoff's approximation, and their comparisons with the corresponding rigorous results show directly the degree of the accuracy of the approximation. It is indicated that Kirchhoff's approximation is of good accuracy for random surfaces with small roughness value w and large roughness exponent alpha. For random surfaces with larger w and smaller alpha, the approximation results in considerable errors, and detailed calculations show that the inaccuracy comes from the simplification that the transmitted light field is proportional to the incident field and from the neglect of light field derivative at the interface.
Resumo:
The Earth is very heterogeneous, especially in the region close to the surface of the Earth, and in regions close to the core-mantle boundary (CMB). The lowermost mantle (bottom 300km of the mantle) is the place for fast anomaly (3% faster S velocity than PREM, modeled from Scd), for slow anomaly (-3% slower S velocity than PREM, modeled from S,ScS), for extreme anomalous structure (ultra-low velocity zone, 30% lower inS velocity, 10% lower in P velocity). Strong anomaly with larger dimension is also observed beneath Africa and Pacific, originally modeled from travel time of S, SKS and ScS. Given the heterogeneous nature of the earth, more accurate approach (than travel time) has to be applied to study the details of various anomalous structures, and matching waveform with synthetic seismograms has proven effective in constraining the velocity structures. However, it is difficult to make synthetic seismograms in more than 1D cases where no exact analytical solution is possible. Numerical methods like finite difference or finite elements are too time consuming in modeling body waveforms. We developed a 2D synthetic algorithm, which is extended from 1D generalized ray theory (GRT), to make synthetic seismograms efficiently (each seismogram per minutes). This 2D algorithm is related to WKB approximation, but is based on different principles, it is thus named to be WKM, i.e., WKB modified. WKM has been applied to study the variation of fast D" structure beneath the Caribbean sea, to study the plume beneath Africa. WKM is also applied to study PKP precursors which is a very important seismic phase in modeling lower mantle heterogeneity. By matching WKM synthetic seismograms with various data, we discovered and confirmed that (a) The D" beneath Caribbean varies laterally, and the variation is best revealed with Scd+Sab beyond 88 degree where Sed overruns Sab. (b) The low velocity structure beneath Africa is about 1500 km in height, at least 1000km in width, and features 3% reduced S velocity. The low velocity structure is a combination of a relatively thin, low velocity layer (200 km thick or less) beneath the Atlantic, then rising very sharply into mid mantle towards Africa. (c) At the edges of this huge Africa low velocity structures, ULVZs are found by modeling the large separation between S and ScS beyond 100 degree. The ULVZ to the eastern boundary was discovered with SKPdS data, and later is confirmed by PKP precursor data. This is the first time that ULVZ is verified with distinct seismic phase.
Resumo:
Este Trabajo de Fin de Grado estudia el uso del Producto Interior Bruto (PIB) como medida de bienestar. Al tiempo que se realiza el análisis de la contabilidad nacional y, en mayor profundidad del PIB, se detallan algunas medidas alternativas. Éstas han surgido como consecuencia de la búsqueda de indicadores que incluyan en sus mediciones, no solo el crecimiento económico, sino también otra serie de variables que son relevantes y que el PIB no contempla (como los recursos naturales, la contaminación, la calidad de vida o la igualdad en la distribución de la renta de un país, por ejemplo). Así, surgen nuevos indicadores como el PIB ambientalmente ajustado, el Índice de Mejor Vida, el Índice de Progreso Genuino o la Huella Ecológica, entre otros. Como resultado de este análisis se concluye que la medida que mejor refleja el bienestar depende del objetivo específico que se busque investigar. En otras palabras, hoy en día ninguna medida es capaz de contestar todas las preguntas relacionadas con el bienestar de la población. Sin embargo, el PIB no está configurado –ni fue creado- como una medida de bienestar de un país. Por lo tanto, actualmente es preciso acompañar la información de este indicador con la de otros alternativos de tal manera que podamos tener una imagen más completa sobre el nivel de bienestar de un país.
Resumo:
The study of codes, classically motivated by the need to communicate information reliably in the presence of error, has found new life in fields as diverse as network communication, distributed storage of data, and even has connections to the design of linear measurements used in compressive sensing. But in all contexts, a code typically involves exploiting the algebraic or geometric structure underlying an application. In this thesis, we examine several problems in coding theory, and try to gain some insight into the algebraic structure behind them.
The first is the study of the entropy region - the space of all possible vectors of joint entropies which can arise from a set of discrete random variables. Understanding this region is essentially the key to optimizing network codes for a given network. To this end, we employ a group-theoretic method of constructing random variables producing so-called "group-characterizable" entropy vectors, which are capable of approximating any point in the entropy region. We show how small groups can be used to produce entropy vectors which violate the Ingleton inequality, a fundamental bound on entropy vectors arising from the random variables involved in linear network codes. We discuss the suitability of these groups to design codes for networks which could potentially outperform linear coding.
The second topic we discuss is the design of frames with low coherence, closely related to finding spherical codes in which the codewords are unit vectors spaced out around the unit sphere so as to minimize the magnitudes of their mutual inner products. We show how to build frames by selecting a cleverly chosen set of representations of a finite group to produce a "group code" as described by Slepian decades ago. We go on to reinterpret our method as selecting a subset of rows of a group Fourier matrix, allowing us to study and bound our frames' coherences using character theory. We discuss the usefulness of our frames in sparse signal recovery using linear measurements.
The final problem we investigate is that of coding with constraints, most recently motivated by the demand for ways to encode large amounts of data using error-correcting codes so that any small loss can be recovered from a small set of surviving data. Most often, this involves using a systematic linear error-correcting code in which each parity symbol is constrained to be a function of some subset of the message symbols. We derive bounds on the minimum distance of such a code based on its constraints, and characterize when these bounds can be achieved using subcodes of Reed-Solomon codes.
Resumo:
Part I: The mobilities of photo-generated electrons and holes in orthorhombic sulfur are determined by drift mobility techniques. At room temperature electron mobilities between 0.4 cm2/V-sec and 4.8 cm2/V-sec and hole mobilities of about 5.0 cm2/V-sec are reported. The temperature dependence of the electron mobility is attributed to a level of traps whose effective depth is about 0.12 eV. This value is further supported by both the voltage dependence of the space-charge-limited, D.C. photocurrents and the photocurrent versus photon energy measurements.
As the field is increased from 10 kV/cm to 30 kV/cm a second mechanism for electron transport becomes appreciable and eventually dominates. Evidence that this is due to impurity band conduction at an appreciably lower mobility (4.10-4 cm2/V-sec) is presented. No low mobility hole current could be detected. When fields exceeding 30 kV/cm for electron transport and 35 kV/cm for hole transport are applied, avalanche phenomena are observed. The results obtained are consistent with recent energy gap studies in sulfur.
The theory of the transport of photo-generated carriers is modified to include the case of appreciable thermos-regeneration from the traps in one transit time.
Part II: An explicit formula for the electric field E necessary to accelerate an electron to a steady-state velocity v in a polarizable crystal at arbitrary temperature is determined via two methods utilizing Feynman Path Integrals. No approximation is made regarding the magnitude of the velocity or the strength of the field. However, the actual electron-lattice Coulombic interaction is approximated by a distribution of harmonic oscillator potentials. One may be able to find the “best possible” distribution of oscillators using a variational principle, but we have not been able to find the expected criterion. However, our result is relatively insensitive to the actual distribution of oscillators used, and our E-v relationship exhibits the physical behavior expected for the polaron. Threshold fields for ejecting the electron for the polaron state are calculated for several substances using numerical results for a simple oscillator distribution.
Resumo:
Part I
Solutions of Schrödinger’s equation for system of two particles bound in various stationary one-dimensional potential wells and repelling each other with a Coulomb force are obtained by the method of finite differences. The general properties of such systems are worked out in detail for the case of two electrons in an infinite square well. For small well widths (1-10 a.u.) the energy levels lie above those of the noninteresting particle model by as much as a factor of 4, although excitation energies are only half again as great. The analytical form of the solutions is obtained and it is shown that every eigenstate is doubly degenerate due to the “pathological” nature of the one-dimensional Coulomb potential. This degeneracy is verified numerically by the finite-difference method. The properties of the square-well system are compared with those of the free-electron and hard-sphere models; perturbation and variational treatments are also carried out using the hard-sphere Hamiltonian as a zeroth-order approximation. The lowest several finite-difference eigenvalues converge from below with decreasing mesh size to energies below those of the “best” linear variational function consisting of hard-sphere eigenfunctions. The finite-difference solutions in general yield expectation values and matrix elements as accurate as those obtained using the “best” variational function.
The system of two electrons in a parabolic well is also treated by finite differences. In this system it is possible to separate the center-of-mass motion and hence to effect a considerable numerical simplification. It is shown that the pathological one-dimensional Coulomb potential gives rise to doubly degenerate eigenstates for the parabolic well in exactly the same manner as for the infinite square well.
Part II
A general method of treating inelastic collisions quantum mechanically is developed and applied to several one-dimensional models. The formalism is first developed for nonreactive “vibrational” excitations of a bound system by an incident free particle. It is then extended to treat simple exchange reactions of the form A + BC →AB + C. The method consists essentially of finding a set of linearly independent solutions of the Schrödinger equation such that each solution of the set satisfies a distinct, yet arbitrary boundary condition specified in the asymptotic region. These linearly independent solutions are then combined to form a total scattering wavefunction having the correct asymptotic form. The method of finite differences is used to determine the linearly independent functions.
The theory is applied to the impulsive collision of a free particle with a particle bound in (1) an infinite square well and (2) a parabolic well. Calculated transition probabilities agree well with previously obtained values.
Several models for the exchange reaction involving three identical particles are also treated: (1) infinite-square-well potential surface, in which all three particles interact as hard spheres and each two-particle subsystem (i.e. BC and AB) is bound by an attractive infinite-square-well potential; (2) truncated parabolic potential surface, in which the two-particle subsystems are bound by a harmonic oscillator potential which becomes infinite for interparticle separations greater than a certain value; (3) parabolic (untruncated) surface. Although there are no published values with which to compare our reaction probabilities, several independent checks on internal consistency indicate that the results are reliable.
Performance preserving frequency weighted controller approximation: a coprime factorization approach
Resumo:
An approximate analytical description for fundamental-mode fields of graded-index fibers is explicitly presented by use of the power-series expansion method, the maximum-value condition at the fiber axis, the decay properties of fundamental-mode fields at large distance from the fiber axis, and the approximate modal parameters U obtained from the Gaussian approximation. This analytical description is much more accurate than the Gaussian approximation and at the same time keep the simplicity of the latter. As two special examples, we present the approximate analytical formulas for the fundamental-mode fields of a step profile fiber and a Gaussian profile fiber, and we find that they are both highly accurate in the single-mode range by comparing them with the corresponding exact solutions.
Resumo:
A relatively simple transform from an arbitrary solution of the paraxial wave equation to the corresponding exact solution of the Helmholtz wave equation is derived in the condition that the evanescent waves are ignored and is used to study the corrections to the paraxial approximation of an arbitrary free-propagation beam. Specifically, the general lowest-order correction field is given in a very simple form and is proved to be exactly consistent with the perturbation method developed by Lax et nl. [Phys. Rev. A 11, 1365 (1975)]. Some special examples, such as the lowest-order correction to the paraxial approximation of a fundamental Gaussian beam whose waist plane has a parallel shin from the z = 0 plane, are presented. (C) 1998 Optical Society of America.
Resumo:
The coupled differential recurrence equations for the corrections to the paraxial approximation solutions in transversely nonuniform refractive-index media are established in terms of the perturbation method. All the corrections (including the longitudinal field corrections) to the paraxial approximation solutions are presented in the weak-guidance approximation. As a concrete application, the first-order longitudinal field correction and the second-order transverse field correction to the paraxial approximation of a Gaussian beam propagating in a transversely quadratic refractive index medium are analytically investigated. (C) 1999 Optical Society of America [S0740-3232(99)00310-5].
Resumo:
The 9th International Test Commission Conference (ITC) took place at the Miramar Palace in San Sebastian, Spain, between the 2nd and 5th of July, 2014. The Conference was titled, “Global and Local Challenges for Best Practices in Assessment.” The International Test Commission, ITC (www.intestcom.org), is an association of national psychological associations, test commissions, publishers, and other organizations, as well as individuals who are committed to the promotion of effective testing and assessment policies and to the proper development, evaluation, and uses of educational and psychological instruments. The ITC facilitates the exchange of information among members and stimulates their cooperation on problems related to the construction, distribution, and uses of psychological and educational tests and other psychodiagnostic tools. This volume contains the abstracts of the contributions presented at the 9th International Test Commission Conference. The four themes of the Conference were closely linked to the goals of the ITC: - Challenges and Opportunities in International Assessment. - Application of New Technoloogies and New Psychometric Models in Testing. - Standards and Guidelines for Best Testing Practices. - Testing in Multilingual and Multicultural Contexts.