866 resultados para correlation distance


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the first relativistic many-electron SCF correlation diagram for a superheavy quasimolecule: Pb - Pb. The discussion shows a large number of quantitative as well as qualitative differences as compared with the known one-electron correlation diagram.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A realistic self-consistent charge correlation diagram calculation of the Kr{^2+} - Kr{^2+} system has been performed. We get excellent agreement for the 4(3/2)_u level with an experimentally observed MO level at large distances. Possible reasons for discrepancies between experiment and theory at small distances are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ground state (J = 0) electronic correlation energy of the 4-electron Be-sequence is calculated in the Multi-Configuration Dirac-Fock approximation for Z = 4-20. The 4 electrons were distributed over the configurations arising from the 1s, 2s, 2p, 3s, 3p and 3d orbitals. Theoretical values obtained here are in good agreement with experimental correlation energies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is found that the electric dipole polarizabilities of neutral atoms correlate very strongly with their first ionization potential within the groups of elements with the same angular momenta of the outermost electrons. As the latter values are known very accurately, this allows a very good (<30%) prediction of various atomic polarizabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Correlation energies for all isoelectronic sequences of 2 to 20 electrons and Z = 2 to 25 are obtained by taking differences between theoretical total energies of Dirac-Fock calculations and experimental total energies. These are pure relativistic correlation energies because relativistic and QED effects are already taken care of. The theoretical as well as the experimental values are analysed critically in order to get values as accurate as possible. The correlation energies obtained show an essentially consistent behaviour from Z = 2 to 17. For Z > 17 inconsistencies occur indicating errors in the experimental values which become very large for Z > 25.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interaction of short intense laser pulses with atoms/molecules produces a multitude of highly nonlinear processes requiring a non-perturbative treatment. Detailed study of these highly nonlinear processes by numerically solving the time-dependent Schrodinger equation becomes a daunting task when the number of degrees of freedom is large. Also the coupling between the electronic and nuclear degrees of freedom further aggravates the computational problems. In the present work we show that the time-dependent Hartree (TDH) approximation, which neglects the correlation effects, gives unreliable description of the system dynamics both in the absence and presence of an external field. A theoretical framework is required that treats the electrons and nuclei on equal footing and fully quantum mechanically. To address this issue we discuss two approaches, namely the multicomponent density functional theory (MCDFT) and the multiconfiguration time-dependent Hartree (MCTDH) method, that go beyond the TDH approximation and describe the correlated electron-nuclear dynamics accurately. In the MCDFT framework, where the time-dependent electronic and nuclear densities are the basic variables, we discuss an algorithm to calculate the exact Kohn-Sham (KS) potentials for small model systems. By simulating the photodissociation process in a model hydrogen molecular ion, we show that the exact KS potentials contain all the many-body effects and give an insight into the system dynamics. In the MCTDH approach, the wave function is expanded as a sum of products of single-particle functions (SPFs). The MCTDH method is able to describe the electron-nuclear correlation effects as the SPFs and the expansion coefficients evolve in time and give an accurate description of the system dynamics. We show that the MCTDH method is suitable to study a variety of processes such as the fragmentation of molecules, high-order harmonic generation, the two-center interference effect, and the lochfrass effect. We discuss these phenomena in a model hydrogen molecular ion and a model hydrogen molecule. Inclusion of absorbing boundaries in the mean-field approximation and its consequences are discussed using the model hydrogen molecular ion. To this end, two types of calculations are considered: (i) a variational approach with a complex absorbing potential included in the full many-particle Hamiltonian and (ii) an approach in the spirit of time-dependent density functional theory (TDDFT), including complex absorbing potentials in the single-particle equations. It is elucidated that for small grids the TDDFT approach is superior to the variational approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new paradigm for signal reconstruction and superresolution, Correlation Kernel Analysis (CKA), that is based on the selection of a sparse set of bases from a large dictionary of class- specific basis functions. The basis functions that we use are the correlation functions of the class of signals we are analyzing. To choose the appropriate features from this large dictionary, we use Support Vector Machine (SVM) regression and compare this to traditional Principal Component Analysis (PCA) for the tasks of signal reconstruction, superresolution, and compression. The testbed we use in this paper is a set of images of pedestrians. This paper also presents results of experiments in which we use a dictionary of multiscale basis functions and then use Basis Pursuit De-Noising to obtain a sparse, multiscale approximation of a signal. The results are analyzed and we conclude that 1) when used with a sparse representation technique, the correlation function is an effective kernel for image reconstruction and superresolution, 2) for image compression, PCA and SVM have different tradeoffs, depending on the particular metric that is used to evaluate the results, 3) in sparse representation techniques, L_1 is not a good proxy for the true measure of sparsity, L_0, and 4) the L_epsilon norm may be a better error metric for image reconstruction and compression than the L_2 norm, though the exact psychophysical metric should take into account high order structure in images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study four measures of problem instance behavior that might account for the observed differences in interior-point method (IPM) iterations when these methods are used to solve semidefinite programming (SDP) problem instances: (i) an aggregate geometry measure related to the primal and dual feasible regions (aspect ratios) and norms of the optimal solutions, (ii) the (Renegar-) condition measure C(d) of the data instance, (iii) a measure of the near-absence of strict complementarity of the optimal solution, and (iv) the level of degeneracy of the optimal solution. We compute these measures for the SDPLIB suite problem instances and measure the correlation between these measures and IPM iteration counts (solved using the software SDPT3) when the measures have finite values. Our conclusions are roughly as follows: the aggregate geometry measure is highly correlated with IPM iterations (CORR = 0.896), and is a very good predictor of IPM iterations, particularly for problem instances with solutions of small norm and aspect ratio. The condition measure C(d) is also correlated with IPM iterations, but less so than the aggregate geometry measure (CORR = 0.630). The near-absence of strict complementarity is weakly correlated with IPM iterations (CORR = 0.423). The level of degeneracy of the optimal solution is essentially uncorrelated with IPM iterations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quantitative estimation of Sea Surface Temperatures from fossils assemblages is a fundamental issue in palaeoclimatic and paleooceanographic investigations. The Modern Analogue Technique, a widely adopted method based on direct comparison of fossil assemblages with modern coretop samples, was revised with the aim of conforming it to compositional data analysis. The new CODAMAT method was developed by adopting the Aitchison metric as distance measure. Modern coretop datasets are characterised by a large amount of zeros. The zero replacement was carried out by adopting a Bayesian approach to the zero replacement, based on a posterior estimation of the parameter of the multinomial distribution. The number of modern analogues from which reconstructing the SST was determined by means of a multiple approach by considering the Proxies correlation matrix, Standardized Residual Sum of Squares and Mean Squared Distance. This new CODAMAT method was applied to the planktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea. Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix, Standardized Residual Sum of Squares

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypermedia systems based on the Web for open distance education are becoming increasingly popular as tools for user-driven access learning information. Adaptive hypermedia is a new direction in research within the area of user-adaptive systems, to increase its functionality by making it personalized [Eklu 961. This paper sketches a general agents architecture to include navigational adaptability and user-friendly processes which would guide and accompany the student during hislher learning on the PLAN-G hypermedia system (New Generation Telematics Platform to Support Open and Distance Learning), with the aid of computer networks and specifically WWW technology [Marz 98-1] [Marz 98-2]. The PLAN-G actual prototype is successfully used with some informatics courses (the current version has no agents yet). The propased multi-agent system, contains two different types of adaptive autonomous software agents: Personal Digital Agents {Interface), to interacl directly with the student when necessary; and Information Agents (Intermediaries), to filtrate and discover information to learn and to adapt navigation space to a specific student

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introducción: El Deslizamiento Epifisiario Capital Femoral es la enfermedad de la cadera más común en adolescentes entre los 9 y 16 años. Es de causa idiopática, más frecuente en hombres, se clasifica en 4 estadios según criterios clínicos y radiológicos. Se buscó evaluar la evolución de los deslizamientos moderados y severos tratados con una de las dos técnicas propuestas. Metodología Se realizó un estudio descriptivo con pacientes que fueron llevados a fijación in situ o luxación controlada entre 2008 y 2011. Resultados: Se incluyeron 26 pacientes, los cuales el 65.4% se les realizó luxación quirúrgica controlada y el 34.6% fijación in situ. El 70,6% de pacientes tenían DECF inestable y 70,5% tenían desplazamiento severo. La evaluación de la escala WOMAC para dolor, rigidez y capacidad funcional encontró mejores beneficios para el grupo de fijación in situ, estadísticamente significativos (p<0,05), no solo en términos de dolor, rigidez y capacidad funcional sino menor frecuencia de complicaciones. Las complicaciones más frecuentes en el grupo de luxación quirúrgica controlada fueron un caso de infección, 7 casos (41,2%) de necrosis avascular de cabeza femoral, 5 casos (29,4%) de condrolisis y 2 casos (11,8%) de pseudoartrosis; En el grupo de fijación in situ, solo 1 (11,1%) presentó Infección del Sitio Operatorio y 1 (11,1%) Condrolisis. Resultados significativos solo para necrosis avascular. Discusión: Los pacientes con deslizamientos moderados y severos manejados con fijación in situ tuvieron una mejor resultado con menor proporción de complicaciones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we reviewed the models of volatility for a group of five Latin American countries, mainly motivated by the recent periods of financial turbulence. Our results based on high frequency data suggest that Dynamic multivariate models are more powerful to study the volatilities of asset returns than Constant Conditional Correlation models. For the group of countries included, we identified that domestic volatilities of asset markets have been increasing; but the co-volatility of the region is still moderate.