943 resultados para Analysis theory
Resumo:
El presente trabajo es una revisión de literatura que busca contribuir al entendimiento de los procesos psicológicos subyacentes en la Teoría de Roberts sobre Lovemarks que, dentro del campo del marketing, ha buscado reemplazar la idea que se tiene sobre las marcas. La primera parte proporciona una introducción de lo que ha sido la evolución de las marcas desde una perspectiva psicológica y de mercadeo. En la segunda parte se explica la teoría de Lovemarks haciendo énfasis en sus componentes: el eje amor/respeto, las características de misterio, sensualidad e intimidad. Adicionalmente, se soporta esta teoría a través de literatura complementaria y casos de aplicación exitosos. La tercera parte, corresponde a la identificación y análisis de los procesos y aspectos psicológicos que explican la formación de un Lovemark: percepción, memoria, motivación individual y social y emoción. La cuarta y última parte contiene las conclusiones e implicaciones en la formación de la relación entre el consumidor y una marca.
Resumo:
An analysis of the alternatives of compensation in relation to international investment disputes is relevant, because a pecuniary award is not always the appropriate remedy to solve disputes arising between investors and States. This is the case because States may be increasingly interested in opting for a different type of compensation. Furthermore, it is still not clear whether arbitral tribunals have recognised alternative types of awarding damages in respect of international investments disputes. This analysis comprises two principal components, the first, is to identify whether or not the tribunals may render an award that not only demands the payment of a sum of money but also considers some other means of compensation. The second, centres on how compliance with these non-pecuniary awards may be demanded. Our approach to these two principal components will always revolve around the idea of respecting the sovereignty of the State, bearing in mind that the execution of an arbitral award, which obliges the State to refrain from or to perform an act in its territory, relies precisely on the sovereignty of the State to execute it.
Resumo:
This thesis theoretically studies the relationship between the informal sector (both in the labor and the housing market) and the city structure.
Resumo:
Resumen basado en el de la publicación
Resumo:
The energy decomposition scheme proposed in a recent paper has been realized by performing numerical integrations. The sample calculations carried out for some simple molecules show excellent agreement with the chemical picture of molecules, indicating that such an energy decomposition analysis can be useful from the point of view of connecting quantum mechanics with the genuine chemical concepts
Resumo:
The basis set superposition error-free second-order MØller-Plesset perturbation theory of intermolecular interactions was studied. The difficulties of the counterpoise (CP) correction in open-shell systems were also discussed. The calculations were performed by a program which was used for testing the new variants of the theory. It was shown that the CP correction for the diabatic surfaces should be preferred to the adiabatic ones
Resumo:
Quantum molecular similarity (QMS) techniques are used to assess the response of the electron density of various small molecules to application of a static, uniform electric field. Likewise, QMS is used to analyze the changes in electron density generated by the process of floating a basis set. The results obtained show an interrelation between the floating process, the optimum geometry, and the presence of an external field. Cases involving the Le Chatelier principle are discussed, and an insight on the changes of bond critical point properties, self-similarity values and density differences is performed
Resumo:
A procedure based on quantum molecular similarity measures (QMSM) has been used to compare electron densities obtained from conventional ab initio and density functional methodologies at their respective optimized geometries. This method has been applied to a series of small molecules which have experimentally known properties and molecular bonds of diverse degrees of ionicity and covalency. Results show that in most cases the electron densities obtained from density functional methodologies are of a similar quality than post-Hartree-Fock generalized densities. For molecules where Hartree-Fock methodology yields erroneous results, the density functional methodology is shown to yield usually more accurate densities than those provided by the second order Møller-Plesset perturbation theory
Resumo:
A comparative systematic study of the CrO2F2 compound has been performed using different conventional ab initio methodologies and density functional procedures. Two points have been analyzed: first, the accuracy of results yielded by each method under study, and second, the computational cost required to reach such results. Weighing up both aspects, density functional theory has been found to be more appropriate than the Hartree-Fock (HF) and the analyzed post-HF methods. Hence, the structural characterization and spectroscopic elucidation of the full CrO2X2 series (X=F,Cl,Br,I) has been done at this level of theory. Emphasis has been given to the unknown CrO2I2 species, and specially to the UV/visible spectra of all four compounds. Furthermore, a topological analysis in terms of charge density distributions has revealed why the valence shell electron pair repulsion model fails in predicting the molecular shape of such CrO2X2 complexes
Resumo:
Es mostra que, gracies a una extensió en la definició dels Índexs Moleculars Topològics, s'arriba a la formulació d'índexs relacionats amb la teoria de la Semblança Molecular Quàntica. Es posa de manifest la connexió entre les dues metodologies: es revela que un marc de treball teòric sòlidament fonamentat sobre la teoria de la Mecànica Quàntica es pot connectar amb una de les tècniques més antigues relacionades amb els estudis de QSPR. Es mostren els resultats per a dos casos d'exemple d'aplicació d'ambdues metodologies
Resumo:
En la literatura sobre mecànica quàntica és freqüent trobar descriptors basats en la densitat de parells o la densitat electrònica, amb un èxit divers segons les aplicacions que atenyin. Per tal de que tingui sentit químic un descriptor ha de donar la definició d'un àtom en una molècula, o ésser capaç d'identificar regions de l'espai molecular associades amb algun concepte químic (com pot ser un parell solitari o zona d'enllaç, entre d'altres). En aquesta línia, s'han proposat diversos esquemes de partició: la teoria d'àtoms en molècules (AIM), la funció de localització electrònica (ELF), les cel·les de Voroni, els àtoms de Hirshfeld, els àtoms difusos, etc. L'objectiu d'aquesta tesi és explorar descriptors de la densitat basats en particions de l'espai molecular del tipus AIM, ELF o àtoms difusos, analitzar els descriptors existents amb diferents nivells de teoria, proposar nous descriptors d'aromaticitat, així com estudiar l'habilitat de totes aquestes eines per discernir entre diferents mecanismes de reacció.
Resumo:
A new method of clear-air turbulence (CAT) forecasting based on the Lighthill–Ford theory of spontaneous imbalance and emission of inertia–gravity waves has been derived and applied on episodic and seasonal time scales. A scale analysis of this shallow-water theory for midlatitude synoptic-scale flows identifies advection of relative vorticity as the leading-order source term. Examination of leading- and second-order terms elucidates previous, more empirically inspired CAT forecast diagnostics. Application of the Lighthill–Ford theory to the Upper Mississippi and Ohio Valleys CAT outbreak of 9 March 2006 results in good agreement with pilot reports of turbulence. Application of Lighthill–Ford theory to CAT forecasting for the 3 November 2005–26 March 2006 period using 1-h forecasts of the Rapid Update Cycle (RUC) 2 1500 UTC model run leads to superior forecasts compared to the current operational version of the Graphical Turbulence Guidance (GTG1) algorithm, the most skillful operational CAT forecasting method in existence. The results suggest that major improvements in CAT forecasting could result if the methods presented herein become operational.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.