919 resultados para one-boson-exchange models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Todas las personas establecen numerosas relaciones de diversa índole a lo largo de la vida y a través de estas intercambian y construyen experiencias, conocimientos, formas de sentir y de ver la vida; además de compartir necesidades, intereses y afectos (Pozo, Alonso, Hernández & Martos, 2005). Este postulado, es uno de los principales argumentos que motiva la realización de esta revisión teórica, por lo que el presente documento describe los resultados de una revisión teórica sobre las relaciones sociales en el ámbito organizacional y la importancia que estas tienen en este particular contexto. Los objetivos que orientaron esta misiva fueron: establecer los atributos esenciales del concepto de relaciones sociales, identificar las aproximaciones psicológicas que han aportado a la comprensión de este fenómeno, determinar los métodos de investigación utilizados por las aproximaciones psicológicas en el estudio de las relaciones sociales, describir algunos de los hallazgos de las investigaciones psicológicas que se han interesado por el impacto que tienen las relaciones sociales en el contexto laboral e identificar las tendencias actuales a través de las cuales se estudian las relaciones sociales en los contextos organizacionales. Para el logro de estos objetivos, se explora el concepto y características de las relaciones sociales, las perspectivas y modelos teóricos existentes para su estudio, las metodologías de estudio a las que se ha acudido y finalmente el papel de las relaciones sociales en los contextos organizacionales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La dependencia entre las series financieras, es un parámetro fundamental para la estimación de modelos de Riesgo. El Valor en Riesgo (VaR) es una de las medidas más importantes utilizadas para la administración y gestión de Riesgos Financieros, en la actualidad existen diferentes métodos para su estimación, como el método por simulación histórica, el cual no asume ninguna distribución sobre los retornos de los factores de riesgo o activos, o los métodos paramétricos que asumen normalidad sobre las distribuciones. En este documento se introduce la teoría de cópulas, como medida de dependencia entre las series, se estima un modelo ARMA-GARCH-Cópula para el cálculo del Valor en Riesgo de un portafolio compuesto por dos series financiera, la tasa de cambio Dólar-Peso y Euro-Peso. Los resultados obtenidos muestran que la estimación del VaR por medio de copulas es más preciso en relación a los métodos tradicionales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a complete generalization of the background risk models. In so doing, first, it relaxes the independence assumption. Second, it adopts a general functional form. Third, itadopts a general type of risk. Furthermore, it introduces a new general form of background risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La siguiente investigación describe una aproximación teórica al tema de los modelos de presupuestación de capital, el objetivo fundamental se basa en comprender su enfoque e importancia al momento de tomar decisiones de inversión por parte de los directores de una empresa, así como de prever los efectos de esta en un futuro. Al respecto, y sobre la base de que los modelos de presupuestación de capital son herramientas para analizar posibles erogaciones de capital por parte de una empresa, es necesario para efectos del presente proyecto de investigación, definir sus diferentes modelos desde lo teórico y metodológico, explicando los diferentes conceptos relacionados con el tema. Así mismo, se explican algunos de los indicadores financieros utilizados en las compañías para medir y estimar la “salud financiera” de la empresa, además de puntualizar su impacto en la perdurabilidad de las entidades, lo cual permite dar una visión más general sobre la importancia que trasciende de los indicadores financieros, generando un impacto positivo en la evolución o crecimiento de la organización. En complemento, la investigación aborda la presupuestación de capital de manera particular aplicado en la gestión empresarial, sean estas privadas o públicas (estatal y gubernamental). En este sentido, se abordan conceptos elaborados por diferentes académicos en los que se exponen algunas aproximaciones respecto al posible mejoramiento de la presupuestación para los sectores a los que pertenecen determinadas entidades. Finalmente, se presenta de manera explícita las conclusiones que surgieron a lo largo de la construcción del documento de investigación, con el fin de dar cumplimiento concreto al objetivo general del trabajo, el cual constituye una respuesta a la pregunta de investigación que se enunciará en el desarrollo del documento.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work provides a generalization of Mayer's energy decomposition for the density-functional theory (DFT) case. It is shown that one- and two-atom Hartree-Fock energy components in Mayer's approach can be represented as an action of a one-atom potential VA on a one-atom density ρ A or ρ B. To treat the exchange-correlation term in the DFT energy expression in a similar way, the exchange-correlation energy density per electron is expanded into a linear combination of basis functions. Calculations carried out for a number of density functionals demonstrate that the DFT and Hartree-Fock two-atom energies agree to a reasonable extent with each other. The two-atom energies for strong covalent bonds are within the range of typical bond dissociation energies and are therefore a convenient computational tool for assessment of individual bond strength in polyatomic molecules. For nonspecific nonbonding interactions, the two-atom energies are low. They can be either repulsive or slightly attractive, but the DFT results more frequently yield small attractive values compared to the Hartree-Fock case. The hydrogen bond in the water dimer is calculated to be between the strong covalent and nonbonding interactions on the energy scale

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A conceptually new approach is introduced for the decomposition of the molecular energy calculated at the density functional theory level of theory into sum of one- and two-atomic energy components, and is realized in the "fuzzy atoms" framework. (Fuzzy atoms mean that the three-dimensional physical space is divided into atomic regions having no sharp boundaries but exhibiting a continuous transition from one to another.) The new scheme uses the new concept of "bond order density" to calculate the diatomic exchange energy components and gives them unexpectedly close to the values calculated by the exact (Hartree-Fock) exchange for the same Kohn-Sham orbitals

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En años recientes,la Inteligencia Artificial ha contribuido a resolver problemas encontrados en el desempeño de las tareas de unidades informáticas, tanto si las computadoras están distribuidas para interactuar entre ellas o en cualquier entorno (Inteligencia Artificial Distribuida). Las Tecnologías de la Información permiten la creación de soluciones novedosas para problemas específicos mediante la aplicación de los hallazgos en diversas áreas de investigación. Nuestro trabajo está dirigido a la creación de modelos de usuario mediante un enfoque multidisciplinario en los cuales se emplean los principios de la psicología, inteligencia artificial distribuida, y el aprendizaje automático para crear modelos de usuario en entornos abiertos; uno de estos es la Inteligencia Ambiental basada en Modelos de Usuario con funciones de aprendizaje incremental y distribuido (conocidos como Smart User Model). Basándonos en estos modelos de usuario, dirigimos esta investigación a la adquisición de características del usuario importantes y que determinan la escala de valores dominantes de este en aquellos temas en los cuales está más interesado, desarrollando una metodología para obtener la Escala de Valores Humanos del usuario con respecto a sus características objetivas, subjetivas y emocionales (particularmente en Sistemas de Recomendación).Una de las áreas que ha sido poco investigada es la inclusión de la escala de valores humanos en los sistemas de información. Un Sistema de Recomendación, Modelo de usuario o Sistemas de Información, solo toman en cuenta las preferencias y emociones del usuario [Velásquez, 1996, 1997; Goldspink, 2000; Conte and Paolucci, 2001; Urban and Schmidt, 2001; Dal Forno and Merlone, 2001, 2002; Berkovsky et al., 2007c]. Por lo tanto, el principal enfoque de nuestra investigación está basado en la creación de una metodología que permita la generación de una escala de valores humanos para el usuario desde el modelo de usuario. Presentamos resultados obtenidos de un estudio de casos utilizando las características objetivas, subjetivas y emocionales en las áreas de servicios bancarios y de restaurantes donde la metodología propuesta en esta investigación fue puesta a prueba.En esta tesis, las principales contribuciones son: El desarrollo de una metodología que, dado un modelo de usuario con atributos objetivos, subjetivos y emocionales, se obtenga la Escala de Valores Humanos del usuario. La metodología propuesta está basada en el uso de aplicaciones ya existentes, donde todas las conexiones entre usuarios, agentes y dominios que se caracterizan por estas particularidades y atributos; por lo tanto, no se requiere de un esfuerzo extra por parte del usuario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new technique is described for the analysis of cloud-resolving model simulations, which allows one to investigate the statistics of the lifecycles of cumulus clouds. Clouds are tracked from timestep-to-timestep within the model run. This allows for a very simple method of tracking, but one which is both comprehensive and robust. An approach for handling cloud splits and mergers is described which allows clouds with simple and complicated time histories to be compared within a single framework. This is found to be important for the analysis of an idealized simulation of radiative-convective equilibrium, in which the moist, buoyant, updrafts (i.e., the convective cores) were tracked. Around half of all such cores were subject to splits and mergers during their lifecycles. For cores without any such events, the average lifetime is 30min, but events can lengthen the typical lifetime considerably.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffuse reflectance spectroscopy (DRS) is increasingly being used to predict numerous soil physical, chemical and biochemical properties. However, soil properties and processes vary at different scales and, as a result, relationships between soil properties often depend on scale. In this paper we report on how the relationship between one such property, cation exchange capacity (CEC), and the DRS of the soil depends on spatial scale. We show this by means of a nested analysis of covariance of soils sampled on a balanced nested design in a 16 km × 16 km area in eastern England. We used principal components analysis on the DRS to obtain a reduced number of variables while retaining key variation. The first principal component accounted for 99.8% of the total variance, the second for 0.14%. Nested analysis of the variation in the CEC and the two principal components showed that the substantial variance components are at the > 2000-m scale. This is probably the result of differences in soil composition due to parent material. We then developed a model to predict CEC from the DRS and used partial least squares (PLS) regression do to so. Leave-one-out cross-validation results suggested a reasonable predictive capability (R2 = 0.71 and RMSE = 0.048 molc kg− 1). However, the results from the independent validation were not as good, with R2 = 0.27, RMSE = 0.056 molc kg− 1 and an overall correlation of 0.52. This would indicate that DRS may not be useful for predictions of CEC. When we applied the analysis of covariance between predicted and observed we found significant scale-dependent correlations at scales of 50 and 500 m (0.82 and 0.73 respectively). DRS measurements can therefore be useful to predict CEC if predictions are required, for example, at the field scale (50 m). This study illustrates that the relationship between DRS and soil properties is scale-dependent and that this scale dependency has important consequences for prediction of soil properties from DRS data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land use and land cover changes in the Brazilian Amazon have major implications for regional and global carbon (C) cycling. Cattle pasture represents the largest single use (about 70%) of this once-forested land in most of the region. The main objective of this study was to evaluate the accuracy of the RothC and Century models at estimating soil organic C (SOC) changes under forest-to-pasture conditions in the Brazilian Amazon. We used data from 11 site-specific 'forest to pasture' chronosequences with the Century Ecosystem Model (Century 4.0) and the Rothamsted C Model (RothC 26.3). The models predicted that forest clearance and conversion to well managed pasture would cause an initial decline in soil C stocks (0-20 cm depth), followed in the majority of cases by a slow rise to levels exceeding those under native forest. One exception to this pattern was a chronosequence in Suia-Missu, which is under degraded pasture. In three other chronosequences the recovery of soil C under pasture appeared to be only to about the same level as under the previous forest. Statistical tests were applied to determine levels of agreement between simulated SOC stocks and observed stocks for all the sites within the 11 chronosequences. The models also provided reasonable estimates (coefficient of correlation = 0.8) of the microbial biomass C in the 0-10 cm soil layer for three chronosequences, when compared with available measured data. The Century model adequately predicted the magnitude and the overall trend in delta C-13 for the six chronosequences where measured 813 C data were available. This study gave independent tests of model performance, as no adjustments were made to the models to generate outputs. Our results suggest that modelling techniques can be successfully used for monitoring soil C stocks and changes, allowing both the identification of current patterns in the soil and the projection of future conditions. Results were used and discussed not only to evaluate soil C dynamics but also to indicate soil C sequestration opportunities for the Brazilian Amazon region. Moreover, modelling studies in these 'forest to pasture' systems have important applications, for example, the calculation of CO, emissions from land use change in national greenhouse gas inventories. (0 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a review is undertaken of the major models currently in use for describing water quality in freshwater river systems. The number of existing models is large because the various studies of water quality in rivers around the world have often resulted in the construction of new 'bespoke' models designed for the particular situation of that study. However, it is worth considering models that are already available, since an existing model, suitable for the purposes of the study, will save a great deal of work and may already have been established within regulatory and legal frameworks. The models chosen here are SIMCAT, TOMCAT, QUAL2E, QUASAR, MIKE-11 and ISIS, and the potential for each model is examined in relation to the issue of simulating dissolved oxygen (DO) in lowland rivers. These models have been developed for particular purposes and this review shows that no one model can provide all of the functionality required. Furthermore, all of the models contain assumptions and limitations that need to be understood if meaningful interpretations of the model simulations are to. be made. The work is concluded with the view that it is unfair to set one model against another in terms of broad applicability, but that a model of intermediate complexity, such as QUASAR, is generally well suited to simulate DO in river systems. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Rio Tinto river in SW Spain is a classic example of acid mine drainage and the focus of an increasing amount of research including environmental geochemistry, extremophile microbiology and Mars-analogue studies. Its 5000-year mining legacy has resulted in a wide range of point inputs including spoil heaps and tunnels draining underground workings. The variety of inputs and importance of the river as a research site make it an ideal location for investigating sulphide oxidation mechanisms at the field scale. Mass balance calculations showed that pyrite oxidation accounts for over 93% of the dissolved sulphate derived from sulphide oxidation in the Rio Tinto point inputs. Oxygen isotopes in water and sulphate were analysed from a variety of drainage sources and displayed delta O-18((SO4-H2O)) values from 3.9 to 13.6 parts per thousand, indicating that different oxidation pathways occurred at different sites within the catchment. The most commonly used approach to interpreting field oxygen isotope data applies water and oxygen fractionation factors derived from laboratory experiments. We demonstrate that this approach cannot explain high delta O-18((SO4-H2O)) values in a manner that is consistent with recent models of pyrite and sulphoxyanion oxidation. In the Rio Tinto, high delta O-18((SO4-H2O)) values (11.2-13.6 parts per thousand) occur in concentrated (Fe = 172-829 mM), low pH (0.88-1.4), ferrous iron (68-91% of total Fe) waters and are most simply explained by a mechanism involving a dissolved sulphite intermediate, sulphite-water oxygen equilibrium exchange and finally sulphite oxidation to sulphate with O-2. In contrast, drainage from large waste blocks of acid volcanic tuff with pyritiferous veins also had low pH (1.7). but had a low delta O-18((SO4-H2O)) value of 4.0 parts per thousand and high concentrations of ferric iron (Fe(III) = 185 mM, total Fe = 186 mM), suggesting a pathway where ferric iron is the primary oxidant, water is the primary source of oxygen in the sulphate and where sulphate is released directly from the pyrite surface. However, problems remain with the sulphite-water oxygen exchange model and recommendations are therefore made for future experiments to refine our understanding of oxygen isotopes in pyrite oxidation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Covered Catchment Experiment at Gordsjon is a large scale forest ecosystem manipulation, where acid precipitation was intercepted by a 7000 m(2) plastic roof and replaced by 'clean precipitation' sprinkled below the roof for ten years between 1991 and 2001. The treatment resulted in a strong positive response of runoff quality. The runoff sulphate, inorganic aluminium and base cations decreased, while there was a strong increase in runoff ANC and a moderate increase in pH. The runoff continued to improve over the whole duration of the experiment. The achieved quality was, however, after ten years still considerably worse than estimated pre-industrial runoff at the site. Stable isotopes of sulphur were analysed to study the soil sulphur cycling. At the initial years of the experiment, the desorption of SO4 from the mineral soil appeared to control the runoff SO4 concentration. However, as the experiment proceeded, there was growing evidence that net mineralisation of soil organic sulphur in the humus layer was an additional source of SO4 in runoff. This might provide a challenge to current acidification models. The experiment convincingly demonstrated on a catchment scale, that reduction in acid deposition causes an immediate improvement of surface water quality even at heavily acidified sites. The improvement of the runoff appeared to be largely a result of cation exchange processes in the soil due to decreasing concentrations of the soil solution, while any potential change in soil base saturation seemed to be less important for the runoff chemistry over the short time period of one decade. These findings should be considered when interpreting and extrapolating regional trends in surface water chemistry to the terrestrial parts of ecosystems.