911 resultados para CIDOC Conceptual Reference Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is focused on deriving framework for the value thought for from the Customer Relationship Management system adopted by an enterprise operating in the financial services industry. It will analyze existing academic work to derive a conceptual value model, while applying secondary industry specific case studies provided by the CRM vendors to check the validity and commonality of these drivers. Furthermore this work locates the variances and correlation between value thought for from CRM system, scope of enterprise operations and size of the enterprise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interaction mean free path between neutrons and TRISO particles is simulated using scripts written in MATLAB to solve the increasing error present with an increase in the packing factor in the reactor physics code Serpent. Their movement is tracked both in an unbounded and in a bounded space. Their track is calculated, depending on the program, linearly directly using the position vectors of the neutrons and the surface equations of all the fuel particles; by dividing the space in multiple subspaces, each of which contain a fraction of the total number of particles, and choosing the particles from those subspaces through which the neutron passes through; or by choosing the particles that lie within an infinite cylinder formed on the movement axis of the neutron. The estimate from the current analytical model, based on an exponential distribution, for the mean free path, utilized by Serpent, is used as a reference result. The results from the implicit model in Serpent imply a too long mean free path with high packing factors. The received results support this observation by producing, with a packing factor of 17 %, approximately 2.46 % shorter mean free path compared to the reference model. This is supported by the packing factor experienced by the neutron, the simulation of which resulted in a 17.29 % packing factor. It was also observed that the neutrons leaving from the surfaces of the fuel particles, in contrast to those starting inside the moderator, do not follow the exponential distribution. The current model, as it is, is thus not valid in the determination of the free path lengths of the neutrons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gasification of biomass is an efficient method process to produce liquid fuels, heat and electricity. It is interesting especially for the Nordic countries, where raw material for the processes is readily available. The thermal reactions of light hydrocarbons are a major challenge for industrial applications. At elevated temperatures, light hydrocarbons react spontaneously to form higher molecular weight compounds. In this thesis, this phenomenon was studied by literature survey, experimental work and modeling effort. The literature survey revealed that the change in tar composition is likely caused by the kinetic entropy. The role of the surface material is deemed to be an important factor in the reactivity of the system. The experimental results were in accordance with previous publications on the subject. The novelty of the experimental work lies in the used time interval for measurements combined with an industrially relevant temperature interval. The aspects which are covered in the modeling include screening of possible numerical approaches, testing of optimization methods and kinetic modelling. No significant numerical issues were observed, so the used calculation routines are adequate for the task. Evolutionary algorithms gave a better performance combined with better fit than the conventional iterative methods such as Simplex and Levenberg-Marquardt methods. Three models were fitted on experimental data. The LLNL model was used as a reference model to which two other models were compared. A compact model which included all the observed species was developed. The parameter estimation performed on that model gave slightly impaired fit to experimental data than LLNL model, but the difference was barely significant. The third tested model concentrated on the decomposition of hydrocarbons and included a theoretical description of the formation of carbon layer on the reactor walls. The fit to experimental data was extremely good. Based on the simulation results and literature findings, it is likely that the surface coverage of carbonaceous deposits is a major factor in thermal reactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The costs of health care are going up in many countries. In order to provide affordable and effective health care solutions, new technologies and approaches are constantly being developed. In this research, video games are presented as a possible solution to the problem. Video games are fun, and nowadays most people like to spend time on them. In addition, recent studies have pointed out that video games can have notable health benefits. Health games have already been developed, used in practice, and researched. However, the bulk of health game studies have been concerned with the design or the effectiveness of the games; no actual business studies have been conducted on the subject, even though health games often lack commercial success despite their health benefits. This thesis seeks to fill this gap. The specific aim of this thesis is to develop a conceptual business model framework and empirically use it in explorative medical game business model research. In the first stage of this research, a literature review was conducted and the existing literature analyzed and synthesized into a conceptual business model framework consisting of six dimensions. The motivation behind the synthesis is the ongoing ambiguity around the business model concept. In the second stage, 22 semi-structured interviews were conducted with different professionals within the value network for medical games. The business model framework was present in all stages of the empirical research: First, in the data collection stage, the framework acted as a guiding instrument, focusing the interview process. Then, the interviews were coded and analyzed using the framework as a structure. The results were then reported following the structure of the framework. In the results, the interviewees highlighted several important considerations and issues for medical games concerning the six dimensions of the business model framework. Based on the key findings of this research, several key components of business models for medical games were identified and illustrated in a single figure. Furthermore, five notable challenges for business models for medical games were presented, and possible solutions for the challenges were postulated. Theoretically, these findings provide pioneering information on the untouched subject of business models for medical games. Moreover, the conceptual business model framework and its use in the novel context of medical games provide a contribution to the business model literature. Regarding practice, this thesis further accentuates that medical games can offer notable benefits to several stakeholder groups and offers advice to companies seeking to commercialize these games.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a general stochastic framework and an equilibrium asset pricing model that make clear how attitudes towards intertemporal substitution and risk matter for option pricing. In particular, we show under which statistical conditions option pricing formulas are not preference-free, in other words, when preferences are not hidden in the stock and bond prices as they are in the standard Black and Scholes (BS) or Hull and White (HW) pricing formulas. The dependence of option prices on preference parameters comes from several instantaneous causality effects such as the so-called leverage effect. We also emphasize that the most standard asset pricing models (CAPM for the stock and BS or HW preference-free option pricing) are valid under the same stochastic setting (typically the absence of leverage effect), regardless of preference parameter values. Even though we propose a general non-preference-free option pricing formula, we always keep in mind that the BS formula is dominant both as a theoretical reference model and as a tool for practitioners. Another contribution of the paper is to characterize why the BS formula is such a benchmark. We show that, as soon as we are ready to accept a basic property of option prices, namely their homogeneity of degree one with respect to the pair formed by the underlying stock price and the strike price, the necessary statistical hypotheses for homogeneity provide BS-shaped option prices in equilibrium. This BS-shaped option-pricing formula allows us to derive interesting characterizations of the volatility smile, that is, the pattern of BS implicit volatilities as a function of the option moneyness. First, the asymmetry of the smile is shown to be equivalent to a particular form of asymmetry of the equivalent martingale measure. Second, this asymmetry appears precisely when there is either a premium on an instantaneous interest rate risk or on a generalized leverage effect or both, in other words, whenever the option pricing formula is not preference-free. Therefore, the main conclusion of our analysis for practitioners should be that an asymmetric smile is indicative of the relevance of preference parameters to price options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'épaule est un complexe articulaire formé par le thorax, la clavicule, la scapula et l'humérus. Alors que les orientation et position de ces derniers la rendent difficile à étudier, la compréhension approfondie de l'interrelation de ces segments demeure cliniquement importante. Ainsi, un nouveau modèle du membre supérieur est développé et présenté. La cinématique articulaire de 15 sujets sains est collectée et reconstruite à l'aide du modèle. Celle-ci s'avère être généralement moins variable et plus facilement interprétable que le modèle de référence. Parallèlement, l'utilisation de simplifications, issues de la 2D, sur le calcul d'amplitude de mouvement en 3D est critiquée. Cependant, des cas d'exception où ces simplifications s'appliquent sont dégagés et prouvés. Ainsi, ils sont une éventuelle avenue d'amélioration supplémentaire des modèles sans compromission de leur validé.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Dirichlet family owes its privileged status within simplex distributions to easyness of interpretation and good mathematical properties. In particular, we recall fundamental properties for the analysis of compositional data such as closure under amalgamation and subcomposition. From a probabilistic point of view, it is characterised (uniquely) by a variety of independence relationships which makes it indisputably the reference model for expressing the non trivial idea of substantial independence for compositions. Indeed, its well known inadequacy as a general model for compositional data stems from such an independence structure together with the poorness of its parametrisation. In this paper a new class of distributions (called Flexible Dirichlet) capable of handling various dependence structures and containing the Dirichlet as a special case is presented. The new model exhibits a considerably richer parametrisation which, for example, allows to model the means and (part of) the variance-covariance matrix separately. Moreover, such a model preserves some good mathematical properties of the Dirichlet, i.e. closure under amalgamation and subcomposition with new parameters simply related to the parent composition parameters. Furthermore, the joint and conditional distributions of subcompositions and relative totals can be expressed as simple mixtures of two Flexible Dirichlet distributions. The basis generating the Flexible Dirichlet, though keeping compositional invariance, shows a dependence structure which allows various forms of partitional dependence to be contemplated by the model (e.g. non-neutrality, subcompositional dependence and subcompositional non-invariance), independence cases being identified by suitable parameter configurations. In particular, within this model substantial independence among subsets of components of the composition naturally occurs when the subsets have a Dirichlet distribution

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el presente trabajo se pretende mostrar un estudio realizado a dos compañías colombianas, dicho estudio tiene como finalidad establecer aquellos procesos en los que el desempeño de las organizaciones es alto en cuanto al manejo de sus cadenas de suministro se refiere; para ello se ha realizado un análisis de datos resultantes del Balance General y Estado de Resultados de las empresas escogidas, mediante el uso del modelo SCOR® V.10. y de determinadas métricas del mismo. Con base en éstos resultados se espera llevar a cabo un BENCHMARKING con el fin de establecer cuáles son los procesos competitivos y aquellos débiles de cada una de las empresas, para que éstas puedan determinar en qué procesos deben mejorar y de esta forma aumentar su productividad. Este trabajo de grado, busca ser un punto de referencia para el medio empresarial colombiano y una herramienta útil a la hora de evaluar la forma en la que las organizaciones se están desarrollando, los procesos a mejorar y la manera en la que dichas reformas (en pro del desempeño superior de las compañías) pueden afectar el crecimiento empresarial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resumen tomado de la publicación

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monográfico con el título: 'Estado actual de los sistemas e-learning'. Resumen basado en el de la publicación

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La creciente importancia del uso de las aplicaciones SIG en las administraciones públicas, tanto españolas como europeas, ha dado lugar al surgimiento de diversos proyectos de desarrollo de software basados en licencias libres, cada uno de los cuales se dirige a un sector determinado de usuarios. Además, cada uno de estos proyectos define un modelo conceptual de datos para almacenar la información, servicios o módulos para el acceso a esa a información y funcionalidad que se le ofrece al usuario. La mayor parte de las veces estos proyectos se desarrollan de forma independiente a pesar de que existen interrelaciones claras entre todos ellos tales como compartir partes del modelo de datos, el interés común en dar soporte a aplicaciones de gestión municipal o el hecho de utilizar como base los mismos componentes. Estos motivos recomiendan buscar la confluencia entre los proyectos con el objetivo de evitar desarrollos duplicados y favorecer su integración e interoperabilidad. Por este motivo, en Enero de 2009 se constituyó la red signergias que busca mantener en contacto a los responsables de arquitectura y de desarrollo de estos proyectos con el fin de analizar las posibilidades de confluencia y de llegar a acuerdos que permitan compartir modelos de datos, definir de forma conjunta servicios y funcionalidades, o intercambiar componentes de software. En este artículo se describe la motivación de la creación de la red, sus objetivos, su forma de funcionamiento y los resultados alcanzados

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While the standard models of concentration addition and independent action predict overall toxicity of multicomponent mixtures reasonably, interactions may limit the predictive capability when a few compounds dominate a mixture. This study was conducted to test if statistically significant systematic deviations from concentration addition (i.e. synergism/antagonism, dose ratio- or dose level-dependency) occur when two taxonomically unrelated species, the earthworm Eisenia fetida and the nematode Caenorhabditis elegans were exposed to a full range of mixtures of the similar acting neonicotinoid pesticides imidacloprid and thiacloprid. The effect of the mixtures on C. elegans was described significantly better (p<0.01) by a dose level-dependent deviation from the concentration addition model than by the reference model alone, while the reference model description of the effects on E. fetida could not be significantly improved. These results highlight that deviations from concentration addition are possible even with similar acting compounds, but that the nature of such deviations are species dependent. For improving ecological risk assessment of simple mixtures, this implies that the concentration addition model may need to be used in a probabilistic context, rather than in its traditional deterministic manner. Crown Copyright (C) 2008 Published by Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rapid urbanisation in China has resulted in great demands for energy, resources and pressure on the environment. The progress in China's development is considered in the context of energy efficiency in the built environment, including policy, technology and implementation. The key research challenges and opportunities are identified for delivering a low carbon built environment. The barriers include the existing traditional sequential design process, the lack of integrated approaches, and insufficient socio-technical knowledge. A proposed conceptual systemic model of an integrated approach identifies research opportunities. The organisation of research activities should be initiated, operated, and managed in a collaborative way among policy makers, professionals, researchers and stakeholders. More emphasis is needed on integrating social, economic and environmental impacts in the short, medium and long terms. An ideal opportunity exists for China to develop its own expertise, not merely in a technical sense but in terms of vision and intellectual leadership in order to flourish in global collaborations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] Remotely sensed, multiannual data sets of shortwave radiative surface fluxes are now available for assimilation into land surface schemes (LSSs) of climate and/or numerical weather prediction models. The RAMI4PILPS suite of virtual experiments assesses the accuracy and consistency of the radiative transfer formulations that provide the magnitudes of absorbed, reflected, and transmitted shortwave radiative fluxes in LSSs. RAMI4PILPS evaluates models under perfectly controlled experimental conditions in order to eliminate uncertainties arising from an incomplete or erroneous knowledge of the structural, spectral and illumination related canopy characteristics typical for model comparison with in situ observations. More specifically, the shortwave radiation is separated into a visible and near-infrared spectral region, and the quality of the simulated radiative fluxes is evaluated by direct comparison with a 3-D Monte Carlo reference model identified during the third phase of the Radiation transfer Model Intercomparison (RAMI) exercise. The RAMI4PILPS setup thus allows to focus in particular on the numerical accuracy of shortwave radiative transfer formulations and to pinpoint to areas where future model improvements should concentrate. The impact of increasing degrees of structural and spectral subgrid variability on the simulated fluxes is documented and the relevance of any thus emerging biases with respect to gross primary production estimates and shortwave radiative forcings due to snow and fire events are investigated.