36 resultados para Data Modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a revisited classification of term variation in the light of the Linked Data initiative. Linked Data refers to a set of best practices for publishing and connecting structured data on the Web with the idea of transforming it into a global graph. One of the crucial steps of this initiative is the linking step, in which datasets in one or more languages need to be linked or connected with one another. We claim that the linking process would be facilitated if datasets are enriched with lexical and terminological information. Being that the final aim, we propose a classification of lexical, terminological and semantic variants that will become part of a model of linguistic descriptions that is currently being proposed within the framework of the W3C Ontology-Lexica Community Group to enrich ontologies and Linked Data vocabularies. Examples of modeling solutions of the different types of variants are also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling is an essential tool for the development of atmospheric emission abatement measures and air quality plans. Most often these plans are related to urban environments with high emission density and population exposure. However, air quality modeling in urban areas is a rather challenging task. As environmental standards become more stringent (e.g. European Directive 2008/50/EC), more reliable and sophisticated modeling tools are needed to simulate measures and plans that may effectively tackle air quality exceedances, common in large urban areas across Europe, particularly for NO2. This also implies that emission inventories must satisfy a number of conditions such as consistency across the spatial scales involved in the analysis, consistency with the emission inventories used for regulatory purposes and versatility to match the requirements of different air quality and emission projection models. This study reports the modeling activities carried out in Madrid (Spain) highlighting the atmospheric emission inventory development and preparation as an illustrative example of the combination of models and data needed to develop a consistent air quality plan at urban level. These included a series of source apportionment studies to define contributions from the international, national, regional and local sources in order to understand to what extent local authorities can enforce meaningful abatement measures. Moreover, source apportionment studies were conducted in order to define contributions from different sectors and to understand the maximum feasible air quality improvement that can be achieved by reducing emissions from those sectors, thus targeting emission reduction policies to the most relevant activities. Finally, an emission scenario reflecting the effect of such policies was developed and the associated air quality was modeled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wake effect represents one of the most important aspects to be analyzed at the engineering phase of every wind farm since it supposes an important power deficit and an increase of turbulence levels with the consequent decrease of the lifetime. It depends on the wind farm design, wind turbine type and the atmospheric conditions prevailing at the site. Traditionally industry has used analytical models, quick and robust, which allow carry out at the preliminary stages wind farm engineering in a flexible way. However, new models based on Computational Fluid Dynamics (CFD) are needed. These models must increase the accuracy of the output variables avoiding at the same time an increase in the computational time. Among them, the elliptic models based on the actuator disk technique have reached an extended use during the last years. These models present three important problems in case of being used by default for the solution of large wind farms: the estimation of the reference wind speed upstream of each rotor disk, turbulence modeling and computational time. In order to minimize the consequence of these problems, this PhD Thesis proposes solutions implemented under the open source CFD solver OpenFOAM and adapted for each type of site: a correction on the reference wind speed for the general elliptic models, the semi-parabollic model for large offshore wind farms and the hybrid model for wind farms in complex terrain. All the models are validated in terms of power ratios by means of experimental data derived from real operating wind farms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El objetivo de esta investigación es desarrollar una metodología para estimar los potenciales impactos económicos y de transporte generados por la aplicación de políticas en el sector transporte. Los departamentos de transporte y otras instituciones gubernamentales relacionadas se encuentran interesadas en estos análisis debido a que son presentados comúnmente de forma errónea por la insuficiencia de datos o por la falta de metodologías adecuadas. La presente investigación tiene por objeto llenar este vacío haciendo un análisis exhaustivo de las técnicas disponibles que coincidan con ese propósito. Se ha realizado un análisis que ha identificado las diferencias cuando son aplicados para la valoración de los beneficios para el usuario o para otros efectos como aspectos sociales. Como resultado de ello, esta investigación ofrece un enfoque integrado que incluye un modelo Input-Output de múltiples regiones basado en la utilidad aleatoria (RUBMRIO), y un modelo de red de transporte por carretera. Este modelo permite la reproducción con mayor detalle y realismo del transporte de mercancías que por medio de su estructura sectorial identifica los vínculos de las compras y ventas inter-industriales dentro de un país utilizando los servicios del transporte de mercancías. Por esta razón, el modelo integrado es aplicable a diversas políticas de transporte. En efecto, el enfoque se ha aplicado para estudiar los efectos macroeconómicos regionales de la implementación de dos políticas diferentes en el sistema de transporte de mercancías de España, tales como la tarificación basada en la distancia recorrida por vehículo-kilómetro (€/km) aplicada a los vehículos del transporte de mercancías, y para la introducción de vehículos más largos y pesados de mercancías en la red de carreteras de España. El enfoque metodológico se ha evaluado caso por caso teniendo en cuenta una selección de la red de carreteras que unen las capitales de las regiones españolas. También se ha tenido en cuenta una dimensión económica a través de una tabla Input-Output de múltiples regiones (MRIO) y la base de datos de conteo de tráfico existente para realizar la validación del modelo. El enfoque integrado reproduce las condiciones de comercio observadas entre las regiones usando el sistema de transporte de mercancías por carretera, y que permite por comparación con los escenarios de políticas, determinar las contribuciones a los cambios distributivos y generativos. Así pues, el análisis estima los impactos económicos en cualquier región considerando los cambios en el Producto Interno Bruto (PIB) y el empleo. El enfoque identifica los cambios en el sistema de transporte a través de todos los caminos de la red de transporte a través de las medidas de efectividad (MOEs). Los resultados presentados en esta investigación proporcionan evidencia sustancial de que en la evaluación de las políticas de transporte, es necesario establecer un vínculo entre la estructura económica de las regiones y de los servicios de transporte. Los análisis muestran que para la mayoría de las regiones del país, los cambios son evidentes para el PIB y el empleo, ya que el comercio se fomenta o se inhibe. El enfoque muestra cómo el tráfico se desvía en ambas políticas, y también determina detalles de las emisiones de contaminantes en los dos escenarios. Además, las políticas de fijación de precios o de regulación de los sistemas de transporte de mercancías por carretera dirigidas a los productores y consumidores en las regiones promoverán transformaciones regionales afectando todo el país, y esto conduce a conclusiones diferentes. Así mismo, este enfoque integrado podría ser útil para evaluar otras políticas y otros países en todo el mundo. The purpose of this research is to develop a methodological approach aimed at assessing the potential economic and transportation impacts of transport policies. Transportation departments and other related government parties are interested in such analysis because it is commonly misrepresented for the insufficiency of data and suitable methodologies available. This research is directed at filling this gap by making a comprehensive analysis of the available techniques that match with that purpose. The differences when they are applied for the valuation of user benefits or for other impacts as social matters have been identified. As a result, this research presents an integrated approach which includes both a random utility-based multiregional Input-Output model (RUBMRIO), and a road transport network model. This model accounts for freight transport with more detail and realism because its commodity-based structure traces the linkages of inter-industry purchases and sales that use freight services within a given country. For this reason, the integrated model is applicable to various transport policies. In fact, the approach is applied to study the regional macroeconomic effects of implementing two different policies in the freight transport system of Spain, such as a distance-based charge in vehicle-kilometer (€/km) for Heavy Goods Vehicles (HGVs), and the introduction of Longer and Heavier Vehicles (LHVs) in the road network of Spain. The methodological approach has been evaluated on a case by case basis considering a selected road network of highways linking the capitals of the Spanish regions. It has also considered an economic dimension through a Multiregional Input Output Table (MRIO) and the existing traffic count database used in the model validation. The integrated approach replicates observed conditions of trade among regions using road freight transport systems that determine contributions to distributional and generative changes by comparison with policy scenarios. Therefore, the model estimates economic impacts in any given area by considering changes in Gross Domestic Product (GDP), employment (jobs), and in the transportation system across all paths of the transport network considering Measures of effectiveness (MOEs). The results presented in this research provide substantive evidence that in the assessment of transport policies it is necessary to establish a link between the economic structure of regions and the transportation services. The analysis shows that for most regions in the country, GDP and employment changes are noticeable when trade is encouraged or discouraged. This approach shows how traffic is diverted in both policies, and also provides details of the pollutant emissions in both scenarios. Furthermore, policies, such as pricing or regulation of road freight transportation systems, directed to producers and consumers in regions will promote different regional transformations across the country, and this lead to different conclusions. In addition, this integrated approach could be useful to assess other policies and countries worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decades, neuropsychological theories tend to consider cognitive functions as a result of the whole brainwork and not as individual local areas of its cortex. Studies based on neuroimaging techniques have increased in the last years, promoting an exponential growth of the body of knowledge about relations between cognitive functions and brain structures [1]. However, so fast evolution make complicated to integrate them in verifiable theories and, even more, translated in to cognitive rehabilitation. The aim of this research work is to develop a cognitive process-modeling tool. The purpose of this system is, in the first term, to represent multidimensional data, from structural and functional connectivity, neuroimaging, data from lesion studies and derived data from clinical intervention [2][3]. This will allow to identify consolidated knowledge, hypothesis, experimental designs, new data from ongoing studies and emerging results from clinical interventions. In the second term, we pursuit to use Artificial Intelligence to assist in decision making allowing to advance towards evidence based and personalized treatments in cognitive rehabilitation. This work presents the knowledge base design of the knowledge representation tool. It is compound of two different taxonomies (structure and function) and a set of tags linking both taxonomies at different levels of structural and functional organization. The remainder of the abstract is organized as follows: Section 2 presents the web application used for gathering necessary information for generating the knowledge base, Section 3 describes knowledge base structure and finally Section 4 expounds reached conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a revisited classification of term variation in the light of the Linked Data initiative. Linked Data refers to a set of best practices for publishing and connecting structured data on the Web with the idea of transforming it into a global graph. One of the crucial steps of this initiative is the linking step, in which datasets in one or more languages need to be linked or connected with one another. We claim that the linking process would be facilitated if datasets are enriched with lexical and terminological information. Being that the final aim, we propose a classification of lexical, terminological and semantic variants that will become part of a model of linguistic descriptions that is currently being proposed within the framework of the W3C Ontology- Lexica Community Group to enrich ontologies and Linked Data vocabularies. Examples of modeling solutions of the different types of variants are also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leakage power consumption is a com- ponent of the total power consumption in data cen- ters that is not traditionally considered in the set- point temperature of the room. However, the effect of this power component, increased with temperature, can determine the savings associated with the careful management of the cooling system, as well as the re- liability of the system. The work presented in this paper detects the need of addressing leakage power in order to achieve substantial savings in the energy consumption of servers. In particular, our work shows that, by a careful detection and management of two working regions (low and high impact of thermal- dependent leakage), energy consumption of the data- center can be optimized by a reduction of the cooling budget.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a depth-color scene modeling strategy for indoors 3D contents generation. It combines depth and visual information provided by a low-cost active depth camera to improve the accuracy of the acquired depth maps considering the different dynamic nature of the scene elements. Accurate depth and color models of the scene background are iteratively built, and used to detect moving elements in the scene. The acquired depth data is continuously processed with an innovative joint-bilateral filter that efficiently combines depth and visual information thanks to the analysis of an edge-uncertainty map and the detected foreground regions. The main advantages of the proposed approach are: removing depth maps spatial noise and temporal random fluctuations; refining depth data at object boundaries, generating iteratively a robust depth and color background model and an accurate moving object silhouette.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge modeling tools are software tools that follow a modeling approach to help developers in building a knowledge-based system. The purpose of this article is to show the advantages of using this type of tools in the development of complex knowledge-based decision support systems. In order to do so, the article describes the development of a system called SAIDA in the domain of hydrology with the help of the KSM modeling tool. SAIDA operates on real-time receiving data recorded by sensors (rainfall, water levels, flows, etc.). It follows a multi-agent architecture to interpret the data, predict the future behavior and recommend control actions. The system includes an advanced knowledge based architecture with multiple symbolic representation. KSM was especially useful to design and implement the complex knowledge based architecture in an efficient way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 1-diode/2-resistors electric circuit equivalent to a photovoltaic system is analyzed. The equations at particular points of the I–V curve are studied considering the maximum number of terms. The maximum power point as a boundary condition is given special attention. A new analytical method is developed based on a reduced amount of information, consisting in the normal manufacturer data. Results indicate that this new method is faster than numerical methods and has similar (or better) accuracy than other existing methods, numerical or analytical.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of a Power Plant is affected by the distribution of the pulverized coal within the furnace. The coal, which is pulverized in the mills, is transported and distributed by the primary gas through the mill-ducts to the interior of the furnace. This is done with a double function: dry and enter the coal by different levels for optimizing the combustion in the sense that a complete combustion occurs with homogeneous heat fluxes to the walls. The mill-duct systems of a real Power Plant are very complex and they are not yet well understood. In particular, experimental data concerning the mass flows of coal to the different levels are very difficult to measure. CFD modeling can help to determine them. An Eulerian/Lagrangian approach is used due to the low solid–gas volume ratio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monkey anterior intraparietal area (AIP) encodes visual information about three-dimensional object shape that is used to shape the hand for grasping. We modeled shape tuning in visual AIP neurons and its relationship with curvature and gradient information from the caudal intraparietal area (CIP). The main goal was to gain insight into the kinds of shape parameterizations that can account for AIP tuning and that are consistent with both the inputs to AIP and the role of AIP in grasping. We first experimented with superquadric shape parameters. We considered superquadrics because they occupy a role in robotics that is similar to AIP , in that superquadric fits are derived from visual input and used for grasp planning. We also experimented with an alternative shape parameterization that was based on an Isomap dimension reduction of spatial derivatives of depth (i.e., distance from the observer to the object surface). We considered an Isomap-based model because its parameters lacked discontinuities between similar shapes. When we matched the dimension of the Isomap to the number of superquadric parameters, the superquadric model fit the AIP data somewhat more closely. However, higher-dimensional Isomaps provided excellent fits. Also, we found that the Isomap parameters could be approximated much more accurately than superquadric parameters by feedforward neural networks with CIP-like inputs. We conclude that Isomaps, or perhaps alternative dimension reductions of visual inputs to AIP, provide a promising model of AIP electrophysiology data. Further work is needed to test whether such shape parameterizations actually provide an effective basis for grasp control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In recent years, Spain has implemented a number of air quality control measures that are expected to lead to a future reduction in fine particle concentrations and an ensuing positive impact on public health. Objectives: We aimed to assess the impact on mortality attributable to a reduction in fine particle levels in Spain in 2014 in relation to the estimated level for 2007. Methods: To estimate exposure, we constructed fine particle distribution models for Spain for 2007 (reference scenario) and 2014 (projected scenario) with a spatial resolution of 16x16 km2. In a second step, we used the concentration-response functions proposed by cohort studies carried out in Europe (European Study of Cohorts for Air Pollution Effects and Rome longitudinal cohort) and North America (American Cancer Society cohort, Harvard Six Cities study and Canadian national cohort) to calculate the number of attributable annual deaths corresponding to all causes, all non-accidental causes, ischemic heart disease and lung cancer among persons aged over 25 years (2005-2007 mortality rate data). We examined the effect of the Spanish demographic shift in our analysis using 2007 and 2012 population figures. Results: Our model suggested that there would be a mean overall reduction in fine particle levels of 1mg/m3 by 2014. Taking into account 2007 population data, between 8 and 15 all-cause deaths per 100,000 population could be postponed annually by the expected reduction in fine particle levels. For specific subgroups, estimates varied from 10 to 30 deaths for all non-accidental causes, from 1 to 5 for lung cancer, and from 2 to 6 for ischemic heart disease. The expected burden of preventable mortality would be even higher in the future due to the Spanish population growth. Taking into account the population older than 30 years in 2012, the absolute mortality impact estimate would increase approximately by 18%. Conclusions: Effective implementation of air quality measures in Spain, in a scenario with a short-term projection, would amount to an appreciable decline infine particle concentrations, and this, in turn, would lead to notable health-related benefits. Recent European cohort studies strengthen the evidence of an association between long-term exposure to fine particles and health effects, and could enhance the health impact quantification in Europe. Air quality models can contribute to improved assessment of air pollution health impact estimates, particularly in study areas without air pollution monitoring data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As advanced Cloud services are becoming mainstream, the contribution of data centers in the overall power consumption of modern cities is growing dramatically. The average consumption of a single data center is equivalent to the energy consumption of 25.000 households. Modeling the power consumption for these infrastructures is crucial to anticipate the effects of aggressive optimization policies, but accurate and fast power modeling is a complex challenge for high-end servers not yet satisfied by analytical approaches. This work proposes an automatic method, based on Multi-Objective Particle Swarm Optimization, for the identification of power models of enterprise servers in Cloud data centers. Our approach, as opposed to previous procedures, does not only consider the workload consolidation for deriving the power model, but also incorporates other non traditional factors like the static power consumption and its dependence with temperature. Our experimental results shows that we reach slightly better models than classical approaches, but simul- taneously simplifying the power model structure and thus the numbers of sensors needed, which is very promising for a short-term energy prediction. This work, validated with real Cloud applications, broadens the possibilities to derive efficient energy saving techniques for Cloud facilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En una planta de fusión, los materiales en contacto con el plasma así como los materiales de primera pared experimentan condiciones particularmente hostiles al estar expuestos a altos flujos de partículas, neutrones y grandes cargas térmicas. Como consecuencia de estas diferentes y complejas condiciones de trabajo, el estudio, desarrollo y diseño de estos materiales es uno de los más importantes retos que ha surgido en los últimos años para la comunidad científica en el campo de los materiales y la energía. Debido a su baja tasa de erosión, alta resistencia al sputtering, alta conductividad térmica, muy alto punto de fusión y baja retención de tritio, el tungsteno (wolframio) es un importante candidato como material de primera pared y como posible material estructural avanzado en fusión por confinamiento magnético e inercial. Sin embargo, el tiempo de vida del tungsteno viene controlado por diversos factores como son su respuesta termo-mecánica en la superficie, la posibilidad de fusión y el fallo por acumulación de helio. Es por ello que el tiempo de vida limitado por la respuesta mecánica del tungsteno (W), y en particular su fragilidad, sean dos importantes aspectos que tienes que ser investigados. El comportamiento plástico en materiales refractarios con estructura cristalina cúbica centrada en las caras (bcc) como el tungsteno está gobernado por las dislocaciones de tipo tornillo a escala atómica y por conjuntos e interacciones de dislocaciones a escalas más grandes. El modelado de este complejo comportamiento requiere la aplicación de métodos capaces de resolver de forma rigurosa cada una de las escalas. El trabajo que se presenta en esta tesis propone un modelado multiescala que es capaz de dar respuestas ingenieriles a las solicitudes técnicas del tungsteno, y que a su vez está apoyado por la rigurosa física subyacente a extensas simulaciones atomísticas. En primer lugar, las propiedades estáticas y dinámicas de las dislocaciones de tipo tornillo en cinco potenciales interatómicos de tungsteno son comparadas, determinando cuáles de ellos garantizan una mayor fidelidad física y eficiencia computacional. Las grandes tasas de deformación asociadas a las técnicas de dinámica molecular hacen que las funciones de movilidad de las dislocaciones obtenidas no puedan ser utilizadas en los siguientes pasos del modelado multiescala. En este trabajo, proponemos dos métodos alternativos para obtener las funciones de movilidad de las dislocaciones: un modelo Monte Cario cinético y expresiones analíticas. El conjunto de parámetros necesarios para formular el modelo de Monte Cario cinético y la ley de movilidad analítica son calculados atomísticamente. Estos parámetros incluyen, pero no se limitan a: la determinación de las entalpias y energías de formación de las parejas de escalones que forman las dislocaciones, la parametrización de los efectos de no Schmid característicos en materiales bcc,etc. Conociendo la ley de movilidad de las dislocaciones en función del esfuerzo aplicado y la temperatura, se introduce esta relación como ecuación de flujo dentro de un modelo de plasticidad cristalina. La predicción del modelo sobre la dependencia del límite de fluencia con la temperatura es validada experimentalmente con ensayos uniaxiales en tungsteno monocristalino. A continuación, se calcula el límite de fluencia al aplicar ensayos uniaxiales de tensión para un conjunto de orientaciones cristalográticas dentro del triángulo estándar variando la tasa de deformación y la temperatura de los ensayos. Finalmente, y con el objetivo de ser capaces de predecir una respuesta más dúctil del tungsteno para una variedad de estados de carga, se realizan ensayos biaxiales de tensión sobre algunas de las orientaciones cristalográficas ya estudiadas en función de la temperatura.-------------------------------------------------------------------------ABSTRACT ----------------------------------------------------------Tungsten and tungsten alloys are being considered as leading candidates for structural and functional materials in future fusion energy devices. The most attractive properties of tungsten for the design of magnetic and inertial fusion energy reactors are its high melting point, high thermal conductivity, low sputtering yield and low longterm disposal radioactive footprint. However, tungsten also presents a very low fracture toughness, mostly associated with inter-granular failure and bulk plasticity, that limits its applications. As a result of these various and complex conditions of work, the study, development and design of these materials is one of the most important challenges that have emerged in recent years to the scientific community in the field of materials for energy applications. The plastic behavior of body-centered cubic (bcc) refractory metals like tungsten is governed by the kink-pair mediated thermally activated motion of h¿ (\1 11)i screw dislocations on the atomistic scale and by ensembles and interactions of dislocations at larger scales. Modeling this complex behavior requires the application of methods capable of resolving rigorously each relevant scale. The work presented in this thesis proposes a multiscale model approach that gives engineering-level responses to the technical specifications required for the use of tungsten in fusion energy reactors, and it is also supported by the rigorous underlying physics of extensive atomistic simulations. First, the static and dynamic properties of screw dislocations in five interatomic potentials for tungsten are compared, determining which of these ensure greater physical fidelity and computational efficiency. The large strain rates associated with molecular dynamics techniques make the dislocation mobility functions obtained not suitable to be used in the next steps of the multiscale model. Therefore, it is necessary to employ mobility laws obtained from a different method. In this work, we suggest two alternative methods to get the dislocation mobility functions: a kinetic Monte Carlo model and analytical expressions. The set of parameters needed to formulate the kinetic Monte Carlo model and the analytical mobility law are calculated atomistically. These parameters include, but are not limited to: enthalpy and energy barriers of kink-pairs as a function of the stress, width of the kink-pairs, non-Schmid effects ( both twinning-antitwinning asymmetry and non-glide stresses), etc. The function relating dislocation velocity with applied stress and temperature is used as the main source of constitutive information into a dislocation-based crystal plasticity framework. We validate the dependence of the yield strength with the temperature predicted by the model against existing experimental data of tensile tests in singlecrystal tungsten, with excellent agreement between the simulations and the measured data. We then extend the model to a number of crystallographic orientations uniformly distributed in the standard triangle and study the effects of temperature and strain rate. Finally, we perform biaxial tensile tests and provide the yield surface as a function of the temperature for some of the crystallographic orientations explored in the uniaxial tensile tests.