998 resultados para Metric system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hojas Kilométricas (Kilometric Sheets). Specifically, the study focuses on those sheets referring to the city centre and surrounding area of the Royal Site of Aranjuez, a town in the south of the Province of Madrid. The aim of this study is to restore the actual size and measurements of scanned images of the Hojas Kilométricas. This would allow us, among other things, to reestablish both the format and scale of the original plans. To achieve this goal it is necessary to rectify and then georeference these images, i.e. assign them a geographic reference system. This procedure is essential in the overlaying and comparison of the Hojas Kilométricas of the Royal Site with other historical cartography as well as other sources related to the same area from different time periods. Subsequent research would allow us, for example, to reconstruct the time-evolution of the urban area, to spot new construction and to pinpoint the locations of any altered or missing buildings or architectural features. In addition, this would allow us to develop and integrate databases for GIS models applicable to the management of our cultural heritage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Métrica de calidad de video de alta definición construida a partir de ratios de referencia completa. La medida de calidad de video, en inglés Visual Quality Assessment (VQA), es uno de los mayores retos por solucionar en el entorno multimedia. La calidad de vídeo tiene un impacto altísimo en la percepción del usuario final (consumidor) de los servicios sustentados en la provisión de contenidos multimedia y, por tanto, factor clave en la valoración del nuevo paradigma denominado Calidad de la Experiencia, en inglés Quality of Experience (QoE). Los modelos de medida de calidad de vídeo se pueden agrupar en varias ramas según la base técnica que sustenta el sistema de medida, destacando en importancia los que emplean modelos psicovisuales orientados a reproducir las características del sistema visual humano, en inglés Human Visual System, del que toman sus siglas HVS, y los que, por el contrario, optan por una aproximación ingenieril en la que el cálculo de calidad está basado en la extracción de parámetros intrínsecos de la imagen y su comparación. A pesar de los avances recogidos en este campo en los últimos años, la investigación en métricas de calidad de vídeo, tanto en presencia de referencia (los modelos denominados de referencia completa), como en presencia de parte de ella (modelos de referencia reducida) e incluso los que trabajan en ausencia de la misma (denominados sin referencia), tiene un amplio camino de mejora y objetivos por alcanzar. Dentro de ellos, la medida de señales de alta definición, especialmente las utilizadas en las primeras etapas de la cadena de valor que son de muy alta calidad, son de especial interés por su influencia en la calidad final del servicio y no existen modelos fiables de medida en la actualidad. Esta tesis doctoral presenta un modelo de medida de calidad de referencia completa que hemos llamado PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), basado en la ponderación de cuatro ratios de calidad calculados a partir de características intrínsecas de la imagen. Son: El Ratio de Fidelidad, calculado mediante el gradiente morfológico o gradiente de Beucher. El Ratio de Similitud Visual, calculado mediante los puntos visualmente significativos de la imagen a través de filtrados locales de contraste. El Ratio de Nitidez, que procede de la extracción del estadístico de textura de Haralick contraste. El Ratio de Complejidad, obtenido de la definición de homogeneidad del conjunto de estadísticos de textura de Haralick PARMENIA presenta como novedad la utilización de la morfología matemática y estadísticos de Haralick como base de una métrica de medida de calidad, pues esas técnicas han estado tradicionalmente más ligadas a la teledetección y la segmentación de objetos. Además, la aproximación de la métrica como un conjunto ponderado de ratios es igualmente novedosa debido a que se alimenta de modelos de similitud estructural y otros más clásicos, basados en la perceptibilidad del error generado por la degradación de la señal asociada a la compresión. PARMENIA presenta resultados con una altísima correlación con las valoraciones MOS procedentes de las pruebas subjetivas a usuarios que se han realizado para la validación de la misma. El corpus de trabajo seleccionado procede de conjuntos de secuencias validados internacionalmente, de modo que los resultados aportados sean de la máxima calidad y el máximo rigor posible. La metodología de trabajo seguida ha consistido en la generación de un conjunto de secuencias de prueba de distintas calidades a través de la codificación con distintos escalones de cuantificación, la obtención de las valoraciones subjetivas de las mismas a través de pruebas subjetivas de calidad (basadas en la recomendación de la Unión Internacional de Telecomunicaciones BT.500), y la validación mediante el cálculo de la correlación de PARMENIA con estos valores subjetivos, cuantificada a través del coeficiente de correlación de Pearson. Una vez realizada la validación de los ratios y optimizada su influencia en la medida final y su alta correlación con la percepción, se ha realizado una segunda revisión sobre secuencias del hdtv test dataset 1 del Grupo de Expertos de Calidad de Vídeo (VQEG, Video Quality Expert Group) mostrando los resultados obtenidos sus claras ventajas. Abstract Visual Quality Assessment has been so far one of the most intriguing challenges on the media environment. Progressive evolution towards higher resolutions while increasing the quality needed (e.g. high definition and better image quality) aims to redefine models for quality measuring. Given the growing interest in multimedia services delivery, perceptual quality measurement has become a very active area of research. First, in this work, a classification of objective video quality metrics based on their underlying methodologies and approaches for measuring video quality has been introduced to sum up the state of the art. Then, this doctoral thesis describes an enhanced solution for full reference objective quality measurement based on mathematical morphology, texture features and visual similarity information that provides a normalized metric that we have called PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), with a high correlated MOS score. The PARMENIA metric is based on the pooling of different quality ratios that are obtained from three different approaches: Beucher’s gradient, local contrast filtering, and contrast and homogeneity Haralick’s texture features. The metric performance is excellent, and improves the current state of the art by providing a wide dynamic range that make easier to discriminate between very close quality coded sequences, especially for very high bit rates whose quality, currently, is transparent for quality metrics. PARMENIA introduces a degree of novelty against other working metrics: on the one hand, exploits the structural information variation to build the metric’s kernel, but complements the measure with texture information and a ratio of visual meaningful points that is closer to typical error sensitivity based approaches. We would like to point out that PARMENIA approach is the only metric built upon full reference ratios, and using mathematical morphology and texture features (typically used in segmentation) for quality assessment. On the other hand, it gets results with a wide dynamic range that allows measuring the quality of high definition sequences from bit rates of hundreds of Megabits (Mbps) down to typical distribution rates (5-6 Mbps), even streaming rates (1- 2 Mbps). Thus, a direct correlation between PARMENIA and MOS scores are easily constructed. PARMENIA may further enhance the number of available choices in objective quality measurement, especially for very high quality HD materials. All this results come from validation that has been achieved through internationally validated datasets on which subjective tests based on ITU-T BT.500 methodology have been carried out. Pearson correlation coefficient has been calculated to verify the accuracy of PARMENIA and its reliability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To shift to a low-carbon economy, the EU has been encouraging the deployment of variable renewable energy sources (VRE). However, VRE lack of competitiveness and their technical specificities have substantially raised the cost of the transition. Economic evaluations show that VRE life-cycle costs of electricity generation are still today higher than those of conventional thermal power plants. Member States have consequently adopted dedicated policies to support them. In addition, Ueckerdt et al. (2013) show that when integrated to the power system, VRE induce supplementary not-accounted-for costs. This paper first exposes the rationale of EU renewables goals, the EU targets and current deployment. It then explains why the LCOE metric is not appropriate to compute VRE costs by describing integration costs, their magnitude and their implications. Finally, it analyses the consequences for the power system and policy options. The paper shows that the EU has greatly underestimated VRE direct and indirect costs and that policymakers have failed to take into account the burden caused by renewable energy and the return of State support policies. Indeed, induced market distortions have been shattering the whole power system and have undermined competition in the Internal Energy Market. EU policymakers can nonetheless take full account of this negative trend and reverse it by relying on competition rules, setting-up a framework to collect robust EU-wide data, redesigning the architecture of the electricity system and relying on EU regulators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The semantic web vision is one in which rich, ontology-based semantic markup will become widely available. The availability of semantic markup on the web opens the way to novel, sophisticated forms of question answering. AquaLog is a portable question-answering system which takes queries expressed in natural language and an ontology as input, and returns answers drawn from one or more knowledge bases (KBs). We say that AquaLog is portable because the configuration time required to customize the system for a particular ontology is negligible. AquaLog presents an elegant solution in which different strategies are combined together in a novel way. It makes use of the GATE NLP platform, string metric algorithms, WordNet and a novel ontology-based relation similarity service to make sense of user queries with respect to the target KB. Moreover it also includes a learning component, which ensures that the performance of the system improves over the time, in response to the particular community jargon used by end users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metrics estimate the quality of different aspects of software. In particular, cohesion indicates how well the parts of a system hold together. A metric to evaluate class cohesion is important in object-oriented programming because it gives an indication of a good design of classes. There are several proposals of metrics for class cohesion but they have several problems (for instance, low discrimination). In this paper, a new metric to evaluate class cohesion is proposed, called SCOM, which has several relevant features. It has an intuitive and analytical formulation, what is necessary to apply it to large-size software systems. It is normalized to produce values in the range [0..1], thus yielding meaningful values. It is also more sensitive than those previously reported in the literature. The attributes and methods used to evaluate SCOM are unambiguously stated. SCOM has an analytical threshold, which is a very useful but rare feature in software metrics. We assess the metric with several sample cases, showing that it gives more sensitive values than other well know cohesion metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AMS Subj. Classification: MSC2010: 42C10, 43A50, 43A75

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resilience is widely accepted as a desirable system property for cyber-physical systems. However, there are no metrics that can be used to measure the resilience of cyber-physical systems (CPS) while the multi-dimensional nature of performance in these systems is considered. In this work, we present first results towards a resilience metric framework. The key contributions of this framework are threefold: First, it allows to evaluate resilience with respect to different performance indicators that are of interest. Second, complexities that are relevant to the performance indicators of interest, can be intentionally abstracted. Third and final, it supports the identification of reasons for good or bad resilience to improve system design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electric vehicle (EV) batteries tend to have accelerated degradation due to high peak power and harsh charging/discharging cycles during acceleration and deceleration periods, particularly in urban driving conditions. An oversized energy storage system (ESS) can meet the high power demands; however, it suffers from increased size, volume and cost. In order to reduce the overall ESS size and extend battery cycle life, a battery-ultracapacitor (UC) hybrid energy storage system (HESS) has been considered as an alternative solution. In this work, we investigate the optimized configuration, design, and energy management of a battery-UC HESS. One of the major challenges in a HESS is to design an energy management controller for real-time implementation that can yield good power split performance. We present the methodologies and solutions to this problem in a battery-UC HESS with a DC-DC converter interfacing with the UC and the battery. In particular, a multi-objective optimization problem is formulated to optimize the power split in order to prolong the battery lifetime and to reduce the HESS power losses. This optimization problem is numerically solved for standard drive cycle datasets using Dynamic Programming (DP). Trained using the DP optimal results, an effective real-time implementation of the optimal power split is realized based on Neural Network (NN). This proposed online energy management controller is applied to a midsize EV model with a 360V/34kWh battery pack and a 270V/203Wh UC pack. The proposed online energy management controller effectively splits the load demand with high power efficiency and also effectively reduces the battery peak current. More importantly, a 38V-385Wh battery and a 16V-2.06Wh UC HESS hardware prototype and a real-time experiment platform has been developed. The real-time experiment results have successfully validated the real-time implementation feasibility and effectiveness of the real-time controller design for the battery-UC HESS. A battery State-of-Health (SoH) estimation model is developed as a performance metric to evaluate the battery cycle life extension effect. It is estimated that the proposed online energy management controller can extend the battery cycle life by over 60%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Failure of application operations is one of the maincauses of system-wide outages in cloud environments. Thisparticularly applies to DevOps operations, such as backup,redeployment, upgrade, customized scaling, and migration that areexposed to frequent interference from other concurrent operations,configuration changes, and resources failure. However, currentpractices fail to provide a reliable assurance of correct execution ofthese kinds of operations. In this paper, we present an approach toaddress this problem that adopts a regression-based analysistechnique to find the correlation between an operation’s activity logsand the operation activity’s effect on cloud resources. Thecorrelation model is then used to derive assertion specifications,which can be used for runtime verification of running operations andtheir impact on resources. We evaluated our proposed approach onAmazon EC2 with 22 rounds of rolling upgrade operations whileother types of operations were running and random faults wereinjected. Our experiment shows that our approach successfullymanaged to raise alarms for 115 random injected faults, with aprecision of 92.3%.