963 resultados para quality metrics


Relevância:

30.00% 30.00%

Publicador:

Resumo:

What is the best luminance contrast weighting-function for image quality optimization? Traditionally measured contrast sensitivity functions (CSFs), have been often used as weighting-functions in image quality and difference metrics. Such weightings have been shown to result in increased sharpness and perceived quality of test images. We suggest contextual CSFs (cCSFs) and contextual discrimination functions (cVPFs) should provide bases for further improvement, since these are directly measured from pictorial scenes, modeling threshold and suprathreshold sensitivities within the context of complex masking information. Image quality assessment is understood to require detection and discrimination of masked signals, making contextual sensitivity and discrimination functions directly relevant. In this investigation, test images are weighted with a traditional CSF, cCSF, cVPF and a constant function. Controlled mutations of these functions are also applied as weighting-functions, seeking the optimal spatial frequency band weighting for quality optimization. Image quality, sharpness and naturalness are then assessed in two-alternative forced-choice psychophysical tests. We show that maximal quality for our test images, results from cCSFs and cVPFs, mutated to boost contrast in the higher visible frequencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessing the subjective quality of processed images through an objective quality metric is a key issue in multimedia processing and transmission. In some scenarios, it is also important to evaluate the quality of the received images with minimal reference to the transmitted ones. For instance, for closed-loop optimisation of image and video transmission, the quality measure can be evaluated at the receiver and provided as feedback information to the system controller. The original images - prior to compression and transmission - are not usually available at the receiver side, and it is important to rely at the receiver side on an objective quality metric that does not need reference or needs minimal reference to the original images. The observation that the human eye is very sensitive to edge and contour information of an image underpins the proposal of our reduced reference (RR) quality metric, which compares edge information between the distorted and the original image. Results highlight that the metric correlates well with subjective observations, also in comparison with commonly used full-reference metrics and with a state-of-the-art reduced reference metric. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) emerge as underlying infrastructures for new classes of large-scale networked embedded systems. However, WSNs system designers must fulfill the quality-of-service (QoS) requirements imposed by the applications (and users). Very harsh and dynamic physical environments and extremely limited energy/computing/memory/communication node resources are major obstacles for satisfying QoS metrics such as reliability, timeliness, and system lifetime. The limited communication range of WSN nodes, link asymmetry, and the characteristics of the physical environment lead to a major source of QoS degradation in WSNs-the ldquohidden node problem.rdquo In wireless contention-based medium access control (MAC) protocols, when two nodes that are not visible to each other transmit to a third node that is visible to the former, there will be a collision-called hidden-node or blind collision. This problem greatly impacts network throughput, energy-efficiency and message transfer delays, and the problem dramatically increases with the number of nodes. This paper proposes H-NAMe, a very simple yet extremely efficient hidden-node avoidance mechanism for WSNs. H-NAMe relies on a grouping strategy that splits each cluster of a WSN into disjoint groups of non-hidden nodes that scales to multiple clusters via a cluster grouping strategy that guarantees no interference between overlapping clusters. Importantly, H-NAMe is instantiated in IEEE 802.15.4/ZigBee, which currently are the most widespread communication technologies for WSNs, with only minor add-ons and ensuring backward compatibility with their protocols standards. H-NAMe was implemented and exhaustively tested using an experimental test-bed based on ldquooff-the-shelfrdquo technology, showing that it increases network throughput and transmission success probability up to twice the values obtained without H-NAMe. H-NAMe effectiveness was also demonstrated in a target tracking application with mobile robots - over a WSN deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lors de ces dix dernières années, le coût de la maintenance des systèmes orientés objets s'est accru jusqu' à compter pour plus de 70% du coût total des systèmes. Cette situation est due à plusieurs facteurs, parmi lesquels les plus importants sont: l'imprécision des spécifications des utilisateurs, l'environnement d'exécution changeant rapidement et la mauvaise qualité interne des systèmes. Parmi tous ces facteurs, le seul sur lequel nous ayons un réel contrôle est la qualité interne des systèmes. De nombreux modèles de qualité ont été proposés dans la littérature pour contribuer à contrôler la qualité. Cependant, la plupart de ces modèles utilisent des métriques de classes (nombre de méthodes d'une classe par exemple) ou des métriques de relations entre classes (couplage entre deux classes par exemple) pour mesurer les attributs internes des systèmes. Pourtant, la qualité des systèmes par objets ne dépend pas uniquement de la structure de leurs classes et que mesurent les métriques, mais aussi de la façon dont celles-ci sont organisées, c'est-à-dire de leur conception, qui se manifeste généralement à travers les patrons de conception et les anti-patrons. Dans cette thèse nous proposons la méthode DEQUALITE, qui permet de construire systématiquement des modèles de qualité prenant en compte non seulement les attributs internes des systèmes (grâce aux métriques), mais aussi leur conception (grâce aux patrons de conception et anti-patrons). Cette méthode utilise une approche par apprentissage basée sur les réseaux bayésiens et s'appuie sur les résultats d'une série d'expériences portant sur l'évaluation de l'impact des patrons de conception et des anti-patrons sur la qualité des systèmes. Ces expériences réalisées sur 9 grands systèmes libres orientés objet nous permettent de formuler les conclusions suivantes: • Contre l'intuition, les patrons de conception n'améliorent pas toujours la qualité des systèmes; les implantations très couplées de patrons de conception par exemple affectent la structure des classes et ont un impact négatif sur leur propension aux changements et aux fautes. • Les classes participantes dans des anti-atrons sont beaucoup plus susceptibles de changer et d'être impliquées dans des corrections de fautes que les autres classes d'un système. • Un pourcentage non négligeable de classes sont impliquées simultanément dans des patrons de conception et dans des anti-patrons. Les patrons de conception ont un effet positif en ce sens qu'ils atténuent les anti-patrons. Nous appliquons et validons notre méthode sur trois systèmes libres orientés objet afin de démontrer l'apport de la conception des systèmes dans l'évaluation de la qualité.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The estimation of prediction quality is important because without quality measures, it is difficult to determine the usefulness of a prediction. Currently, methods for ligand binding site residue predictions are assessed in the function prediction category of the biennial Critical Assessment of Techniques for Protein Structure Prediction (CASP) experiment, utilizing the Matthews Correlation Coefficient (MCC) and Binding-site Distance Test (BDT) metrics. However, the assessment of ligand binding site predictions using such metrics requires the availability of solved structures with bound ligands. Thus, we have developed a ligand binding site quality assessment tool, FunFOLDQA, which utilizes protein feature analysis to predict ligand binding site quality prior to the experimental solution of the protein structures and their ligand interactions. The FunFOLDQA feature scores were combined using: simple linear combinations, multiple linear regression and a neural network. The neural network produced significantly better results for correlations to both the MCC and BDT scores, according to Kendall’s τ, Spearman’s ρ and Pearson’s r correlation coefficients, when tested on both the CASP8 and CASP9 datasets. The neural network also produced the largest Area Under the Curve score (AUC) when Receiver Operator Characteristic (ROC) analysis was undertaken for the CASP8 dataset. Furthermore, the FunFOLDQA algorithm incorporating the neural network, is shown to add value to FunFOLD, when both methods are employed in combination. This results in a statistically significant improvement over all of the best server methods, the FunFOLD method (6.43%), and one of the top manual groups (FN293) tested on the CASP8 dataset. The FunFOLDQA method was also found to be competitive with the top server methods when tested on the CASP9 dataset. To the best of our knowledge, FunFOLDQA is the first attempt to develop a method that can be used to assess ligand binding site prediction quality, in the absence of experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information to guide decision making is especially urgent in human dominated landscapes in the tropics, where urban and agricultural frontiers are still expanding in an unplanned manner. Nevertheless, most studies that have investigated the influence of landscape structure on species distribution have not considered the heterogeneity of altered habitats of the matrix, which is usually high in human dominated landscapes. Using the distribution of small mammals in forest remnants and in the four main altered habitats in an Atlantic forest landscape, we investigated 1) how explanatory power of models describing species distribution in forest remnants varies between landscape structure variables that do or do not incorporate matrix quality and 2) the importance of spatial scale for analyzing the influence of landscape structure. We used standardized sampling in remnants and altered habitats to generate two indices of habitat quality, corresponding to the abundance and to the occurrence of small mammals. For each remnant, we calculated habitat quantity and connectivity in different spatial scales, considering or not the quality of surrounding habitats. The incorporation of matrix quality increased model explanatory power across all spatial scales for half the species that occurred in the matrix, but only when taking into account the distance between habitat patches (connectivity). These connectivity models were also less affected by spatial scale than habitat quantity models. The few consistent responses to the variation in spatial scales indicate that despite their small size, small mammals perceive landscape features at large spatial scales. Matrix quality index corresponding to species occurrence presented a better or similar performance compared to that of species abundance. Results indicate the importance of the matrix for the dynamics of fragmented landscapes and suggest that relatively simple indices can improve our understanding of species distribution, and could be applied in modeling, monitoring and managing complex tropical landscapes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Establishing metrics to assess machine translation (MT) systems automatically is now crucial owing to the widespread use of MT over the web. In this study we show that such evaluation can be done by modeling text as complex networks. Specifically, we extend our previous work by employing additional metrics of complex networks, whose results were used as input for machine learning methods and allowed MT texts of distinct qualities to be distinguished. Also shown is that the node-to-node mapping between source and target texts (English-Portuguese and Spanish-Portuguese pairs) can be improved by adding further hierarchical levels for the metrics out-degree, in-degree, hierarchical common degree, cluster coefficient, inter-ring degree, intra-ring degree and convergence ratio. The results presented here amount to a proof-of-principle that the possible capturing of a wider context with the hierarchical levels may be combined with machine learning methods to yield an approach for assessing the quality of MT systems. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality and the power of human activities affect the external environment in different ways that can be measured and evaluated by means of several approaches and indicators. While the scientific community has been publishing several proposals for sustainable development indicators, there is still no consensus regarding the best approach to the use of these indicators and their reliability to measure sustainability. It is important, therefore, to question the effectiveness of sustainable development indicators in an effort to continue in the search for sustainability. This paper compares the results obtained with emergy accounting with five global Sustainability Metrics (SMs) proposed in the literature to verify if metrics are communicating coherent and similar information to guide decision makers towards sustainable development. Results obtained using emergy indices are discussed with the aid of emergy ternary diagrams. Metrics are confronted with emergy results, and the degree of variability among them is analyzed using a correlation matrix created for the Mercosur nations. The contrast of results clearly shows that metrics arrive at different interpretations about the sustainability of the nations studied, but also that some metrics may be grouped and used more prudently. Mercosur is presented as a case study to highlight and explain the discrepancies and similarities among Sustainability Metrics, and to expose the extent of emergy accounting. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of multimedia content and the demand for new audio or video services have fostered the development of a new era based on multimedia information, which allowed the evolution of Wireless Multimedia Sensor Networks (WMSNs) and also Flying Ad-Hoc Networks (FANETs). In this way, live multimedia services require realtime video transmissions with a low frame loss rate, tolerable end-to-end delay, and jitter to support video dissemination with Quality of Experience (QoE) support. Hence, a key principle in a QoE-aware approach is the transmission of high priority frames (protect them) with a minimum packet loss ratio, as well as network overhead. Moreover, multimedia content must be transmitted from a given source to the destination via intermediate nodes with high reliability in a large scale scenario. The routing service must cope with dynamic topologies caused by node failure or mobility, as well as wireless channel changes, in order to continue to operate despite dynamic topologies during multimedia transmission. Finally, understanding user satisfaction on watching a video sequence is becoming a key requirement for delivery of multimedia content with QoE support. With this goal in mind, solutions involving multimedia transmissions must take into account the video characteristics to improve video quality delivery. The main research contributions of this thesis are driven by the research question how to provide multimedia distribution with high energy-efficiency, reliability, robustness, scalability, and QoE support over wireless ad hoc networks. The thesis addresses several problem domains with contributions on different layers of the communication stack. At the application layer, we introduce a QoE-aware packet redundancy mechanism to reduce the impact of the unreliable and lossy nature of wireless environment to disseminate live multimedia content. At the network layer, we introduce two routing protocols, namely video-aware Multi-hop and multi-path hierarchical routing protocol for Efficient VIdeo transmission for static WMSN scenarios (MEVI), and cross-layer link quality and geographical-aware beaconless OR protocol for multimedia FANET scenarios (XLinGO). Both protocols enable multimedia dissemination with energy-efficiency, reliability and QoE support. This is achieved by combining multiple cross-layer metrics for routing decision in order to establish reliable routes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a wide range of video services over complex transmission networks, and in some cases end users fail to receive an acceptable quality level. In this paper, the different factors that degrade users' quality of experience (QoE) in video streaming service that use TCP as transmission protocol are studied. In this specific service, impairment factors are: number of pauses, their duration and temporal location. In order to measure the effect that each temporal segment has in the overall video quality, subjective tests. Because current subjective test methodologies are not adequate to assess video streaming over TCP, some recommendations are provided here. At the application layer, a customized player is used to evaluate the behavior of player buffer, and consequently, the end user QoE. Video subjective test results demonstrate that there is a close correlation between application parameters and subjective scores. Based on this fact, a new metrics named VsQM is defined, which considers the importance of temporal location of pauses to assess the user QoE of video streaming service. A useful application scenario is also presented, in which the metrics proposed herein is used to improve video services(1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Skype is one of the well-known applications that has guided the evolution of real-time video streaming and has become one of the most used software in everyday life. It provides VoIP audio/video calls as well as messaging chat and file transfer. Many versions are available covering all the principal operating systems like Windows, Macintosh and Linux but also mobile systems. Voice quality decreed Skype success since its birth in 2003 and peer-to-peer architecture has allowed worldwide diffusion. After video call introduction in 2006 Skype became a complete solution to communicate between two or more people. As a primarily video conferencing application, Skype assumes certain characteristics of the delivered video to optimize its perceived quality. However in the last years, and with the recent release of SkypeKit1, many new Skype video-enabled devices came out especially in the mobile world. This forced a change to the traditional recording, streaming and receiving settings allowing for a wide range of network and content dynamics. Video calls are not anymore based on static ‘chatting’ but mobile devices have opened new possibilities and can be used in several scenarios. For instance, lecture streaming or one-to-one mobile video conferences exhibit more dynamics as both caller and callee might be on move. Most of these cases are different from “head&shoulder” only content. Therefore, Skype needs to optimize its video streaming engine to cover more video types. Heterogeneous connections require different behaviors and solutions and Skype must face with this variety to maintain a certain quality independently from connection used. Part of the present work will be focused on analyzing Skype behavior depending on video content. Since Skype protocol is proprietary most of the studies so far have tried to characterize its traffic and to reverse engineer its protocol. However, questions related to the behavior of Skype, especially on quality as perceived by users, remain unanswered. We will study Skype video codecs capabilities and video quality assessment. Another motivation of our work is the design of a mechanism that estimates the perceived cost of network conditions on Skype video delivery. To this extent we will try to assess in an objective way the impact of network impairments on the perceived quality of a Skype video call. Traditional video streaming schemes lack the necessary flexibility and adaptivity that Skype tries to achieve at the edge of a network. Our contribution will lye on a testbed and consequent objective video quality analysis that we will carry out on input videos. We will stream raw video files with Skype via an impaired channel and then we will record it at the receiver side to analyze with objective quality of experience metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise Applications are complex software systems that manipulate much persistent data and interact with the user through a vast and complex user interface. In particular applications written for the Java 2 Platform, Enterprise Edition (J2EE) are composed using various technologies such as Enterprise Java Beans (EJB) or Java Server Pages (JSP) that in turn rely on languages other than Java, such as XML or SQL. In this heterogeneous context applying existing reverse engineering and quality assurance techniques developed for object-oriented systems is not enough. Because those techniques have been created to measure quality or provide information about one aspect of J2EE applications, they cannot properly measure the quality of the entire system. We intend to devise techniques and metrics to measure quality in J2EE applications considering all their aspects and to aid their evolution. Using software visualization we also intend to inspect to structure of J2EE applications and all other aspects that can be investigate through this technique. In order to do that we also need to create a unified meta-model including all elements composing a J2EE application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Opportunistic routing (OR) takes advantage of the broadcast nature and spatial diversity of wireless transmission to improve the performance of wireless ad-hoc networks. Instead of using a predetermined path to send packets, OR postpones the choice of the next-hop to the receiver side, and lets the multiple receivers of a packet to coordinate and decide which one will be the forwarder. Existing OR protocols choose the next-hop forwarder based on a predefined candidate list, which is calculated using single network metrics. In this paper, we propose TLG - Topology and Link quality-aware Geographical opportunistic routing protocol. TLG uses multiple network metrics such as network topology, link quality, and geographic location to implement the coordination mechanism of OR. We compare TLG with well-known existing solutions and simulation results show that TLG outperforms others in terms of both QoS and QoE metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless mobile sensor networks are enlarging the Internet of Things (IoT) portfolio with a huge number of multimedia services for smart cities. Safety and environmental monitoring multimedia applications will be part of the Smart IoT systems, which aim to reduce emergency response time, while also predicting hazardous events. In these mobile and dynamic (possible disaster) scenarios, opportunistic routing allows routing decisions in a completely distributed manner, by using a hop- by-hop route decision based on protocol-specific characteristics, and a predefined end-to-end path is not a reliable solution. This enables the transmission of video flows of a monitored area/object with Quality of Experience (QoE) support to users, headquarters or IoT platforms. However, existing approaches rely on a single metric to make the candidate selection rule, including link quality or geographic information, which causes a high packet loss rate, and reduces the video perception from the human standpoint. This article proposes a cross-layer Link quality and Geographical-aware Opportunistic routing protocol (LinGO), which is designed for video dissemination in mobile multimedia IoT environments. LinGO improves routing decisions using multiple metrics, including link quality, geographic loca- tion, and energy. The simulation results show the benefits of LinGO compared with well-known routing solutions for video transmission with QoE support in mobile scenarios.