893 resultados para E-Metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de doutoramento, Ciências Biomédicas (Neurociências), Universidade de Lisboa, Faculdade de Medicina, 2014

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de doutoramento, Engenharia Biomédica e Biofísica, Universidade de Lisboa, Faculdade de Ciências, 2015

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimates of airline delay costs as a function of delay magnitude are combined with fuel and (future) emissions charges to make cost-benefit trade-offs in the pre-departure and airborne phases. Hypothetical scenarios for the distribution of flow management slots are explored in terms of their cost and target-setting implications. The general superiority of passenger-centric metrics is of significance for delay measurement, although flight delays are still the only commonly-reported type of metric in both the US and Europe. There is a particular need for further research into reactionary (network) effects, especially with regard to passenger metrics and flow management delay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reactionary delays constitute nearly half of all delay minutes in Europe. A capped, multi-component model is presented for estimating reactionary delay costs, as a non-linear function of primary delay duration. Maximum Take-Off Weights, historically established as a charging mechanism, may be used to model delay costs. Current industry reporting on delay is flight-centric. Passenger-centric metrics are needed to better understand delay propagation. In ATM, it is important to take account of contrasting flight- and passenger-centric effects, caused by cancellations, for example. Costs to airlines and passenger disutility will both continue to be driven by delay relative to the original schedule.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2015-12

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What is the best luminance contrast weighting-function for image quality optimization? Traditionally measured contrast sensitivity functions (CSFs), have been often used as weighting-functions in image quality and difference metrics. Such weightings have been shown to result in increased sharpness and perceived quality of test images. We suggest contextual CSFs (cCSFs) and contextual discrimination functions (cVPFs) should provide bases for further improvement, since these are directly measured from pictorial scenes, modeling threshold and suprathreshold sensitivities within the context of complex masking information. Image quality assessment is understood to require detection and discrimination of masked signals, making contextual sensitivity and discrimination functions directly relevant. In this investigation, test images are weighted with a traditional CSF, cCSF, cVPF and a constant function. Controlled mutations of these functions are also applied as weighting-functions, seeking the optimal spatial frequency band weighting for quality optimization. Image quality, sharpness and naturalness are then assessed in two-alternative forced-choice psychophysical tests. We show that maximal quality for our test images, results from cCSFs and cVPFs, mutated to boost contrast in the higher visible frequencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessing the subjective quality of processed images through an objective quality metric is a key issue in multimedia processing and transmission. In some scenarios, it is also important to evaluate the quality of the received images with minimal reference to the transmitted ones. For instance, for closed-loop optimisation of image and video transmission, the quality measure can be evaluated at the receiver and provided as feedback information to the system controller. The original images - prior to compression and transmission - are not usually available at the receiver side, and it is important to rely at the receiver side on an objective quality metric that does not need reference or needs minimal reference to the original images. The observation that the human eye is very sensitive to edge and contour information of an image underpins the proposal of our reduced reference (RR) quality metric, which compares edge information between the distorted and the original image. Results highlight that the metric correlates well with subjective observations, also in comparison with commonly used full-reference metrics and with a state-of-the-art reduced reference metric. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Region merging algorithms commonly produce results that are seen to be far below the current commonly accepted state-of-the-art image segmentation techniques. The main challenging problem is the selection of an appropriate and computationally efficient method to control resolution and region homogeneity. In this paper we present a region merging algorithm that includes a semi-greedy criterion and an adaptive threshold to control segmentation resolution. In addition we present a new relative performance indicator that compares algorithm performance across many metrics against the results from human segmentation. Qualitative (visual) comparison demonstrates that our method produces results that outperform existing leading techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de natureza científica realizada para obtenção do grau de Mestre em Engenharia de Redes de Computadores e Multimédia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho de Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Localization is a fundamental task in Cyber-Physical Systems (CPS), where data is tightly coupled with the environment and the location where it is generated. The research literature on localization has reached a critical mass, and several surveys have also emerged. This review paper contributes on the state-of-the-art with the proposal of a new and holistic taxonomy of the fundamental concepts of localization in CPS, based on a comprehensive analysis of previous research works and surveys. The main objective is to pave the way towards a deep understanding of the main localization techniques, and unify their descriptions. Furthermore, this review paper provides a complete overview on the most relevant localization and geolocation techniques. Also, we present the most important metrics for measuring the accuracy of localization approaches, which is meant to be the gap between the real location and its estimate. Finally, we present open issues and research challenges pertaining to localization. We believe that this review paper will represent an important and complete reference of localization techniques in CPS for researchers and practitioners and will provide them with an added value as compared to previous surveys.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fractional calculus generalizes integer order derivatives and integrals. During the last half century a considerable progress took place in this scientific area. This paper addresses the evolution and establishes an assertive measure of the research development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Darwinian Particle Swarm Optimization (DPSO) is an evolutionary algorithm that extends the Particle Swarm Optimization using natural selection to enhance the ability to escape from sub-optimal solutions. An extension of the DPSO to multi-robot applications has been recently proposed and denoted as Robotic Darwinian PSO (RDPSO), benefiting from the dynamical partitioning of the whole population of robots, hence decreasing the amount of required information exchange among robots. This paper further extends the previously proposed algorithm adapting the behavior of robots based on a set of context-based evaluation metrics. Those metrics are then used as inputs of a fuzzy system so as to systematically adjust the RDPSO parameters (i.e., outputs of the fuzzy system), thus improving its convergence rate, susceptibility to obstacles and communication constraints. The adapted RDPSO is evaluated in groups of physical robots, being further explored using larger populations of simulated mobile robots within a larger scenario.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) emerge as underlying infrastructures for new classes of large-scale networked embedded systems. However, WSNs system designers must fulfill the quality-of-service (QoS) requirements imposed by the applications (and users). Very harsh and dynamic physical environments and extremely limited energy/computing/memory/communication node resources are major obstacles for satisfying QoS metrics such as reliability, timeliness, and system lifetime. The limited communication range of WSN nodes, link asymmetry, and the characteristics of the physical environment lead to a major source of QoS degradation in WSNs-the ldquohidden node problem.rdquo In wireless contention-based medium access control (MAC) protocols, when two nodes that are not visible to each other transmit to a third node that is visible to the former, there will be a collision-called hidden-node or blind collision. This problem greatly impacts network throughput, energy-efficiency and message transfer delays, and the problem dramatically increases with the number of nodes. This paper proposes H-NAMe, a very simple yet extremely efficient hidden-node avoidance mechanism for WSNs. H-NAMe relies on a grouping strategy that splits each cluster of a WSN into disjoint groups of non-hidden nodes that scales to multiple clusters via a cluster grouping strategy that guarantees no interference between overlapping clusters. Importantly, H-NAMe is instantiated in IEEE 802.15.4/ZigBee, which currently are the most widespread communication technologies for WSNs, with only minor add-ons and ensuring backward compatibility with their protocols standards. H-NAMe was implemented and exhaustively tested using an experimental test-bed based on ldquooff-the-shelfrdquo technology, showing that it increases network throughput and transmission success probability up to twice the values obtained without H-NAMe. H-NAMe effectiveness was also demonstrated in a target tracking application with mobile robots - over a WSN deployment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The IEEE 802.15.4 protocol proposes a flexible communication solution for Low-Rate Wireless Personal Area Networks including sensor networks. It presents the advantage to fit different requirements of potential applications by adequately setting its parameters. When enabling its beacon mode, the protocol makes possible real-time guarantees by using its Guaranteed Time Slot (GTS) mechanism. This paper analyzes the performance of the GTS allocation mechanism in IEEE 802.15.4. The analysis gives a full understanding of the behavior of the GTS mechanism with regards to delay and throughput metrics. First, we propose two accurate models of service curves for a GTS allocation as a function of the IEEE 802.15.4 parameters. We then evaluate the delay bounds guaranteed by an allocation of a GTS using Network Calculus formalism. Finally, based on the analytic results, we analyze the impact of the IEEE 802.15.4 parameters on the throughput and delay bound guaranteed by a GTS allocation. The results of this work pave the way for an efficient dimensioning of an IEEE 802.15.4 cluster.