977 resultados para Kahler metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increased capabilities (e.g., processing, storage) of portable devices along with the constant need of users to retrieve and send information have introduced a new form of communication. Users can seamlessly exchange data by means of opportunistic contacts among them and this is what characterizes the opportunistic networks (OppNets). OppNets allow users to communicate even when an end-to-end path may not exist between them. Since 2007, there has been a trend to improve the exchange of data by considering social similarity metrics. Social relationships, shared interests, and popularity are examples of such metrics that have been employed successfully: as users interact based on relationships and interests, this information can be used to decide on the best next forwarders of information. This Thesis work combines the features of today's devices found in the regular urban environment with the current social-awareness trend in the context of opportunistic routing. To achieve this goal, this work was divided into di erent tasks that map to a set of speci c objectives, leading to the following contributions: i) an up-to-date opportunistic routing taxonomy; ii) a universal evaluation framework that aids in devising and testing new routing proposals; iii) three social-aware utility functions that consider the dynamic user behavior and can be easily incorporated to other routing proposals; iv) two opportunistic routing proposals based on the users' daily routines and on the content traversing the network and interest of users in such content; and v) a structure analysis of the social-based network formed based on the approaches devised in this work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Future emerging market trends head towards positioning based services placing a new perspective on the way we obtain and exploit positioning information. On one hand, innovations in information technology and wireless communication systems enabled the development of numerous location based applications such as vehicle navigation and tracking, sensor networks applications, home automation, asset management, security and context aware location services. On the other hand, wireless networks themselves may bene t from localization information to improve the performances of di erent network layers. Location based routing, synchronization, interference cancellation are prime examples of applications where location information can be useful. Typical positioning solutions rely on measurements and exploitation of distance dependent signal metrics, such as the received signal strength, time of arrival or angle of arrival. They are cheaper and easier to implement than the dedicated positioning systems based on ngerprinting, but at the cost of accuracy. Therefore intelligent localization algorithms and signal processing techniques have to be applied to mitigate the lack of accuracy in distance estimates. Cooperation between nodes is used in cases where conventional positioning techniques do not perform well due to lack of existing infrastructure, or obstructed indoor environment. The objective is to concentrate on hybrid architecture where some nodes have points of attachment to an infrastructure, and simultaneously are interconnected via short-range ad hoc links. The availability of more capable handsets enables more innovative scenarios that take advantage of multiple radio access networks as well as peer-to-peer links for positioning. Link selection is used to optimize the tradeo between the power consumption of participating nodes and the quality of target localization. The Geometric Dilution of Precision and the Cramer-Rao Lower Bound can be used as criteria for choosing the appropriate set of anchor nodes and corresponding measurements before attempting location estimation itself. This work analyzes the existing solutions for node selection in order to improve localization performance, and proposes a novel method based on utility functions. The proposed method is then extended to mobile and heterogeneous environments. Simulations have been carried out, as well as evaluation with real measurement data. In addition, some speci c cases have been considered, such as localization in ill-conditioned scenarios and the use of negative information. The proposed approaches have shown to enhance estimation accuracy, whilst signi cantly reducing complexity, power consumption and signalling overhead.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dependence clusters are (maximal) collections of mutually dependent source code entities according to some dependence relation. Their presence in software complicates many maintenance activities including testing, refactoring, and feature extraction. Despite several studies finding them common in production code, their formation, identification, and overall structure are not well understood, partly because of challenges in approximating true dependences between program entities. Previous research has considered two approximate dependence relations: a fine-grained statement-level relation using control and data dependences from a program’s System Dependence Graph and a coarser relation based on function-level controlflow reachability. In principal, the first is more expensive and more precise than the second. Using a collection of twenty programs, we present an empirical investigation of the clusters identified by these two approaches. In support of the analysis, we consider hybrid cluster types that works at the coarser function-level but is based on the higher-precision statement-level dependences. The three types of clusters are compared based on their slice sets using two clustering metrics. We also perform extensive analysis of the programs to identify linchpin functions – functions primarily responsible for holding a cluster together. Results include evidence that the less expensive, coarser approaches can often be used as e�ective proxies for the more expensive, finer-grained approaches. Finally, the linchpin analysis shows that linchpin functions can be e�ectively and automatically identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work summarises the Intercalibration Exercise (IE) required for the Common Implementation Strategy of the Water Framework Directive (WFD; 2000/60/EC) that was carried out in Portugal, and applied to a coastal region. The WFD aims to achieve good ec ological status for all waters in the European Community by 2015. The Ecological Status of a water body is determined us ing a range of Hydromorphological and Physico-Chemical Quality Elements as well Biological Quality Elements (BQE ). In coastal waters, the Biological Elements include Phytoplankton, Other Aquatic Flora and Benthic Inverteb rate Fauna. Good cooperation with the other Member States allowed the IE to proceed without a complete da ta set, and Portugal was ab le to intercalibrate and harmonise methods within the North Ea st Atlantic Geographica l Intercalibration Group for most of the BQE. The appropriate metrics and corre sponding methods were agreed under the framework of the RECITAL (Reference Conditions and Intercalibra tion) project, funded by the Port uguese Water Institu te, INAG. Some preliminary sampling was undertaken, but not su fficient to establish the Reference Conditions. The study area was a coastal lagoon in the southern part of Portugal. The focus was on the Phytoplankton Quality Element, but other BQE were also taken into account. Two sampli ng stations in Ria Formosa coastal lagoon were considered in this exercise: Ramalhete a nd Ponte. The metrics adopted by the Intercalibration Exercise groups were applied enabli ng the classification for the two sta tions of Good/High Status for the majority of the BQE parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We write to comment on the recently published paper “Defining phytoplankton class boundaries in Portuguese transitional waters: an evaluation of the ecological quality status according to the Water Framework Directive” (Brito et al., 2012). This paper presents an integrated methodology to analyse the ecological quality status of several Portuguese transitional waters, using phytoplanktonrelated metrics. One of the systems analysed, the Guadiana estuary in southern Portugal, is considered the most problematic estuary, with its upstream water bodies classified as Poor in terms of ecological status. We strongly disagree with this conclusion and we would like to raise awareness to some methodological constraints that, in our opinion, are the basis of such deceptive conclusions and should therefore not be neglected when using phytoplankton to assess the ecological status of natural waters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les réseaux véhiculaires mobiles, ou Vehicular Ad-hoc NETworks (VANETs), existent depuis les années 80, mais sont de plus en plus développés depuis quelques années dans différentes villes à travers le monde. Ils constituent un apport d’informations aux réseaux routiers grâce à la mise en place de communications entre ses constituants : principalement les véhicules, mais aussi certaines infrastructures de bords de routes liées directement aux automobilistes (feux de circulation, parcomètres, infrastructures spécialisées pour les VANETs et bien d’autres). L’ajout des infrastructures apporte un support fixe à la dissémination des informations dans le réseau. Le principal objectif de ce type de réseau est d’améliorer la sécurité routière, les conditions de circulations, et d’apporter aux conducteurs et aux passagers quelques applications publicitaires ou de divertissement. Pour cela, il est important de faire circuler l’information de la manière la plus efficace possible entre les différents véhicules. L’utilisation des infrastructures pour la simulation de ces réseaux est bien souvent négligée. En effet, une grande partie des protocoles présentés dans la littérature simulent un réseau ad-hoc avec des noeuds se déplaçant plus rapidement et selon une carte définie. Cependant, ils ne prennent pas en compte les spécificités même d’un réseau véhiculaire mobile. Le routage de l’information dans les réseaux véhiculaires mobiles utilise les infrastructures de façon certes opportuniste, mais à terme, les infrastructures seront très présentes dans les villes et sur les autoroutes. C’est pourquoi nous nous sommes concentrés dans ce mémoire à l’étude des variations des différentes métriques du routage de l’information lors de l’ajout d’infrastructures sur une autoroute avec l’utilisation du protocole de routage AODV. De plus, nous avons modifié le protocole AODV afin d’obliger les messages à emprunter le chemin passant par les infrastructures si celles-ci sont disponibles. Les résultats présentés sont encourageants, et nous montrent qu’il est important de simuler les réseaux VANETs de manière complète, en considérant les infrastructures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de doutoramento, Ciências Biomédicas (Neurociências), Universidade de Lisboa, Faculdade de Medicina, 2014

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de doutoramento, Engenharia Biomédica e Biofísica, Universidade de Lisboa, Faculdade de Ciências, 2015

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Concert program for University Symphony and Student Soloists in a Concerto Concert, January 13, 1963

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimates of airline delay costs as a function of delay magnitude are combined with fuel and (future) emissions charges to make cost-benefit trade-offs in the pre-departure and airborne phases. Hypothetical scenarios for the distribution of flow management slots are explored in terms of their cost and target-setting implications. The general superiority of passenger-centric metrics is of significance for delay measurement, although flight delays are still the only commonly-reported type of metric in both the US and Europe. There is a particular need for further research into reactionary (network) effects, especially with regard to passenger metrics and flow management delay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reactionary delays constitute nearly half of all delay minutes in Europe. A capped, multi-component model is presented for estimating reactionary delay costs, as a non-linear function of primary delay duration. Maximum Take-Off Weights, historically established as a charging mechanism, may be used to model delay costs. Current industry reporting on delay is flight-centric. Passenger-centric metrics are needed to better understand delay propagation. In ATM, it is important to take account of contrasting flight- and passenger-centric effects, caused by cancellations, for example. Costs to airlines and passenger disutility will both continue to be driven by delay relative to the original schedule.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2015-12

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What is the best luminance contrast weighting-function for image quality optimization? Traditionally measured contrast sensitivity functions (CSFs), have been often used as weighting-functions in image quality and difference metrics. Such weightings have been shown to result in increased sharpness and perceived quality of test images. We suggest contextual CSFs (cCSFs) and contextual discrimination functions (cVPFs) should provide bases for further improvement, since these are directly measured from pictorial scenes, modeling threshold and suprathreshold sensitivities within the context of complex masking information. Image quality assessment is understood to require detection and discrimination of masked signals, making contextual sensitivity and discrimination functions directly relevant. In this investigation, test images are weighted with a traditional CSF, cCSF, cVPF and a constant function. Controlled mutations of these functions are also applied as weighting-functions, seeking the optimal spatial frequency band weighting for quality optimization. Image quality, sharpness and naturalness are then assessed in two-alternative forced-choice psychophysical tests. We show that maximal quality for our test images, results from cCSFs and cVPFs, mutated to boost contrast in the higher visible frequencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessing the subjective quality of processed images through an objective quality metric is a key issue in multimedia processing and transmission. In some scenarios, it is also important to evaluate the quality of the received images with minimal reference to the transmitted ones. For instance, for closed-loop optimisation of image and video transmission, the quality measure can be evaluated at the receiver and provided as feedback information to the system controller. The original images - prior to compression and transmission - are not usually available at the receiver side, and it is important to rely at the receiver side on an objective quality metric that does not need reference or needs minimal reference to the original images. The observation that the human eye is very sensitive to edge and contour information of an image underpins the proposal of our reduced reference (RR) quality metric, which compares edge information between the distorted and the original image. Results highlight that the metric correlates well with subjective observations, also in comparison with commonly used full-reference metrics and with a state-of-the-art reduced reference metric. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Region merging algorithms commonly produce results that are seen to be far below the current commonly accepted state-of-the-art image segmentation techniques. The main challenging problem is the selection of an appropriate and computationally efficient method to control resolution and region homogeneity. In this paper we present a region merging algorithm that includes a semi-greedy criterion and an adaptive threshold to control segmentation resolution. In addition we present a new relative performance indicator that compares algorithm performance across many metrics against the results from human segmentation. Qualitative (visual) comparison demonstrates that our method produces results that outperform existing leading techniques.