945 resultados para Ad-hoc
Resumo:
The growing availability and popularity of opinion rich resources on the online web resources, such as review sites and personal blogs, has made it convenient to find out about the opinions and experiences of layman people. But, simultaneously, this huge eruption of data has made it difficult to reach to a conclusion. In this thesis, I develop a novel recommendation system, Recomendr that can help users digest all the reviews about an entity and compare candidate entities based on ad-hoc dimensions specified by keywords. It expects keyword specified ad-hoc dimensions/features as input from the user and based on those features; it compares the selected range of entities using reviews provided on the related User Generated Contents (UGC) e.g. online reviews. It then rates the textual stream of data using a scoring function and returns the decision based on an aggregate opinion to the user. Evaluation of Recomendr using a data set in the laptop domain shows that it can effectively recommend the best laptop as per user-specified dimensions such as price. Recomendr is a general system that can potentially work for any entities on which online reviews or opinionated text is available.
Resumo:
Observational studies in the field of sport are complicated by the added difficulty of having to analyse multiple, complex events or behaviours that may last just a fraction of a second. In this study, we analyse three aspects related to the reliability of data collected in such a study. The first aim was to analyse and compare the reliability of data sets assessed quantitatively (calculation of kappa statistic) and qualitatively (consensus agreement method). The second aim was to describe how, by ensuring the alignment of events, we calculated the kappa statistic for the order parameter using SDIS-GSEQ software (version 5.1) for data sets containing different numbers of sequences. The third objective was to describe a new consultative procedure designed to remove the confusion generated by discordant data sets and improve the reliability of the data. The procedure is called "consultative" because it involves the participation of a new observer who is responsible for consulting the existing observations and deciding on the definitive result.
Resumo:
With wireless vehicular communications, Vehicular Ad Hoc Networks (VANETs) enable numerous applications to enhance traffic safety, traffic efficiency, and driving experience. However, VANETs also impose severe security and privacy challenges which need to be thoroughly investigated. In this dissertation, we enhance the security, privacy, and applications of VANETs, by 1) designing application-driven security and privacy solutions for VANETs, and 2) designing appealing VANET applications with proper security and privacy assurance. First, the security and privacy challenges of VANETs with most application significance are identified and thoroughly investigated. With both theoretical novelty and realistic considerations, these security and privacy schemes are especially appealing to VANETs. Specifically, multi-hop communications in VANETs suffer from packet dropping, packet tampering, and communication failures which have not been satisfyingly tackled in literature. Thus, a lightweight reliable and faithful data packet relaying framework (LEAPER) is proposed to ensure reliable and trustworthy multi-hop communications by enhancing the cooperation of neighboring nodes. Message verification, including both content and signature verification, generally is computation-extensive and incurs severe scalability issues to each node. The resource-aware message verification (RAMV) scheme is proposed to ensure resource-aware, secure, and application-friendly message verification in VANETs. On the other hand, to make VANETs acceptable to the privacy-sensitive users, the identity and location privacy of each node should be properly protected. To this end, a joint privacy and reputation assurance (JPRA) scheme is proposed to synergistically support privacy protection and reputation management by reconciling their inherent conflicting requirements. Besides, the privacy implications of short-time certificates are thoroughly investigated in a short-time certificates-based privacy protection (STCP2) scheme, to make privacy protection in VANETs feasible with short-time certificates. Secondly, three novel solutions, namely VANET-based ambient ad dissemination (VAAD), general-purpose automatic survey (GPAS), and VehicleView, are proposed to support the appealing value-added applications based on VANETs. These solutions all follow practical application models, and an incentive-centered architecture is proposed for each solution to balance the conflicting requirements of the involved entities. Besides, the critical security and privacy challenges of these applications are investigated and addressed with novel solutions. Thus, with proper security and privacy assurance, these solutions show great application significance and economic potentials to VANETs. Thus, by enhancing the security, privacy, and applications of VANETs, this dissertation fills the gap between the existing theoretic research and the realistic implementation of VANETs, facilitating the realistic deployment of VANETs.
Resumo:
In a recent paper [1] Reis showed that both the principles of extremum of entropy production rate, which are often used in the study of complex systems, are corollaries of the Constructal Law. In fact, both follow from the maximization of overall system conductivities, under appropriate constraints. In this way, the maximum rate of entropy production (MEP) occurs when all the forces in the system are kept constant. On the other hand, the minimum rate of entropy production (mEP) occurs when all the currents that cross the system are kept constant. In this paper it is shown how the so-called principle of "minimum energy expenditure" which is often used as the basis for explaining many morphologic features in biologic systems, and also in inanimate systems, is also a corollary of Bejan's Constructal Law [2]. Following the general proof some cases namely, the scaling laws of human vascular systems and river basins are discussed as illustrations from the side of life, and inanimate systems, respectively.
Resumo:
Popper's explications of 'ad hoc' in relation to hypotheses and explanations turn out to be either trivial, confused or mistaken. One such explication I discuss at length is circularity; another is reduction in empirical content. I argue that non-circularity is preferable to non-ad hocness for an acceptable explanation or explanans, and I isolate some persistent errors in his analysis. Second, Popper is barking up the wrong tree in proscribing reductions in empirical content in novel hypotheses. Such reductions may constitute scientific progress. He fails to show that ad hoc hypothesis are the threat to science he claims.
Resumo:
Data caching is an attractive solution for reducing bandwidth demands and network latency in mobile ad hoc networks. Deploying caches in mobile nodes can reduce the overall traf c considerably. Cache hits eliminate the need to contact the data source frequently, which avoids additional network overhead. In this paper we propose a data discovery and cache management policy for cooperative caching, which reduces the power usage, caching overhead and delay by reducing the number of control messages flooded into the network .A cache discovery process based on position cordinates of neighboring nodes is developed for this .The stimulstion results gives a promising result based on the metrics of the studies.
Resumo:
In questa tesi ci si pone l'obiettivo di sviluppare sistemi distribuiti composti da device mobile che si scambiano informazioni tramite comunicazioni opportunistiche wireless peer-to-peer. Vengono inizialmente analizzate le principali tecnologie di comunicazione wireless adatte allo scopo, soffermandosi sulle reti Wifi ad hoc, delle quali vengono studiate le performance in sistemi di larga scala tramite il simulatore di reti ns-3. Successivamente viene esposto lo sviluppo di componenti software, basati su Akka Stream, per la costruzione di campi computazionali tramite comunicazioni opportunistiche tra device Android, effettuate tramite reti Wifi ad hoc.
Resumo:
Negli ultimi decenni sono state registrate preoccupati fenomeni di mortalità della vongola Chamelea gallina, in particolare nell’area costiera emiliano-romagnola e di cui non sono ancora state chiarite le cause. Il presente studio si è occupato di caratterizzare la comunità microbica associata alla vongola nella ghiandola digestiva, utilizzando il sequenziamento della regione ipervariabile V3-V4 del gene rRNA 16S, al fine di individuare fenomeni di disbiosi in aree ad elevata mortalità. Sono state quindi esplorate le variazioni stagionali (da luglio a novembre) nella struttura del microbiota della vongola e nell'ecosistema microbico dell'acqua di mare circostante, in quattro siti scelti ad hoc, secondo un gradiente di incidenza storica di mortalità, da Nord a Sud, tra le aree di Ravenna e Rimini. Lo stato di salute della vongola e del suo microbiota associato sono stati esplorati tramite, rispettivamente, l’indice di condizione e lo studio mediante NGS della composizione dell’ecosistema microbico intestinale. I nostri dati, sebbene preliminari, dimostrano come tra le aree Nord e Sud ci sia un comportamento differente e reciproco relativamente all’andamento stagionale dei valori di diversità interna (alfa) al microbiota della vongola, che si riduce dall’estate all’autunno nelle aree Nord (Ravenna e Lido di Savio), mentre aumenta - nello stesso periodo di tempo - nelle aree Sud (Rimini e Cesenatico). A conferma dei dati di alfa diversità, l’analisi mediante PCoA delle variazione del microbiota della vongola tra i quattro siti di indagine stratificate per stagione, dimostrano profonde differenze tra i due estremi nord-sud. In particolare, l’analisi integrata dei dati storici di produttività, indice di condizione e dinamica del microbiota della g.d. ci ha consentito di discriminare cinque famiglie microbiche come potenziali Growth Promoting Bacteria, poiché associate ad un picco di indice di condizione che si registra nelle aree a bassa mortalità, nel mese di settembre.
Resumo:
Diabetic Retinopathy (DR) is a complication of diabetes that can lead to blindness if not readily discovered. Automated screening algorithms have the potential to improve identification of patients who need further medical attention. However, the identification of lesions must be accurate to be useful for clinical application. The bag-of-visual-words (BoVW) algorithm employs a maximum-margin classifier in a flexible framework that is able to detect the most common DR-related lesions such as microaneurysms, cotton-wool spots and hard exudates. BoVW allows to bypass the need for pre- and post-processing of the retinographic images, as well as the need of specific ad hoc techniques for identification of each type of lesion. An extensive evaluation of the BoVW model, using three large retinograph datasets (DR1, DR2 and Messidor) with different resolution and collected by different healthcare personnel, was performed. The results demonstrate that the BoVW classification approach can identify different lesions within an image without having to utilize different algorithms for each lesion reducing processing time and providing a more flexible diagnostic system. Our BoVW scheme is based on sparse low-level feature detection with a Speeded-Up Robust Features (SURF) local descriptor, and mid-level features based on semi-soft coding with max pooling. The best BoVW representation for retinal image classification was an area under the receiver operating characteristic curve (AUC-ROC) of 97.8% (exudates) and 93.5% (red lesions), applying a cross-dataset validation protocol. To assess the accuracy for detecting cases that require referral within one year, the sparse extraction technique associated with semi-soft coding and max pooling obtained an AUC of 94.2 ± 2.0%, outperforming current methods. Those results indicate that, for retinal image classification tasks in clinical practice, BoVW is equal and, in some instances, surpasses results obtained using dense detection (widely believed to be the best choice in many vision problems) for the low-level descriptors.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Aims. A model-independent reconstruction of the cosmic expansion rate is essential to a robust analysis of cosmological observations. Our goal is to demonstrate that current data are able to provide reasonable constraints on the behavior of the Hubble parameter with redshift, independently of any cosmological model or underlying gravity theory. Methods. Using type Ia supernova data, we show that it is possible to analytically calculate the Fisher matrix components in a Hubble parameter analysis without assumptions about the energy content of the Universe. We used a principal component analysis to reconstruct the Hubble parameter as a linear combination of the Fisher matrix eigenvectors (principal components). To suppress the bias introduced by the high redshift behavior of the components, we considered the value of the Hubble parameter at high redshift as a free parameter. We first tested our procedure using a mock sample of type Ia supernova observations, we then applied it to the real data compiled by the Sloan Digital Sky Survey (SDSS) group. Results. In the mock sample analysis, we demonstrate that it is possible to drastically suppress the bias introduced by the high redshift behavior of the principal components. Applying our procedure to the real data, we show that it allows us to determine the behavior of the Hubble parameter with reasonable uncertainty, without introducing any ad-hoc parameterizations. Beyond that, our reconstruction agrees with completely independent measurements of the Hubble parameter obtained from red-envelope galaxies.
Resumo:
A thermodynamic approach is presented to model devices manufactured with cellular polymers. They are heterogeneous nonpolar space-charge electrets that exhibit much higher piezoelectricity than the well-known ferroelectric polymers. Their pyroelectric and piezoelectric properties are characterized by adequate coefficients which quantify the performance of devices manufactured with those materials. The method presented in this contribution to calculate those coefficients is exact and consistent avoiding ad hoc simplifications introduced in other approaches. The results obtained by this method allow drawing conclusions regarding device optimization.
Resumo:
Wireless Sensor Networks (WSNs) have a vast field of applications, including deployment in hostile environments. Thus, the adoption of security mechanisms is fundamental. However, the extremely constrained nature of sensors and the potentially dynamic behavior of WSNs hinder the use of key management mechanisms commonly applied in modern networks. For this reason, many lightweight key management solutions have been proposed to overcome these constraints. In this paper, we review the state of the art of these solutions and evaluate them based on metrics adequate for WSNs. We focus on pre-distribution schemes well-adapted for homogeneous networks (since this is a more general network organization), thus identifying generic features that can improve some of these metrics. We also discuss some challenges in the area and future research directions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Parana (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited.
Resumo:
Knowledge of residual perturbations in the orbit of Uranus in the early 1840s did not lead to the refutation of Newton's law of gravitation but instead to the discovery of Neptune in 1846. Karl Popper asserts that this case is atypical of science and that the law of gravitation was at least prima facie falsified by these perturbations. I argue that these assertions are the product of a false, a priori methodological position I call, 'Weak Popperian Falsificationism' (WPF). Further, on the evidence the law was not prima facie false and was not generally considered so by astronomers at the time. Many of Popper's commentators (Kuhn, Lakatos, Feyerabend and others) presuppose WPF and their views on this case and its implications for scientific rationality and method suffer from this same defect.