968 resultados para anonimato rete privacy deep web onion routing cookie
Resumo:
Negli ultimi anni i documenti web hanno attratto molta attenzione, poiché vengono visti come un nuovo mezzo che porta quello che sono le esperienze ed opinioni di un individuo da una parte all'altra del mondo, raggiungendo quindi persone che mai si incontreranno. Ed è proprio con la proliferazione del Web 2.0 che l’attenzione è stata incentrata sul contenuto generato dagli utenti della rete, i quali hanno a disposizione diverse piattaforme sulle quali condividere i loro pensieri, opinioni o andare a cercarne di altrui, magari per valutare l’acquisto di uno smartphone piuttosto che un altro o se valutare l’opzione di cambiare operatore telefonico, ponderando quali potrebbero essere gli svantaggi o i vantaggi che otterrebbe modificando la sia situazione attuale. Questa grande disponibilità di informazioni è molto preziosa per i singoli individui e le organizzazioni, che devono però scontrarsi con la grande difficoltà di trovare le fonti di tali opinioni, estrapolarle ed esprimerle in un formato standard. Queste operazioni risulterebbero quasi impossibili da eseguire a mano, per questo è nato il bisogno di automatizzare tali procedimenti, e la Sentiment Analysis è la risposta a questi bisogni. Sentiment analysis (o Opinion Mining, come è chiamata a volte) è uno dei tanti campi di studio computazionali che affronta il tema dell’elaborazione del linguaggio naturale orientato all'estrapolazione delle opinioni. Negli ultimi anni si è rilevato essere uno dei nuovi campi di tendenza nel settore dei social media, con una serie di applicazioni nel campo economico, politico e sociale. Questa tesi ha come obiettivo quello di fornire uno sguardo su quello che è lo stato di questo campo di studio, con presentazione di metodi e tecniche e di applicazioni di esse in alcuni studi eseguiti in questi anni.
Resumo:
This article provides a holistic legal analysis of the use of cookies in Online Behavioural Advertising. The current EU legislative framework is outlined in detail, and the legal obligations are examined. Consent and the debates surrounding its implementation form a large portion of the analysis. The article outlines the current difficulties associated with the reliance on this requirement as a condition for the placing and accessing of cookies. Alternatives to this approach are explored, and the implementation of solutions based on the application of the Privacy by Design and Privacy by Default concepts are presented. This discussion involves an analysis of the use of code and, therefore, product architecture to ensure adequate protections.
Resumo:
Patients with ilio-femoral deep-vein thrombosis (DVT) are at high risk of developing the post-thrombotic syndrome (PTS). In comparison to anticoagulation therapy alone, extended venography-guided catheter-directed thrombolysis without routine stenting of venous stenosis in patients with ilio-femoral DVT is associated with an increased risk of bleeding and a moderate reduction of PTS. We performed a prospective single-centre study to investigate safety, patency and incidence of PTS in patients with acute ilio-femoral DVT treated with fixed-dose ultrasound-assisted catheter-directed thrombolysis (USAT; 20 mg rt-PA during 15 hours) followed by routing stenting of venous stenosis, defined as residual luminal narrowing >50%, absent antegrade flow, or presence of collateral flow at the site of suspected stenosis. A total of 87 patients (age 46 ± 21 years, 60% women) were included. At 15 hours, thrombolysis success ≥50% was achieved in 67 (77%) patients. Venous stenting (mean 1.9 ± 1.3 stents) was performed in 70 (80%) patients, with the common iliac vein as the most frequent stenting site (83%). One major (1%; 95% CI, 0-6%) and 6 minor bleedings (7%; 95%CI, 3-14%) occurred. Primary and secondary patency rates at 1 year were 87% (95% CI, 74-94%) and 96% (95% CI, 88-99%), respectively. At three months, 88% (95% CI, 78-94%) of patients were free from PTS according to the Villalta scale, with a similar rate at one year (94%, 95% CI, 81-99%). In conclusion, a fixed-dose USAT regimen followed by routine stenting of underlying venous stenosis in patients with ilio-femoral DVT was associated with a low bleeding rate, high patency rates, and a low incidence of PTS.
Resumo:
Over the years, a drastic increase in online information disclosure spurs a wave of concerns from multiple stakeholders. Among others, users resent the “behind the closed doors” processing of their personal data by companies. Privacy policies are supposed to inform users how their personal information is handled by a website. However, several studies have shown that users rarely read privacy policies for various reasons, not least because limitedly readable policy texts are difficult to understand. Based on our online survey with over 440 responses, we examine the objective and subjective readability of privacy policies and investigate their impact on users’ trust in five big Internet services. Our findings show the stronger a user believes in having understood the privacy policy, the higher he or she trusts a web site across all companies we studied. Our results call for making readability of privacy policies more accessible to an average reader.
Resumo:
Cold-water corals (CWC) are frequently reported from deep sites with locally accelerated currents that enhance seabed food particle supply. Moreover, zooplankton likely account for ecologically important prey items, but their contribution to CWC diet remains unquantified. We investigated the benthic food web structure of the recently discovered Santa Maria di Leuca (SML) CWC province (300 to 1100 m depth) located in the oligotrophic northern Ionian Sea. We analyzed stable isotopes (delta13C and delta15N) of the main consumers (including ubiquitous CWC species) exhibiting different feeding strategies, zooplankton, suspended particulate organic matter (POM) and sedimented organic matter (SOM). Zooplankton and POM were collected 3 m above the coral colonies in order to assess their relative contributions to CWC diet. The delta15N of the scleractinians Desmophyllum dianthus, Madrepora oculata and Lophelia pertusa and the gorgonian Paramuricea cf. macrospinawere consistent with a diet mainly composed of zooplankton. The antipatharian Leiopathes glaberrima was more 15N- depletedthan other cnidarians, suggesting a lower contribution of zooplankton to its diet. Our delta13C data clearly indicate that the benthic food web of SML is exclusively fuelled by carbon of phytoplanktonic origin. Nevertheless, consumers feeding at the water sediment interface were more 13C-enriched than consumers feeding above the bottom (i.e. living corals and their epifauna). This pattern suggests that carbon is assimilated via 2 trophic pathways: relatively fresh phytoplanktonic production for 13C-depleted consumers and more decayed organic matter for 13C-enriched consumers. When the delta13C values of consumers were corrected for the influence of lipids (which are significantly 13C-depleted relative to other tissue components), our conclusions remained unchanged, except in the case of L. glaberrima which could assimilate a mixture of zooplankton and resuspended decayed organic matter.
Resumo:
A mobile ad hoc network MANET is a collection of wireless mobile nodes that can dynamically configure a network without a fixed infrastructure or centralized administration. This makes it ideal for emergency and rescue scenarios where information sharing is essential and should occur as soon as possible. This article discusses which of the routing strategies for mobile ad hoc networks: proactive, reactive and hierarchical, have a better performance in such scenarios. Using a real urban area being set for the emergency and rescue scenario, we calculate the density of nodes and the mobility model needed for validation. The NS2 simulator has been used in our study. We also show that the hierarchical routing strategies are beffer suited for this type of scenarios.
Resumo:
The Privacy by Design approach to systems engineering introduces privacy requirements in the early stages of development, instead of patching up a built system afterwards. However, 'vague', 'disconnected from technology', or 'aspirational' are some terms employed nowadays to refer to the privacy principles which must lead the development process. Although privacy has become a first-class citizen in the realm of non-functional requirements and some methodological frameworks help developers by providing design guidance, software engineers often miss a solid reference detailing which specific, technical requirements they must abide by, and a systematic methodology to follow. In this position paper, we look into a domain that has already successfully tackled these problems -web accessibility-, and propose translating their findings into the realm of privacy requirements engineering, analyzing as well the gaps not yet covered by current privacy initiatives.
Resumo:
The revelation of the top-secret US intelligence-led PRISM Programme has triggered wide-ranging debates across Europe. Press reports have shed new light on the electronic surveillance ‘fishing expeditions’ of the US National Security Agency and the FBI into the world’s largest electronic communications companies. This Policy Brief by a team of legal specialists and political scientists addresses the main controversies raised by the PRISM affair and the policy challenges that it poses for the EU. Two main arguments are presented: First, the leaks over the PRISM programme have undermined the trust that EU citizens have in their governments and the European institutions to safeguard and protect their privacy; and second, the PRISM affair raises questions regarding the capacity of EU institutions to draw lessons from the past and to protect the data of its citizens and residents in the context of transatlantic relations. The Policy Brief puts forward a set of policy recommendations for the EU to follow and implement a robust data protection strategy in response to the affair.
Resumo:
In questo documento ho analizzato lo scenario della passata e dell’odierna Internet, dal classico protocollo HTTP, al protocollo sperimentale QUIC, argomento di questa tesi. In primis ho analizzato gli attuali protocolli utilizzati nella rete e ricercato i motivi che hanno portato a crearne di nuovi, successivamente ho effettuato un analisi teorica del protocollo affidandomi ai documenti forniti dall'IETF, poi in un capitolo a sé ho descritto l'handshake crittografato tipico di questo protocollo ed infine nell'ultimo capitolo ho mostrato graficamente e praticamente come lavora il protocollo in una reale implementazione. Dopo aver completato questa tesi, mi sono potuto rendere conto di quanto sia necessario un cambio di rotta verso protocolli di rete più veloci, sicuri ed affidabili. I classici protocolli oramai non sono più sufficienti a soddisfare le migliaia di richieste di connessione e presentano, come si vedrà, delle lacune a cui bisogna porre rimedio. Gran parte della popolazione mondiale ha accesso al web,ed è uno strumento ormai alla portata di tutti e non più privilegio di pochi e ci si augura per il bene della rete Internet che tale protocollo o protocolli simili possano prendere presto piede per una migliore esperienza di navigazione a livello globale. Probabilmente saranno necessari molti anni, ma l’idea che già si pensi ad un futuro non tanto prossimo fa ben sperare su quello che ci aspetta. Nella lettura di questa tesi si vedrà come queste ultime affermazioni possano diventare realtà.
Resumo:
Mentre navighiamo siamo veramente certi che i nostri dati e la nostra privacy siano al sicuro? I browser e le tecnologie di cui fanno uso possono rivelare una miriade di informazioni. Al crescere delle informazioni reperibili, si inizia a superare una massa critica che può permettere l'identificazione. Il device fingerprinting è proprio il rilevamento di questa tipologia di dati. HTML5 e le nuove API che esso mette a disposizione aumentano a dismisura le modalità per fare fingerprinting. Durante lo sviluppo della presente tesi è stato realizzato un framework molto potente che verrà mostrato nel dettaglio. Come a seguito di un disastro aereo, l'ingegneria aeronautica si mette all'opera per scovare i punti deboli allo scopo di rendere più robusti gli aerei di nuova generazione, noi con la presente tesi vogliamo dare il nostro contributo al miglioramento del web del futuro. Affinchè la nostra privacy sia veramente nelle nostre mani e possiamo essere artefici del nostro domani.
Resumo:
"September 20, 2005."
Resumo:
With the advent of GPS enabled smartphones, an increasing number of users is actively sharing their location through a variety of applications and services. Along with the continuing growth of Location-Based Social Networks (LBSNs), security experts have increasingly warned the public of the dangers of exposing sensitive information such as personal location data. Most importantly, in addition to the geographical coordinates of the user’s location, LBSNs allow easy access to an additional set of characteristics of that location, such as the venue type or popularity. In this paper, we investigate the role of location semantics in the identification of LBSN users. We simulate a scenario in which the attacker’s goal is to reveal the identity of a set of LBSN users by observing their check-in activity. We then propose to answer the following question: what are the types of venues that a malicious user has to monitor to maximize the probability of success? Conversely, when should a user decide whether to make his/her check-in to a location public or not? We perform our study on more than 1 million check-ins distributed over 17 urban regions of the United States. Our analysis shows that different types of venues display different discriminative power in terms of user identity, with most of the venues in the “Residence” category providing the highest re-identification success across the urban regions. Interestingly, we also find that users with a high entropy of their check-ins distribution are not necessarily the hardest to identify, suggesting that it is the collective behaviour of the users’ population that determines the complexity of the identification task, rather than the individual behaviour.
Resumo:
This paper looks at the issue of privacy and anonymity through the prism of Scott's concept of legibility i.e. the desire of the state to obtain an ever more accurate mapping of its domain and the actors in its domain. We argue that privacy was absent in village life in the past, and it has arisen as a temporary phenomenon arising from the lack of appropriate technology to make all life in the city legible. Cities have been the loci of creativity for the major part of human civilisation. There is something specific about the illegibility of cities which facilitates creativity and innovation. By providing the technology to catalogue and classify all objects and ideas around us, this leads to a consideration of semantic web technologies, Linked Data and the Internet of Things as unwittingly furthering this ever greater legibility. There is a danger that the over description of a domain will lead to a loss in creativity and innovation. We conclude by arguing that our prime concern must be to preserve illegibility because the survival of some form, any form, of civilisation depends upon it.
Resumo:
In recent years, there has been an enormous growth of location-aware devices, such as GPS embedded cell phones, mobile sensors and radio-frequency identification tags. The age of combining sensing, processing and communication in one device, gives rise to a vast number of applications leading to endless possibilities and a realization of mobile Wireless Sensor Network (mWSN) applications. As computing, sensing and communication become more ubiquitous, trajectory privacy becomes a critical piece of information and an important factor for commercial success. While on the move, sensor nodes continuously transmit data streams of sensed values and spatiotemporal information, known as ``trajectory information". If adversaries can intercept this information, they can monitor the trajectory path and capture the location of the source node. ^ This research stems from the recognition that the wide applicability of mWSNs will remain elusive unless a trajectory privacy preservation mechanism is developed. The outcome seeks to lay a firm foundation in the field of trajectory privacy preservation in mWSNs against external and internal trajectory privacy attacks. First, to prevent external attacks, we particularly investigated a context-based trajectory privacy-aware routing protocol to prevent the eavesdropping attack. Traditional shortest-path oriented routing algorithms give adversaries the possibility to locate the target node in a certain area. We designed the novel privacy-aware routing phase and utilized the trajectory dissimilarity between mobile nodes to mislead adversaries about the location where the message started its journey. Second, to detect internal attacks, we developed a software-based attestation solution to detect compromised nodes. We created the dynamic attestation node chain among neighboring nodes to examine the memory checksum of suspicious nodes. The computation time for memory traversal had been improved compared to the previous work. Finally, we revisited the trust issue in trajectory privacy preservation mechanism designs. We used Bayesian game theory to model and analyze cooperative, selfish and malicious nodes' behaviors in trajectory privacy preservation activities.^
Resumo:
L'ambiente di questa tesi è quello del Delay and Disruption Tolerant Networks (DTN), un'architettura di rete di telecomunicazioni avente come obiettivo le comunicazioni tra nodi di reti dette “challenged”, le quali devono affrontare problemi come tempi di propagazione elevati, alto tasso di errore e periodi di perdita delle connessioni. Il Bunde layer, un nuovo livello inserito tra trasporto e applicazione nell’architettura ISO/OSI, ed il protocollo ad esso associato, il Bundle Protocol (BP), sono stati progettati per rendere possibili le comunicazioni in queste reti. A volte fra la ricezione e l’invio può trascorrere un lungo periodo di tempo, a causa della indisponibilità del collegamento successivo; in questo periodo il bundle resta memorizzato in un database locale. Esistono varie implementazioni dell'architettura DTN come DTN2, implementazione di riferimento, e ION (Interplanetary Overlay Network), sviluppata da NASA JPL, per utilizzo in applicazioni spaziali; in esse i contatti tra i nodi sono deterministici, a differenza delle reti terrestri nelle quali i contatti sono generalmente opportunistici (non noti a priori). Per questo motivo all’interno di ION è presente un algoritmo di routing, detto CGR (Contact Graph Routing), progettato per operare in ambienti con connettività deterministica. È in fase di ricerca un algoritmo che opera in ambienti non deterministici, OCGR (Opportunistic Contact Graph Routing), che estende CGR. L’obiettivo di questa tesi è quello di fornire una descrizione dettagliata del funzionamento di OCGR, partendo necessariamente da CGR sul quale è basato, eseguire dei test preliminari, richiesti da NASA JPL, ed analizzarne i risultati per verificare la possibilità di utilizzo e miglioramento dell’algoritmo. Sarà inoltre descritto l’ambiente DTN e i principali algoritmi di routing per ambienti opportunistici. Nella parte conclusiva sarà presentato il simulatore DTN “The ONE” e l’integrazione di CGR e OCGR al suo interno.