884 resultados para Libyan Data Protection Authority
Resumo:
Speaker: Dr Kieron O'Hara Organiser: Time: 04/02/2015 11:00-11:45 Location: B32/3077 Abstract In order to reap the potential societal benefits of big and broad data, it is essential to share and link personal data. However, privacy and data protection considerations mean that, to be shared, personal data must be anonymised, so that the data subject cannot be identified from the data. Anonymisation is therefore a vital tool for data sharing, but deanonymisation, or reidentification, is always possible given sufficient auxiliary information (and as the amount of data grows, both in terms of creation, and in terms of availability in the public domain, the probability of finding such auxiliary information grows). This creates issues for the management of anonymisation, which are exacerbated not only by uncertainties about the future, but also by misunderstandings about the process(es) of anonymisation. This talk discusses these issues in relation to privacy, risk management and security, reports on recent theoretical tools created by the UKAN network of statistics professionals (on which the author is one of the leads), and asks how long anonymisation can remain a useful tool, and what might replace it.
Resumo:
El desarrollo que está presentando el tema, hace que la información al respecto resulte algo limitada, no es mucha la literatura que hasta el momento se haya producido, especialmente en países donde la vida del Habeas Data es más corta. Por ello es que nuestra investigación resulta una herramienta
Resumo:
This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.
Resumo:
Consecrated in 1297 as the monastery church of the four years earlier founded St. Catherine’s monastery, the Gothic Church of St. Catherine was largely destroyed in a devastating bombing raid on January 2nd 1945. To counteract the process of disintegration, the departments of geo-information and lower monument protection authority of the City of Nuremburg decided to getting done a three dimensional building model of the Church of St. Catherine’s. A heterogeneous set of data was used for preparation of a parametric architectural model. In effect the modeling of historic buildings can profit from the so called BIM method (Building Information Modeling), as the necessary structuring of the basic data renders it into very sustainable information. The resulting model is perfectly suited to deliver a vivid impression of the interior and exterior of this former mendicant orders’ church to present observers.
Resumo:
The revelation of the top-secret US intelligence-led PRISM Programme has triggered wide-ranging debates across Europe. Press reports have shed new light on the electronic surveillance ‘fishing expeditions’ of the US National Security Agency and the FBI into the world’s largest electronic communications companies. This Policy Brief by a team of legal specialists and political scientists addresses the main controversies raised by the PRISM affair and the policy challenges that it poses for the EU. Two main arguments are presented: First, the leaks over the PRISM programme have undermined the trust that EU citizens have in their governments and the European institutions to safeguard and protect their privacy; and second, the PRISM affair raises questions regarding the capacity of EU institutions to draw lessons from the past and to protect the data of its citizens and residents in the context of transatlantic relations. The Policy Brief puts forward a set of policy recommendations for the EU to follow and implement a robust data protection strategy in response to the affair.
Resumo:
L’évolution continue des besoins d’apprentissage vers plus d’efficacité et plus de personnalisation a favorisé l’émergence de nouveaux outils et dimensions dont l’objectif est de rendre l’apprentissage accessible à tout le monde et adapté aux contextes technologiques et sociaux. Cette évolution a donné naissance à ce que l’on appelle l'apprentissage social en ligne mettant l'accent sur l’interaction entre les apprenants. La considération de l’interaction a apporté de nombreux avantages pour l’apprenant, à savoir établir des connexions, échanger des expériences personnelles et bénéficier d’une assistance lui permettant d’améliorer son apprentissage. Cependant, la quantité d'informations personnelles que les apprenants divulguent parfois lors de ces interactions, mène, à des conséquences souvent désastreuses en matière de vie privée comme la cyberintimidation, le vol d’identité, etc. Malgré les préoccupations soulevées, la vie privée en tant que droit individuel représente une situation idéale, difficilement reconnaissable dans le contexte social d’aujourd’hui. En effet, on est passé d'une conceptualisation de la vie privée comme étant un noyau des données sensibles à protéger des pénétrations extérieures à une nouvelle vision centrée sur la négociation de la divulgation de ces données. L’enjeu pour les environnements sociaux d’apprentissage consiste donc à garantir un niveau maximal d’interaction pour les apprenants tout en préservant leurs vies privées. Au meilleur de nos connaissances, la plupart des innovations dans ces environnements ont porté sur l'élaboration des techniques d’interaction, sans aucune considération pour la vie privée, un élément portant nécessaire afin de créer un environnement favorable à l’apprentissage. Dans ce travail, nous proposons un cadre de vie privée que nous avons appelé « gestionnaire de vie privée». Plus précisément, ce gestionnaire se charge de gérer la protection des données personnelles et de la vie privée de l’apprenant durant ses interactions avec ses co-apprenants. En s’appuyant sur l’idée que l’interaction permet d’accéder à l’aide en ligne, nous analysons l’interaction comme une activité cognitive impliquant des facteurs contextuels, d’autres apprenants, et des aspects socio-émotionnels. L'objectif principal de cette thèse est donc de revoir les processus d’entraide entre les apprenants en mettant en oeuvre des outils nécessaires pour trouver un compromis entre l’interaction et la protection de la vie privée. ii Ceci a été effectué selon trois niveaux : le premier étant de considérer des aspects contextuels et sociaux de l’interaction telle que la confiance entre les apprenants et les émotions qui ont initié le besoin d’interagir. Le deuxième niveau de protection consiste à estimer les risques de cette divulgation et faciliter la décision de protection de la vie privée. Le troisième niveau de protection consiste à détecter toute divulgation de données personnelles en utilisant des techniques d’apprentissage machine et d’analyse sémantique.
Resumo:
In the wake of the disclosures surrounding PRISM and other US surveillance programmes, this paper assesses the large-scale surveillance practices by a selection of EU member states: the UK, Sweden, France, Germany and the Netherlands. Given the large-scale nature of these practices, which represent a reconfiguration of traditional intelligence gathering, the paper contends that an analysis of European surveillance programmes cannot be reduced to a question of the balance between data protection versus national security, but has to be framed in terms of collective freedoms and democracy. It finds that four of the five EU member states selected for in-depth examination are engaging in some form of large-scale interception and surveillance of communication data, and identifies parallels and discrepancies between these programmes and the NSA-run operations. The paper argues that these programmes do not stand outside the realm of EU intervention but can be analysed from an EU law perspective via i) an understanding of national security in a democratic rule of law framework where fundamental human rights and judicial oversight constitute key norms; ii) the risks posed to the internal security of the Union as a whole as well as the privacy of EU citizens as data owners and iii) the potential spillover into the activities and responsibilities of EU agencies. The paper then presents a set of policy recommendations to the European Parliament.
Resumo:
This paper examines the challenges facing the EU regarding data retention, particularly in the aftermath of the judgment Digital Rights Ireland by the Court of Justice of the European Union (CJEU) of April 2014, which found the Data Retention Directive 2002/58 to be invalid. It first offers a brief historical account of the Data Retention Directive and then moves to a detailed assessment of what the judgment means for determining the lawfulness of data retention from the perspective of the EU Charter of Fundamental Rights: what is wrong with the Data Retention Directive and how would it need to be changed to comply with the right to respect for privacy? The paper also looks at the responses to the judgment from the European institutions and elsewhere, and presents a set of policy suggestions to the European institutions on the way forward. It is argued here that one of the main issues underlying the Digital Rights Ireland judgment has been the role of fundamental rights in the EU legal order, and in particular the extent to which the retention of metadata for law enforcement purposes is consistent with EU citizens’ right to respect for privacy and to data protection. The paper offers three main recommendations to EU policy-makers: first, to give priority to a full and independent evaluation of the value of the data retention directive; second, to assess the judgment’s implications for other large EU information systems and proposals that provide for the mass collection of metadata from innocent persons, in the EU; and third, to adopt without delay the proposal for Directive COM(2012)10 dealing with data protection in the fields of police and judicial cooperation in criminal matters.
Resumo:
In its recent Schrems judgment the Luxembourg Court annulled Commission Decision 2000/520 according to which US data protection rules are sufficient to satisfy EU privacy rules regarding EU-US transfers of personal data, otherwise known as the ‘Safe Harbour’ framework. What does this judgment mean and what are its main implications for EU-US data transfers? In this paper the authors find that this landmark judgment sends a strong message to EU and US policy-makers about the need to ensure clear rules governing data transfers, so that people whose personal data is transferred to third countries have sufficient legal guarantees. Without such rules there is legal uncertainty and mistrust. Any future arrangement for the transatlantic transfer of data will therefore need to be firmly anchored in a framework of protection commensurate to the EU Charter of Fundamental Rights and the EU data protection architecture.
Resumo:
L’évolution continue des besoins d’apprentissage vers plus d’efficacité et plus de personnalisation a favorisé l’émergence de nouveaux outils et dimensions dont l’objectif est de rendre l’apprentissage accessible à tout le monde et adapté aux contextes technologiques et sociaux. Cette évolution a donné naissance à ce que l’on appelle l'apprentissage social en ligne mettant l'accent sur l’interaction entre les apprenants. La considération de l’interaction a apporté de nombreux avantages pour l’apprenant, à savoir établir des connexions, échanger des expériences personnelles et bénéficier d’une assistance lui permettant d’améliorer son apprentissage. Cependant, la quantité d'informations personnelles que les apprenants divulguent parfois lors de ces interactions, mène, à des conséquences souvent désastreuses en matière de vie privée comme la cyberintimidation, le vol d’identité, etc. Malgré les préoccupations soulevées, la vie privée en tant que droit individuel représente une situation idéale, difficilement reconnaissable dans le contexte social d’aujourd’hui. En effet, on est passé d'une conceptualisation de la vie privée comme étant un noyau des données sensibles à protéger des pénétrations extérieures à une nouvelle vision centrée sur la négociation de la divulgation de ces données. L’enjeu pour les environnements sociaux d’apprentissage consiste donc à garantir un niveau maximal d’interaction pour les apprenants tout en préservant leurs vies privées. Au meilleur de nos connaissances, la plupart des innovations dans ces environnements ont porté sur l'élaboration des techniques d’interaction, sans aucune considération pour la vie privée, un élément portant nécessaire afin de créer un environnement favorable à l’apprentissage. Dans ce travail, nous proposons un cadre de vie privée que nous avons appelé « gestionnaire de vie privée». Plus précisément, ce gestionnaire se charge de gérer la protection des données personnelles et de la vie privée de l’apprenant durant ses interactions avec ses co-apprenants. En s’appuyant sur l’idée que l’interaction permet d’accéder à l’aide en ligne, nous analysons l’interaction comme une activité cognitive impliquant des facteurs contextuels, d’autres apprenants, et des aspects socio-émotionnels. L'objectif principal de cette thèse est donc de revoir les processus d’entraide entre les apprenants en mettant en oeuvre des outils nécessaires pour trouver un compromis entre l’interaction et la protection de la vie privée. ii Ceci a été effectué selon trois niveaux : le premier étant de considérer des aspects contextuels et sociaux de l’interaction telle que la confiance entre les apprenants et les émotions qui ont initié le besoin d’interagir. Le deuxième niveau de protection consiste à estimer les risques de cette divulgation et faciliter la décision de protection de la vie privée. Le troisième niveau de protection consiste à détecter toute divulgation de données personnelles en utilisant des techniques d’apprentissage machine et d’analyse sémantique.
Resumo:
Much has been written about Big Data from a technical, economical, juridical and ethical perspective. Still, very little empirical and comparative data is available on how Big Data is approached and regulated in Europe and beyond. This contribution makes a first effort to fill that gap by presenting the reactions to a survey on Big Data from the Data Protection Authorities of fourteen European countries and a comparative legal research of eleven countries. This contribution presents those results, addressing 10 challenges for the regulation of Big Data.
Resumo:
Healthcare systems have assimilated information and communication technologies in order to improve the quality of healthcare and patient's experience at reduced costs. The increasing digitalization of people's health information raises however new threats regarding information security and privacy. Accidental or deliberate data breaches of health data may lead to societal pressures, embarrassment and discrimination. Information security and privacy are paramount to achieve high quality healthcare services, and further, to not harm individuals when providing care. With that in mind, we give special attention to the category of Mobile Health (mHealth) systems. That is, the use of mobile devices (e.g., mobile phones, sensors, PDAs) to support medical and public health. Such systems, have been particularly successful in developing countries, taking advantage of the flourishing mobile market and the need to expand the coverage of primary healthcare programs. Many mHealth initiatives, however, fail to address security and privacy issues. This, coupled with the lack of specific legislation for privacy and data protection in these countries, increases the risk of harm to individuals. The overall objective of this thesis is to enhance knowledge regarding the design of security and privacy technologies for mHealth systems. In particular, we deal with mHealth Data Collection Systems (MDCSs), which consists of mobile devices for collecting and reporting health-related data, replacing paper-based approaches for health surveys and surveillance. This thesis consists of publications contributing to mHealth security and privacy in various ways: with a comprehensive literature review about mHealth in Brazil; with the design of a security framework for MDCSs (SecourHealth); with the design of a MDCS (GeoHealth); with the design of Privacy Impact Assessment template for MDCSs; and with the study of ontology-based obfuscation and anonymisation functions for health data.
Resumo:
Australian privacy law regulates how government agencies and private sector organisations collect, store and use personal information. A coherent conceptual basis of personal information is an integral requirement of information privacy law as it determines what information is regulated. A 2004 report conducted on behalf of the UK’s Information Commissioner (the 'Booth Report') concluded that there was no coherent definition of personal information currently in operation because different data protection authorities throughout the world conceived the concept of personal information in different ways. The authors adopt the models developed by the Booth Report to examine the conceptual basis of statutory definitions of personal information in Australian privacy laws. Research findings indicate that the definition of personal information is not construed uniformly in Australian privacy laws and that different definitions rely upon different classifications of personal information. A similar situation is evident in a review of relevant case law. Despite this, the authors conclude the article by asserting that a greater jurisprudential discourse is required based on a coherent conceptual framework to ensure the consistent development of Australian privacy law.