893 resultados para Data security principle
Resumo:
Satelliittipaikannuksen hyödyntäminen eri sovellusaloilla ja siviilikäytössä on kasvanut merkittävästi 2000-luvulla Yhdysvaltojen puolustusministeriön lopetettua GPS-järjestelmän tarkoituksenmukaisen häirinnän. Langattomien datayhteyksien yleistyminen ja nopeuksien kasvaminen on avannut paikkatiedon käyttämiseksi ja hyödyntämiseksi reaaliaikaisesti uusia mahdollisuuksia. Kustannusten kasvaessa on tehokkaasta liikennöinnistä tullut tänä päivänä erittäin tärkeä osa yritysten päivittäisiä toimintoja. Ajoneuvojen hallinta on yksi tapa, jolla pyritään tehostamaan logistisia toimintoja ja vähentämään siitä aiheutuvia kustannuksia. Seuraamalla reaaliaikaisesti ajoneuvojen liikennöintiä voidaan pyrkiä saavuttamaan säästöjä optimoimalla aikatauluja ja reittejä sekä uudelleenohjaamalla ajoneuvoja sijaintien mukaan vähentäen näin kuljettua matkaa ja aikaa. Tässä diplomityössä tavoitteena on tutkia kuinka satelliittipaikannusta, paikkatietoa ja langattomia datayhteyksiä hyödyntämällä voidaan toteuttaa reaaliaikainen jäljitysohjelmisto. Työssä esitellään aluksi paikannustekniikat ja niiden toiminta. Lisäksi tutkitaan kuinka tiedonsiirto voidaan järjestelmässä toteuttaa sekä tarkastellaan järjestelmän kehityksessä huomioitavia tietoturvanäkökohtia. Tutkimuksen pohjalta suunniteltiin ja toteutettiin reaaliaikainen jäljitysohjelmisto kotipalveluyrityksen ajoneuvojen paikannustarpeisiin. Järjestelmän avulla voidaan valvoa ja jäljittää ajoneuvojen sijainteja kartalla reaaliaikaisesti sekä paikantaa tiettyä kohdetta lähimpänä olevat ajoneuvot. Tämä mahdollistaa hälytyksen sattuessa lähimpänä olevan työntekijän lähettämisen asiakaskohteeseen mahdollisimman nopeasti. Järjestelmän avulla käyttäjät voivat lisäksi seurata ajamiaan matkoja ja pitää automaattista ajopäiväkirjaa. Lopuksi työssä arvioidaan toteutetun järjestelmän toimintaa testauksessa saatujen mittaustulosten perusteella.
Resumo:
Käyttöjärjestelmän uuden version myötä vanhat ohjelmat eivät välttämättä toimi uudessa ympäristössä. Windows-käyttöjärjestelmässä sovellusten yhteensopivuus on aiemmin säilytetty melko hyvin. Uusimpiin Windows Vista ja Windows 7 -versioihin on tehty paljon tietoturvauudistuksia. Niistä johtuen vanhojen ohjelmien yhteensopivuutta on karsittu. Tässä työssä kuvataan automaatiojärjestelmän ohjelmakomponenttien siirtoa uuteen Windows-ympäristöön. Tavoitteena on saada tehtyä ohjeita muille automaatiojärjestelmän kehittäjille. Myös Windowsin tietoturvaominaisuuksiin tehdään katsaus, erityisesti pääsynhallintaan
Resumo:
Analysis of firewall and antivirus log files without any kind of log analysis tool could be very difficult for normal computer user. In log files every event is organized according to time, but reading those with understanding without any kind of log analysis tool requires expert knowledge. In this Bachelor’s Thesis I put together a software packet for normal private computer user and this software packet allows user to analyze log files in Windows environment without any additional effort. Most of the private computer users don’t have much of experience about computers and data security so this Bachelor’s Thesis can be also used as a manual for analysis tool used in this work.
Resumo:
This report describes web archiving in the National Library of Finland. The National Library of Finland has been archiving Finnish web on a regular basis since 2006. Web archiving is an important part of the Library'ʹs endeavours to collect and preserve Finnish published cultural heritage. In 2010, the amount of harvested data was 200 million files, or 25 Terabytes. The report takes the reader through the relevant legislation; internal plans and policies; funding and their allocation; the practices of web archiving; arrangements for the use of the archive; and issues rising from data security, sensitive materials, &c.
Resumo:
Työn tarkoituksena on selvittää, mitä valmiita ohjelmistoja tarjoavat pilvipalvelut lupaavat tietoturvastaan. Työssä tutustutaan eri pilvipalveluiden palveluehtoihin ja keskitytään analysoimaan palvelun tietoturvaa ja yksityisyyden suojaa näiden ehtojen mukaisesti. Tuloksena saadaan katsaus pilvipalveluiden tietoturvaan ja voidaan samalla päätellä, kuinka palvelut vastaavat pilven ominaisiin tietoturvauhkiin.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
"Mémoire présenté à la Faculté des études supérieures en vue de l'obtention du grade de Maîtrise en LL.M. Droit - Recherche option Droit, Biotechnologies et Sociétés"
Resumo:
Repository contains an animation related to privacy along with the poster for the resource in both jpeg and pdf format.
Resumo:
This Policy Contribution assesses the broad obstacles hampering ICT-led growth in Europe and identifies the main areas in which policy could unlock the greatest value. We review estimates of the value that could be generated through take-up of various technologies and carry out a broad matching with policy areas. According to the literature survey and the collected estimates, the areas in which the right policies could unlock the greatest ICT-led growth are product and labour market regulations and the European Single Market. These areas should be reformed to make European markets more flexible and competitive. This would promote wider adoption of modern data-driven organisational and management practices thereby helping to close the productivity gap between the United States and the European Union. Gains could also be made in the areas of privacy, data security, intellectual property and liability pertaining to the digital economy, especially cloud computing, and next generation network infrastructure investment. Standardisation and spectrum allocation issues are found to be important, though to a lesser degree. Strong complementarities between the analysed technologies suggest, however, that policymakers need to deal with all of the identified obstacles in order to fully realise the potential of ICT to spur long-term growth beyond the partial gains that we report.
Resumo:
Recent developments in the fields of veterinary epidemiology and economics are critically reviewed and assessed. The impacts of recent technological developments in diagnosis, genetic characterisation, data processing and statistical analysis are evaluated. It is concluded that the acquisition and availability of data remains the principal constraint to the application of available techniques in veterinary epidemiology and economics, especially at population level. As more commercial producers use computerised management systems, the availability of data for analysis within herds is improving. However, consistency of recording and diagnosis remains problematic. Recent trends to the development of national livestock databases intended to provide reassurance to consumers of the safety and traceability of livestock products are potentially valuable sources of data that could lead to much more effective application of veterinary epidemiology and economics. These opportunities will be greatly enhanced if data from different sources, such as movement recording, official animal health programmes, quality assurance schemes, production recording and breed societies can be integrated. However, in order to realise such integrated databases, it will be necessary to provide absolute control of user access to guarantee data security and confidentiality. The potential applications of integrated livestock databases in analysis, modelling, decision-support, and providing management information for veterinary services and livestock producers are discussed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Alison Macrina is the founder and director of the Library Freedom Project, an initiative that aims to make real the promise of intellectual freedom in libraries. The Library Freedom Project trains librarians on the state of global surveillance, privacy rights, and privacy-protecting technology, so that librarians may in turn teach their communities about safeguarding privacy. In 2015, Alison was named one of Library Journal‘s Movers and Shakers. Read more about the Library Freedom Project at libraryfreedomproject.org.
Resumo:
The development of strategies for structural health monitoring (SHM) has become increasingly important because of the necessity of preventing undesirable damage. This paper describes an approach to this problem using vibration data. It involves a three-stage process: reduction of the time-series data using principle component analysis (PCA), the development of a data-based model using an auto-regressive moving average (ARMA) model using data from an undamaged structure, and the classification of whether or not the structure is damaged using a fuzzy clustering approach. The approach is applied to data from a benchmark structure from Los Alamos National Laboratory, USA. Two fuzzy clustering algorithms are compared: fuzzy c-means (FCM) and Gustafson-Kessel (GK) algorithms. It is shown that while both fuzzy clustering algorithms are effective, the GK algorithm marginally outperforms the FCM algorithm. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
In questa tesi ho voluto descrivere il Timing Attack al sistema crittografico RSA, il suo funzionamento, la teoria su cui si basa, i suoi punti di forza e i punti deboli. Questo particolare tipo di attacco informatico fu presentato per la prima volta da Paul C. Kocher nel 1996 all’“RSA Data Security and CRYPTO conferences”. Nel suo articolo “Timing Attacks on Implementations of Diffie-Hellman, RSA, DSS, and Other Systems” l’autore svela una nuova possibile falla nel sistema RSA, che non dipende da debolezze del crittosistema puramente matematiche, ma da un aspetto su cui nessuno prima di allora si era mai soffermato: il tempo di esecuzione delle operazioni crittografiche. Il concetto è tanto semplice quanto geniale: ogni operazione in un computer ha una certa durata. Le variazioni dei tempi impiegati per svolgere le operazioni dal computer infatti, necessariamente dipendono dal tipo di algoritmo e quindi dalle chiavi private e dal particolare input che si è fornito. In questo modo, misurando le variazioni di tempo e usando solamente strumenti statistici, Kocher mostra che è possibile ottenere informazioni sull’implementazione del crittosistema e quindi forzare RSA e altri sistemi di sicurezza, senza neppure andare a toccare l’aspetto matematico dell’algoritmo. Di centrale importanza per questa teoria diventa quindi la statistica. Questo perché entrano in gioco molte variabili che possono influire sul tempo di calcolo nella fase di decifrazione: - La progettazione del sistema crittografico - Quanto impiega la CPU ad eseguire il processo - L’algoritmo utilizzato e il tipo di implementazione - La precisione delle misurazioni - Ecc. Per avere più possibilità di successo nell’attaccare il sistema occorre quindi fare prove ripetute utilizzando la stessa chiave e input differenti per effettuare analisi di correlazione statistica delle informazioni di temporizzazione, fino al punto di recuperare completamente la chiave privata. Ecco cosa asserisce Kocher: “Against a vulnerable system, the attack is computationally inexpensive and often requires only known ciphertext.”, cioè, contro sistemi vulnerabili, l’attacco è computazionalmente poco costoso e spesso richiede solo di conoscere testi cifrati e di ottenere i tempi necessari per la loro decifrazione.