34 resultados para Music Composition, Interface, Electronic Music, Computer, Performance
Resumo:
Sound is potentially an effective way of analysing data and it is possible to simultaneously interpret layers of sounds and identify changes. Multiple attempts to use sound with scientific data have been made, with varying levels of success. On many occasions this was done without including the end user during the development. In this study a sonified model of the 8 planets of our solar system was built and tested using an end user approach. The sonification was created for the Esplora Planetarium, which is currently being constructed in Malta. The data requirements were gathered from a member of the planetarium staff, and 12 end users, as well as the planetarium representative tested the sonification. The results suggest that listeners were able to discern various planetary characteristics without requiring any additional information. Three out of eight sound design parameters did not represent characteristics successfully. These issues have been identified and further development will be conducted in order to improve the model.
Distributed and compressed MIKEY mode to secure end-to-end communications in the Internet of things.
Resumo:
Multimedia Internet KEYing protocol (MIKEY) aims at establishing secure credentials between two communicating entities. However, existing MIKEY modes fail to meet the requirements of low-power and low-processing devices. To address this issue, we combine two previously proposed approaches to introduce a new distributed and compressed MIKEY mode for the Internet of Things. Indeed, relying on a cooperative approach, a set of third parties is used to discharge the constrained nodes from heavy computational operations. Doing so, the preshared mode is used in the constrained part of network, while the public key mode is used in the unconstrained part of the network. Furthermore, to mitigate the communication cost we introduce a new header compression scheme that reduces the size of MIKEY’s header from 12 Bytes to 3 Bytes in the best compression case. Preliminary results show that our proposed mode is energy preserving whereas its security properties are preserved untouched.
Resumo:
Imagine being told that your wage was going to be cut in half. Well, that’s what’s soon going to happen to those who make money from Bitcoin mining, the process of earning the online currency Bitcoin. The current expected date for this change is 11 July 2016. Many see this as the day when Bitcoin prices will rocket and when Bitcoin owners could make a great deal of money. Others see it as the start of a Bitcoin crash. At present no one quite knows which way it will go. Bitcoin was created in 2009 by someone known as Satoshi Nakamoto, borrowing from a whole lot of research methods. It is a cryptocurrency, meaning it uses digital encryption techniques to create bitcoins and secure financial transactions. It doesn’t need a central government or organisation to regulate it, nor a broker to manage payments. Conventional currencies usually have a central bank that creates money and controls its supply. Bitcoin is instead created when individuals “mine” for it by using their computers to perform complex calculations through special software. The algorithm behind Bitcoin is designed to limit the number of bitcoins that can ever be created. All Bitcoin transactions are recorded on a public database known as a blockchain. Every time someone mines for Bitcoin, it is recorded with a new block that is transmitted to every Bitcoin app across the network, like a bank updating its online records.
Resumo:
This report covers a workshop on digital engagement for Community Councils supported by the School of Computing's public engagement fund. It was held in Glasgow on 22 March 2016. The workshop combined presentations by subject experts with attendee-led round-table discussions. It was well received and felt by delegates to be of immediate benefit. There is clear demand for follow-up events, potentially more focussed on training.
Resumo:
Model Driven based approach for Service Evolution in Clouds will mainly focus on the reusable evolution patterns' advantage to solve evolution problems. During the process, evolution pattern will be driven by MDA models to pattern aspects. Weaving the aspects into service based process by using Aspect-Oriented extended BPEL engine at runtime will be the dynamic feature of the evolution.
Resumo:
Understanding the evolution of sociality in humans and other species requires understanding how selection on social behaviour varies with group size. However, the effects of group size are frequently obscured in the theoretical literature, which often makes assumptions that are at odds with empirical findings. In particular, mechanisms are suggested as supporting large-scale cooperation when they would in fact rapidly become ineffective with increasing group size. Here we review the literature on the evolution of helping behaviours (cooperation and altruism), and frame it using a simple synthetic model that allows us to delineate how the three main components of the selection pressure on helping must vary with increasing group size. The first component is the marginal benefit of helping to group members, which determines both direct fitness benefits to the actor and indirect fitness benefits to recipients. While this is often assumed to be independent of group size, marginal benefits are in practice likely to be maximal at intermediate group sizes for many types of collective action problems, and will eventually become very small in large groups due to the law of decreasing returns. The second component is the response of social partners on the past play of an actor, which underlies conditional behaviour under repeated social interactions. We argue that under realistic conditions on the transmission of information in a population, this response on past play decreases rapidly with increasing group size so that reciprocity alone (whether direct, indirect, or generalised) cannot sustain cooperation in very large groups. The final component is the relatedness between actor and recipient, which, according to the rules of inheritance, again decreases rapidly with increasing group size. These results explain why helping behaviours in very large social groups are limited to cases where the number of reproducing individuals is small, as in social insects, or where there are social institutions that can promote (possibly through sanctioning) large-scale cooperation, as in human societies. Finally, we discuss how individually devised institutions can foster the transition from small-scale to large-scale cooperative groups in human evolution.
Resumo:
Nowadays there is almost no crime committed without a trace of digital evidence, and since the advanced functionality of mobile devices today can be exploited to assist in crime, the need for mobile forensics is imperative. Many of the mobile applications available today, including internet browsers, will request the user’s permission to access their current location when in use. This geolocation data is subsequently stored and managed by that application's underlying database files. If recovered from a device during a forensic investigation, such GPS evidence and track points could hold major evidentiary value for a case. The aim of this paper is to examine and compare to what extent geolocation data is available from the iOS and Android operating systems. We focus particularly on geolocation data recovered from internet browsing applications, comparing the native Safari and Browser apps with Google Chrome, downloaded on to both platforms. All browsers were used over a period of several days at various locations to generate comparable test data for analysis. Results show considerable differences not only in the storage locations and formats, but also in the amount of geolocation data stored by different browsers and on different operating systems.
Resumo:
Data leakage is a serious issue and can result in the loss of sensitive data, compromising user accounts and details, potentially affecting millions of internet users. This paper contributes to research in online security and reducing personal footprint by evaluating the levels of privacy provided by the Firefox browser. The aim of identifying conditions that would minimize data leakage and maximize data privacy is addressed by assessing and comparing data leakage in the four possible browsing modes: normal and private modes using a browser installed on the host PC or using a portable browser from a connected USB device respectively. To provide a firm foundation for analysis, a series of carefully designed, pre-planned browsing sessions were repeated in each of the various modes of Firefox. This included low RAM environments to determine any effects low RAM may have on browser data leakage. The results show that considerable data leakage may occur within Firefox. In normal mode, all of the browsing information is stored within the Mozilla profile folder in Firefox-specific SQLite databases and sessionstore.js. While passwords were not stored as plain text, other confidential information such as credit card numbers could be recovered from the Form history under certain conditions. There is no difference when using a portable browser in normal mode, except that the Mozilla profile folder is located on the USB device rather than the host's hard disk. By comparison, private browsing reduces data leakage. Our findings confirm that no information is written to the Firefox-related locations on the hard disk or USB device during private browsing, implying that no deletion would be necessary and no remnants of data would be forensically recoverable from unallocated space. However, two aspects of data leakage occurred equally in all four browsing modes. Firstly, all of the browsing history was stored in the live RAM and was therefore accessible while the browser remained open. Secondly, in low RAM situations, the operating system caches out RAM to pagefile.sys on the host's hard disk. Irrespective of the browsing mode used, this may include Firefox history elements which can then remain forensically recoverable for considerable time.
Resumo:
SQL injection is a common attack method used to leverage infor-mation out of a database or to compromise a company’s network. This paper investigates four injection attacks that can be conducted against the PL/SQL engine of Oracle databases, comparing two recent releases (10g, 11g) of Oracle. The results of the experiments showed that both releases of Oracle were vulner-able to injection but that the injection technique often differed in the packages that it could be conducted in.
Resumo:
This paper discusses the large-scale group project undertaken by BSc Hons Digital Forensics students at Abertay University in their penultimate year. The philosophy of the project is to expose students to the full digital crime "life cycle", from commission through investigation, preparation of formal court report and finally, to prosecution in court. In addition, the project is novel in two aspects; the "crimes" are committed by students, and the moot court proceedings, where students appear as expert witnesses for the prosecution, are led by law students acting as counsels for the prosecution and defence. To support students, assessments are staged across both semesters with staff feedback provided at critical points. Feedback from students is very positive, highlighting particularly the experience of engaging with the law students and culminating in the realistic moot court, including a challenging cross-examination. Students also commented on the usefulness of the final debrief, where the whole process and the student experience is discussed in an informal plenary meeting between DF students and staff, providing an opportunity for the perpetrators and investigators to discuss details of the "crimes", and enabling all groups to learn from all crimes and investigations. We conclude with a reflection on the challenges encountered and a discussion of planned changes.
Resumo:
Abstract: The importance of e-government models lies in their offering a basis to measure and guide e-government. There is still no agreement on how to assess a government online. Most of the e-government models are not based on research, nor are they validated. In most countries, e-government has not reached higher stages of growth. Several scholars have shown a confusing picture of e-government. What is lacking is an in-depth analysis of e-government models. Responding to the need for such an analysis, this study identifies the strengths and weaknesses of major national and local e-government evaluation models. The common limitations of most models are focusing on the government and not the citizen, missing qualitative measures, constructing the e-equivalent of a bureaucratic administration, and defining general criteria without sufficient validations. In addition, this study has found that the metrics defined for national e-government are not suitable for municipalities, and most of the existing studies have focused on national e-governments even though local ones are closer to citizens. There is a need for developing a good theoretical model for both national and local municipal e-government.
Resumo:
There is still a lack of an engineering approach for building Web systems, and the field of measuring the Web is not yet mature. In particular, there is an uncertainty in the selection of evaluation methods, and there are risks of standardizing inadequate evaluation practices. It is important to know whether we are evaluating the Web or specific website(s). We need a new categorization system, a different focus on evaluation methods, and an in-depth analysis that reveals the strengths and weaknesses of each method. As a contribution to the field of Web evaluation, this study proposes a novel approach to view and select evaluation methods based on the purpose and platforms of the evaluation. It has been shown that the choice of the appropriate evaluation method(s) depends greatly on the purpose of the evaluation.
Resumo:
Recent years have seen an astronomical rise in SQL Injection Attacks (SQLIAs) used to compromise the confidentiality, authentication and integrity of organisations’ databases. Intruders becoming smarter in obfuscating web requests to evade detection combined with increasing volumes of web traffic from the Internet of Things (IoT), cloud-hosted and on-premise business applications have made it evident that the existing approaches of mostly static signature lack the ability to cope with novel signatures. A SQLIA detection and prevention solution can be achieved through exploring an alternative bio-inspired supervised learning approach that uses input of labelled dataset of numerical attributes in classifying true positives and negatives. We present in this paper a Numerical Encoding to Tame SQLIA (NETSQLIA) that implements a proof of concept for scalable numerical encoding of features to a dataset attributes with labelled class obtained from deep web traffic analysis. In the numerical attributes encoding: the model leverages proxy in the interception and decryption of web traffic. The intercepted web requests are then assembled for front-end SQL parsing and pattern matching by applying traditional Non-Deterministic Finite Automaton (NFA). This paper is intended for a technique of numerical attributes extraction of any size primed as an input dataset to an Artificial Neural Network (ANN) and statistical Machine Learning (ML) algorithms implemented using Two-Class Averaged Perceptron (TCAP) and Two-Class Logistic Regression (TCLR) respectively. This methodology then forms the subject of the empirical evaluation of the suitability of this model in the accurate classification of both legitimate web requests and SQLIA payloads.