876 resultados para bigdata, data stream processing, dsp, apache storm, cyber security
Resumo:
In 2002, 2003 and 2004, we took macoinvertebrate samples on a total of 36 occasions at the Badacsony bay of Lake Balaton. Our sampling site was characterised by areas of open water (in 2003 and 2004 full of reed-grass) as well as by areas covered by common reed (Phragmites australis) and narrowleaf cattail (Typha angustifolia). Samples were taken both from water body and benthic ooze by use of a stiff hand net. We have gained our data from processing 208 individual samples. We took samples frequently from early spring until late autumn for a deeper understanding of the processes of seasonal dynamics. The main seasonal patterns and temporal changes of diversity were described. We constructed a weather-dependent simulation model of the processes of seasonal dynamics in the interest of a possible further utilization of our data in climate change research. We described the total number of individuals, biovolume and diversity of all macroinvertebrate species with a single index and used the temporal trends of this index for simulation modelling. Our discrete deterministic model includes only the impact of temperature, other interactions might only appear concealed. Running the model for different climate change scenarios it became possible to estimate conditions for the 2070-2100 period. The results, however, should be treated very prudently not only because our model is very simple but also because the scenarios are the results of different models.
Resumo:
The need to incorporate advanced engineering tools in biology, biochemistry and medicine is in great demand. Many of the existing instruments and tools are usually expensive and require special facilities.^ With the advent of nanotechnology in the past decade, new approaches to develop devices and tools have been generated by academia and industry. ^ One such technology, NMR spectroscopy, has been used by biochemists for more than 2 decades to study the molecular structure of chemical compounds. However, NMR spectrometers are very expensive and require special laboratory rooms for their proper operation. High magnetic fields with strengths in the order of several Tesla make these instruments unaffordable to most research groups.^ This doctoral research proposes a new technology to develop NMR spectrometers that can operate at field strengths of less than 0.5 Tesla using an inexpensive permanent magnet and spin dependent nanoscale magnetic devices. This portable NMR system is intended to analyze samples as small as a few nanoliters.^ The main problem to resolve when downscaling the variables is to obtain an NMR signal with high Signal-To-Noise-Ratio (SNR). A special Tunneling Magneto-Resistive (TMR) sensor design was developed to achieve this goal. The minimum specifications for each component of the proposed NMR system were established. A complete NMR system was designed based on these minimum requirements. The goat was always to find cost effective realistic components. The novel design of the NMR system uses technologies such as Direct Digital Synthesis (DDS), Digital Signal Processing (DSP) and a special Backpropagation Neural Network that finds the best match of the NMR spectrum. The system was designed, calculated and simulated with excellent results.^ In addition, a general method to design TMR Sensors was developed. The technique was automated and a computer program was written to help the designer perform this task interactively.^
Resumo:
Modern geographical databases, which are at the core of geographic information systems (GIS), store a rich set of aspatial attributes in addition to geographic data. Typically, aspatial information comes in textual and numeric format. Retrieving information constrained on spatial and aspatial data from geodatabases provides GIS users the ability to perform more interesting spatial analyses, and for applications to support composite location-aware searches; for example, in a real estate database: “Find the nearest homes for sale to my current location that have backyard and whose prices are between $50,000 and $80,000”. Efficient processing of such queries require combined indexing strategies of multiple types of data. Existing spatial query engines commonly apply a two-filter approach (spatial filter followed by nonspatial filter, or viceversa), which can incur large performance overheads. On the other hand, more recently, the amount of geolocation data has grown rapidly in databases due in part to advances in geolocation technologies (e.g., GPS-enabled smartphones) that allow users to associate location data to objects or events. The latter poses potential data ingestion challenges of large data volumes for practical GIS databases. In this dissertation, we first show how indexing spatial data with R-trees (a typical data pre-processing task) can be scaled in MapReduce—a widely-adopted parallel programming model for data intensive problems. The evaluation of our algorithms in a Hadoop cluster showed close to linear scalability in building R-tree indexes. Subsequently, we develop efficient algorithms for processing spatial queries with aspatial conditions. Novel techniques for simultaneously indexing spatial with textual and numeric data are developed to that end. Experimental evaluations with real-world, large spatial datasets measured query response times within the sub-second range for most cases, and up to a few seconds for a small number of cases, which is reasonable for interactive applications. Overall, the previous results show that the MapReduce parallel model is suitable for indexing tasks in spatial databases, and the adequate combination of spatial and aspatial attribute indexes can attain acceptable response times for interactive spatial queries with constraints on aspatial data.
Resumo:
Resilience is widely accepted as a desirable system property for cyber-physical systems. However, there are no metrics that can be used to measure the resilience of cyber-physical systems (CPS) while the multi-dimensional nature of performance in these systems is considered. In this work, we present first results towards a resilience metric framework. The key contributions of this framework are threefold: First, it allows to evaluate resilience with respect to different performance indicators that are of interest. Second, complexities that are relevant to the performance indicators of interest, can be intentionally abstracted. Third and final, it supports the identification of reasons for good or bad resilience to improve system design.
Resumo:
Nowadays, a lot of interesting and useful and imaginative applications are springing to Android software market. And for guitar fans, some related apps bring great connivence to them, like a guitar tuner can save people from carrying a entity tuner all the time, some apps can simulate a real guitar, and some apps provide some simple lessons allowing people to learn some basic things. But these apps which can teach people, they can't really “monitor ” people, that is, they just give some instructions and hope people would follow them. So my project is to design an app which can detect if users are playing wrong and right real-timely. Guitar chords are always the first for new guitar beginners to learn, and a chord is a set of notes combined together in a regulated way ( get from the music theory having millions of developing ), and 'pitch' is the term for determining if the note different from other notes or noise, so the problem here is to manage the multi-pitch analysis in real time. And it's necessary to know some basics of digital signal processing ( DSP ) because digital signals are always more convenient for computers to analyze compared to analog signals. Then I found an audio processing Java library – TarsosDSP, and try to apply it to my Android project.
Resumo:
This work explores the development of MemTri. A memory forensics triage tool that can assess the likelihood of criminal activity in a memory image, based on evidence data artefacts generated by several applications. Fictitious illegal suspect activity scenarios were performed on virtual machines to generate 60 test memory images for input into MemTri. Four categories of applications (i.e. Internet Browsers, Instant Messengers, FTP Client and Document Processors) are examined for data artefacts located through the use of regular expressions. These identified data artefacts are then analysed using a Bayesian Network, to assess the likelihood that a seized memory image contained evidence of illegal activity. Currently, MemTri is under development and this paper introduces only the basic concept as well as the components that the application is built on. A complete description of MemTri coupled with extensive experimental results is expected to be published in the first semester of 2017.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
It is now generally accepted that cyber crime represents a big threat to organisations, and that they need to take appropriate action to protect their valuable information assets. However, current research shows that, although small businesses understand that they are potentially vulnerable, many are still not taking sufficient action to counteract the threat. Last year, the authors sought, through a more generalised but categorised attitudinal study, to explore the reasons why smaller SMEs in particular were reluctant to engage with accepted principles for protecting their data. The results showed that SMEs understood many of the issues. They were prepared to spend more but were particularly suspicious about spending on information assurance. The authors’ current research again focuses on SME attitudes but this time the survey asks only questions directly relating to information assurance and the standards available, in an attempt to try to understand exactly what is causing them to shy away from getting the badge or certificate that would demonstrate to customers and business partners that they take cyber security seriously. As with last year’s study, the results and analysis provide useful pointers towards the broader business environment changes that might cause SMEs to be more interested in working towards an appropriate cyber security standard.
Resumo:
The human factor is often recognised as a major aspect of cyber-security research. Risk and situational perception are identified as key factors in the decision making process, often playing a lead role in the adoption of security mechanisms. However, risk awareness and perception have been poorly investigated in the field of eHealth wearables. Whilst end-users often have limited understanding of privacy and security of wearables, assessing the perceived risks and consequences will help shape the usability of future security mechanisms. This paper present a survey of the the risks and situational awareness in eHealth services. An analysis of the lack of security and privacy measures in connected health devices is described with recommendations to circumvent critical situations.
Resumo:
This paper provides an overview of IDS types and how they work as well as configuration considerations and issues that affect them. Advanced methods of increasing the performance of an IDS are explored such as specification based IDS for protecting Supervisory Control And Data Acquisition (SCADA) and Cloud networks. Also by providing a review of varied studies ranging from issues in configuration and specific problems to custom techniques and cutting edge studies a reference can be provided to others interested in learning about and developing IDS solutions. Intrusion Detection is an area of much required study to provide solutions to satisfy evolving services and networks and systems that support them. This paper aims to be a reference for IDS technologies other researchers and developers interested in the field of intrusion detection.
Resumo:
Security Onion is a Network Security Manager (NSM) platform that provides multiple Intrusion Detection Systems (IDS) including Host IDS (HIDS) and Network IDS (NIDS). Many types of data can be acquired using Security Onion for analysis. This includes data related to: Host, Network, Session, Asset, Alert and Protocols. Security Onion can be implemented as a standalone deployment with server and sensor included or with a master server and multiple sensors allowing for the system to be scaled as required. Many interfaces and tools are available for management of the system and analysis of data such as Sguil, Snorby, Squert and Enterprise Log Search and Archive (ELSA). These interfaces can be used for analysis of alerts and captured events and then can be further exported for analysis in Network Forensic Analysis Tools (NFAT) such as NetworkMiner, CapME or Xplico. The Security Onion platform also provides various methods of management such as Secure SHell (SSH) for management of server and sensors and Web client remote access. All of this with the ability to replay and analyse example malicious traffic makes the Security Onion a suitable low cost alternative for Network Security Management. In this paper, we have a feature and functionality review for the Security Onion in terms of: types of data, configuration, interface, tools and system management.
Resumo:
In this paper, we demonstrate a digital signal processing (DSP) algorithm for improving spatial resolution of images captured by CMOS cameras. The basic approach is to reconstruct a high resolution (HR) image from a shift-related low resolution (LR) image sequence. The aliasing relationship of Fourier transforms between discrete and continuous images in the frequency domain is used for mapping LR images to a HR image. The method of projection onto convex sets (POCS) is applied to trace the best estimate of pixel matching from the LR images to the reconstructed HR image. Computer simulations and preliminary experimental results have shown that the algorithm works effectively on the application of post-image-captured processing for CMOS cameras. It can also be applied to HR digital image reconstruction, where shift information of the LR image sequence is known.
Resumo:
Secure transmission of bulk data is of interest to many content providers. A commercially-viable distribution of content requires technology to prevent unauthorised access. Encryption tools are powerful, but have a performance cost. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Two technical solutions make it possible to perform bulk transmissions while retaining security without too high a performance overhead. These are: 1. a) hierarchical encryption - the stronger the encryption, the harder it is to break but also the more computationally expensive it is. A hierarchical approach to key exchange means that simple and relatively weak encryption and keys are used to encrypt small chunks of data, for example 10 seconds of video. Each chunk has its own key. New keys for this bottom-level encryption are exchanged using a slightly stronger encryption, for example a whole-video key could govern the exchange of the 10-second chunk keys. At a higher level again, there could be daily or weekly keys, securing the exchange of whole-video keys, and at a yet higher level, a subscriber key could govern the exchange of weekly keys. At higher levels, the encryption becomes stronger but is used less frequently, so that the overall computational cost is minimal. The main observation is that the value of each encrypted item determines the strength of the key used to secure it. 2. b) non-symbolic fragmentation with signal diversity - communications are usually assumed to be sent over a single communications medium, and the data to have been encrypted and/or partitioned in whole-symbol packets. Network and path diversity break up a file or data stream into fragments which are then sent over many different channels, either in the same network or different networks. For example, a message could be transmitted partly over the phone network and partly via satellite. While TCP/IP does a similar thing in sending different packets over different paths, this is done for load-balancing purposes and is invisible to the end application. Network and path diversity deliberately introduce the same principle as a secure communications mechanism - an eavesdropper would need to intercept not just one transmission path but all paths used. Non-symbolic fragmentation of data is also introduced to further confuse any intercepted stream of data. This involves breaking up data into bit strings which are subsequently disordered prior to transmission. Even if all transmissions were intercepted, the cryptanalyst still needs to determine fragment boundaries and correctly order them. These two solutions depart from the usual idea of data encryption. Hierarchical encryption is an extension of the combined encryption of systems such as PGP but with the distinction that the strength of encryption at each level is determined by the "value" of the data being transmitted. Non- symbolic fragmentation suppresses or destroys bit patterns in the transmitted data in what is essentially a bit-level transposition cipher but with unpredictable irregularly-sized fragments. Both technologies have applications outside the commercial and can be used in conjunction with other forms of encryption, being functionally orthogonal.
Resumo:
As introduced by Bentley et al. (2005), artificial immune systems (AIS) are lacking tissue, which is present in one form or another in all living multi-cellular organisms. Some have argued that this concept in the context of AIS brings little novelty to the already saturated field of the immune inspired computational research. This article aims to show that such a component of an AIS has the potential to bring an advantage to a data processing algorithm in terms of data pre-processing, clustering and extraction of features desired by the immune inspired system. The proposed tissue algorithm is based on self-organizing networks, such as self-organizing maps (SOM) developed by Kohonen (1996) and an analogy of the so called Toll-Like Receptors (TLR) affecting the activation function of the clusters developed by the SOM.