47 resultados para DATA STORAGE


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Global Positioning Systems (GPS) in the Australian Football League (AFL) are the big-ticket item that sees clubs trying to gain any competitive advantage over their opposition that they can. This paper explores whether the current application of GPS by clubs is worthwhile or a waste of time from three core perspectives: technical, organisational and personal. Issues include poor data storage and analysis, inaccurate units, lack of appropriate business processes in place, and resistance to use. Although many of these issues can be addressed through improved technology, resolving the organisational and personal issues will require a change in mindset to ensure the use of GPS in the AFL is a worthwhile endeavour. The paper concludes that the current use of GPS devices in the AFL is a waste of time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Industries in developed countries are moving quickly to ensure the rapid adoption of cloud computing. At this stage, several outstanding issues exist, particularly related to Service Level Agreements (SLAs), security and privacy. Consumers and businesses are willing to use cloud computing only if they can trust that their data will remain private and secure. Our review of research literature indicates the level of control that a user has on their data is directly correlated to the level of data privacy provided by the cloud service. We considered several privacy factors from the industry perspective, namely data loss, data storage location being unknown to the client, vendor lock-in, unauthorized secondary use of user's data for advertising, targeting secured backup and easy restoration. The level of user control in database models were identified according to the level of existence in these privacy factors. Finally, we focused on a novel logical model that might help to bring the level of user control of privacy in cloud databases into a higher level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biopolymers can be produced through a variety of mechanisms. They can be derived from microbial systems, extracted from higher organisms such as plants, or synthesized chemically from basic biological building blocks. A wide range of emerging applications rely on all three of these production techniques. In recent years, considerable attention has been given to biopolymers produced by microbes. It is on the microbial level where the tools of genetic engineering can be most readily applied. A number of novel materials are now being developed or introduced into the market. Biopolymers are being developed for use as medical materials, packaging, cosmetics, food additives, clothing fabrics, water treatment chemicals, industrial plastics, absorbents, biosensors, and even data storage elements. This review identifies the possible commercial applications and describes the various methods of production of microbial biopolymers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Various air-breathing marine vertebrates such as seals, turtles and seabirds show distinct patterns of diving behaviour. For fish, the distinction between different vertical behaviours is often less clear-cut, as there are no surface intervals to differentiate between dives. Using data from acoustic tags (n = 23) and archival depth recorders attached to cod Gadus morhua (n = 92) in the southern North Sea, we developed a quantitative method of classifying vertical movements in order to facilitate an objective comparison of the behaviour of different individuals. This method expands the utilisation of data from data storage tags, with the potential for a better understanding of fish behaviour and enhanced individual based behaviour for improved ecosystem modelling. We found that cod were closely associated with the seabed for 90% of the time, although they showed distinct seasonal and spatial patterns in behaviour. For example, cod tagged in the southern North Sea exhibited high rates of vertical movement in spring and autumn that were probably associated with migration, while the vertical movements of resident cod in other areas were much less extensive and were probably related to foraging or spawning behaviours. The full reasons underlying spatial and temporal behavioural plasticity by cod in the North Sea warrant further investigation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multidimensional WSNs are deployed in complex environments to sense and collect data relating to multiple attributes (multidimensional data). Such networks present unique challenges to data dissemination, data storage and in-network query processing (information discovery). In this paper, we investigate efficient strategies for information discovery in large-scale multidimensional WSNs and propose the Adaptive MultiDimensional Multi-Resolution Architecture (A-MDMRA) that efficiently combines “push” and “pull” strategies for information discovery and adapts to variations in the frequencies of events and queries in the network to construct optimal routing structures. We present simulation results showing the optimal routing structure depends on the frequency of events and query occurrence in the network. It also balances push and pull operations in large scale networks enabling significant QoS improvements and energy savings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multidimensional WSNs are deployed in complex environments to sense and collect data relating to multiple attributes (multi-dimensional data). Such networks present unique challenges to data dissemination, data storage and in-network query processing (information discovery). Recent algorithms proposed for such WSNs are aimed at achieving better energy efficiency and minimizing latency. This creates a partitioned network area due to the overuse of certain nodes in areas which are on the shortest or closest or path to the base station or data aggregation points which results in hotspots nodes. In this paper, we propose a time-based multi-dimensional, multi-resolution storage approach for range queries that balances the energy consumption by balancing the traffic load as uniformly as possible. Thus ensuring a maximum network lifetime. We present simulation results to show that the proposed approach to information discovery offers significant improvements on information discovery latency compared with current approaches. In addition, the results prove that the Quality of Service (QoS) improvements reduces hotspots thus resulting in significant network-wide energy saving and an increased network lifetime.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Autonomous Wireless sensor networks(WSNs) have sensors that are usually deployed randomly to monitor one or more phenomena. They are attractive for information discovery in large-scale data rich environments and can add value to mission–critical applications such as battlefield surveillance and emergency response systems. However, in order to fully exploit these networks for such applications, energy efficient, load balanced and scalable solutions for information discovery are essential. Multi-dimensional autonomous WSNs are deployed in complex environments to sense and collect data relating to multiple attributes (multi-dimensional data). Such networks present unique challenges to data dissemination, data storage of in-network information discovery. In this paper, we propose a novel method for information discovery for multi-dimensional autonomous WSNs which sensors are deployed randomly that can significantly increase network lifetime and minimize query processing latency, resulting in quality of service (QoS) improvements that are of immense benefit to mission–critical applications. We present simulation results to show that the proposed approach to information discovery offers significant improvements on query resolution latency compared with current approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Distributed caching-empowered wireless networks can greatly improve the efficiency of data storage and transmission and thereby the users' quality of experience (QoE). However, how this technology can alleviate the network access pressure while ensuring the consistency of content delivery is still an open question, especially in the case where the users are in fast motion. Therefore, in this paper, we investigate the caching issue emerging from a forthcoming scenario where vehicular video streaming is performed under cellular networks. Specifically, a QoE centric distributed caching approach is proposed to fulfill as many users' requests as possible, considering the limited caching space of base stations and basic user experience guarantee. Firstly, a QoE evaluation model is established using verified empirical data. Also, the mathematic relationship between the streaming bit rate and actual storage space is developed. Then, the distributed caching management for vehicular video streaming is formulated as a constrained optimization problem and solved with the generalized-reduced gradient method. Simulation results indicate that our approach can improve the users' satisfaction ratio by up to 40%.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The proliferation of cloud computing allows users to flexibly store, re-compute or transfer large generated datasets with multiple cloud service providers. However, due to the pay-As-you-go model, the total cost of using cloud services depends on the consumption of storage, computation and bandwidth resources which are three key factors for the cost of IaaS-based cloud resources. In order to reduce the total cost for data, given cloud service providers with different pricing models on their resources, users can flexibly choose a cloud service to store a generated dataset, or delete it and choose a cloud service to regenerate it whenever reused. However, finding the minimum cost is a complicated yet unsolved problem. In this paper, we propose a novel algorithm that can calculate the minimum cost for storing and regenerating datasets in clouds, i.e. whether datasets should be stored or deleted, and furthermore where to store or to regenerate whenever they are reused. This minimum cost also achieves the best trade-off among computation, storage and bandwidth costs in multiple clouds. Comprehensive analysis and rigid theorems guarantee the theoretical soundness of the paper, and general (random) simulations conducted with popular cloud service providers' pricing models demonstrate the excellent performance of our approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cloud computing systems and services have become major targets for cyberattackers. To provide strong protection of cloud platforms, infrastructure, hosted applications, and data stored in the cloud, we need to address the security issue from a range of perspectives-from secure data and application outsourcing, to anonymous communication, to secure multiparty computation. This special issue on cloud security aims to address the importance of protecting and securing cloud platforms, infrastructures, hosted applications, and data storage.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main problem in data grids is how to provide good and timely access to huge data given the limited number and size of storage devices and high latency of the interconnection network. One approach to address this problem is to cache the files locally such that remote access overheads are avoided. Caching requires a cache-replacement algorithm, which is the focus of this paper. Specifically, we propose a new replacement policy and compare it with an existing policy using simulations. The results of the simulation show that the proposed policy performs better than the baseline policy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we propose a two-factor data security protection mechanism with factor revocability for cloud storage system. Our system allows a sender to send an encrypted message to a receiver through a cloud storage server. The sender only needs to know the identity of the receiver but no other information (such as its public key or its certificate). The receiver needs to possess two things in order to decrypt the ciphertext. The first thing is his/her secret key stored in the computer. The second thing is a unique personal security device which connects to the computer. It is impossible to decrypt the ciphertext without either piece. More importantly, once the security device is stolen or lost, this device is revoked. It cannot be used to decrypt any ciphertext. This can be done by the cloud server which will immediately execute some algorithms to change the existing ciphertext to be un-decryptable by this device. This process is completely transparent to the sender. Furthermore, the cloud server cannot decrypt any ciphertext at any time. The security and efficiency analysis show that our system is not only secure but also practical.