47 resultados para Packet switching (Data transmission)

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

People with special medical monitoring needs can, these days, be sent home and remotely monitored through the use of data logging medical sensors and a transmission base-station. While this can improve quality of life by allowing the patient to spend most of their time at home, most current technologies rely on hardwired landline technology or expensive mobile data transmissions to transmit data to a medical facility. The aim of this paper is to investigate and develop an approach to increase the freedom of a monitored patient and decrease costs by utilising mobile technologies and SMS messaging to transmit data from patient to medico. To this end, we evaluated the capabilities of SMS and propose a generic communications protocol which can work within the constraints of the SMS format, but provide the necessary redundancy and robustness to be used for the transmission of non-critical medical telemetry from data logging medical sensors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Remote telemonitoring holds great potential to augment management of patients with coronary heart disease (CHD) and atrial fibrillation (AF) by enabling regular physiological monitoring during physical activity. Remote physiological monitoring may improve home and community exercise-based cardiac rehabilitation (exCR) programs and could improve assessment of the impact and management of pharmacological interventions for heart rate control in individuals with AF.

Objective: Our aim was to evaluate the measurement validity and data transmission reliability of a remote telemonitoring system comprising a wireless multi-parameter physiological sensor, custom mobile app, and middleware platform, among individuals in sinus rhythm and AF.

Methods: Participants in sinus rhythm and with AF undertook simulated daily activities, low, moderate, and/or high intensity exercise. Remote monitoring system heart rate and respiratory rate were compared to reference measures (12-lead ECG and indirect calorimeter). Wireless data transmission loss was calculated between the sensor, mobile app, and remote Internet server.

Results: Median heart rate (-0.30 to 1.10 b∙min-1) and respiratory rate (-1.25 to 0.39 br∙min-1) measurement biases were small, yet statistically significant (all P≤.003) due to the large number of observations. Measurement reliability was generally excellent (rho=.87-.97, all P<.001; intraclass correlation coefficient [ICC]=.94-.98, all P<.001; coefficient of variation [CV]=2.24-7.94%), although respiratory rate measurement reliability was poor among AF participants (rho=.43, P<.001; ICC=.55, P<.001; CV=16.61%). Data loss was minimal (<5%) when all system components were active; however, instability of the network hosting the remote data capture server resulted in data loss at the remote Internet server during some trials.

Conclusions: System validity was sufficient for remote monitoring of heart and respiratory rates across a range of exercise intensities. Remote exercise monitoring has potential to augment current exCR and heart rate control management approaches by enabling the provision of individually tailored care to individuals outside traditional clinical environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When wearable and personal health device and sensors capture data such as heart rate and body temperature for fitness tracking and health services, they simply transfer data without filtering or optimising. This can cause over-loading to the sensors as well as rapid battery consumption when they interact with Internet of Things (IoT) networks, which are expected to increase and de-mand more health data from device wearers. To solve the problem, this paper proposes to infer sensed data to reduce the data volume, which will affect the bandwidth and battery power reduction that are essential requirements to sensor devices. This is achieved by applying beacon data points after the inferencing of data processing utilising variance rates, which compare the sensed data with ad-jacent data before and after. This novel approach verifies by experiments that data volume can be saved by up to 99.5% with a 98.62% accuracy. Whilst most existing works focus on sensor network improvements such as routing, operation and reading data algorithms, we efficiently reduce data volume to reduce band-width and battery power consumption while maintaining accuracy by implement-ing intelligence and optimisation in sensor devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many network applications, the nature of traffic is of burst type. Often, the transient response of network to such traffics is the result of a series of interdependant events whose occurrence prediction is not a trivial task. The previous efforts in IEEE 802.15.4 networks often followed top-down approaches to model those sequences of events, i.e., through making top-view models of the whole network, they tried to track the transient response of network to burst packet arrivals. The problem with such approaches was that they were unable to give station-level views of network response and were usually complex. In this paper, we propose a non-stationary analytical model for the IEEE 802.15.4 slotted CSMA/CA medium access control (MAC) protocol under burst traffic arrival assumption and without the optional acknowledgements. We develop a station-level stochastic time-domain method from which the network-level metrics are extracted. Our bottom-up approach makes finding station-level details such as delay, collision and failure distributions possible. Moreover, network-level metrics like the average packet loss or transmission success rate can be extracted from the model. Compared to the previous models, our model is proven to be of lower memory and computational complexity order and also supports contention window sizes of greater than one. We have carried out extensive and comparative simulations to show the high accuracy of our model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data replication is one of the key components in data grid architecture as it enhances data access and reliability and minimises the cost of data transmission. In this paper, we address the problem of reducing the overheads of the replication mechanisms that drive the data management components of a data grid. We propose an approach that extends the resource broker with policies that factor in user quality of service as well as service costs when replicating and transferring data. A realistic model of the data grid was created to simulate and explore the performance of the proposed policy. The policy displayed an effective means of improving the performance of the grid network traffic and is indicated by the improvement of speed and cost of transfers by brokers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ust-Noticeable-Differences (JND) as a dead-band in perceptual analysis has been widely used for more than a decade. This technique has been employed for data reduction in hap tic data transmission systems by several researchers. In fact, researchers use two different JND coefficients that are JNDV and JNDF for velocity and force data respectively. For position data, they usually rely on the resolution of hap tic display device to omit data that are unperceivable to human. In this paper, pruning undesirable position data that are produced by the vibration of the device or subject and/or noise in transmission line is addressed. It is shown that using inverse JNDV for position data can prune undesirable position data. Comparison of the results of the proposed method in this paper with several well known filters and some available methods proposed by other researchers is performed. It is shown that combination of JNDV could provide lower error with desirable curve smoothness, and as little as possible computation effort and complexity. It also has been shown that this method reduces much more data rather than using forward-JNDV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studied a new type of network model; it is formed by the dynamic autonomy area, the structured source servers and the proxy servers. The new network model satisfies the dynamics within the autonomy area, where each node undertakes different tasks according to their different abilities, to ensure that each node has the load ability fit its own; it does not need to exchange information via the central servers, so it can carry out the efficient data transmission and routing search. According to the highly dynamics of the autonomy area, we established dynamic tree structure-proliferation system routing and resource-search algorithms and simulated these algorithms. Test results show the performance of the proposed network model and the algorithms are very stable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

 Multicore network processors have been playing an increasingly important role in computational processes, which emphasize on scalability and parallelism of the systems, in distributed environments especially in Internet-based delay-sensitive applications. It is an important but unsolved issue, however, to efficiently schedule tasks in network processors with multicore and multithread for improving the system throughput as much as possible. Profiling can gather runtime environment information and guide the compiler to optimize programs through scheduling tasks based on the runtime context. This paper proposes a profiling-based task scheduling approach, targeting on improving the throughput of multicore network processor (Intel IXP) systems in the balanced pipeline way. In this work, we investigate a profiling-based task scheduling framework, a task scheduling algorithm, and a set of performance models. Our task allocation scheme maps tasks onto the pipeline architecture and multiple threads of network processors in parallel, which incorporates the profiling context and global thread refinement. We evaluate our task scheduling algorithm by implementing representative network applications on the Intel IXP network processor. Experimental results demonstrate that our algorithm is able to schedule tasks in a balanced pipeline fashion and achieve the high throughput and data transmission rate. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seismic data gathered from the Hydrocarbon Exploration and Discovery Operation is essential to identify possible hydrocarbon existence in a geologically surveyed area. However, the discovery operation takes a long time to be completed and computational processing of the acquired data is often delayed. Hydrocarbon exploration may end up needlessly covering an area without any hydrocarbon traces due to lack of immediate feedback from geophysical experts. This feedback can only be given when the acquired seismic data is computationally processed, analysed and interpreted. In response, we propose a comprehensive model to facilitate Hydrocarbon Exploration and Discovery Operation using encryption, decryption, satellite transmission and clouds. The model details the logical design of Seismic Data Processing (SDP) that exploits clouds and the ability for geophysical experts to provide on-line decisions on how to progress the hydrocarbon exploration operation at a remote location. Initial feasibility assessment was carried out to support our model. The SDP, data encryption and encryption for the assessment were carried out on a private cloud. The assessment shows that the overall process of hydrocarbon exploration from data acquisition, satellite data transmission through to SDP could be executed in a short time and at low costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we address the problem of blind separation of spatially correlated signals, which is encountered in some emerging applications, e.g., distributed wireless sensor networks and wireless surveillance systems. We preprocess the source signals in transmitters prior to transmission. Specifically, the source signals are first filtered by a set of properly designed precoders and then the coded signals are transmitted. On the receiving side, the Z-domain features of the precoders are exploited to separate the coded signals, from which the source signals are recovered. Based on the proposed precoders, a closed-form algorithm is derived to estimate the coded signals and the source signals. Unlike traditional blind source separation approaches, the proposed method does not require the source signals to be uncorrelated, sparse, or nonnegative. Compared with the existing precoder-based approach, the new method uses precoders with much lower order, which reduces the delay in data transmission and is easier to implement in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless ad hoc networks, especially in the hostile environment, are vulnerable to traffic analysis which allows the adversary to trace the routing messages and the sensitive data packets. Anonymity mechanism in ad hoc networks is a critical securing measure method employed to mitigate these problems. In this paper, we propose a novel secure and anonymous source routing protocol, called SADSR, based on Dynamic Source Routing (DSR) for wireless ad hoc networks. In the proposed scheme, we use the pseudonym, pseudonym based cryptography and the bloom filter to establish secure and anonymous routing in wireless ad hoc networks. Compared to other anonymous routing protocol, SADSR is not only anonymous but also the secure in the routing discover process and data transmission process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with blind separation of spatially correlated signals mixed by an instantaneous system. Taking advantage of the fact that the source signals are accessible in some man-made systems such as wireless communication systems, we preprocess the source signals in transmitters by a set of properly designed first-order precoders and then the coded signals are transmitted. At the receiving side, information about the precoders are utilized to perform signal separation. Compared with the existing precoder-based methods, the new method only employs the simplest first-order precoders, which reduces the delay in data transmission and is easier to implement in practical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cognitive radio improves spectrum efficiency and mitigates spectrum scarcity by allowing cognitive users to opportunistically access idle chunks of the spectrum owned by licensed users. In long-term spectrum leasing markets, secondary network operators make a decision about how much spectrum is optimal to fulfill their users' data transmission requirements. We study this optimization problem in multiple channel scenarios. Under the constrains of expected user admission rate and quality of service, we model the secondary network into a dynamic data transportation system. In this system, the spectrum accesses of both primary users and secondary users are in accordance with stochastic processes, respectively. The main metrics of quality of service we are concerned with include user admission rate, average transmission delay and stability of the delay. To quantify the relationship between spectrum provisioning and quality of service, we propose an approximate analytical model. We use the model to estimate the lower and upper bounds of the optimal amount of the spectrum. The distance between the bounds is relatively narrow. In addition, we design a simple algorithm to compute the optimum by using the bounds. We conduct numerical simulations on a slotted multiple channel dynamic spectrum access network model. Simulation results demonstrate the preciseness of the proposed model. Our work sheds light on the design of game and auction based dynamic spectrum sharing mechanisms in cognitive radio networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the development of the cyber-physical systems (CPS), the security analysis of the data therein becomes more and more important. Recently, due to the advantage of joint encryption and compression for data transmission in CPS, the emerging compressed sensing (CS)-based cryptosystem has attracted much attention, where security is of extreme importance. The existing methods only analyze the security of the plaintext under the assumption that the key is absolutely safe. However, for sparse plaintext, the prior sparsity knowledge of the plaintext could be exploited to partly retrieve the key, and then the plaintext, from the ciphertext. So, the existing methods do not provide a satisfactory security analysis. In this paper, it is conducted in the information theory frame, where the plaintext sparsity feature and the mutual information of the ciphertext, key, and plaintext are involved. In addition, the perfect secrecy criteria (Shannon-sense and Wyner-sense) are extended to measure the security. While the security level is given, the illegal access risk is also discussed. It is shown that the CS-based cryptosystem achieves the extended Wyner-sense perfect secrecy, but when the key is used repeatedly, both the plaintext and the key could be conditionally accessed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smart grid constrained optimal control is a complex issue due to the constant growth of grid complexity and the large volume of data available as input to smart device control. In this context, traditional centralized control paradigms may suffer in terms of the timeliness of optimization results due to the volume of data to be processed and the delayed asynchronous nature of the data transmission. To address these limits of centralized control, this paper presents a coordinated, distributed algorithm based on distributed, local controllers and a central coordinator for exchanging summarized global state information. The proposed model for exchanging global state information is resistant to fluctuations caused by the inherent interdependence between local controllers, and is robust to delays in information exchange. In addition, the algorithm features iterative refinement of local state estimations that is able to improve local controller ability to operate within network constraints. Application of the proposed coordinated, distributed algorithm through simulation shows its effectiveness in optimizing a global goal within a complex distribution system operating under constraints, while ensuring network operation stability under varying levels of information exchange delay, and with a range of network sizes.