66 resultados para receiver-based geomulticast protocol


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ActiGraph accelerometer is commonly used to measure physical activity in children. Count cut-off points are needed when using accelerometer data to determine the time a person spent in moderate or vigorous physical activity. For the GT3X accelerometer no cut-off points for young children have been published yet. The aim of the current study was thus to develop and validate count cut-off points for young children. Thirty-two children aged 5 to 9 years performed four locomotor and four play activities. Activity classification into the light-, moderate- or vigorous-intensity category was based on energy expenditure measurements with indirect calorimetry. Vertical axis as well as vector magnitude cut-off points were determined through receiver operating characteristic curve analyses with the data of two thirds of the study group and validated with the data of the remaining third. The vertical axis cut-off points were 133 counts per 5 sec for moderate to vigorous physical activity (MVPA), 193 counts for vigorous activity (VPA) corresponding to a metabolic threshold of 5 MET and 233 for VPA corresponding to 6 MET. The vector magnitude cut-off points were 246 counts per 5 sec for MVPA, 316 counts for VPA - 5 MET and 381 counts for VPA - 6 MET. When validated, the current cut-off points generally showed high recognition rates for each category, high sensitivity and specificity values and moderate agreement in terms of the Kappa statistic. These results were similar for vertical axis and vector magnitude cut-off points. The current cut-off points adequately reflect MVPA and VPA in young children. Cut-off points based on vector magnitude counts did not appear to reflect the intensity categories better than cut-off points based on vertical axis counts alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Driving a car is a complex instrumental activity of daily living and driving performance is very sensitive to cognitive impairment. The assessment of driving-relevant cognition in older drivers is challenging and requires reliable and valid tests with good sensitivity and specificity to predict safe driving. Driving simulators can be used to test fitness to drive. Several studies have found strong correlation between driving simulator performance and on-the-road driving. However, access to driving simulators is restricted to specialists and simulators are too expensive, large, and complex to allow easy access to older drivers or physicians advising them. An easily accessible, Web-based, cognitive screening test could offer a solution to this problem. The World Wide Web allows easy dissemination of the test software and implementation of the scoring algorithm on a central server, allowing generation of a dynamically growing database with normative values and ensures that all users have access to the same up-to-date normative values. OBJECTIVE In this pilot study, we present the novel Web-based Bern Cognitive Screening Test (wBCST) and investigate whether it can predict poor simulated driving performance in healthy and cognitive-impaired participants. METHODS The wBCST performance and simulated driving performance have been analyzed in 26 healthy younger and 44 healthy older participants as well as in 10 older participants with cognitive impairment. Correlations between the two tests were calculated. Also, simulated driving performance was used to group the participants into good performers (n=70) and poor performers (n=10). A receiver-operating characteristic analysis was calculated to determine sensitivity and specificity of the wBCST in predicting simulated driving performance. RESULTS The mean wBCST score of the participants with poor simulated driving performance was reduced by 52%, compared to participants with good simulated driving performance (P<.001). The area under the receiver-operating characteristic curve was 0.80 with a 95% confidence interval 0.68-0.92. CONCLUSIONS When selecting a 75% test score as the cutoff, the novel test has 83% sensitivity, 70% specificity, and 81% efficiency, which are good values for a screening test. Overall, in this pilot study, the novel Web-based computer test appears to be a promising tool for supporting clinicians in fitness-to-drive assessments of older drivers. The Web-based distribution and scoring on a central computer will facilitate further evaluation of the novel test setup. We expect that in the near future, Web-based computer tests will become a valid and reliable tool for clinicians, for example, when assessing fitness to drive in older drivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet of Things based systems are anticipated to gain widespread use in industrial applications. Standardization efforts, like 6L0WPAN and the Constrained Application Protocol (CoAP) have made the integration of wireless sensor nodes possible using Internet technology and web-like access to data (RESTful service access). While there are still some open issues, the interoperability problem in the lower layers can now be considered solved from an enterprise software vendors' point of view. One possible next step towards integration of real-world objects into enterprise systems and solving the corresponding interoperability problems at higher levels is to use semantic web technologies. We introduce an abstraction of real-world objects, called Semantic Physical Business Entities (SPBE), using Linked Data principles. We show that this abstraction nicely fits into enterprise systems, as SPBEs allow a business object centric view on real-world objects, instead of a pure device centric view. The interdependencies between how currently services in an enterprise system are used and how this can be done in a semantic real-world aware enterprise system are outlined, arguing for the need of semantic services and semantic knowledge repositories. We introduce a lightweight query language, which we use to perform a quantitative analysis of our approach to demonstrate its feasibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Opportunistic routing (OR) takes advantage of the broadcast nature and spatial diversity of wireless transmission to improve the performance of wireless ad-hoc networks. Instead of using a predetermined path to send packets, OR postpones the choice of the next-hop to the receiver side, and lets the multiple receivers of a packet to coordinate and decide which one will be the forwarder. Existing OR protocols choose the next-hop forwarder based on a predefined candidate list, which is calculated using single network metrics. In this paper, we propose TLG - Topology and Link quality-aware Geographical opportunistic routing protocol. TLG uses multiple network metrics such as network topology, link quality, and geographic location to implement the coordination mechanism of OR. We compare TLG with well-known existing solutions and simulation results show that TLG outperforms others in terms of both QoS and QoE metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study state-based video communication where a client simultaneously informs the server about the presence status of various packets in its buffer. In sender-driven transmission, the client periodically sends to the server a single acknowledgement packet that provides information about all packets that have arrived at the client by the time the acknowledgment is sent. In receiver-driven streaming, the client periodically sends to the server a single request packet that comprises a transmission schedule for sending missing data to the client over a horizon of time. We develop a comprehensive optimization framework that enables computing packet transmission decisions that maximize the end-to-end video quality for the given bandwidth resources, in both prospective scenarios. The core step of the optimization comprises computing the probability that a single packet will be communicated in error as a function of the expected transmission redundancy (or cost) used to communicate the packet. Through comprehensive simulation experiments, we carefully examine the performance advances that our framework enables relative to state-of-the-art scheduling systems that employ regular acknowledgement or request packets. Consistent gains in video quality of up to 2B are demonstrated across a variety of content types. We show that there is a direct analogy between the error-cost efficiency of streaming a single packet and the overall rate-distortion performance of streaming the whole content. In the case of sender-driven transmission, we develop an effective modeling approach that accurately characterizes the end-to-end performance as a function of the packet loss rate on the backward channel and the source encoding characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Heart valve diseases are common with an estimated prevalence of 2.5% in the Western world. The number is rising due to an ageing population. Once symptomatic, heart valve diseases are potentially lethal, and heavily influence daily living and quality of life. Surgical treatment, either valve replacement or repair, remains the treatment of choice. However, post surgery, the transition to daily living may become a physical, mental and social challenge. We hypothesise that a comprehensive cardiac rehabilitation programme can improve physical capacity and self-assessed mental health and reduce hospitalisation and healthcare costs after heart valve surgery. Methods A randomised clinical trial, CopenHeartVR, aims to investigate whether cardiac rehabilitation in addition to usual care is superior to treatment as usual after heart valve surgery. The trial will randomly allocate 210 patients, 1:1 intervention to control group, using central randomisation, and blinded outcome assessment and statistical analyses. The intervention consists of 12 weeks of physical exercise, and a psycho-educational intervention comprising five consultations. Primary outcome is peak oxygen uptake (VO2 peak) measured by cardiopulmonary exercise testing with ventilatory gas analysis. Secondary outcome is self-assessed mental health measured by the standardised questionnaire Short Form 36. Also, long-term healthcare utilisation and mortality as well as biochemistry, echocardiography and cost-benefit will be assessed. A mixed-method design is used to evaluate qualitative and quantitative findings encompassing a survey-based study before the trial and a qualitative pre- and post-intervention study. Discussion The study is approved by the local regional Research Ethics Committee (H-1-2011-157), and the Danish Data Protection Agency (j.nr. 2007-58-0015).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dendrogeomorphology uses information sources recorded in the roots, trunks and branches of trees and bushes located in the fluvial system to complement (or sometimes even replace) systematic and palaeohydrological records of past floods. The application of dendrogeomorphic data sources and methods to palaeoflood analysis over nearly 40 years has allowed improvements to be made in frequency and magnitude estimations of past floods. Nevertheless, research carried out so far has shown that the dendrogeomorphic indicators traditionally used (mainly scar evidence), and their use to infer frequency and magnitude, have been restricted to a small, limited set of applications. New possibilities with enormous potential remain unexplored. New insights in future research of palaeoflood frequency and magnitude using dendrogeomorphic data sources should: (1) test the application of isotopic indicators (16O/18O ratio) to discover the meteorological origin of past floods; (2) use different dendrogeomorphic indicators to estimate peak flows with 2D (and 3D) hydraulic models and study how they relate to other palaeostage indicators; (3) investigate improved calibration of 2D hydraulic model parameters (roughness); and (4) apply statistics-based cost–benefit analysis to select optimal mitigation measures. This paper presents an overview of these innovative methodologies, with a focus on their capabilities and limitations in the reconstruction of recent floods and palaeofloods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND We describe the long-term outcome after clinical introduction and dose escalation of somatostatin receptor targeted therapy with [90Y-DOTA]-TOC in patients with metastasized neuroendocrine tumors. METHODS In a clinical phase I dose escalation study we treated patients with increasing [90Y-DOTA]-TOC activities. Multivariable Cox regression and competing risk regression were used to compare efficacy and toxicities of the different dosage protocols. RESULTS Overall, 359 patients were recruited; 60 patients were enrolled for low dose (median: 2.4 GBq/cycle, range 0.9-7.8 GBq/cycle), 77 patients were enrolled for intermediate dose (median: 3.3 GBq/cycle, range: 2.0-7.4 GBq/cycle) and 222 patients were enrolled for high dose (median: 6.7 GBq/cycle, range: 3.7-8.1 GBq/cycle) [90Y-DOTA]-TOC treatment. The incidences of hematotoxicities grade 1-4 were 65.0%, 64.9% and 74.8%; the incidences of grade 4/5 kidney toxicities were 8.4%, 6.5% and 14.0%, and the median survival was 39 (range: 1-158) months, 34 (range: 1-118) months and 29 (range: 1-113) months. The high dose protocol was associated with an increased risk of kidney toxicity (Hazard Ratio: 3.12 (1.13-8.59) vs. intermediate dose, p = 0.03) and a shorter overall survival (Hazard Ratio: 2.50 (1.08-5.79) vs. low dose, p = 0.03). CONCLUSIONS Increasing [90Y-DOTA]-TOC activities may be associated with increasing hematological toxicities. The dose related hematotoxicity profile of [90Y-DOTA]-TOC could facilitate tailoring [90Y-DOTA]-TOC in patients with preexisting hematotoxicities. The results of the long-term outcome suggest that fractionated [90Y-DOTA]-TOC treatment might allow to reduce renal toxicity and to improve overall survival. (ClinicalTrials.gov number NCT00978211).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Middle atmospheric water vapour can be used as a tracer for dynamical processes. It is mainly measured by satellite instruments and ground-based microwave radiometers. Ground-based instruments capable of measuring middle-atmospheric water vapour are sparse but valuable as they complement satellite measurements, are relatively easy to maintain and have a long lifetime. MIAWARA-C is a ground-based microwave radiometer for middle-atmospheric water vapour designed for use on measurement campaigns for both atmospheric case studies and instrument intercomparisons. MIAWARA-C's retrieval version 1.1 (v1.1) is set up in a such way as to provide a consistent data set even if the instrument is operated from different locations on a campaign basis. The sensitive altitude range for v1.1 extends from 4 hPa (37 km) to 0.017 hPa (75 km). For v1.1 the estimated systematic error is approximately 10% for all altitudes. At lower altitudes it is dominated by uncertainties in the calibration, with altitude the influence of spectroscopic and temperature uncertainties increases. The estimated random error increases with altitude from 5 to 25%. MIAWARA-C measures two polarisations of the incident radiation in separate receiver channels, and can therefore provide two measurements of the same air mass with independent instrumental noise. The standard deviation of the difference between the profiles obtained from the two polarisations is in excellent agreement with the estimated random measurement error of v1.1. In this paper, the quality of v1.1 data is assessed for measurements obtained at two different locations: (1) a total of 25 months of measurements in the Arctic (Sodankylä, 67.37° N, 26.63° E) and (2) nine months of measurements at mid-latitudes (Zimmerwald, 46.88° N, 7.46° E). For both locations MIAWARA-C's profiles are compared to measurements from the satellite experiments Aura MLS and MIPAS. In addition, comparisons to ACE-FTS and SOFIE are presented for the Arctic and to the ground-based radiometer MIAWARA for the mid-latitude campaigns. In general, all intercomparisons show high correlation coefficients, confirming the ability of MIAWARA-C to monitor temporal variations of the order of days. The biases are generally below 13% and within the estimated systematic uncertainty of MIAWARA-C. No consistent wet or dry bias is identified for MIAWARA-C. In addition, comparisons to the reference instruments indicate the estimated random error of v1.1 to be a realistic measure of the random variation on the retrieved profile between 45 and 70 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RESTful services gained a lot of attention recently, even in the enterprise world, which is traditionally more web-service centric. Data centric RESfFul services, as previously mainly known in web environments, established themselves as a second paradigm complementing functional WSDL-based SOA. In the Internet of Things, and in particular when talking about sensor motes, the Constraint Application Protocol (CoAP) is currently in the focus of both research and industry. In the enterprise world a protocol called OData (Open Data Protocol) is becoming the future RESTful data access standard. To integrate sensor motes seamlessly into enterprise networks, an embedded OData implementation on top of CoAP is desirable, not requiring an intermediary gateway device. In this paper we introduce and evaluate an embedded OData implementation. We evaluate the OData protocol in terms of performance and energy consumption, considering different data encodings, and compare it to a pure CoAP implementation. We were able to demonstrate that the additional resources needed for an OData/JSON implementation are reasonable when aiming for enterprise interoperability, where OData is suggested to solve both the semantic and technical interoperability problems we have today when connecting systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Disc degeneration, usually associated with low back pain and changes of intervertebral stiffness, represents a major health issue. As the intervertebral disc (IVD) morphology influences its stiffness, the link between mechanical properties and degenerative grade is partially lost without an efficient normalization of the stiffness with respect to the morphology. Moreover, although the behavior of soft tissues is highly nonlinear, only linear normalization protocols have been defined so far for the disc stiffness. Thus, the aim of this work is to propose a nonlinear normalization based on finite elements (FE) simulations and evaluate its impact on the stiffness of human anatomical specimens of lumbar IVD. First, a parameter study involving simulations of biomechanical tests (compression, flexion/extension, bilateral torsion and bending) on 20 FE models of IVDs with various dimensions was carried out to evaluate the effect of the disc's geometry on its compliance and establish stiffness/morphology relations necessary to the nonlinear normalization. The computed stiffness was then normalized by height (H), cross-sectional area (CSA), polar moment of inertia (J) or moments of inertia (Ixx, Iyy) to quantify the effect of both linear and nonlinear normalizations. In the second part of the study, T1-weighted MRI images were acquired to determine H, CSA, J, Ixx and Iyy of 14 human lumbar IVDs. Based on the measured morphology and pre-established relation with stiffness, linear and nonlinear normalization routines were then applied to the compliance of the specimens for each quasi-static biomechanical test. The variability of the stiffness prior to and after normalization was assessed via coefficient of variation (CV). The FE study confirmed that larger and thinner IVDs were stiffer while the normalization strongly attenuated the effect of the disc geometry on its stiffness. Yet, notwithstanding the results of the FE study, the experimental stiffness showed consistently higher CV after normalization. Assuming that geometry and material properties affect the mechanical response, they can also compensate for one another. Therefore, the larger CV after normalization can be interpreted as a strong variability of the material properties, previously hidden by the geometry's own influence. In conclusion, a new normalization protocol for the intervertebral disc stiffness in compression, flexion, extension, bilateral torsion and bending was proposed, with the possible use of MRI and FE to acquire the discs' anatomy and determine the nonlinear relations between stiffness and morphology. Such protocol may be useful to relate the disc's mechanical properties to its degree of degeneration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information Centric Networking (ICN) as an emerging paradigm for the Future Internet has initially been rather focusing on bandwidth savings in wired networks, but there might also be some significant potential to support communication in mobile wireless networks as well as opportunistic network scenarios, where end systems have spontaneous but time-limited contact to exchange data. This chapter addresses the reasoning why ICN has an important role in mobile and opportunistic networks by identifying several challenges in mobile and opportunistic Information-Centric Networks and discussing appropriate solutions for them. In particular, it discusses the issues of receiver and source mobility. Source mobility needs special attention. Solutions based on routing protocol extensions, indirection, and separation of name resolution and data transfer are discussed. Moreover, the chapter presents solutions for problems in opportunistic Information-Centric Networks. Among those are mechanisms for efficient content discovery in neighbour nodes, resume mechanisms to recover from intermittent connectivity disruptions, a novel agent delegation mechanisms to offload content discovery and delivery to mobile agent nodes, and the exploitation of overhearing to populate routing tables of mobile nodes. Some preliminary performance evaluation results of these developed mechanisms are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the design of a submillimeter-wave mixer based on electromagnetic band gap (EBG) technology and using subharmonic local oscillator (LO) injection. The indicated device converts an incoming submilimeter wavelength signal into a 1-5 GHz intermediate frequency (IF) signal by mixing it with a subharmonic LO signal. The mixer consists of a dual-band receiver and two coplanar stripline (CPS) filters, collocated on top of a three-dimensional (3-D) EBG structure. A four-element array of the proposed receivers was designed, fabricated and tested. The configuration demonstrated reasonable performance: conversion loss below 8 dB and noise temperature below 3000 K. The presented concept can be used for higher frequencies, provided the availability of sufficiently powerful LO sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Life expectancy continuously increases but our society faces age-related conditions. Among musculoskeletal diseases, osteoporosis associated with risk of vertebral fracture and degenerative intervertebral disc (IVD) are painful pathologies responsible for tremendous healthcare costs. Hence, reliable diagnostic tools are necessary to plan a treatment or follow up its efficacy. Yet, radiographic and MRI techniques, respectively clinical standards for evaluation of bone strength and IVD degeneration, are unspecific and not objective. Increasingly used in biomedical engineering, CT-based finite element (FE) models constitute the state-of-art for vertebral strength prediction. However, as non-invasive biomechanical evaluation and personalised FE models of the IVD are not available, rigid boundary conditions (BCs) are applied on the FE models to avoid uncertainties of disc degeneration that might bias the predictions. Moreover, considering the impact of low back pain, the biomechanical status of the IVD is needed as a criterion for early disc degeneration. Thus, the first FE study focuses on two rigid BCs applied on the vertebral bodies during compression test of cadaver vertebral bodies, vertebral sections and PMMA embedding. The second FE study highlights the large influence of the intervertebral disc’s compliance on the vertebral strength, damage distribution and its initiation. The third study introduces a new protocol for normalisation of the IVD stiffness in compression, torsion and bending using MRI-based data to account for its morphology. In the last study, a new criterion (Otsu threshold) for disc degeneration based on quantitative MRI data (axial T2 map) is proposed. The results show that vertebral strength and damage distribution computed with rigid BCs are identical. Yet, large discrepancies in strength and damage localisation were observed when the vertebral bodies were loaded via IVDs. The normalisation protocol attenuated the effect of geometry on the IVD stiffnesses without complete suppression. Finally, the Otsu threshold computed in the posterior part of annulus fibrosus was related to the disc biomechanics and meet objectivity and simplicity required for a clinical application. In conclusion, the stiffness normalisation protocol necessary for consistent IVD comparisons and the relation found between degeneration, mechanical response of the IVD and Otsu threshold lead the way for non-invasive evaluation biomechanical status of the IVD. As the FE prediction of vertebral strength is largely influenced by the IVD conditions, this data could also improve the future FE models of osteoporotic vertebra.