911 resultados para multicast protocols
Resumo:
The aim of this study was to investigate adolescents' potential reactivity and tampering while wearing pedometers by comparing different monitoring protocols to accelerometer output. The sample included adolescents (N=123, age range=14-15 years) from three secondary schools in New South Wales, Australia. Schools were randomised to one of the three pedometer monitoring protocols: (i) daily sealed (DS) pedometer group, (ii) unsealed (US) pedometer group or (iii) weekly sealed (WS) pedometer group. Participants wore pedometers (Yamax Digi-Walker CW700, Yamax Corporation, Kumamoto City, Japan) and accelerometers (Actigraph GT3X+, Pensacola, USA) simultaneously for seven days. Repeated measures analysis of variance was used to examine potential reactivity. Bivariate correlations between step counts and accelerometer output were calculated to explore potential tampering. The correlation between accelerometer output and pedometer steps/day was strongest among participants in the WS group (r=0.82, P <= 0.001), compared to the US (r=0.63, P <= 0.001) and DS (r=0.16, P=0.324) groups. The DS (P <= 0.001) and US (P=0.003), but not the WS (P=0.891), groups showed evidence of reactivity. The results suggest that reactivity and tampering does occur in adolescents and contrary to existing research, pedometer monitoring protocols may influence participant behaviour.
Resumo:
Indigenous commentators have long critiqued the way in which government agencies and member of academic institutions carry out research in their social context. Recently, these commentators have turned their critical gaze upon activities of Research Ethics Boards(REBs). Informed by the reflections on research processes and by Indigenous Canadian and New Zealand research participants, as well as the extant literature, this paper critiques the processes employed by New Zealand REBs to assess Indigenous‐focused or Indigenous‐led research in the criminological realm.
Resumo:
We first classify the state-of-the-art stream authentication problem in the multicast environment and group them into Signing and MAC approaches. A new approach for authenticating digital streams using Threshold Techniques is introduced. The new approach main advantages are in tolerating packet loss, up to a threshold number, and having a minimum space overhead. It is most suitable for multicast applications running over lossy, unreliable communication channels while, in same time, are pertain the security requirements. We use linear equations based on Lagrange polynomial interpolation and Combinatorial Design methods.
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and inject data packets into the communication channel. In this context, bandwidth limitation and fast authentication are the core concerns. Therefore any authentication scheme is to reduce as much as possible the packet overhead and the time spent at the receiver to check the authenticity of collected elements. Recently, Tartary and Wang developed a provably secure protocol with small packet overhead and a reduced number of signature verifications to be performed at the receiver. In this paper, we propose an hybrid scheme based on Tartary and Wang’s approach and Merkle hash trees. Our construction will exhibit a smaller overhead and a much faster processing at the receiver making it even more suitable for multicast than the earlier approach. As Tartary and Wang’s protocol, our construction is provably secure and allows the total recovery of the data stream despite erasures and injections occurred during transmission.
Resumo:
Multi-party key agreement protocols indirectly assume that each principal equally contributes to the final form of the key. In this paper we consider three malleability attacks on multi-party key agreement protocols. The first attack, called strong key control allows a dishonest principal (or a group of principals) to fix the key to a pre-set value. The second attack is weak key control in which the key is still random, but the set from which the key is drawn is much smaller than expected. The third attack is named selective key control in which a dishonest principal (or a group of dishonest principals) is able to remove a contribution of honest principals to the group key. The paper discusses the above three attacks on several key agreement protocols, including DH (Diffie-Hellman), BD (Burmester-Desmedt) and JV (Just-Vaudenay). We show that dishonest principals in all three protocols can weakly control the key, and the only protocol which does not allow for strong key control is the DH protocol. The BD and JV protocols permit to modify the group key by any pair of neighboring principals. This modification remains undetected by honest principals.
Resumo:
To prevent unauthorized access to protected trusted platform module (TPM) objects, authorization protocols, such as the object-specific authorization protocol (OSAP), have been introduced by the trusted computing group (TCG). By using OSAP, processes trying to gain access to the protected TPM objects need to prove their knowledge of relevant authorization data before access to the objects can be granted. Chen and Ryan’s 2009 analysis has demonstrated OSAP’s authentication vulnerability in sessions with shared authorization data. They also proposed the Session Key Authorization Protocol (SKAP) with fewer stages as an alternative to OSAP. Chen and Ryan’s analysis of SKAP using ProVerif proves the authentication property. The purpose of this paper was to examine the usefulness of Colored Petri Nets (CPN) and CPN Tools for security analysis. Using OSAP and SKAP as case studies, we construct intruder and authentication property models in CPN. CPN Tools is used to verify the authentication property using a Dolev–Yao-based model. Verification of the authentication property in both models using the state space tool produces results consistent with those of Chen and Ryan.
Resumo:
We propose a reliable and ubiquitous group key distribution scheme that is suitable for ad hoc networks. The scheme has self-initialisation and self-securing features. The former feature allows a cooperation of an arbitrary number of nodes to initialise the system, and it also allows node admission to be performed in a decentralised fashion. The latter feature allows a group member to determine the group key remotely while maintaining the system security. We also consider a decentralised solution of establishing secure point-to-point communication. The solution allows a new node to establish a secure channel with every existing node if it has pre-existing secure channels with a threshold number of the existing nodes.
Resumo:
The finite-signal-to-noise ratio (SNR) diversity-multiplexing trade-off (DMT) of cooperative diversity protocols are investigated in vehicular networks based on cascaded Rayleigh fading. Lower bounds of DMT at finite SNR for orthogonal and non-orthogonal protocols are derived. The results showcase the first look into the achievable DMT trade-off of cooperative diversity in volatile vehicular environments. It is shown that the diversity gains are significantly suboptimal at realistic SNRs.
Resumo:
Citizen science projects have demonstrated the advantages of people with limited relevant prior knowledge participating in research. However, there is a difference between engaging the general public in a scientific project and entering an established expert community to conduct research. This paper describes our ongoing acoustic biodiversity monitoring collaborations with the bird watching community. We report on findings gathered over six years from participation in bird walks, observing conservation efforts, and records of personal activities of experienced birders. We offer an empirical study into extending existing protocols through in-context collaborative design involving scientists and domain experts.
Resumo:
Firstly, we would like to thank Ms. Alison Brough and her colleagues for their positive commentary on our published work [1] and their appraisal of our utility of the “off-set plane” protocol for anthropometric analysis. The standardized protocols described in our manuscript have wide applications, ranging from forensic anthropology and paleodemographic research to clinical settings such as paediatric practice and orthopaedic surgical design. We affirm that the use of geometrically based reference tools commonly found in computer aided design (CAD) programs such as Geomagic Design X® are imperative for more automated and precise measurement protocols for quantitative skeletal analysis. Therefore we stand by our recommendation of the use of software such as Amira and Geomagic Design X® in the contexts described in our manuscript...
Resumo:
As of now, there is little evidence on the questions of whether pro bono services are effective, whether lawyer charity is a cheaper way to provide them (because it does cost money to do pro bono), or whether it would in fact be more efficient and effective if firms and attorneys stopped giving their time and instead donated money to the organizations already specializing in these clients and causes. Pro bono may or may not be an efficient way of doing socially important work; at this point, we simply do not know
Resumo:
In this paper, the security of two recent RFID mutual authentication protocols are investigated. The first protocol is a scheme proposed by Huang et al. [7] and the second one by Huang, Lin and Li [6]. We show that these two protocols have several weaknesses. In Huang et al.’s scheme, an adversary can determine the 32-bit secret password with a probability of 2−2 , and in Huang-Lin-Li scheme, a passive adversary can recognize a target tag with a success probability of 1−2−4 and an active adversary can determine all 32 bits of Access password with success probability of 2−4 . The computational complexity of these attacks is negligible.
Resumo:
Concerns over the security and privacy of patient information are one of the biggest hindrances to sharing health information and the wide adoption of eHealth systems. At present, there are competing requirements between healthcare consumers' (i.e. patients) requirements and healthcare professionals' (HCP) requirements. While consumers want control over their information, healthcare professionals want access to as much information as required in order to make well-informed decisions and provide quality care. In order to balance these requirements, the use of an Information Accountability Framework devised for eHealth systems has been proposed. In this paper, we take a step closer to the adoption of the Information Accountability protocols and demonstrate their functionality through an implementation in FluxMED, a customisable EHR system.
Resumo:
This paper investigates communication protocols for relaying sensor data from animal tracking applications back to base stations. While Delay Tolerant Networks (DTNs) are well suited to such challenging environments, most existing protocols do not consider the available energy that is particularly important when tracking devices can harvest energy. This limits both the network lifetime and delivery probability in energy-constrained applications to the point when routing performance becomes worse than using no routing at all. Our work shows that substantial improvement in data yields can be achieved through simple yet efficient energy-aware strategies. Conceptually, there is need for balancing the energy spent on sensing, data mulling, and delivery of direct packets to destination. We use empirical traces collected in a flying fox (fruit bat) tracking project and show that simple threshold-based energy-aware strategies yield up to 20% higher delivery rates. Furthermore, these results generalize well for a wide range of operating conditions.
Resumo:
Here we describe a protocol for advanced CUBIC (Clear, Unobstructed Brain/Body Imaging Cocktails and Computational analysis). The CUBIC protocol enables simple and efficient organ clearing, rapid imaging by light-sheet microscopy and quantitative imaging analysis of multiple samples. The organ or body is cleared by immersion for 1–14 d, with the exact time required dependent on the sample type and the experimental purposes. A single imaging set can be completed in 30–60 min. Image processing and analysis can take <1 d, but it is dependent on the number of samples in the data set. The CUBIC clearing protocol can process multiple samples simultaneously. We previously used CUBIC to image whole-brain neural activities at single-cell resolution using Arc-dVenus transgenic (Tg) mice. CUBIC informatics calculated the Venus signal subtraction, comparing different brains at a whole-organ scale. These protocols provide a platform for organism-level systems biology by comprehensively detecting cells in a whole organ or body.