927 resultados para cryptographic pairing computation, elliptic curve cryptography


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Previous investigation showed that the volume-time curve technique could be an alternative for endotracheal tube (ETT) cuff management. However, the clinical impact of the volume-time curve application has not been documented. the purpose of this study was to compare the occurrence and intensity of a sore throat, cough, thoracic pain, and pulmonary function between these 2 techniques for ETT cuff management: volume-time curve technique versus minimal occlusive volume (MOV) technique after coronary artery bypass grafting. METHODS: A total of 450 subjects were randomized into 2 groups for cuff management after intubation: MOV group (n = 222) and volume-time curve group (n = 228). We measured cuff pressure before extubation. We performed spirometry 24 h before and after surgery. We graded sore throat and cough according to a 4-point scale at 1, 24, 72, and 120 h after extubation and assessed thoracic pain at 24 h after extubation and quantified the level of pain by a 10-point scale. RESULTS: the volume-time curve group presented significantly lower cuff pressure (30.9 +/- 2.8 vs 37.7 +/- 3.4 cm H2O), less incidence and intensity of sore throat (1 h, 23.7 vs 51.4%; and 24 h, 18.9 vs 40.5%, P < .001), cough (1 h, 19.3 vs 48.6%; and 24 h, 18.4 vs 42.3%, P < .001), thoracic pain (5.2 +/- 1.8 vs 7.1 +/- 1.7), better preservation of FVC (49.5 +/- 9.9 vs 41.8 +/- 12.9%, P = .005), and FEV1, (46.6 +/- 1.8 vs 38.6 +/- 1.4%, P = .005) compared with the MOV group. CONCLUSIONS: the subjects who received the volume-time curve technique for ETT cuff management presented a significantly lower incidence and severity of sore throat and cough, less thoracic pain, and minimally impaired pulmonary function than those subjects who received the MOV technique during the first 24 h after coronary artery bypass grafting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rowland, J.J. (2003) Model Selection Methodology in Supervised Learning with Evolutionary Computation. BioSystems 72, 1-2, pp 187-196, Nov

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rowland, J. J. (2003) Generalisation and Model Selection in Supervised Learning with Evolutionary Computation. European Workshop on Evolutionary Computation in Bioinformatics: EvoBio 2003. Lecture Notes in Computer Science (Springer), Vol 2611, pp 119-130

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rowland, J.J. (2002) Interpreting Analytical Spectra with Evolutionary Computation. In: Fogel, G.B. and Corne, D.W. (eds), Evolutionary Computation in Bioinformatics. Morgan Kaufmann, San Francisco, pp 341--365, ISBN 1-55860-797-8

Relevância:

20.00% 20.00%

Publicador:

Resumo:

J. Keppens, Q. Shen and B. Schafer. Probabilistic abductive computation of evidence collection strategies in crime investigation. Proceedings of the 10th International Conference on Artificial Intelligence and Law, pages 215-225.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

M. Galea and Q. Shen. Fuzzy rules from ant-inspired computation. Proceedings of the 13th International Conference on Fuzzy Systems, pages 1691-1696, 2004.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

B.M. Brown, M. Marletta, S. Naboko, I. Wood: Boundary triplets and M-functions for non-selfadjoint operators, with applications to elliptic PDEs and block operator matrices, J. London Math. Soc., June 2008; 77: 700-718. The full text of this article will be made available in this repository in June 2009 Sponsorship: EPSRC,INTAS

Relevância:

20.00% 20.00%

Publicador:

Resumo:

UPNa. Instituto de Agrobiotecnología. Laboratorio de Biofilms Microbianos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new notion of cryptographic tamper evidence. A tamper-evident signature scheme provides an additional procedure Div which detects tampering: given two signatures, Div can determine whether one of them was generated by the forger. Surprisingly, this is possible even after the adversary has inconspicuously learned (exposed) some-or even all-the secrets in the system. In this case, it might be impossible to tell which signature is generated by the legitimate signer and which by the forger. But at least the fact of the tampering will be made evident. We define several variants of tamper-evidence, differing in their power to detect tampering. In all of these, we assume an equally powerful adversary: she adaptively controls all the inputs to the legitimate signer (i.e., all messages to be signed and their timing), and observes all his outputs; she can also adaptively expose all the secrets at arbitrary times. We provide tamper-evident schemes for all the variants and prove their optimality. Achieving the strongest tamper evidence turns out to be provably expensive. However, we define a somewhat weaker, but still practical, variant: α-synchronous tamper-evidence (α-te) and provide α-te schemes with logarithmic cost. Our α-te schemes use a combinatorial construction of α-separating sets, which might be of independent interest. We stress that our mechanisms are purely cryptographic: the tamper-detection algorithm Div is stateless and takes no inputs except the two signatures (in particular, it keeps no logs), we use no infrastructure (or other ways to conceal additional secrets), and we use no hardware properties (except those implied by the standard cryptographic assumptions, such as random number generators). Our constructions are based on arbitrary ordinary signature schemes and do not require random oracles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formal tools like finite-state model checkers have proven useful in verifying the correctness of systems of bounded size and for hardening single system components against arbitrary inputs. However, conventional applications of these techniques are not well suited to characterizing emergent behaviors of large compositions of processes. In this paper, we present a methodology by which arbitrarily large compositions of components can, if sufficient conditions are proven concerning properties of small compositions, be modeled and completely verified by performing formal verifications upon only a finite set of compositions. The sufficient conditions take the form of reductions, which are claims that particular sequences of components will be causally indistinguishable from other shorter sequences of components. We show how this methodology can be applied to a variety of network protocol applications, including two features of the HTTP protocol, a simple active networking applet, and a proposed web cache consistency algorithm. We also doing discuss its applicability to framing protocol design goals and to representing systems which employ non-model-checking verification methodologies. Finally, we briefly discuss how we hope to broaden this methodology to more general topological compositions of network applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attributing a dollar value to a keyword is an essential part of running any profitable search engine advertising campaign. When an advertiser has complete control over the interaction with and monetization of each user arriving on a given keyword, the value of that term can be accurately tracked. However, in many instances, the advertiser may monetize arrivals indirectly through one or more third parties. In such cases, it is typical for the third party to provide only coarse-grained reporting: rather than report each monetization event, users are aggregated into larger channels and the third party reports aggregate information such as total daily revenue for each channel. Examples of third parties that use channels include Amazon and Google AdSense. In such scenarios, the number of channels is generally much smaller than the number of keywords whose value per click (VPC) we wish to learn. However, the advertiser has flexibility as to how to assign keywords to channels over time. We introduce the channelization problem: how do we adaptively assign keywords to channels over the course of multiple days to quickly obtain accurate VPC estimates of all keywords? We relate this problem to classical results in weighing design, devise new adaptive algorithms for this problem, and quantify the performance of these algorithms experimentally. Our results demonstrate that adaptive weighing designs that exploit statistics of term frequency, variability in VPCs across keywords, and flexible channel assignments over time provide the best estimators of keyword VPCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is shown that determining whether a quantum computation has a non-zero probability of accepting is at least as hard as the polynomial time hierarchy. This hardness result also applies to determining in general whether a given quantum basis state appears with nonzero amplitude in a superposition, or whether a given quantum bit has positive expectation value at the end of a quantum computation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is shown that determining whether a quantum computation has a non-zero probability of accepting is at least as hard as the polynomial time hierarchy. This hardness result also applies to determining in general whether a given quantum basis state appears with nonzero amplitude in a superposition, or whether a given quantum bit has positive expectation value at the end of a quantum computation. This result is achieved by showing that the complexity class NQP of Adleman, Demarrais, and Huang, a quantum analog of NP, is equal to the counting class coC=P.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, attacks on cryptographic algorithms looked for mathematical weaknesses in the underlying structure of a cipher. Side-channel attacks, however, look to extract secret key information based on the leakage from the device on which the cipher is implemented, be it smart-card, microprocessor, dedicated hardware or personal computer. Attacks based on the power consumption, electromagnetic emanations and execution time have all been practically demonstrated on a range of devices to reveal partial secret-key information from which the full key can be reconstructed. The focus of this thesis is power analysis, more specifically a class of attacks known as profiling attacks. These attacks assume a potential attacker has access to, or can control, an identical device to that which is under attack, which allows him to profile the power consumption of operations or data flow during encryption. This assumes a stronger adversary than traditional non-profiling attacks such as differential or correlation power analysis, however the ability to model a device allows templates to be used post-profiling to extract key information from many different target devices using the power consumption of very few encryptions. This allows an adversary to overcome protocols intended to prevent secret key recovery by restricting the number of available traces. In this thesis a detailed investigation of template attacks is conducted, along with how the selection of various attack parameters practically affect the efficiency of the secret key recovery, as well as examining the underlying assumption of profiling attacks in that the power consumption of one device can be used to extract secret keys from another. Trace only attacks, where the corresponding plaintext or ciphertext data is unavailable, are then investigated against both symmetric and asymmetric algorithms with the goal of key recovery from a single trace. This allows an adversary to bypass many of the currently proposed countermeasures, particularly in the asymmetric domain. An investigation into machine-learning methods for side-channel analysis as an alternative to template or stochastic methods is also conducted, with support vector machines, logistic regression and neural networks investigated from a side-channel viewpoint. Both binary and multi-class classification attack scenarios are examined in order to explore the relative strengths of each algorithm. Finally these machine-learning based alternatives are empirically compared with template attacks, with their respective merits examined with regards to attack efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowing one's HIV status is particularly important in the setting of recent tuberculosis (TB) exposure. Blood tests for assessment of tuberculosis infection, such as the QuantiFERON Gold in-tube test (QFT; Cellestis Limited, Carnegie, Victoria, Australia), offer the possibility of simultaneous screening for TB and HIV with a single blood draw. We performed a cross-sectional analysis of all contacts to a highly infectious TB case in a large meatpacking factory. Twenty-two percent were foreign-born and 73% were black. Contacts were tested with both tuberculin skin testing (TST) and QFT. HIV testing was offered on an opt-out basis. Persons with TST >or=10 mm, positive QFT, and/or positive HIV test were offered latent TB treatment. Three hundred twenty-six contacts were screened: TST results were available for 266 people and an additional 24 reported a prior positive TST for a total of 290 persons with any TST result (89.0%). Adequate QFT specimens were obtained for 312 (95.7%) of persons. Thirty-two persons had QFT results but did not return for TST reading. Twenty-two percent met the criteria for latent TB infection. Eighty-eight percent accepted HIV testing. Two (0.7%) were HIV seropositive; both individuals were already aware of their HIV status, but one had stopped care a year previously. None of the HIV-seropositive persons had latent TB, but all were offered latent TB treatment per standard guidelines. This demonstrates that opt-out HIV testing combined with QFT in a large TB contact investigation was feasible and useful. HIV testing was also widely accepted. Pairing QFT with opt-out HIV testing should be strongly considered when possible.