935 resultados para proxy signature
Resumo:
Standard signature schemes are usually designed only to achieve weak unforgeability – i.e. preventing forgery of signatures on new messages not previously signed. However, most signature schemes are randomised and allow many possible signatures for a single message. In this case, it may be possible to produce a new signature on a previously signed message. Some applications require that this type of forgery also be prevented – this requirement is called strong unforgeability. At PKC2006, Boneh Shen and Waters presented an efficient transform based on any randomised trapdoor hash function which converts a weakly unforgeable signature into a strongly unforgeable signature and applied it to construct a strongly unforgeable signature based on the CDH problem. However, the transform of Boneh et al only applies to a class of so-called partitioned signatures. Although many schemes fall in this class, some do not, for example the DSA signature. Hence it is natural to ask whether one can obtain a truly generic efficient transform based on any randomised trapdoor hash function which converts any weakly unforgeable signature into a strongly unforgeable one. We answer this question in the positive by presenting a simple modification of the Boneh-Shen-Waters transform. Our modified transform uses two randomised trapdoor hash functions.
Resumo:
Handover performance is critical to support real-time traffic applications in wireless network communications. The longer the handover delay is, the longer an Mobile Node (MN) is prevented from sending and receiving any data packet. In real-time network communication applications, such as VoIP and video-conference, a long handover delay is often unacceptable. In order to achieve better handover performance, Fast Proxy Mobile IPv6 (FPMIPv6) has been standardised as an improvement to the original Proxy Mobile IPv6 (PMIPv6) in the Internet Engineering Task Force (IETF). The FPMIPv6 adopts a link layer triggering mechanism to perform two modes of operation: predictive and reactive modes. Using the link layer triggering, the handover performance of the FPMIPv6 can be improved in the predictive mode. However, an unsuccessful predictive handover operation will lead to activation of a reactive handover. In the reactive mode, MNs still experience long handover delays and a large amount of packet loss, which significantly degrade the handover performance of the FPMIPv6. Addressing this problem, this thesis presents an Enhanced Triggering Mechanism (ETM) in the FPMIPv6 to form an enhanced FPMIPv6 (eFPMIPv6). The ETM reduces the most time consuming processes in the reactive handover: the failed Handover Initiate (HO-Initiate) delay and bidirectional tunnel establishment delay. Consequently, the overall handover performance of the FPMIPv6 is enhanced in the eFPMIPv6. To show the advantages of the proposed eFPMIPv6, a theoretical analysis is carried out to mathematically model the performance of PMIPv6, FPMIPv6 and eFPMIPv6. Extensive case studies are conducted to validate the effectiveness of the presented eFPMIPv6 mechanism. They are carried out under various scenarios with changes in network link delay, traffic load, number of hops and MN moving velocity. The case studies show that the proposed mechanism ETM reduces the reactive handover delay, and the presented eFPMIPv6 outperforms the PMIPv6 and FPMIPv6 in terms of the overall handover performance.
Resumo:
In this paper we present an original approach for finding approximate nearest neighbours in collections of locality-sensitive hashes. The paper demonstrates that this approach makes high-performance nearest-neighbour searching feasible on Web-scale collections and commodity hardware with minimal degradation in search quality.
Resumo:
The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.
Resumo:
Email is rapidly replacing other forms of communication as the preferred means of communication between contracting parties. The recent decision of Stellard Pty Ltd v North Queensland Fuel Pty Ltd [2015] QSC 119 reinforces the judicial acceptance of email as an effective means of creating a binding agreement and the willingness to adopt a liberal concept of ‘signing’ in an electronic environment.
Resumo:
In this paper both documentary and natural proxy data have been used to improve the accuracy of palaeoclimatic knowledge in Finland since the 18th century. Early meteorological observations from Turku (1748-1800) were analyzed first as a potential source of climate variability. The reliability of the calculated mean temperatures was evaluated by comparing them with those of contemporary temperature records from Stockholm, St. Petersburg and Uppsala. The resulting monthly, seasonal and yearly mean temperatures from 1748 to 1800 were compared with the present day mean values (1961-1990): the comparison suggests that the winters of the period 1749-1800 were 0.8 ºC colder than today, while the summers were 0.4 ºC warmer. Over the same period, springs were 0.9 ºC and autumns 0.1 ºC colder than today. Despite their uncertainties when compared with modern meteorological data, early temperature measurements offer direct and daily information about the weather for all months of the year, in contrast with other proxies. Secondly, early meteorological observations from Tornio (1737-1749) and Ylitornio (1792-1838) were used to study the temporal behaviour of the climate-tree growth relationship during the past three centuries in northern Finland. Analyses showed that the correlations between ring widths and mid-summer (July) temperatures did not vary significantly as a function of time. Early (June) and late summer (August) mean temperatures were secondary to mid-summer temperatures in controlling the radial growth. According the dataset used, there was no clear signature of temporally reduced sensitivity of Scots pine ring widths to mid-summer temperatures over the periods of early and modern meteorological observations. Thirdly, plant phenological data with tree-rings from south-west Finland since 1750 were examined as a palaeoclimate indicator. The information from the fragmentary, partly overlapping, partly nonsystematically biased plant phenological records of 14 different phenomena were combined into one continuous time series of phenological indices. The indices were found to be reliable indicators of the February to June temperature variations. In contrast, there was no correlation between the phenological indices and the precipitation data. Moreover, the correlations between the studied tree-rings and spring temperatures varied as a function of time and hence, their use in palaeoclimate reconstruction is questionable. The use of present tree-ring datasets for palaeoclimate purposes may become possible after the application of more sophisticated calibration methods. Climate variability since the 18th century is perhaps best seen in the fourth paper study of the multiproxy spring temperature reconstruction of south-west Finland. With the help of transfer functions, an attempt has been made to utilize both documentary and natural proxies. The reconstruction was verified with statistics showing a high degree of validity between the reconstructed and observed temperatures. According to the proxies and modern meteorological observations from Turku, springs have become warmer and have featured a warming trend since around the 1850s. Over the period of 1750 to around 1850, springs featured larger multidecadal low-frequency variability, as well as a smaller range of annual temperature variations. The coldest springtimes occurred around the 1840s and 1850s and the first decade of the 19th century. Particularly warm periods occurred in the 1760s, 1790s, 1820s, 1930s, 1970s and from 1987 onwards, although in this period cold springs occurred, such as the springs of 1994 and 1996. On the basis of the available material, long-term temperature changes have been related to changes in the atmospheric circulation, such as the North Atlantic Oscillation (February-June).
Resumo:
Background Multiple sclerosis (MS) is thought to be a T cell-mediated autoimmune disorder. MS pathogenesis is likely due to a genetic predisposition triggered by a variety of environmental factors. Epigenetics, particularly DNA methylation, provide a logical interface for environmental factors to influence the genome. In this study we aim to identify DNA methylation changes associated with MS in CD8+ T cells in 30 relapsing remitting MS patients and 28 healthy blood donors using Illumina 450K methylation arrays. Findings Seventy-nine differentially methylated CpGs were associated with MS. The methylation profile of CD8+ T cells was distinctive from our previously published data on CD4+ T cells in the same cohort. Most notably, there was no major CpG effect at the MS risk gene HLA-DRB1 locus in the CD8+ T cells. Conclusion CD8+ T cells and CD4+ T cells have distinct DNA methylation profiles. This case–control study highlights the importance of distinctive cell subtypes when investigating epigenetic changes in MS and other complex diseases.
Resumo:
We present results of a signature-based search for new physics using a dijet plus missing transverse energy data sample collected in 2 fb-1 of p-pbar collisions at sqrt(s) = 1.96 TeV with the CDF II detector at the Fermilab Tevatron. We observe no significant event excess with respect to the standard model prediction and extract a 95% C.L. upper limit on the cross section times acceptance for a potential contribution from a non-standard model process. Based on this limit the mass of a first or second generation scalar leptoquark is constrained to be above 187 GeV/c^2.
Resumo:
In a search for new phenomena in a signature suppressed in the standard model of elementary particles (SM), we compare the inclusive production of events containing a lepton, a photon, significant transverse momentum imbalance (MET), and a jet identified as containing a b-quark, to SM predictions. The search uses data produced in proton-antiproton collisions at 1.96 TeV corresponding to 1.9 fb-1 of integrated luminosity taken with the CDF detector at the Fermilab Tevatron. We find 28 lepton+photon+MET+b events versus an expectation of 31.0+4.1/-3.5 events. If we further require events to contain at least three jets and large total transverse energy, simulations predict that the largest SM source is top-quark pair production with an additional radiated photon, ttbar+photon. In the data we observe 16 ttbar+photon candidate events versus an expectation from SM sources of 11.2+2.3/-2.1. Assuming the difference between the observed number and the predicted non-top-quark total is due to SM top quark production, we estimate the ttg cross section to be 0.15 +- 0.08 pb.
Resumo:
Amino acid sequences of proteinaceous proteinase inhibitors have been extensively analysed for deriving information regarding the molecular evolution and functional relationship of these proteins. These sequences have been grouped into several well defined families. It was found that the phylogeny constructed with the sequences corresponding to the exposed loop responsible for inhibition has several branches that resemble those obtained from comparisons using the entire sequence. The major branches of the unrooted tree corresponded to the families to which the inhibitors belonged. Further branching is related to the enzyme specificity of the inhibitor. Examination of the active site loop sequences of trypsin inhibitors revealed that there are strong preferences for specific amino acids at different positions of the loop. These preferences are inhibitor class specific. Inhibitors active against more than one enzyme occur within a class and confirm to class specific sequence in their loops. Hence, only a few positions in the loop seem to determine the specificity. The ability to inhibit the same enzyme by inhibitors that belong to different classes appears to be a result of convergent evolution
Resumo:
Glioblastoma (GBM) is the most common and aggressive primary brain tumor with very poor patient median survival. To identify a microRNA (miRNA) expression signature that can predict GBM patient survival, we analyzed the miRNA expression data of GBM patients (n = 222) derived from The Cancer Genome Atlas (TCGA) dataset. We divided the patients randomly into training and testing sets with equal number in each group. We identified 10 significant miRNAs using Cox regression analysis on the training set and formulated a risk score based on the expression signature of these miRNAs that segregated the patients into high and low risk groups with significantly different survival times (hazard ratio HR] = 2.4; 95% CI = 1.4-3.8; p < 0.0001). Of these 10 miRNAs, 7 were found to be risky miRNAs and 3 were found to be protective. This signature was independently validated in the testing set (HR = 1.7; 95% CI = 1.1-2.8; p = 0.002). GBM patients with high risk scores had overall poor survival compared to the patients with low risk scores. Overall survival among the entire patient set was 35.0% at 2 years, 21.5% at 3 years, 18.5% at 4 years and 11.8% at 5 years in the low risk group, versus 11.0%, 5.5%, 0.0 and 0.0% respectively in the high risk group (HR = 2.0; 95% CI = 1.4-2.8; p < 0.0001). Cox multivariate analysis with patient age as a covariate on the entire patient set identified risk score based on the 10 miRNA expression signature to be an independent predictor of patient survival (HR = 1.120; 95% CI = 1.04-1.20; p = 0.003). Thus we have identified a miRNA expression signature that can predict GBM patient survival. These findings may have implications in the understanding of gliomagenesis, development of targeted therapy and selection of high risk cancer patients for adjuvant therapy.
Resumo:
Damage detection by measuring and analyzing vibration signals in a machine component is an established procedure in mechanical and aerospace engineering. This paper presents vibration signature analysis of steel bridge structures in a nonconventional way using artificial neural networks (ANN). Multilayer perceptrons have been adopted using the back-propagation algorithm for network training. The training patterns in terms of vibration signature are generated analytically for a moving load traveling on a trussed bridge structure at a constant speed to simulate the inspection vehicle. Using the finite-element technique, the moving forces are converted into stationary time-dependent force functions in order to generate vibration signals in the structure and the same is used to train the network. The performance of the trained networks is examined for their capability to detect damage from unknown signatures taken independently at one, three, and five nodes. It has been observed that the prediction using the trained network with single-node signature measurement at a suitability chosen location is even better than that of three-node and five-node measurement data.
Resumo:
Optical UBVRI photometry and medium-resolution spectroscopy of the Type Ib supernova SN 2009jf, during the period from similar to -15 to +250 d, with respect to the B maximum are reported. The light curves are broad, with an extremely slow decline. The early post-maximum decline rate in the V band is similar to SN 2008D; however, the late-phase decline rate is slower than other Type Ib supernovae studied. With an absolute magnitude of M-V = -17.96 +/- 0.19 at peak, SN 2009jf is a normally bright supernova. The peak bolometric luminosity and the energy deposition rate via the 56Ni -> 56Co chain indicate that similar to 0.17+0.03(-0.03) M-circle dot of 56Ni was ejected during the explosion. The He i 5876 A line is clearly identified in the first spectrum of day similar to -15, at a velocity of similar to 16 000 km s-1. The O i] 6300-6364 A line seen in the nebular spectrum has a multipeaked and asymmetric emission profile, with the blue peak being stronger. The estimated flux in this line implies that greater than or similar to 1.34 M-circle dot oxygen was ejected. The slow evolution of the light curves of SN 2009jf indicates the presence of a massive ejecta. The high expansion velocity in the early phase and broader emission lines during the nebular phase suggest it to be an explosion with a large kinetic energy. A simple qualitative estimate leads to the ejecta mass of M-ej = 4-9 M-circle dot and kinetic energy E-K = 3-8 x 1051 erg. The ejected mass estimate is indicative of an initial main-sequence mass of greater than or similar to 20-25 M-circle dot.
Resumo:
This paper is concerned with off-line signature verification. Four different types of pattern representation schemes have been implemented, viz., geometric features, moment-based representations, envelope characteristics and tree-structured Wavelet features. The individual feature components in a representation are weighed by their pattern characterization capability using Genetic Algorithms. The conclusions of the four subsystems teach depending on a representation scheme) are combined to form a final decision on the validity of signature. Threshold-based classifiers (including the traditional confidence-interval classifier), neighbourhood classifiers and their combinations were studied. Benefits of using forged signatures for training purposes have been assessed. Experimental results show that combination of the Feature-based classifiers increases verification accuracy. (C) 1999 Pattern Recognition Society. Published by Elsevier Science Ltd. All rights reserved.