408 resultados para edward mebarak
Resumo:
This paper provides new results about efficient arithmetic on Jacobi quartic form elliptic curves, y 2 = d x 4 + 2 a x 2 + 1. With recent bandwidth-efficient proposals, the arithmetic on Jacobi quartic curves became solidly faster than that of Weierstrass curves. These proposals use up to 7 coordinates to represent a single point. However, fast scalar multiplication algorithms based on windowing techniques, precompute and store several points which require more space than what it takes with 3 coordinates. Also note that some of these proposals require d = 1 for full speed. Unfortunately, elliptic curves having 2-times-a-prime number of points, cannot be written in Jacobi quartic form if d = 1. Even worse the contemporary formulae may fail to output correct coordinates for some inputs. This paper provides improved speeds using fewer coordinates without causing the above mentioned problems. For instance, our proposed point doubling algorithm takes only 2 multiplications, 5 squarings, and no multiplication with curve constants when d is arbitrary and a = ±1/2.
Resumo:
This paper improves implementation techniques of Elliptic Curve Cryptography. We introduce new formulae and algorithms for the group law on Jacobi quartic, Jacobi intersection, Edwards, and Hessian curves. The proposed formulae and algorithms can save time in suitable point representations. To support our claims, a cost comparison is made with classic scalar multiplication algorithms using previous and current operation counts. Most notably, the best speeds are obtained from Jacobi quartic curves which provide the fastest timings for most scalar multiplication strategies benefiting from the proposed 12M + 5S + 1D point doubling and 7M + 3S + 1D point addition algorithms. Furthermore, the new addition algorithm provides an efficient way to protect against side channel attacks which are based on simple power analysis (SPA). Keywords: Efficient elliptic curve arithmetic,unified addition, side channel attack.
Resumo:
The role that heparanase plays during metastasis and angiogenesis in tumors makes it an attractive target for cancer therapeutics. Despite this enzyme’s significance, most of the assays developed to measure its activity are complex. Moreover, they usually rely on labeling variable preparations of the natural substrate heparan sulfate, making comparisons across studies precarious. To overcome these problems, we have developed a convenient assay based on the cleavage of the synthetic heparin oligosaccharide fondaparinux. The assay measures the appearance of the disaccharide product of heparanase-catalyzed fondaparinux cleavage colorimetrically using the tetrazolium salt WST-1. Because this assay has a homogeneous substrate with a single point of cleavage, the kinetics of the enzyme can be reliably characterized, giving a Km of 46 μM and a kcat of 3.5 s−1 with fondaparinux as substrate. The inhibition of heparanase by the published inhibitor, PI-88, was also studied, and a Ki of 7.9 nM was determined. The simplicity and robustness of this method, should, not only greatly assist routine assay of heparanase activity but also could be adapted for high-throughput screening of compound libraries, with the data generated being directly comparable across studies.
Resumo:
Heparan sulfate mimetics, which we have called the PG500 series, have been developed to target the inhibition of both angiogenesis and heparanase activity. This series extends the technology underpinning PI-88, a mixture of highly sulfated oligosaccharides which reached Phase III clinical development for hepatocellular carcinoma. Advances in the chemistry of the PG500 series provide numerous advantages over PI-88. These new compounds are fully sulfated, single entity oligosaccharides attached to a lipophilic moiety, which have been optimized for drug development. The rational design of these compounds has led to vast improvements in potency compared to PI-88, based on in vitro angiogenesis assays and in vivo tumor models. Based on these and other data, PG545 has been selected as the lead clinical candidate for oncology and is currently undergoing formal preclinical development as a novel treatment for advanced cancer.
Resumo:
The United Arab Emirates (UAE) is part of the geographic region known as the Middle East. With a land mass of 82,000 square kilometres, predominantly desert and mountains it is bordered by Oman, Saudi Arabia and the Arabian Gulf. The UAE is strategically located due to its proximity to other oil rich Middle Eastern countries such as Kuwait, Iraq, Iran, and Saudi Arabia. The UAE was formed from a federation of seven emirates (Abu Dhabi, Dubai, Sharjah, Ras Al Khaimah, Ajman, Fujuriah, and Um Al Quain) in December 1971 (Ras Al Khaimah did not join the federation until 1972) (Heard-bey, 2004, 370). Abu Dhabi is the political capital, and the richest emirate; while Dubai is the commercial centre. The majority of the population of the various Emirates live along the coast line as sources of fresh water often heavily influenced the site of different settlements. Unlike some near neighbours (Iran and Iraq) the UAE has not undergone any significant political instability since it was formed in 1971. Due to this early British influences the UAE has had very strong political and economic ties with first Britain, and, more recently, the United States of America (Rugh, 2007). Until the economic production of oil in the early 1960’s the different Emirates had survived on a mixture of primary industry (dates), farming (goats and camels), pearling and subsidies from Britain (Davidson 2005, 3; Hvit, 2007, 565) Along with near neighbours Kuwait, Bahrain, Oman, Qatar and Saudi Arabia, the UAE is part of the Gulf Cooperation Council (GCC), a trading bloc. (Hellyer, 2001, 166-168).
Resumo:
The paper analyses the expected value of OD volumes from probe with fixed error, error that is proportional to zone size and inversely proportional to zone size. To add realism to the analysis, real trip ODs in the Tokyo Metropolitan Region are synthesised. The results show that for small zone coding with average radius of 1.1km, and fixed measurement error of 100m, an accuracy of 70% can be expected. The equivalent accuracy for medium zone coding with average radius of 5km would translate into a fixed error of approximately 300m. As expected small zone coding is more sensitive than medium zone coding as the chances of the probe error envelope falling into adjacent zones are higher. For the same error radii, error proportional to zone size would deliver higher level of accuracy. As over half (54.8%) of the trip ends start or end at zone with equivalent radius of ≤ 1.2 km and only 13% of trips ends occurred at zones with equivalent radius ≥2.5km, measurement error that is proportional to zone size such as mobile phone would deliver higher level of accuracy. The synthesis of real OD with different probe error characteristics have shown that expected value of >85% is difficult to achieve for small zone coding with average radius of 1.1km. For most transport applications, OD matrix at medium zone coding is sufficient for transport management. From this study it can be drawn that GPS with error range between 2 and 5m, and at medium zone coding (average radius of 5km) would provide OD estimates greater than 90% of the expected value. However, for a typical mobile phone operating error range at medium zone coding the expected value would be lower than 85%. This paper assumes transmission of one origin and one destination positions from the probe. However, if multiple positions within the origin and destination zones are transmitted, map matching to transport network could be performed and it would greatly improve the accuracy of the probe data.
Resumo:
This paper presents a model to estimate travel time using cumulative plots. Three different cases considered are i) case-Det, for only detector data; ii) case-DetSig, for detector data and signal controller data and iii) case-DetSigSFR: for detector data, signal controller data and saturation flow rate. The performance of the model for different detection intervals is evaluated. It is observed that detection interval is not critical if signal timings are available. Comparable accuracy can be obtained from larger detection interval with signal timings or from shorter detection interval without signal timings. The performance for case-DetSig and for case-DetSigSFR is consistent with accuracy generally more than 95% whereas, case-Det is highly sensitive to the signal phases in the detection interval and its performance is uncertain if detection interval is integral multiple of signal cycles.
Resumo:
Vehicle detectors have been installed at approximately every 300 meters on each lane on Tokyo metropolitan expressway. Various traffic data such as traffic volume, average speed and time occupancy are collected by vehicle detectors. We can understand traffic characteristics of every point by comparing traffic data collected at consecutive points. In this study, we focused on average speed, analyzed road potential by operating speed during free-flow conditions, and identified latent bottlenecks. Furthermore, we analyzed effects for road potential by the rainfall level and day of the week. It’s expected that this method of analysis will be utilized for installation of ITS such as drive assist, estimation of parameters for traffic simulation and feedback to road design as congestion measures.
Resumo:
Tzeng et al. proposed a new threshold multi-proxy multi-signature scheme with threshold verification. In their scheme, a subset of original signers authenticates a designated proxy group to sign on behalf of the original group. A message m has to be signed by a subset of proxy signers who can represent the proxy group. Then, the proxy signature is sent to the verifier group. A subset of verifiers in the verifier group can also represent the group to authenticate the proxy signature. Subsequently, there are two improved schemes to eliminate the security leak of Tzeng et al.’s scheme. In this paper, we have pointed out the security leakage of the three schemes and further proposed a novel threshold multi-proxy multi-signature scheme with threshold verification.
Resumo:
A strong designated verifier signature scheme makes it possible for a signer to convince a designated verifier that she has signed a message in such a way that the designated verifier cannot transfer the signature to a third party, and no third party can even verify the validity of a designated verifier signature. We show that anyone who intercepts one signature can verify subsequent signatures in Zhang-Mao ID-based designated verifier signature scheme and Lal-Verma ID-based designated verifier proxy signature scheme. We propose a new and efficient ID-based designated verifier signature scheme that is strong and unforgeable. As a direct corollary, we also get a new efficient ID-based designated verifier proxy signature scheme.
Resumo:
This fascinating handbook defines how knowledge contributes to social and economic life, and vice versa. It considers the five areas critical to acquiring a comprehensive understanding of the knowledge economy: the nature of the knowledge economy; social, cooperative, cultural, creative, ethical and intellectual capital; knowledge and innovation systems; policy analysis for knowledge-based economies; and knowledge management. In presenting the outcomes of an important body of research, the handbook enables knowledge policy and management practitioners to be more systematically guided in their thinking and actions. The contributors cover a wide disciplinary spectrum in an accessible way, presenting concise, to-the-point discussions of critical concepts and practices that will enable practitioners to make effective research, managerial and policy decisions. They also highlight important new areas of concern to knowledge economies such as wisdom, ethics, language and creative economies that are largely overlooked.
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. AE is potentially more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message and in a separate pass, providing integrity protection by generating a Message Authentication Code (MAC) tag. This paper presents results on the analysis of three AE stream ciphers submitted to the recently completed eSTREAM competition. We classify the ciphers based on the methods the ciphers use to provide authenticated encryption and discuss possible methods for mounting attacks on these ciphers.
Resumo:
In this paper, we discuss our participation to the INEX 2008 Link-the-Wiki track. We utilized a sliding window based algorithm to extract the frequent terms and phrases. Using the extracted phrases and term as descriptive vectors, the anchors and relevant links (both incoming and outgoing) are recognized efficiently.