358 resultados para KAM curve


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces fast algorithms for performing group operations on twisted Edwards curves, pushing the recent speed limits of Elliptic Curve Cryptography (ECC) forward in a wide range of applications. Notably, the new addition algorithm uses for suitably selected curve constants. In comparison, the fastest point addition algorithms for (twisted) Edwards curves stated in the literature use . It is also shown that the new addition algorithm can be implemented with four processors dropping the effective cost to . This implies an effective speed increase by the full factor of 4 over the sequential case. Our results allow faster implementation of elliptic curve scalar multiplication. In addition, the new point addition algorithm can be used to provide a natural protection from side channel attacks based on simple power analysis (SPA).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides new results about efficient arithmetic on Jacobi quartic form elliptic curves, y 2 = d x 4 + 2 a x 2 + 1. With recent bandwidth-efficient proposals, the arithmetic on Jacobi quartic curves became solidly faster than that of Weierstrass curves. These proposals use up to 7 coordinates to represent a single point. However, fast scalar multiplication algorithms based on windowing techniques, precompute and store several points which require more space than what it takes with 3 coordinates. Also note that some of these proposals require d = 1 for full speed. Unfortunately, elliptic curves having 2-times-a-prime number of points, cannot be written in Jacobi quartic form if d = 1. Even worse the contemporary formulae may fail to output correct coordinates for some inputs. This paper provides improved speeds using fewer coordinates without causing the above mentioned problems. For instance, our proposed point doubling algorithm takes only 2 multiplications, 5 squarings, and no multiplication with curve constants when d is arbitrary and a = ±1/2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper improves implementation techniques of Elliptic Curve Cryptography. We introduce new formulae and algorithms for the group law on Jacobi quartic, Jacobi intersection, Edwards, and Hessian curves. The proposed formulae and algorithms can save time in suitable point representations. To support our claims, a cost comparison is made with classic scalar multiplication algorithms using previous and current operation counts. Most notably, the best speeds are obtained from Jacobi quartic curves which provide the fastest timings for most scalar multiplication strategies benefiting from the proposed 12M + 5S + 1D point doubling and 7M + 3S + 1D point addition algorithms. Furthermore, the new addition algorithm provides an efficient way to protect against side channel attacks which are based on simple power analysis (SPA). Keywords: Efficient elliptic curve arithmetic,unified addition, side channel attack.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents efficient formulas for computing cryptographic pairings on the curve y 2 = c x 3 + 1 over fields of large characteristic. We provide examples of pairing-friendly elliptic curves of this form which are of interest for efficient pairing implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is an IT enabled technology that allows storage, management, sharing, access, update and use of all the data relevant to a project through out the project life-cycle in the form of a data repository. BIM enables improved inter-disciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. While the technology itself may not be new, and similar approaches have been in use in some other sectors like Aircraft and Automobile industry for well over a decade now, the AEC/FM (Architecture, Engineering and Construction/ Facilities Management) industry is still to catch up with them in its ability to exploit the benefits of the IT revolution. Though the potential benefits of the technology in terms of knowledge sharing, project management, project co-ordination and collaboration are near to obvious, the adoption rate has been rather lethargic, inspite of some well directed efforts and availability of supporting commercial tools. Since the technology itself has been well tested over the years in some other domains the plausible causes must be rooted well beyond the explanation of the ‘Bell Curve of innovation adoption’. This paper discusses the preliminary findings of an ongoing research project funded by the Cooperative Research Centre for Construction Innovation (CRC-CI) which aims to identify these gaps and come up with specifications and guidelines to enable greater adoption of the BIM approach in practice. A detailed literature review is conducted that looks at some of the similar research reported in the recent years. A desktop audit of some of the existing commercial tools that support BIM application has been conducted to identify the technological issues and concerns, and a workshop was organized with industry partners and various players in the AEC industry for needs analysis, expectations and feedback on the possible deterrents and inhibitions surrounding the BIM adoption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-stroke outboard boat engines using total loss lubrication deposit a significant proportion of their lubricant and fuel directly into the water. The purpose of this work is to document the velocity and concentration field characteristics of a submerged swirling water jet emanating from a propeller in order to provide information on its fundamental characteristics. Measurements of the velocity and concentration field were performed in a turbulent jet generated by a model boat propeller (0.02 m diameter) operating at 1500 rpm and 3000 rpm. The measurements were carried out in the Zone of Established Flow up to 50 propeller diameters downstream of the propeller. Both the mean axial velocity profile and the mean concentration profile showed self-similarity. Further, the stand deviation growth curve was linear. The effects of propeller speed and dye release location were also investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review explores the question whether chemometrics methods enhance the performance of electroanalytical methods. Electroanalysis has long benefited from the well-established techniques such as potentiometric titrations, polarography and voltammetry, and the more novel ones such as electronic tongues and noses, which have enlarged the scope of applications. The electroanalytical methods have been improved with the application of chemometrics for simultaneous quantitative prediction of analytes or qualitative resolution of complex overlapping responses. Typical methods include partial least squares (PLS), artificial neural networks (ANNs), and multiple curve resolution methods (MCR-ALS, N-PLS and PARAFAC). This review aims to provide the practising analyst with a broad guide to electroanalytical applications supported by chemometrics. In this context, after a general consideration of the use of a number of electroanalytical techniques with the aid of chemometrics methods, several overviews follow with each one focusing on an important field of application such as food, pharmaceuticals, pesticides and the environment. The growth of chemometrics in conjunction with electronic tongue and nose sensors is highlighted, and this is followed by an overview of the use of chemometrics for the resolution of complicated profiles for qualitative identification of analytes, especially with the use of the MCR-ALS methodology. Finally, the performance of electroanalytical methods is compared with that of some spectrophotometric procedures on the basis of figures-of-merit. This showed that electroanalytical methods can perform as well as the spectrophotometric ones. PLS-1 appears to be the method of practical choice if the %relative prediction error of not, vert, similar±10% is acceptable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactions between small molecules with biopolymers e.g. the bovine serum albumin (BSA protein), are important, and significant information is recorded in the UV–vis and fluorescence spectra of their reaction mixtures. The extraction of this information is difficult conventionally and principally because there is significant overlapping of the spectra of the three analytes in the mixture. The interaction of berberine chloride (BC) and the BSA protein provides an interesting example of such complex systems. UV–vis and fluorescence spectra of BC and BSA mixtures were investigated in pH 7.4 Tris–HCl buffer at 37 °C. Two sample series were measured by each technique: (1) [BSA] was kept constant and the [BC] was varied and (2) [BC] was kept constant and the [BSA] was varied. This produced four spectral data matrices, which were combined into one expanded spectral matrix. This was processed by the multivariate curve resolution–alternating least squares method (MCR–ALS). The results produced: (1) the extracted pure BC, BSA and the BC–BSA complex spectra from the measured heavily overlapping composite responses, (2) the concentration profiles of BC, BSA and the BC–BSA complex, which are difficult to obtain by conventional means, and (3) estimates of the number of binding sites of BC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interaction of quercetin, which is a bioflavonoid, with bovine serum albumin (BSA) was investigated under pseudo-physiological conditions by the application of UV–vis spectrometry, spectrofluorimetry and cyclic voltammetry (CV). These studies indicated a cooperative interaction between the quercetin–BSA complex and warfarin, which produced a ternary complex, quercetin–BSA–warfarin. It was found that both quercetin and warfarin were located in site I. However, the spectra of these three components overlapped and the chemometrics method – multivariate curve resolution-alternating least squares (MCR-ALS) was applied to resolve the spectra. The resolved spectra of quercetin–BSA and warfarin agreed well with their measured spectra, and importantly, the spectrum of the quercetin–BSA–warfarin complex was extracted. These results allowed the rationalization of the behaviour of the overlapping spectra. At lower concentrations ([warfarin] < 1 × 10−5 mol L−1), most of the site marker reacted with the quercetin–BSA, but free warfarin was present at higher concentrations. Interestingly, the ratio between quercetin–BSA and warfarin was found to be 1:2, suggesting a quercetin–BSA–(warfarin)2 complex, and the estimated equilibrium constant was 1.4 × 1011 M−2. The results suggest that at low concentrations, warfarin binds at the high-affinity sites (HAS), while low-affinity binding sites (LAS) are occupied at higher concentrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cognitive-energetical theories of information processing were used to generate predictions regarding the relationship between workload and fatigue within and across consecutive days of work. Repeated measures were taken on board a naval vessel during a non-routine and a routine patrol. Data were analyzed using growth curve modeling. Fatigue demonstrated a non-monotonic relationship within days in both patrols – fatigue was high at midnight, started decreasing until noontime and then increased again. Fatigue increased across days towards the end of the non-routine patrol, but remained stable across days in the routine patrol. The relationship between workload and fatigue changed over consecutive days in the non-routine patrol. At the beginning of the patrol, low workload was associated with fatigue. At the end of the patrol, high workload was associated with fatigue. This relationship could not be tested in the routine patrol, however it demonstrated a non-monotonic relationship between workload and fatigue – low and high workloads were associated with the highest fatigue. These results suggest that the optimal level of workload can change over time and thus have implications for the management of fatigue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is important to detect and treat malnutrition in hospital patients so as to improve clinical outcome and reduce hospital stay. The aim of this study was to develop and validate a nutrition screening tool with a simple and quick scoring system for acute hospital patients in Singapore. In this study, 818 newly admitted patients aged above 18 years old were screened using five parameters that contribute to the risk of malnutrition. A dietitian blinded to the nutrition screening score assessed the same patients using the reference standard, Subjective Global Assessment (SGA) within 48 hours. The sensitivity and specificity were established using the Receiver Operator Characteristics (ROC) curve and the best cutoff scores determined. The nutrition parameter with the largest Area Under the ROC Curve (AUC) was chosen as the final screening tool, which was named 3-Minute Nutrition Screening (3-MinNS). The combination of the parameters weight loss, intake and muscle wastage (3-MinNS), gave the largest AUC when compared with SGA. Using 3-MinNS, the best cutoff point to identify malnourished patients is three (sensitivity 86%, specificity 83%). The cutoff score to identify subjects at risk of severe malnutrition is five (sensitivity 93%, specificity 86%). 3-Minute Nutrition Screening is a valid, simple and rapid tool to identify patients at risk of malnutrition in Singapore acute hospital patients. It is able to differentiate patients at risk of moderate malnutrition and severe malnutrition for prioritization and management purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distracted is a luminous, interactive, computational media installation of sound, light and translucent sculptural materials. The work is inspired by scientific ice core samples taken in Antarctica. The sculpture is capable of displaying data taken from these ice core samples, and responding to the proximity of an audience. Rather than simply using the interface as a didactic display device, we have chosen a more poetic approach of generating visual effects from the data that are evocative of the ice, fluids and the notion of change. The data has also been used in the composition of an evolving soundscape. As well as data from ice core samples, such as the Vostok ice core, we have incorporated data from the Keeling Curve that shows the annual rise and fall of atmospheric carbon dioxide, following the pattern of the Northern Hemisphere winter. These effects combine with changes caused directly by audience members as they come within close proximity to the work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The wide range of contributing factors and circumstances surrounding crashes on road curves suggest that no single intervention can prevent these crashes. This paper presents a novel methodology, based on data mining techniques, to identify contributing factors and the relationship between them. It identifies contributing factors that influence the risk of a crash. Incident records, described using free text, from a large insurance company were analysed with rough set theory. Rough set theory was used to discover dependencies among data, and reasons using the vague, uncertain and imprecise information that characterised the insurance dataset. The results show that male drivers, who are between 50 and 59 years old, driving during evening peak hours are involved with a collision, had a lowest crash risk. Drivers between 25 and 29 years old, driving from around midnight to 6 am and in a new car has the highest risk. The analysis of the most significant contributing factors on curves suggests that drivers with driving experience of 25 to 42 years, who are driving a new vehicle have the highest crash cost risk, characterised by the vehicle running off the road and hitting a tree. This research complements existing statistically based tools approach to analyse road crashes. Our data mining approach is supported with proven theory and will allow road safety practitioners to effectively understand the dependencies between contributing factors and the crash type with the view to designing tailored countermeasures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to forecast machinery failure is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models for forecasting machinery health based on condition data. Although these models have aided the advancement of the discipline, they have made only a limited contribution to developing an effective machinery health prognostic system. The literature review indicates that there is not yet a prognostic model that directly models and fully utilises suspended condition histories (which are very common in practice since organisations rarely allow their assets to run to failure); that effectively integrates population characteristics into prognostics for longer-range prediction in a probabilistic sense; which deduces the non-linear relationship between measured condition data and actual asset health; and which involves minimal assumptions and requirements. This work presents a novel approach to addressing the above-mentioned challenges. The proposed model consists of a feed-forward neural network, the training targets of which are asset survival probabilities estimated using a variation of the Kaplan-Meier estimator and a degradation-based failure probability density estimator. The adapted Kaplan-Meier estimator is able to model the actual survival status of individual failed units and estimate the survival probability of individual suspended units. The degradation-based failure probability density estimator, on the other hand, extracts population characteristics and computes conditional reliability from available condition histories instead of from reliability data. The estimated survival probability and the relevant condition histories are respectively presented as “training target” and “training input” to the neural network. The trained network is capable of estimating the future survival curve of a unit when a series of condition indices are inputted. Although the concept proposed may be applied to the prognosis of various machine components, rolling element bearings were chosen as the research object because rolling element bearing failure is one of the foremost causes of machinery breakdowns. Computer simulated and industry case study data were used to compare the prognostic performance of the proposed model and four control models, namely: two feed-forward neural networks with the same training function and structure as the proposed model, but neglected suspended histories; a time series prediction recurrent neural network; and a traditional Weibull distribution model. The results support the assertion that the proposed model performs better than the other four models and that it produces adaptive prediction outputs with useful representation of survival probabilities. This work presents a compelling concept for non-parametric data-driven prognosis, and for utilising available asset condition information more fully and accurately. It demonstrates that machinery health can indeed be forecasted. The proposed prognostic technique, together with ongoing advances in sensors and data-fusion techniques, and increasingly comprehensive databases of asset condition data, holds the promise for increased asset availability, maintenance cost effectiveness, operational safety and – ultimately – organisation competitiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern machines are complex and often required to operate long hours to achieve production targets. The ability to detect symptoms of failure, hence, forecasting the remaining useful life of the machine is vital to prevent catastrophic failures. This is essential to reducing maintenance cost, operation downtime and safety hazard. Recent advances in condition monitoring technologies have given rise to a number of prognosis models that attempt to forecast machinery health based on either condition data or reliability data. In practice, failure condition trending data are seldom kept by industries and data that ended with a suspension are sometimes treated as failure data. This paper presents a novel approach of incorporating historical failure data and suspended condition trending data in the prognostic model. The proposed model consists of a FFNN whose training targets are asset survival probabilities estimated using a variation of Kaplan-Meier estimator and degradation-based failure PDF estimator. The output survival probabilities collectively form an estimated survival curve. The viability of the model was tested using a set of industry vibration data.