386 resultados para Algebraic Curve


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reinforced concrete structures are susceptible to a variety of deterioration mechanisms due to creep and shrinkage, alkali-silica reaction (ASR), carbonation, and corrosion of the reinforcement. The deterioration problems can affect the integrity and load carrying capacity of the structure. Substantial research has been dedicated to these various mechanisms aiming to identify the causes, reactions, accelerants, retardants and consequences. This has improved our understanding of the long-term behaviour of reinforced concrete structures. However, the strengthening of reinforced concrete structures for durability has to date been mainly undertaken after expert assessment of field data followed by the development of a scheme to both terminate continuing degradation, by separating the structure from the environment, and strengthening the structure. The process does not include any significant consideration of the residual load-bearing capacity of the structure and the highly variable nature of estimates of such remaining capacity. Development of performance curves for deteriorating bridge structures has not been attempted due to the difficulty in developing a model when the input parameters have an extremely large variability. This paper presents a framework developed for an asset management system which assesses residual capacity and identifies the most appropriate rehabilitation method for a given reinforced concrete structure exposed to aggressive environments. In developing the framework, several industry consultation sessions have been conducted to identify input data required, research methodology and output knowledge base. Capturing expert opinion in a useable knowledge base requires development of a rule based formulation, which can subsequently be used to model the reliability of the performance curve of a reinforced concrete structure exposed to a given environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adolescent Idiopathic Scoliosis (AIS) is the most common deformity of the spine, affecting 2-4% of the population. Previous studies have shown that the vertebrae in scoliotic spines undergo abnormal shape changes, however there has been little exploration of how scoliosis affects bone density distribution within the vertebrae. In this study, existing CT scans of 53 female idiopathic scoliosis patients with right-sided main thoracic curves were used to measure the lateral (right to left) bone density profile at mid-height through each vertebral body. Five key bone density profile measures were identified from each normalised bone density distribution, and multiple regression analysis was performed to explore the relationship between bone density distribution and patient demographics (age, height, weight, body mass index (BMI), skeletal maturity, time since Menarche, vertebral level, and scoliosis curve severity). Results showed a marked convex/concave asymmetry in bone density for vertebral levels at or near the apex of the scoliotic curve. At the apical vertebra, mean bone density at the left side (concave) cortical shell was 23.5% higher than for the right (convex) cortical shell, and cancellous bone density along the central 60% of the lateral path from convex to concave increased by 13.8%. The centre of mass of the bone density profile at the thoracic curve apex was located 53.8% of the distance along the lateral path, indicating a shift of nearly 4% toward the concavity of the deformity. These lateral bone density gradients tapered off when moving away from the apical vertebra. Multi-linear regressions showed that the right cortical shell peak bone density is significantly correlated with skeletal maturity, with each Risser increment corresponding to an increase in mineral equivalent bone density of 4-5%. There were also statistically significant relationships between patient height, weight and BMI, and the gradient of cancellous bone density along the central 60% of the lateral path. Bone density gradient is positively correlated with weight, and negatively correlated with height and BMI, such that at the apical vertebra, a unit decrease in BMI corresponds to an almost 100% increase in bone density gradient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Surgical treatment of scoliosis is quantitatively assessed in the clinic using radiographic measures of deformity correction, as well as the rib hump, but it is important to understand the extent to which these quantitative measures correlate with self-reported improvements in patients’ quality of life following surgery. The purpose of this prospective study was to evaluate the relationship between clinical outcomes of thoracoscopic anterior scoliosis surgery and deformity correction using the Scoliosis Research Society questionnaire (SRS-24). Patients undergoing thoracoscopic anterior scoliosis correction report good SRS scores which are comparable to those reported in previous studies for both open and thoracoscopic scoliosis correction procedures. Major Cobb correction is a significant predictor of patient satisfaction when comparing subgroups of patients with the highest and lowest major curve corrections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces fast algorithms for performing group operations on twisted Edwards curves, pushing the recent speed limits of Elliptic Curve Cryptography (ECC) forward in a wide range of applications. Notably, the new addition algorithm uses for suitably selected curve constants. In comparison, the fastest point addition algorithms for (twisted) Edwards curves stated in the literature use . It is also shown that the new addition algorithm can be implemented with four processors dropping the effective cost to . This implies an effective speed increase by the full factor of 4 over the sequential case. Our results allow faster implementation of elliptic curve scalar multiplication. In addition, the new point addition algorithm can be used to provide a natural protection from side channel attacks based on simple power analysis (SPA).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides new results about efficient arithmetic on Jacobi quartic form elliptic curves, y 2 = d x 4 + 2 a x 2 + 1. With recent bandwidth-efficient proposals, the arithmetic on Jacobi quartic curves became solidly faster than that of Weierstrass curves. These proposals use up to 7 coordinates to represent a single point. However, fast scalar multiplication algorithms based on windowing techniques, precompute and store several points which require more space than what it takes with 3 coordinates. Also note that some of these proposals require d = 1 for full speed. Unfortunately, elliptic curves having 2-times-a-prime number of points, cannot be written in Jacobi quartic form if d = 1. Even worse the contemporary formulae may fail to output correct coordinates for some inputs. This paper provides improved speeds using fewer coordinates without causing the above mentioned problems. For instance, our proposed point doubling algorithm takes only 2 multiplications, 5 squarings, and no multiplication with curve constants when d is arbitrary and a = ±1/2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper improves implementation techniques of Elliptic Curve Cryptography. We introduce new formulae and algorithms for the group law on Jacobi quartic, Jacobi intersection, Edwards, and Hessian curves. The proposed formulae and algorithms can save time in suitable point representations. To support our claims, a cost comparison is made with classic scalar multiplication algorithms using previous and current operation counts. Most notably, the best speeds are obtained from Jacobi quartic curves which provide the fastest timings for most scalar multiplication strategies benefiting from the proposed 12M + 5S + 1D point doubling and 7M + 3S + 1D point addition algorithms. Furthermore, the new addition algorithm provides an efficient way to protect against side channel attacks which are based on simple power analysis (SPA). Keywords: Efficient elliptic curve arithmetic,unified addition, side channel attack.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents efficient formulas for computing cryptographic pairings on the curve y 2 = c x 3 + 1 over fields of large characteristic. We provide examples of pairing-friendly elliptic curves of this form which are of interest for efficient pairing implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we discuss our current efforts to develop and implement an exploratory, discovery mode assessment item into the total learning and assessment profile for a target group of about 100 second level engineering mathematics students. The assessment item under development is composed of 2 parts, namely, a set of "pre-lab" homework problems (which focus on relevant prior mathematical knowledge, concepts and skills), and complementary computing laboratory exercises which are undertaken within a fixed (1 hour) time frame. In particular, the computing exercises exploit the algebraic manipulation and visualisation capabilities of the symbolic algebra package MAPLE, with the aim of promoting understanding of certain mathematical concepts and skills via visual and intuitive reasoning, rather than a formal or rigorous approach. The assessment task we are developing is aimed at providing students with a significant learning experience, in addition to providing feedback on their individual knowledge and skills. To this end, a noteworthy feature of the scheme is that marks awarded for the laboratory work are primarily based on the extent to which reflective, critical thinking is demonstrated, rather than the amount of CBE-style tasks completed by the student within the allowed time. With regard to student learning outcomes, a novel and potentially critical feature of our scheme is that the assessment task is designed to be intimately linked to the overall course content, in that it aims to introduce important concepts and skills (via individual student exploration) which will be revisited somewhat later in the pedagogically more restrictive formal lecture component of the course (typically a large group plenary format). Furthermore, the time delay involved, or "incubation period", is also a deliberate design feature: it is intended to allow students the opportunity to undergo potentially important internal re-adjustments in their understanding, before being exposed to lectures on related course content which are invariably delivered in a more condensed, formal and mathematically rigorous manner. In our presentation, we will discuss in more detail our motivation and rationale for trailing such a scheme for the targeted student group. Some of the advantages and disadvantages of our approach (as we perceived them at the initial stages) will also be enumerated. In a companion paper, the theoretical framework for our approach will be more fully elaborated, and measures of student learning outcomes (as obtained from eg. student provided feedback) will be discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generalising arithmetic structures is seen as a key to developing algebraic understanding. Many adolescent students begin secondary school with a poor understanding of the structure of arithmetic. This paper presents a theory for a teaching/learning trajectory designed to build mathematical understanding and abstraction in the elementary school context. The particular focus is on the use of models and representations to construct an understanding of equivalence. The results of a longitudinal intervention study with five elementary schools, following 220 students as they progressed from Year 2 to Year 6, informed the development of this theory. Data were gathered from multiple sources including interviews, videos of classroom teaching, and pre-and post-tests. Data reduction resulted in the development of nine conjectures representing a growth in integration of models and representations. These conjectures formed the basis of the theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is an IT enabled technology that allows storage, management, sharing, access, update and use of all the data relevant to a project through out the project life-cycle in the form of a data repository. BIM enables improved inter-disciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. While the technology itself may not be new, and similar approaches have been in use in some other sectors like Aircraft and Automobile industry for well over a decade now, the AEC/FM (Architecture, Engineering and Construction/ Facilities Management) industry is still to catch up with them in its ability to exploit the benefits of the IT revolution. Though the potential benefits of the technology in terms of knowledge sharing, project management, project co-ordination and collaboration are near to obvious, the adoption rate has been rather lethargic, inspite of some well directed efforts and availability of supporting commercial tools. Since the technology itself has been well tested over the years in some other domains the plausible causes must be rooted well beyond the explanation of the ‘Bell Curve of innovation adoption’. This paper discusses the preliminary findings of an ongoing research project funded by the Cooperative Research Centre for Construction Innovation (CRC-CI) which aims to identify these gaps and come up with specifications and guidelines to enable greater adoption of the BIM approach in practice. A detailed literature review is conducted that looks at some of the similar research reported in the recent years. A desktop audit of some of the existing commercial tools that support BIM application has been conducted to identify the technological issues and concerns, and a workshop was organized with industry partners and various players in the AEC industry for needs analysis, expectations and feedback on the possible deterrents and inhibitions surrounding the BIM adoption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-stroke outboard boat engines using total loss lubrication deposit a significant proportion of their lubricant and fuel directly into the water. The purpose of this work is to document the velocity and concentration field characteristics of a submerged swirling water jet emanating from a propeller in order to provide information on its fundamental characteristics. Measurements of the velocity and concentration field were performed in a turbulent jet generated by a model boat propeller (0.02 m diameter) operating at 1500 rpm and 3000 rpm. The measurements were carried out in the Zone of Established Flow up to 50 propeller diameters downstream of the propeller. Both the mean axial velocity profile and the mean concentration profile showed self-similarity. Further, the stand deviation growth curve was linear. The effects of propeller speed and dye release location were also investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review explores the question whether chemometrics methods enhance the performance of electroanalytical methods. Electroanalysis has long benefited from the well-established techniques such as potentiometric titrations, polarography and voltammetry, and the more novel ones such as electronic tongues and noses, which have enlarged the scope of applications. The electroanalytical methods have been improved with the application of chemometrics for simultaneous quantitative prediction of analytes or qualitative resolution of complex overlapping responses. Typical methods include partial least squares (PLS), artificial neural networks (ANNs), and multiple curve resolution methods (MCR-ALS, N-PLS and PARAFAC). This review aims to provide the practising analyst with a broad guide to electroanalytical applications supported by chemometrics. In this context, after a general consideration of the use of a number of electroanalytical techniques with the aid of chemometrics methods, several overviews follow with each one focusing on an important field of application such as food, pharmaceuticals, pesticides and the environment. The growth of chemometrics in conjunction with electronic tongue and nose sensors is highlighted, and this is followed by an overview of the use of chemometrics for the resolution of complicated profiles for qualitative identification of analytes, especially with the use of the MCR-ALS methodology. Finally, the performance of electroanalytical methods is compared with that of some spectrophotometric procedures on the basis of figures-of-merit. This showed that electroanalytical methods can perform as well as the spectrophotometric ones. PLS-1 appears to be the method of practical choice if the %relative prediction error of not, vert, similar±10% is acceptable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactions between small molecules with biopolymers e.g. the bovine serum albumin (BSA protein), are important, and significant information is recorded in the UV–vis and fluorescence spectra of their reaction mixtures. The extraction of this information is difficult conventionally and principally because there is significant overlapping of the spectra of the three analytes in the mixture. The interaction of berberine chloride (BC) and the BSA protein provides an interesting example of such complex systems. UV–vis and fluorescence spectra of BC and BSA mixtures were investigated in pH 7.4 Tris–HCl buffer at 37 °C. Two sample series were measured by each technique: (1) [BSA] was kept constant and the [BC] was varied and (2) [BC] was kept constant and the [BSA] was varied. This produced four spectral data matrices, which were combined into one expanded spectral matrix. This was processed by the multivariate curve resolution–alternating least squares method (MCR–ALS). The results produced: (1) the extracted pure BC, BSA and the BC–BSA complex spectra from the measured heavily overlapping composite responses, (2) the concentration profiles of BC, BSA and the BC–BSA complex, which are difficult to obtain by conventional means, and (3) estimates of the number of binding sites of BC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interaction of quercetin, which is a bioflavonoid, with bovine serum albumin (BSA) was investigated under pseudo-physiological conditions by the application of UV–vis spectrometry, spectrofluorimetry and cyclic voltammetry (CV). These studies indicated a cooperative interaction between the quercetin–BSA complex and warfarin, which produced a ternary complex, quercetin–BSA–warfarin. It was found that both quercetin and warfarin were located in site I. However, the spectra of these three components overlapped and the chemometrics method – multivariate curve resolution-alternating least squares (MCR-ALS) was applied to resolve the spectra. The resolved spectra of quercetin–BSA and warfarin agreed well with their measured spectra, and importantly, the spectrum of the quercetin–BSA–warfarin complex was extracted. These results allowed the rationalization of the behaviour of the overlapping spectra. At lower concentrations ([warfarin] < 1 × 10−5 mol L−1), most of the site marker reacted with the quercetin–BSA, but free warfarin was present at higher concentrations. Interestingly, the ratio between quercetin–BSA and warfarin was found to be 1:2, suggesting a quercetin–BSA–(warfarin)2 complex, and the estimated equilibrium constant was 1.4 × 1011 M−2. The results suggest that at low concentrations, warfarin binds at the high-affinity sites (HAS), while low-affinity binding sites (LAS) are occupied at higher concentrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cognitive-energetical theories of information processing were used to generate predictions regarding the relationship between workload and fatigue within and across consecutive days of work. Repeated measures were taken on board a naval vessel during a non-routine and a routine patrol. Data were analyzed using growth curve modeling. Fatigue demonstrated a non-monotonic relationship within days in both patrols – fatigue was high at midnight, started decreasing until noontime and then increased again. Fatigue increased across days towards the end of the non-routine patrol, but remained stable across days in the routine patrol. The relationship between workload and fatigue changed over consecutive days in the non-routine patrol. At the beginning of the patrol, low workload was associated with fatigue. At the end of the patrol, high workload was associated with fatigue. This relationship could not be tested in the routine patrol, however it demonstrated a non-monotonic relationship between workload and fatigue – low and high workloads were associated with the highest fatigue. These results suggest that the optimal level of workload can change over time and thus have implications for the management of fatigue.