55 resultados para Check digits
Resumo:
Scientific computation has unavoidable approximations built into its very fabric. One important source of error that is difficult to detect and control is round-off error propagation which originates from the use of finite precision arithmetic. We propose that there is a need to perform regular numerical `health checks' on scientific codes in order to detect the cancerous effect of round-off error propagation. This is particularly important in scientific codes that are built on legacy software. We advocate the use of the CADNA library as a suitable numerical screening tool. We present a case study to illustrate the practical use of CADNA in scientific codes that are of interest to the Computer Physics Communications readership. In doing so we hope to stimulate a greater awareness of round-off error propagation and present a practical means by which it can be analyzed and managed.
Resumo:
Dynamical effects of non-conservative forces in long, defect free atomic wires are investigated. Current flow through these wires is simulated and we find that during the initial transient, the kinetic energies of the ions are contained in a small number of phonon modes, closely clustered in frequency. These phonon modes correspond to the waterwheel modes determined from preliminary static calculations. The static calculations allow one to predict the appearance of non-conservative effects in advance of the more expensive real-time simulations. The ion kinetic energy redistributes across the band as non-conservative forces reach a steady state with electronic factional forces. The typical ion kinetic energy is found to decrease with system length, increase with atomic mass, and its dependence on bias, mass and length is supported with a pen and paper model. This paper highlights the importance of non-conservative forces in current carrying devices and provides criteria for the design of stable atomic wires.
Resumo:
Purpose – The aim of this paper is to analyse how critical incidents or organisational crises can be used to check and legitimise quality management change efforts in relation to the fundamental principles of quality. Design/methodology/approach – Multiple case studies analyse critical incidents that demonstrate the importance of legitimisation, normative evaluation and conflict constructs in this process. A theoretical framework composed of these constructs is used to guide the analysis. Findings – The cases show that the critical incidents leading to the legitimisation of continuous improvement (CI) were diverse. However all resulted in the need for significant ongoing cost reduction to achieve or retain competitiveness. In addition, attempts at legitimising CI were coupled with attempts at destabilising the existing normative practice. This destabilisation process, in some cases, advocated supplementing the existing approaches and in others replacing them. In all cases, significant conflict arose in these legitimising and normative evaluation processes. Research limitations/implications – It is suggested that further research could involve a critical analysis of existing quality models, tools and techniques in relation to how they incorporate, and are built upon, fundamental quality management principles. Furthermore, such studies could probe the dangers of quality curriculum becoming divorced from business and market reality and thus creating a parallel existence. Practical implications – As demonstrated by the case studies, models, tools and techniques are not valued for their intrinsic value but rather for what they will contribute to addressing the business needs. Thus, in addition to being an opportunity for quality management, critical incidents present a challenge to the field. Quality management must be shown to make a contribution in these circumstances. Originality/value – This paper is of value to both academics and practitioners.
Resumo:
We present results of a study into the performance of a variety of different image transform-based feature types for speaker-independent visual speech recognition of isolated digits. This includes the first reported use of features extracted using a discrete curvelet transform. The study will show a comparison of some methods for selecting features of each feature type and show the relative benefits of both static and dynamic visual features. The performance of the features will be tested on both clean video data and also video data corrupted in a variety of ways to assess each feature type's robustness to potential real-world conditions. One of the test conditions involves a novel form of video corruption we call jitter which simulates camera and/or head movement during recording.
Resumo:
This book examines credit in working class communities since 1880, focusing on forms of borrowing that were dependent on personal relationships and social networks. It provides an extended historical discussion of credit unions, legal and illegal moneylenders (loan sharks), and looks at the concept of ‘financial exclusion’. Initially, the book focuses on the history of tallymen, check traders, and their eventual movement into moneylending following the loss of their more affluent customers, due to increased spending power and an increasingly liberalized credit market. They also faced growing competition from mail order companies operating through networks of female agents, whose success owed much to the reciprocal cultural and economic conventions that lay at the heart of traditional working class credit relationships. Discussion of these forms of credit is related to theoretical debates about cultural aspects of credit exchange that ensured the continuing success of such forms of lending, despite persistent controversies about their use. The book contrasts commercial forms of credit with formal and informal co-operative alternatives, such as the mutuality clubs operated by co-operative retailers and credit unions. It charts the impact of post-war immigration upon credit patterns, particularly in relation to the migrant (Irish and Caribbean) origins of many credit unions and explains the relative lack of success of the credit union movement. The book contributes to anti-debt debates by exploring the historical difficulties of developing legislation in relation to the millions of borrowers who have patronized what has come to be termed the sub-prime sector.
Resumo:
Context: The masses previously obtained for the X-ray binary 2S 0921-630 inferred a compact object that was either a high-mass neutron star or low-mass black-hole, but used a previously published value for the rotational broadening (v sin i) with large uncertainties. Aims: We aim to determine an accurate mass for the compact object through an improved measurement of the secondary star's projected equatorial rotational velocity. Methods: We have used UVES echelle spectroscopy to determine the v sin i of the secondary star (V395 Car) in the low-mass X-ray binary 2S 0921-630 by comparison to an artificially broadened spectral-type template star. In addition, we have also measured v sin i from a single high signal-to-noise ratio absorption line profile calculated using the method of Least-Squares Deconvolution (LSD). Results: We determine v sin i to lie between 31.3±0.5 km s-1 to 34.7±0.5 km s-1 (assuming zero and continuum limb darkening, respectively) in disagreement with previous results based on intermediate resolution spectroscopy obtained with the 3.6 m NTT. Using our revised v sin i value in combination with the secondary star's radial velocity gives a binary mass ratio of 0.281±0.034. Furthermore, assuming a binary inclination angle of 75° gives a compact object mass of 1.37±0.13 M_?. Conclusions: We find that using relatively low-resolution spectroscopy can result in systemic uncertainties in the measured v sin i values obtained using standard methods. We suggest the use of LSD as a secondary, reliable check of the results as LSD allows one to directly discern the shape of the absorption line profile. In the light of the new v sin i measurement, we have revised down the compact object's mass, such that it is now compatible with a canonical neutron star mass.
Resumo:
We present nine newly observed transits of TrES-3, taken as part of a transit timing program using the RISE instrument on the Liverpool Telescope. A Markov-Chain Monte Carlo analysis was used to determine the planet star radius ratio and inclination of the system, which were found to be R-p/R-star = 0.1664(-0.0018)(+0.0011) and i = 81.73(-0.04)(+0.13), respectively, consistent with previous results. The central transit times and uncertainties were also calculated, using a residual-permutation algorithm as an independent check on the errors. A re-analysis of eight previously published TrES-3 light curves was conducted to determine the transit times and uncertainties using consistent techniques. Whilst the transit times were not found to be in agreement with a linear ephemeris, giving chi(2) = 35.07 for 15 degrees of freedom, we interpret this to be the result of systematics in the light curves rather than a real transit timing variation. This is because the light curves that show the largest deviation from a constant period either have relatively little out-of-transit coverage or have clear systematics. A new ephemeris was calculated using the transit times and was found to be T-c(0) = 2454632.62610 +/- 0.00006 HJD and P = 1.3061864 +/- 0.0000005 days. The transit times were then used to place upper mass limits as a function of the period ratio of a potential perturbing planet, showing that our data are sufficiently sensitive to have probed sub-Earth mass planets in both interior and exterior 2:1 resonances, assuming that the additional planet is in an initially circular orbit.