64 resultados para Algorithm Calibration
Resumo:
New radiocarbon calibration curves, IntCal04 and Marine04, have been constructed and internationally ratified to replace the terrestrial and marine components of IntCal98. The new calibration data sets extend an additional 2000 yr, from 0–26 cal kyr BP (Before Present, 0 cal BP = AD 1950), and provide much higher resolution, greater precision, and more detailed structure than IntCal98. For the Marine04 curve, dendrochronologically-dated tree-ring samples, converted with a box diffusion model to marine mixed-layer ages, cover the period from 0–10.5 cal kyr BP. Beyond 10.5 cal kyr BP, high-resolution marine data become available from foraminifera in varved sediments and U/Th-dated corals. The marine records are corrected with site-specific 14C reservoir age information to provide a single global marine mixed-layer calibration from 10.5–26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a random walk model, which takes into account the uncertainty in both the calendar age and the 14C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed here. The tree-ring data sets, sources of uncertainty, and regional offsets are presented in detail in a companion paper by Reimer et al. (this issue).
Resumo:
The definitive paper by Stuiver and Polach (1977) established the conventions for reporting of 14C data for chronological and geophysical studies based on the radioactive decay of 14C in the sample since the year of sample death or formation. Several ways of reporting 14C activity levels relative to a standard were also established, but no specific instructions were given for reporting nuclear weapons testing (post-bomb) 14C levels in samples. Because the use of post-bomb 14C is becoming more prevalent in forensics, biology, and geosciences, a convention needs to be adopted. We advocate the use of fraction modern with a new symbol F14C to prevent confusion with the previously used Fm, which may or may not have been fractionation corrected. We also discuss the calibration of post-bomb 14C samples and the available datasets and compilations, but do not give a recommendation for a particular dataset.
Resumo:
We have conducted a series of radiocarbon measurements on decadal samples of dendrochronologically dated wood from both hemispheres, spanning 1000 years (McCormac et al. 1998; Hogg et al. this issue). Using the data presented in Hogg et al., we show that during the period AD 950-1850 the 14C offset between the hemispheres is not constant, but varies periodically (~130 yr periodicity) with amplitudes varying between 1 and 10‰ (i.e. 8-80 yr), with a consequent effect on the 14C calibration of material from the Southern Hemisphere. A large increase in the offset occurs between AD 1245 and 1355. In this paper, we present a Southern Hemisphere high-precision calibration data set (SHCal02) that comprises measurements from New Zealand, Chile, and South Africa. This data, and a new value of 41 ± 14 yr for correction of the IntCal98 data for the period outside the range given here, is proposed for use in calibrating Southern Hemisphere 14C dates.
Resumo:
Radiocarbon dating has been rarely used for chronological problems relating to the Anglo-Saxon period. The "flatness" of the calibration curve and the resultant wide range in calendrical dates provide little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology have, however, created the possibility of refining and checking the established chronologies, based on typology of artifacts, against 14C dates. The calibration process, within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have therefore re-measured, at decadal intervals, a section of the Irish oak chronology for the period AD 495–725. These measurements have been included in IntCal04.
Resumo:
Recent measurements on dendrochronologically-dated wood from the Southern Hemisphere have shown that there are differences between the structural form of the radiocarbon calibration curves from each hemisphere. Thus, it is desirable, when possible, to use calibration data obtained from secure dendrochronologically-dated wood from the corresponding hemisphere. In this paper, we outline the recent work and point the reader to the internationally recommended data set that should be used for future calibration of Southern Hemisphere 14C dates.
Resumo:
BACKGROUND: Male fertility potential cannot be measured by conventional parameters for assisted reproduction by intracytoplasmic sperm injection. This study determines the relationship between testicular and ejaculated sperm mitochondrial (mt) DNA deletions, nuclear (n) DNA fragmentation and fertilisation and pregnancy rates in ICSI. METHODS: Ejaculated sperm were obtained from 77 men and testicular sperm from 28 men with obstructive azoospermia undergoing ICSI. Testicular sperm were retrieved using a Trucut needle. MtDNA analysed using a long polymerase chain reaction. The alkaline Comet assay determined nDNA fragmentation. RESULTS: Of subjects who achieved a pregnancy (50%) using testicular sperm, only 26% had partners�??�?�¢?? sperm with wild type (WT) mtDNA. Of pregnant subjects (38%) using ejaculated sperm, only 8% had partner sperm with WT mtDNA.. In each, the successful group had less mtDNA deletions and less nDNA fragmentation. There were inverse relationships between pregnancy and mtDNA deletion numbers, size and nDNA fragmentation for both testicular and ejaculated sperm. No relationships were observed with fertilisation rates. An algorithm for the prediction of pregnancy is presented based on the quality of sperm nDNA and mtDNA. CONCLUSION: In both testicular and ejaculated sperm, mtDNA deletions and nDNA fragmentation are closely associated with pregnancy in ICSI.
Resumo:
A hardware performance analysis of the SHACAL-2 encryption algorithm is presented in this paper. SHACAL-2 was one of four symmetric key algorithms chosen in the New European Schemes for Signatures, Integrity and Encryption (NESSIE) initiative in 2003. The paper describes a fully pipelined encryption SHACAL-2 architecture implemented on a Xilinx Field Programmable Gate Array (FPGA) device that achieves a throughput of over 25 Gbps. This is the fastest private key encryption algorithm architecture currently available. The SHACAL-2 decryption algorithm is also defined in the paper as it was not provided in the NESSIE submission.
Resumo:
This paper investigates the two-stage stepwise identification for a class of nonlinear dynamic systems that can be described by linear-in-the-parameters models, and the model has to be built from a very large pool of basis functions or model terms. The main objective is to improve the compactness of the model that is obtained by the forward stepwise methods, while retaining the computational efficiency. The proposed algorithm first generates an initial model using a forward stepwise procedure. The significance of each selected term is then reviewed at the second stage and all insignificant ones are replaced, resulting in an optimised compact model with significantly improved performance. The main contribution of this paper is that these two stages are performed within a well-defined regression context, leading to significantly reduced computational complexity. The efficiency of the algorithm is confirmed by the computational complexity analysis, and its effectiveness is demonstrated by the simulation results.
Resumo:
This paper proposes a novel hybrid forward algorithm (HFA) for the construction of radial basis function (RBF) neural networks with tunable nodes. The main objective is to efficiently and effectively produce a parsimonious RBF neural network that generalizes well. In this study, it is achieved through simultaneous network structure determination and parameter optimization on the continuous parameter space. This is a mixed integer hard problem and the proposed HFA tackles this problem using an integrated analytic framework, leading to significantly improved network performance and reduced memory usage for the network construction. The computational complexity analysis confirms the efficiency of the proposed algorithm, and the simulation results demonstrate its effectiveness
Resumo:
We present a fast and efficient hybrid algorithm for selecting exoplanetary candidates from wide-field transit surveys. Our method is based on the widely used SysRem and Box Least-Squares (BLS) algorithms. Patterns of systematic error that are common to all stars on the frame are mapped and eliminated using the SysRem algorithm. The remaining systematic errors caused by spatially localized flat-fielding and other errors are quantified using a boxcar-smoothing method. We show that the dimensions of the search-parameter space can be reduced greatly by carrying out an initial BLS search on a coarse grid of reduced dimensions, followed by Newton-Raphson refinement of the transit parameters in the vicinity of the most significant solutions. We illustrate the method's operation by applying it to data from one field of the SuperWASP survey, comprising 2300 observations of 7840 stars brighter than V = 13.0. We identify 11 likely transit candidates. We reject stars that exhibit significant ellipsoidal variations caused indicative of a stellar-mass companion. We use colours and proper motions from the Two Micron All Sky Survey and USNO-B1.0 surveys to estimate the stellar parameters and the companion radius. We find that two stars showing unambiguous transit signals pass all these tests, and so qualify for detailed high-resolution spectroscopic follow-up.
Resumo:
Course Scheduling consists of assigning lecture events to a limited set of specific timeslots and rooms. The objective is to satisfy as many soft constraints as possible, while maintaining a feasible solution timetable. The most successful techniques to date require a compute-intensive examination of the solution neighbourhood to direct searches to an optimum solution. Although they may require fewer neighbourhood moves than more exhaustive techniques to gain comparable results, they can take considerably longer to achieve success. This paper introduces an extended version of the Great Deluge Algorithm for the Course Timetabling problem which, while avoiding the problem of getting trapped in local optima, uses simple Neighbourhood search heuristics to obtain solutions in a relatively short amount of time. The paper presents results based on a standard set of benchmark datasets, beating over half of the currently published best results with in some cases up to 60% of an improvement.