968 resultados para High-precision Radiocarbon Dating


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integer ambiguity resolution is an indispensable procedure for all high precision GNSS applications. The correctness of the estimated integer ambiguities is the key to achieving highly reliable positioning, but the solution cannot be validated with classical hypothesis testing methods. The integer aperture estimation theory unifies all existing ambiguity validation tests and provides a new prospective to review existing methods, which enables us to have a better understanding on the ambiguity validation problem. This contribution analyses two simple but efficient ambiguity validation test methods, ratio test and difference test, from three aspects: acceptance region, probability basis and numerical results. The major contribution of this paper can be summarized as: (1) The ratio test acceptance region is overlap of ellipsoids while the difference test acceptance region is overlap of half-spaces. (2) The probability basis of these two popular tests is firstly analyzed. The difference test is an approximation to optimal integer aperture, while the ratio test follows an exponential relationship in probability. (3) The limitations of the two tests are firstly identified. The two tests may under-evaluate the failure risk if the model is not strong enough or the float ambiguities fall in particular region. (4) Extensive numerical results are used to compare the performance of these two tests. The simulation results show the ratio test outperforms the difference test in some models while difference test performs better in other models. Particularly in the medium baseline kinematic model, the difference tests outperforms the ratio test, the superiority is independent on frequency number, observation noise, satellite geometry, while it depends on success rate and failure rate tolerance. Smaller failure rate leads to larger performance discrepancy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article introduces a deterministic approach to using low-temperature, thermally non-equilibrium plasmas to synthesize delicate low-dimensional nanostructures of a small number of atoms on plasma exposed surfaces. This approach is based on a set of plasma-related strategies to control elementary surface processes, an area traditionally covered by surface science. Major issues related to balanced delivery and consumption of building units, appropriate choice of process conditions, and account of plasma-related electric fields, electric charges and polarization effects are identified and discussed in the quantum dot nanoarray context. Examples of a suitable plasma-aided nanofabrication facility and specific effects of a plasma-based environment on self-organized growth of size- and position-uniform nanodot arrays are shown. These results suggest a very positive outlook for using low-temperature plasma-based nanotools in high-precision nanofabrication of self-assembled nanostructures and elements of nanodevices, one of the areas of continuously rising demand from academia and industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Database watermarking has received significant research attention in the current decade. Although, almost all watermarking models have been either irreversible (the original relation cannot be restored from the watermarked relation) and/or non-blind (requiring original relation to detect the watermark in watermarked relation). This model has several disadvantages over reversible and blind watermarking (requiring only watermarked relation and secret key from which the watermark is detected and original relation is restored) including inability to identify rightful owner in case of successful secondary watermarking, inability to revert the relation to original data set (required in high precision industries) and requirement to store unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store original database at a secure secondary storage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Archean Hollandaire volcanogenic massive sulfide deposit is a felsic–siliciclastic VMS deposit located in the Murchison Domain of the Youanmi Terrane, Yilgarn Craton, Western Australia. It is hosted in a succession of turbidites, mudstones and coherent rhyodacite sills and has been metamorphosed to upper greenschist/lower amphibolite facies and includes a pervasive S1 deformational fabric. The coherent rhyodacitic sills are interpreted as syndepositional based on geochemical similarities with well-known VMS-associated felsic rocks and similar foliations to the metasediments. We offer several explanations for the absence of textural evidence (e.g. breccias) for syn-depositional origins: 1) the subaqueous sediments were dehydrated by long-lived magmatism such that no pore-water remained to drive quench fragmentation; 2) pore-space occlusion by burial and/or, 3) alteration overprinting and obscuring of primary breccias at contact margins. Mineralisation occurs by sub-seafloor replacement of original host rocks in two ore bodies, Hollandaire Main (~125 x >500 m and ~8 m thick) and Hollandaire West (~100 x 470 m and ~5 m thick), and occurs in three main textural styles, massive sulfides, which are exclusively hosted in turbidites and mudstones, and stringer and disseminated sulfides, which are also hosted in coherent rhyodacite. Most sulfides have textures consistent with remobilisation and recrystallisation. Hydrothermal metamorphism has altered the hangingwall and footwall to similar degrees, with significant gains in Mg, Mn and K and losses in Na, Ca and Sr. Garnet and staurolite porphyryoblasts also exhibit a footprint around mineralisation, extending up to 30 m both above and below the ore zone. High precision thermal ionisation mass spectrometry of zircons extracted from the coherent rhyodacite yield an age of 2759.5 ± 0.9 Ma, which along with geochemical comparisons, places the succession within the 2760–2735 Ma Greensleeves Formation of the Polelle Group of the Murchison Supergroup. Geochemical and geochronological evidence link the coherent rhyodacite sills to the Peter Well Granodiorite pluton ~2 km to the W, which acted as the heat engine driving hydrothermal circulation during VMS mineralisation. This study highlights the importance of both: detailed physical volcanological studies from which an accurate assessment of timing relationships, particularly the possibility of intrusions dismembering ore horizons, can be made; and identifying synvolcanic plutons and other similar suites, for VMS exploration targets in the Youanmi Terrane and worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Natural nanopatterned surfaces (nNPS) present on insect wings have demonstrated bactericidal activity [1, 2]. Fabricated nanopatterned surfaces (fNPS) derived by characterization of these wings have also shown superior bactericidal activity [2]. However bactericidal NPS topologies vary in both geometry and chemical characteristics of the individual features in different insects and fabricated surfaces, rendering it difficult to ascertain the optimum geometrical parameters underling bactericidal activity. This situation calls for the adaptation of new and emerging techniques, which are capable of fabricating and characterising comparable structures to nNPS from biocompatible materials. In this research, CAD drawn nNPS representing an area of 10 μm x10 μm was fabricated on a fused silica glass by Nanoscribe photonic professional GT 3D laser lithography system using two photon polymerization lithography. The glass was cleaned with acetone and isopropyl alcohol thrice and a drop of IP-DIP photoresist from Nanoscribe GmbH was cast onto the glass slide prior to patterning. Photosensitive IP-DIP resist was polymerized with high precision to make the surface nanopatterns using a 780 nm wavelength laser. Both moving-beam fixedsample (MBFS) and fixed-beam moving-sample (FBMS) fabrication approaches were tested during the fabrication process to determine the best approach for the precise fabrication of the required nanotopological pattern. Laser power was also optimized to fabricate the required fNPS, where this was changed from 3mW to 10mW to determine the optimum laser power for the polymerization of the photoresist for fabricating FNPS...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a new high precision focused word sense disambiguation (WSD) approach is proposed, which not only attempts to identify the proper sense for a word but also provides the probabilistic evaluation for the identification confidence at the same time. A novel Instance Knowledge Network (IKN) is built to generate and maintain semantic knowledge at the word, type synonym set and instance levels. Related algorithms based on graph matching are developed to train IKN with probabilistic knowledge and to use IKN for probabilistic word sense disambiguation. Based on the Senseval-3 all-words task, we run extensive experiments to show the performance enhancements in different precision ranges and the rationality of probabilistic based automatic confidence evaluation of disambiguation. We combine our WSD algorithm with five best WSD algorithms in senseval-3 all words tasks. The results show that the combined algorithms all outperform the corresponding algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the programming an FPGA (Field Programmable Gate Array) to emulate the dynamics of DC machines. FPGA allows high speed real time simulation with high precision. The described design includes block diagram representation of DC machine, which contain all arithmetic and logical operations. The real time simulation of the machine in FPGA is controlled by user interfaces they are Keypad interface, LCD display on-line and digital to analog converter. This approach provides emulation of electrical machine by changing the parameters. Separately Exited DC machine implemented and experimental results are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rarely is it possible to obtain absolute numbers in free-ranging populations and although various direct and indirect methods are used to estimate abundance, few are validated against populations of known size. In this paper, we apply grounding, calibration and verification methods, used to validate mathematical models, to methods of estimating relative abundance. To illustrate how this might be done, we consider and evaluate the widely applied passive tracking index (PTI) methodology. Using published data, we examine the rationality of PTI methodology, how conceptually animal activity and abundance are related and how alternative methods are subject to similar biases or produce similar abundance estimates and trends. We then attune the method against populations representing a range of densities likely to be encountered in the field. Finally, we compare PTI trends against a prediction that adjacent populations of the same species will have similar abundance values and trends in activity. We show that while PTI abundance estimates are subject to environmental and behavioural stochasticity peculiar to each species, the PTI method and associated variance estimate showed high probability of detection, high precision of abundance values and, generally, low variability between surveys, and suggest that the PTI method applied using this procedure and for these species provides a sensitive and credible index of abundance. This same or similar validation approach can and should be applied to alternative relative abundance methods in order to demonstrate their credibility and justify their use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We apply our technique of using a Rb-stabilized ring-cavity resonator to measure the frequencies of various spectral components in the 555.8-nm 1S0-->3P1 line of Yb. We determine the isotope shifts with 60 kHz precision, which is an order-of-magnitude improvement over the best previous measurement on this line. There are two overlapping transitions, 171Yb(1/2-->3/2) and 173Yb(5/2-->3/2), which we resolve by applying a magnetic field. We thus obtain the hyperfine constants in the 3P1 state of the odd isotopes with a significantly improved precision. Knowledge of isotope shifts and hyperfine structure should prove useful for high-precision calculations in Yb necessary to interpret ongoing experiments testing parity and time-reversal symmetry violation in the laws of physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A key challenge of wide area kinematic positioning is to overcome the effects of the varying hardware biases in code signals of the BeiDou system. Based on three geometryfree/ionosphere-free combinations, the elevation-dependent code biases are modelled for all BeiDou satellites. Results from the data sets of 30-day for 5 baselines of 533 to 2545 km demonstrate that the wide-lane (WL) integer-fixing success rates of 98% to 100% can be achieved within 25 min. Under the condition of HDOP of less than 2, the overall RMS statistics show that ionospheric-free WL single-epoch solutions achieve 24 to 50 cm in the horizontal direction. Smoothing processing over the moving window of 20 min reduces the RMS values by a factor of about 2. Considering distance-independent nature, the above results show the potential that reliable and high precision positioning services could be provided in a wide area based on a sparsely distributed ground network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequency multiplication (FM) can be used to design low power frequency synthesizers. This is achieved by running the VCO at a much reduced frequency, while employing a power efficient frequency multiplier, and also thereby eliminating the first few dividers. Quadrature signals can be generated by frequency- multiplying low frequency I/Q signals, however this also multiplies the quadrature error of these signals. Another way is generating additional edges from the low-frequency oscillator (LFO) and develop a quadrature FM. This makes the I-Q precision heavily dependent on process mismatches in the ring oscillator. In this paper we examine the use of fewer edges from LFO and a single stage polyphase filter to generate approximate quadrature signals, which is then followed by an injection-locked quadrature VCO to generate high- precision I/Q signals. Simulation comparisons with the existing approach shows that the proposed method offers very good phase accuracy of 0.5deg with only a modest increase in power dissipation for 2.4 GHz IEEE 802.15.4 standard using UMC 0.13 mum RFCMOS technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In uplink OFDMA, carrier frequency offsets (CFO) and/or timing offsets (TO) of other users with respect to a desired user can cause multiuser interference (MUI). In practical uplink OFDMA systems (e.g., IEEE 802.16e standard), effect of this MUI is made acceptably small by requiring that frequency/timing alignment be achieved at the receiver with high precision (e.g., CFO must be within 1 % of the subcarrier spacing and TO must be within 1/8th of the cyclic prefix duration in IEEE 802.16e), which is realized using complex closed-loop frequency/timing correction between the transmitter and the receiver. An alternate open-loop approach to handle the MUI induced by large CFOs and TOs is to employ interference cancellation techniques at the receiver. In this paper, we first analytically characterize the degradation in the average output signal-to-interference ratio (SIR) due to the combined effect of large CFOs and TOs in uplink OFDMA. We then propose a parallel interference canceller (PIC) for the mitigation of interference due to CFOs and TOs in this system. We show that the proposed PIC effectively mitigates the performance loss due to CFO/TO induced interference in uplink OFDMA.