40 resultados para Real exchange rate


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports a study of ion exchange (IX) as an alternative CSG water treatment to the widely used reverse osmosis (RO) desalination process. An IX pilot plant facility has been constructed and operated using both synthetic and real CSG water samples. Application of appropriate synthetic resin technology has proved the effectiveness of IX processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of existing motorway traffic safety studies using disaggregate traffic flow data aim at developing models for identifying real-time traffic risks by comparing pre-crash and non-crash conditions. One of serious shortcomings in those studies is that non-crash conditions are arbitrarily selected and hence, not representative, i.e. selected non-crash data might not be the right data comparable with pre-crash data; the non-crash/pre-crash ratio is arbitrarily decided and neglects the abundance of non-crash over pre-crash conditions; etc. Here, we present a methodology for developing a real-time MotorwaY Traffic Risk Identification Model (MyTRIM) using individual vehicle data, meteorological data, and crash data. Non-crash data are clustered into groups called traffic regimes. Thereafter, pre-crash data are classified into regimes to match with relevant non-crash data. Among totally eight traffic regimes obtained, four highly risky regimes were identified; three regime-based Risk Identification Models (RIM) with sufficient pre-crash data were developed. MyTRIM memorizes the latest risk evolution identified by RIM to predict near future risks. Traffic practitioners can decide MyTRIM’s memory size based on the trade-off between detection and false alarm rates. Decreasing the memory size from 5 to 1 precipitates the increase of detection rate from 65.0% to 100.0% and of false alarm rate from 0.21% to 3.68%. Moreover, critical factors in differentiating pre-crash and non-crash conditions are recognized and usable for developing preventive measures. MyTRIM can be used by practitioners in real-time as an independent tool to make online decision or integrated with existing traffic management systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ascidians are marine invertebrates that have been a source of numerous cytotoxic compounds. Of the first six marine-derived drugs that made anticancer clinical trials, three originated from ascidian specimens. In order to identify new anti-neoplastic compounds, an ascidian extract library (143 samples) was generated and screened in MDA-MB-231 breast cancer cells using a real-time cell analyzer (RTCA). This resulted in 143 time-dependent cell response profiles (TCRP), which are read-outs of changes to the growth rate, morphology, and adhesive characteristics of the cell culture. Twenty-one extracts affected the TCRP of MDA-MB-231 cells and were further investigated regarding toxicity and specificity, as well as their effects on cell morphology and cell cycle. The results of these studies were used to prioritize extracts for bioassay-guided fractionation, which led to the isolation of the previously identified marine natural product, eusynstyelamide B (1). This bis-indole alkaloid was shown to display an IC50 of 5 μM in MDA-MB-231 cells. Moreover, 1 caused a strong cell cycle arrest in G2/M and induced apoptosis after 72 h treatment, making this molecule an attractive candidate for further mechanism of action studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study reports an investigation of the ion exchange treatment of sodium chloride solutions in relation to use of resin technology for applications such as desalination of brackish water. In particular, a strong acid cation (SAC) resin (DOW Marathon C) was studied to determine its capacity for sodium uptake and to evaluate the fundamentals of the ion exchange process involved. Key questions to answer included: impact of resin identity; best models to simulate the kinetics and equilibrium exchange behaviour of sodium ions; difference between using linear least squares (LLS) and non-linear least squares (NLLS) methods for data interpretation; and, effect of changing the type of anion in solution which accompanied the sodium species. Kinetic studies suggested that the exchange process was best described by a pseudo first order rate expression based upon non-linear least squares analysis of the test data. Application of the Langmuir Vageler isotherm model was recommended as it allowed confirmation that experimental conditions were sufficient for maximum loading of sodium ions to occur. The Freundlich expression best fitted the equilibrium data when analysing the information by a NLLS approach. In contrast, LLS methods suggested that the Langmuir model was optimal for describing the equilibrium process. The Competitive Langmuir model which considered the stoichiometric nature of ion exchange process, estimated the maximum loading of sodium ions to be 64.7 g Na/kg resin. This latter value was comparable to sodium ion capacities for SAC resin published previously. Inherent discrepancies involved when using linearized versions of kinetic and isotherm equations were illustrated, and despite their widespread use, the value of this latter approach was questionable. The equilibrium behaviour of sodium ions form sodium fluoride solution revealed that the sodium ions were now more preferred by the resin compared to the situation with sodium chloride. The solution chemistry of hydrofluoric acid was suggested as promoting the affinity of the sodium ions to the resin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reverse osmosis is the dominant technology utilized for desalination of saline water produced during the extraction of coal seam gas. Alternatively, ion exchange is of interest due to potential cost advantages. However, there is limited information regarding the column performance of strong acid cation resin for removal of sodium ions from both model and actual coal seam water samples. In particular, the impact of bed depth, flow rate, and regeneration was not clear. Consequently, this study applied Bed Depth Service Time (BDST) models to reveal that increasing sodium ion concentration and flow rates diminished the time required for breakthrough to occur. The loading of sodium ions on fresh resin was calculated to be ca. 71.1 g Na/kg resin. Difficulties in regeneration of the resin using hydrochloric acid solutions were discovered, with 86% recovery of exchange sites observed. The maximum concentration of sodium ions in the regenerant brine was found to be 47,400 mg/L under the conditions employed. The volume of regenerant waste formed was 6.2% of the total volume of water treated. A coal seam water sample was found to load the resin with only 53.5 g Na/kg resin, which was consistent with not only the co-presence of more favoured ions such as calcium, magnesium, barium and strontium, but also inefficient regeneration of the resin prior to the coal seam water test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Background: Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. Method: We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Results: Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Conclusion: Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Application: Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety.