804 resultados para Real Exchange rate


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an efficient face detection method suitable for real-time surveillance applications. Improved efficiency is achieved by constraining the search window of an AdaBoost face detector to pre-selected regions. Firstly, the proposed method takes a sparse grid of sample pixels from the image to reduce whole image scan time. A fusion of foreground segmentation and skin colour segmentation is then used to select candidate face regions. Finally, a classifier-based face detector is applied only to selected regions to verify the presence of a face (the Viola-Jones detector is used in this paper). The proposed system is evaluated using 640 x 480 pixels test images and compared with other relevant methods. Experimental results show that the proposed method reduces the detection time to 42 ms, where the Viola-Jones detector alone requires 565 ms (on a desktop processor). This improvement makes the face detector suitable for real-time applications. Furthermore, the proposed method requires 50% of the computation time of the best competing method, while reducing the false positive rate by 3.2% and maintaining the same hit rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports a study of ion exchange (IX) as an alternative CSG water treatment to the widely used reverse osmosis (RO) desalination process. An IX pilot plant facility has been constructed and operated using both synthetic and real CSG water samples. Application of appropriate synthetic resin technology has proved the effectiveness of IX processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of existing motorway traffic safety studies using disaggregate traffic flow data aim at developing models for identifying real-time traffic risks by comparing pre-crash and non-crash conditions. One of serious shortcomings in those studies is that non-crash conditions are arbitrarily selected and hence, not representative, i.e. selected non-crash data might not be the right data comparable with pre-crash data; the non-crash/pre-crash ratio is arbitrarily decided and neglects the abundance of non-crash over pre-crash conditions; etc. Here, we present a methodology for developing a real-time MotorwaY Traffic Risk Identification Model (MyTRIM) using individual vehicle data, meteorological data, and crash data. Non-crash data are clustered into groups called traffic regimes. Thereafter, pre-crash data are classified into regimes to match with relevant non-crash data. Among totally eight traffic regimes obtained, four highly risky regimes were identified; three regime-based Risk Identification Models (RIM) with sufficient pre-crash data were developed. MyTRIM memorizes the latest risk evolution identified by RIM to predict near future risks. Traffic practitioners can decide MyTRIM’s memory size based on the trade-off between detection and false alarm rates. Decreasing the memory size from 5 to 1 precipitates the increase of detection rate from 65.0% to 100.0% and of false alarm rate from 0.21% to 3.68%. Moreover, critical factors in differentiating pre-crash and non-crash conditions are recognized and usable for developing preventive measures. MyTRIM can be used by practitioners in real-time as an independent tool to make online decision or integrated with existing traffic management systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ascidians are marine invertebrates that have been a source of numerous cytotoxic compounds. Of the first six marine-derived drugs that made anticancer clinical trials, three originated from ascidian specimens. In order to identify new anti-neoplastic compounds, an ascidian extract library (143 samples) was generated and screened in MDA-MB-231 breast cancer cells using a real-time cell analyzer (RTCA). This resulted in 143 time-dependent cell response profiles (TCRP), which are read-outs of changes to the growth rate, morphology, and adhesive characteristics of the cell culture. Twenty-one extracts affected the TCRP of MDA-MB-231 cells and were further investigated regarding toxicity and specificity, as well as their effects on cell morphology and cell cycle. The results of these studies were used to prioritize extracts for bioassay-guided fractionation, which led to the isolation of the previously identified marine natural product, eusynstyelamide B (1). This bis-indole alkaloid was shown to display an IC50 of 5 μM in MDA-MB-231 cells. Moreover, 1 caused a strong cell cycle arrest in G2/M and induced apoptosis after 72 h treatment, making this molecule an attractive candidate for further mechanism of action studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regenerable 'gel-coated' cationic resins with fast sorption kinetics and high sorption capacity have application potential for removal of trace metal ions even in large-scale operations. Poly(acrylic acid) has been gel-coated on high-surface area silica (pre-coated with ethylene-vinyl acetate copolymer providing a thin barrier layer) and insolubilized by crosslinking with a low-molecular-weight diepoxide (epoxy equivalent 180 g) in the presence of benzyl dimethylamine catalyst at 70 degrees C, In experiments performed for Ca2+ sorption from dilute aqueous solutions of Ca(NO,),, the gel-coated acrylic resin is found to have nearly 40% higher sorption capacity than the bead-form commercial methacrylic resin Amberlite IRC-50 and also several limes higher rate of sorption. The sorption on the gel-coated sorbent under vigorous agitation has the characteristics of particle diffusion control with homogeneous (gel) diffusion in resin phase. A new mathematical model is proposed for such sorption on gel-coated ion-exchange resin in finite bath and solved by applying operator-theoretic methods. The analytical solution so obtained shows goad agreement with experimental sorption kinetics at relatively low levels (< 70%) of resin conversion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Odour emission rates were measured for seven different anaerobic ponds treating piggery wastes at six to nine discrete locations across the surface of each pond on each sampling occasion over a thirteen month period. Significant variability in emission rates were observed for each pond. Measurement of a number of water quality variables in pond liquor samples collected at the same time and from the same locations as the odour samples indicated that the composition of the pond liquor was also variable. The results indicated that spatial variability was a real phenomenon and could have a significant impact on odour assessment practices. Considerably more odour samples would be required to characterise pond emissions than currently recommended by most practitioners, or regulatory agencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trichinella nematodes are the causative agent of trichinellosis, a meat-borne zoonosis acquired by consuming undercooked, infected meat. Although most human infections are sourced from the domestic environment, the majority of Trichinella parasites circulate in the natural environment in carnivorous and scavenging wildlife. Surveillance using reliable and accurate diagnostic tools to detect Trichinella parasites in wildlife hosts is necessary to evaluate the prevalence and risk of transmission from wildlife to humans. Real-time PCR assays have previously been developed for the detection of European Trichinella species in commercial pork and wild fox muscle samples. We have expanded on the use of real-time PCR in Trichinella detection by developing an improved extraction method and SYBR green assay that detects all known Trichinella species in muscle samples from a greater variety of wildlife. We simulated low-level Trichinella infections in wild pig, fox, saltwater crocodile, wild cat and a native Australian marsupial using Trichinella pseudospiralis or Trichinella papuae ethanol-fixed larvae. Trichinella-specific primers targeted a conserved region of the small subunit of the ribosomal RNA and were tested for specificity against host and other parasite genomic DNAs. The analytical sensitivity of the assay was at least 100 fg using pure genomic T. pseudospiralis DNA serially diluted in water. The diagnostic sensitivity of the assay was evaluated by spiking log of each host muscle with T. pseudospiralis or T. papuae larvae at representative infections of 1.0, 0.5 and 0.1 larvae per gram, and shown to detect larvae at the lowest infection rate. A field sample evaluation on naturally infected muscle samples of wild pigs and Tasmanian devils showed complete agreement with the EU reference artificial digestion method (k-value = 1.00). Positive amplification of mouse tissue experimentally infected with T. spiralis indicated the assay could also be used on encapsulated species in situ. This real-time PCR assay offers an alternative highly specific and sensitive diagnostic method for use in Trichinella wildlife surveillance and could be adapted to wildlife hosts of any region. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time scheduling algorithms, such as Rate Monotonic and Earliest Deadline First, guarantee that calculations are performed within a pre-defined time. As many real-time systems operate on limited battery power, these algorithms have been enhanced with power-aware properties. In this thesis, 13 power-aware real-time scheduling algorithms for processor, device and system-level use are explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study reports an investigation of the ion exchange treatment of sodium chloride solutions in relation to use of resin technology for applications such as desalination of brackish water. In particular, a strong acid cation (SAC) resin (DOW Marathon C) was studied to determine its capacity for sodium uptake and to evaluate the fundamentals of the ion exchange process involved. Key questions to answer included: impact of resin identity; best models to simulate the kinetics and equilibrium exchange behaviour of sodium ions; difference between using linear least squares (LLS) and non-linear least squares (NLLS) methods for data interpretation; and, effect of changing the type of anion in solution which accompanied the sodium species. Kinetic studies suggested that the exchange process was best described by a pseudo first order rate expression based upon non-linear least squares analysis of the test data. Application of the Langmuir Vageler isotherm model was recommended as it allowed confirmation that experimental conditions were sufficient for maximum loading of sodium ions to occur. The Freundlich expression best fitted the equilibrium data when analysing the information by a NLLS approach. In contrast, LLS methods suggested that the Langmuir model was optimal for describing the equilibrium process. The Competitive Langmuir model which considered the stoichiometric nature of ion exchange process, estimated the maximum loading of sodium ions to be 64.7 g Na/kg resin. This latter value was comparable to sodium ion capacities for SAC resin published previously. Inherent discrepancies involved when using linearized versions of kinetic and isotherm equations were illustrated, and despite their widespread use, the value of this latter approach was questionable. The equilibrium behaviour of sodium ions form sodium fluoride solution revealed that the sodium ions were now more preferred by the resin compared to the situation with sodium chloride. The solution chemistry of hydrofluoric acid was suggested as promoting the affinity of the sodium ions to the resin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reverse osmosis is the dominant technology utilized for desalination of saline water produced during the extraction of coal seam gas. Alternatively, ion exchange is of interest due to potential cost advantages. However, there is limited information regarding the column performance of strong acid cation resin for removal of sodium ions from both model and actual coal seam water samples. In particular, the impact of bed depth, flow rate, and regeneration was not clear. Consequently, this study applied Bed Depth Service Time (BDST) models to reveal that increasing sodium ion concentration and flow rates diminished the time required for breakthrough to occur. The loading of sodium ions on fresh resin was calculated to be ca. 71.1 g Na/kg resin. Difficulties in regeneration of the resin using hydrochloric acid solutions were discovered, with 86% recovery of exchange sites observed. The maximum concentration of sodium ions in the regenerant brine was found to be 47,400 mg/L under the conditions employed. The volume of regenerant waste formed was 6.2% of the total volume of water treated. A coal seam water sample was found to load the resin with only 53.5 g Na/kg resin, which was consistent with not only the co-presence of more favoured ions such as calcium, magnesium, barium and strontium, but also inefficient regeneration of the resin prior to the coal seam water test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Background: Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. Method: We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Results: Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Conclusion: Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Application: Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety.