911 resultados para Global navigation satellites system
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
Global Navigation Satellite Systems (GNSS)-based observation systems can provide high precision positioning and navigation solutions in real time, in the order of subcentimetre if we make use of carrier phase measurements in the differential mode and deal with all the bias and noise terms well. However, these carrier phase measurements are ambiguous due to unknown, integer numbers of cycles. One key challenge in the differential carrier phase mode is to fix the integer ambiguities correctly. On the other hand, in the safety of life or liability-critical applications, such as for vehicle safety positioning and aviation, not only is high accuracy required, but also the reliability requirement is important. This PhD research studies to achieve high reliability for ambiguity resolution (AR) in a multi-GNSS environment. GNSS ambiguity estimation and validation problems are the focus of the research effort. Particularly, we study the case of multiple constellations that include initial to full operations of foreseeable Galileo, GLONASS and Compass and QZSS navigation systems from next few years to the end of the decade. Since real observation data is only available from GPS and GLONASS systems, the simulation method named Virtual Galileo Constellation (VGC) is applied to generate observational data from another constellation in the data analysis. In addition, both full ambiguity resolution (FAR) and partial ambiguity resolution (PAR) algorithms are used in processing single and dual constellation data. Firstly, a brief overview of related work on AR methods and reliability theory is given. Next, a modified inverse integer Cholesky decorrelation method and its performance on AR are presented. Subsequently, a new measure of decorrelation performance called orthogonality defect is introduced and compared with other measures. Furthermore, a new AR scheme considering the ambiguity validation requirement in the control of the search space size is proposed to improve the search efficiency. With respect to the reliability of AR, we also discuss the computation of the ambiguity success rate (ASR) and confirm that the success rate computed with the integer bootstrapping method is quite a sharp approximation to the actual integer least-squares (ILS) method success rate. The advantages of multi-GNSS constellations are examined in terms of the PAR technique involving the predefined ASR. Finally, a novel satellite selection algorithm for reliable ambiguity resolution called SARA is developed. In summary, the study demonstrats that when the ASR is close to one, the reliability of AR can be guaranteed and the ambiguity validation is effective. The work then focuses on new strategies to improve the ASR, including a partial ambiguity resolution procedure with a predefined success rate and a novel satellite selection strategy with a high success rate. The proposed strategies bring significant benefits of multi-GNSS signals to real-time high precision and high reliability positioning services.
Resumo:
Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.
Resumo:
This research investigates how to obtain accurate and reliable positioning results with global navigation satellite systems (GNSS). The work provides a theoretical framework for reliability control in GNSS carrier phase ambiguity resolution, which is the key technique for precise GNSS positioning in centimetre levels. The proposed approach includes identification and exclusion procedures of unreliable solutions and hypothesis tests, allowing the reliability of solutions to be controlled in the aspects of mathematical models, integer estimation and ambiguity acceptance tests. Extensive experimental results with both simulation and observed data sets effectively demonstrate the reliability performance characteristics based on the proposed theoretical framework and procedures.
Resumo:
At the 2014 G20 held in Brisbane, Australia took the position that climate change is not an economic issue. Most others thought it was - especially the Turkish Prime Minister who is hosting the 2015 G20. It is certainly an economic issue. But, it is not just an economic issue - either in the source or the solution.
Resumo:
This paper shows that by using only symbolic language phrases, a mobile robot can purposefully navigate to specified rooms in previously unexplored environments. The robot intelligently organises a symbolic language description of the unseen environment and “imagines” a representative map, called the abstract map. The abstract map is an internal representation of the topological structure and spatial layout of symbolically defined locations. To perform goal-directed exploration, the abstract map creates a high-level semantic plan to reason about spaces beyond the robot’s known world. While completing the plan, the robot uses the metric guidance provided by a spatial layout, and grounded observations of door labels, to efficiently guide its navigation. The system is shown to complete exploration in unexplored spaces by travelling only 13.3% further than the optimal path.
Resumo:
This book investigates the ethical values that inform the global carbon integrity system, and reflects on alternative norms that could or should do so. The global carbon integrity system comprises the emerging international architecture being built to respond to the climate change. This architecture can be understood as an 'integrity system'- an inter-related set of institutions, governance arrangements, regulations and practices that work to ensure the system performs its role faithfully and effectively. This volume investigates the ways ethical values impact on where and how the integrity system works, where it fails, and how it can be improved. With a wide array of perspectives across many disciplines, including ethicists, philosophers, lawyers, governance experts and political theorists, the chapters seek to explore the positive values driving the global climate change processes, to offer an understanding of the motivations justifying the creation of the regime and the way that social norms impact upon the operation of the integrity system. The collection focuses on the nexus between ideal ethics and real-world implementation through institutions and laws. The book will be of interest to policy makers, climate change experts, carbon taxation regulators, academics, legal practitioners and researchers.
Resumo:
Networked digital technologies and Open Access (OA) are transforming the processes and institutions of research, knowledge creation and dissemination globally: enabling new forms of collaboration, allowing researchers to be seen and heard in new ways and reshaping relationships between stakeholders across the global academic publishing system. This article draws on Joseph Nye’s concept of ‘Soft Power’ to explore the role that OA is playing in helping to reshape academic publishing in China. It focusses on two important areas of OA development: OA journals and national-level repositories. OA is being supported at the highest levels, and there is potential for it to play an important role in increasing the status and impact of Chinese scholarship. Investments in OA also have the potential to help China to re-position itself within international copyright discourses: moving beyond criticism for failure to enforce the rights of foreign copyright owners and progressing an agenda that places greater emphasis on equality of access to the resources needed to foster innovation. However, the potential for OA to help China to build and project its soft power is being limited by the legacies of the print era, as well as the challenges of efficiently governing the national research and innovation systems.
Resumo:
One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.
Resumo:
The performance of postdetection integration (PDI) techniques for the detection of Global Navigation Satellite Systems (GNSS) signals in the presence of uncertainties in frequency offsets, noise variance, and unknown data-bits is studied. It is shown that the conventional PDI techniques are generally not robust to uncertainty in the data-bits and/or the noise variance. Two new modified PDI techniques are proposed, and they are shown to be robust to these uncertainties. The receiver operating characteristics (ROC) and sample complexity performance of the PDI techniques in the presence of model uncertainties are analytically derived. It is shown that the proposed methods significantly outperform existing methods, and hence they could become increasingly important as the GNSS receivers attempt to push the envelope on the minimum signal-to-noise ratio (SNR) for reliable detection.
Resumo:
The state of PICES science - 1999 The status of the Bering Sea: January - July, 1999 The state of the western North Pacific in the second half of 1998 The state of the eastern North Pacific since February 1999 MEQ/WG 8 Practical Workshop Michael M. Mullin - A biography Highlights of Eighth Annual Meeting Mechanism causing the variability of the Japanese sardine population: Achievements of the Bio-Cosmos Project in Japan Climate change, global warming, and the PICES mandate – The need for improved monitoring The new age of China-GLOBEC study GLOBEC activities in Korean waters Aspects of the Global Ocean Observing System (GOOS)
Resumo:
A dissertação baseia-se na Teoria do Desenvolvimento Geográfico Desigual, fruto das elucubrações e da inserção do viés marxista no âmbito da Geografia Crítica ou Radical, que desvelará a produção das diferenciações espaciais concernente ao espaço urbano. Apoiado nessa base teórica objetiva-se desvelar a produção do espaço desigual no bairro de Campo Grande, Rio de Janeiro, com foco nas Zonas Residenciais 3 e 4, pela leitura das políticas urbanas estatais, que fragmentam e restringem o acesso aos diferentes espaços do bairro, estabelecidas e influenciadas pelo neoliberalismo no momento atual da integração e acumulação do sistema político-econômico mundial estabelecido como Globalização. Ainda, verifica-se a influência dos agentes privados na formulação de tais políticas e na apropriação e produção do espaço.
Resumo:
本文介绍了我们研制开发的一种用于自主移动机器人的激光全局定位系统 ,重点描述了该系统的硬件结构和工作原理 ,介绍分析了定位算法 .文章最后介绍了该定位系统在实验室条件下所进行的实验 .实验结果表明 :该系统具有较高的定位精度和抗干扰能力 ,是自主移动机器人理想的定位工具 .
Resumo:
文中重点讨论了系统实现过程中,任务分解与行走命令下达,时序分配与同步,路标定位与行走误差修正.动态障碍感知,测定与响应和特别情况紧急处理等难题的解决策略及遇到的问题.
Resumo:
This paper examines long term changes in the plankton of the North Atlantic and northwest European shelf seas and discusses the forcing mechanisms behind some observed interannual, decadal and spatial patterns of variability with a focus on climate change. Evidence from the Continuous Plankton Records suggests that the plankton integrates hydrometeorological signals and may be used as a possible index of climate change. Changes evident in the plankton are likely to have important effects on the carrying capacity of fisheries and are of relvance to eutrophication issues and to the assessment of biodiversity. The scale of the changes seen over the past five decades emphasises the importance of maintaining existing, and establishing new, long term and wide scale monitoring programmes of the world's oceans in initiatives such as the Global Ocean Observing System (GOOS).