158 resultados para baselines


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real‐time kinematic (RTK) GPS techniques have been extensively developed for applications including surveying, structural monitoring, and machine automation. Limitations of the existing RTK techniques that hinder their applications for geodynamics purposes are twofold: (1) the achievable RTK accuracy is on the level of a few centimeters and the uncertainty of vertical component is 1.5–2 times worse than those of horizontal components and (2) the RTK position uncertainty grows in proportional to the base‐torover distances. The key limiting factor behind the problems is the significant effect of residual tropospheric errors on the positioning solutions, especially on the highly correlated height component. This paper develops the geometry‐specified troposphere decorrelation strategy to achieve the subcentimeter kinematic positioning accuracy in all three components. The key is to set up a relative zenith tropospheric delay (RZTD) parameter to absorb the residual tropospheric effects and to solve the established model as an ill‐posed problem using the regularization method. In order to compute a reasonable regularization parameter to obtain an optimal regularized solution, the covariance matrix of positional parameters estimated without the RZTD parameter, which is characterized by observation geometry, is used to replace the quadratic matrix of their “true” values. As a result, the regularization parameter is adaptively computed with variation of observation geometry. The experiment results show that new method can efficiently alleviate the model’s ill condition and stabilize the solution from a single data epoch. Compared to the results from the conventional least squares method, the new method can improve the longrange RTK solution precision from several centimeters to the subcentimeter in all components. More significantly, the precision of the height component is even higher. Several geosciences applications that require subcentimeter real‐time solutions can largely benefit from the proposed approach, such as monitoring of earthquakes and large dams in real‐time, high‐precision GPS leveling and refinement of the vertical datum. In addition, the high‐resolution RZTD solutions can contribute to effective recovery of tropospheric slant path delays in order to establish a 4‐D troposphere tomography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The BeiDou system is the first global navigation satellite system in which all satellites transmit triple-frequency signals that can provide the positioning, navigation, and timing independently. A benefit of triple-frequency signals is that more useful combinations can be formed, including some extrawide-lane combinations whose ambiguities can generally be instantaneously fixed without distance restriction, although the narrow-lane ambiguity resolution (NL AR) still depends on the interreceiver distance or requires a long time to achieve. In this paper, we synthetically study decimeter and centimeter kinematic positioning using BeiDou triple-frequency signals. It starts with AR of two extrawide-lane signals based on the ionosphere-free or ionosphere-reduced geometry-free model. For decimeter positioning, one can immediately use two ambiguity-fixed extrawide-lane observations without pursuing NL AR. To achieve higher accuracy, NL AR is the necessary next step. Despite the fact that long-baseline NL AR is still challenging, some NL ambiguities can indeed be fixed with high reliability. Partial AR for NL signals is acceptable, because as long as some ambiguities for NL signals are fixed, positioning accuracy will be certainly improved.With accumulation of observations, more and more NL ambiguities are fixed and the positioning accuracy continues to improve. An efficient Kalman-filtering system is established to implement the whole process. The formulated system is flexible, since the additional constraints can be easily applied to enhance the model's strength. Numerical results from a set of real triple-frequency BeiDou data on a 50 km baseline show that decimeter positioning is achievable instantaneously.With only five data epochs, 84% of NL ambiguities can be fixed so that the real-time kinematic accuracies are 4.5, 2.5, and 16 cm for north, east, and height components (respectively), while with 10 data epochs more than 90% of NL ambiguities are fixed, and the rea- -time kinematic solutions are improved to centimeter level for all three coordinate components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carrier phase ambiguity resolution over long baselines is challenging in BDS data processing. This is partially due to the variations of the hardware biases in BDS code signals and its dependence on elevation angles. We present an assessment of satellite-induced code bias variations in BDS triple-frequency signals and the ambiguity resolutions procedures involving both geometry-free and geometry-based models. First, since the elevation of a GEO satellite remains unchanged, we propose to model the single-differenced fractional cycle bias with widespread ground stations. Second, the effects of code bias variations induced by GEO, IGSO and MEO satellites on ambiguity resolution of extra-wide-lane, wide-lane and narrow-lane combinations are analyzed. Third, together with the IGSO and MEO code bias variations models, the effects of code bias variations on ambiguity resolution are examined using 30-day data collected over the baselines ranging from 500 to 2600 km in 2014. The results suggest that although the effect of code bias variations on the extra-wide-lane integer solution is almost ignorable due to its long wavelength, the wide-lane integer solutions are rather sensitive to the code bias variations. Wide-lane ambiguity resolution success rates are evidently improved when code bias variations are corrected. However, the improvement of narrow-lane ambiguity resolution is not obvious since it is based on geometry-based model and there is only an indirect impact on the narrow-lane ambiguity solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies concerning the physiological significance of Ca2+ sparks often depend on the detection and measurement of large populations of events in noisy microscopy images. Automated detection methods have been developed to quickly and objectively distinguish potential sparks from noise artifacts. However, previously described algorithms are not suited to the reliable detection of sparks in images where the local baseline fluorescence and noise properties can vary significantly, and risk introducing additional bias when applied to such data sets. Here, we describe a new, conceptually straightforward approach to spark detection in linescans that addresses this issue by combining variance stabilization with local baseline subtraction. We also show that in addition to greatly increasing the range of images in which sparks can be automatically detected, the use of a more accurate noise model enables our algorithm to achieve similar detection sensitivities with fewer false positives than previous approaches when applied both to synthetic and experimental data sets. We propose, therefore, that it might be a useful tool for improving the reliability and objectivity of spark analysis in general, and describe how it might be further optimized for specific applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis revealed the most importance factors shaping the distribution, abundance and genetic diversity of four marine foundation species. Environmental conditions, particularly sea temperatures, nutrient availability and ocean waves, played a primary role in shaping the spatial distribution and abundance of populations, acting on scales varying from tens of meters to hundreds of kilometres. Furthermore, the use of Species Distribution Models (SDMs) with biological records of occurrence and high-resolution oceanographic data, allowed predicting species distributions across time. This approach highlighted the role of climate change, particularly when extreme temperatures prevailed during glacial and interglacial periods. These results, when combined with mtDNA and microsatellite genetic variation of populations allowed inferring for the influence of past range dynamics in the genetic diversity and structure of populations. For instance, the Last Glacial Maximum produced important shifts in species ranges, leaving obvious signatures of higher genetic diversities in regions where populations persisted (i.e., refugia). However, it was found that a species’ genetic pool is shaped by regions of persistence, adjacent to others experiencing expansions and contractions. Contradicting expectations, refugia seem to play a minor role on the re(colonization) process of previously eroded populations. In addition, the available habitat area for expanding populations and the inherent mechanisms of species dispersal in occupying available habitats were also found to be fundamental in shaping the distributions of genetic diversity. However, results suggest that the high levels of genetic diversity in some populations do not rule out that they may have experienced strong genetic erosion in the past, a process here named shifting genetic baselines. Furthermore, this thesis predicted an ongoing retraction at the rear edges and extinctions of unique genetic lineages, which will impoverish the global gene pool, strongly shifting the genetic baselines in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What constitutes a baseline level of success for protein fold recognition methods? As fold recognition benchmarks are often presented without any thought to the results that might be expected from a purely random set of predictions, an analysis of fold recognition baselines is long overdue. Given varying amounts of basic information about a protein—ranging from the length of the sequence to a knowledge of its secondary structure—to what extent can the fold be determined by intelligent guesswork? Can simple methods that make use of secondary structure information assign folds more accurately than purely random methods and could these methods be used to construct viable hierarchical classifications?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is the first assessment of mollusk fossil assemblages relative to the compositional fidelity of modern mollusk living and death assemblages. It also shows that the sedimentary record can provide information on the original, non-human-impacted, freshwater malacofauna biodiversity, based on Late Pleistocene shells. The fossil mollusk assemblage from the Touro Passo Formation (Pleistocene-Holocene) was compared to living and death assemblages of the Touro Passo River, southern Brazil, revealing little resemblance between fossil and live-dead species composition. Although the living and death assemblages agree closely in richness, species composition, and species relative abundances (both proportional and rank), the fossil assemblage differs significantly from both modern assemblages in most of these measures. The fossil assemblage is dominated by the native endemic corbiculid bivalve Cyanocyclas limosa and the gastropod Heleobia aff. bertoniana. These are absent in the living assemblages, and both living and death assemblages are dominated by the alien Asiatic corbiculid C. fluminea, which is absent in the fossil assemblage. The fossil assemblage also contains, overall, a higher proportional abundance of relatively thick-shelled species, suggesting a genuine bias against the thinner- and smaller-shelled species. Our results suggest that contemporary environmental changes, such as the introduction of some alien freshwater mollusk species, together with post-burial taphonomic processes, are the main factors leading to the poor fidelity of the fossil assemblage studied. Hence, the taxonomic composition of the Late Pleistocene mollusks from the Touro Passo Formation probably would show greater similarity to present-day assemblages wherever the mollusk biodiversity is not disturbed by human activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"January 1993."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shifting baselines describes the phenomenon where long-term changes to an environment go unrecognized because what is perceived as natural shifts with succeeding generations of scientists and other observers. This is a particular problem for the oceans because we are rarely able to directly observe the consequences of human activities. In the absence of data to track these consequences, a common assumption has been that the communities we observe today using SCUBA or other technology, are similar to the communities that existed 10, 100, or even 1000 years ago. Research is increasingly demonstrating this is not the case. Instead, marine ecosystems may have been vastly different in the past, and we have succumbed to the shifting baselines syndrome. This has significant implications for scientific study, management, and for human communities more broadly. We discuss these implications, and how we might address the shifting baseline syndrome in the oceans to confront its repercussions. In a world where environmental degradation is accelerating, doing so is critical to avoid further ratcheting down of our expectations of ecosystem health and productivity, and to ensure that we have the information necessary to implement appropriate recovery and management goals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the preliminary results in establishing a strategy for predicting Zenith Tropospheric Delay (ZTD) and relative ZTD (rZTD) between Continuous Operating Reference Stations (CORS) in near real-time. It is anticipated that the predicted ZTD or rZTD can assist the network-based Real-Time Kinematic (RTK) performance over long inter-station distances, ultimately, enabling a cost effective method of delivering precise positioning services to sparsely populated regional areas, such as Queensland. This research firstly investigates two ZTD solutions: 1) the post-processed IGS ZTD solution and 2) the near Real-Time ZTD solution. The near Real-Time solution is obtained through the GNSS processing software package (Bernese) that has been deployed for this project. The predictability of the near Real-Time Bernese solution is analyzed and compared to the post-processed IGS solution where it acts as the benchmark solution. The predictability analyses were conducted with various prediction time of 15, 30, 45, and 60 minutes to determine the error with respect to timeliness. The predictability of ZTD and relative ZTD is determined (or characterized) by using the previously estimated ZTD as the predicted ZTD of current epoch. This research has shown that both the ZTD and relative ZTD predicted errors are random in nature; the STD grows from a few millimeters to sub-centimeters while the predicted delay interval ranges from 15 to 60 minutes. Additionally, the RZTD predictability shows very little dependency on the length of tested baselines of up to 1000 kilometers. Finally, the comparison of near Real-Time Bernese solution with IGS solution has shown a slight degradation in the prediction accuracy. The less accurate NRT solution has an STD error of 1cm within the delay of 50 minutes. However, some larger errors of up to 10cm are observed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: In 1970, Enright observed a distortion of perceived driving speed, induced by monocular application of a neutral density (ND) filter. If a driver looks out of the right side of a vehicle with a filter over the right eye, the driver perceives a reduction of the vehicle’s apparent velocity, while applying a ND filter over the left eye increases the vehicle’s apparent velocity. The purpose of the current study was to provide the first empirical measurements of the Enright phenomenon. Methods: Ten experienced drivers were tested and drove an automatic sedan on a closed road circuit. Filters (0.9 ND) were placed over the left, right or both eyes during a driving run, in addition to a control condition with no filters in place. Subjects were asked to look out of the right side of the car and adjust their driving speed to either 40 km/h or 60 km/h. Results: Without a filter or with both eyes filtered subjects showed good estimation of speed when asked to travel at 60 km/h but travelled a mean of 12 to 14 km/h faster than the requested 40 km/h. Subjects travelled faster than these baselines by a mean of 7 to 9 km/h (p < 0.001) with the filter over their right eye, and 3 to 5 km/h slower with the filter over their left eye (p < 0.05). Conclusions: The Enright phenomenon causes significant and measurable distortions of perceived driving speed under realworld driving conditions.