964 resultados para High Precision Positioning
Resumo:
The in-medium influence on π0 photoproduction from spin zero nuclei is carefully studied in the GeV range using a straightforward Monte Carlo analysis. The calculation takes into account the relativistic nuclear recoil for coherent mechanisms (electromagnetic and nuclear amplitudes) plus a time dependent multi-collisional intranuclear cascade approach (MCMC) to describe the transport properties of mesons produced in the surroundings of the nucleon. A detailed analysis of the meson energy spectra for the photoproduction on 12C at 5.5 GeV indicates that both the Coulomb and nuclear coherent events are associated with a small energy transfer to the nucleus (≲ 5 MeV), while the contribution of the nuclear incoherent mechanism is vanishing small within this kinematical range. The angular distributions are dominated by the Primakoff peak at extreme forward angles, with the nuclear incoherent process being the most important contribution above θπ0 ≲ 20. Such consistent Monte Carlo approach provides a suitable method to clean up nuclear backgrounds in some recent high precision experiments, such as the PrimEx experiment at the Jefferson Laboratory Facility.
Resumo:
Integer carrier phase ambiguity resolution is the key to rapid and high-precision global navigation satellite system (GNSS) positioning and navigation. As important as the integer ambiguity estimation, it is the validation of the solution, because, even when one uses an optimal, or close to optimal, integer ambiguity estimator, unacceptable integer solution can still be obtained. This can happen, for example, when the data are degraded by multipath effects, which affect the real-valued float ambiguity solution, conducting to an incorrect integer (fixed) ambiguity solution. Thus, it is important to use a statistic test that has a correct theoretical and probabilistic base, which has became possible by using the Ratio Test Integer Aperture (RTIA) estimator. The properties and underlying concept of this statistic test are shortly described. An experiment was performed using data with and without multipath. Reflector objects were placed surrounding the receiver antenna aiming to cause multipath. A method based on multiresolution analysis by wavelet transform is used to reduce the multipath of the GPS double difference (DDs) observations. So, the objective of this paper is to compare the ambiguity resolution and validation using data from these two situations: data with multipath and with multipath reduced by wavelets. Additionally, the accuracy of the estimated coordinates is also assessed by comparing with the ground truth coordinates, which were estimated using data without multipath effects. The success and fail probabilities of the RTIA were, in general, coherent and showed the efficiency and the reliability of this statistic test. After multipath mitigation, ambiguity resolution becomes more reliable and the coordinates more precise. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
Low-frequency multipath is still one of the major challenges for high precision GPS relative positioning. In kinematic applications, mainly, due to geometry changes, the low-frequency multipath is difficult to be removed or modeled. Spectral analysis has a powerful technique to analyze this kind of non-stationary signals: the wavelet transform. However, some processes and specific ways of processing are necessary to work together in order to detect and efficiently mitigate low-frequency multipath. In this paper, these processes are discussed. Some experiments were carried out in a kinematic mode with a controlled and known vehicle movement. The data were collected in the presence of a reflector surface placed close to the vehicle to cause, mainly, low-frequency multipath. From theanalyses realized, the results in terms of double difference residuals and statistical tests showed that the proposed methodology is very efficient to detect and mitigate low-frequency multipath effects. © 2008 IEEE.
Resumo:
The upcoming solar maximum, which is expected to reach its peak around May 2013, occurs at a time when our reliance on high-precision GNSS has reached unprecedented proportions. The perturbations of the ionosphere caused by increased solar activity pose a major threat to these applications. This is particularly true in equatorial regions where high exposure to solar-induced disturbances is coupled with explosive growth of precise GNSS applications. Along with the various types of solar-induced ionospheric disturbances, strong scintillations are amongst the most challenging, causing phase measurement errors up to full losses of lock for several satellites. Brazil, which heavily relies on high-precision GNSS, is one of the most affected regions due notably to the proximity to the southern crest of the ionospheric equatorial anomaly and to the South Atlantic Magnetic Anomaly. In the framework of the CIGALA project, we developed the PolaRxS™, a GNSS receiver dedicated to the monitoring of ionospheric scintillation indices not only in the GPS L1 band but for all operational and upcoming constellations and frequency bands. A network of these receivers was deployed across the whole Brazilian territory in order to first investigate and secondly to mitigate the impact of scintillation on the different signals, ensuring high precision GNSS availability and integrity in the area. This paper reports on the validation of the PolaRxS™ receiver as an ionospheric scintillation monitor and the first results of the analysis of the data collected with the CIGALA network.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Plasmonen sind die kollektive resonante Anregung von Leitungselektronen. Vom Licht angeregternPlasmonen in subwellenlängen-grossen Nanopartikeln heissen Partikelplasmonen und sind vielversprechende Kandidaten für zukünftige Mikrosensoren wegen der starken Abhängigkeit der Resonanz an extern steuerbaren Parametern, wie die optischen Eigenschaften des umgebenden Mediums und die elektrische Ladung der Nanopartikel. Die extrem hohe Streue_zienz von Partikelplasmonen erlaubt eine einfache Beobachtung einzelner Nanopartikel in einem Mikroskop.rnDie Anforderung, schnell eine statistisch relevante Anzahl von Datenpunkten sammeln zu können,rnund die wachsende Bedeutung von plasmonischen (vor allem Gold-) Nanopartikeln für Anwendungenrnin der Medizin, hat nach der Entwicklung von automatisierten Mikroskopen gedrängt, die im bis dahin nur teilweise abgedeckten spektralen Fenster der biologischen Gewebe (biologisches Fenster) von 650 bis 900nm messen können. Ich stelle in dieser Arbeit das Plasmoscope vor, das genau unter Beobachtung der genannten Anforderungen entworfen wurde, in dem (1) ein einstellbarer Spalt in die Eingangsö_nung des Spektrometers, die mit der Bildebene des Mikroskops zusammenfällt, gesetzt wurde, und (2) einem Piezo Scantisch, der es ermöglicht, die Probe durch diesen schmalen Spalt abzurastern. Diese Verwirklichung vermeidet optische Elemente, die im nahen Infra-Rot absorbieren.rnMit dem Plasmoscope untersuche ich die plasmonische Sensitivität von Gold- und Silbernanostrnäbchen, d.h. die Plasmon-Resonanzverschiebung in Abhängigkeit mit der Änderung des umgebendenrnMediums. Die Sensitivität ist das Mass dafür, wie gut die Nanopartikeln Materialänderungenrnin ihrer Umgebung detektieren können, und damit ist es immens wichtig zu wissen, welche Parameterrndie Sensitivität beein_ussen. Ich zeige hier, dass Silbernanostäbchen eine höhere Sensitivität alsrnGoldnanostäbchen innerhalb des biologischen Fensters besitzen, und darüberhinaus, dass die Sensitivität mit der Dicke der Stäbchen wächst. Ich stelle eine theoretische Diskussion der Sensitivitätrnvor, indenti_ziere die Materialparameter, die die Sensitivität bein_ussen und leite die entsprechendenrnFormeln her. In einer weiteren Annäherung präsentiere ich experimentelle Daten, die die theoretische Erkenntnis unterstützen, dass für Sensitivitätsmessschemata, die auch die Linienbreite mitberücksichtigen, Goldnanostäbchen mit einem Aspektverhältnis von 3 bis 4 das optimalste Ergebnis liefern. Verlässliche Sensoren müssen eine robuste Wiederholbarkeit aufweisen, die ich mit Gold- und Silbernanostäbchen untersuche.rnDie Plasmonen-resonanzwellenlänge hängt von folgenden intrinsischen Materialparametern ab:rnElektrondichte, Hintergrundpolarisierbarkeit und Relaxationszeit. Basierend auf meinen experimentellen Ergebnissen zeige ich, dass Nanostäbchen aus Kupfer-Gold-Legierung im Vergleich zu ähnlich geformten Goldnanostäbchen eine rotverschobene Resonanz haben, und in welcher Weiserndie Linienbreite mit der stochimetrischen Zusammensetzung der legierten Nanopartikeln variiert.rnDie Abhängigkeit der Linienbreite von der Materialzusammensetzung wird auch anhand von silberbeschichteten und unbeschichteten Goldnanostäbchen untersucht.rnHalbleiternanopartikeln sind Kandidaten für e_ziente photovoltaische Einrichtungen. Die Energieumwandlung erfordert eine Ladungstrennung, die mit dem Plasmoscope experimentell vermessen wird, in dem ich die lichtinduzierte Wachstumsdynamik von Goldsphären auf Halbleiternanost äbchen in einer Goldionenlösung durch die Messung der gestreuten Intensität verfolge.rn
Resumo:
An extensive study of the morphology and the dynamics of the equatorial ionosphere over South America is presented here. A multi parametric approach is used to describe the physical characteristics of the ionosphere in the regions where the combination of the thermospheric electric field and the horizontal geomagnetic field creates the so-called Equatorial Ionization Anomalies. Ground based measurements from GNSS receivers are used to link the Total Electron Content (TEC), its spatial gradients and the phenomenon known as scintillation that can lead to a GNSS signal degradation or even to a GNSS signal ‘loss of lock’. A new algorithm to highlight the features characterizing the TEC distribution is developed in the framework of this thesis and the results obtained are validated and used to improve the performance of a GNSS positioning technique (long baseline RTK). In addition, the correlation between scintillation and dynamics of the ionospheric irregularities is investigated. By means of a software, here implemented, the velocity of the ionospheric irregularities is evaluated using high sampling rate GNSS measurements. The results highlight the parallel behaviour of the amplitude scintillation index (S4) occurrence and the zonal velocity of the ionospheric irregularities at least during severe scintillations conditions (post-sunset hours). This suggests that scintillations are driven by TEC gradients as well as by the dynamics of the ionospheric plasma. Finally, given the importance of such studies for technological applications (e.g. GNSS high-precision applications), a validation of the NeQuick model (i.e. the model used in the new GALILEO satellites for TEC modelling) is performed. The NeQuick performance dramatically improves when data from HF radar sounding (ionograms) are ingested. A custom designed algorithm, based on the image recognition technique, is developed to properly select the ingested data, leading to further improvement of the NeQuick performance.
Resumo:
La tesi è suddivisa in due parti. La prima è dedicata alla determinazione della Deflessione della Verticale (DdV) in Medicina (BO). Vengono presentati tre metodi per la determinazione delle componenti della DdV. Il primo utilizza la livellazione geometrica ed il sistema GNSS, il secondo, eseguito dal dott. Serantoni, utilizza il sistema QDaedalus, messo a punto all' ETH di Zurigo ed il terzo approccio utilizza il programma ConvER, messo a disposizione dalla regione Emilia-Romagna. Nella seconda parte viene presentato un metodo per la determinazione del Coefficiente di Rifrazione Atmosferico (CRA). La procedura di calcolo è di tipo iterativo ed utilizza, oltre agli angoli zenitali, anche le distanze misurate. Il metodo è stato testato in due aree di studio. La prima nella città di Limassol (Cipro) in ambiente urbano nell' autunno 2013. La seconda in Venezia nella laguna durante l'estate 2014.
Resumo:
Future experiments in nuclear and particle physics are moving towards the high luminosity regime in order to access rare processes. In this framework, particle detectors require high rate capability together with excellent timing resolution for precise event reconstruction. In order to achieve this, the development of dedicated FrontEnd Electronics (FEE) for detectors has become increasingly challenging and expensive. Thus, a current trend in R&D is towards flexible FEE that can be easily adapted to a great variety of detectors, without impairing the required high performance. This thesis reports on a novel FEE for two different detector types: imaging Cherenkov counters and plastic scintillator arrays. The former requires high sensitivity and precision for detection of single photon signals, while the latter is characterized by slower and larger signals typical of scintillation processes. The FEE design was developed using high-bandwidth preamplifiers and fast discriminators which provide Time-over-Threshold (ToT). The use of discriminators allowed for low power consumption, minimal dead-times and self-triggering capabilities, all fundamental aspects for high rate applications. The output signals of the FEE are readout by a high precision TDC system based on FPGA. The performed full characterization of the analogue signals under realistic conditions proved that the ToT information can be used in a novel way for charge measurements or walk corrections, thus improving the obtainable timing resolution. Detailed laboratory investigations proved the feasibility of the ToT method. The full readout chain was investigated in test experiments at the Mainz Microtron: high counting rates per channel of several MHz were achieved, and a timing resolution of better than 100 ps after walk correction based on ToT was obtained. Ongoing applications to fast Time-of-Flight counters and future developments of FEE have been also recently investigated.
Resumo:
OBJECTIVES Optical scanners combined with computer-aided design and computer-aided manufacturing (CAD/CAM) technology provide high accuracy in the fabrication of titanium (TIT) and zirconium dioxide (ZrO) bars. The aim of this study was to compare the precision of fit of CAD/CAM TIT bars produced with a photogrammetric and a laser scanner. METHODS Twenty rigid CAD/CAM bars were fabricated on one single edentulous master cast with 6 implants in the positions of the second premolars, canines and central incisors. A photogrammetric scanner (P) provided digitized data for TIT-P (n=5) while a laser scanner (L) was used for TIT-L (n=5). The control groups consisted of soldered gold bars (gold, n=5) and ZrO-P with similar bar design. Median vertical distance between implant and bar platforms from non-tightened implants (one-screw test) was calculated from mesial, buccal and distal scanning electron microscope measurements. RESULTS Vertical microgaps were not significantly different between TIT-P (median 16μm; 95% CI 10-27μm) and TIT-L (25μm; 13-32μm). Gold (49μm; 12-69μm) had higher values than TIT-P (p=0.001) and TIT-L (p=0.008), while ZrO-P (35μm; 17-55μm) exhibited higher values than TIT-P (p=0.023). Misfit values increased in all groups from implant position 23 (3 units) to 15 (10 units), while in gold and TIT-P values decreased from implant 11 toward the most distal implant 15. SIGNIFICANCE CAD/CAM titanium bars showed high precision of fit using photogrammetric and laser scanners. In comparison, the misfit of ZrO bars (CAM/CAM, photogrammetric scanner) and soldered gold bars was statistically higher but values were clinically acceptable.
Resumo:
Objective In order to benefit from the obvious advantages of minimally invasive liver surgery there is a need to develop high precision tools for intraoperative anatomical orientation, navigation and safety control. In a pilot study we adapted a newly developed system for computer-assisted liver surgery (CALS) in terms of accuracy and technical feasibility to the specific requirements of laparoscopy. Here, we present practical aspects related to laparoscopic computer assisted liver surgery (LCALS). Methods Our video relates to a patient presenting with 3 colorectal liver metastases in Seg. II, III and IVa who was selected in an appropriate oncological setting for LCALS using the CAScination system combined with 3D MEVIS reconstruction. After minimal laparoscopic mobilization of the liver, a 4- landmark registration method was applied to enable navigation. Placement of microwave needles was performed using the targeting module of the navigation system and correct needle positioning was confirmed by intraoperative sonography. Ablation of each lesion was carried out by application of microwave energy at 100 Watts for 1 minute. Results To acquire an accurate (less 0.5 cm) registration, 4 registration cycles were necessary. In total, seven minutes were required to accomplish precise registration. Successful ablation with complete response in all treated areas was assessed by intraoperative sonography and confirmed by postoperative CT scan. Conclusions This teaching video demonstrates the theoretical and practical key points of LCALS with a special emphasis on preoperative planning, intraoperative registration and accuracy testing by laparoscopic methodology. In contrast to mere ultrasound-guided ablation of liver lesions, LCALS offers a more dimensional targeting and higher safety control. This is currently also in routine use to treat vanishing lesions and other difficult to target focal lesions within the liver.