987 resultados para Atmospheric Effects
Modeling of atmospheric effects on InSAR measurements by incorporating terrain elevation information
Resumo:
We propose an elevation-dependent calibratory method to correct for the water vapour-induced delays over Mt. Etna that affect the interferometric syntheric aperture radar (InSAR) results. Water vapour delay fields are modelled from individual zenith delay estimates on a network of continuous GPS receivers. These are interpolated using simple kriging with varying local means over two domains, above and below 2 km in altitude. Test results with data from a meteorological station and 14 continuous GPS stations over Mt. Etna show that a reduction of the mean phase delay field of about 27% is achieved after the model is applied to a 35-day interferogram. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Atmospheric parameters, Such as pressure (P), temperature (T) and density (rho proportional to P/T), affect the development of extensive air showers initiated by energetic cosmic rays. We have Studied the impact of atmospheric variations on extensive air showers by means of the surface detector of the Pierre Auger Observatory. The rate of events shows a similar to 10% seasonal modulation and similar to 2% diurnal one. We find that the observed behaviour is explained by a model including the effects associated with the variations of P and rho. The former affects the longitudinal development of air showers while the latter influences the Moliere radius and hence the lateral distribution of the shower particles. The model is validated with full simulations of extensive air showers using atmospheric profiles measured at the site of the Pierre Auger Observatory. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Atmospheric scattering plays a crucial rule in degrading the performance of electro optical imaging systems operating in the visible and infra-red spectral bands, and hence limits the quality of the acquired images, either through reduction of contrast or increase of image blur. The exact nature of light scattering by atmospheric media is highly complex and depends on the types, orientations, sizes and distributions of particles constituting these media, as well as wavelengths, polarization states and directions of the propagating radiation. Here we follow the common approach for solving imaging and propagation problems by treating the propagating light through atmospheric media as composed of two main components: a direct (unscattered), and a scattered component. In this work we developed a detailed model of the effects of absorption and scattering by haze and fog atmospheric aerosols on the optical radiation propagating from the object plane to an imaging system, based on the classical theory of EM scattering. This detailed model is then used to compute the average point spread function (PSF) of an imaging system which properly accounts for the effects of the diffraction, scattering, and the appropriate optical power level of both the direct and the scattered radiation arriving at the pupil of the imaging system. Also, the calculated PSF, properly weighted for the energy contributions of the direct and scattered components is used, in combination with a radiometric model, to estimate the average number of the direct and scattered photons detected at the sensor plane, which are then used to calculate the image spectrum signal to- noise ratio (SNR) in the visible near infra-red (NIR) and mid infra-red (MIR) spectral wavelength bands. Reconstruction of images degraded by atmospheric scattering and measurement noise is then performed, up to the limit imposed by the noise effective cutoff spatial frequency of the image spectrum SNR. Key results of this research are as follows: A mathematical model based on Mie scattering theory for how scattering from aerosols affects the overall point spread function (PSF) of an imaging system was developed, coded in MATLAB, and demonstrated. This model along with radiometric theory was used to predict the limiting resolution of an imaging system as a function of the optics, scattering environment, and measurement noise. Finally, image reconstruction algorithms were developed and demonstrated which mitigate the effects of scattering-induced blurring to within the limits imposed by noise.
Resumo:
The impact of topography and mixed pixels on L-band radiometric observations over land needs to be quantified to improve the accuracy of soil moisture retrievals. For this purpose, a series of simulations has been performed with an improved version of the soil moisture and ocean salinity (SMOS) end-to-end performance simulator (SEPS). The brightness temperature generator of SEPS has been modified to include a 100-m-resolution land cover map and a 30-m-resolution digital elevation map of Catalonia (northeast of Spain). This high-resolution generator allows the assessment of the errors in soil moisture retrieval algorithms due to limited spatial resolution and provides a basis for the development of pixel disaggregation techniques. Variation of the local incidence angle, shadowing, and atmospheric effects (up- and downwelling radiation) due to surface topography has been analyzed. Results are compared to brightness temperatures that are computed under the assumption of an ellipsoidal Earth.
Resumo:
Transmission of Cherenkov light through the atmosphere is strongly influenced by the optical clarity of the atmosphere and the prevailing weather conditions. The performance of telescopes measuring this light is therefore dependent on atmospheric effects. This thesis presents software and hardware developed to implement a prototype sky monitoring system for use on the proposed next-generation gamma-ray telescope array, VERITAS. The system, consisting of a CCD camera and a far-infrared pyrometer, was successfully installed and tested on the ten metre atmospheric Cherenkov imaging telescope operated by the VERITAS Collaboration at the F.L. Whipple Observatory in Arizona. The thesis also presents the results of observations of the BL Lacertae object, 1ES1959+650, made with the Whipple ten metre telescope. The observations provide evidence for TeV gamma-ray emission from the BL Lacertae object, 1ES1959+650, at a level of more than 15 standard deviations above background. This represents the first unequivocal detection of this object at TeV energies, making it only the third extragalactic source seen at such levels of significance in this energy range. The flux variability of the source on a number of timescales is also investigated.
Resumo:
This paper presents a model of the Stokes emission vector from the ocean surface. The ocean surface is described as an ensemble of facets with Cox and Munk's (1954) Gram-Charlier slope distribution. The study discusses the impact of different up-wind and cross-wind rms slopes, skewness, peakedness, foam cover models and atmospheric effects on the azimuthal variation of the Stokes vector, as well as the limitations of the model. Simulation results compare favorably, both in mean value and azimuthal dependence, with SSM/I data at 53° incidence angle and with JPL's WINDRAD measurements at incidence angles from 30° to 65°, and at wind speeds from 2.5 to 11 m/s.
Resumo:
A simple physical model of the atmospheric effects of large explosive volcanic eruptions is developed. Using only one input parameter - the initial amount of sulphur dioxide injected into the stratosphere - the global-average stratospheric optical-depth perturbation and surface temperature response are modelled. The simplicity of this model avoids issues of incomplete data (applicable to more comprehensive models), making it a powerful and useful tool for atmospheric diagnostics of this climate forcing mechanism. It may also provide a computationally inexpensive and accurate way of introducing volcanic activity into larger climate models. The modelled surface temperature response for an initial sulphur-dioxide injection, coupled with emission-history statistics, is used to demonstrate that the most climatically significant volcanic eruptions are those of sufficient explosivity to just reach into the stratosphere (and achieve longevity). This study also highlights the fact that this measure of significance is highly sensitive to the representation of the climatic response and the frequency data used, and that we are far from producing a definitive history of explosive volcanism for at least the past 1000 years. Given this high degree of uncertainty, these results suggest that eruptions that release around and above 0.1 Mt SO2 into the stratosphere have the maximum climatic impact.
Resumo:
The Arctic is a region particularly susceptible to rapid climate change. General circulation models (GCMs) suggest a polar amplification of any global warming signal by a factor of about 1.5 due, in part, to sea ice feedbacks. The dramatic recent decline in multi-year sea ice cover lies outside the standard deviation of the CMIP3 ensemble GCM predictions. Sea ice acts as a barrier between cold air and warmer oceans during winter, as well as inhibiting evaporation from the ocean surface water during the summer. An ice free Arctic would likely have an altered hydrological cycle with more evaporation from the ocean surface leading to changes in precipitation distribution and amount. Using the U.K. Met Office Regional Climate Model (RCM), HadRM3, the atmospheric effects of the observed and projected reduction in Arctic sea ice are investigated. The RCM is driven by the atmospheric GCM HadAM3. Both models are forced with sea surface temperature and sea ice for the period 2061-2090 from the CMIP3 HadGEM1 experiments. Here we use an RCM at 50km resolution over the Arctic and 25km over Svalbard, which captures well the present-day pattern of precipitation and provides a detailed picture of the projected changes in the behaviour of the oceanic-atmosphere moisture fluxes and how they affect precipitation. These experiments show that the projected 21stCentury sea ice decline alone causes large impacts to the surface mass balance (SMB) on Svalbard. However Greenland’s SMB is not significantly affected by sea ice decline alone, but responds with a strongly negative shift in SMB when changes to SST are incorporated into the experiments. This is the first study to characterise the impact of changes in future sea ice to Arctic terrestrial cryosphere mass balance.
Resumo:
Since data-taking began in January 2004, the Pierre Auger Observatory has been recording the count rates of low energy secondary cosmic ray particles for the self-calibration of the ground detectors of its surface detector array. After correcting for atmospheric effects, modulations of galactic cosmic rays due to solar activity and transient events are observed. Temporal variations related with the activity of the heliosphere can be determined with high accuracy due to the high total count rates. In this study, the available data are presented together with an analysis focused on the observation of Forbush decreases, where a strong correlation with neutron monitor data is found.
Resumo:
The air fluorescence detector of the Pierre Auger Observatory is designed to perforin calorimetric measurements of extensive air showers created by Cosmic rays of above 10(18) eV. To correct these measurements for the effects introduced by atmospheric fluctuations, the Observatory contains a group Of monitoring instruments to record atmospheric conditions across the detector site, ail area exceeding 3000 km(2). The atmospheric data are used extensively in the reconstruction of air showers, and are particularly important for the correct determination of shower energies and the depths of shower maxima. This paper contains a summary of the molecular and aerosol conditions measured at the Pierre Auger Observatory since the start of regular operations in 2004, and includes a discussion of the impact of these measurements oil air shower reconstructions. Between 10(18) and 10(20) eV, the systematic Uncertainties due to all atmospheric effects increase from 4% to 8% in measurements of shower energy, and 4 g cm(-2) to 8 g cm(-2) in measurements of the shower maximum. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
In the past few years the interest is accomplishing a high accuracy positioning increasing. One of the methods that has been applied by the scientific community is the network based on positioning. By using multiple reference station data, it is possible to obtain centimetric positioning in a larger coverage area, in addition to gain in reliability, availability and integrity of the service. Besides, using this concept, it is possible to model the atmospheric effects (troposphere refraction and ionosphere effect). Another important question concerning this topic is related to the transmission of the network corrections to the users. There are some possibilities for this fact and an efficient one is the Virtual Reference Station (VRS) concept. In the VRS concept, a reference station is generated near to the rover receiver (user). This provides a short baseline and the user has the possibility of using a single frequency receiver to accomplish the relative positioning. In order to test this kind of positioning method, a software has been developed at São Paulo State University. In this paper, the methodology applied to generate the VRS data is described and the VRS quality is analyzed by using the Precise Point Positioning (PPP) method.
Resumo:
Among the positioning systems that compose GNSS (Global Navigation Satellite System), GPS has the capability of providing low, medium and high precision positioning data. However, GPS observables may be subject to many different types of errors. These systematic errors can degrade the accuracy of the positioning provided by GPS. These errors are mainly related to GPS satellite orbits, multipath, and atmospheric effects. In order to mitigate these errors, a semiparametric model and the penalized least squares technique were employed in this study. This is similar to changing the stochastical model, in which error functions are incorporated and the results are similar to those in which the functional model is changed instead. Using this method, it was shown that ambiguities and the estimation of station coordinates were more reliable and accurate than when employing a conventional least squares methodology.
Resumo:
The GPS observables are subject to several errors. Among them, the systematic ones have great impact, because they degrade the accuracy of the accomplished positioning. These errors are those related, mainly, to GPS satellites orbits, multipath and atmospheric effects. Lately, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique (PLS). In this method, the errors are modeled as functions varying smoothly in time. It is like to change the stochastic model, in which the errors functions are incorporated, the results obtained are similar to those in which the functional model is changed. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method (CLS). In general, the solution requires a shorter data interval, minimizing costs. The method performance was analyzed in two experiments, using data from single frequency receivers. The first one was accomplished with a short baseline, where the main error was the multipath. In the second experiment, a baseline of 102 km was used. In this case, the predominant errors were due to the ionosphere and troposphere refraction. In the first experiment, using 5 minutes of data collection, the largest coordinates discrepancies in relation to the ground truth reached 1.6 cm and 3.3 cm in h coordinate for PLS and the CLS, respectively, in the second one, also using 5 minutes of data, the discrepancies were 27 cm in h for the PLS and 175 cm in h for the CLS. In these tests, it was also possible to verify a considerable improvement in the ambiguities resolution using the PLS in relation to the CLS, with a reduced data collection time interval. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)