39 resultados para measurement data

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concentrations of sulfate, black carbon (BC) and other aerosols in the Arctic are characterized by high values in late winter and spring (so-called Arctic Haze) and low values in summer. Models have long been struggling to capture this seasonality and especially the high concentrations associated with Arctic Haze. In this study, we evaluate sulfate and BC concentrations from eleven different models driven with the same emission inventory against a comprehensive pan-Arctic measurement data set over a time period of 2 years (2008–2009). The set of models consisted of one Lagrangian particle dispersion model, four chemistry transport models (CTMs), one atmospheric chemistry-weather forecast model and five chemistry climate models (CCMs), of which two were nudged to meteorological analyses and three were running freely. The measurement data set consisted of surface measurements of equivalent BC (eBC) from five stations (Alert, Barrow, Pallas, Tiksi and Zeppelin), elemental carbon (EC) from Station Nord and Alert and aircraft measurements of refractory BC (rBC) from six different campaigns. We find that the models generally captured the measured eBC or rBC and sulfate concentrations quite well, compared to previous comparisons. However, the aerosol seasonality at the surface is still too weak in most models. Concentrations of eBC and sulfate averaged over three surface sites are underestimated in winter/spring in all but one model (model means for January–March underestimated by 59 and 37 % for BC and sulfate, respectively), whereas concentrations in summer are overestimated in the model mean (by 88 and 44 % for July–September), but with overestimates as well as underestimates present in individual models. The most pronounced eBC underestimates, not included in the above multi-site average, are found for the station Tiksi in Siberia where the measured annual mean eBC concentration is 3 times higher than the average annual mean for all other stations. This suggests an underestimate of BC sources in Russia in the emission inventory used. Based on the campaign data, biomass burning was identified as another cause of the modeling problems. For sulfate, very large differences were found in the model ensemble, with an apparent anti-correlation between modeled surface concentrations and total atmospheric columns. There is a strong correlation between observed sulfate and eBC concentrations with consistent sulfate/eBC slopes found for all Arctic stations, indicating that the sources contributing to sulfate and BC are similar throughout the Arctic and that the aerosols are internally mixed and undergo similar removal. However, only three models reproduced this finding, whereas sulfate and BC are weakly correlated in the other models. Overall, no class of models (e.g., CTMs, CCMs) performed better than the others and differences are independent of model resolution.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Data assimilation algorithms are a crucial part of operational systems in numerical weather prediction, hydrology and climate science, but are also important for dynamical reconstruction in medical applications and quality control for manufacturing processes. Usually, a variety of diverse measurement data are employed to determine the state of the atmosphere or to a wider system including land and oceans. Modern data assimilation systems use more and more remote sensing data, in particular radiances measured by satellites, radar data and integrated water vapor measurements via GPS/GNSS signals. The inversion of some of these measurements are ill-posed in the classical sense, i.e. the inverse of the operator H which maps the state onto the data is unbounded. In this case, the use of such data can lead to significant instabilities of data assimilation algorithms. The goal of this work is to provide a rigorous mathematical analysis of the instability of well-known data assimilation methods. Here, we will restrict our attention to particular linear systems, in which the instability can be explicitly analyzed. We investigate the three-dimensional variational assimilation and four-dimensional variational assimilation. A theory for the instability is developed using the classical theory of ill-posed problems in a Banach space framework. Further, we demonstrate by numerical examples that instabilities can and will occur, including an example from dynamic magnetic tomography.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The North Atlantic Marine Boundary Layer Experiment (NAMBLEX), involving over 50 scientists from 12 institutions, took place at Mace Head, Ireland (53.32° N, 9.90° W), between 23 July and 4 September 2002. A wide range of state-of-the-art instrumentation enabled detailed measurements of the boundary layer structure and atmospheric composition in the gas and aerosol phase to be made, providing one of the most comprehensive in situ studies of the marine boundary layer to date. This overview paper describes the aims of the NAMBLEX project in the context of previous field campaigns in the Marine Boundary Layer (MBL), the overall layout of the site, a summary of the instrumentation deployed, the temporal coverage of the measurement data, and the numerical models used to interpret the field data. Measurements of some trace species were made for the first time during the campaign, which was characterised by predominantly clean air of marine origin, but more polluted air with higher levels of NOx originating from continental regions was also experienced. This paper provides a summary of the meteorological measurements and Planetary Boundary Layer (PBL) structure measurements, presents time series of some of the longer-lived trace species (O3, CO, H2, DMS, CH4, NMHC, NOx, NOy, PAN) and summarises measurements of other species that are described in more detail in other papers within this special issue, namely oxygenated VOCs, HCHO, peroxides, organo-halogenated species, a range of shorter lived halogen species (I2, OIO, IO, BrO), NO3 radicals, photolysis frequencies, the free radicals OH, HO2 and (HO2+Σ RO2), as well as a summary of the aerosol measurements. NAMBLEX was supported by measurements made in the vicinity of Mace Head using the NERC Dornier-228 aircraft. Using ECMWF wind-fields, calculations were made of the air-mass trajectories arriving at Mace Head during NAMBLEX, and were analysed together with both meteorological and trace-gas measurements. In this paper a chemical climatology for the duration of the campaign is presented to interpret the distribution of air-mass origins and emission sources, and to provide a convenient framework of air-mass classification that is used by other papers in this issue for the interpretation of observed variability in levels of trace gases and aerosols.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Chongqing is the largest central-government-controlled municipality in China, which is now under going a rapid urbanization. The question remains open: What are the consequences of such rapid urbanization in Chongqing in terms of urban microclimates? An integrated study comprising three different research approaches is adopted in the present paper. By analyzing the observed annual climate data, an average rising trend of 0.10◦C/decade was found for the annual mean temperature from 1951 to 2010 in Chongqing,indicating a higher degree of urban warming in Chongqing. In addition, two complementary types of field measurements were conducted: fixed weather stations and mobile transverse measurement. Numerical simulations using a house-developed program are able to predict the urban air temperature in Chongqing.The urban heat island intensity in Chongqing is stronger in summer compared to autumn and winter.The maximum urban heat island intensity occurs at around midnight, and can be as high as 2.5◦C. In the day time, an urban cool island exists. Local greenery has a great impact on the local thermal environment.Urban green spaces can reduce urban air temperature and therefore mitigate the urban heat island. The cooling effect of an urban river is limited in Chongqing, as both sides of the river are the most developed areas, but the relative humidity is much higher near the river compared with the places far from it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An instrument is described which carries three orthogonal geomagnetic field sensors on a standard meteorological balloon package, to sense rapid motion and position changes during ascent through the atmosphere. Because of the finite data bandwidth available over the UHF radio link, a burst sampling strategy is adopted. Bursts of 9s of measurements at 3.6Hz are interleaved with periods of slow data telemetry lasting 25s. Calculation of the variability in each channel is used to determine position changes, a method robust to periods of poor radio signals. During three balloon ascents, variability was found repeatedly at similar altitudes, simultaneously in each of three orthogonal sensors carried. This variability is attributed to atmospheric motions. It is found that the vertical sensor is least prone to stray motions, and that the use of two horizontal sensors provides no additional information over a single horizontal sensor

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecological risk assessments must increasingly consider the effects of chemical mixtures on the environment as anthropogenic pollution continues to grow in complexity. Yet testing every possible mixture combination is impractical and unfeasible; thus, there is an urgent need for models that can accurately predict mixture toxicity from single-compound data. Currently, two models are frequently used to predict mixture toxicity from single-compound data: Concentration addition and independent action (IA). The accuracy of the predictions generated by these models is currently debated and needs to be resolved before their use in risk assessments can be fully justified. The present study addresses this issue by determining whether the IA model adequately described the toxicity of binary mixtures of five pesticides and other environmental contaminants (cadmium, chlorpyrifos, diuron, nickel, and prochloraz) each with dissimilar modes of action on the reproduction of the nematode Caenorhabditis elegans. In three out of 10 cases, the IA model failed to describe mixture toxicity adequately with significant or antagonism being observed. In a further three cases, there was an indication of synergy, antagonism, and effect-level-dependent deviations, respectively, but these were not statistically significant. The extent of the significant deviations that were found varied, but all were such that the predicted percentage effect seen on reproductive output would have been wrong by 18 to 35% (i.e., the effect concentration expected to cause a 50% effect led to an 85% effect). The presence of such a high number and variety of deviations has important implications for the use of existing mixture toxicity models for risk assessments, especially where all or part of the deviation is synergistic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airborne laser altimetry has the potential to make frequent detailed observations that are important for many aspects of studying land surface processes. However, the uncertainties inherent in airborne laser altimetry data have rarely been well measured. Uncertainty is often specified as generally as 20cm in elevation, and 40cm planimetric. To better constrain these uncertainties, we present an analysis of several datasets acquired specifically to study the temporal consistency of laser altimetry data, and thus assess its operational value. The error budget has three main components, each with a time regime. For measurements acquired less than 50ms apart, elevations have a local standard deviation in height of 3.5cm, enabling the local measurement of surface roughness of the order of 5cm. Points acquired seconds apart acquire an additional random error due to Differential Geographic Positioning System (DGPS) fluctuation. Measurements made up to an hour apart show an elevation drift of 7cm over a half hour. Over months, this drift gives rise to a random elevation offset between swathes, with an average of 6.4cm. The RMS planimetric error in point location was derived as 37.4cm. We conclude by considering the consequences of these uncertainties on the principle application of laser altimetry in the UK, intertidal zone monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Northern hemisphere snow water equivalent (SWE) distribution from remote sensing (SSM/I), the ERA40 reanalysis product and the HadCM3 general circulation model are compared. Large differences are seen in the February climatologies, particularly over Siberia. The SSM/I retrieval algorithm may be overestimating SWE in this region, while comparison with independent runoff estimates suggest that HadCM3 is underestimating SWE. Treatment of snow grain size and vegetation parameterizations are concerns with the remotely sensed data. For this reason, ERA40 is used as `truth' for the following experiments. Despite the climatology differences, HadCM3 is able to reproduce the distribution of ERA40 SWE anomalies when assimilating ERA40 anomaly fields of temperature, sea level pressure, atmospheric winds and ocean temperature and salinity. However when forecasts are released from these assimilated initial states, the SWE anomaly distribution diverges rapidly from that of ERA40. No predictability is seen from one season to another. Strong links between European SWE distribution and the North Atlantic Oscillation (NAO) are seen, but forecasts of this index by the assimilation scheme are poor. Longer term relationships between SWE and the NAO, and SWE and the El Ni\~no-Southern Oscillation (ENSO) are also investigated in a multi-century run of HadCM3. SWE is impacted by ENSO in the Himalayas and North America, while the NAO affects SWE in North America and Europe. While significant connections with the NAO index were only present in DJF (and to an extent SON), the link between ENSO and February SWE distribution was seen to exist from the previous JJA ENSO index onwards. This represents a long lead time for SWE prediction for hydrological applications such as flood and wildfire forecasting. Further work is required to develop reliable large scale observation-based SWE datasets with which to test these model-derived connections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we report a new method based on supercritical carbon dioxide (scCO(2)) to fill and distribute the porous magnetic nanoparticles with n-octanol in a homogeneous manner. The high solubility of n-octanol in scCO(2) and high diffusivity and permeability of the fluid allow efficient delivery of n-octanol into the porous magnetic nanoparticles. Thus, the n-octanol-loaded magnetic nanoparticles can be readily dispersed into aqueous buffer (pH 7.40) to form a homogenous suspension consisting of nano-sized n-octanol droplets. We refer this suspension as the n-octanol stock solution. The n-octanol stock solution is then mixed with bulk aqueous phase (pH 7.40) containing an organic compound prior to magnetic separation. The small-size of the particles and the efficient mixing enable a rapid establishment of the partition equilibrium of the organic compound between the solid supported n-octanol nano-droplets and the bulk aqueous phase. UV-vis spectrophotometry is then applied to determine the concentration of the organic compound in the aqueous phase both before and after partitioning (after magnetic separation). As a result, log D values of organic compounds of pharmaceutical interest determined by this modified method are found to be in excellent agreement with the literature data. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the extent to which clients were able to influence performance measurement appraisals during the downturn in commercial property markets that began in the UK during the second half of 2007. The sharp change in market sentiment produced speculation that different client categories were attempting to influence their appraisers in different ways. In particular, it was recognised that the requirement for open‐ended funds to meet redemptions gave them strong incentives to ensure that their asset values were marked down to market. Using data supplied by Investment Property Databank, we demonstrate that, indeed, unlisted open‐ended funds experienced sharper drops in capital values than other fund types in the last quarter of 2007, after the market turning point and at the time when redemptions were at their highest. These differences are statistically significant and cannot simply be explained by differences in portfolio composition. Client influence on appraisal forms one possible explanation of the results observed: the different pressures on fund managers resulting in different appraisal outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Normally wind measurements from Doppler radars rely on the presence of rain. During fine weather, insects become a potential radar target for wind measurement. However, it is difficult to separate ground clutter and insect echoes when spectral or polarimetric methods are not available. Archived reflectivity and velocity data from repeated scans provide alternative methods. The probability of detection (POD) method, which maps areas with a persistent signal as ground clutter, is ineffective when most scans also contain persistent insect echoes. We developed a clutter detection method which maps the standard deviation of velocity (SDV) over a large number of scans, and can differentiate insects and ground clutter close to the radar. Beyond the range of persistent insect echoes, the POD method more thoroughly removes ground clutter. A new, pseudo-probability clutter map was created by combining the POD and SDV maps. The new map optimised ground clutter detection without removing insect echoes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of data reconciliation techniques can considerably reduce the inaccuracy of process data due to measurement errors. This in turn results in improved control system performance and process knowledge. Dynamic data reconciliation techniques are applied to a model-based predictive control scheme. It is shown through simulations on a chemical reactor system that the overall performance of the model-based predictive controller is enhanced considerably when data reconciliation is applied. The dynamic data reconciliation techniques used include a combined strategy for the simultaneous identification of outliers and systematic bias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the extent to which clients were able to influence performance measurement appraisals during the downturn in commercial property markets that began in the UK during the second half of 2007. The sharp change in market sentiment produced speculation that different client categories were attempting to influence their appraisers in different ways. In particular, it was recognised that the requirement for open-ended funds to meet redemptions gave them strong incentives to ensure that their asset values were marked down to market. Using data supplied by Investment Property Databank, we demonstrate that, indeed, unlisted open ended funds experienced sharper drops in capital values than other fund types in the second half of 2007, after the market turning point. These differences are statistically significant and cannot simply be explained by differences in portfolio composition. Client influence on appraisal forms one possible explanation of the results observed: the different pressures on fund managers resulting in different appraisal outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses data provided by three major real estate advisory firms to investigate the level and pattern of variation in the measurement of historic real estate rental values for the main European office centres. The paper assesses the extent to which the data providing organizations agree on historic market performance in terms of returns, risk and timing and examines the relationship between market maturity and agreement. The analysis suggests that at the aggregate level and for many markets, there is substantial agreement on direction, quantity and timing of market change. However, there is substantial variability in the level of agreement among cities. The paper also assesses whether the different data sets produce different explanatory models and market forecast. It is concluded that, although disagreement on the direction of market change is high for many market, the different data sets often produce similar explanatory models and predict similar relative performance.