42 resultados para Surface conditioning methods
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.
Resumo:
The applicability of BET model for calculation of surface area of activated carbons is checked by using molecular simulations. By calculation of geometric surface areas for the simple model carbon slit-like pore with the increasing width, and by comparison of the obtained values with those for the same systems from the VEGA ZZ package (adsorbate-accessible molecular surface), it is shown that the latter methods provide correct values. For the system where a monolayer inside a pore is created the ASA approach (GCMC, Ar, T = 87 K) underestimates the value of surface area for micropores (especially, where only one layer is observed and/or two layers of adsorbed Ar are formed). Therefore, we propose the modification of this method based on searching the relationship between the pore diameter and the number of layers in a pore. Finally BET; original andmodified ASA; and A, B and C-point surface areas are calculated for a series of virtual porous carbons using simulated Ar adsorption isotherms (GCMC and T = 87 K). The comparison of results shows that the BET method underestimates and not, as it was usually postulated, overestimates the surface areas of microporous carbons.
Resumo:
Background: Thiol isomerases are a family of endoplasmic reticulum enzymes which orchestrate redox-based modifications of protein disulphide bonds. Previous studies have identified important roles for the thiol isomerases PDI and ERp5 in the regulation of normal platelet function. Objectives: Recently, we demonstrated the presence of a further five thiol isomerases at the platelet surface. In this report we aim to report the role of one of these enzymes - ERp57 in the regulation of platelet function. Methods/Results: Using enzyme activity function blocking antibodies, we demonstrate a role for ERp57 in platelet aggregation, dense granule secretion, fibrinogen binding, calcium mobilisation and thrombus formation under arterial conditions. In addition to the effects of ERp57 on isolated platelets, we observe the presence of ERp57 in the developing thrombus in vivo. Furthermore the inhibition of ERp57 function was found to reduce laser-injury induced arterial thrombus formation in a murine model of thrombosis. Conclusions: These data suggest that ERp57 is important for normal platelet function and opens up the possibility that the regulation of platelet function by a range of cell surface thiol isomerases may represent a broad paradigm for the regulation of haemostasis and thrombosis.
Resumo:
Specific traditional plate count method and real-time PCR systems based on SYBR Green I and TaqMan technologies using a specific primer pair and probe for amplification of iap-gene were used for quantitative assay of Listeria monocytogenes in seven decimal serial dilution series of nutrient broth and milk samples containing 1.58 to 1.58×107 cfu /ml and the real-time PCR methods were compared with the plate count method with respect to accuracy and sensitivity. In this study, the plate count method was performed using surface-plating of 0.1 ml of each sample on Palcam Agar. The lowest detectable level for this method was 1.58×10 cfu/ml for both nutrient broth and milk samples. Using purified DNA as a template for generation of standard curves, as few as four copies of the iap-gene could be detected per reaction with both real-time PCR assays, indicating that they were highly sensitive. When these real-time PCR assays were applied to quantification of L. monocytogenes in decimal serial dilution series of nutrient broth and milk samples, 3.16×10 to 3.16×105 copies per reaction (equals to 1.58×103 to 1.58×107 cfu/ml L. monocytogenes) were detectable. As logarithmic cycles, for Plate Count and both molecular assays, the quantitative results of the detectable steps were similar to the inoculation levels.
Resumo:
Accurate decadal climate predictions could be used to inform adaptation actions to a changing climate. The skill of such predictions from initialised dynamical global climate models (GCMs) may be assessed by comparing with predictions from statistical models which are based solely on historical observations. This paper presents two benchmark statistical models for predicting both the radiatively forced trend and internal variability of annual mean sea surface temperatures (SSTs) on a decadal timescale based on the gridded observation data set HadISST. For both statistical models, the trend related to radiative forcing is modelled using a linear regression of SST time series at each grid box on the time series of equivalent global mean atmospheric CO2 concentration. The residual internal variability is then modelled by (1) a first-order autoregressive model (AR1) and (2) a constructed analogue model (CA). From the verification of 46 retrospective forecasts with start years from 1960 to 2005, the correlation coefficient for anomaly forecasts using trend with AR1 is greater than 0.7 over parts of extra-tropical North Atlantic, the Indian Ocean and western Pacific. This is primarily related to the prediction of the forced trend. More importantly, both CA and AR1 give skillful predictions of the internal variability of SSTs in the subpolar gyre region over the far North Atlantic for lead time of 2 to 5 years, with correlation coefficients greater than 0.5. For the subpolar gyre and parts of the South Atlantic, CA is superior to AR1 for lead time of 6 to 9 years. These statistical forecasts are also compared with ensemble mean retrospective forecasts by DePreSys, an initialised GCM. DePreSys is found to outperform the statistical models over large parts of North Atlantic for lead times of 2 to 5 years and 6 to 9 years, however trend with AR1 is generally superior to DePreSys in the North Atlantic Current region, while trend with CA is superior to DePreSys in parts of South Atlantic for lead time of 6 to 9 years. These findings encourage further development of benchmark statistical decadal prediction models, and methods to combine different predictions.
Resumo:
The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.
Resumo:
We present a new iterative approach called Line Adaptation for the Singular Sources Objective (LASSO) to object or shape reconstruction based on the singular sources method (or probe method) for the reconstruction of scatterers from the far-field pattern of scattered acoustic or electromagnetic waves. The scheme is based on the construction of an indicator function given by the scattered field for incident point sources in its source point from the given far-field patterns for plane waves. The indicator function is then used to drive the contraction of a surface which surrounds the unknown scatterers. A stopping criterion for those parts of the surfaces that touch the unknown scatterers is formulated. A splitting approach for the contracting surfaces is formulated, such that scatterers consisting of several separate components can be reconstructed. Convergence of the scheme is shown, and its feasibility is demonstrated using a numerical study with several examples.
Conditioning model output statistics of regional climate model precipitation on circulation patterns
Resumo:
Dynamical downscaling of Global Climate Models (GCMs) through regional climate models (RCMs) potentially improves the usability of the output for hydrological impact studies. However, a further downscaling or interpolation of precipitation from RCMs is often needed to match the precipitation characteristics at the local scale. This study analysed three Model Output Statistics (MOS) techniques to adjust RCM precipitation; (1) a simple direct method (DM), (2) quantile-quantile mapping (QM) and (3) a distribution-based scaling (DBS) approach. The modelled precipitation was daily means from 16 RCMs driven by ERA40 reanalysis data over the 1961–2000 provided by the ENSEMBLES (ENSEMBLE-based Predictions of Climate Changes and their Impacts) project over a small catchment located in the Midlands, UK. All methods were conditioned on the entire time series, separate months and using an objective classification of Lamb's weather types. The performance of the MOS techniques were assessed regarding temporal and spatial characteristics of the precipitation fields, as well as modelled runoff using the HBV rainfall-runoff model. The results indicate that the DBS conditioned on classification patterns performed better than the other methods, however an ensemble approach in terms of both climate models and downscaling methods is recommended to account for uncertainties in the MOS methods.
Resumo:
We investigate Fréchet differentiability of the scattered field with respect to variation in the boundary in the case of time–harmonic acoustic scattering by an unbounded, sound–soft, one–dimensional rough surface. We rigorously prove the differentiability of the scattered field and derive a characterization of the Fréchet derivative as the solution to a Dirichlet boundary value problem. As an application of these results we give rigorous error estimates for first–order perturbation theory, justifying small perturbation methods that have a long history in the engineering literature. As an application of our rigorous estimates we show that a plane acoustic wave incident on a sound–soft rough surface can produce an unbounded scattered field.
Resumo:
We consider the Dirichlet and Robin boundary value problems for the Helmholtz equation in a non-locally perturbed half-plane, modelling time harmonic acoustic scattering of an incident field by, respectively, sound-soft and impedance infinite rough surfaces.Recently proposed novel boundary integral equation formulations of these problems are discussed. It is usual in practical computations to truncate the infinite rough surface, solving a boundary integral equation on a finite section of the boundary, of length 2A, say. In the case of surfaces of small amplitude and slope we prove the stability and convergence as A→∞ of this approximation procedure. For surfaces of arbitrarily large amplitude and/or surface slope we prove stability and convergence of a modified finite section procedure in which the truncated boundary is ‘flattened’ in finite neighbourhoods of its two endpoints. Copyright © 2001 John Wiley & Sons, Ltd.
Resumo:
Three methods for intercalibrating humidity sounding channels are compared to assess their merits and demerits. The methods use the following: (1) natural targets (Antarctica and tropical oceans), (2) zonal average brightness temperatures, and (3) simultaneous nadir overpasses (SNOs). Advanced Microwave Sounding Unit-B instruments onboard the polar-orbiting NOAA 15 and NOAA 16 satellites are used as examples. Antarctica is shown to be useful for identifying some of the instrument problems but less promising for intercalibrating humidity sounders due to the large diurnal variations there. Owing to smaller diurnal cycles over tropical oceans, these are found to be a good target for estimating intersatellite biases. Estimated biases are more resistant to diurnal differences when data from ascending and descending passes are combined. Biases estimated from zonal-averaged brightness temperatures show large seasonal and latitude dependence which could have resulted from diurnal cycle aliasing and scene-radiance dependence of the biases. This method may not be the best for channels with significant surface contributions. We have also tested the impact of clouds on the estimated biases and found that it is not significant, at least for tropical ocean estimates. Biases estimated from SNOs are the least influenced by diurnal cycle aliasing and cloud impacts. However, SNOs cover only relatively small part of the dynamic range of observed brightness temperatures.
Resumo:
Optimal estimation (OE) and probabilistic cloud screening were developed to provide lake surface water temperature (LSWT) estimates from the series of (advanced) along-track scanning radiometers (ATSRs). Variations in physical properties such as elevation, salinity, and atmospheric conditions are accounted for through the forward modelling of observed radiances. Therefore, the OE retrieval scheme developed is generic (i.e., applicable to all lakes). LSWTs were obtained for 258 of Earth's largest lakes from ATSR-2 and AATSR imagery from 1995 to 2009. Comparison to in situ observations from several lakes yields satellite in situ differences of −0.2 ± 0.7 K for daytime and −0.1 ± 0.5 K for nighttime observations (mean ± standard deviation). This compares with −0.05 ± 0.8 K for daytime and −0.1 ± 0.9 K for nighttime observations for previous methods based on operational sea surface temperature algorithms. The new approach also increases coverage (reducing misclassification of clear sky as cloud) and exhibits greater consistency between retrievals using different channel–view combinations. Empirical orthogonal function (EOF) techniques were applied to the LSWT retrievals (which contain gaps due to cloud cover) to reconstruct spatially and temporally complete time series of LSWT. The new LSWT observations and the EOF-based reconstructions offer benefits to numerical weather prediction, lake model validation, and improve our knowledge of the climatology of lakes globally. Both observations and reconstructions are publically available from http://hdl.handle.net/10283/88.
Resumo:
We describe the approach to be adopted for a major new initiative to derive a homogeneous record of sea surface temperature for 1991–2007 from the observations of the series of three along-track scanning radiometers (ATSRs). This initiative is called (A)RC: (Advanced) ATSR Re-analysis for Climate. The main objectives are to reduce regional biases in retrieved sea surface temperature (SST) to less than 0.1 K for all global oceans, while creating a very homogenous record that is stable in time to within 0.05 K decade−1, with maximum independence of the record from existing analyses of SST used in climate change research. If these stringent targets are achieved, this record will enable significantly improved estimates of surface temperature trends and variability of sufficient quality to advance questions of climate change attribution, climate sensitivity and historical reconstruction of surface temperature changes. The approach includes development of new, consistent estimators for SST for each of the ATSRs, and detailed analysis of overlap periods. Novel aspects of the approach include generation of multiple versions of the record using alternative channel sets and cloud detection techniques, to assess for the first time the effect of such choices. There will be extensive effort in quality control, validation and analysis of the impact on climate SST data sets. Evidence for the plausibility of the 0.1 K target for systematic error is reviewed, as is the need for alternative cloud screening methods in this context.
Resumo:
This paper details an investigation into sensory substitution by means of direct electrical stimulation of the tongue for the purpose of information input to the human brain. In particular, a device has been constructed and a series of trials have been performed in order to demonstrate the efficacy and performance of an electro-tactile array mounted onto the tongue surface for the purpose of sensory augmentation. Tests have shown that by using a low resolution array a computer-human feedback loop can be successfully implemented by humans in order to complete tasks such as object tracking, surface shape identification and shape recognition with no training or prior experience with the device. Comparisons of this technique have been made with visual alternatives and these show that the tongue based tactile array can match such methods in convenience and accuracy in performing simple tasks.
Resumo:
We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).