74 resultados para 2-EPT probability density function
Resumo:
This paper builds upon previous research on currency bands, and provides a model for the Colombian peso. Stochastic differential equations are combined with information related to the Colombian currency band to estimate competing models of the behaviour of the Colombian peso within the limits of the currency band. The resulting moments of the density function for the simulated returns describe adequately most of the characteristics of the sample returns data. The factor included to account for the intra-marginal intervention performed to drive the rate towards the Central Parity accounts only for 6.5% of the daily change, which supports the argument that intervention, if performed by the Central Bank, it is not directed to push the currency towards the limits. Moreover, the credibility of the Colombian Central Bank, Banco de la República’s ability to defend the band seems low.
Resumo:
In the present paper we characterize the statistical properties of non-precipitating tropical ice clouds (deep ice anvils resulting from deep convection and cirrus clouds) over Niamey, Niger, West Africa, and Darwin, northern Australia, using ground-based radar–lidar observations from the Atmospheric Radiation Measurement (ARM) programme. The ice cloud properties analysed in this paper are the frequency of ice cloud occurrence, cloud fraction, the morphological properties (cloud-top height, base height, and thickness), the microphysical and radiative properties (ice water content, visible extinction, effective radius, terminal fall speed, and concentration), and the internal cloud dynamics (in-cloud vertical air velocity). The main highlight of the paper is that it characterizes for the first time the probability density functions of the tropical ice cloud properties, their vertical variability and their diurnal variability at the same time. This is particularly important over West Africa, since the ARM deployment in Niamey provides the first vertically resolved observations of non-precipitating ice clouds in this crucial area in terms of redistribution of water and energy in the troposphere. The comparison between the two sites also provides an additional observational basis for the evaluation of the parametrization of clouds in large-scale models, which should be able to reproduce both the statistical properties at each site and the differences between the two sites. The frequency of ice cloud occurrence is found to be much larger over Darwin when compared to Niamey, and with a much larger diurnal variability, which is well correlated with the diurnal cycle of deep convective activity. The diurnal cycle of the ice cloud occurrence over Niamey is also much less correlated with that of deep convective activity than over Darwin, probably owing to the fact that Niamey is further away from the deep convective sources of the region. The frequency distributions of cloud fraction are strongly bimodal and broadly similar over the two sites, with a predominance of clouds characterized either by a very small cloud fraction (less than 0.3) or a very large cloud fraction (larger than 0.9). The ice clouds over Darwin are also much thicker (by 1 km or more statistically) and are characterized by a much larger diurnal variability than ice clouds over Niamey. Ice clouds over Niamey are also characterized by smaller particle sizes and fall speeds but in much larger concentrations, thereby carrying more ice water and producing more visible extinction than the ice clouds over Darwin. It is also found that there is a much larger occurrence of downward in-cloud air motions less than 1 m s−1 over Darwin, which together with the larger fall speeds retrieved over Darwin indicates that the life cycle of ice clouds is probably shorter over Darwin than over Niamey.
Resumo:
An experimental method is described which enables the inelastically scattered X-ray component to be removed from diffractometer data prior to radial density function analysis. At each scattering angle an energy spectrum is generated from a Si(Li) detector combined with a multi-channel analyser from which the coherently scattered component is separated. The data obtained from organic polymers has an improved signal/noise ratio at high values of scattering angle, and a commensurate enhancement of resolution of the RDF at low r is demonstrated for the case of PMMA (ICI `Perspex'). The method obviates the need for the complicated correction for multiple scattering.
Resumo:
We compare the characteristics of synthetic European droughts generated by the HiGEM1 coupled climate model run with present day atmospheric composition with observed drought events extracted from the CRU TS3 data set. The results demonstrate consistency in both the rate of drought occurrence and the spatiotemporal structure of the events. Estimates of the probability density functions for event area, duration and severity are shown to be similar with confidence > 90%. Encouragingly, HiGEM is shown to replicate the extreme tails of the observed distributions and thus the most damaging European drought events. The soil moisture state is shown to play an important role in drought development. Once a large-scale drought has been initiated it is found to be 50% more likely to continue if the local soil moisture is below the 40th percentile. In response to increased concentrations of atmospheric CO2, the modelled droughts are found to increase in duration, area and severity. The drought response can be largely attributed to temperature driven changes in relative humidity. 1 HiGEM is based on the latest climate configuration of the Met Office Hadley Centre Unified Model (HadGEM1) with the horizontal resolution increased to 1.25 x 0.83 degrees in longitude and latitude in the atmosphere and 1/3 x 1/3 degrees in the ocean.
Resumo:
Airborne lidar provides accurate height information of objects on the earth and has been recognized as a reliable and accurate surveying tool in many applications. In particular, lidar data offer vital and significant features for urban land-cover classification, which is an important task in urban land-use studies. In this article, we present an effective approach in which lidar data fused with its co-registered images (i.e. aerial colour images containing red, green and blue (RGB) bands and near-infrared (NIR) images) and other derived features are used effectively for accurate urban land-cover classification. The proposed approach begins with an initial classification performed by the Dempster–Shafer theory of evidence with a specifically designed basic probability assignment function. It outputs two results, i.e. the initial classification and pseudo-training samples, which are selected automatically according to the combined probability masses. Second, a support vector machine (SVM)-based probability estimator is adopted to compute the class conditional probability (CCP) for each pixel from the pseudo-training samples. Finally, a Markov random field (MRF) model is established to combine spatial contextual information into the classification. In this stage, the initial classification result and the CCP are exploited. An efficient belief propagation (EBP) algorithm is developed to search for the global minimum-energy solution for the maximum a posteriori (MAP)-MRF framework in which three techniques are developed to speed up the standard belief propagation (BP) algorithm. Lidar and its co-registered data acquired by Toposys Falcon II are used in performance tests. The experimental results prove that fusing the height data and optical images is particularly suited for urban land-cover classification. There is no training sample needed in the proposed approach, and the computational cost is relatively low. An average classification accuracy of 93.63% is achieved.
Resumo:
Our knowledge of stratospheric O3-N2O correlations is extended, and their potential for model-measurement comparison assessed, using data from the Atmospheric Chemistry Experiment (ACE) satellite and the Canadian Middle Atmosphere Model (CMAM). ACE provides the first comprehensive data set for the investigation of interhemispheric, interseasonal, and height-resolved differences of the O_3-N_2O correlation structure. By subsampling the CMAM data, the representativeness of the ACE data is evaluated. In the middle stratosphere, where the correlations are not compact and therefore mainly reflect the data sampling, joint probability density functions provide a detailed picture of key aspects of transport and mixing, but also trace polar ozone loss. CMAM captures these important features, but exhibits a displacement of the tropical pipe into the Southern Hemisphere (SH). Below about 21 km, the ACE data generally confirm the compactness of the correlations, although chemical ozone loss tends to destroy the compactness during late winter/spring, especially in the SH. This allows a quantitative comparison of the correlation slopes in the lower and lowermost stratosphere (LMS), which exhibit distinct seasonal cycles that reveal the different balances between diabatic descent and horizontal mixing in these two regions in the Northern Hemisphere (NH), reconciling differences found in aircraft measurements, and the strong role of chemical ozone loss in the SH. The seasonal cycles are qualitatively well reproduced by CMAM, although their amplitude is too weak in the NH LMS. The correlation slopes allow a "chemical" definition of the LMS, which is found to vary substantially in vertical extent with season.
Resumo:
Airborne high resolution in situ measurements of a large set of trace gases including ozone (O3) and total water (H2O) in the upper troposphere and the lowermost stratosphere (UT/LMS) have been performed above Europe within the SPURT project. SPURT provides an extensive data coverage of the UT/LMS in each season within the time period between November 2001 and July 2003. In the LMS a distinct spring maximum and autumn minimum is observed in O3, whereas its annual cycle in the UT is shifted by 2–3 months later towards the end of the year. The more variable H2O measurements reveal a maximum during summer and a minimum during autumn/winter with no phase shift between the two atmospheric compartments. For a comprehensive insight into trace gas composition and variability in the UT/LMS several statistical methods are applied using chemical, thermal and dynamical vertical coordinates. In particular, 2-dimensional probability distribution functions serve as a tool to transform localised aircraft data to a more comprehensive view of the probed atmospheric region. It appears that both trace gases, O3 and H2O, reveal the most compact arrangement and are best correlated in the view of potential vorticity (PV) and distance to the local tropopause, indicating an advanced mixing state on these surfaces. Thus, strong gradients of PV seem to act as a transport barrier both in the vertical and the horizontal direction. The alignment of trace gas isopleths reflects the existence of a year-round extra-tropical tropopause transition layer. The SPURT measurements reveal that this layer is mainly affected by stratospheric air during winter/spring and by tropospheric air during autumn/summer. Normalised mixing entropy values for O3 and H2O in the LMS appear to be maximal during spring and summer, respectively, indicating highest variability of these trace gases during the respective seasons.
Resumo:
We propose and demonstrate a fully probabilistic (Bayesian) approach to the detection of cloudy pixels in thermal infrared (TIR) imagery observed from satellite over oceans. Using this approach, we show how to exploit the prior information and the fast forward modelling capability that are typically available in the operational context to obtain improved cloud detection. The probability of clear sky for each pixel is estimated by applying Bayes' theorem, and we describe how to apply Bayes' theorem to this problem in general terms. Joint probability density functions (PDFs) of the observations in the TIR channels are needed; the PDFs for clear conditions are calculable from forward modelling and those for cloudy conditions have been obtained empirically. Using analysis fields from numerical weather prediction as prior information, we apply the approach to imagery representative of imagers on polar-orbiting platforms. In comparison with the established cloud-screening scheme, the new technique decreases both the rate of failure to detect cloud contamination and the false-alarm rate by one quarter. The rate of occurrence of cloud-screening-related errors of >1 K in area-averaged SSTs is reduced by 83%. Copyright © 2005 Royal Meteorological Society.
Resumo:
Atmospheric aerosols cause scattering and absorption of incoming solar radiation. Additional anthropogenic aerosols released into the atmosphere thus exert a direct radiative forcing on the climate system1. The degree of present-day aerosol forcing is estimated from global models that incorporate a representation of the aerosol cycles1–3. Although the models are compared and validated against observations, these estimates remain uncertain. Previous satellite measurements of the direct effect of aerosols contained limited information about aerosol type, and were confined to oceans only4,5. Here we use state-of-the-art satellitebased measurements of aerosols6–8 and surface wind speed9 to estimate the clear-sky direct radiative forcing for 2002, incorporating measurements over land and ocean. We use a Monte Carlo approach to account for uncertainties in aerosol measurements and in the algorithm used. Probability density functions obtained for the direct radiative forcing at the top of the atmosphere give a clear-sky, global, annual average of 21.9Wm22 with standard deviation, 60.3Wm22. These results suggest that present-day direct radiative forcing is stronger than present model estimates, implying future atmospheric warming greater than is presently predicted, as aerosol emissions continue to decline10.
Resumo:
We apply a new parameterisation of the Greenland ice sheet (GrIS) feedback between surface mass balance (SMB: the sum of surface accumulation and surface ablation) and surface elevation in the MAR regional climate model (Edwards et al., 2014) to projections of future climate change using five ice sheet models (ISMs). The MAR (Modèle Atmosphérique Régional: Fettweis, 2007) climate projections are for 2000–2199, forced by the ECHAM5 and HadCM3 global climate models (GCMs) under the SRES A1B emissions scenario. The additional sea level contribution due to the SMB– elevation feedback averaged over five ISM projections for ECHAM5 and three for HadCM3 is 4.3% (best estimate; 95% credibility interval 1.8–6.9 %) at 2100, and 9.6% (best estimate; 95% credibility interval 3.6–16.0 %) at 2200. In all results the elevation feedback is significantly positive, amplifying the GrIS sea level contribution relative to the MAR projections in which the ice sheet topography is fixed: the lower bounds of our 95% credibility intervals (CIs) for sea level contributions are larger than the “no feedback” case for all ISMs and GCMs. Our method is novel in sea level projections because we propagate three types of modelling uncertainty – GCM and ISM structural uncertainties, and elevation feedback parameterisation uncertainty – along the causal chain, from SRES scenario to sea level, within a coherent experimental design and statistical framework. The relative contributions to uncertainty depend on the timescale of interest. At 2100, the GCM uncertainty is largest, but by 2200 both the ISM and parameterisation uncertainties are larger. We also perform a perturbed parameter ensemble with one ISM to estimate the shape of the projected sea level probability distribution; our results indicate that the probability density is slightly skewed towards higher sea level contributions.
Resumo:
This article shows how one can formulate the representation problem starting from Bayes’ theorem. The purpose of this article is to raise awareness of the formal solutions,so that approximations can be placed in a proper context. The representation errors appear in the likelihood, and the different possibilities for the representation of reality in model and observations are discussed, including nonlinear representation probability density functions. Specifically, the assumptions needed in the usual procedure to add a representation error covariance to the error covariance of the observations are discussed,and it is shown that, when several sub-grid observations are present, their mean still has a representation error ; socalled ‘superobbing’ does not resolve the issue. Connection is made to the off-line or on-line retrieval problem, providing a new simple proof of the equivalence of assimilating linear retrievals and original observations. Furthermore, it is shown how nonlinear retrievals can be assimilated without loss of information. Finally we discuss how errors in the observation operator model can be treated consistently in the Bayesian framework, connecting to previous work in this area.
Resumo:
The present work describes a new tool that helps bidders improve their competitive bidding strategies. This new tool consists of an easy-to-use graphical tool that allows the use of more complex decision analysis tools in the field of Competitive Bidding. The graphic tool described here tries to move away from previous bidding models which attempt to describe the result of an auction or a tender process by means of studying each possible bidder with probability density functions. As an illustration, the tool is applied to three practical cases. Theoretical and practical conclusions on the great potential breadth of application of the tool are also presented.
Resumo:
The co-polar correlation coefficient (ρhv) has many applications, including hydrometeor classification, ground clutter and melting layer identification, interpretation of ice microphysics and the retrieval of rain drop size distributions (DSDs). However, we currently lack the quantitative error estimates that are necessary if these applications are to be fully exploited. Previous error estimates of ρhv rely on knowledge of the unknown "true" ρhv and implicitly assume a Gaussian probability distribution function of ρhv samples. We show that frequency distributions of ρhv estimates are in fact highly negatively skewed. A new variable: L = -log10(1 - ρhv) is defined, which does have Gaussian error statistics, and a standard deviation depending only on the number of independent radar pulses. This is verified using observations of spherical drizzle drops, allowing, for the first time, the construction of rigorous confidence intervals in estimates of ρhv. In addition, we demonstrate how the imperfect co-location of the horizontal and vertical polarisation sample volumes may be accounted for. The possibility of using L to estimate the dispersion parameter (µ) in the gamma drop size distribution is investigated. We find that including drop oscillations is essential for this application, otherwise there could be biases in retrieved µ of up to ~8. Preliminary results in rainfall are presented. In a convective rain case study, our estimates show µ to be substantially larger than 0 (an exponential DSD). In this particular rain event, rain rate would be overestimated by up to 50% if a simple exponential DSD is assumed.
Resumo:
LDL oxidation may be important in atherosclerosis. Extensive oxidation of LDL by copper induces increased uptake by macrophages, but results in decomposition of hydroperoxides, making it more difficult to investigate the effects of hydroperoxides in oxidised LDL on cell function. We describe here a simple method of oxidising LDL by dialysis against copper ions at 4 degrees C, which inhibits the decomposition of hydroperoxides, and allows the production of LDL rich in hydroperoxides (626 +/- 98 nmol/mg LDL protein) but low in oxysterols (3 +/- 1 nmol 7-ketocholesterol/mg LDL protein), whilst allowing sufficient modification (2.6 +/- 0.5 relative electrophoretic mobility) for rapid uptake by macrophages (5.49 +/- 0.75 mu g I-125-labelled hydroperoxide-rich LDL vs. 0.46 +/- 0.04 mu g protein/mg cell protein in 18 h for native LDL). By dialysing under the same conditions, but at 37 degrees C, the hydroperoxides are decomposed extensively and the LDL becomes rich in oxysterols. This novel method of oxidising LDL with high yield to either a hydroperoxide- or oxysterol-rich form by simply altering the temperature of dialysis may provide a useful tool for determining the effects of these different oxidation products on cell function. (C) 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward constrained regression manner. The leave-one-out (LOO) test score is used for kernel selection. The jackknife parameter estimator subject to positivity constraint check is used for the parameter estimation of a single parameter at each forward step. As such the proposed approach is simple to implement and the associated computational cost is very low. An illustrative example is employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to that of the classical Parzen window estimate.