185 resultados para Mean-variance.
Resumo:
We present a summary of the principal physical and optical properties of aerosol particles using the FAAM BAE-146 instrumented aircraft during ADRIEX between 27 August and 6 September 2004, augmented by sunphotometer, lidar and satellite retrievals. Observations of anthropogenic aerosol, principally from industrial sources, were concentrated over the northern Adriatic Sea and over the Po Valley close to the aerosol sources. An additional flight was also carried out over the Black Sea to compare east and west European pollution. Measurements show the single-scattering albedo of dry aerosol particles to vary considerably between 0.89 and 0.97 at a wavelength of 0.55 μm, with a campaign mean within the polluted lower free troposphere of 0.92. Although aerosol concentrations varied significantly from day to day and during individual days, the shape of the aerosol size distribution was relatively consistent through the experiment, with no detectable change observed over land and over sea. There is evidence to suggest that the pollution aerosol within the marine boundary layer was younger than that in the elevated layer. Trends in the aerosol volume distribution show consistency with multiple-site AERONET radiometric observations. The aerosol optical depths derived from aircraft measurements show a consistent bias to lower values than both the AERONET and lidar ground-based radiometric observations, differences which can be explained by local variations in the aerosol column loading and by some aircraft instrumental artefacts. Retrievals of the aerosol optical depth and fine-mode (<0.5 μm radius) fraction contribution to the optical depth using MODIS data from the Terra and Aqua satellites show a reasonable level of agreement with the AERONET and aircraft measurements.
Resumo:
In this article, we illustrate experimentally an important consequence of the stochastic component in choice behaviour which has not been acknowledged so far. Namely, its potential to produce ‘regression to the mean’ (RTM) effects. We employ a novel approach to individual choice under risk, based on repeated multiple-lottery choices (i.e. choices among many lotteries), to show how the high degree of stochastic variability present in individual decisions can distort crucially certain results through RTM effects. We demonstrate the point in the context of a social comparison experiment.
Resumo:
We compare a number of models of post War US output growth in terms of the degree and pattern of non-linearity they impart to the conditional mean, where we condition on either the previous period's growth rate, or the previous two periods' growth rates. The conditional means are estimated non-parametrically using a nearest-neighbour technique on data simulated from the models. In this way, we condense the complex, dynamic, responses that may be present in to graphical displays of the implied conditional mean.
Resumo:
Under particular large-scale atmospheric conditions, several windstorms may affect Europe within a short time period. The occurrence of such cyclone families leads to large socioeconomic impacts and cumulative losses. The serial clustering of windstorms is analyzed for the North Atlantic/western Europe. Clustering is quantified as the dispersion (ratio variance/mean) of cyclone passages over a certain area. Dispersion statistics are derived for three reanalysis data sets and a 20-run European Centre Hamburg Version 5 /Max Planck Institute Version–Ocean Model Version 1 global climate model (ECHAM5/MPI-OM1 GCM) ensemble. The dependence of the seriality on cyclone intensity is analyzed. Confirming previous studies, serial clustering is identified in reanalysis data sets primarily on both flanks and downstream regions of the North Atlantic storm track. This pattern is a robust feature in the reanalysis data sets. For the whole area, extreme cyclones cluster more than nonextreme cyclones. The ECHAM5/MPI-OM1 GCM is generally able to reproduce the spatial patterns of clustering under recent climate conditions, but some biases are identified. Under future climate conditions (A1B scenario), the GCM ensemble indicates that serial clustering may decrease over the North Atlantic storm track area and parts of western Europe. This decrease is associated with an extension of the polar jet toward Europe, which implies a tendency to a more regular occurrence of cyclones over parts of the North Atlantic Basin poleward of 50°N and western Europe. An increase of clustering of cyclones is projected south of Newfoundland. The detected shifts imply a change in the risk of occurrence of cumulative events over Europe under future climate conditions.
Resumo:
The role of eddy fluxes in the general circulation is often approached by treating eddies as (macro)turbulence. In this approach, the eddies act to diffuse certain quasiconservative quantities, such as potential vorticity (PV), along isentropic surfaces in the free atmosphere. The eddy fluxes are determined primarily by the eddy diffusivities and are necessarily down-gradient of the basic state PV field. Support for the (macro)turbulence approach stems from the fact that the eddy fluxes of PV in the free atmosphere are generally down-gradient in the long-term mean. Here we call attention to a pronounced and significant region of upgradient eddy PV fluxes on the poleward flank of the jet core in both hemispheres. The region of up-gradient (i.e., notionally “antidiffusive”) eddy PV fluxes is most pronounced during the winter and spring seasons and partially contradicts the turbulence approach described above. Analyses of the PV variance (potential enstrophy) budget suggest that the up-gradient PV fluxes represent local wave decay and are maintained by poleward fluxes of PV variance. Finite-amplitude effects thus represent leading order contributions to the PV variance budget, whereas dissipation is only of secondary importance locally. The appearance of up-gradient PV fluxes in the long-term mean is associated with the poleward shift of the jet—and thus the region of wave decay relative to wave growth—following wave-breaking events.
Resumo:
Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.
Resumo:
This paper combines and generalizes a number of recent time series models of daily exchange rate series by using a SETAR model which also allows the variance equation of a GARCH specification for the error terms to be drawn from more than one regime. An application of the model to the French Franc/Deutschmark exchange rate demonstrates that out-of-sample forecasts for the exchange rate volatility are also improved when the restriction that the data it is drawn from a single regime is removed. This result highlights the importance of considering both types of regime shift (i.e. thresholds in variance as well as in mean) when analysing financial time series.
Resumo:
Leading patterns of observed monthly extreme rainfall variability in Australia are examined using an Empirical Orthogonal Teleconnection (EOT) method. Extreme rainfall variability is more closely related to mean rainfall variability during austral summer than in winter. The leading EOT patterns of extreme rainfall explain less variance in Australia-wide extreme rainfall than is the case for mean rainfall EOTs. We illustrate that, as with mean rainfall, the El Niño-Southern Oscillation (ENSO) has the strongest association with warm-season extreme rainfall variability, while in the cool-season the primary drivers are atmospheric blocking and the subtropical ridge. The Indian Ocean Dipole and Southern Annular Mode also have significant relationships with patterns of variability during austral winter and spring. Leading patterns of summer extreme rainfall variability have predictability several months ahead from Pacific sea surface temperatures (SSTs) and as much as a year in advance from Indian Ocean SSTs. Predictability from the Pacific is greater for wetter than average summer months than for months that are drier than average, whereas for the Indian Ocean the relationship has greater linearity. Several cool-season EOTs are associated with mid-latitude synoptic-scale patterns along the south and east coasts. These patterns have common atmospheric signatures denoting moist onshore flow and strong cyclonic anomalies often to the north of a blocking anti-cyclone. Tropical cyclone activity is observed to have significant relationships with some warm season EOTs. This analysis shows that extreme rainfall variability in Australia can be related to remote drivers and local synoptic-scale patterns throughout the year.
Resumo:
Recent gravity missions have produced a dramatic improvement in our ability to measure the ocean’s mean dynamic topography (MDT) from space. To fully exploit this oceanic observation, however, we must quantify its error. To establish a baseline, we first assess the error budget for an MDT calculated using a 3rd generation GOCE geoid and the CLS01 mean sea surface (MSS). With these products, we can resolve MDT spatial scales down to 250 km with an accuracy of 1.7 cm, with the MSS and geoid making similar contributions to the total error. For spatial scales within the range 133–250 km the error is 3.0 cm, with the geoid making the greatest contribution. For the smallest resolvable spatial scales (80–133 km) the total error is 16.4 cm, with geoid error accounting for almost all of this. Relative to this baseline, the most recent versions of the geoid and MSS fields reduce the long and short-wavelength errors by 0.9 and 3.2 cm, respectively, but they have little impact in the medium-wavelength band. The newer MSS is responsible for most of the long-wavelength improvement, while for the short-wavelength component it is the geoid. We find that while the formal geoid errors have reasonable global mean values they fail capture the regional variations in error magnitude, which depend on the steepness of the sea floor topography.
Resumo:
Purpose – Progress in retrofitting the UK's commercial properties continues to be slow and fragmented. New research from the UK and USA suggests that radical changes are needed to drive large-scale retrofitting, and that new and innovative models of financing can create new opportunities. The purpose of this paper is to offer insights into the terminology of retrofit and the changes in UK policy and practice that are needed to scale up activity in the sector. Design/methodology/approach – The paper reviews and synthesises key published research into commercial property retrofitting in the UK and USA and also draws on policy and practice from the EU and Australia. Findings – The paper provides a definition of “retrofit”, and compares and contrasts this with “refurbishment” and “renovation” in an international context. The paper summarises key findings from recent research and suggests that there are a number of policy and practice measures which need to be implemented in the UK for commercial retrofitting to succeed at scale. These include improved funding vehicles for retrofit; better transparency in actual energy performance; and consistency in measurement, verification and assessment standards. Practical implications – Policy and practice in the UK needs to change if large-scale commercial property retrofit is to be rolled out successfully. This requires mandatory legislation underpinned by incentives and penalties for non-compliance. Originality/value – This paper synthesises recent research to provide a set of policy and practice recommendations which draw on international experience, and can assist on implementation in the UK.
Resumo:
A parameterization of mesoscale eddies in coarse-resolution ocean general circulation models (GCM) is formulated and implemented using a residual-mean formalism. In that framework, mean buoyancy is advected by the residual velocity (the sum of the Eulerian and eddy-induced velocities) and modified by a residual flux which accounts for the diabatic effects of mesoscale eddies. The residual velocity is obtained by stepping forward a residual-mean momentum equation in which eddy stresses appear as forcing terms. Study of the spatial distribution of eddy stresses, derived by using them as control parameters to ‘‘fit’’ the residual-mean model to observations, supports the idea that eddy stresses can be likened to a vertical down-gradient flux of momentum with a coefficient which is constant in the vertical. The residual eddy flux is set to zero in the ocean interior, where mesoscale eddies are assumed to be quasi-adiabatic, but is parameterized by a horizontal down-gradient diffusivity near the surface where eddies develop a diabatic component as they stir properties horizontally across steep isopycnals. The residual-mean model is implemented and tested in the MIT general circulation model. It is shown that the resulting model (1) has a climatology that is superior to that obtained using the Gent and McWilliams parameterization scheme with a spatially uniform diffusivity and (2) allows one to significantly reduce the (spurious) horizontal viscosity used in coarse resolution GCMs.
Resumo:
Numerical experiments are described that pertain to the climate of a coupled atmosphere–ocean–ice system in the absence of land, driven by modern-day orbital and CO2 forcing. Millennial time-scale simulations yield a mean state in which ice caps reach down to 55° of latitude and both the atmosphere and ocean comprise eastward- and westward-flowing zonal jets, whose structure is set by their respective baroclinic instabilities. Despite the zonality of the ocean, it is remarkably efficient at transporting heat meridionally through the agency of Ekman transport and eddy-driven subduction. Indeed the partition of heat transport between the atmosphere and ocean is much the same as the present climate, with the ocean dominating in the Tropics and the atmosphere in the mid–high latitudes. Variability of the system is dominated by the coupling of annular modes in the atmosphere and ocean. Stochastic variability inherent to the atmospheric jets drives variability in the ocean. Zonal flows in the ocean exhibit decadal variability, which, remarkably, feeds back to the atmosphere, coloring the spectrum of annular variability. A simple stochastic model can capture the essence of the process. Finally, it is briefly reviewed how the aquaplanet can provide information about the processes that set the partition of heat transport and the climate of Earth.
Resumo:
A dataset of manufacturers' measurements of acrylamide levels in 40,455 samples of fresh sliced potato crisps from 20 European countries for years 2002 to 2011 was compiled. This dataset is by far the largest ever compiled relating to acrylamide levels in potato crisps. Analysis of variance was applied to the data and showed a clear, significant downward trend for mean levels of acrylamide, from 763 +/- 91.1 ng g(-1) (parts per billion) in 2002 to 358 +/- 2.5 ng g(-1) in 2011; this was a decrease of 53% +/- 13.5%. The yearly 95th quantile values were also subject to a clear downward trend. The effect of seasonality arising from the influence of potato storage on acrylamide levels was evident, with acrylamide in the first 6 months of the year being significantly higher than in the second 6 months. The proportion of samples containing acrylamide at a level above the indicative value of 1000ngg(-1) for potato crisps introduced by the European Commission in 2011 fell from 23.8% in 2002 to 3.2% in 2011. Nevertheless, even in 2011, a small proportion of samples still contained high levels of acrylamide, with 0.2% exceeding 2000ngg(-1).
Resumo:
Confidence in projections of global-mean sea level rise (GMSLR) depends on an ability to account for GMSLR during the twentieth century. There are contributions from ocean thermal expansion, mass loss from glaciers and ice sheets, groundwater extraction, and reservoir impoundment. Progress has been made toward solving the “enigma” of twentieth-century GMSLR, which is that the observed GMSLR has previously been found to exceed the sum of estimated contributions, especially for the earlier decades. The authors propose the following: thermal expansion simulated by climate models may previously have been underestimated because of their not including volcanic forcing in their control state; the rate of glacier mass loss was larger than previously estimated and was not smaller in the first half than in the second half of the century; the Greenland ice sheet could have made a positive contribution throughout the century; and groundwater depletion and reservoir impoundment, which are of opposite sign, may have been approximately equal in magnitude. It is possible to reconstruct the time series of GMSLR from the quantified contributions, apart from a constant residual term, which is small enough to be explained as a long-term contribution from the Antarctic ice sheet. The reconstructions account for the observation that the rate of GMSLR was not much larger during the last 50 years than during the twentieth century as a whole, despite the increasing anthropogenic forcing. Semiempirical methods for projecting GMSLR depend on the existence of a relationship between global climate change and the rate of GMSLR, but the implication of the authors' closure of the budget is that such a relationship is weak or absent during the twentieth century.
Resumo:
Purpose – Investors are now able to analyse more noise-free news to inform their trading decisions than ever before. Their expectation that more information means better performance is not supported by previous psychological experiments which argue that too much information actually impairs performance. The purpose of this paper is to examine whether the degree of information explicitness improves stock market performance. Design/methodology/approach – An experiment is conducted in a computer laboratory to examine a trading simulation manipulated from a real market-shock. Participants’ performance efficiency and effectiveness are measured separately. Findings – The results indicate that the explicitness of information neither improves nor impairs participants’ performance effectiveness from the perspectives of returns, share and cash positions, and trading volumes. However, participants’ performance efficiency is significantly affected by information explicitness. Originality/value – The novel approach and findings of this research add to the knowledge of the impact of information explicitness on the quality of decision making in a financial market environment.