84 resultados para Early Data Release

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of data uncertainty on real-time decision-making can be reduced by predicting early revisions to US GDP growth. We show that survey forecasts efficiently anticipate the first-revised estimate of GDP, but that forecasting models incorporating monthly economic indicators and daily equity returns provide superior forecasts of the second-revised estimate. We consider the implications of these findings for analyses of the impact of surprises in GDP revision announcements on equity markets, and for analyses of the impact of anticipated future revisions on announcement-day returns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a new composite of geomagnetic activity which is designed to be as homogeneous in its construction as possible. This is done by only combining data that, by virtue of the locations of the source observatories used, have similar responses to solar wind and IMF (interplanetary magnetic field) variations. This will enable us (in Part 2, Lockwood et al., 2013a) to use the new index to reconstruct the interplanetary magnetic field, B, back to 1846 with a full analysis of errors. Allowance is made for the effects of secular change in the geomagnetic field. The composite uses interdiurnal variation data from Helsinki for 1845–1890 (inclusive) and 1893–1896 and from Eskdalemuir from 1911 to the present. The gaps are filled using data from the Potsdam (1891–1892 and 1897–1907) and the nearby Seddin observatories (1908–1910) and intercalibration achieved using the Potsdam–Seddin sequence. The new index is termed IDV(1d) because it employs many of the principles of the IDV index derived by Svalgaard and Cliver (2010), inspired by the u index of Bartels (1932); however, we revert to using one-day (1d) means, as employed by Bartels, because the use of near-midnight values in IDV introduces contamination by the substorm current wedge auroral electrojet, giving noise and a dependence on solar wind speed that varies with latitude. The composite is compared with independent, early data from European-sector stations, Greenwich, St Petersburg, Parc St Maur, and Ekaterinburg, as well as the composite u index, compiled from 2–6 stations by Bartels, and the IDV index of Svalgaard and Cliver. Agreement is found to be extremely good in all cases, except two. Firstly, the Greenwich data are shown to have gradually degraded in quality until new instrumentation was installed in 1915. Secondly, we infer that the Bartels u index is increasingly unreliable before about 1886 and overestimates the solar cycle amplitude between 1872 and 1883 and this is amplified in the proxy data used before 1872. This is therefore also true of the IDV index which makes direct use of the u index values.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper reports on a new satellite sensor, the Geostationary Earth Radiation Budget (GERB) experiment. GERB is designed to make the first measurements of the Earth's radiation budget from geostationary orbit. Measurements at high absolute accuracy of the reflected sunlight from the Earth, and the thermal radiation emitted by the Earth are made every 15 min, with a spatial resolution at the subsatellite point of 44.6 km (north–south) by 39.3 km (east–west). With knowledge of the incoming solar constant, this gives the primary forcing and response components of the top-of-atmosphere radiation. The first GERB instrument is an instrument of opportunity on Meteosat-8, a new spin-stabilized spacecraft platform also carrying the Spinning Enhanced Visible and Infrared (SEVIRI) sensor, which is currently positioned over the equator at 3.5°W. This overview of the project includes a description of the instrument design and its preflight and in-flight calibration. An evaluation of the instrument performance after its first year in orbit, including comparisons with data from the Clouds and the Earth's Radiant Energy System (CERES) satellite sensors and with output from numerical models, are also presented. After a brief summary of the data processing system and data products, some of the scientific studies that are being undertaken using these early data are described. This marks the beginning of a decade or more of observations from GERB, as subsequent models will fly on each of the four Meteosat Second Generation satellites.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract. Not long after Franklin’s iconic studies, an atmospheric electric field was discovered in “fair weather” regions, well away from thunderstorms. The origin of the fair weather field was sought by Lord Kelvin, through development of electrostatic instrumentation and early data logging techniques, but was ultimately explained through the global circuit model of C.T.R. Wilson. In Wilson’s model, charge exchanged by disturbed weather electrifies the ionosphere, and returns via a small vertical current density in fair weather regions. New insights into the relevance of fair weather atmospheric electricity to terrestrial and planetary atmospheres are now emerging. For example, there is a possible role of the global circuit current density in atmospheric processes, such as cloud formation. Beyond natural atmospheric processes, a novel practical application is the use of early atmospheric electrostatic investigations to provide quantitative information on past urban air pollution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The currently available model-based global data sets of atmospheric circulation are a by-product of the daily requirement of producing initial conditions for numerical weather prediction (NWP) models. These data sets have been quite useful for studying fundamental dynamical and physical processes, and for describing the nature of the general circulation of the atmosphere. However, due to limitations in the early data assimilation systems and inconsistencies caused by numerous model changes, the available model-based global data sets may not be suitable for studying global climate change. A comprehensive analysis of global observations based on a four-dimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth's climate system. The concept is equally applicable for producing data sets for the atmosphere, the oceans, and the biosphere, and such data sets will be quite useful for studying global climate change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An investigation using the Stepping Out model of early hominin dispersal out of Africa is presented here. The late arrival of early hominins into Europe, as deduced from the fossil record, is shown to be consistent with poor ability of these hominins to survive in the Eurasian landscape. The present study also extends the understanding of modelling results from the original study by Mithen and Reed (2002. Stepping out: a computer simulation of hominid dispersal from Africa. J. Hum. Evol. 43, 433-462). The representation of climate and vegetation patterns has been improved through the use of climate model output. This study demonstrates that interpretative confidence may be strengthened, and new insights gained when climate models and hominin dispersal models are integrated. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present study was carried out to determine whether cephalic stimulation, associated with eating a meal, was sufficient stimulus to provoke the release of stored triacylglycerol (TAG) from a previous high-fat meal. Ten subjects were studied on three separate occasions. Following a 12 h overnight fast, subjects were given a standard mixed test meal which contained 56 g fat. Blood samples were taken before the meal and for 5 h after the meal when the subjects were randomly allocated to receive either water (control) or were modified sham fed a low-fat (6 g fat) or moderate-fat (38 g fat) meal. Blood samples were collected for a further 3 h. Compared with the control, modified sham feeding a low- or moderate-fat meal did not provoke an early entry of TAG, analysed in either plasma or TAG-rich lipoprotein (TRL) fraction (density ,1´006 kg/l). The TRL-retinyl ester data showed similar findings. A cephalic phase secretion of pancreatic polypeptide, without a significant increase in cholecystokinin levels, was observed on modified sham feeding. Although these data indicate that modified sham feeding was carried out successfully, analysis of the fat content of the expectorant showed that our subjects may have accidentally ingested a small amount of fat (0´7 g for the low-fat meal and 2´4 g for the moderate-fat meal). Nevertheless, an early TAG peak following modified sham feeding was not demonstrated in the present study, suggesting that significant ingestion of food, and not just orosensory stimulation, is necessary to provoke the release of any TAG stored from a previous meal.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parkinson's disease is characterized by the progressive and selective loss of dopaminergic neurons in the substantia nigra. It has been postulated that endogenously formed CysDA (5-S-cysteinyldopamine) and its metabolites may be, in part, responsible for this selective neuronal loss, although the mechanisms by which they contribute to such neurotoxicity are not understood. Exposure of neurons in culture to CysDA caused cell injury, apparent 12-48 h post-exposure. A portion of the neuronal death induced by CysDA was preceded by a rapid uptake and intracellular oxidation of CysDA, leading to an acute and transient activation of ERK2 (extracellular-signal-regulated kinase 2) and caspase 8. The oxidation of CysDA also induced the activation of apoptosis signal-regulating kinase 1 via its de-phosphorylation at Ser967, the phosphorylation of JNK (c-Jun N-terminal kinase) and c-Jun (Ser73) as well as the activation of p38, caspase 3, caspase 8, caspase 7 and caspase 9. Concurrently, the inhibition of complex I by the dihydrobenzothiazine DHBT-1 [7-(2-aminoethyl)-3,4-dihydro-5-hydroxy-2H-1,4-benzothiazine-3-carboxylic acid], formed from the intracellular oxidation of CysDA, induces complex I inhibition and the subsequent release of cytochrome c which further potentiates pro-apoptotic mechanisms. Our data suggest a novel comprehensive mechanism for CysDA that may hold relevance for the selective neuronal loss observed in Parkinson's disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood modelling of urban areas is still at an early stage, partly because until recently topographic data of sufficiently high resolution and accuracy have been lacking in urban areas. However, Digital Surface Models (DSMs) generated from airborne scanning laser altimetry (LiDAR) having sub-metre spatial resolution have now become available, and these are able to represent the complexities of urban topography. The paper describes the development of a LiDAR post-processor for urban flood modelling based on the fusion of LiDAR and digital map data. The map data are used in conjunction with LiDAR data to identify different object types in urban areas, though pattern recognition techniques are also employed. Post-processing produces a Digital Terrain Model (DTM) for use as model bathymetry, and also a friction parameter map for use in estimating spatially-distributed friction coefficients. In vegetated areas, friction is estimated from LiDAR-derived vegetation height, and (unlike most vegetation removal software) the method copes with short vegetation less than ~1m high, which may occupy a substantial fraction of even an urban floodplain. The DTM and friction parameter map may also be used to help to generate an unstructured mesh of a vegetated urban floodplain for use by a 2D finite element model. The mesh is decomposed to reflect floodplain features having different frictional properties to their surroundings, including urban features such as buildings and roads as well as taller vegetation features such as trees and hedges. This allows a more accurate estimation of local friction. The method produces a substantial node density due to the small dimensions of many urban features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the event of a release of toxic gas in the center of London, the emergency services would need to determine quickly the extent of the area contaminated. The transport of pollutants by turbulent flow within the complex street and building architecture of cities is not straightforward, and we might wonder whether it is at all possible to make a scientifically-reasoned decision. Here we describe recent progress from a major UK project, ‘Dispersion of Air Pollution and its Penetration into the Local Environment’ (DAPPLE, www.dapple.org.uk). In DAPPLE, we focus on the movement of airborne pollutants in cities by developing a greater understanding of atmospheric flow and dispersion within urban street networks. In particular, we carried out full-scale dispersion experiments in central London (UK) during 2003, 2004, 2007, and 2008 to address the extent of the dispersion of tracers following their release at street level. These measurements complemented previous studies because (i) our focus was on dispersion within the first kilometer from the source, when most of the material was expected to remain within the street network rather than being mixed into the boundary layer aloft, (ii) measurements were made under a wide variety of meteorological conditions, and (iii) central London represents a European, rather than North American, city geometry. Interpretation of the results from the full-scale experiments was supported by extensive numerical and wind tunnel modeling, which allowed more detailed analysis under idealized and controlled conditions. In this article, we review the full-scale DAPPLE methodologies and show early results from the analysis of the 2007 field campaign data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a new methodology for comparing satellite radiation budget data with a numerical weather prediction (NWP) model. This is applied to data from the Geostationary Earth Radiation Budget (GERB) instrument on Meteosat-8. The methodology brings together, in near-real time, GERB broadband shortwave and longwave fluxes with simulations based on analyses produced by the Met Office global NWP model. Results for the period May 2003 to February 2005 illustrate the progressive improvements in the data products as various initial problems were resolved. In most areas the comparisons reveal systematic errors in the model's representation of surface properties and clouds, which are discussed elsewhere. However, for clear-sky regions over the oceans the model simulations are believed to be sufficiently accurate to allow the quality of the GERB fluxes themselves to be assessed and any changes in time of the performance of the instrument to be identified. Using model and radiosonde profiles of temperature and humidity as input to a single-column version of the model's radiation code, we conduct sensitivity experiments which provide estimates of the expected model errors over the ocean of about ±5–10 W m−2 in clear-sky outgoing longwave radiation (OLR) and ±0.01 in clear-sky albedo. For the more recent data the differences between the observed and modeled OLR and albedo are well within these error estimates. The close agreement between the observed and modeled values, particularly for the most recent period, illustrates the value of the methodology. It also contributes to the validation of the GERB products and increases confidence in the quality of the data, prior to their release.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the efficacy of published δ18O data from the calcite of Late Miocene surface dwelling planktonic foraminifer shells, for sea surface temperature estimates for the pre-Quaternary. The data are from 33 Late Miocene (Messinian) marine sites from a modern latitudinal gradient of 64°N to 48°S. They give estimates of SSTs in the tropics/subtropics (to 30°N and S) that are mostly cooler than present. Possible causes of this temperature discrepancy are ecological factors (e.g. calcification of shells at levels below the ocean mixed layer), taphonomic effects (e.g. diagenesis or dissolution), inaccurate estimation of Late Miocene seawater oxygen isotope composition, or a real Late Miocene cool climate. The scale of apparent cooling in the tropics suggests that the SST signal of the foraminifer calcite has been reset, at least in part, by early diagenetic calcite with higher δ18O, formed in the foraminifer shells in cool sea bottom pore waters, probably coupled with the effects of calcite formed below the mixed layer during the life of the foraminifera. This hypothesis is supported by the markedly cooler SST estimates from low latitudes—in some cases more than 9 °C cooler than present—where the gradients of temperature and the δ18O composition of seawater between sea surface and sea bottom are most marked, and where ocean surface stratification is high. At higher latitudes, particularly N and S of 30°, the temperature signal is still cooler, though maximum temperature estimates overlap with modern SSTs N and S of 40°. Comparison of SST estimates for the Late Miocene from alkenone unsaturation analysis from the eastern tropical Atlantic at Ocean Drilling Program (ODP) Site 958—which suggest a warmer sea surface by 2–4 °C, with estimates from oxygen isotopes at Deep Sea Drilling Project (DSDP) Site 366 and ODP Site 959, indicating cooler than present SSTs, also suggest a significant impact on the δ18O signal. Nevertheless, much of the original SST variation is clearly preserved in the primary calcite formed in the mixed layer, and records secular and temporal oceanographic changes at the sea surface, such as movement of the Antarctic Polar Front in the Southern Ocean. Cooler SSTs in the tropics and sub-tropics are also consistent with the Late Miocene latitude reduction in the coral reef belt and with interrupted reef growth on the Queensland Plateau of eastern Australia, though it is not possible to quantify absolute SSTs with the existing oxygen isotope data. Reconstruction of an accurate global SST dataset for Neogene time-slices from the existing published DSDP/ODP isotope data, for use in general circulation models, may require a detailed re-assessment of taphonomy at many sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our understanding of the ancient ocean-atmosphere system has focused on oceanic proxies. However, the study of terrestrial proxies is equally necessary to constrain our understanding of ancient climates and linkages between the terrestrial and oceanic carbon reservoirs. We have analyzed carbon-isotope ratios from fossil plant material through the Valanginian and Lower Hauterivian from a shallow-marine, ammonite-constrained succession in the Crimean Peninsula of the southern Ukraine in order to determine if the Upper Valanginian positive carbon-isotope excursion is expressed in the atmosphere. delta(13)C(plant) values fluctuate around -23% to -22% for the Valanginian-Hauterivian, except during the Upper Valanginian where delta(13)C(plant) values record a positive excursion to similar to-18%. Based upon ammonite biostratigraphy from Crimea, and in conjunction with a composite Tethyan marine delta(13)C(carb) curve, several conclusions can be drawn: (1) the delta(13)C(plant) record indicates that the atmospheric carbon reservoir was affected; (2) the defined ammonite correlations between Europe and Crimea are synchronous; and (3) a change in photosynthetic carbon-isotope fractionation, caused by a decrease in atmospheric PCO2, occurred during the Upper Valanginian Positive delta(13)C excursion. Our new data, combined with other paleoenvironmental and paleoclimatic information, indicate that the Upper Valanginian was a cool period (icehouse) and highlights that the Cretaceous period was interrupted by periods of cooling and was not an equable climate as previously thought. (C) 2005 Elsevier B.V. All rights reserved.