888 resultados para Dataset


Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is for mally proved that the general smoother for nonlinear dynamics can be for mulated as a sequential method, that is, obser vations can be assimilated sequentially during a for ward integration. The general filter can be derived from the smoother and it is shown that the general smoother and filter solutions at the final time become identical, as is expected from linear theor y. Then, a new smoother algorithm based on ensemble statistics is presented and examined in an example with the Lorenz equations. The new smoother can be computed as a sequential algorithm using only for ward-in-time model integrations. It bears a strong resemblance with the ensemble Kalman filter . The difference is that ever y time a new dataset is available during the for ward integration, an analysis is computed for all previous times up to this time. Thus, the first guess for the smoother is the ensemble Kalman filter solution, and the smoother estimate provides an improvement of this, as one would expect a smoother to do. The method is demonstrated in this paper in an intercomparison with the ensemble Kalman filter and the ensemble smoother introduced by van Leeuwen and Evensen, and it is shown to be superior in an application with the Lorenz equations. Finally , a discussion is given regarding the properties of the analysis schemes when strongly non-Gaussian distributions are used. It is shown that in these cases more sophisticated analysis schemes based on Bayesian statistics must be used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ther mohaline exchange between the Atlantic and the Souther n Ocean is analyzed, using a dataset based on WOCE hydrographic data. It is shown that the salt and heat transports brought about by the South Atlantic subtropical gyre play an essential role in the Atlantic heat and salt budgets. It is found that on average the exported North Atlantic Deep W ater (NADW) is fresher than the retur n flows (basically composed of ther mocline and inter mediate water), indicating that the overtur ning circulation (OC) exports freshwater from the Atlantic. The sensitivity of the OC to interbasin fluxes of heat and salt is studied in a 2 D model, representing the Atlantic between 60 8 N and 30 8 S. The model is forced by mixed boundar y conditions at the sur face, and by realistic fluxes of heat and salt at its 30 8 S boundar y. The model circulation tur ns out to be ver y sensitive to net buoyancy fluxes through the sur face. Both net sur face cooling and net sur face saltening are sources of potential energy and impact positively on the circulation strength. The vertical distributions of the lateral fluxes tend to stabilize the stratification, and, as they extract potential energy from the system, tend to weaken the flow . These results imply that a change in the composition of the NADW retur n transports, whether by a change in the ratio ther mocline/inter mediate water , o r by a change in their ther mohaline characteristics, might influence the Atlantic OC considerably . It is also shown that the circulation is much more sensitive to changes in the shape of the lateral buoyancy flux than to changes in the shape of the sur face buoyancy flux, as the latter does not explicitly impact on the potential energy of the system. It is concluded that interocean fluxes of heat and salt are important for the strength and operation of the Atlantic ther mohaline circulation, and should be correctly represented in models that are used for climate sensitivity studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ring-shedding process in the Agulhas Current is studied using the ensemble Kalman filter to assimilate geosat altimeter data into a two-layer quasigeostrophic ocean model. The properties of the ensemble Kalman filter are further explored with focus on the analysis scheme and the use of gridded data. The Geosat data consist of 10 fields of gridded sea-surface height anomalies separated 10 days apart that are added to a climatic mean field. This corresponds to a huge number of data values, and a data reduction scheme must be applied to increase the efficiency of the analysis procedure. Further, it is illustrated how one can resolve the rank problem occurring when a too large dataset or a small ensemble is used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research in Bid Tender Forecasting Models (BTFM) has been in progress since the 1950s. None of the developed models were easy-to-use tools for effective use by bidding practitioners because the advanced mathematical apparatus and massive data inputs required. This scenario began to change in 2012 with the development of the Smartbid BTFM, a quite simple model that presents a series of graphs that enables any project manager to study competitors using a relatively short historical tender dataset. However, despite the advantages of this new model, so far, it is still necessary to study all the auction participants as an indivisible group; that is, the original BTFM was not devised for analyzing the behavior of a single bidding competitor or a subgroup of them. The present paper tries to solve that flaw and presents a stand-alone methodology useful for estimating future competitors’ bidding behaviors separately.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Datasets containing information to locate and identify water bodies have been generated from data locating static-water-bodies with resolution of about 300 m (1/360 deg) recently released by the Land Cover Climate Change Initiative (LC CCI) of the European Space Agency. The LC CCI water-bodies dataset has been obtained from multi-temporal metrics based on time series of the backscattered intensity recorded by ASAR on Envisat between 2005 and 2010. The new derived datasets provide coherently: distance to land, distance to water, water-body identifiers and lake-centre locations. The water-body identifier dataset locates the water bodies assigning the identifiers of the Global Lakes and Wetlands Database (GLWD), and lake centres are defined for in-land waters for which GLWD IDs were determined. The new datasets therefore link recent lake/reservoir/wetlands extent to the GLWD, together with a set of coordinates which locates unambiguously the water bodies in the database. Information on distance-to-land for each water cell and the distance-to-water for each land cell has many potential applications in remote sensing, where the applicability of geophysical retrieval algorithms may be affected by the presence of water or land within a satellite field of view (image pixel). During the generation and validation of the datasets some limitations of the GLWD database and of the LC CCI water-bodies mask have been found. Some examples of the inaccuracies/limitations are presented and discussed. Temporal change in water-body extent is common. Future versions of the LC CCI dataset are planned to represent temporal variation, and this will permit these derived datasets to be updated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantifying the effect of the seawater density changes on sea level variability is of crucial importance for climate change studies, as the sea level cumulative rise can be regarded as both an important climate change indicator and a possible danger for human activities in coastal areas. In this work, as part of the Ocean Reanalysis Intercomparison Project, the global and regional steric sea level changes are estimated and compared from an ensemble of 16 ocean reanalyses and 4 objective analyses. These estimates are initially compared with a satellite-derived (altimetry minus gravimetry) dataset for a short period (2003–2010). The ensemble mean exhibits a significant high correlation at both global and regional scale, and the ensemble of ocean reanalyses outperforms that of objective analyses, in particular in the Southern Ocean. The reanalysis ensemble mean thus represents a valuable tool for further analyses, although large uncertainties remain for the inter-annual trends. Within the extended intercomparison period that spans the altimetry era (1993–2010), we find that the ensemble of reanalyses and objective analyses are in good agreement, and both detect a trend of the global steric sea level of 1.0 and 1.1 ± 0.05 mm/year, respectively. However, the spread among the products of the halosteric component trend exceeds the mean trend itself, questioning the reliability of its estimate. This is related to the scarcity of salinity observations before the Argo era. Furthermore, the impact of deep ocean layers is non-negligible on the steric sea level variability (22 and 12 % for the layers below 700 and 1500 m of depth, respectively), although the small deep ocean trends are not significant with respect to the products spread.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Autism spectrum conditions (ASC) are a group of neurodevelopmental conditions characterized by difficulties in social interaction and communication alongside repetitive and stereotyped behaviours. ASC are heritable, and common genetic variants contribute substantial phenotypic variability. More than 600 genes have been implicated in ASC to date. However, a comprehensive investigation of candidate gene association studies in ASC is lacking. Methods In this study, we systematically reviewed the literature for association studies for 552 genes associated with ASC. We identified 58 common genetic variants in 27 genes that have been investigated in three or more independent cohorts and conducted a meta-analysis for 55 of these variants. We investigated publication bias and sensitivity and performed stratified analyses for a subset of these variants. Results We identified 15 variants nominally significant for the mean effect size, 8 of which had P values below a threshold of significance of 0.01. Of these 15 variants, 11 were re-investigated for effect sizes and significance in the larger Psychiatric Genomics Consortium dataset, and none of them were significant. Effect direction for 8 of the 11 variants were concordant between both the datasets, although the correlation between the effect sizes from the two datasets was poor and non-significant. Conclusions This is the first study to comprehensively examine common variants in candidate genes for ASC through meta-analysis. While for majority of the variants, the total sample size was above 500 cases and 500 controls, the total sample size was not large enough to accurately identify common variants that contribute to the aetiology of ASC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first large-scale archaeobotanical study in Britain, conducted from 1899 to 1909 by Clement Reid and Arthur Lyell at Silchester, provided the first evidence for the introduction of Roman plant foods to Britain, yet the findings have thus far remained unverified. This paper presents a reassessment of these archaeobotanical remains, now stored as part of the Silchester Collection in Reading Museum. The documentary evidence for the Silchester study is summarised, before the results are presented for over a 1000 plant remains including an assessment of preservation, identification and modern contamination. The dataset includes both evidence for the presence of nationally rare plant foods, such as medlar, and several archaeophytes. The methodologies and original interpretations of Reid and Lyell’s study are reassessed in light of current archaeobotanical knowledge. Spatial and contextual patterns in the distribution of plant foods and ornamental taxa are also explored. Finally, the legacy of the study for the development of archaeobotany in the 20th century is evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Extratropical cyclones produce the majority of precipitation in many regions of the extratropics. This study evaluates the ability of a climate model, HiGEM, to reproduce the precipitation associated with extratropical cyclones. The model is evaluated using the ERA-Interim reanalysis and GPCP dataset. The analysis employs a cyclone centred compositing technique, evaluates composites across a range of geographical areas and cyclone intensities and also investigates the ability of the model to reproduce the climatological distribution of cyclone associated precipitation across the Northern Hemisphere. Using this phenomena centred approach provides an ability to identify the processes which are responsible for climatological biases in the model. Composite precipitation intensities are found to be comparable when all cyclones across the Northern Hemisphere are included. When the cyclones are filtered by region or intensity, differences are found, in particular, HiGEM produces too much precipitation in its most intense cyclones relative to ERA-Interim and GPCP. Biases in the climatological distribution of cyclone associated precipitation are also found, with biases around the storm track regions associated with both the number of cyclones in HiGEM and also their average precipitation intensity. These results have implications for the reliability of future projections of extratropical precipitation from the model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We test the ability of a two-dimensional flux model to simulate polynya events with narrow open-water zones by comparing model results to ice-thickness and ice-production estimates derived from thermal infrared Moderate Resolution Imaging Spectroradiometer (MODIS) observations in conjunction with an atmospheric dataset. Given a polynya boundary and an atmospheric dataset, the model correctly reproduces the shape of an 11 day long event, using only a few simple conservation laws. Ice production is slightly overestimated by the model, owing to an underestimated ice thickness. We achieved best model results with the consolidation thickness parameterization developed by Biggs and others (2000). Observed regional discrepancies between model and satellite estimates might be a consequence of the missing representation of the dynamic of the thin-ice thickening (e.g. rafting). We conclude that this simplified polynya model is a valuable tool for studying polynya dynamics and estimating associated fluxes of single polynya events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The field campaign LOFZY 2005 (LOFoten ZYklonen, engl.: Cyclones) was carried out in the frame of Collaborative Research Centre 512, which deals with low-pressure systems (cyclones) and the climate system of the North Atlantic. Cyclones are of special interest due to their influence on the interaction between atmosphere and ocean. Cyclone activity in the northern part of the Atlantic Ocean is notably high and is of particular importance for the entire Atlantic Ocean. An area of maximum precipitation exists in front of the Norwegian Lofoten islands. One aim of the LOFZY field campaign was to clarify the role cyclones play in the interaction of ocean and atmosphere. In order to obtain a comprehensive dataset of cyclone activity and ocean-atmosphere interaction a field experiment was carried out in the Lofoten region during March and April 2005. Employed platforms were the Irish research vessel RV Celtic Explorer which conducted a meteorological (radiosondes, standard parameters, observations) and an oceanographic (CTD) program. The German research aircraft Falcon accomplished eight flight missions (between 4-21 March) to observe synoptic conditions with high spatial and temporal resolution. In addition 23 autonomous marine buoys were deployed in advance of the campaign in the observed area to measure drift, air-temperature and -pressure and water-temperature. In addition to the published datasets several other measurements were performed during the experiment. Corresonding datasets will be published in the near future and are available on request. Details about all used platforms and sensors and all performed measurements are listed in the fieldreport. The following datasets are available on request: ground data at RV Celtic Explorer

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Approximate Bayesian computation (ABC) is a popular family of algorithms which perform approximate parameter inference when numerical evaluation of the likelihood function is not possible but data can be simulated from the model. They return a sample of parameter values which produce simulations close to the observed dataset. A standard approach is to reduce the simulated and observed datasets to vectors of summary statistics and accept when the difference between these is below a specified threshold. ABC can also be adapted to perform model choice. In this article, we present a new software package for R, abctools which provides methods for tuning ABC algorithms. This includes recent dimension reduction algorithms to tune the choice of summary statistics, and coverage methods to tune the choice of threshold. We provide several illustrations of these routines on applications taken from the ABC literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A recent field campaign in southwest England used numerical modeling integrated with aircraft and radar observations to investigate the dynamic and microphysical interactions that can result in heavy convective precipitation. The COnvective Precipitation Experiment (COPE) was a joint UK-US field campaign held during the summer of 2013 in the southwest peninsula of England, designed to study convective clouds that produce heavy rain leading to flash floods. The clouds form along convergence lines that develop regularly due to the topography. Major flash floods have occurred in the past, most famously at Boscastle in 2004. It has been suggested that much of the rain was produced by warm rain processes, similar to some flash floods that have occurred in the US. The overarching goal of COPE is to improve quantitative convective precipitation forecasting by understanding the interactions of the cloud microphysics and dynamics and thereby to improve NWP model skill for forecasts of flash floods. Two research aircraft, the University of Wyoming King Air and the UK BAe 146, obtained detailed in situ and remote sensing measurements in, around, and below storms on several days. A new fast-scanning X-band dual-polarization Doppler radar made 360-deg volume scans over 10 elevation angles approximately every 5 minutes, and was augmented by two UK Met Office C-band radars and the Chilbolton S-band radar. Detailed aerosol measurements were made on the aircraft and on the ground. This paper: (i) provides an overview of the COPE field campaign and the resulting dataset; (ii) presents examples of heavy convective rainfall in clouds containing ice and also in relatively shallow clouds through the warm rain process alone; and (iii) explains how COPE data will be used to improve high-resolution NWP models for operational use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The collective representation within global models of aerosol, cloud, precipitation, and their radiative properties remains unsatisfactory. They constitute the largest source of uncertainty in predictions of climatic change and hamper the ability of numerical weather prediction models to forecast high-impact weather events. The joint European Space Agency (ESA)–Japan Aerospace Exploration Agency (JAXA) Earth Clouds, Aerosol and Radiation Explorer (EarthCARE) satellite mission, scheduled for launch in 2018, will help to resolve these weaknesses by providing global profiles of cloud, aerosol, precipitation, and associated radiative properties inferred from a combination of measurements made by its collocated active and passive sensors. EarthCARE will improve our understanding of cloud and aerosol processes by extending the invaluable dataset acquired by the A-Train satellites CloudSat, Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), and Aqua. Specifically, EarthCARE’s cloud profiling radar, with 7 dB more sensitivity than CloudSat, will detect more thin clouds and its Doppler capability will provide novel information on convection, precipitating ice particle, and raindrop fall speeds. EarthCARE’s 355-nm high-spectral-resolution lidar will measure directly and accurately cloud and aerosol extinction and optical depth. Combining this with backscatter and polarization information should lead to an unprecedented ability to identify aerosol type. The multispectral imager will provide a context for, and the ability to construct, the cloud and aerosol distribution in 3D domains around the narrow 2D retrieved cross section. The consistency of the retrievals will be assessed to within a target of ±10 W m–2 on the (10 km)2 scale by comparing the multiview broadband radiometer observations to the top-of-atmosphere fluxes estimated by 3D radiative transfer models acting on retrieved 3D domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major gap in our understanding of the medieval economy concerns interest rates, especially relating to commercial credit. Although direct evidence about interest rates is scattered and anecdotal, there is much more surviving information about exchange rates. Since both contemporaries and historians have suggested that exchange and rechange transactions could be used to disguise the charging of interest in order to circumvent the usury prohibition, it should be possible to back out the interest rates from exchange rates. The following analysis is based on a new dataset of medieval exchange rates collected from commercial correspondence in the archive of Francesco di Marco Datini of Prato, c.1383-1411. It demonstrates that the time value of money was consistently incorporated into market exchange rates. Moreover, these implicit interest rates are broadly comparable to those received from other types of commercial loan and investment. Although on average profitable, the return on any individual exchange and rechange transaction did involve a degree of uncertainty that may have justified their non-usurious nature. However, there were also practical reasons why medieval merchants may have used foreign exchange transactions as a means of extending credit.