48 resultados para Semi-complete Data Synchronization
Resumo:
This paper describes a method that employs Earth Observation (EO) data to calculate spatiotemporal estimates of soil heat flux, G, using a physically-based method (the Analytical Method). The method involves a harmonic analysis of land surface temperature (LST) data. It also requires an estimate of near-surface soil thermal inertia; this property depends on soil textural composition and varies as a function of soil moisture content. The EO data needed to drive the model equations, and the ground-based data required to provide verification of the method, were obtained over the Fakara domain within the African Monsoon Multidisciplinary Analysis (AMMA) program. LST estimates (3 km × 3 km, one image 15 min−1) were derived from MSG-SEVIRI data. Soil moisture estimates were obtained from ENVISAT-ASAR data, while estimates of leaf area index, LAI, (to calculate the effect of the canopy on G, largely due to radiation extinction) were obtained from SPOT-HRV images. The variation of these variables over the Fakara domain, and implications for values of G derived from them, were discussed. Results showed that this method provides reliable large-scale spatiotemporal estimates of G. Variations in G could largely be explained by the variability in the model input variables. Furthermore, it was shown that this method is relatively insensitive to model parameters related to the vegetation or soil texture. However, the strong sensitivity of thermal inertia to soil moisture content at low values of relative saturation (<0.2) means that in arid or semi-arid climates accurate estimates of surface soil moisture content are of utmost importance, if reliable estimates of G are to be obtained. This method has the potential to improve large-scale evaporation estimates, to aid land surface model prediction and to advance research that aims to explain failure in energy balance closure of meteorological field studies.
Resumo:
Climate-G is a large scale distributed testbed devoted to climate change research. It is an unfunded effort started in 2008 and involving a wide community both in Europe and US. The testbed is an interdisciplinary effort involving partners from several institutions and joining expertise in the field of climate change and computational science. Its main goal is to allow scientists carrying out geographical and cross-institutional data discovery, access, analysis, visualization and sharing of climate data. It represents an attempt to address, in a real environment, challenging data and metadata management issues. This paper presents a complete overview about the Climate-G testbed highlighting the most important results that have been achieved since the beginning of this project.
Resumo:
Background A whole-genome genotyping array has previously been developed for Malus using SNP data from 28 Malus genotypes. This array offers the prospect of high throughput genotyping and linkage map development for any given Malus progeny. To test the applicability of the array for mapping in diverse Malus genotypes, we applied the array to the construction of a SNPbased linkage map of an apple rootstock progeny. Results Of the 7,867 Malus SNP markers on the array, 1,823 (23.2 %) were heterozygous in one of the two parents of the progeny, 1,007 (12.8 %) were heterozygous in both parental genotypes, whilst just 2.8 % of the 921 Pyrus SNPs were heterozygous. A linkage map spanning 1,282.2 cM was produced comprising 2,272 SNP markers, 306 SSR markers and the S-locus. The length of the M432 linkage map was increased by 52.7 cM with the addition of the SNP markers, whilst marker density increased from 3.8 cM/marker to 0.5 cM/marker. Just three regions in excess of 10 cM remain where no markers were mapped. We compared the positions of the mapped SNP markers on the M432 map with their predicted positions on the ‘Golden Delicious’ genome sequence. A total of 311 markers (13.7 % of all mapped markers) mapped to positions that conflicted with their predicted positions on the ‘Golden Delicious’ pseudo-chromosomes, indicating the presence of paralogous genomic regions or misassignments of genome sequence contigs during the assembly and anchoring of the genome sequence. Conclusions We incorporated data for the 2,272 SNP markers onto the map of the M432 progeny and have presented the most complete and saturated map of the full 17 linkage groups of M. pumila to date. The data were generated rapidly in a high-throughput semi-automated pipeline, permitting significant savings in time and cost over linkage map construction using microsatellites. The application of the array will permit linkage maps to be developed for QTL analyses in a cost-effective manner, and the identification of SNPs that have been assigned erroneous positions on the ‘Golden Delicious’ reference sequence will assist in the continued improvement of the genome sequence assembly for that variety.
Resumo:
Data assimilation refers to the problem of finding trajectories of a prescribed dynamical model in such a way that the output of the model (usually some function of the model states) follows a given time series of observations. Typically though, these two requirements cannot both be met at the same time–tracking the observations is not possible without the trajectory deviating from the proposed model equations, while adherence to the model requires deviations from the observations. Thus, data assimilation faces a trade-off. In this contribution, the sensitivity of the data assimilation with respect to perturbations in the observations is identified as the parameter which controls the trade-off. A relation between the sensitivity and the out-of-sample error is established, which allows the latter to be calculated under operational conditions. A minimum out-of-sample error is proposed as a criterion to set an appropriate sensitivity and to settle the discussed trade-off. Two approaches to data assimilation are considered, namely variational data assimilation and Newtonian nudging, also known as synchronization. Numerical examples demonstrate the feasibility of the approach.
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
As wind generation increases, system impact studies rely on predictions of future generation and effective representation of wind variability. A well-established approach to investigate the impact of wind variability is to simulate generation using observations from 10 m meteorological mast-data. However, there are problems with relying purely on historical wind-speed records or generation histories: mast-data is often incomplete, not sited at a relevant wind generation sites, and recorded at the wrong altitude above ground (usually 10 m), each of which may distort the generation profile. A possible complimentary approach is to use reanalysis data, where data assimilation techniques are combined with state-of-the-art weather forecast models to produce complete gridded wind time-series over an area. Previous investigations of reanalysis datasets have placed an emphasis on comparing reanalysis to meteorological site records whereas this paper compares wind generation simulated using reanalysis data directly against historic wind generation records. Importantly, this comparison is conducted using raw reanalysis data (typical resolution ∼50 km), without relying on a computationally expensive “dynamical downscaling” for a particular target region. Although the raw reanalysis data cannot, by nature of its construction, represent the site-specific effects of sub-gridscale topography, it is nevertheless shown to be comparable to or better than the mast-based simulation in the region considered and it is therefore argued that raw reanalysis data may offer a number of significant advantages as a data source.
Resumo:
The interannual variability of the stratospheric polar vortex during winter in both hemispheres is observed to correlate strongly with the phase of the quasi-biennial oscillation (QBO) in tropical stratospheric winds. It follows that the lack of a spontaneously generated QBO in most atmospheric general circulation models (AGCMs) adversely affects the nature of polar variability in such models. This study examines QBO–vortex coupling in an AGCM in which a QBO is spontaneously induced by resolved and parameterized waves. The QBO–vortex coupling in the AGCM compares favorably to that seen in reanalysis data [from the 40-yr ECMWF Re-Analysis (ERA-40)], provided that careful attention is given to the definition of QBO phase. A phase angle representation of the QBO is employed that is based on the two leading empirical orthogonal functions of equatorial zonal wind vertical profiles. This yields a QBO phase that serves as a proxy for the vertical structure of equatorial winds over the whole depth of the stratosphere and thus provides a means of subsampling the data to select QBO phases with similar vertical profiles of equatorial zonal wind. Using this subsampling, it is found that the QBO phase that induces the strongest polar vortex response in early winter differs from that which induces the strongest late-winter vortex response. This is true in both hemispheres and for both the AGCM and ERA-40. It follows that the strength and timing of QBO influence on the vortex may be affected by the partial seasonal synchronization of QBO phase transitions that occurs both in observations and in the model. This provides a mechanism by which changes in the strength of QBO–vortex correlations may exhibit variability on decadal time scales. In the model, such behavior occurs in the absence of external forcings or interannual variations in sea surface temperatures.
Resumo:
The ability to create accurate geometric models of neuronal morphology is important for understanding the role of shape in information processing. Despite a significant amount of research on automating neuron reconstructions from image stacks obtained via microscopy, in practice most data are still collected manually. This paper describes Neuromantic, an open source system for three dimensional digital tracing of neurites. Neuromantic reconstructions are comparable in quality to those of existing commercial and freeware systems while balancing speed and accuracy of manual reconstruction. The combination of semi-automatic tracing, intuitive editing, and ability of visualizing large image stacks on standard computing platforms provides a versatile tool that can help address the reconstructions availability bottleneck. Practical considerations for reducing the computational time and space requirements of the extended algorithm are also discussed.
Resumo:
We present evidence that large-scale spatial coherence of 40 Hz oscillations can emerge dynamically in a cortical mean field theory. The simulated synchronization time scale is about 150 ms, which compares well with experimental data on large-scale integration during cognitive tasks. The same model has previously provided consistent descriptions of the human EEG at rest, with tranquilizers, under anesthesia, and during anesthetic-induced epileptic seizures. The emergence of coherent gamma band activity is brought about by changing just one physiological parameter until cortex becomes marginally unstable for a small range of wavelengths. This suggests for future study a model of dynamic computation at the edge of cortical stability.
Resumo:
Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.
Resumo:
Traditionally, the formal scientific output in most fields of natural science has been limited to peer- reviewed academic journal publications, with less attention paid to the chain of intermediate data results and their associated metadata, including provenance. In effect, this has constrained the representation and verification of the data provenance to the confines of the related publications. Detailed knowledge of a dataset’s provenance is essential to establish the pedigree of the data for its effective re-use, and to avoid redundant re-enactment of the experiment or computation involved. It is increasingly important for open-access data to determine their authenticity and quality, especially considering the growing volumes of datasets appearing in the public domain. To address these issues, we present an approach that combines the Digital Object Identifier (DOI) – a widely adopted citation technique – with existing, widely adopted climate science data standards to formally publish detailed provenance of a climate research dataset as an associated scientific workflow. This is integrated with linked-data compliant data re-use standards (e.g. OAI-ORE) to enable a seamless link between a publication and the complete trail of lineage of the corresponding dataset, including the dataset itself.
Resumo:
Semi-open street roofs protect pedestrians from intense sunshine and rains. Their effects on natural ventilation of urban canopy layers (UCL) are less understood. This paper investigates two idealized urban models consisting of 4(2×2) or 16(4×4) buildings under a neutral atmospheric condition with parallel (0°) or non-parallel (15°,30°,45°) approaching wind. The aspect ratio (building height (H) / street width (W)) is 1 and building width is B=3H. Computational fluid dynamic (CFD) simulations were first validated by experimental data, confirming that standard k-ε model predicted airflow velocity better than RNG k-ε model, realizable k–ε model and Reynolds stress model. Three ventilation indices were numerically analyzed for ventilation assessment, including flow rates across street roofs and openings to show the mechanisms of air exchange, age of air to display how long external air reaches a place after entering UCL, and purging flow rate to quantify the net UCL ventilation capacity induced by mean flows and turbulence. Five semi-open roof types are studied: Walls being hung above street roofs (coverage ratio λa=100%) at z=1.5H, 1.2H, 1.1H ('Hung1.5H', 'Hung1.2H', 'Hung1.1H' types); Walls partly covering street roofs (λa=80%) at z=H ('Partly-covered' type); Walls fully covering street roofs (λa=100%) at z=H ('Fully-covered' type).They basically obtain worse UCL ventilation than open street roof type due to the decreased roof ventilation. 'Hung1.1H', 'Hung1.2H', 'Hung1.5H' types are better designs than 'Fully-covered' and 'Partly-covered' types. Greater urban size contains larger UCL volume and requires longer time to ventilate. The methodologies and ventilation indices are confirmed effective to quantify UCL ventilation.
Resumo:
This study of landscape evolution presents both new modern and palaeo process-landform data, and analyses the behaviour of the Antarctic Peninsula Ice Sheet through the Last Glacial Maximum (LGM), the Holocene and to the present day. Six sediment-landform assemblages are described and interpreted for Ulu Peninsula, James Ross Island, NE Antarctic Peninsula: (1) the Glacier Ice and Snow Assemblage; (2) the Glacigenic Assemblage, which relates to LGM sediments and comprises both erratic-poor and erratic-rich drift, deposited by cold-based and wet-based ice and ice streams respectively; (3) the Boulder Train Assemblage, deposited during a Mid-Holocene glacier readvance; (4) the Ice-cored Moraine Assemblage, found in front of small cirque glaciers; (5) the Paraglacial Assemblage including scree, pebble-boulder lags, and littoral and fluvial processes; and (6) the Periglacial Assemblage including rock glaciers, protalus ramparts, blockfields, solifluction lobes and extensive patterned ground. The interplay between glacial, paraglacial and periglacial processes in this semi-arid polar environment is important in understanding polygenetic landforms. Crucially, cold-based ice was capable of sediment and landform genesis and modification. This landsystem model can aid the interpretation of past environments, but also provides new data to aid the reconstruction of the last ice sheet to overrun James Ross Island.
Resumo:
Pollen data from China for 6000 and 18,000 14C yr bp were compiled and used to reconstruct palaeovegetation patterns, using complete taxon lists where possible and a biomization procedure that entailed the assignment of 645 pollen taxa to plant functional types. A set of 658 modern pollen samples spanning all biomes and regions provided a comprehensive test for this procedure and showed convincing agreement between reconstructed biomes and present natural vegetation types, both geographically and in terms of the elevation gradients in mountain regions of north-eastern and south-western China. The 6000 14C yr bp map confirms earlier studies in showing that the forest biomes in eastern China were systematically shifted northwards and extended westwards during the mid-Holocene. Tropical rain forest occurred on mainland China at sites characterized today by either tropical seasonal or broadleaved evergreen/warm mixed forest. Broadleaved evergreen/warm mixed forest occurred further north than today, and at higher elevation sites within the modern latitudinal range of this biome. The northern limit of temperate deciduous forest was shifted c. 800 km north relative to today. The 18,000 14C yr bp map shows that steppe and even desert vegetation extended to the modern coast of eastern China at the last glacial maximum, replacing today’s temperate deciduous forest. Tropical forests were excluded from China and broadleaved evergreen/warm mixed forest had retreated to tropical latitudes, while taiga extended southwards to c. 43°N.
Resumo:
New compilations of African pollen and lake data are compared with climate (CCM1, NCAR, Boulder) and vegetation (BIOME 1.2, GSG, Lund) simulations for the last glacial maximum (LGM) and early to mid-Holocene (EMH). The simulated LGM climate was ca 4°C colder and drier than present, with maximum reduction in precipitation in semi-arid regions. Biome simulations show lowering of montane vegetation belts and expansion of southern xerophytic associations, but no change in the distribution of deserts and tropical rain forests. The lakes show LGM conditions similar or drier than present throughout northern and tropical Africa. Pollen data indicate lowering of montane vegetation belts, the stability of the Sahara, and a reduction of rain forest. The paleoenvironmental data are consistent with the simulated changes in temperature and moisture budgets, although they suggest the climate model underestimates equatorial aridity. EMH simulations show temperatures slightly less than present and increased monsoonal precipitation in the eastern Sahara and East Africa. Biome simulations show an upward shift of montane vegetation belts, fragmentation of xerophytic vegetation in southern Africa, and a major northward shift of the southern margin of the eastern Sahara. The lakes indicate conditions wetter than present across northern Africa. Pollen data show an upward shift of the montane forests, the northward shift of the southern margin of the Sahara, and a major extension of tropical rain forest. The lake and pollen data confirm monsoon expansion in eastern Africa, but the climate model fails to simulate the wet conditions in western Africa.