140 resultados para High-Order Accurate Scheme


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of six scanning cloud radar scan strategies to reconstruct cumulus cloud fields for radiation study is assessed. Utilizing snapshots of clean and polluted cloud fields from large eddy simulations, an analysis is undertaken of error in both the liquid water path and monochromatic downwelling surface irradiance at 870 nm of the reconstructed cloud fields. Error introduced by radar sensitivity, choice of radar scan strategy, retrieval of liquid water content (LWC), and reconstruction scheme is explored. Given an in␣nitely sensitive radar and perfect LWC retrieval, domain average surface irradiance biases are typically less than 3 W m␣2 ␣m␣1, corresponding to 5–10% of the cloud radiative effect (CRE). However, when using a realistic radar sensitivity of ␣37.5 dBZ at 1 km, optically thin areas and edges of clouds are dif␣cult to detect due to their low radar re-ectivity; in clean conditions, overestimates are of order 10 W m␣2 ␣m␣1 (~20% of the CRE), but in polluted conditions, where the droplets are smaller, this increases to 10–26 W m␣2 ␣m␣1 (~40–100% of the CRE). Drizzle drops are also problematic; if treated as cloud droplets, reconstructions are poor, leading to large underestimates of 20–46 W m␣2 ␣m␣1 in domain average surface irradiance (~40–80% of the CRE). Nevertheless, a synergistic retrieval approach combining the detailed cloud structure obtained from scanning radar with the droplet-size information and location of cloud base gained from other instruments would potentially make accurate solar radiative transfer calculations in broken cloud possible for the first time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Targeted Induced Loci Lesions IN Genomes (TILLING) is increasingly being used to generate and identify mutations in target genes of crop genomes. TILLING populations of several thousand lines have been generated in a number of crop species including Brassica rapa. Genetic analysis of mutants identified by TILLING requires an efficient, high-throughput and cost effective genotyping method to track the mutations through numerous generations. High resolution melt (HRM) analysis has been used in a number of systems to identify single nucleotide polymorphisms (SNPs) and insertion/deletions (IN/DELs) enabling the genotyping of different types of samples. HRM is ideally suited to high-throughput genotyping of multiple TILLING mutants in complex crop genomes. To date it has been used to identify mutants and genotype single mutations. The aim of this study was to determine if HRM can facilitate downstream analysis of multiple mutant lines identified by TILLING in order to characterise allelic series of EMS induced mutations in target genes across a number of generations in complex crop genomes. Results: We demonstrate that HRM can be used to genotype allelic series of mutations in two genes, BraA.CAX1a and BraA.MET1.a in Brassica rapa. We analysed 12 mutations in BraA.CAX1.a and five in BraA.MET1.a over two generations including a back-cross to the wild-type. Using a commercially available HRM kit and the Lightscanner™ system we were able to detect mutations in heterozygous and homozygous states for both genes. Conclusions: Using HRM genotyping on TILLING derived mutants, it is possible to generate an allelic series of mutations within multiple target genes rapidly. Lines suitable for phenotypic analysis can be isolated approximately 8-9 months (3 generations) from receiving M3 seed of Brassica rapa from the RevGenUK TILLING service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The observation-error covariance matrix used in data assimilation contains contributions from instrument errors, representativity errors and errors introduced by the approximated observation operator. Forward model errors arise when the observation operator does not correctly model the observations or when observations can resolve spatial scales that the model cannot. Previous work to estimate the observation-error covariance matrix for particular observing instruments has shown that it contains signifcant correlations. In particular, correlations for humidity data are more significant than those for temperature. However it is not known what proportion of these correlations can be attributed to the representativity errors. In this article we apply an existing method for calculating representativity error, previously applied to an idealised system, to NWP data. We calculate horizontal errors of representativity for temperature and humidity using data from the Met Office high-resolution UK variable resolution model. Our results show that errors of representativity are correlated and more significant for specific humidity than temperature. We also find that representativity error varies with height. This suggests that the assimilation scheme may be improved if these errors are explicitly included in a data assimilation scheme. This article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The parameterization of surface heat-flux variability in urban areas relies on adequate representation of surface characteristics. Given the horizontal resolutions (e.g. ≈0.1–1km) currently used in numerical weather prediction (NWP) models, properties of the urban surface (e.g. vegetated/built surfaces, street-canyon geometries) often have large spatial variability. Here, a new approach based on Urban Zones to characterize Energy partitioning (UZE) is tested within a NWP model (Weather Research and Forecasting model;WRF v3.2.1) for Greater London. The urban land-surface scheme is the Noah/Single-Layer Urban Canopy Model (SLUCM). Detailed surface information (horizontal resolution 1 km)in central London shows that the UZE offers better characterization of surface properties and their variability compared to default WRF-SLUCM input parameters. In situ observations of the surface energy fluxes and near-surface meteorological variables are used to select the radiation and turbulence parameterization schemes and to evaluate the land-surface scheme

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Future climate change projections are often derived from ensembles of simulations from multiple global circulation models using heuristic weighting schemes. This study provides a more rigorous justification for this by introducing a nested family of three simple analysis of variance frameworks. Statistical frameworks are essential in order to quantify the uncertainty associated with the estimate of the mean climate change response. The most general framework yields the “one model, one vote” weighting scheme often used in climate projection. However, a simpler additive framework is found to be preferable when the climate change response is not strongly model dependent. In such situations, the weighted multimodel mean may be interpreted as an estimate of the actual climate response, even in the presence of shared model biases. Statistical significance tests are derived to choose the most appropriate framework for specific multimodel ensemble data. The framework assumptions are explicit and can be checked using simple tests and graphical techniques. The frameworks can be used to test for evidence of nonzero climate response and to construct confidence intervals for the size of the response. The methodology is illustrated by application to North Atlantic storm track data from the Coupled Model Intercomparison Project phase 5 (CMIP5) multimodel ensemble. Despite large variations in the historical storm tracks, the cyclone frequency climate change response is not found to be model dependent over most of the region. This gives high confidence in the response estimates. Statistically significant decreases in cyclone frequency are found on the flanks of the North Atlantic storm track and in the Mediterranean basin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particle filters are fully non-linear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high-dimensional geophysical applications the number of particles required by the sequential importance resampling (SIR) particle filter in order to capture the high probability region of the posterior, is too large to make them usable. However particle filters can be formulated using proposal densities, which gives greater freedom in how particles are sampled and allows for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high probability region of the posterior probability density function. This gives rise to the possibility of non-linear data assimilation in large dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform simulations of several convective events over the southern UK with the Met Office Unified Model (UM) at horizontal grid lengths ranging from 1.5 km to 200 m. Comparing the simulated storms on these days with the Met Office rainfall radar network allows us to apply a statistical approach to evaluate the properties and evolution of the simulated storms over a range of conditions. Here we present results comparing the storm morphology in the model and reality which show that the simulated storms become smaller as grid length decreases and that the grid length that fits the observations best changes with the size of the observed cells. We investigate the sensitivity of storm morphology in the model to the mixing length used in the subgrid turbulence scheme. As the subgrid mixing length is decreased, the number of small storms with high area-averaged rain rates increases. We show that by changing the mixing length we can produce a lower resolution simulation that produces similar morphologies to a higher resolution simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low glycaemic index (GI) foods consumed at breakfast can enhance memory in comparison to high-GI foods; however, the impact of evening meal GI manipulations on cognition the following morning remains unexplored. Fourteen healthy males consumed a high-GI evening meal or a low-GI evening meal in a counterbalanced order on two separate evenings. Memory and attention were assessed before and after a high-GI breakfast the following morning. The high-GI evening meal elicited significantly higher evening glycaemic responses than the low-GI evening meal. Verbal recall was better the morning following the high-GI evening meal compared to after the low-GI evening meal. In summary, the GI of the evening meal was associated with memory performance the next day, suggesting a second meal cognitive effect. The present findings imply that an overnight fast may not be sufficient to control for previous nutritional consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-power medium access control (MAC) protocols used for communication of energy constraint wireless embedded devices do not cope well with situations where transmission channels are highly erroneous. Existing MAC protocols discard corrupted messages which lead to costly retransmissions. To improve transmission performance, it is possible to include an error correction scheme and transmit/receive diversity. It is possible to add redundant information to transmitted packets in order to recover data from corrupted packets. It is also possible to make use of transmit/receive diversity via multiple antennas to improve error resiliency of transmissions. Both schemes may be used in conjunction to further improve the performance. In this study, the authors show how an error correction scheme and transmit/receive diversity can be integrated in low-power MAC protocols. Furthermore, the authors investigate the achievable performance gains of both methods. This is important as both methods have associated costs (processing requirements; additional antennas and power) and for a given communication situation it must be decided which methods should be employed. The authors’ results show that, in many practical situations, error control coding outperforms transmission diversity; however, if very high reliability is required, it is useful to employ both schemes together.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study assesses the influence of the El Niño–Southern Oscillation (ENSO) on global tropical cyclone activity using a 150-yr-long integration with a high-resolution coupled atmosphere–ocean general circulation model [High-Resolution Global Environmental Model (HiGEM); with N144 resolution: ~90 km in the atmosphere and ~40 km in the ocean]. Tropical cyclone activity is compared to an atmosphere-only simulation using the atmospheric component of HiGEM (HiGAM). Observations of tropical cyclones in the International Best Track Archive for Climate Stewardship (IBTrACS) and tropical cyclones identified in the Interim ECMWF Re-Analysis (ERA-Interim) are used to validate the models. Composite anomalies of tropical cyclone activity in El Niño and La Niña years are used. HiGEM is able to capture the shift in tropical cyclone locations to ENSO in the Pacific and Indian Oceans. However, HiGEM does not capture the expected ENSO–tropical cyclone teleconnection in the North Atlantic. HiGAM shows more skill in simulating the global ENSO–tropical cyclone teleconnection; however, variability in the Pacific is overpronounced. HiGAM is able to capture the ENSO–tropical cyclone teleconnection in the North Atlantic more accurately than HiGEM. An investigation into the large-scale environmental conditions, known to influence tropical cyclone activity, is used to further understand the response of tropical cyclone activity to ENSO in the North Atlantic and western North Pacific. The vertical wind shear response over the Caribbean is not captured in HiGEM compared to HiGAM and ERA-Interim. Biases in the mean ascent at 500 hPa in HiGEM remain in HiGAM over the western North Pacific; however, a more realistic low-level vorticity in HiGAM results in a more accurate ENSO–tropical cyclone teleconnection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project, using PRACE (Partnership for Advanced Computing in Europe) resources, constructed and ran an ensemble of atmosphere-only global climate model simulations, using the Met Office Unified Model GA3 configuration. Each simulation is 27 years in length for both the present climate and an end-of-century future climate, at resolutions of N96 (130 km), N216 (60 km) and N512 (25 km), in order to study the impact of model resolution on high impact climate features such as tropical cyclones. Increased model resolution is found to improve the simulated frequency of explicitly tracked tropical cyclones, and correlations of interannual variability in the North Atlantic and North West Pacific lie between 0.6 and 0.75. Improvements in the deficit of genesis in the eastern North Atlantic as resolution increases appear to be related to the representation of African Easterly Waves and the African Easterly Jet. However, the intensity of the modelled tropical cyclones as measured by 10 m wind speed remain weak, and there is no indication of convergence over this range of resolutions. In the future climate ensemble, there is a reduction of 50% in the frequency of Southern Hemisphere tropical cyclones, while in the Northern Hemisphere there is a reduction in the North Atlantic, and a shift in the Pacific with peak intensities becoming more common in the Central Pacific. There is also a change in tropical cyclone intensities, with the future climate having fewer weak storms and proportionally more stronger storms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global characteristics of tropical cyclones (TCs) simulated by several climate models are analyzed and compared with observations. The global climate models were forced by the same sea surface temperature (SST) fields in two types of experiments, using climatological SST and interannually varying SST. TC tracks and intensities are derived from each model's output fields by the group who ran that model, using their own preferred tracking scheme; the study considers the combination of model and tracking scheme as a single modeling system, and compares the properties derived from the different systems. Overall, the observed geographic distribution of global TC frequency was reasonably well reproduced. As expected, with the exception of one model, intensities of the simulated TC were lower than in observations, to a degree that varies considerably across models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The asymmetries in the convective flows, current systems, and particle precipitation in the high-latitude dayside ionosphere which are related to the equatorial plane components of the interplanetary magnetic field (IMF) are discussed in relation to the results of several recent observational studies. It is argued that all of the effects reported to date which are ascribed to the y component of the IMF can be understood, at least qualitatively, in terms of a simple theoretical picture in which the effects result from the stresses exerted on the magnetosphere consequent on the interconnection of terrestrial and interplanetary fields. In particular, relaxation under the action of these stresses allows, in effect, a partial penetration of the IMF into the magnetospheric cavity, such that the sense of the expected asymmetry effects on closed field lines can be understood, to zeroth order, in terms of the “dipole plus uniform field” model. In particular, in response to IMF By, the dayside cusp should be displaced in longitude about noon in the same sense as By in the northern hemisphere, and in the opposite sense to By in the southern hemisphere, while simultaneously the auroral oval as a whole should be shifted in the dawn-dusk direction in the opposite sense with respect to By. These expected displacements are found to be consistent with recently published observations. Similar considerations lead to the suggestion that the auroral oval may also undergo displacements in the noon-midnight direction which are associated with the x component of the IMF. We show that a previously published study of the position of the auroral oval contains strong initial evidence for the existence of this effect. However, recent results on variations in the latitude of the cusp are more ambiguous. This topic therefore requires further study before definitive conclusions can be drawn.