48 resultados para Observation-driven Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an efficient method of combining wide angle neutron scattering data with detailed atomistic models, allowing us to perform a quantitative and qualitative mapping of the organisation of the chain conformation in both glass and liquid phases. The structural refinement method presented in this work is based on the exploitation of the intrachain features of the diffraction pattern and its intimate linkage with atomistic models by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Atomic connectivity is defined through these coordinates that are in turn assigned by pre-defined probability distributions, thus allowing for the models in question to be built stochastically. Incremental variation of these coordinates allows for the construction of models that minimise the differences between the observed and calculated structure factors. We present a series of neutron scattering data of 1,2 polybutadiene at the region 120-400K. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54Å and 1.35Å respectively. Valence angles of the backbone were found to be at 112° and the torsion distributions are characterised by five rotational states, a three-fold trans-skew± for the backbone and gauche± for the vinyl group. Rotational states of the vinyl group were found to be equally populated, indicating a largely atactic chan. The two backbone torsion angles exhibit different behaviour with respect to temperature of their trans population, with one of them adopting an almost all trans sequence. Consequently the resulting configuration leads to a rather persistent chain, something indicated by the value of the characteristic ratio extrapolated from the model. We compare our results with theoretical predictions, computer simulations, RIS models and previously reported experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the sensory consequences of an action are systematically altered our brain can recalibrate the mappings between sensory cues and properties of our environment. This recalibration can be driven by both cue conflicts and altered sensory statistics, but neither mechanism offers a way for cues to be calibrated so they provide accurate information about the world, as sensory cues carry no information as to their own accuracy. Here, we explored whether sensory predictions based on internal physical models could be used to accurately calibrate visual cues to 3D surface slant. Human observers played a 3D kinematic game in which they adjusted the slant of a surface so that a moving ball would bounce off the surface and through a target hoop. In one group, the ball’s bounce was manipulated so that the surface behaved as if it had a different slant to that signaled by visual cues. With experience of this altered bounce, observers recalibrated their perception of slant so that it was more consistent with the assumed laws of kinematics and physical behavior of the surface. In another group, making the ball spin in a way that could physically explain its altered bounce eliminated this pattern of recalibration. Importantly, both groups adjusted their behavior in the kinematic game in the same way, experienced the same set of slants and were not presented with low-level cue conflicts that could drive the recalibration. We conclude that observers use predictive kinematic models to accurately calibrate visual cues to 3D properties of world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of inquiries and reports suggest considerable failings in the care provided to some patients in the NHS. Although the Bristol Inquiry report of 2001 led to the creation of many new regulatory bodies to supervise the NHS, they have never enjoyed consistent support from government and the Mid Staffordshire Inquiry in 2013 suggests they made little difference. Why do some parts of the NHS disregard patients’ interests and how we should we respond to the challenge? The following discusses the evolution of approaches to NHS governance through the Hippocratic, Managerial and Commercial models, and assesses their risks and benefits. Apart from the ethical imperative, the need for effective governance is driven both by the growth in information available to the public and the resources wasted by ineffective systems of care. Appropriate solutions depend on an understanding of the perverse incentives inherent in each model and the need for greater sensitivity to the voices of patients and the public.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For certain observing types, such as those that are remotely sensed, the observation errors are correlated and these correlations are state- and time-dependent. In this work, we develop a method for diagnosing and incorporating spatially correlated and time-dependent observation error in an ensemble data assimilation system. The method combines an ensemble transform Kalman filter with a method that uses statistical averages of background and analysis innovations to provide an estimate of the observation error covariance matrix. To evaluate the performance of the method, we perform identical twin experiments using the Lorenz ’96 and Kuramoto-Sivashinsky models. Using our approach, a good approximation to the true observation error covariance can be recovered in cases where the initial estimate of the error covariance is incorrect. Spatial observation error covariances where the length scale of the true covariance changes slowly in time can also be captured. We find that using the estimated correlated observation error in the assimilation improves the analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observations of atmospheric conditions and processes in citiesare fundamental to understanding the interactions between the urban surface and weather/climate, improving the performance of urban weather, air quality and climate models, and providing key information for city end-users (e.g. decision-makers, stakeholders, public). In this paper, Shanghai's urban integrated meteorological observation network (SUIMON) and some examples of intended applications are introduced. Its characteristics include being: multi- purpose (e.g. forecast, research, service), multi-function (high impact weather, city climate, special end-users), multi-scale (e.g. macro/meso-, urban-, neighborhood, street canyon), multi-variable (e.g. thermal, dynamic, chemical, bio-meteorological, ecological), and multi- platform (e.g. radar, wind profiler, ground-based, satellite based, in-situ observation/ sampling). Underlying SUIMON is a data management system to facilitate exchange of data and information. The overall aim of the network is to improve coordination strategies and instruments; to identify data gaps based on science and user driven requirements; and to intelligently combine observations from a variety of platforms by using a data assimilation system that is tuned to produce the best estimate of the current state of the urban atmosphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The incorporation of cobalt in mixed metal carbonates is a possible route to the immobilization of this toxic element in the environment. However, the thermodynamics of (Ca,Co)CO3 solid solutions are still unclear due to conflicting data from experiment and from the observation of natural ocurrences. We report here the results of a computer simulation study of the mixing of calcite (CaCO3) and spherocobaltite (CoCO3), using density functional theory calculations. Our simulations suggest that previously proposed thermodynamic models, based only on the range of observed compositions, significantly overestimate the solubility between the two solids and therefore underestimate the extension of the miscibility gap under ambient conditions. The enthalpy of mixing of the disordered solid solution is strongly positive and moderately asymmetric: calcium incorporation in spherocobaltite is more endothermic than cobalt incorporation in calcite. Ordering of the impurities in (0001) layers is energetically favourable with respect to the disordered solid solution at low temperatures and intermediate compositions, but the ordered phase is still unstable to demixing. We calculate the solvus and spinodal lines in the phase diagram using a sub-regular solution model, and conclude that many Ca1-xCoxCO3 mineral solid solutions (with observed compositions of up to x=0.027, and above x=0.93) are metastable with respect to phase separation. We also calculate solid/aqueous distribution coefficients to evaluate the effect of the strong non-ideality of mixing on the equilibrium with aqueous solution, showing that the thermodynamically-driven incorporation of cobalt in calcite (and of calcium in spherocobaltite) is always very low, regardless of the Co/Ca ratio of the aqueous environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experiments with CO2 instantaneously quadrupled and then held constant are used to show that the relationship between the global-mean net heat input to the climate system and the global-mean surface-air-temperature change is nonlinear in Coupled Model Intercomparison Project phase 5 (CMIP5) Atmosphere-Ocean General Circulation Models (AOGCMs). The nonlinearity is shown to arise from a change in strength of climate feedbacks driven by an evolving pattern of surface warming. In 23 out of the 27 AOGCMs examined the climate feedback parameter becomes significantly (95% confidence) less negative – i.e. the effective climate sensitivity increases – as time passes. Cloud feedback parameters show the largest changes. In the AOGCM-mean approximately 60% of the change in feedback parameter comes from the topics (30N-30S). An important region involved is the tropical Pacific where the surface warming intensifies in the east after a few decades. The dependence of climate feedbacks on an evolving pattern of surface warming is confirmed using the HadGEM2 and HadCM3 atmosphere GCMs (AGCMs). With monthly evolving sea-surface-temperatures and sea-ice prescribed from its AOGCM counterpart each AGCM reproduces the time-varying feedbacks, but when a fixed pattern of warming is prescribed the radiative response is linear with global temperature change or nearly so. We also demonstrate that the regression and fixed-SST methods for evaluating effective radiative forcing are in principle different, because rapid SST adjustment when CO2 is changed can produce a pattern of surface temperature change with zero global mean but non-zero change in net radiation at the top of the atmosphere (~ -0.5 Wm-2 in HadCM3).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We utilize energy budget diagnostics from the Coupled Model Intercomparison Project phase 5 (CMIP5) to evaluate the models' climate forcing since preindustrial times employing an established regression technique. The climate forcing evaluated this way, termed the adjusted forcing (AF), includes a rapid adjustment term associated with cloud changes and other tropospheric and land-surface changes. We estimate a 2010 total anthropogenic and natural AF from CMIP5 models of 1.9 ± 0.9 W m−2 (5–95% range). The projected AF of the Representative Concentration Pathway simulations are lower than their expected radiative forcing (RF) in 2095 but agree well with efficacy weighted forcings from integrated assessment models. The smaller AF, compared to RF, is likely due to cloud adjustment. Multimodel time series of temperature change and AF from 1850 to 2100 have large intermodel spreads throughout the period. The intermodel spread of temperature change is principally driven by forcing differences in the present day and climate feedback differences in 2095, although forcing differences are still important for model spread at 2095. We find no significant relationship between the equilibrium climate sensitivity (ECS) of a model and its 2003 AF, in contrast to that found in older models where higher ECS models generally had less forcing. Given the large present-day model spread, there is no indication of any tendency by modelling groups to adjust their aerosol forcing in order to produce observed trends. Instead, some CMIP5 models have a relatively large positive forcing and overestimate the observed temperature change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observational analyses of running 5-year ocean heat content trends (Ht) and net downward top of atmosphere radiation (N) are significantly correlated (r~0.6) from 1960 to 1999, but a spike in Ht in the early 2000s is likely spurious since it is inconsistent with estimates of N from both satellite observations and climate model simulations. Variations in N between 1960 and 2000 were dominated by volcanic eruptions, and are well simulated by the ensemble mean of coupled models from the Fifth Coupled Model Intercomparison Project (CMIP5). We find an observation-based reduction in N of -0.31±0.21 Wm-2 between 1999 and 2005 that potentially contributed to the recent warming slowdown, but the relative roles of external forcing and internal variability remain unclear. While present-day anomalies of N in the CMIP5 ensemble mean and observations agree, this may be due to a cancellation of errors in outgoing longwave and absorbed solar radiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. Weather forecasts using multiple NWPs from various weather centres implemented on catchment hydrology can provide significantly improved early flood warning. The availability of global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a new opportunity for the development of state-of-the-art early flood forecasting systems. This paper presents a case study using the TIGGE database for flood warning on a meso-scale catchment (4062 km2) located in the Midlands region of England. For the first time, a research attempt is made to set up a coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE database. The study shows that precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial precipitation variability on such a comparatively small catchment, which indicates need to improve NWPs resolution and/or disaggregating techniques to narrow down the spatial gap between meteorology and hydrology. The spread of discharge forecasts varies from centre to centre, but it is generally large and implies a significant level of uncertainties. Nevertheless, the results show the TIGGE database is a promising tool to forecast flood inundation, comparable with that driven by raingauge observation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Substantial biases in shortwave cloud forcing (SWCF) of up to ±30 W m−2are found in the midlatitudes of the Southern Hemisphere in the historical simulations of 34 CMIP5 coupled general circulation models. The SWCF biases are shown to induce surface temperature anomalies localized in the midlatitudes, and are significantly correlated with the mean latitude of the eddy-driven jet, with a negative SWCF bias corresponding to an equatorward jet latitude bias. Aquaplanet model experiments are performed to demonstrate that the jet latitude biases are primarily induced by the midlatitude SWCF anomalies, such that the jet moves toward (away from) regions of enhanced (reduced) temperature gradients. The results underline the necessity of accurately representing cloud radiative forcings in state-of-the-art coupled models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A strong relationship is found between changes in the meridional gradient of absorbed shortwave radiation (ASR) and Southern Hemispheric jet shifts in 21st century climate simulations of CMIP5 (Coupled Model Intercomparison Project phase 5) coupled models. The relationship is such that models with increases in the meridional ASR gradient around the southern midlatitudes, and therefore increases in midlatitude baroclinicity, tend to produce a larger poleward jet shift. The ASR changes are shown to be dominated by changes in cloud properties, with sea ice declines playing a secondary role. We demonstrate that the ASR changes are the cause, and not the result, of the intermodel differences in jet response by comparing coupled simulations with experiments in which sea surface temperature increases are prescribed. Our results highlight the importance of reducing the uncertainty in cloud feedbacks in order to constrain future circulation changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research suggests Eurasian snow-covered area (SCA) influences the Arctic Oscillation (AO) via the polar vortex. This could be important for Northern Hemisphere winter season forecasting. A fairly strong negative correlation between October SCA and the AO, based on both monthly and daily observational data, has been noted in the literature. While reproducing these previous links when using the same data, we find no further evidence of the link when using an independent satellite data source, or when using a climate model.