53 resultados para reliability of delivery

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded computer systems equipped with wireless communication transceivers are nowadays used in a vast number of application scenarios. Energy consumption is important in many of these scenarios, as systems are battery operated and long maintenance-free operation is required. To achieve this goal, embedded systems employ low-power communication transceivers and protocols. However, currently used protocols cannot operate efficiently when communication channels are highly erroneous. In this study, we show how average diversity combining (ADC) can be used in state-of-the-art low-power communication protocols. This novel approach improves transmission reliability and in consequence energy consumption and transmission latency in the presence of erroneous channels. Using a testbed, we show that highly erroneous channels are indeed a common occurrence in situations, where low-power systems are used and we demonstrate that ADC improves low-power communication dramatically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An ability to quantify the reliability of probabilistic flood inundation predictions is a requirement not only for guiding model development but also for their successful application. Probabilistic flood inundation predictions are usually produced by choosing a method of weighting the model parameter space, but previous study suggests that this choice leads to clear differences in inundation probabilities. This study aims to address the evaluation of the reliability of these probabilistic predictions. However, a lack of an adequate number of observations of flood inundation for a catchment limits the application of conventional methods of evaluating predictive reliability. Consequently, attempts have been made to assess the reliability of probabilistic predictions using multiple observations from a single flood event. Here, a LISFLOOD-FP hydraulic model of an extreme (>1 in 1000 years) flood event in Cockermouth, UK, is constructed and calibrated using multiple performance measures from both peak flood wrack mark data and aerial photography captured post-peak. These measures are used in weighting the parameter space to produce multiple probabilistic predictions for the event. Two methods of assessing the reliability of these probabilistic predictions using limited observations are utilized; an existing method assessing the binary pattern of flooding, and a method developed in this paper to assess predictions of water surface elevation. This study finds that the water surface elevation method has both a better diagnostic and discriminatory ability, but this result is likely to be sensitive to the unknown uncertainties in the upstream boundary condition

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scoring rules are an important tool for evaluating the performance of probabilistic forecasting schemes. A scoring rule is called strictly proper if its expectation is optimal if and only if the forecast probability represents the true distribution of the target. In the binary case, strictly proper scoring rules allow for a decomposition into terms related to the resolution and the reliability of a forecast. This fact is particularly well known for the Brier Score. In this article, this result is extended to forecasts for finite-valued targets. Both resolution and reliability are shown to have a positive effect on the score. It is demonstrated that resolution and reliability are directly related to forecast attributes that are desirable on grounds independent of the notion of scores. This finding can be considered an epistemological justification of measuring forecast quality by proper scoring rules. A link is provided to the original work of DeGroot and Fienberg, extending their concepts of sufficiency and refinement. The relation to the conjectured sharpness principle of Gneiting, et al., is elucidated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The oral administration of probiotic bacteria has shown potential in clinical trials for the alleviation of specific disorders of the gastrointestinal tract. However, cells must be alive in order to exert these benefits. The low pH of the stomach can greatly reduce the number of viable microorganisms that reach the intestine, thereby reducing the efficacy of the administration. Herein, a model probiotic, Bifidobacterium breve, has been encapsulated into an alginate matrix before coating in multilayers of alternating alginate and chitosan. The intention of this formulation was to improve the survival of B. breve during exposure to low pH and to target the delivery of the cells to the intestine. The material properties were first characterized before in vitro testing. Biacore™ experiments allowed for the polymer interactions to be confirmed; additionally, the stability of these multilayers to buffers simulating the pH of the gastrointestinal tract was demonstrated. Texture analysis was used to monitor changes in the gel strength during preparation, showing a weakening of the matrices during coating as a result of calcium ion sequestration. The build-up of multilayers was confirmed by confocal laser-scanning microscopy, which also showed the increase in the thickness of coat over time. During exposure to in vitro gastric conditions, an increase in viability from <3 log(CFU) per mL, seen in free cells, up to a maximum of 8.84 ± 0.17 log(CFU) per mL was noted in a 3-layer coated matrix. Multilayer-coated alginate matrices also showed a targeting of delivery to the intestine, with a gradual release of their loads over 240 min.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personalised nutrition (PN) has the potential to reduce disease risk and optimise health and performance. Although previous research has shown good acceptance of the concept of PN in the UK, preferences regarding the delivery of a PN service (e.g. online v. face-to-face) are not fully understood. It is anticipated that the presence of a free at point of delivery healthcare system, the National Health Service (NHS), in the UK may have an impact on end-user preferences for deliverances. To determine this, supplementary analysis of qualitative data obtained from focus group discussions on PN service delivery, collected as part of the Food4Me project in the UK and Ireland, was undertaken. Irish data provided comparative analysis of a healthcare system that is not provided free of charge at the point of delivery to the entire population. Analyses were conducted using the 'framework approach' described by Rabiee (Focus-group interview and data analysis. Proc Nutr Soc 63, 655-660). There was a preference for services to be led by the government and delivered face-to-face, which was perceived to increase trust and transparency, and add value. Both countries associated paying for nutritional advice with increased commitment and motivation to follow guidelines. Contrary to Ireland, however, and despite the perceived benefit of paying, UK discussants still expected PN services to be delivered free of charge by the NHS. Consideration of this unique challenge of free healthcare that is embedded in the NHS culture will be crucial when introducing PN to the UK.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A manufactured aeration and nanofiltration MBR greywater system was tested during continuous operation at the University of Reading, to demonstrate reliability in delivery of high quality treated greywater. Its treatment performance was evaluated against British Standard criteria [BSI (Greywater Systems—Part 1 Code of Practice: BS8525-1:2010. BS Press, 2010); (Greywater Systems—Part 2 Domestic Greywater Treatment, Requirements and Methods: BS 8525-2:2011. BS Press, 2011)]. The low carbon greywater recycling technology produced excellent analytical results as well as consistency in performance. User acceptance of such reliably treated greywater was then evaluated through user perception studies. The results inform the potential supply of treated greywater to student accommodation. Out of 135 questionnaire replies, 95% demonstrated a lack of aversion in one or more attributes, to using treated, recycled greywater.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless Sensor Networks (WSNs) have been an exciting topic in recent years. The services offered by a WSN can be classified into three major categories: monitoring, alerting, and information on demand. WSNs have been used for a variety of applications related to the environment (agriculture, water and forest fire detection), the military, buildings, health (elderly people and home monitoring), disaster relief, and area or industrial monitoring. In most WSNs tasks like processing the sensed data, making decisions and generating emergency messages are carried out by a remote server, hence the need for efficient means of transferring data across the network. Because of the range of applications and types of WSN there is a need for different kinds of MAC and routing protocols in order to guarantee delivery of data from the source nodes to the server (or sink). In order to minimize energy consumption and increase performance in areas such as reliability of data delivery, extensive research has been conducted and documented in the literature on designing energy efficient protocols for each individual layer. The most common way to conserve energy in WSNs involves using the MAC layer to put the transceiver and the processor of the sensor node into a low power, sleep state when they are not being used. Hence the energy wasted due to collisions, overhearing and idle listening is reduced. As a result of this strategy for saving energy, the routing protocols need new solutions that take into account the sleep state of some nodes, and which also enable the lifetime of the entire network to be increased by distributing energy usage between nodes over time. This could mean that a combined MAC and routing protocol could significantly improve WSNs because the interaction between the MAC and network layers lets nodes be active at the same time in order to deal with data transmission. In the research presented in this thesis, a cross-layer protocol based on MAC and routing protocols was designed in order to improve the capability of WSNs for a range of different applications. Simulation results, based on a range of realistic scenarios, show that these new protocols improve WSNs by reducing their energy consumption as well as enabling them to support mobile nodes, where necessary. A number of conference and journal papers have been published to disseminate these results for a range of applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The reliability of the global reanalyses in the polar regions is investigated. The overview stems from an April 2006 Scientific Committee on Antarctic Research (SCAR) workshop on the performance of global reanalyses in high latitudes held at the British Antarctic Survey. Overall, the skill is much higher in the Arctic than the Antarctic, where the reanalyses are only reliable in the summer months prior to the modern satellite era. In the Antarctic, large circulation differences between the reanalyses are found primarily before 1979, when vast quantities of satellite sounding data started to be assimilated. Specifically for ERA-40, this data discontinuity creates a marked jump in Antarctic snow accumulation, especially at high elevations. In the Arctic, the largest differences are related to the reanalyses depiction of clouds and their associated radiation impacts; ERA-40 captures the cloud variability much better than NCEP1 and JRA-25, but the ERA-40 and JRA-25 clouds are too optically thin for shortwave radiation. To further contrast the reanalyses skill, cyclone tracking results are presented. In the Southern Hemisphere, cyclonic activity is markedly different between the reanalyses, where there are few matched cyclones prior to 1979. In comparison, only some of the weaker cyclones are not matched in the Northern Hemisphere from 1958-2001, again indicating the superior skill in this hemisphere. Although this manuscript focuses on deficiencies in the reanalyses, it is important to note that they are a powerful tool for climate studies in both polar regions when used with a recognition of their limitations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The mathematical difficulties which can arise in the force constant refinement procedure for calculating force constants and normal co-ordinates are described and discussed. The method has been applied to the methyl fluoride molecule, using an electronic computer. The best values of the twelve force constants in the most general harmonic potential field were obtained to fit twenty-two independently observed experimental data, these being the six vibration frequencies, three Coriolis zeta constants and two centrifugal stretching constants DJ and DJK, for both CH3F and CD3F. The calculations have been repeated both with and without anharmonicity corrections to the vibration frequencies. All the experimental data were weighted according to the reliability of the observations, and the corresponding standard errors and correlation coefficients of the force constants have been deduced. The final force constants are discussed briefly, and compared with previous treatments, particularly with a recent Urey-Bradley treatment for this molecule.