866 resultados para Statistic validation
Resumo:
Urban flood inundation models require considerable data for their parameterisation, calibration and validation. TerraSAR-X should be suitable for urban flood detection because of its high resolution in stripmap/spotlight modes. The paper describes ongoing work on a project to assess how well TerraSAR-X can detect flooded regions in urban areas, and how well these can constrain the parameters of an urban flood model. The study uses a TerraSAR-X image of a 1-in-150 year flood near Tewkesbury, UK , in 2007, for which contemporaneous aerial photography exists for validation. The DLR SETES SAR simulator was used in conjunction with LiDAR data to estimate regions of the image in which water would not be visible due to shadow or layover caused by buildings and vegetation. An algorithm for the delineation of flood water in urban areas is described, together with its validation using the aerial photographs.
Resumo:
An operational dust forecasting model is developed by including the Met Office Hadley Centre climate model dust parameterization scheme, within a Met Office regional numerical weather prediction (NWP) model. The model includes parameterizations for dust uplift, dust transport, and dust deposition in six discrete size bins and provides diagnostics such as the aerosol optical depth. The results are compared against surface and satellite remote sensing measurements and against in situ measurements from the Facility for Atmospheric Airborne Measurements for a case study when a strong dust event was forecast. Comparisons are also performed against satellite and surface instrumentation for the entire month of August. The case study shows that this Saharan dust NWP model can provide very good guidance of dust events, as much as 42 h ahead. The analysis of monthly data suggests that the mean and variability in the dust model is also well represented.
Resumo:
Resumo:
Results are presented from a new web application called OceanDIVA - Ocean Data Intercomparison and Visualization Application. This tool reads hydrographic profiles and ocean model output and presents the data on either depth levels or isotherms for viewing in Google Earth, or as probability density functions (PDFs) of regional model-data misfits. As part of the CLIVAR Global Synthesis and Observations Panel, an intercomparison of water mass properties of various ocean syntheses has been undertaken using OceanDIVA. Analysis of model-data misfits reveals significant differences between the water mass properties of the syntheses, such as the ability to capture mode water properties.
Resumo:
We describe a new methodology for comparing satellite radiation budget data with a numerical weather prediction (NWP) model. This is applied to data from the Geostationary Earth Radiation Budget (GERB) instrument on Meteosat-8. The methodology brings together, in near-real time, GERB broadband shortwave and longwave fluxes with simulations based on analyses produced by the Met Office global NWP model. Results for the period May 2003 to February 2005 illustrate the progressive improvements in the data products as various initial problems were resolved. In most areas the comparisons reveal systematic errors in the model's representation of surface properties and clouds, which are discussed elsewhere. However, for clear-sky regions over the oceans the model simulations are believed to be sufficiently accurate to allow the quality of the GERB fluxes themselves to be assessed and any changes in time of the performance of the instrument to be identified. Using model and radiosonde profiles of temperature and humidity as input to a single-column version of the model's radiation code, we conduct sensitivity experiments which provide estimates of the expected model errors over the ocean of about ±5–10 W m−2 in clear-sky outgoing longwave radiation (OLR) and ±0.01 in clear-sky albedo. For the more recent data the differences between the observed and modeled OLR and albedo are well within these error estimates. The close agreement between the observed and modeled values, particularly for the most recent period, illustrates the value of the methodology. It also contributes to the validation of the GERB products and increases confidence in the quality of the data, prior to their release.
Resumo:
The density (BSG) of bone increases, at the osteon scale, during lifetime aging within the bone. In addition, post-mortem diagenetic change due to microbial attack produces denser bioapatite. Thus, fractionation of finely powdered bone on the basis of density should not only enable younger and older populations of osteons to be separated but also make it possible to separate out a less diagenetically altered component. We show that the density fractionation method can be used as a tool to investigate the isotopic history within an individual's lifetime, both in recent and archaeological contexts, and we use the bomb C-14 atmospheric pulse for validating the method.
Resumo:
This article describes an empirical, user-centred approach to explanation design. It reports three studies that investigate what patients want to know when they have been prescribed medication. The question is asked in the context of the development of a drug prescription system called OPADE. The system is aimed primarily at improving the prescribing behaviour of physicians, but will also produce written explanations for indirect users such as patients. In the first study, a large number of people were presented with a scenario about a visit to the doctor, and were asked to list the questions that they would like to ask the doctor about the prescription. On the basis of the results of the study, a categorization of question types was developed in terms of how frequently particular questions were asked. In the second and third studies a number of different explanations were generated in accordance with this categorization, and a new sample of people were presented with another scenario and were asked to rate the explanations on a number of dimensions. The results showed significant differences between the different explanations. People preferred explanations that included items corresponding to frequently asked questions in study 1. For an explanation to be considered useful, it had to include information about side effects, what the medication does, and any lifestyle changes involved. The implications of the results of the three studies are discussed in terms of the development of OPADE's explanation facility.
Resumo:
Real-time rainfall monitoring in Africa is of great practical importance for operational applications in hydrology and agriculture. Satellite data have been used in this context for many years because of the lack of surface observations. This paper describes an improved artificial neural network algorithm for operational applications. The algorithm combines numerical weather model information with the satellite data. Using this algorithm, daily rainfall estimates were derived for 4 yr of the Ethiopian and Zambian main rainy seasons and were compared with two other algorithms-a multiple linear regression making use of the same information as that of the neural network and a satellite-only method. All algorithms were validated against rain gauge data. Overall, the neural network performs best, but the extent to which it does so depends on the calibration/validation protocol. The advantages of the neural network are most evident when calibration data are numerous and close in space and time to the validation data. This result emphasizes the importance of a real-time calibration system.
Resumo:
Heat waves are expected to increase in frequency and magnitude with climate change. The first part of a study to produce projections of the effect of future climate change on heat-related mortality is presented. Separate city-specific empirical statistical models that quantify significant relationships between summer daily maximum temperature (T max) and daily heat-related deaths are constructed from historical data for six cities: Boston, Budapest, Dallas, Lisbon, London, and Sydney. ‘Threshold temperatures’ above which heat-related deaths begin to occur are identified. The results demonstrate significantly lower thresholds in ‘cooler’ cities exhibiting lower mean summer temperatures than in ‘warmer’ cities exhibiting higher mean summer temperatures. Analysis of individual ‘heat waves’ illustrates that a greater proportion of mortality is due to mortality displacement in cities with less sensitive temperature–mortality relationships than in those with more sensitive relationships, and that mortality displacement is no longer a feature more than 12 days after the end of the heat wave. Validation techniques through residual and correlation analyses of modelled and observed values and comparisons with other studies indicate that the observed temperature–mortality relationships are represented well by each of the models. The models can therefore be used with confidence to examine future heat-related deaths under various climate change scenarios for the respective cities (presented in Part 2).
Resumo:
Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.
Resumo:
Estimation of whole-grain (WG) food intake in epidemiological and nutritional studies is normally based on general diet FFQ, which are not designed to specifically capture WG intake. To estimate WG cereal intake, we developed a forty-three-item FFQ focused on cereal product intake over the past month. We validated this questionnaire against a 3-d-weighed food record (3DWFR) in thirty-one subjects living in the French-speaking part of Switzerland (nineteen female and twelve male). Subjects completed the FFQ on day 1 (FFQ1), the 3DWFR between days 2 and 13 and the FFQ again on day 14 (FFQ2). The subjects provided a fasting blood sample within 1 week of FFQ2. Total cereal intake, total WG intake, intake of individual cereals, intake of different groups of cereal products and alkylresorcinol (AR) intake were calculated from both FFQ and the 3DWFR. Plasma AR, possible biomarkers for WG wheat and rye intake were also analysed. The total WG intake for the 3DWFR, FFQ1, FFQ2 was 26 (sd 22), 28 (sd 25) and 21 (sd 16) g/d, respectively. Mean plasma AR concentration was 55.8 (sd 26.8) nmol/l. FFQ1, FFQ2 and plasma AR were correlated with the 3DWFR (r 0.72, 0.81 and 0.57, respectively). Adjustment for age, sex, BMI and total energy intake did not affect the results. This FFQ appears to give a rapid and adequate estimate of WG cereal intake in free-living subjects.
Resumo:
An improved method for the detection of pressed hazelnut oil in admixtures with virgin olive oil by analysis of polar components is described. The method. which is based on the SPE-based isolation of the polar fraction followed by RP-HPLC analysis with UV detection. is able to detect virgin olive oil adulterated with pressed hazelnut oil at levels as low as 5% with accuracy (90.0 +/- 4.2% recovery of internal standard), good reproducibility (4.7% RSD) and linearity (R-2: 0.9982 over the 5-40% adulteration range). An international ring-test of the developed method highlighted its capability as 80% of the samples were, on average, correctly identified despite the fact that no training samples were provided to the participating laboratories. However, the large variability in marker components among the pressed hazelnut oils examined prevents the use of the method for quantification of the level of adulteration. (C) 2003 Elsevier Ltd. All rights reserved.