128 resultados para Operational validation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The difference between cirrus emissivities at 8 and 11 μm is sensitive to the mean effective ice crystal size of the cirrus cloud, De. By using single scattering properties of ice crystals shaped as planar polycrystals, diameters of up to about 70 μm can be retrieved, instead of up to 45 μm assuming spheres or hexagonal columns. The method described in this article is used for a global determination of mean effective ice crystal sizes of cirrus clouds from TOVS satellite observations. A sensitivity study of the De retrieval to uncertainties in hypotheses on ice crystal shape, size distributions, and temperature profiles, as well as in vertical and horizontal cloud heterogeneities shows that uncertainties can be as large as 30%. However, the TOVS data set is one of few data sets which provides global and long-term coverage. Having analyzed the years 1987–1991, it was found that measured effective ice crystal diameters De are stable from year to year. For 1990 a global median De of 53.5 μm was determined. Averages distinguishing ocean/land, season, and latitude lie between 23 μm in winter over Northern Hemisphere midlatitude land and 64 μm in the tropics. In general, larger Des are found in regions with higher atmospheric water vapor and for cirrus with a smaller effective emissivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heat waves are expected to increase in frequency and magnitude with climate change. The first part of a study to produce projections of the effect of future climate change on heat-related mortality is presented. Separate city-specific empirical statistical models that quantify significant relationships between summer daily maximum temperature (T max) and daily heat-related deaths are constructed from historical data for six cities: Boston, Budapest, Dallas, Lisbon, London, and Sydney. ‘Threshold temperatures’ above which heat-related deaths begin to occur are identified. The results demonstrate significantly lower thresholds in ‘cooler’ cities exhibiting lower mean summer temperatures than in ‘warmer’ cities exhibiting higher mean summer temperatures. Analysis of individual ‘heat waves’ illustrates that a greater proportion of mortality is due to mortality displacement in cities with less sensitive temperature–mortality relationships than in those with more sensitive relationships, and that mortality displacement is no longer a feature more than 12 days after the end of the heat wave. Validation techniques through residual and correlation analyses of modelled and observed values and comparisons with other studies indicate that the observed temperature–mortality relationships are represented well by each of the models. The models can therefore be used with confidence to examine future heat-related deaths under various climate change scenarios for the respective cities (presented in Part 2).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The skill of numerical Lagrangian drifter trajectories in three numerical models is assessed by comparing these numerically obtained paths to the trajectories of drifting buoys in the real ocean. The skill assessment is performed using the two-sample Kolmogorov–Smirnov statistical test. To demonstrate the assessment procedure, it is applied to three different models of the Agulhas region. The test can either be performed using crossing positions of one-dimensional sections in order to test model performance in specific locations, or using the total two-dimensional data set of trajectories. The test yields four quantities: a binary decision of model skill, a confidence level which can be used as a measure of goodness-of-fit of the model, a test statistic which can be used to determine the sensitivity of the confidence level, and cumulative distribution functions that aid in the qualitative analysis. The ordering of models by their confidence levels is the same as the ordering based on the qualitative analysis, which suggests that the method is suited for model validation. Only one of the three models, a 1/10° two-way nested regional ocean model, might have skill in the Agulhas region. The other two models, a 1/2° global model and a 1/8° assimilative model, might have skill only on some sections in the region

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the past 15 years, a number of initiatives have been undertaken at national level to develop ocean forecasting systems operating at regional and/or global scales. The co-ordination between these efforts has been organized internationally through the Global Ocean Data Assimilation Experiment (GODAE). The French MERCATOR project is one of the leading participants in GODAE. The MERCATOR systems routinely assimilate a variety of observations such as multi-satellite altimeter data, sea-surface temperature and in situ temperature and salinity profiles, focusing on high-resolution scales of the ocean dynamics. The assimilation strategy in MERCATOR is based on a hierarchy of methods of increasing sophistication including optimal interpolation, Kalman filtering and variational methods, which are progressively deployed through the Syst`eme d’Assimilation MERCATOR (SAM) series. SAM-1 is based on a reduced-order optimal interpolation which can be operated using ‘altimetry-only’ or ‘multi-data’ set-ups; it relies on the concept of separability, assuming that the correlations can be separated into a product of horizontal and vertical contributions. The second release, SAM-2, is being developed to include new features from the singular evolutive extended Kalman (SEEK) filter, such as three-dimensional, multivariate error modes and adaptivity schemes. The third one, SAM-3, considers variational methods such as the incremental four-dimensional variational algorithm. Most operational forecasting systems evaluated during GODAE are based on least-squares statistical estimation assuming Gaussian errors. In the framework of the EU MERSEA (Marine EnviRonment and Security for the European Area) project, research is being conducted to prepare the next-generation operational ocean monitoring and forecasting systems. The research effort will explore nonlinear assimilation formulations to overcome limitations of the current systems. This paper provides an overview of the developments conducted in MERSEA with the SEEK filter, the Ensemble Kalman filter and the sequential importance re-sampling filter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimation of whole-grain (WG) food intake in epidemiological and nutritional studies is normally based on general diet FFQ, which are not designed to specifically capture WG intake. To estimate WG cereal intake, we developed a forty-three-item FFQ focused on cereal product intake over the past month. We validated this questionnaire against a 3-d-weighed food record (3DWFR) in thirty-one subjects living in the French-speaking part of Switzerland (nineteen female and twelve male). Subjects completed the FFQ on day 1 (FFQ1), the 3DWFR between days 2 and 13 and the FFQ again on day 14 (FFQ2). The subjects provided a fasting blood sample within 1 week of FFQ2. Total cereal intake, total WG intake, intake of individual cereals, intake of different groups of cereal products and alkylresorcinol (AR) intake were calculated from both FFQ and the 3DWFR. Plasma AR, possible biomarkers for WG wheat and rye intake were also analysed. The total WG intake for the 3DWFR, FFQ1, FFQ2 was 26 (sd 22), 28 (sd 25) and 21 (sd 16) g/d, respectively. Mean plasma AR concentration was 55.8 (sd 26.8) nmol/l. FFQ1, FFQ2 and plasma AR were correlated with the 3DWFR (r 0.72, 0.81 and 0.57, respectively). Adjustment for age, sex, BMI and total energy intake did not affect the results. This FFQ appears to give a rapid and adequate estimate of WG cereal intake in free-living subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building energy consumption(BEC) accounting and assessment is fundamental work for building energy efficiency(BEE) development. In existing Chinese statistical yearbook, there is no specific item for BEC accounting and relevant data are separated and mixed with other industry consumption. Approximate BEC data can be acquired from existing energy statistical yearbook. For BEC assessment, caloric values of different energy carriers are adopted in energy accounting and assessment field. This methodology obtained much useful conclusion for energy efficiency development. While the traditional methodology concerns only on the energy quantity, energy classification issue is omitted. Exergy methodology is put forward to assess BEC. With the new methodology, energy quantity and quality issues are both concerned in BEC assessment. To illustrate the BEC accounting and exergy assessment, a case of Chongqing in 2004 is shown. Based on the exergy analysis, BEC of Chongqing in 2004 accounts for 17.3% of the total energy consumption. This result is quite common to that of traditional methodology. As far as energy supply efficiency is concerned, the difference is highlighted by 0.417 of the exergy methodology to 0.645 of the traditional methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the reliability literature, maintenance time is usually ignored during the optimization of maintenance policies. In some scenarios, costs due to system failures may vary with time, and the ignorance of maintenance time will lead to unrealistic results. This paper develops maintenance policies for such situations where the system under study operates iteratively at two successive states: up or down. The costs due to system failure at the up state consist of both business losses & maintenance costs, whereas those at the down state only include maintenance costs. We consider three models: Model A, B, and C: Model A makes only corrective maintenance (CM). Model B performs imperfect preventive maintenance (PM) sequentially, and CM. Model C executes PM periodically, and CM; this PM can restore the system as good as the state just after the latest CM. The CM in this paper is imperfect repair. Finally, the impact of these maintenance policies is illustrated through numerical examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An improved method for the detection of pressed hazelnut oil in admixtures with virgin olive oil by analysis of polar components is described. The method. which is based on the SPE-based isolation of the polar fraction followed by RP-HPLC analysis with UV detection. is able to detect virgin olive oil adulterated with pressed hazelnut oil at levels as low as 5% with accuracy (90.0 +/- 4.2% recovery of internal standard), good reproducibility (4.7% RSD) and linearity (R-2: 0.9982 over the 5-40% adulteration range). An international ring-test of the developed method highlighted its capability as 80% of the samples were, on average, correctly identified despite the fact that no training samples were provided to the participating laboratories. However, the large variability in marker components among the pressed hazelnut oils examined prevents the use of the method for quantification of the level of adulteration. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An unstructured mathematical model is proposed to describe the fermentation kinetics of growth, lactic acid production, pH and sugar consumption by Lactobacillus plantarum as a function of the buffering capacity and initial glucose concentration of the culture media. Initially the experimental data of L plantarum fermentations in synthetic media with different buffering capacity and glucose were fitted to a set of primary models. Later the parameters obtained from these models were used to establish mathematical relationships with the independent variables tested. The models were validated with 6 fermentations of L. plantarum in different cereal-based media. In most cases the proposed models adequately describe the biochemical changes taking place during fermentation and are a promising approach for the formulation of cereal-based probiotic foods. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To examine the properties of the Social Communication Questionnaire (SCQ) in a population cohort of children with autism spectrum disorders (ASDs) and in the general population, Method: SCQ data were collected from three samples: the Special Needs and Autism Project (SNAP) cohort of 9- to 10-year-old children with special educational needs with and without ASD and two similar but separate age groups of children from the general population (n = 411 and n = 247). Diagnostic assessments were completed on a stratified subsample (n = 255) of the special educational needs group. A sample-weighting procedure enabled us to estimate characteristics of the SCQ in the total ASD population. Diagnostic status of cases in the general population samples were extracted from child health records. Results: The SCQ showed strong discrimination between ASD and non-ASD cases (sensitivity 0.88, specificity 0.72) and between autism and nonautism cases (sensitivity 0.90, specificity 0.86). Findings were not affected by child IQ or parental education. In the general population samples between 4% and 5% of children scored above the ASD cutoff including 1.5% who scored above the autism cutoff. Although many of these high-scoring children had an ASD diagnosis, almost all (similar to 90%) of them had a diagnosed neurodevelopmental disorder. Conclusions: This study confirms the utility of the SCQ as a,first-level screen for ASD in at-risk samples of school-age children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Typically, algorithms for generating stereo disparity maps have been developed to minimise the energy equation of a single image. This paper proposes a method for implementing cross validation in a belief propagation optimisation. When tested using the Middlebury online stereo evaluation, the cross validation improves upon the results of standard belief propagation. Furthermore, it has been shown that regions of homogeneous colour within the images can be used for enforcing the so-called "Segment Constraint". Developing from this, Segment Support is introduced to boost belief between pixels of the same image region and improve propagation into textureless regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability of four operational weather forecast models [ECMWF, Action de Recherche Petite Echelle Grande Echelle model (ARPEGE), Regional Atmospheric Climate Model (RACMO), and Met Office] to generate a cloud at the right location and time (the cloud frequency of occurrence) is assessed in the present paper using a two-year time series of observations collected by profiling ground-based active remote sensors (cloud radar and lidar) located at three different sites in western Europe (Cabauw. Netherlands; Chilbolton, United Kingdom; and Palaiseau, France). Particular attention is given to potential biases that may arise from instrumentation differences (especially sensitivity) from one site to another and intermittent sampling. In a second step the statistical properties of the cloud variables involved in most advanced cloud schemes of numerical weather forecast models (ice water content and cloud fraction) are characterized and compared with their counterparts in the models. The two years of observations are first considered as a whole in order to evaluate the accuracy of the statistical representation of the cloud variables in each model. It is shown that all models tend to produce too many high-level clouds, with too-high cloud fraction and ice water content. The midlevel and low-level cloud occurrence is also generally overestimated, with too-low cloud fraction but a correct ice water content. The dataset is then divided into seasons to evaluate the potential of the models to generate different cloud situations in response to different large-scale forcings. Strong variations in cloud occurrence are found in the observations from one season to the same season the following year as well as in the seasonal cycle. Overall, the model biases observed using the whole dataset are still found at seasonal scale, but the models generally manage to well reproduce the observed seasonal variations in cloud occurrence. Overall, models do not generate the same cloud fraction distributions and these distributions do not agree with the observations. Another general conclusion is that the use of continuous ground-based radar and lidar observations is definitely a powerful tool for evaluating model cloud schemes and for a responsive assessment of the benefit achieved by changing or tuning a model cloud