18 resultados para VALIDATION STUDIES

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions. The observing system changed considerably over this re-analysis period, with assimilable data provided by a succession of satellite-borne instruments from the 1970s onwards, supplemented by increasing numbers of observations from aircraft, ocean-buoys and other surface platforms, but with a declining number of radiosonde ascents since the late 1980s. The observations used in ERA-40 were accumulated from many sources. The first part of this paper describes the data acquisition and the principal changes in data type and coverage over the period. It also describes the data assimilation system used for ERA-40. This benefited from many of the changes introduced into operational forecasting since the mid-1990s, when the systems used for the 15-year ECMWF re-analysis (ERA-15) and the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis were implemented. Several of the improvements are discussed. General aspects of the production of the analyses are also summarized. A number of results indicative of the overall performance of the data assimilation system, and implicitly of the observing system, are presented and discussed. The comparison of background (short-range) forecasts and analyses with observations, the consistency of the global mass budget, the magnitude of differences between analysis and background fields and the accuracy of medium-range forecasts run from the ERA-40 analyses are illustrated. Several results demonstrate the marked improvement that was made to the observing system for the southern hemisphere in the 1970s, particularly towards the end of the decade. In contrast, the synoptic quality of the analysis for the northern hemisphere is sufficient to provide forecasts that remain skilful well into the medium range for all years. Two particular problems are also examined: excessive precipitation over tropical oceans and a too strong Brewer-Dobson circulation, both of which are pronounced in later years. Several other aspects of the quality of the re-analyses revealed by monitoring and validation studies are summarized. Expectations that the second-generation ERA-40 re-analysis would provide products that are better than those from the firstgeneration ERA-15 and NCEP/NCAR re-analyses are found to have been met in most cases. © Royal Meteorological Society, 2005. The contributions of N. A. Rayner and R. W. Saunders are Crown copyright.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In order to validate the reported precision of space‐based atmospheric composition measurements, validation studies often focus on measurements in the tropical stratosphere, where natural variability is weak. The scatter in tropical measurements can then be used as an upper limit on single‐profile measurement precision. Here we introduce a method of quantifying the scatter of tropical measurements which aims to minimize the effects of short‐term atmospheric variability while maintaining large enough sample sizes that the results can be taken as representative of the full data set. We apply this technique to measurements of O3, HNO3, CO, H2O, NO, NO2, N2O, CH4, CCl2F2, and CCl3F produced by the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE‐FTS). Tropical scatter in the ACE‐FTS retrievals is found to be consistent with the reported random errors (RREs) for H2O and CO at altitudes above 20 km, validating the RREs for these measurements. Tropical scatter in measurements of NO, NO2, CCl2F2, and CCl3F is roughly consistent with the RREs as long as the effect of outliers in the data set is reduced through the use of robust statistics. The scatter in measurements of O3, HNO3, CH4, and N2O in the stratosphere, while larger than the RREs, is shown to be consistent with the variability simulated in the Canadian Middle Atmosphere Model. This result implies that, for these species, stratospheric measurement scatter is dominated by natural variability, not random error, which provides added confidence in the scientific value of single‐profile measurements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bioaccessibility studies have been widely used as a research tool to determine the potential human exposure to ingested contaminants. More recently they have been practically applied for soil borne toxic elements. This paper reviews the application of bioaccessibility tests across a range of organic pollutants and contaminated matrices. Important factors are reported to be: the physiological relevance of the test, the components in the gut media, the size fraction chosen for the test and whether it contains a sorptive sink. The bioaccessibility is also a function of the composition of the matrix (e.g. organic carbon content of soils) and the physico-chemical characteristics of the pollutant under test. Despite the widespread use of these tests, there are a large number of formats used and very few validation studies with animal models. We propose a unified format for a bioaccessibility test for organic pollutants. The robustness of this test should first be confirmed through inter laboratory comparison, then tested in-vivo.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Changes in the depth of Lake Viljandi between 1940 and 1990 were simulated using a lake water and energy-balance model driven by standard monthly weather data. Catchment runoff was simulated using a one-dimensional hydrological model, with a two-layer soil, a single-layer snowpack, a simple representation of vegetation cover and similarly modest input requirements. Outflow was modelled as a function of lake level. The simulated record of lake level and outflow matched observations of lake-level variations (r = 0.78) and streamflow (r = 0.87) well. The ability of the model to capture both intra- and inter-annual variations in the behaviour of a specific lake, despite the relatively simple input requirements, makes it extremely suitable for investigations of the impacts of climate change on lake water balance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes an empirical, user-centred approach to explanation design. It reports three studies that investigate what patients want to know when they have been prescribed medication. The question is asked in the context of the development of a drug prescription system called OPADE. The system is aimed primarily at improving the prescribing behaviour of physicians, but will also produce written explanations for indirect users such as patients. In the first study, a large number of people were presented with a scenario about a visit to the doctor, and were asked to list the questions that they would like to ask the doctor about the prescription. On the basis of the results of the study, a categorization of question types was developed in terms of how frequently particular questions were asked. In the second and third studies a number of different explanations were generated in accordance with this categorization, and a new sample of people were presented with another scenario and were asked to rate the explanations on a number of dimensions. The results showed significant differences between the different explanations. People preferred explanations that included items corresponding to frequently asked questions in study 1. For an explanation to be considered useful, it had to include information about side effects, what the medication does, and any lifestyle changes involved. The implications of the results of the three studies are discussed in terms of the development of OPADE's explanation facility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heat waves are expected to increase in frequency and magnitude with climate change. The first part of a study to produce projections of the effect of future climate change on heat-related mortality is presented. Separate city-specific empirical statistical models that quantify significant relationships between summer daily maximum temperature (T max) and daily heat-related deaths are constructed from historical data for six cities: Boston, Budapest, Dallas, Lisbon, London, and Sydney. ‘Threshold temperatures’ above which heat-related deaths begin to occur are identified. The results demonstrate significantly lower thresholds in ‘cooler’ cities exhibiting lower mean summer temperatures than in ‘warmer’ cities exhibiting higher mean summer temperatures. Analysis of individual ‘heat waves’ illustrates that a greater proportion of mortality is due to mortality displacement in cities with less sensitive temperature–mortality relationships than in those with more sensitive relationships, and that mortality displacement is no longer a feature more than 12 days after the end of the heat wave. Validation techniques through residual and correlation analyses of modelled and observed values and comparisons with other studies indicate that the observed temperature–mortality relationships are represented well by each of the models. The models can therefore be used with confidence to examine future heat-related deaths under various climate change scenarios for the respective cities (presented in Part 2).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimation of whole-grain (WG) food intake in epidemiological and nutritional studies is normally based on general diet FFQ, which are not designed to specifically capture WG intake. To estimate WG cereal intake, we developed a forty-three-item FFQ focused on cereal product intake over the past month. We validated this questionnaire against a 3-d-weighed food record (3DWFR) in thirty-one subjects living in the French-speaking part of Switzerland (nineteen female and twelve male). Subjects completed the FFQ on day 1 (FFQ1), the 3DWFR between days 2 and 13 and the FFQ again on day 14 (FFQ2). The subjects provided a fasting blood sample within 1 week of FFQ2. Total cereal intake, total WG intake, intake of individual cereals, intake of different groups of cereal products and alkylresorcinol (AR) intake were calculated from both FFQ and the 3DWFR. Plasma AR, possible biomarkers for WG wheat and rye intake were also analysed. The total WG intake for the 3DWFR, FFQ1, FFQ2 was 26 (sd 22), 28 (sd 25) and 21 (sd 16) g/d, respectively. Mean plasma AR concentration was 55.8 (sd 26.8) nmol/l. FFQ1, FFQ2 and plasma AR were correlated with the 3DWFR (r 0.72, 0.81 and 0.57, respectively). Adjustment for age, sex, BMI and total energy intake did not affect the results. This FFQ appears to give a rapid and adequate estimate of WG cereal intake in free-living subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce the notion that the energy of individuals can manifest as a higher-level, collective construct. To this end, we conducted four independent studies to investigate the viability and importance of the collective energy construct as assessed by a new survey instrument—the productive energy measure (PEM). Study 1 (n = 2208) included exploratory and confirmatory factor analyses to explore the underlying factor structure of PEM. Study 2 (n = 660) cross-validated the same factor structure in an independent sample. In study 3, we administered the PEM to more than 5000 employees from 145 departments located in five countries. Results from measurement invariance, statistical aggregation, convergent, and discriminant-validity assessments offered additional support for the construct validity of PEM. In terms of predictive and incremental validity, the PEM was positively associated with three collective attitudes—units' commitment to goals, the organization, and overall satisfaction. In study 4, we explored the relationship between the productive energy of firms and their overall performance. Using data from 92 firms (n = 5939employees), we found a positive relationship between the PEM (aggregated to the firm level) and the performance of those firms. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been continued and expanding recognition of probiotic approaches for treating gastrointestinal and systemic disease, as well as increased acceptance of probiotic therapies by both the public and the medical community. A parallel development has been the increasing recognition of the diverse roles that the normal gut microbiota plays in the normal biology of the host. This advance has in turn has been fed by implementation of novel investigative technologies and conceptual paradigms focused on understanding the fundamental role of the microbiota and indeed all commensal bacteria, on known and previously unsuspected aspects of host physiology in health and disease. This review discusses current advances in the study of the host-microbiota interaction, especially as it relates to potential mechanisms of probiotics. It is hoped these new approaches will allow more rational selection and validation of probiotic usage in a variety of clinical conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Thought–shape fusion (TSF) is a cognitive distortion that has been linked to eating pathology. Two studies were conducted to further explore this phenomenon and to establish the psychometric properties of a French short version of the TSF scale. Method: In Study 1, students (n 5 284) completed questionnaires assessing TSF and related psychopathology. In Study 2, the responses of women with eating disorders (n 5 22) and women with no history of an eating disorder (n 5 23) were compared. Results: The French short version of the TSF scale has a unifactorial structure, with convergent validity with measures of eating pathology, and good internal consistency. Depression, eating pathology, body dissatisfaction, and thought-action fusion emerged as predictors of TSF. Individuals with eating disorders have higher TSF, and more clinically relevant food-related thoughts than do women with no history of an eating disorder. Discussion: This research suggests that the shortened TSF scale can suitably measure this construct, and provides support for the notion that TSF is associated with eating pathology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a within-subject experiment, we compare hypothetical and real willingness to pay (WTP) for an improvement in the recyclability of a product. Subjects are faced with a real payment scenario after they have responded to a hypothetical question. Contrary to most of the results obtained in similar studies, at apopulation level, there are no significant median differences between actual and hypothetical stated values of WTP. However,within-subject comparisons between hypothetical and actual values indicate that subjects stating a low (high) hypothetical WTP tend to underestimate (overestimate) the value of their actual contributions.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Refractivity changes (ΔN) derived from radar ground clutter returns serve as a proxy for near-surface humidity changes (1 N unit ≡ 1% relative humidity at 20 °C). Previous studies have indicated that better humidity observations should improve forecasts of convection initiation. A preliminary assessment of the potential of refractivity retrievals from an operational magnetron-based C-band radar is presented. The increased phase noise at shorter wavelengths, exacerbated by the unknown position of the target within the 300 m gate, make it difficult to obtain absolute refractivity values, so we consider the information in 1 h changes. These have been derived to a range of 30 km with a spatial resolution of ∼4 km; the consistency of the individual estimates (within each 4 km × 4 km area) indicates that ΔN errors are about 1 N unit, in agreement with in situ observations. Measurements from an instrumented tower on summer days show that the 1 h refractivity changes up to a height of 100 m remain well correlated with near-surface values. The analysis of refractivity as represented in the operational Met Office Unified Model at 1.5, 4 and 12 km grid lengths demonstrates that, as model resolution increases, the spatial scales of the refractivity structures improve. It is shown that the magnitude of refractivity changes is progressively underestimated at larger grid lengths during summer. However, the daily time series of 1 h refractivity changes reveal that, whereas the radar-derived values are very well correlated with the in situ observations, the high-resolution model runs have little skill in getting the right values of ΔN in the right place at the right time. This suggests that the assimilation of these radar refractivity observations could benefit forecasts of the initiation of convection.