100 resultados para Quantitative verification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The applicability of BET model for calculation of surface area of activated carbons is checked by using molecular simulations. By calculation of geometric surface areas for the simple model carbon slit-like pore with the increasing width, and by comparison of the obtained values with those for the same systems from the VEGA ZZ package (adsorbate-accessible molecular surface), it is shown that the latter methods provide correct values. For the system where a monolayer inside a pore is created the ASA approach (GCMC, Ar, T = 87 K) underestimates the value of surface area for micropores (especially, where only one layer is observed and/or two layers of adsorbed Ar are formed). Therefore, we propose the modification of this method based on searching the relationship between the pore diameter and the number of layers in a pore. Finally BET; original andmodified ASA; and A, B and C-point surface areas are calculated for a series of virtual porous carbons using simulated Ar adsorption isotherms (GCMC and T = 87 K). The comparison of results shows that the BET method underestimates and not, as it was usually postulated, overestimates the surface areas of microporous carbons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developed in response to the new challenges of the social Web, this study investigates how involvement with brand-related user-generated content (UGC) affects consumers’ perceptions of brands. The authors develop a model that provides new insights into the links between drivers of UGC creation, involvement, and consumer-based brand equity. Expert opinions were sought on a hypothesized model, which further was tested through data from an online survey of 202 consumers. The results provide guidance for managerial initiatives involving UGC campaigns for brand building. The findings indicate that consumer perceptions of co-creation, community, and self-concept have a positive impact on UGC involvement that, in turn, positively affects consumer based brand equity. These empirical results have significant implications for avoiding problems and building deeper relationships between consumers and brands in the age of social media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The organization of non-crystalline polymeric materials at a local level, namely on a spatial scale between a few and 100 a, is still unclear in many respects. The determination of the local structure in terms of the configuration and conformation of the polymer chain and of the packing characteristics of the chain in the bulk material represents a challenging problem. Data from wide-angle diffraction experiments are very difficult to interpret due to the very large amount of information that they carry, that is the large number of correlations present in the diffraction patterns.We describe new approaches that permit a detailed analysis of the complex neutron diffraction patterns characterizing polymer melts and glasses. The coupling of different computer modelling strategies with neutron scattering data over a wide Q range allows the extraction of detailed quantitative information on the structural arrangements of the materials of interest. Proceeding from modelling routes as diverse as force field calculations, single-chain modelling and reverse Monte Carlo, we show the successes and pitfalls of each approach in describing model systems, which illustrate the need to attack the data analysis problem simultaneously from several fronts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mannitol is a polymorphic excipient which is usually used in pharmaceutical products as the beta form, although other polymorphs (alpha and delta) are common contaminants. Binary mixtures containing beta and delta mannitol were prepared to quantify the concentration of the beta form using FT-Raman spectroscopy. Spectral regions characteristic of each form were selected and peak intensity ratios of beta peaks to delta peaks were calculated. Using these ratios, a correlation curve was established which was then validated by analysing further samples of known composition. The results indicate that levels down to 2% beta could be quantified using this novel, non-destructive approach. Potential errors associated with quantitative studies using FT-Raman spectroscopy were also researched. The principal source of variability arose from inhomogeneities on mixing of the samples; a significant reduction of these errors was observed by reducing and controlling the particle size range. The results show that FT-Raman spectroscopy can be used to rapidly and accurately quantitate polymorphic mixtures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mannitol is a polymorphic pharmaceutical excipient, which commonly exists in three forms: alpha, beta and delta. Each polymorph has a needle-like morphology, which can give preferred orientation effects when analysed by X-ray powder diffractometry (XRPD) thus providing difficulties for quantitative XRPD assessments. The occurrence of preferred orientation may be demonstrated by sample rotation and the consequent effects on X-ray data can be minimised by reducing the particle size. Using two particle size ranges (less than 125 and 125–500�microns), binary mixtures of beta and delta mannitol were prepared and the delta component was quantified. Samples were assayed in either a static or rotating sampling accessory. Rotation and reducing the particle size range to less than�125 microns halved the limits of detection and quantitation to 1 and 3.6%, respectively. Numerous potential sources of assay errors were investigated; sample packing and mixing errors contributed the greatest source of variation. However, the rotation of samples for both particle size ranges reduced the majority of assay errors examined. This study shows that coupling sample rotation with a particle size reduction minimises preferred orientation effects on assay accuracy, allowing discrimination of two very similar polymorphs at around the 1% level

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring the retention, or residence time, of dosage forms to biological tissue is commonly a qualitative measurement, where no real values to describe the retention can be recorded. The result of this is an assessment that is dependent upon a user's interpretation of visual observation. This research paper outlines the development of a methodology to quantitatively measure, both by image analysis and by spectrophotometric techniques, the retention of material to biological tissues, using the retention of polymer solutions to ocular tissue as an example. Both methods have been shown to be repeatable, with the spectrophotometric measurement generating data reliably and quickly for further analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a new Bayesian approach to retrieve oceanic rain rate from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI), with an emphasis on typhoon cases in the West Pacific. Retrieved rain rates are validated with measurements of rain gauges located on Japanese islands. To demonstrate improvement, retrievals are also compared with those from the TRMM/Precipitation Radar (PR), the Goddard Profiling Algorithm (GPROF), and a multi-channel linear regression statistical method (MLRS). We have found that qualitatively, all methods retrieved similar horizontal distributions in terms of locations of eyes and rain bands of typhoons. Quantitatively, our new Bayesian retrievals have the best linearity and the smallest root mean square (RMS) error against rain gauge data for 16 typhoon overpasses in 2004. The correlation coefficient and RMS of our retrievals are 0.95 and ~2 mm hr-1, respectively. In particular, at heavy rain rates, our Bayesian retrievals outperform those retrieved from GPROF and MLRS. Overall, the new Bayesian approach accurately retrieves surface rain rate for typhoon cases. Accurate rain rate estimates from this method can be assimilated in models to improve forecast and prevent potential damages in Taiwan during typhoon seasons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many studies warn that climate change may undermine global food security. Much work on this topic focuses on modelling crop-weather interactions but these models do not generally account for the ways in which socio-economic factors influence how harvests are affected by weather. To address this gap, this paper uses a quantitative harvest vulnerability index based on annual soil moisture and grain production data as the dependent variables in a Linear Mixed Effects model with national scale socio-economic data as independent variables for the period 1990-2005. Results show that rice, wheat and maize production in middle income countries were especially vulnerable to droughts. By contrast, harvests in countries with higher investments in agriculture (e.g higher amounts of fertilizer use) were less vulnerable to drought. In terms of differences between the world's major grain crops, factors that made rice and wheat crops vulnerable to drought were quite consistent, whilst those of maize crops varied considerably depending on the type of region. This is likely due to the fact that maize is produced under very different conditions worldwide. One recommendation for reducing drought vulnerability risks is coordinated development and adaptation policies, including institutional support that enables farmers to take proactive action.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Specific traditional plate count method and real-time PCR systems based on SYBR Green I and TaqMan technologies using a specific primer pair and probe for amplification of iap-gene were used for quantitative assay of Listeria monocytogenes in seven decimal serial dilution series of nutrient broth and milk samples containing 1.58 to 1.58×107 cfu /ml and the real-time PCR methods were compared with the plate count method with respect to accuracy and sensitivity. In this study, the plate count method was performed using surface-plating of 0.1 ml of each sample on Palcam Agar. The lowest detectable level for this method was 1.58×10 cfu/ml for both nutrient broth and milk samples. Using purified DNA as a template for generation of standard curves, as few as four copies of the iap-gene could be detected per reaction with both real-time PCR assays, indicating that they were highly sensitive. When these real-time PCR assays were applied to quantification of L. monocytogenes in decimal serial dilution series of nutrient broth and milk samples, 3.16×10 to 3.16×105 copies per reaction (equals to 1.58×103 to 1.58×107 cfu/ml L. monocytogenes) were detectable. As logarithmic cycles, for Plate Count and both molecular assays, the quantitative results of the detectable steps were similar to the inoculation levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of NWP models with grid spacing down to 1 km should produce more realistic forecasts of convective storms. However, greater realism does not necessarily mean more accurate precipitation forecasts. The rapid growth of errors on small scales in conjunction with preexisting errors on larger scales may limit the usefulness of such models. The purpose of this paper is to examine whether improved model resolution alone is able to produce more skillful precipitation forecasts on useful scales, and how the skill varies with spatial scale. A verification method will be described in which skill is determined from a comparison of rainfall forecasts with radar using fractional coverage over different sized areas. The Met Office Unified Model was run with grid spacings of 12, 4, and 1 km for 10 days in which convection occurred during the summers of 2003 and 2004. All forecasts were run from 12-km initial states for a clean comparison. The results show that the 1-km model was the most skillful over all but the smallest scales (approximately <10–15 km). A measure of acceptable skill was defined; this was attained by the 1-km model at scales around 40–70 km, some 10–20 km less than that of the 12-km model. The biggest improvement occurred for heavier, more localized rain, despite it being more difficult to predict. The 4-km model did not improve much on the 12-km model because of the difficulties of representing convection at that resolution, which was accentuated by the spinup from 12-km fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is becoming increasingly important to be able to verify the spatial accuracy of precipitation forecasts, especially with the advent of high-resolution numerical weather prediction (NWP) models. In this article, the fractions skill score (FSS) approach has been used to perform a scale-selective evaluation of precipitation forecasts during 2003 from the Met Office mesoscale model (12 km grid length). The investigation shows how skill varies with spatial scale, the scales over which the data assimilation (DA) adds most skill, and how the loss of that skill is dependent on both the spatial scale and the rainfall coverage being examined. Although these results come from a specific model, they demonstrate how this verification approach can provide a quantitative assessment of the spatial behaviour of new finer-resolution models and DA techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the potential to adapt to warmer climate is constrained by genetic trade-offs, our understanding of how selection and mutation shape genetic (co)variances in thermal reaction norms is poor. Using 71 isofemale lines of the fly Sepsis punctum, originating from northern, central, and southern European climates, we tested for divergence in juvenile development rate across latitude at five experimental temperatures. To investigate effects of evolutionary history in different climates on standing genetic variation in reaction norms, we further compared genetic (co)variances between regions. Flies were reared on either high or low food resources to explore the role of energy acquisition in determining genetic trade-offs between different temperatures. Although the latter had only weak effects on the strength and sign of genetic correlations, genetic architecture differed significantly between climatic regions, implying that evolution of reaction norms proceeds via different trajectories at high latitude versus low latitude in this system. Accordingly, regional genetic architecture was correlated to region-specific differentiation. Moreover, hot development temperatures were associated with low genetic variance and stronger genetic correlations compared to cooler temperatures. We discuss the evolutionary potential of thermal reaction norms in light of their underlying genetic architectures, evolutionary histories, and the materialization of trade-offs in natural environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative estimates of temperature and precipitation change during the late Pleistocene and Holocene have been difficult to obtain for much of the lowland Neotropics. Using two published lacustrine pollen records and a climate-vegetation model based on the modern abundance distributions of 154 Neotropical plant families, we demonstrate how family-level counts of fossil pollen can be used to quantitatively reconstruct tropical paleoclimate and provide needed information on historic patterns of climatic change. With this family-level analysis, we show that one area of the lowland tropics, northeastern Bolivia, experienced cooling (1–3 °C) and drying (400 mm/yr), relative to present, during the late Pleistocene (50,000–12,000 calendar years before present [cal. yr B.P.]). Immediately prior to the Last Glacial Maximum (LGM, ca. 21,000 cal. yr B.P.), we observe a distinct transition from cooler temperatures and variable precipitation to a period of warmer temperatures and relative dryness that extends to the middle Holocene (5000–3000 cal. yr B.P.). This prolonged reduction in precipitation occurs against the backdrop of increasing atmospheric CO2 concentrations, indicating that the presence of mixed savanna and dry-forest communities in northeastern Bolivia durng the LGM was not solely the result of low CO2 levels, as suggested previously, but also lower precipitation. The results of our analysis demonstrate the potential for using the distribution and abundance structure of modern Neotropical plant families to infer paleoclimate from the fossil pollen record.