80 resultados para REFINEMENT


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plane wave discontinuous Galerkin (PWDG) methods are a class of Trefftz-type methods for the spatial discretization of boundary value problems for the Helmholtz operator $-\Delta-\omega^2$, $\omega>0$. They include the so-called ultra weak variational formulation from [O. Cessenat and B. Després, SIAM J. Numer. Anal., 35 (1998), pp. 255–299]. This paper is concerned with the a priori convergence analysis of PWDG in the case of $p$-refinement, that is, the study of the asymptotic behavior of relevant error norms as the number of plane wave directions in the local trial spaces is increased. For convex domains in two space dimensions, we derive convergence rates, employing mesh skeleton-based norms, duality techniques from [P. Monk and D. Wang, Comput. Methods Appl. Mech. Engrg., 175 (1999), pp. 121–136], and plane wave approximation theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scoring rules are an important tool for evaluating the performance of probabilistic forecasting schemes. A scoring rule is called strictly proper if its expectation is optimal if and only if the forecast probability represents the true distribution of the target. In the binary case, strictly proper scoring rules allow for a decomposition into terms related to the resolution and the reliability of a forecast. This fact is particularly well known for the Brier Score. In this article, this result is extended to forecasts for finite-valued targets. Both resolution and reliability are shown to have a positive effect on the score. It is demonstrated that resolution and reliability are directly related to forecast attributes that are desirable on grounds independent of the notion of scores. This finding can be considered an epistemological justification of measuring forecast quality by proper scoring rules. A link is provided to the original work of DeGroot and Fienberg, extending their concepts of sufficiency and refinement. The relation to the conjectured sharpness principle of Gneiting, et al., is elucidated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time-resolved studies of chlorosilylene, ClSiH, generated by the 193 nm laser flash photolysis of 1-chloro-1- silacyclopent-3-ene, have been carried out to obtain rate constants for its bimolecular reaction with trimethylsilane-1-d, Me3SiD, in the gas phase. The reaction was studied at total pressures up to 100 Torr (with and without added SF6) over the temperature range of 295−407 K. The rate constants were found to be pressure independent and gave the following Arrhenius equation: log[(k/(cm3 molecule−1 s−1)] = (−13.22 ± 0.15) + [(13.20 ± 1.00) kJ mol−1]/(RT ln 10). When compared with previously published kinetic data for the reaction of ClSiH with Me3SiH, kinetic isotope effects, kD/kH, in the range from 7.4 (297 K) to 6.4 (407 K) were obtained. These far exceed values of 0.4−0.5 estimated for a single-step insertion process. Quantum chemical calculations (G3MP2B3 level) confirm not only the involvement of an intermediate complex, but also the existence of a low-energy internal isomerization pathway which can scramble the D and H atom labels. By means of Rice−Ramsperger−Kassel−Marcus modeling and a necessary (but small) refinement of the energy surface, we have shown that this mechanism can reproduce closely the experimental isotope effects. These findings provide the first experimental evidence for the isomerization pathway and thereby offer the most concrete evidence to date for the existence of intermediate complexes in the insertion reactions of silylenes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the European Union, first-tier assessment of the long-term risk to birds and mammals from pesticides is based on calculation of a deterministic long-term toxicity/exposure ratio(TERlt). The ratio is developed from generic herbivores and insectivores and applied to all species. This paper describes two case studies that implement proposed improvements to the way long-term risk is assessed. These refined methods require calculation of a TER for each of five identified phases of reproduction (phase-specific TERs) and use of adjusted No Observed Effect Levels (NOELs)to incorporate variation in species sensitivity to pesticides. They also involve progressive refinement of the exposure estimate so that it applies to particular species, rather than generic indicators, and relates spraying date to onset of reproduction. The effect of using these new methods on the assessment of risk is described. Each refinement did not necessarily alter the calculated TER value in a way that was either predictable or consistent across both case studies. However, use of adjusted NOELs always reduced TERs, and relating spraying date to onset of reproduction increased most phase-specific TERs. The case studies suggested that the current first-tier TERlt assessment may underestimate risk in some circumstances and that phase-specific assessments can help identify appropriate risk-reduction measures. The way in which deterministic phase-specific assessments can currently be implemented to enhance first-tier assessment is outlined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The internal variability and coupling between the stratosphere and troposphere in CCMVal‐2 chemistry‐climate models are evaluated through analysis of the annular mode patterns of variability. Computation of the annular modes in long data sets with secular trends requires refinement of the standard definition of the annular mode, and a more robust procedure that allows for slowly varying trends is established and verified. The spatial and temporal structure of the models’ annular modes is then compared with that of reanalyses. As a whole, the models capture the key features of observed intraseasonal variability, including the sharp vertical gradients in structure between stratosphere and troposphere, the asymmetries in the seasonal cycle between the Northern and Southern hemispheres, and the coupling between the polar stratospheric vortices and tropospheric midlatitude jets. It is also found that the annular mode variability changes little in time throughout simulations of the 21st century. There are, however, both common biases and significant differences in performance in the models. In the troposphere, the annular mode in models is generally too persistent, particularly in the Southern Hemisphere summer, a bias similar to that found in CMIP3 coupled climate models. In the stratosphere, the periods of peak variance and coupling with the troposphere are delayed by about a month in both hemispheres. The relationship between increased variability of the stratosphere and increased persistence in the troposphere suggests that some tropospheric biases may be related to stratospheric biases and that a well‐simulated stratosphere can improve simulation of tropospheric intraseasonal variability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Full-waveform laser scanning data acquired with a Riegl LMS-Q560 instrument were used to classify an orange orchard into orange trees, grass and ground using waveform parameters alone. Gaussian decomposition was performed on this data capture from the National Airborne Field Experiment in November 2006 using a custom peak-detection procedure and a trust-region-reflective algorithm for fitting Gauss functions. Calibration was carried out using waveforms returned from a road surface, and the backscattering coefficient c was derived for every waveform peak. The processed data were then analysed according to the number of returns detected within each waveform and classified into three classes based on pulse width and c. For single-peak waveforms the scatterplot of c versus pulse width was used to distinguish between ground, grass and orange trees. In the case of multiple returns, the relationship between first (or first plus middle) and last return c values was used to separate ground from other targets. Refinement of this classification, and further sub-classification into grass and orange trees was performed using the c versus pulse width scatterplots of last returns. In all cases the separation was carried out using a decision tree with empirical relationships between the waveform parameters. Ground points were successfully separated from orange tree points. The most difficult class to separate and verify was grass, but those points in general corresponded well with the grass areas identified in the aerial photography. The overall accuracy reached 91%, using photography and relative elevation as ground truth. The overall accuracy for two classes, orange tree and combined class of grass and ground, yielded 95%. Finally, the backscattering coefficient c of single-peak waveforms was also used to derive reflectance values of the three classes. The reflectance of the orange tree class (0.31) and ground class (0.60) are consistent with published values at the wavelength of the Riegl scanner (1550 nm). The grass class reflectance (0.46) falls in between the other two classes as might be expected, as this class has a mixture of the contributions of both vegetation and ground reflectance properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question of what explains variation in expenditures on Active Labour Market Programs (ALMPs) has attracted significant scholarship in recent years. Significant insights have been gained with respect to the role of employers, unions and dual labour markets, openness, and partisanship. However, there remain significant disagreements with respects to key explanatory variables such the role of unions or the impact of partisanship. Qualitative studies have shown that there are both good conceptual reasons as well as historical evidence that different ALMPs are driven by different dynamics. There is little reason to believe that vastly different programs such as training and employment subsidies are driven by similar structural, interest group or indeed partisan dynamics. The question is therefore whether different ALMPs have the same correlation with different key explanatory variables identified in the literature? Using regression analysis, this paper shows that the explanatory variables identified by the literature have different relation to distinct ALMPs. This refinement adds significant analytical value and shows that disagreements are at least partly due to a dependent variable problem of ‘over-aggregation’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an efficient method of combining wide angle neutron scattering data with detailed atomistic models, allowing us to perform a quantitative and qualitative mapping of the organisation of the chain conformation in both glass and liquid phases. The structural refinement method presented in this work is based on the exploitation of the intrachain features of the diffraction pattern and its intimate linkage with atomistic models by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Atomic connectivity is defined through these coordinates that are in turn assigned by pre-defined probability distributions, thus allowing for the models in question to be built stochastically. Incremental variation of these coordinates allows for the construction of models that minimise the differences between the observed and calculated structure factors. We present a series of neutron scattering data of 1,2 polybutadiene at the region 120-400K. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54Å and 1.35Å respectively. Valence angles of the backbone were found to be at 112° and the torsion distributions are characterised by five rotational states, a three-fold trans-skew± for the backbone and gauche± for the vinyl group. Rotational states of the vinyl group were found to be equally populated, indicating a largely atactic chan. The two backbone torsion angles exhibit different behaviour with respect to temperature of their trans population, with one of them adopting an almost all trans sequence. Consequently the resulting configuration leads to a rather persistent chain, something indicated by the value of the characteristic ratio extrapolated from the model. We compare our results with theoretical predictions, computer simulations, RIS models and previously reported experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tremendous progress in plant proteomics driven by mass spectrometry (MS) techniques has been made since 2000 when few proteomics reports were published and plant proteomics was in its infancy. These achievements include the refinement of existing techniques and the search for new techniques to address food security, safety, and health issues. It is projected that in 2050, the world’s population will reach 9–12 billion people demanding a food production increase of 34–70% (FAO, 2009) from today’s food production. Provision of food in a sustainable and environmentally committed manner for such a demand without threatening natural resources, requires that agricultural production increases significantly and that postharvest handling and food manufacturing systems become more efficient requiring lower energy expenditure, a decrease in postharvest losses, less waste generation and food with longer shelf life. There is also a need to look for alternative protein sources to animal based (i.e., plant based) to be able to fulfill the increase in protein demands by 2050. Thus, plant biology has a critical role to play as a science capable of addressing such challenges. In this review, we discuss proteomics especially MS, as a platform, being utilized in plant biology research for the past 10 years having the potential to expedite the process of understanding plant biology for human benefits. The increasing application of proteomics technologies in food security, analysis, and safety is emphasized in this review. But, we are aware that no unique approach/technology is capable to address the global food issues. Proteomics-generated information/resources must be integrated and correlated with other omics-based approaches, information, and conventional programs to ensure sufficient food and resources for human development now and in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to demonstrate analytically how entrepreneurial action as learning relating to diversifying into technical clothing – i.e. a high-value manufacturing sector – can take place. This is particularly relevant to recent discussion and debate in academic and policy-making circles concerning the survival of the clothing manufacture industry in developed industrialised countries. Design/methodology/approach – Using situated learning theory (SLT) as the major analytical lens, this case study examines an episode of entrepreneurial action relating to diversification into a high-value manufacturing sector. It is considered on instrumentality grounds, revealing wider tendencies in the management of knowledge and capabilities requisite for effective entrepreneurial action of this kind. Findings – Boundary events, brokers, boundary objects, membership structures and inclusive participation that addresses power asymmetries are found to be crucial organisational design elements, enabling the development of inter- and intracommunal capacities. These together constitute a dynamic learning capability, which underpins entrepreneurial action, such as diversification into high-value manufacturing sectors. Originality/value – Through a refinement of SLT in the context of entrepreneurial action, the paper contributes to an advancement of a substantive theory of managing technological knowledge and capabilities for effective diversification into high-value manufacturing sectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract We present a refined parametric model for forecasting electricity demand which performed particularly well in the recent Global Energy Forecasting Competition (GEFCom 2012). We begin by motivating and presenting a simple parametric model, treating the electricity demand as a function of the temperature and day of the data. We then set out a series of refinements of the model, explaining the rationale for each, and using the competition scores to demonstrate that each successive refinement step increases the accuracy of the model’s predictions. These refinements include combining models from multiple weather stations, removing outliers from the historical data, and special treatments of public holidays.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the economics of Enhanced Landfill Mining (ELFM) both from a private point of view as well as from a society perspective. The private potential is assessed using a case study for which an investment model is developed to identify the impact of a broad range of parameters on the profitability of ELFM. We found that especially variations in Waste-to-Energy (WtE efficiency, electricity price, CO2-price, WtE investment and operational costs) and ELFM support explain the variation in economic profitability measured by the Internal Rate of Return. To overcome site-specific parameters we also evaluated the regional ELFM potential for the densely populated and industrial region of Flanders (north of Belgium). The total number of potential ELFM sites was estimated using a 5-step procedure and a simulation tool was developed to trade-off private costs and benefits. The analysis shows that there is a substantial economic potential for ELFM projects on the wider regional level. Furthermore, this paper also reviews the costs and benefits from a broader perspective. The carbon footprint of the case study was mapped in order to assess the project’s net impact in terms of greenhouse gas emissions. Also the impacts of nature restoration, soil remediation, resource scarcity and reduced import dependence were valued so that they can be used in future social cost-benefit analysis. Given the complex trade-off between economic, social and environmental issues of ELFM projects, we conclude that further refinement of the methodological framework and the development of the integrated decision tools supporting private and public actors, are necessary.