196 resultados para Query Refinement
Resumo:
In the European Union, first-tier assessment of the long-term risk to birds and mammals from pesticides is based on calculation of a deterministic long-term toxicity/exposure ratio(TERlt). The ratio is developed from generic herbivores and insectivores and applied to all species. This paper describes two case studies that implement proposed improvements to the way long-term risk is assessed. These refined methods require calculation of a TER for each of five identified phases of reproduction (phase-specific TERs) and use of adjusted No Observed Effect Levels (NOELs)to incorporate variation in species sensitivity to pesticides. They also involve progressive refinement of the exposure estimate so that it applies to particular species, rather than generic indicators, and relates spraying date to onset of reproduction. The effect of using these new methods on the assessment of risk is described. Each refinement did not necessarily alter the calculated TER value in a way that was either predictable or consistent across both case studies. However, use of adjusted NOELs always reduced TERs, and relating spraying date to onset of reproduction increased most phase-specific TERs. The case studies suggested that the current first-tier TERlt assessment may underestimate risk in some circumstances and that phase-specific assessments can help identify appropriate risk-reduction measures. The way in which deterministic phase-specific assessments can currently be implemented to enhance first-tier assessment is outlined.
Resumo:
The internal variability and coupling between the stratosphere and troposphere in CCMVal‐2 chemistry‐climate models are evaluated through analysis of the annular mode patterns of variability. Computation of the annular modes in long data sets with secular trends requires refinement of the standard definition of the annular mode, and a more robust procedure that allows for slowly varying trends is established and verified. The spatial and temporal structure of the models’ annular modes is then compared with that of reanalyses. As a whole, the models capture the key features of observed intraseasonal variability, including the sharp vertical gradients in structure between stratosphere and troposphere, the asymmetries in the seasonal cycle between the Northern and Southern hemispheres, and the coupling between the polar stratospheric vortices and tropospheric midlatitude jets. It is also found that the annular mode variability changes little in time throughout simulations of the 21st century. There are, however, both common biases and significant differences in performance in the models. In the troposphere, the annular mode in models is generally too persistent, particularly in the Southern Hemisphere summer, a bias similar to that found in CMIP3 coupled climate models. In the stratosphere, the periods of peak variance and coupling with the troposphere are delayed by about a month in both hemispheres. The relationship between increased variability of the stratosphere and increased persistence in the troposphere suggests that some tropospheric biases may be related to stratospheric biases and that a well‐simulated stratosphere can improve simulation of tropospheric intraseasonal variability.
Resumo:
Full-waveform laser scanning data acquired with a Riegl LMS-Q560 instrument were used to classify an orange orchard into orange trees, grass and ground using waveform parameters alone. Gaussian decomposition was performed on this data capture from the National Airborne Field Experiment in November 2006 using a custom peak-detection procedure and a trust-region-reflective algorithm for fitting Gauss functions. Calibration was carried out using waveforms returned from a road surface, and the backscattering coefficient c was derived for every waveform peak. The processed data were then analysed according to the number of returns detected within each waveform and classified into three classes based on pulse width and c. For single-peak waveforms the scatterplot of c versus pulse width was used to distinguish between ground, grass and orange trees. In the case of multiple returns, the relationship between first (or first plus middle) and last return c values was used to separate ground from other targets. Refinement of this classification, and further sub-classification into grass and orange trees was performed using the c versus pulse width scatterplots of last returns. In all cases the separation was carried out using a decision tree with empirical relationships between the waveform parameters. Ground points were successfully separated from orange tree points. The most difficult class to separate and verify was grass, but those points in general corresponded well with the grass areas identified in the aerial photography. The overall accuracy reached 91%, using photography and relative elevation as ground truth. The overall accuracy for two classes, orange tree and combined class of grass and ground, yielded 95%. Finally, the backscattering coefficient c of single-peak waveforms was also used to derive reflectance values of the three classes. The reflectance of the orange tree class (0.31) and ground class (0.60) are consistent with published values at the wavelength of the Riegl scanner (1550 nm). The grass class reflectance (0.46) falls in between the other two classes as might be expected, as this class has a mixture of the contributions of both vegetation and ground reflectance properties.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
The question of what explains variation in expenditures on Active Labour Market Programs (ALMPs) has attracted significant scholarship in recent years. Significant insights have been gained with respect to the role of employers, unions and dual labour markets, openness, and partisanship. However, there remain significant disagreements with respects to key explanatory variables such the role of unions or the impact of partisanship. Qualitative studies have shown that there are both good conceptual reasons as well as historical evidence that different ALMPs are driven by different dynamics. There is little reason to believe that vastly different programs such as training and employment subsidies are driven by similar structural, interest group or indeed partisan dynamics. The question is therefore whether different ALMPs have the same correlation with different key explanatory variables identified in the literature? Using regression analysis, this paper shows that the explanatory variables identified by the literature have different relation to distinct ALMPs. This refinement adds significant analytical value and shows that disagreements are at least partly due to a dependent variable problem of ‘over-aggregation’.
Resumo:
An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.
Resumo:
In the context of national and global trends of producing Beckett’s work, this essay will investigate recent productions of Beckett’s drama which originate in Ireland and tour internationally, examining how these relate to the concept of national identity and its marketability, as well the conceptual and material spaces provided by large-scale festival events. In the last few months, Pan Pan has toured its production of All that Fall from Dublin to the Beckett festival in Enniskillen to New York’s BAM. The Gate Theatre, always a powerhouse of Beckett productions, continues its revival of Barry McGovern’s adaptation of Watt; after the Edinburgh festival, the show will play London’s Barbican in March 2013. While originating in Ireland, these productions – those of the Gate in particular – have an international, as well as domestic, appeal. Examining these and forthcoming Gate productions, I query to what extent a theatre company’s cultural origins and international profile may create a perceived sense of authenticity or definitiveness among critical discourses at ‘home’ and abroad, and how such markers of identity are utilized by the marketing strategies which surround these productions. This article will interrogate the potential convergence of the globalized branding of both Beckett’s work and Irish identity, drawing on the writings of Bourdieu to elucidate how identity may be converted into economic and cultural capital, as well as examining the role that the festival event plays in this process.
Resumo:
We present an efficient method of combining wide angle neutron scattering data with detailed atomistic models, allowing us to perform a quantitative and qualitative mapping of the organisation of the chain conformation in both glass and liquid phases. The structural refinement method presented in this work is based on the exploitation of the intrachain features of the diffraction pattern and its intimate linkage with atomistic models by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Atomic connectivity is defined through these coordinates that are in turn assigned by pre-defined probability distributions, thus allowing for the models in question to be built stochastically. Incremental variation of these coordinates allows for the construction of models that minimise the differences between the observed and calculated structure factors. We present a series of neutron scattering data of 1,2 polybutadiene at the region 120-400K. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54Å and 1.35Å respectively. Valence angles of the backbone were found to be at 112° and the torsion distributions are characterised by five rotational states, a three-fold trans-skew± for the backbone and gauche± for the vinyl group. Rotational states of the vinyl group were found to be equally populated, indicating a largely atactic chan. The two backbone torsion angles exhibit different behaviour with respect to temperature of their trans population, with one of them adopting an almost all trans sequence. Consequently the resulting configuration leads to a rather persistent chain, something indicated by the value of the characteristic ratio extrapolated from the model. We compare our results with theoretical predictions, computer simulations, RIS models and previously reported experimental results.
Resumo:
Tremendous progress in plant proteomics driven by mass spectrometry (MS) techniques has been made since 2000 when few proteomics reports were published and plant proteomics was in its infancy. These achievements include the refinement of existing techniques and the search for new techniques to address food security, safety, and health issues. It is projected that in 2050, the world’s population will reach 9–12 billion people demanding a food production increase of 34–70% (FAO, 2009) from today’s food production. Provision of food in a sustainable and environmentally committed manner for such a demand without threatening natural resources, requires that agricultural production increases significantly and that postharvest handling and food manufacturing systems become more efficient requiring lower energy expenditure, a decrease in postharvest losses, less waste generation and food with longer shelf life. There is also a need to look for alternative protein sources to animal based (i.e., plant based) to be able to fulfill the increase in protein demands by 2050. Thus, plant biology has a critical role to play as a science capable of addressing such challenges. In this review, we discuss proteomics especially MS, as a platform, being utilized in plant biology research for the past 10 years having the potential to expedite the process of understanding plant biology for human benefits. The increasing application of proteomics technologies in food security, analysis, and safety is emphasized in this review. But, we are aware that no unique approach/technology is capable to address the global food issues. Proteomics-generated information/resources must be integrated and correlated with other omics-based approaches, information, and conventional programs to ensure sufficient food and resources for human development now and in the future.
Resumo:
Purpose – The purpose of this paper is to demonstrate analytically how entrepreneurial action as learning relating to diversifying into technical clothing – i.e. a high-value manufacturing sector – can take place. This is particularly relevant to recent discussion and debate in academic and policy-making circles concerning the survival of the clothing manufacture industry in developed industrialised countries. Design/methodology/approach – Using situated learning theory (SLT) as the major analytical lens, this case study examines an episode of entrepreneurial action relating to diversification into a high-value manufacturing sector. It is considered on instrumentality grounds, revealing wider tendencies in the management of knowledge and capabilities requisite for effective entrepreneurial action of this kind. Findings – Boundary events, brokers, boundary objects, membership structures and inclusive participation that addresses power asymmetries are found to be crucial organisational design elements, enabling the development of inter- and intracommunal capacities. These together constitute a dynamic learning capability, which underpins entrepreneurial action, such as diversification into high-value manufacturing sectors. Originality/value – Through a refinement of SLT in the context of entrepreneurial action, the paper contributes to an advancement of a substantive theory of managing technological knowledge and capabilities for effective diversification into high-value manufacturing sectors.
Resumo:
Abstract We present a refined parametric model for forecasting electricity demand which performed particularly well in the recent Global Energy Forecasting Competition (GEFCom 2012). We begin by motivating and presenting a simple parametric model, treating the electricity demand as a function of the temperature and day of the data. We then set out a series of refinements of the model, explaining the rationale for each, and using the competition scores to demonstrate that each successive refinement step increases the accuracy of the model’s predictions. These refinements include combining models from multiple weather stations, removing outliers from the historical data, and special treatments of public holidays.
Resumo:
Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.
Resumo:
This paper addresses the economics of Enhanced Landfill Mining (ELFM) both from a private point of view as well as from a society perspective. The private potential is assessed using a case study for which an investment model is developed to identify the impact of a broad range of parameters on the profitability of ELFM. We found that especially variations in Waste-to-Energy (WtE efficiency, electricity price, CO2-price, WtE investment and operational costs) and ELFM support explain the variation in economic profitability measured by the Internal Rate of Return. To overcome site-specific parameters we also evaluated the regional ELFM potential for the densely populated and industrial region of Flanders (north of Belgium). The total number of potential ELFM sites was estimated using a 5-step procedure and a simulation tool was developed to trade-off private costs and benefits. The analysis shows that there is a substantial economic potential for ELFM projects on the wider regional level. Furthermore, this paper also reviews the costs and benefits from a broader perspective. The carbon footprint of the case study was mapped in order to assess the project’s net impact in terms of greenhouse gas emissions. Also the impacts of nature restoration, soil remediation, resource scarcity and reduced import dependence were valued so that they can be used in future social cost-benefit analysis. Given the complex trade-off between economic, social and environmental issues of ELFM projects, we conclude that further refinement of the methodological framework and the development of the integrated decision tools supporting private and public actors, are necessary.
Resumo:
The purity and structural stability of the high thermoelectric performance Cu12Sb4S13 and Cu10.4Ni1.6Sb4S13 tetrahedrite phases, synthesized by solid–liquid–vapor reaction and Spark Plasma Sintering, were studied at high temperature by Rietveld refinement using high resolution X-ray powder diffraction data, DSC/TG measurements and high resolution transmission electron microscopy. In a complementary study, the crystal structure of Cu10.5Ni1.5Sb4S13 as a function of temperature was investigated by powder neutron diffraction. The temperature dependence of the structural stability of ternary Cu12Sb4S13 is markedly different to that of the nickel-substituted phases, providing clear evidence for the significant and beneficial role of nickel substitution on both sample purity and stability of the tetrahedrite phase. Moreover, kinetic effects on the phase stability/decomposition have been identified and discussed in order to determine the maximum operating temperature for thermoelectric applications. The thermoelectric properties of these compounds have been determined for high density samples (>98%) prepared by Spark Plasma Sintering and therefore can be used as reference values for tetrahedrite samples. The maximum ZT of 0.8 was found for Cu10.4Ni1.6Sb4S13 at 700 K.
Resumo:
Globalization, either directly or indirectly (e.g. through structural adjustment reforms), has called for profound changes in the previously existing institutional order. Some changes adversely impacted the production and market environment of many coffee producers in developing countries resulting in more risky and less remunerative coffee transactions. This paper focuses on customization of a tropical commodity, fair-trade coffee, as an approach to mitigating the effects of worsened market conditions for small-scale coffee producers in less developed countries. fair-trade labeling is viewed as a form of “de-commodification” of coffee through product differentiation on ethical grounds. This is significant not only as a solution to the market failure caused by pervasive information asymmetries along the supply chain, but also as a means of revitalizing the agricultural-commodity-based trade of less developed countries (LDCs) that has been languishing under globalization. More specifically, fair-trade is an example of how the same strategy adopted by developed countries’ producers/ processors (i.e. the sequence product differentiation - institutional certification - advertisement) can be used by LDC producers to increase the reputation content of their outputs by transforming them from mere commodities into “decommodified” (i.e. customized and more reputed) goods. The resulting segmentation of the world coffee market makes possible to meet the demand by consumers with preference for this “(ethically) customized” coffee and to transfer a share of the accruing economic rents backward to the Fair-trade coffee producers in LDCs. It should however be stressed that this outcome cannot be taken for granted since investments are needed to promote the required institutional innovations. In Italy FTC is a niche market with very few private brands selling this product. However, an increase of FTC market share could be a big commercial opportunity for farmers in LDCs and other economic agents involved along the international coffee chain. Hence, this research explores consumers’ knowledge of labels promoting quality products, consumption coffee habits, brand loyalty, willingness to pay and market segmentation according to the heterogeneity of preferences for coffee products. The latter was assessed developing a D-efficient design where stimuli refinement was tested during two focus groups.