96 resultados para competence-based approach
Resumo:
A recent article in this journal challenged claims that a human rights framework should be applied to drug control. This article questions the author’s assertions and reframes them in the context of socio-legal drug scholarship, aiming to build on the discourse concerning human rights and drug use. It is submitted that a rights-based approach is a necessary, indeed obligatory, ethical and legal framework through which to address drug use and that international human rights law provides the proper scope for determining where interferences with individual human rights might be justified on certain, limited grounds.
Resumo:
For more than half a century, emotion researchers have attempted to establish the dimensional space that most economically accounts for similarities and differences in emotional experience. Today, many researchers focus exclusively on two-dimensional models involving valence and arousal. Adopting a theoretically based approach, we show for three languages that four dimensions are needed to satisfactorily represent similarities and differences in the meaning of emotion words. In order of importance, these dimensions are evaluation-pleasantness, potency-control, activation-arousal, and unpredictability. They were identified on the basis of the applicability of 144 features representing the six components of emotions: (a) appraisals of events, (b) psychophysiological changes, (c) motor expressions, (d) action tendencies, (e) subjective experiences, and (f) emotion regulation.
Resumo:
The article explores how fair trade and associated private agri-food standards are incorporated into public procurement in Europe. Procurement law is underpinned by principles of equity, non-discrimination and transparency; one consequence is that legal obstacles exist to fair trade being privileged within procurement practice. These obstacles have pragmatic dimensions, concerning whether and how procurement can be used to fulfil wider social policy objectives or to incorporate private standards; they also bring to the fore underlying issues of value. Taking an agency-based approach and incorporating the concept of governability, empirical evidence demonstrates the role played by different actors in negotiating fair trade’s passage into procurement through pre-empting and managing legal risk. This process exposes contestations that arise when contrasting values come together within sustainable procurement. This examination of fair trade in public procurement helps reveal how practices and knowledge on ethical consumption enter into a new governance arena within the global agri-food system.
Resumo:
In this article, I study the impacts of a specific incentives-based approach to safety regulation, namely the control of quality through sampling and threatening penalties when quality fails to meet some minimum standard. The welfare-improving impacts of this type of scheme seem high and are cogently illustrated in a recent contribution by Segerson, which stimulated many of the ideas in this paper. For this reason, the reader is referred to Segerson for a background on some of the motivation, and throughout, I make an effort to indicate differences between the two approaches. There are three major differences. First, I dispense with the calculus as much as possible, seeking readily interpreted, closedform solutions to illustrate the main ideas. Second, (strategically optimal, symmetric) Nash equilibria are the mainstay of each of the current models. Third, in the uncertainquality- provision equilibria, each of the Nash suppliers chooses the level of the lower bound for quality as a control and offers a draw from its (private) distribution in a contribution to the (public) pool of quality.
Resumo:
As a major mode of intraseasonal variability, which interacts with weather and climate systems on a near-global scale, the Madden – Julian Oscillation (MJO) is a crucial source of predictability for numerical weather prediction (NWP) models. Despite its global significance and comprehensive investigation, improvements in the representation of the MJO in an NWP context remain elusive. However, recent modifications to the model physics in the ECMWF model led to advances in the representation of atmospheric variability and the unprecedented propagation of the MJO signal through the entire integration period. In light of these recent advances, a set of hindcast experiments have been designed to assess the sensitivity of MJO simulation to the formulation of convection. Through the application of established MJO diagnostics, it is shown that the improvements in the representation of the MJO can be directly attributed to the modified convective parametrization. Furthermore, the improvements are attributed to the move from a moisture-convergent- to a relative-humidity-dependent formulation for organized deep entrainment. It is concluded that, in order to understand the physical mechanisms through which a relative-humidity-dependent formulation for entrainment led to an improved simulation of the MJO, a more process-based approach should be taken. T he application of process-based diagnostics t o t he hindcast experiments presented here will be the focus of Part II of this study.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.
Resumo:
We discuss several methods of calculating the DIS structure functions F2(x,Q2) based on BFKL-type small x resummations. Taking into account new HERA data ranging down to small xand low Q2, the pure leading order BFKL-based approach is excluded. Other methods based on high energy factorization are closer to conventional renormalization group equations. Despite several difficulties and ambiguities in combining the renormalization group equations with small x resummed terms, we find that a fit to the current data is hardly feasible, since the data in the low Q2 region are not as steep as the BFKL formalism predicts. Thus we conclude that deviations from the (successful) renormalization group approach towards summing up logarithms in 1/x are disfavoured by experiment.
Resumo:
Spatial memory is important for locating objects in hierarchical data structures, such as desktop folders. There are, however, some contradictions in literature concerning the effectiveness of 3D user interfaces when compared to their 2D counterparts. This paper uses a task-based approach in order to investigate the effectiveness of adding a third dimension to specific user tasks, i.e. the impact of depth on navigation in a 3D file manager. Results highlight issues and benefits of using 3D interfaces for visual and verbal tasks, and introduces the possible existence of a correlation between aptitude scores achieved on the Guilford- Zimmerman Orientation Survey and Electroencephalography- measured brainwave activity as participants search for targets of variable perceptual salience in 2D and 3D environments.
Resumo:
Using a water balance modelling framework, this paper analyses the effects of urban design on the water balance, with a focus on evapotranspiration and storm water. First, two quite different urban water balance models are compared: Aquacycle which has been calibrated for a suburban catchment in Canberra, Australia, and the single-source urban evapotranspiration-interception scheme (SUES), an energy-based approach with a biophysically advanced representation of interception and evapotranspiration. A fair agreement between the two modelled estimates of evapotranspiration was significantly improved by allowing the vegetation cover (leaf area index, LAI) to vary seasonally, demonstrating the potential of SUES to quantify the links between water sensitive urban design and microclimates and the advantage of comparing the two modelling approaches. The comparison also revealed where improvements to SUES are needed, chiefly through improved estimates of vegetation cover dynamics as input to SUES, and more rigorous parameterization of the surface resistance equations using local-scale suburban flux measurements. Second, Aquacycle is used to identify the impact of an array of water sensitive urban design features on the water balance terms. This analysis confirms the potential to passively control urban microclimate by suburban design features that maximize evapotranspiration, such as vegetated roofs. The subsequent effects on daily maximum air temperatures are estimated using an atmospheric boundary layer budget. Potential energy savings of about 2% in summer cooling are estimated from this analysis. This is a clear ‘return on investment’ of using water to maintain urban greenspace, whether as parks distributed throughout an urban area or individual gardens or vegetated roofs.
Resumo:
In any wide-area distributed system there is a need to communicate and interact with a range of networked devices and services ranging from computer-based ones (CPU, memory and disk), to network components (hubs, routers, gateways) and specialised data sources (embedded devices, sensors, data-feeds). In order for the ensemble of underlying technologies to provide an environment suitable for virtual organisations to flourish, the resources that comprise the fabric of the Grid must be monitored in a seamless manner that abstracts away from the underlying complexity. Furthermore, as various competing Grid middleware offerings are released and evolve, an independent overarching monitoring service should act as a corner stone that ties these systems together. GridRM is a standards-based approach that is independent of any given middleware and that can utilise legacy and emerging resource-monitoring technologies. The main objective of the project is to produce a standardised and extensible architecture that provides seamless mechanisms to interact with native monitoring agents across heterogeneous resources.
Resumo:
We consider evaluating the UK Monetary Policy Committee's inflation density forecasts using probability integral transform goodness-of-fit tests. These tests evaluate the whole forecast density. We also consider whether the probabilities assigned to inflation being in certain ranges are well calibrated, where the ranges are chosen to be those of particular relevance to the MPC, given its remit of maintaining inflation rates in a band around per annum. Finally, we discuss the decision-based approach to forecast evaluation in relation to the MPC forecasts
Resumo:
The 3D shape of an object and its 3D location have traditionally thought of as very separate entities, although both can be described within a single 3D coordinate frame. Here, 3D shape and location are considered as two aspects of a view-based approach to representing depth, avoiding the use of 3D coordinate frames.
Resumo:
The evidence for anthropogenic climate change continues to strengthen, and concerns about severe weather events are increasing. As a result, scientific interest is rapidly shifting from detection and attribution of global climate change to prediction of its impacts at the regional scale. However, nearly everything we have any confidence in when it comes to climate change is related to global patterns of surface temperature, which are primarily controlled by thermodynamics. In contrast, we have much less confidence in atmospheric circulation aspects of climate change, which are primarily controlled by dynamics and exert a strong control on regional climate. Model projections of circulation-related fields, including precipitation, show a wide range of possible outcomes, even on centennial timescales. Sources of uncertainty include low-frequency chaotic variability and the sensitivity to model error of the circulation response to climate forcing. As the circulation response to external forcing appears to project strongly onto existing patterns of variability, knowledge of errors in the dynamics of variability may provide some constraints on model projections. Nevertheless, higher scientific confidence in circulation-related aspects of climate change will be difficult to obtain. For effective decision-making, it is necessary to move to a more explicitly probabilistic, risk-based approach.
Resumo:
This paper examines organizational foresight from a relational perspective. In doing this, we present relational incumbency as a transient conceptual framework to explore how the organizing social relationships and interactions of lower participants may influence organizational foresightfulness. The research employed an exploratory case-based approach with three software organisations and their four new product innovation projects serving as the empirical research sites. Drawing on the case evidence, we provide an account on how normative organizing structures, rights and authority relationships constitutively influence the creative emergence of organizational foresight in practice. We conclude the paper with a discussion of the managerial implications and some directions for future research.