911 resultados para Deterministic walkers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The existing assignment problems for assigning n jobs to n individuals are limited to the considerations of cost or profit measured as crisp. However, in many real applications, costs are not deterministic numbers. This paper develops a procedure based on Data Envelopment Analysis method to solve the assignment problems with fuzzy costs or fuzzy profits for each possible assignment. It aims to obtain the points with maximum membership values for the fuzzy parameters while maximizing the profit or minimizing the assignment cost. In this method, a discrete approach is presented to rank the fuzzy numbers first. Then, corresponding to each fuzzy number, we introduce a crisp number using the efficiency concept. A numerical example is used to illustrate the usefulness of this new method. © 2012 Operational Research Society Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Productivity at the macro level is a complex concept but also arguably the most appropriate measure of economic welfare. Currently, there is limited research available on the various approaches that can be used to measure it and especially on the relative accuracy of said approaches. This thesis has two main objectives: firstly, to detail some of the most common productivity measurement approaches and assess their accuracy under a number of conditions and secondly, to present an up-to-date application of productivity measurement and provide some guidance on selecting between sometimes conflicting productivity estimates. With regards to the first objective, the thesis provides a discussion on the issues specific to macro-level productivity measurement and on the strengths and weaknesses of the three main types of approaches available, namely index-number approaches (represented by Growth Accounting), non-parametric distance functions (DEA-based Malmquist indices) and parametric production functions (COLS- and SFA-based Malmquist indices). The accuracy of these approaches is assessed through simulation analysis, which provided some interesting findings. Probably the most important were that deterministic approaches are quite accurate even when the data is moderately noisy, that no approaches were accurate when noise was more extensive, that functional form misspecification has a severe negative effect in the accuracy of the parametric approaches and finally that increased volatility in inputs and prices from one period to the next adversely affects all approaches examined. The application was based on the EU KLEMS (2008) dataset and revealed that the different approaches do in fact result in different productivity change estimates, at least for some of the countries assessed. To assist researchers in selecting between conflicting estimates, a new, three step selection framework is proposed, based on findings of simulation analyses and established diagnostics/indicators. An application of this framework is also provided, based on the EU KLEMS dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The deployment of bioenergy technologies is a key part of UK and European renewable energy policy. A key barrier to the deployment of bioenergy technologies is the management of biomass supply chains including the evaluation of suppliers and the contracting of biomass. In the undeveloped biomass for energy market buyers of biomass are faced with three major challenges during the development of new bioenergy projects. What characteristics will a certain supply of biomass have, how to evaluate biomass suppliers and which suppliers to contract with in order to provide a portfolio of suppliers that best satisfies the needs of the project and its stakeholder group whilst also satisfying crisp and non-crisp technological constraints. The problem description is taken from the situation faced by the industrial partner in this research, Express Energy Ltd. This research tackles these three areas separately then combines them to form a decision framework to assist biomass buyers with the strategic sourcing of biomass. The BioSS framework. The BioSS framework consists of three modes which mirror the development stages of bioenergy projects. BioSS.2 mode for early stage development, BioSS.3 mode for financial close stage and BioSS.Op for the operational phase of the project. BioSS is formed of a fuels library, a supplier evaluation module and an order allocation module, a Monte-Carlo analysis module is also included to evaluate the accuracy of the recommended portfolios. In each mode BioSS can recommend which suppliers should be contracted with and how much material should be purchased from each. The recommended blend should have chemical characteristics within the technological constraints of the conversion technology and also best satisfy the stakeholder group. The fuels library is made up from a wide variety of sources and contains around 100 unique descriptions of potential biomass sources that a developer may encounter. The library takes a wide data collection approach and has the aim of allowing for estimates to be made of biomass characteristics without expensive and time consuming testing. The supplier evaluation part of BioSS uses a QFD-AHP method to give importance weightings to 27 different evaluating criteria. The evaluating criteria have been compiled from interviews with stakeholders and policy and position documents and the weightings have been assigned using a mixture of workshops and expert interview. The weighted importance scores allow potential suppliers to better tailor their business offering and provides a robust framework for decision makers to better understand the requirements of the bioenergy project stakeholder groups. The order allocation part of BioSS uses a chance-constrained programming approach to assign orders of material between potential suppliers based on the chemical characteristics of those suppliers and the preference score of those suppliers. The optimisation program finds the portfolio of orders to allocate to suppliers to give the highest performance portfolio in the eyes of the stakeholder group whilst also complying with technological constraints. The technological constraints can be breached if the decision maker requires by setting the constraint as a chance-constraint. This allows a wider range of biomass sources to be procured and allows a greater overall performance to be realised than considering crisp constraints or using deterministic programming approaches. BioSS is demonstrated against two scenarios faced by UK bioenergy developers. The first is a large scale combustion power project, the second a small scale gasification project. The Bioss is applied in each mode for both scenarios and is shown to adapt the solution to the stakeholder group importance and the different constraints of the different conversion technologies whilst finding a globally optimal portfolio for stakeholder satisfaction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explains some drawbacks on previous approaches for detecting influential observations in deterministic nonparametric data envelopment analysis models as developed by Yang et al. (Annals of Operations Research 173:89-103, 2010). For example efficiency scores and relative entropies obtained in this model are unimportant to outlier detection and the empirical distribution of all estimated relative entropies is not a Monte-Carlo approximation. In this paper we developed a new method to detect whether a specific DMU is truly influential and a statistical test has been applied to determine the significance level. An application for measuring efficiency of hospitals is used to show the superiority of this method that leads to significant advancements in outlier detection. © 2014 Springer Science+Business Media New York.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present exact analytical results for the statistics of nonlinear coupled oscillators under the influence of additive white noise. We suggest a perturbative approach for analysing the statistics of such systems under the action of a deterministic perturbation, based on the exact expressions for probability density functions for noise-driven oscillators. Using our perturbation technique we show that our results can be applied to studying the optical signal propagation in noisy fibres at (nearly) zero dispersion as well as to weakly nonlinear lattice models with additive noise. The approach proposed can account for a wide spectrum of physically meaningful perturbations and is applicable to the case of large noise strength. © 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article reviews a particular aspect of the critique of the increasing focus on the brain and neuroscience; what has been termed by some, 'neuromania'. It engages with the growing literature produced in response to the 'first three years' movement: an alliance of child welfare advocates and politicians that draws on the authority of neuroscience to argue that social problems such as inequality, poverty, educational underachievement, violence and mental illness are best addressed through 'early intervention' programmes to protect or enhance emotional and cognitive aspects of children's brain development. The movement began in the United States in the early 1990s and has become increasingly vocal and influential since then, achieving international legitimacy in the United States, Canada, New Zealand, Australia, the UK and elsewhere. The movement, and the brain-based culture of expert-led parent training that has grown with it, has been criticised for claiming scientific authority whilst taking a cavalier approach to scientific method and evidence; for being overly deterministic about the early years of life; for focusing attention on individual parental failings rather than societal or structural problems, for adding to the expanding anxieties of parents and strengthening the intensification of parenting and, ultimately, for redefining the parent-child relationship in biologised, instrumental and dehumanised terms. © 2014 John Wiley & Sons Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Networked Learning, e-Learning and Technology Enhanced Learning have each been defined in different ways, as people's understanding about technology in education has developed. Yet each could also be considered as a terminology competing for a contested conceptual space. Theoretically this can be a ‘fertile trans-disciplinary ground for represented disciplines to affect and potentially be re-orientated by others’ (Parchoma and Keefer, 2012), as differing perspectives on terminology and subject disciplines yield new understandings. Yet when used in government policy texts to describe connections between humans, learning and technology, terms tend to become fixed in less fertile positions linguistically. A deceptively spacious policy discourse that suggests people are free to make choices conceals an economically-based assumption that implementing new technologies, in themselves, determines learning. Yet it actually narrows choices open to people as one route is repeatedly in the foreground and humans are not visibly involved in it. An impression that the effective use of technology for endless improvement is inevitable cuts off critical social interactions and new knowledge for multiple understandings of technology in people's lives. This paper explores some findings from a corpus-based Critical Discourse Analysis of UK policy for educational technology during the last 15 years, to help to illuminate the choices made. This is important when through political economy, hierarchical or dominant neoliberal logic promotes a single ‘universal model’ of technology in education, without reference to a wider social context (Rustin, 2013). Discourse matters, because it can ‘mould identities’ (Massey, 2013) in narrow, objective economically-based terms which 'colonise discourses of democracy and student-centredness' (Greener and Perriton, 2005:67). This undermines subjective social, political, material and relational (Jones, 2012: 3) contexts for those learning when humans are omitted. Critically confronting these structures is not considered a negative activity. Whilst deterministic discourse for educational technology may leave people unconsciously restricted, I argue that, through a close analysis, it offers a deceptively spacious theoretical tool for debate about the wider social and economic context of educational technology. Methodologically it provides insights about ways technology, language and learning intersect across disciplinary borders (Giroux, 1992), as powerful, mutually constitutive elements, ever-present in networked learning situations. In sharing a replicable approach for linguistic analysis of policy discourse I hope to contribute to visions others have for a broader theoretical underpinning for educational technology, as a developing field of networked knowledge and research (Conole and Oliver, 2002; Andrews, 2011).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We summarize the results of our recent demonstration of the first multi-channel regenerator for phase encoded signals. By developing a novel inline phase sensitive amplification scheme simultaneous suppression of deterministic phase distortion on two independent 42.66 Gbit/s DPSK modulated signal wavelengths was achieved. © 2012 SEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work introduces a complexity measure which addresses some conflicting issues between existing ones by using a new principle - measuring the average amount of symmetry broken by an object. It attributes low (although different) complexity to either deterministic or random homogeneous densities and higher complexity to the intermediate cases. This new measure is easily computable, breaks the coarse graining paradigm and can be straightforwardly generalized, including to continuous cases and general networks. By applying this measure to a series of objects, it is shown that it can be consistently used for both small scale structures with exact symmetry breaking and large scale patterns, for which, differently from similar measures, it consistently discriminates between repetitive patterns, random configurations and self-similar structures

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we analyse rigidities in the behaviour of the mark-up on regular, midgrade and premium varieties of petrol in the New York area using a set of weekly frequency data and a methodology that analyses the pricing process using deterministic and stochastic techniques. The results are consistent across methodologies and indicate that the speeds of adjustment to the long-run equilibrium mark-up differ across varieties of petrol with margins of the premium variety falling faster than they rise, contrary to the popular claim of welfare-decreasing asymmetries in price transmission. © 2012 The Authors. The Manchester School © 2012 Blackwell Publishing Ltd and The University of Manchester.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OpenMI is a widely used standard allowing exchange of data between integrated models, which has mostly been applied to dynamic, deterministic models. Within the FP7 UncertWeb project we are developing mechanisms and tools to support the management of uncertainty in environmental models. In this paper we explore the integration of the UncertWeb framework with OpenMI, to assess the issues that arise when propagating uncertainty in OpenMI model compositions, and the degree of integration possible with UncertWeb tools. In particular we develop an uncertainty-enabled model for a simple Lotka-Volterra system with an interface conforming to the OpenMI standard, exploring uncertainty in the initial predator and prey levels, and the parameters of the model equations. We use the Elicitator tool developed within UncertWeb to identify the initial condition uncertainties, and show how these can be integrated, using UncertML, with simple Monte Carlo propagation mechanisms. The mediators we develop for OpenMI models are generic and produce standard Web services that expose the OpenMI models to a Web based framework. We discuss what further work is needed to allow a more complete system to be developed and show how this might be used practically.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, English welfare and health policy has started to include pregnancy within the foundation stage of child development. The foetus is also increasingly designated as ‘at risk’ from pregnant women. In this article, we draw on an analysis of a purposive sample of English social and welfare policies and closely related advocacy documents to trace the emergence of neuroscientific claims-making in relation to the family. In this article, we show that a specific deterministic understanding of the developing brain that only has a loose relationship with current scientific evidence is an important component in these changes. We examine the ways in which pregnancy is situated in these debates. In these debates, maternal stress is identified as a risk to the foetus; however, the selective concern with women living in disadvantage undermines biological claims. The policy claim of neurological ‘critical windows’ also seems to be influenced by social concerns. Hence, these emerging concerns over the foetus’ developing brain seem to be situated within the gendered history of policing women’s pregnant bodies rather than acting on new insights from scientific discoveries. By situating these developments within the broader framework of risk consciousness, we can link these changes to wider understandings of the ‘at risk’ child and intensified surveillance over family life.