21 resultados para spatially explicit individual-based model

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measuring variations in efficiency and its extension, eco-efficiency, during a restructuring period in different industries has always been a point of interest for regulators and policy makers. This paper assesses the impacts of restructuring of procurement in the Iranian power industry on the performance of power plants. We introduce a new slacks-based model for Malmquist-Luenberger (ML) Index measurement and apply it to the power plants to calculate the efficiency, eco-efficiency, and technological changes over the 8-year period (2003-2010) of restructuring in the power industry. The results reveal that although the restructuring had different effects on the individual power plants, the overall growth in the eco-efficiency of the sector was mainly due to advances in pure technology. We also assess the correlation between efficiency and eco-efficiency of the power plants, which indicates a close relationship between these two steps, thus lending support to the incorporation of environmental factors in efficiency analysis. © 2014 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adapting to blurred images makes in-focus images look too sharp, and vice-versa (Webster et al, 2002 Nature Neuroscience 5 839 - 840). We asked how such blur adaptation is related to contrast adaptation. Georgeson (1985 Spatial Vision 1 103 - 112) found that grating contrast adaptation followed a subtractive rule: perceived (matched) contrast of a grating was fairly well predicted by subtracting some fraction k(~0.3) of the adapting contrast from the test contrast. Here we apply that rule to the responses of a set of spatial filters at different scales and orientations. Blur is encoded by the pattern of filter response magnitudes over scale. We tested two versions - the 'norm model' and 'fatigue model' - against blur-matching data obtained after adaptation to sharpened, in-focus or blurred images. In the fatigue model, filter responses are simply reduced by exposure to the adapter. In the norm model, (a) the visual system is pre-adapted to a focused world and (b) discrepancy between observed and expected responses to the experimental adapter leads to additional reduction (or enhancement) of filter responses during experimental adaptation. The two models are closely related, but only the norm model gave a satisfactory account of results across the four experiments analysed, with one free parameter k. This model implies that the visual system is pre-adapted to focused images, that adapting to in-focus or blank images produces no change in adaptation, and that adapting to sharpened or blurred images changes the state of adaptation, leading to changes in perceived blur or sharpness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The starting point of this research was the belief that manufacturing and similar industries need help with the concept of e-business, especially in assessing the relevance of possible e-business initiatives. The research hypotheses was that it should be possible to produce a systematic model that defines, at a useful level of detail, the probable e-business requirements of an organisation based on objective criteria with an accuracy of 85%-90%. This thesis describes the development and validation of such a model. A preliminary model was developed from a variety of sources, including a survey of current and planned e-business activity and representative examples of e-business material produced by e-business solution providers. The model was subject to a process of testing and refinement based on recursive case studies, with controls over the improving accuracy and stability of the model. Useful conclusions were also possible as to the relevance of e-business functions to the case study participants themselves. Techniques were evolved to synthesise the e-business requirements of an organisation and present them at a management summary level of detail. The results of applying these techniques to all the case studies used in this research were discussed. The conclusion of the research was that the case study methodology employed was successful. A model was achieved suitable for practical application in a manufacturing organisation requiring help with a requirements definition process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate knowledge exchange among commercial organizations, the rationale behind it, and its effects on the market. Knowledge exchange is known to be beneficial for industry, but in order to explain it, authors have used high-level concepts like network effects, reputation, and trust. We attempt to formalize a plausible and elegant explanation of how and why companies adopt information exchange and why it benefits the market as a whole when this happens. This explanation is based on a multiagent model that simulates a market of software providers. Even though the model does not include any high-level concepts, information exchange naturally emerges during simulations as a successful profitable behavior. The conclusions reached by this agent-based analysis are twofold: 1) a straightforward set of assumptions is enough to give rise to exchange in a software market, and 2) knowledge exchange is shown to increase the efficiency of the market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existing method of pipeline monitoring, which requires an entire pipeline to be inspected periodically, wastes time and is expensive. A risk-based model that reduces the amount of time spent on inspection has been developed. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests an efficient design and operation philosophy, construction method and logical insurance plans.The risk-based model uses analytic hierarchy process, a multiple attribute decision-making technique, to identify factors that influence failure on specific segments and analyze their effects by determining the probabilities of risk factors. The severity of failure is determined through consequence analysis, which establishes the effect of a failure in terms of cost caused by each risk factor and determines the cumulative effect of failure through probability analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we propose a NLSE-based model of power and spectral properties of the random distributed feedback (DFB) fiber laser. The model is based on coupled set of non-linear Schrödinger equations for pump and Stokes waves with the distributed feedback due to Rayleigh scattering. The model considers random backscattering via its average strength, i.e. we assume that the feedback is incoherent. In addition, this allows us to speed up simulations sufficiently (up to several orders of magnitude). We found that the model of the incoherent feedback predicts the smooth and narrow (comparing with the gain spectral profile) generation spectrum in the random DFB fiber laser. The model allows one to optimize the random laser generation spectrum width varying the dispersion and nonlinearity values: we found, that the high dispersion and low nonlinearity results in narrower spectrum that could be interpreted as four-wave mixing between different spectral components in the quasi-mode-less spectrum of the random laser under study could play an important role in the spectrum formation. Note that the physical mechanism of the random DFB fiber laser formation and broadening is not identified yet. We investigate temporal and statistical properties of the random DFB fiber laser dynamics. Interestingly, we found that the intensity statistics is not Gaussian. The intensity auto-correlation function also reveals that correlations do exist. The possibility to optimize the system parameters to enhance the observed intrinsic spectral correlations to further potentially achieved pulsed (mode-locked) operation of the mode-less random distributed feedback fiber laser is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Softeam has over 20 years of experience providing UML-based modelling solutions, such as its Modelio modelling tool, and its Constellation enterprise model management and collaboration environment. Due to the increasing number and size of the models used by Softeam’s clients, Softeam joined the MONDO FP7 EU research project, which worked on solutions for these scalability challenges and produced the Hawk model indexer among other results. This paper presents the technical details and several case studies on the integration of Hawk into Softeam’s toolset. The first case study measured the performance of Hawk’s Modelio support using varying amounts of memory for the Neo4j backend. In another case study, Hawk was integrated into Constellation to provide scalable global querying of model repositories. Finally, the combination of Hawk and the Epsilon Generation Language was compared against Modelio for document generation: for the largest model, Hawk was two orders of magnitude faster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advocates of ‘local food’ claim it serves to reduce food miles and greenhouse gas emissions, improve food safety and quality, strengthen local economies and enhance social capital. We critically review the philosophical and scientific rationale for this assertion, and consider whether conventional scientific approaches can help resolve the debate. We conclude that food miles are a poor indicator of the environmental and ethical impacts of food production. Only through combining spatially explicit life cycle assessment with analysis of social issues can the benefits of local food be assessed. This type of analysis is currently lacking for nearly all food chains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word senses. This paper presents a novel approach which addresses these limitations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned representations outperform the publicly available embeddings on half of the metrics in the word similarity task, 6 out of 13 sub tasks in the analogical reasoning task, and gives the best overall accuracy in the word sense effect classification task, which shows the effectiveness of our proposed distributed distribution learning model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose an alternative method for measuring efficiency of Decision making Units, which allows the presence of variables with both negative and positive values. The model is applied to data on the notional effluent processing system to compare the results with recent developed methods; Modified Slacks Based Model as suggested by Sharp et al (2007) and Range Directional Measures developed by Silva Portela et al (2004). A further example explores advantages of using the new model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper contributes to the literature on the intra-firm diffusion of innovations by investigating the factors that affect the firm’s decision to adopt and use sets of complementary innovations. We define complementary innovations those innovations whose joint use generates super additive gains, i.e. the gain from the joint adoption is higher than the sum of the gains derived from the adoption of each innovation in isolation. From a theoretical perspective, we present a simple decision model, whereby the firm decides ‘whether’ and ‘how much’ to invest in each of the innovations under investigation based upon the expected profit gain from each possible combination of adoption and use. The model shows how the extent of complementarity among the innovations can affect the firm’s profit gains and therefore the likelihood that the firm will adopt these innovations jointly, rather than individually. From an empirical perspective, we focus on four sets of management practices, namely operating (OMP), monitoring (MMP), targets (TMP) and incentives (IMP) management practices. We show that these sets of practices, although to a different extent, are complementary to each other. Then, we construct a synthetic indicator of the depth of their use. The resulting intra-firm index is built to reflect not only the number of practices adopted but also the depth of their individual use and the extent of their complementarity. The empirical testing of the decision model is carried out using the evidence from the adoption behaviour of a sample of 1,238 UK establishments present in the 2004 Workplace Employment Relations Survey (WERS). Our empirical results show that the intra-firm profitability based model is a good model in that it can explain more of the variability of joint adoption than models based upon the variability of adoption and use of individual practices. We also investigate whether a number of firm specific and market characteristics by affecting the size of the gains (which the joint adoption of innovations can generate) may drive the intensity of use of the four innovations. We find that establishment size, whether foreign owned, whether exposed to an international market and the degree of homogeneity of the final product are important determinants of the intensity of the joint adoption of the four innovations. Most importantly, our results point out that the factors that the economics of innovation literature has been showing to affect the intensity of use of a technological innovation do also affect the intensity of use of sets of innovative management practices. However, they can explain only a small part of the diversity of their joint adoption use by the firms in the sample.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a stochastic agent-based model for the distribution of personal incomes in a developing economy. We start with the assumption that incomes are determined both by individual labour and by stochastic effects of trading and investment. The income from personal effort alone is distributed about a mean, while the income from trade, which may be positive or negative, is proportional to the trader's income. These assumptions lead to a Langevin model with multiplicative noise, from which we derive a Fokker-Planck (FP) equation for the income probability density function (IPDF) and its variation in time. We find that high earners have a power law income distribution while the low-income groups have a Levy IPDF. Comparing our analysis with the Indian survey data (obtained from the world bank website: http://go.worldbank.org/SWGZB45DN0) taken over many years we obtain a near-perfect data collapse onto our model's equilibrium IPDF. Using survey data to relate the IPDF to actual food consumption we define a poverty index (Sen A. K., Econometrica., 44 (1976) 219; Kakwani N. C., Econometrica, 48 (1980) 437), which is consistent with traditional indices, but independent of an arbitrarily chosen "poverty line" and therefore less susceptible to manipulation. Copyright © EPLA, 2010.