928 resultados para BENCHMARK
Resumo:
可持续发展的要求增强了人类有效利用资源和环境保护的意识,而城市污水处理过程是水环境保护的一项重要内容。本文针对我国城市污水处理系统能耗高、出水水质不稳定以及工业毒水经常入侵的现状,以A/O工艺城市污水生化处理过程为研究对象,研究通过建模、控制和优化手段提高污水处理系统运行性能和降低运行降耗的理论和方法。论文工作是在国际水协会发布的典型工艺与污水处理厂实际运行环境相结合的基础上完成,内容主要涉及活性污泥模型简化、基于简化模型的工艺改进与过程模拟、仿真平台的建立、关键控制回路设计、工业毒水诊断与应对系统开发以及优化控制方法与应用研究。具体内容包括: (1) 基于国际评价基准benchmark实现了污水生化处理过程的稳态模拟和动态模拟,对A2/O工艺的改进进行了仿真研究和分析,结果表明在保证出水水质稳定达标的前提下,改进工艺由于取消了内回流,大大节约了回流泵的能耗。 (2) 在国际水协会活性污泥模型ASM1模型的基础上,结合我国污水处理过程特点,并充分考虑我国污水处理厂现场测量信息严重不足的现状,对ASM1模型的组分和反应过程进行简化,建立了简化的活性污泥模型,研究了难以测量的模型组分浓度与易测的常规水质指标的转换方法,针对温度对反应速率的影响,研究了模型参数的校正方法。在此基础上,开发了A/O工艺污水生化处理过程模拟与仿真平台,并对辽宁某污水处理厂的实际运行过程进行了模拟,取得了较好的模拟结果。 (3) 建立了完整的城市污水处理过程控制系统设计框架,引入入水有机负荷和比耗氧速率分别表征微生物(活性污泥)的“食物”(有机物)数量和微生物活性,进行了系统动态特性分析,在此基础上完成了入水流量串级控制、污泥浓度前馈-反馈控制、溶解氧前馈-串级控制三个控制回路的设计。针对溶解氧控制过程采用了仿人智能控制方法,在实际应用中取得了较好的控制效果。 (4) 针对城市污水处理厂经常遇到的工业毒水侵入的问题,开发了基于专家经验的工业毒水诊断与应对系统,编写了较为完备的专家规则,系统能够在毒水侵入时及时发出警报,并采取相应的应对措施,最大程度的降低污水处理厂的损失,该系统在辽宁某污水处理厂取得了较好的应用效果。 (5) 在优化控制方面,针对污水处理过程能耗过高的问题,设计了A/O工艺城市污水处理过程的优化控制策略。建立了只考虑底物与微生物两种组分的简化活性污泥模型,以溶解氧浓度和污泥排放量为决策变量,以污水处理厂日运行费用为性能指标,以物料平衡、水质排放标准等限制为约束条件,建立了完整的优化控制模型,实现了污水处理过程的最优控制,采用基于最优步长参数动态搜索的改进型梯度法进行最佳运行工况寻优。通过对辽宁某城市污水处理厂的优化,获得了当前溶解氧浓度设定值和污泥排放量的最优值,在保证出水水质稳定达标的前提下,污水处理厂日运行费用显著降低,为污水处理厂的实际操作提供了指导。
Resumo:
A major impetus to study the rough surface and complex structure in near surface model is because accuracy of seismic observation and geophysical prospecting can be improved. Wave theory study about fluid-satuated porous media has important significance for some scientific problems, such as explore underground resources, study of earth's internal structure, and structure response of multi-phase porous soil under dynamic and seismic effect. Seismic wave numerical modeling is one of the effective methods which understand seismic propagation rules in complex media. As a numerical simulation method, boundary element methods had been widely used in seismic wave field study. This paper mainly studies randomly rough surface scattering which used some approximation solutions based on boundary element method. In addition, I developed a boundary element solution for fluid saturated porous media. In this paper, we used boundary element methods which based on integral expression of wave equation to study the free rough surface scattering effects of Kirchhoff approximation method, Perturbation approximation method, Rytov approximation method and Born series approximation method. Gaussian spectrum model of randomly rough surfaces was chosen as the benchmark model. The approximation methods result were compared with exact results which obtained by boundary element methods, we study that the above approximation methods were applicable how rough surfaces and it is founded that this depends on and ( here is the wavenumber of the incident field, is the RMS height and is the surface correlation length ). In general, Kirchhoff approximation which ignores multiple scatterings between any two surface points has been considered valid for the large-scale roughness components. Perturbation theory based on Taylor series expansion is valid for the small-scale roughness components, as and are .Tests with the Gaussian topographies show that the Rytov approximation methods improves the Kirchhoff approximation in both amplitude and phase but at the cost of an extra treatment of transformation for the wave fields. The realistic methods for the multiscale surfaces come with the Born series approximation and the second-order Born series approximation might be sufficient to guarantee the accuracy of randomly rough surfaces. It could be an appropriate choice that a complex rough surface can be divided into large-, medium-, and small-scale roughness components with their scattering features be studied by the Kirchhoff or Rytov phase approximations, the Born series approximation, and the perturbation theory, respectively. For this purpose, it is important to select appropriate parameters that separate these different scale roughness components to guarantee the divided surfaces satisfy the physical assumptions of the used approximations, respectively. In addition, in this paper, the boundary element methods are used for solving the porous elastic wave propagation and carry out the numerical simulation. Based on the fluid-saturated porous model, this paper analyses and presents the dynamic equation of elastic wave propagation and boundary integral equation formulation of fluid saturated porous media in frequency domain. The fundamental solutions of the elastic wave equations are obtained according to the similarity between thermoelasticity and poroelasticity. At last, the numerical simulation of the elastic wave propagation in the two-phase isotropic media is carried out by using the boundary element method. The results show that a slow quasi P-wave can be seen in both solid and fluid wave-field synthetic seismograms. The boundary element method is effective and feasible.
Resumo:
There has been a growing concern about the use of fossil fuels and its adverse effects on the atmospheric greenhouse and ecological environment. A reduction in the release rate of CO2 into the atmosphere poses a major challenge to the land ecology of China. The most promising way of achieving CO2 reduction is to dispose of CO2 in deep saline aquifers. Deep aquifers have a large potential for CO2 sequestration in geological medium in terms of volume and duration. Through the numerical simulation of multiphase flow in a porous media, the transformation and motion of CO2 in saline aquifers has been implemented under various temperature and hydrostatic pressure conditions, which plays an important role to the assessment of the reliability and safety of CO2 geological storage. As expected, the calculated results can provide meaningful and scientific information for management purposes. The key problem to the numerical simulation of multiphase flow in a porous media is to accurately capture the mass interface and to deal with the geological heterogeneity. In this study, the updated CE/SE (Space and time conservation element and solution element) method has been proposed, and the Hybrid Particle Level Set method (HPLS) has extended for multiphase flows in porous medium, which can accurately trace the transformation of the mass interface. The benchmark problems have been applied to evaluate and validate the proposed method. In this study, the reliability of CO2 storage in saline aquifers in Daqingzi oil field in Sunlong basin has been discussed. The simulation code developed in this study takes into account the state for CO2 covering the triple point temperature and pressure to the supercritical region. The geological heterogeneity has been implemented, using the well known geostatistical model (GSLIB) on the base of the hard data. The 2D and 3D model have been set up to simulate the CO2 multiphase flow in the porous saline aquifer, applying the CE/SE method and the HPLS method .The main contents and results are summarized as followings. (1) The 2D CE/SE method with first and second –order accuracy has been extended to simulate the multiphase flow in porous medium, which takes into account the contribution of source and sink in the momentum equation. The 3D CE/SE method with the first accuracy has been deduced. The accuracy and efficiency of the proposed CE/SE method have been investigated, using the benchmark problems. (2) The hybrid particle level set method has been made appropriate and extended for capturing the mass interface of multiphase flows in porous media, and the numerical method for level set function calculated has been formulated. (3) The closed equations for multiphase flow in porous medium has been developed, adept to both the Darcy flow and non-Darcy flow, getting over the limitation of Reynolds number to the calculation. It is found that Darcy number has a decisive influence on pressure as well as velocity given the Darcy number. (4) The new Euler scheme for numerical simulations of multiphase flows in porous medium has been proposed, which is efficient and can accurately capture the mass interface. The artificial compressibility method has been used to couple the velocities and pressure. It is found that the Darcy number has determinant effects on the numerical convergence and stability. In terms of the different Darcy numbers, the coefficient of artificial compressibility and the time step have been obtained. (5) The time scale of the critical instability for critical CO2 in the saline aquifer has been found, which is comparable with that of completely CO2 dissolved saline aquifer. (6) The concept model for CO2 multiphase flows in the saline aquifer has been configured, based on the temperature, pressure, porosity as well as permeability of the field site .Numerical simulation of CO2 hydrodynamic trapping in saline aquifers has been performed, applying the proposed CE/SE method. The state for CO2 has been employed to take into account realistic reservoir conditions for CO2 geological sequestration. The geological heterogeneity has been sufficiently treated , using the geostatistical model. (7) It is found that the Rayleigh-Taylor instability phenomenon, which is associated with the penetration of saline fluid into CO2 fluid in the direction of gravity, has been observed in CO2 multiphase flows in the saline aquifer. Development of a mushroom-type spike is a strong indication of the formation of Kelvin-Helmholtz instability due to the developed short wavelength perturbations present along the interface and parallel to the bulk flow. Additional key findings: the geological heterogeneity can distort the flow convection. The ascending of CO2 can induce the persistent flow cycling effects. The results show that boundary conditions of the field site have determinant effects on the transformation and motion of CO2 in saline aquifers. It is confirmed that the proposed method and numerical model has the reliability to simulate the process of the hydrodynamic trapping, which is the controlling mechanism for the initial period of CO2 storage at time scale of 100 years.
Resumo:
We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechanics. Our mean field theory provides a tractable approximation to the true probability distribution in these networks; it also yields a lower bound on the likelihood of evidence. We demonstrate the utility of this framework on a benchmark problem in statistical pattern recognition -- the classification of handwritten digits.
Resumo:
Integrating connectivity patterns into marine ecosystem management is a fundamental step, specially for stock subjected to the combined impacts of human activities (overfishing, habitat degradation, etc.) and climate changes. Thus, management of marine resources must incorporates the spatial scales over which the populations are connected. Notwithstanding, studying these dynamics remains a crucial and hard task and the predictions of the temporal and spatial patterns of these mechanisms are still particularly challenging. This thesis aims to puzzle over the red mullet Mullus barbatus population connectivity in the Western Mediterranean Sea, by implementing a multidisciplinary approach. Otolith sclerochronology, larval dispersal modelling and genetic techniques were gathered in this study. More particularly, this research project focused on early life history stages of red mullet and their role in the characterization of connectivity dynamics. The results show that M. barbatus larval dispersal distances can reach a range of 200 km. The differences in early life traits (i.e. PLD, spawning and settlement dates) observed between various areas of the Western Mediterranean Sea suggest a certain level of larval patchiness, likely due to the occurrence of different spawning pulses during the reproductive period. The dispersal of individuals across distant areas, even not significant in demographic terms, is accountable for the maintenance of the genetic flow among different demes. Fluctuations in the level of exchange among different areas, due to the variability of the source-sink dynamics, could have major implications in the population connectivity patterns. These findings highlight the reliability of combining several approaches and represent a benchmark for the definition of a proper resource management, with considerable engagements in effectively assuring the beneficial effects of the existent and future conservation strategies.
Resumo:
Within the UK, there is a growing awareness to better understand what online educational technologies can offer in relation to learning and teaching, and how social technologies are changing communication and collaboration out with formal education. The concept of the ‘digital university’ is being widely debated within the UK Higher education sector (McCluskey and Winter, 2012), becoming embedded in educational policy, and beginning to be explored within many institutions. This session will report on one such institutional initiative, undertaken at Edinburgh Napier University in Scotland. A Digital Futures Working Group was established to: benchmark best practice in key areas including digitally enhanced education and digital literacies development; identify areas for short term action; and to produce a robust ‘digital agenda’ to inform the future direction of the university. Pivotal to this was the recognition to evolve staff digital pedagogical practices and to harness emerging digital opportunities, meet learner expectations, and meet wider expectations for contemporary able citizens. This session will be delivered in two parts. Firstly we will provide an insight into the focus of the project and the rich picture methodology used to consult with staff and students. Secondly we will specify the outcomes produced, and provide a case study of how the Faculty of Health, Life and Social Sciences engaged with the process and the progression of their digitally enabled educational practices.
Resumo:
As a management tool Similation Software deserves greater analysis from both an academic and industrial viewpoint. A comparative study of three packages was carried out from a 'first time' use approach. This allowed the ease of use and package features to be assessed using a simple theoretical benchmark manufacturing process. To back the use of these packages an objective survey on simulation use and package features was carried out within the manufacturing industry.This identified the use of simulation software, its' applicability and preception of user requirements thereby proposing an ideal package.
Resumo:
We describe a new hyper-heuristic method NELLI-GP for solving job-shop scheduling problems (JSSP) that evolves an ensemble of heuristics. The ensemble adopts a divide-and-conquer approach in which each heuristic solves a unique subset of the instance set considered. NELLI-GP extends an existing ensemble method called NELLI by introducing a novel heuristic generator that evolves heuristics composed of linear sequences of dispatching rules: each rule is represented using a tree structure and is itself evolved. Following a training period, the ensemble is shown to outperform both existing dispatching rules and a standard genetic programming algorithm on a large set of new test instances. In addition, it obtains superior results on a set of 210 benchmark problems from the literature when compared to two state-of-the-art hyperheuristic approaches. Further analysis of the relationship between heuristics in the evolved ensemble and the instances each solves provides new insights into features that might describe similar instances.
Resumo:
Karwath, A. King, R. Homology induction: the use of machine learning to improve sequence similarity searches. BMC Bioinformatics. 23rd April 2002. 3:11 Additional File Describes the title organims species declaration in one string [http://www.biomedcentral.com/content/supplementary/1471- 2105-3-11-S1.doc] Sponsorship: Andreas Karwath and Ross D. King were supported by the EPSRC grant GR/L62849.
Resumo:
K. Rasmani and Q. Shen. Subsethood-based fuzzy modelling and classification. Proceedings of the 2004 UK Workshop on Computational Intelligence, pages 181-188.
Resumo:
Suganami, Hidemi, 'Wendt, IR and Philosophy: A Critique', In: 'Constructivism and International Relations: Alexander Wendt and His Critics', (New York: Routledge), pp.57-72, 2006 RAE2008
Resumo:
The role of renewable energy in power systems is becoming more significant due to the increasing cost of fossil fuels and climate change concerns. However, the inclusion of Renewable Energy Generators (REG), such as wind power, has created additional problems for power system operators due to the variability and lower predictability of output of most REGs, with the Economic Dispatch (ED) problem being particularly difficult to resolve. In previous papers we had reported on the inclusion of wind power in the ED calculations. The simulation had been performed using a system model with wind power as an intermittent source, and the results of the simulation have been compared to that of the Direct Search Method (DSM) for similar cases. In this paper we report on our continuing investigations into using Genetic Algorithms (GA) for ED for an independent power system with a significant amount of wind energy in its generator portfolio. The results demonstrate, in line with previous reports in the literature, the effectiveness of GA when measured against a benchmark technique such as DSM.
Resumo:
This paper investigates the power of genetic algorithms at solving the MAX-CLIQUE problem. We measure the performance of a standard genetic algorithm on an elementary set of problem instances consisting of embedded cliques in random graphs. We indicate the need for improvement, and introduce a new genetic algorithm, the multi-phase annealed GA, which exhibits superior performance on the same problem set. As we scale up the problem size and test on \hard" benchmark instances, we notice a degraded performance in the algorithm caused by premature convergence to local minima. To alleviate this problem, a sequence of modi cations are implemented ranging from changes in input representation to systematic local search. The most recent version, called union GA, incorporates the features of union cross-over, greedy replacement, and diversity enhancement. It shows a marked speed-up in the number of iterations required to find a given solution, as well as some improvement in the clique size found. We discuss issues related to the SIMD implementation of the genetic algorithms on a Thinking Machines CM-5, which was necessitated by the intrinsically high time complexity (O(n3)) of the serial algorithm for computing one iteration. Our preliminary conclusions are: (1) a genetic algorithm needs to be heavily customized to work "well" for the clique problem; (2) a GA is computationally very expensive, and its use is only recommended if it is known to find larger cliques than other algorithms; (3) although our customization e ort is bringing forth continued improvements, there is no clear evidence, at this time, that a GA will have better success in circumventing local minima.
Resumo:
The performance of a randomized version of the subgraph-exclusion algorithm (called Ramsey) for CLIQUE by Boppana and Halldorsson is studied on very large graphs. We compare the performance of this algorithm with the performance of two common heuristic algorithms, the greedy heuristic and a version of simulated annealing. These algorithms are tested on graphs with up to 10,000 vertices on a workstation and graphs as large as 70,000 vertices on a Connection Machine. Our implementations establish the ability to run clique approximation algorithms on very large graphs. We test our implementations on a variety of different graphs. Our conclusions indicate that on randomly generated graphs minor changes to the distribution can cause dramatic changes in the performance of the heuristic algorithms. The Ramsey algorithm, while not as good as the others for the most common distributions, seems more robust and provides a more even overall performance. In general, and especially on deterministically generated graphs, a combination of simulated annealing with either the Ramsey algorithm or the greedy heuristic seems to perform best. This combined algorithm works particularly well on large Keller and Hamming graphs and has a competitive overall performance on the DIMACS benchmark graphs.
Resumo:
The proliferation of inexpensive workstations and networks has created a new era in distributed computing. At the same time, non-traditional applications such as computer-aided design (CAD), computer-aided software engineering (CASE), geographic-information systems (GIS), and office-information systems (OIS) have placed increased demands for high-performance transaction processing on database systems. The combination of these factors gives rise to significant challenges in the design of modern database systems. In this thesis, we propose novel techniques whose aim is to improve the performance and scalability of these new database systems. These techniques exploit client resources through client-based transaction management. Client-based transaction management is realized by providing logging facilities locally even when data is shared in a global environment. This thesis presents several recovery algorithms which utilize client disks for storing recovery related information (i.e., log records). Our algorithms work with both coarse and fine-granularity locking and they do not require the merging of client logs at any time. Moreover, our algorithms support fine-granularity locking with multiple clients permitted to concurrently update different portions of the same database page. The database state is recovered correctly when there is a complex crash as well as when the updates performed by different clients on a page are not present on the disk version of the page, even though some of the updating transactions have committed. This thesis also presents the implementation of the proposed algorithms in a memory-mapped storage manager as well as a detailed performance study of these algorithms using the OO1 database benchmark. The performance results show that client-based logging is superior to traditional server-based logging. This is because client-based logging is an effective way to reduce dependencies on server CPU and disk resources and, thus, prevents the server from becoming a performance bottleneck as quickly when the number of clients accessing the database increases.