41 resultados para Engineering, Industrial|Engineering, System Science|Operations Research
Resumo:
Pessimistic Malthusian verdicts on the capacity of pre-industrial European economies to sustain a degree of real economic growth under conditions of population growth are challenged using current reconstructions of urbanisation ratios, the real wage rates of building and agricultural labourers, and GDP per capita estimated by a range of methods. Economic growth is shown to have outpaced population growth and raised GDP per capita to in excess of $1,500 (1990 $ international at PPP) in Italy during its twelfth- and thirteenth-century commercial revolution, Holland during its fifteenth- and sixteenth-century golden age, and England during the seventeenth- and eighteenth-century runup to its industrial revolution. During each of these Smithian growth episodes expanding trade and commerce sustained significant output and employment growth in the manufacturing and service sectors. These positive developments were not necessarily reflected by trends in real wage rates for the latter were powerfully influenced by associated changes in relative factor prices and the per capita supply of labour as workers varied the length of the working year in order to consume either more leisure or more goods. The scale of the divergence between trends in real wage rates and GDP per capita nevertheless varied a great deal between countries for reasons which have yet to be adequately explained.
Resumo:
The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.
Resumo:
Permeable reactive barriers are a technology that is one decade old, with most full-scale applications based on abiotic mechanisms. Though there is extensive literature on engineered bioreactors, natural biodegradation potential, and in situ remediation, it is only recently that engineered passive bioreactive barrier technology is being considered at the commercial scale to manage contaminated soil and groundwater risks. Recent full-scale studies are providing the scientific confidence in our understanding of coupled microbial (and genetic), hydrogeologic, and geochemical processes in this approach and have highlighted the need to further integrate engineering and science tools.
Resumo:
This paper deals with Takagi-Sugeno (TS) fuzzy model identification of nonlinear systems using fuzzy clustering. In particular, an extended fuzzy Gustafson-Kessel (EGK) clustering algorithm, using robust competitive agglomeration (RCA), is developed for automatically constructing a TS fuzzy model from system input-output data. The EGK algorithm can automatically determine the 'optimal' number of clusters from the training data set. It is shown that the EGK approach is relatively insensitive to initialization and is less susceptible to local minima, a benefit derived from its agglomerate property. This issue is often overlooked in the current literature on nonlinear identification using conventional fuzzy clustering. Furthermore, the robust statistical concepts underlying the EGK algorithm help to alleviate the difficulty of cluster identification in the construction of a TS fuzzy model from noisy training data. A new hybrid identification strategy is then formulated, which combines the EGK algorithm with a locally weighted, least-squares method for the estimation of local sub-model parameters. The efficacy of this new approach is demonstrated through function approximation examples and also by application to the identification of an automatic voltage regulation (AVR) loop for a simulated 3 kVA laboratory micro-machine system.
Resumo:
Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.
We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.
Resumo:
To utilize the advantages of existing and emerging Internet techniques and to meet the demands for a new generation of collaborative working environments, a framework with an upperware–middleware architecture is proposed, which consists of four layers: resource layer, middleware layer, upperware layer and application layer. The upperware contains intelligent agents and plug/play facilities; the former coordinates and controls multiple middleware techniques such as Grid computing, Web-services and mobile agents, while the latter are used for the applications, such as semantic CAD, to plug and loose couple into the system. The method of migrating legacy software using automatic wrapper generation technique is also presented. A prototype mobile environment for collaborative product design is presented to illustrate the utilization of the CWE framework in collaborative design and manufacture.
Resumo:
In responding to the demand for change and improvement, local government has applied a plethora of operations management-based methods, tools and techniques. This article explores how these methods, specifically in the form of performance management models, are used to improve alignment between central government policy and local government practice, an area which has thus far been neglected in the literature. Using multiple case studies from Environmental Waste Management Services, this research reports that models derived in the private sector are often directly ‘implanted’ into the public sector. This has challenged the efficacy of all performance management models. However, those organisations which used models most effectively did so by embedding (contextualisation) and extending (reconceptualisation) them beyond their original scope. Moreover, success with these models created a cumulative effect whereby other operations management approaches were probed, adapted and used.
Resumo:
The article presents cost modeling results from the application of the Genetic-Causal cost modeling principle. Industrial results from redesign are also presented to verify the opportunity for early concept cost optimization by using Genetic-Causal cost drivers to guide the conceptual design process for structural assemblies. The acquisition cost is considered through the modeling of the recurring unit cost and non-recurring design cost. The operational cost is modeled relative to acquisition cost and fuel burn for predominately metal or composites designs. The main contribution of this study is the application of the Genetic-Causal principle to the modeling of cost, helping to understand how conceptual design parameters impact on cost, and linking that to customer requirements and life cycle cost.