917 resultados para Distributed Parameter Approach
Resumo:
This paper presents a review of modelling and control of biological nutrient removal (BNR)-activated sludge processes for wastewater treatment using distributed parameter models described by partial differential equations (PDE). Numerical methods for solution to the BNR-activated sludge process dynamics are reviewed and these include method of lines, global orthogonal collocation and orthogonal collocation on finite elements. Fundamental techniques and conceptual advances of the distributed parameter approach to the dynamics and control of activated sludge processes are briefly described. A critical analysis on the advantages of the distributed parameter approach over the conventional modelling strategy in this paper shows that the activated sludge process is more adequately described by the former and the method is recommended for application to the wastewater industry (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Fractional Calculus (FC) goes back to the beginning of the theory of differential calculus. Nevertheless, the application of FC just emerged in the last two decades, due to the progress in the area of chaos that revealed subtle relationships with the FC concepts. In the field of dynamical systems theory some work has been carried out but the proposed models and algorithms are still in a preliminary stage of establishment. Having these ideas in mind, the paper discusses a FC perspective in the study of the dynamics and control of some distributed parameter systems.
Resumo:
Recent work on argument structure has shown that there must be a synchronic relation between nouns and derived verbs that can be treated in structural terms. However, a simple phonological/morphological identity or diachronic derivation between a verb and a noun cannot guarantee that there is a denominal structure in a synchronic approach. In this paper we observe the phenomenon of Denominal Verbs in Brazilian Portuguese and argue for a distinction between etymological and synchronic morphological derivation. The objectives of this paper are 1) to identify synchronic and formal criteria to define which diachronic Denominal Verbs can also be considered denominal under a synchronic analysis; and 2) to detect in which cases the label "denominal" can be justifiably abandoned. Based on results of argument structure tests submitted to the judgments of native speakers, it was possible to classify the supposed homogenous Denominal Verbs class into three major groups: Real Denominal Verbs, Root-derived Verbs, and Ambiguous Verbs. In a Distributed Morphology approach, it was possible to explain the distinction between these groups based on the ideia of phases in words and the locality of restriction in the interpretation of roots.
Resumo:
One of the obstacles to improved security of the Internet is ad hoc development of technologies with different design goals and different security goals. This paper proposes reconceptualizing the Internet as a secure distributed system, focusing specifically on the application layer. The notion is to redesign specific functionality, based on principles discovered in research on distributed systems in the decades since the initial development of the Internet. Because of the problems in retrofitting new technology across millions of clients and servers, any options with prospects of success must support backward compatibility. This paper outlines a possible new architecture for internet-based mail which would replace existing protocols by a more secure framework. To maintain backward compatibility, initial implementation could offer a web browser-based front end but the longer-term approach would be to implement the system using appropriate models of replication. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Are persistent marketing effects most likely to appear right after the introduction of a product? The authors give an affirmative answer to this question by developing a model that explicitly reports how persistent and transient marketing effects evolve over time. The proposed model provides managers with a valuable tool to evaluate their allocation of marketing expenditures over time. An application of the model to many pharmaceutical products, estimated through (exact initial) Kalman filtering, indicates that both persistent and transient effects occur predominantly immediately after a brand's introduction. Subsequently, the size of the effects declines. The authors theoretically and empirically compare their methodology with methodology based on unit root testing and demonstrate that the need for unit root tests creates difficulties in applying conventional persistence modeling. The authors recommend that marketing models should either accommodate persistent effects that change over time or be applied to mature brands or limited time windows only.
Resumo:
Many granulation plants operate well below design capacity, suffering from high recycle rates and even periodic instabilities. This behaviour cannot be fully predicted using the present models. The main objective of the paper is to provide an overview of the current status of model development for granulation processes and suggest future directions for research and development. The end-use of the models is focused on the optimal design and control of granulation plants using the improved predictions of process dynamics. The development of novel models involving mechanistically based structural switching methods is proposed in the paper. A number of guidelines are proposed for the selection of control relevant model structures. (C) 2002 Published by Elsevier Science B.V.
Resumo:
In this paper, we introduce a DAI approach called hereinafter Fuzzy Distributed Artificial Intelligence (FDAI). Through the use of fuzzy logic, we have been able to develop mechanisms that we feel may effectively improve current DAI systems, giving much more flexibility and providing the subsidies which a formal theory can bring. The appropriateness of the FDAI approach is explored in an important application, a fuzzy distributed traffic-light control system, where we have been able to aggregate and study several issues concerned with fuzzy and distributed artificial intelligence. We also present a number of current research directions necessary to develop the FDAI approach more fully.
Resumo:
A bilevel programming approach for the optimal contract pricing of distributed generation (DG) in distribution networks is presented. The outer optimization problem corresponds to the owner of the DG who must decide the contract price that would maximize his profits. The inner optimization problem corresponds to the distribution company (DisCo), which procures the minimization of the payments incurred in attending the expected demand while satisfying network constraints. The meet the expected demand the DisCo can purchase energy either form the transmission network through the substations or form the DG units within its network. The inner optimization problem is substituted by its Karush- Kuhn-Tucker optimality conditions, turning the bilevel programming problem into an equivalent single-level nonlinear programming problem which is solved using commercially available software. © 2010 IEEE.
Resumo:
This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.
Resumo:
Virtual Reality (VR) has grown to become state-of-theart technology in many business- and consumer oriented E-Commerce applications. One of the major design challenges of VR environments is the placement of the rendering process. The rendering process converts the abstract description of a scene as contained in an object database to an image. This process is usually done at the client side like in VRML [1] a technology that requires the client’s computational power for smooth rendering. The vision of VR is also strongly connected to the issue of Quality of Service (QoS) as the perceived realism is subject to an interactive frame rate ranging from 10 to 30 frames-per-second (fps), real-time feedback mechanisms and realistic image quality. These requirements overwhelm traditional home computers or even high sophisticated graphical workstations over their limits. Our work therefore introduces an approach for a distributed rendering architecture that gracefully balances the workload between the client and a clusterbased server. We believe that a distributed rendering approach as described in this paper has three major benefits: It reduces the clients workload, it decreases the network traffic and it allows to re-use already rendered scenes.
Resumo:
The growing heterogeneity of networks, devices and consumption conditions asks for flexible and adaptive video coding solutions. The compression power of the HEVC standard and the benefits of the distributed video coding paradigm allow designing novel scalable coding solutions with improved error robustness and low encoding complexity while still achieving competitive compression efficiency. In this context, this paper proposes a novel scalable video coding scheme using a HEVC Intra compliant base layer and a distributed coding approach in the enhancement layers (EL). This design inherits the HEVC compression efficiency while providing low encoding complexity at the enhancement layers. The temporal correlation is exploited at the decoder to create the EL side information (SI) residue, an estimation of the original residue. The EL encoder sends only the data that cannot be inferred at the decoder, thus exploiting the correlation between the original and SI residues; however, this correlation must be characterized with an accurate correlation model to obtain coding efficiency improvements. Therefore, this paper proposes a correlation modeling solution to be used at both encoder and decoder, without requiring a feedback channel. Experiments results confirm that the proposed scalable coding scheme has lower encoding complexity and provides BD-Rate savings up to 3.43% in comparison with the HEVC Intra scalable extension under development. © 2014 IEEE.
Resumo:
One important step in the design of air stripping operations for the removal of VOC is the choice of operating conditions, which are based in the phase ratio. This parameter sets on directly the stripping factor and the efficiency of the operation. Its value has an upper limit determined by the flooding regime, which is previewed using empirical correlations, namely the one developed by Eckert. This type of approach is not suitable for the development of algorithms. Using a pilot scale column and a convenient solution, the pressure drop was determined in different operating conditions and the experimental values were compared with the estimations. This particular research will be incorporated in a global model for simulating the dynamics of air stripping using a multi variable distributed parameter system.
Resumo:
In this work mathematical programming models for structural and operational optimisation of energy systems are developed and applied to a selection of energy technology problems. The studied cases are taken from industrial processes and from large regional energy distribution systems. The models are based on Mixed Integer Linear Programming (MILP), Mixed Integer Non-Linear Programming (MINLP) and on a hybrid approach of a combination of Non-Linear Programming (NLP) and Genetic Algorithms (GA). The optimisation of the structure and operation of energy systems in urban regions is treated in the work. Firstly, distributed energy systems (DES) with different energy conversion units and annual variations of consumer heating and electricity demands are considered. Secondly, district cooling systems (DCS) with cooling demands for a large number of consumers are studied, with respect to a long term planning perspective regarding to given predictions of the consumer cooling demand development in a region. The work comprises also the development of applications for heat recovery systems (HRS), where paper machine dryer section HRS is taken as an illustrative example. The heat sources in these systems are moist air streams. Models are developed for different types of equipment price functions. The approach is based on partitioning of the overall temperature range of the system into a number of temperature intervals in order to take into account the strong nonlinearities due to condensation in the heat recovery exchangers. The influence of parameter variations on the solutions of heat recovery systems is analysed firstly by varying cost factors and secondly by varying process parameters. Point-optimal solutions by a fixed parameter approach are compared to robust solutions with given parameter variation ranges. In the work enhanced utilisation of excess heat in heat recovery systems with impingement drying, electricity generation with low grade excess heat and the use of absorption heat transformers to elevate a stream temperature above the excess heat temperature are also studied.
Resumo:
The task of controlling urban traffic requires flexibility, adaptability and handling uncertain information spread through the intersection network. The use of fuzzy sets concepts convey these characteristics to improve system performance. This paper reviews a distributed traffic control system built upon a fuzzy distributed architecture previously developed by the authors. The emphasis of the paper is on the application of the system to control part of Campinas downtown area. Simulation experiments considering several traffic scenarios were performed to verify the capabilities of the system in controlling a set of coupled intersections. The performance of the proposed system is compared with conventional traffic control strategies under the same scenarios. The results obtained show that the distributed traffic control system outperforms conventional systems as far as average queues, average delay and maximum delay measures are concerned.
Resumo:
Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties, and in some cases rewards, which introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the Maximum Continuous Interruption Duration per customer (MCID). This paper describes a chronological Monte Carlo simulation approach to evaluate probability distributions of reliability indices, including the MCID, and the corresponding penalties. In order to get the desired efficiency, modern computational techniques are used for modeling (UML -Unified Modeling Language) as well as for programming (Object- Oriented Programming). Case studies on a simple distribution network and on real Brazilian distribution systems are presented and discussed. © Copyright KTH 2006.