914 resultados para optimum design theory


Relevância:

80.00% 80.00%

Publicador:

Resumo:

One objective of artificial intelligence is to model the behavior of an intelligent agent interacting with its environment. The environment's transformations can be modeled as a Markov chain, whose state is partially observable to the agent and affected by its actions; such processes are known as partially observable Markov decision processes (POMDPs). While the environment's dynamics are assumed to obey certain rules, the agent does not know them and must learn. In this dissertation we focus on the agent's adaptation as captured by the reinforcement learning framework. This means learning a policy---a mapping of observations into actions---based on feedback from the environment. The learning can be viewed as browsing a set of policies while evaluating them by trial through interaction with the environment. The set of policies is constrained by the architecture of the agent's controller. POMDPs require a controller to have a memory. We investigate controllers with memory, including controllers with external memory, finite state controllers and distributed controllers for multi-agent systems. For these various controllers we work out the details of the algorithms which learn by ascending the gradient of expected cumulative reinforcement. Building on statistical learning theory and experiment design theory, a policy evaluation algorithm is developed for the case of experience re-use. We address the question of sufficient experience for uniform convergence of policy evaluation and obtain sample complexity bounds for various estimators. Finally, we demonstrate the performance of the proposed algorithms on several domains, the most complex of which is simulated adaptive packet routing in a telecommunication network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study a particular restitution problem where there is an indivisible good (land or property) over which two agents have rights: the dispossessed agent and the owner. A third party, possibly the government, seeks to resolve the situation by assigning rights to one and compensate the other. There is also a maximum amount of money available for the compensation. We characterize a family of asymmetrically fair rules that are immune to strategic behavior, guarantee minimal welfare levels for the agents, and satisfy the budget constraint.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En el proceso de diseño se toman decisiones que pueden afectar a la fabricabilidad del producto. Cuando el diseñador es experto, considera las limitaciones, las propiedades y el coste de fabricación en la fase de materialización o de detalle. El problema surge cuando el diseñador no es experto o cuando no hay suficiente información y conocimiento de fabricación disponible. Tomando como referencia la teoría de Diseño Axiomático y las técnicas de DFM, se propone una metodología para identificar, definir y formalizar la información de fabricación que debería estar disponible en el diseño para diseñar para fabricar (DFM). También se propone un prototipo de modelo de información para desarrollar una futura herramienta informática que facilitaría la aplicación de esta metodología y que permitiría guiar al diseñador durante el diseño. La metodología ha sido aplicada a una biela de un motor de combustión interna alternativo (MCIA), y a los procesos que se están usando actualmente para fabricarla: forja en matriz cerrada y forja de polvo de metal.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Details about the parameters of kinetic systems are crucial for progress in both medical and industrial research, including drug development, clinical diagnosis and biotechnology applications. Such details must be collected by a series of kinetic experiments and investigations. The correct design of the experiment is essential to collecting data suitable for analysis, modelling and deriving the correct information. We have developed a systematic and iterative Bayesian method and sets of rules for the design of enzyme kinetic experiments. Our method selects the optimum design to collect data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. The rules select features of the design such as the substrate range and the number of measurements. We show here that this method can be directly applied to the study of other important kinetic systems, including drug transport, receptor binding, microbial culture and cell transport kinetics. It is possible to reduce the errors in the estimated parameters and, most importantly, increase the efficiency and cost-effectiveness by reducing the necessary amount of experiments and data points measured. (C) 2003 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neste trabalho abordam-se as questões relevantes de uma avaliação analítica de estruturas acabadas, apresentando-se os aspectos fundamentais quanto à inspeção e verificação da sua segurança. Primeiramente, apresenta-se os métodos de avaliação, enfatizando-se as vantagens e restrições do emprego de cada método. Quanto à inspeção, são apresentados e discutidos os principais ítens que devem ser verificados nas estruturas acabadas. :t:nfasemaior é dada à inspeção da resistência do concreto, discutindo-se aspectos de sua variação nas ~struturas e apresentando-se os métodos de ensaio empregados. É proposta uma metodologia para a inspeção do concreto, racionalizando-se o emprego dos métodos não-destrutivos. Por fim, destaca-se os aspectos teóricos da verificação da segurança nas estruturas acabadas e as principais diferenças quanto à introdução da segurança nos projetos de novas estruturas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work proposes a computational methodology to solve problems of optimization in structural design. The application develops, implements and integrates methods for structural analysis, geometric modeling, design sensitivity analysis and optimization. So, the optimum design problem is particularized for plane stress case, with the objective to minimize the structural mass subject to a stress criterion. Notice that, these constraints must be evaluated at a series of discrete points, whose distribution should be dense enough in order to minimize the chance of any significant constraint violation between specified points. Therefore, the local stress constraints are transformed into a global stress measure reducing the computational cost in deriving the optimal shape design. The problem is approximated by Finite Element Method using Lagrangian triangular elements with six nodes, and use a automatic mesh generation with a mesh quality criterion of geometric element. The geometric modeling, i.e., the contour is defined by parametric curves of type B-splines, these curves hold suitable characteristics to implement the Shape Optimization Method, that uses the key points like design variables to determine the solution of minimum problem. A reliable tool for design sensitivity analysis is a prerequisite for performing interactive structural design, synthesis and optimization. General expressions for design sensitivity analysis are derived with respect to key points of B-splines. The method of design sensitivity analysis used is the adjoin approach and the analytical method. The formulation of the optimization problem applies the Augmented Lagrangian Method, which convert an optimization problem constrained problem in an unconstrained. The solution of the Augmented Lagrangian function is achieved by determining the analysis of sensitivity. Therefore, the optimization problem reduces to the solution of a sequence of problems with lateral limits constraints, which is solved by the Memoryless Quasi-Newton Method It is demonstrated by several examples that this new approach of analytical design sensitivity analysis of integrated shape design optimization with a global stress criterion purpose is computationally efficient

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho aborda as concepções e práticas dos sujeitos do curso de Licenciatura Plena em Educação do Campo/LPEC desenvolvido no Instituto Federal de Educação/IFPA no polo de Castanhal/Pa, o qual tem por embasamento político-pedagógico o Programa de Educação do Campo/PROCAMPO. Com base no objeto de estudo foram definidos como objetivos identificar a concepção teórico/metodológico do PROCAMPO na formação de educadores do campo; analisar a dinâmica do Curso de LPEC no polo de Castanhal e identificar as contribuições do curso na prática dos educadores do campo. Para alcançar os objetivos propostos o processo investigativo realizou pesquisa bibliográfica, pesquisa documental e pesquisa de campo. Na pesquisa bibliográfica foram revisados autores para dar sustentação teórica sobre educação como prática humanizadora com destaque para Brandão (2007), Saviani (1994), Mészáros (2008), Freire (2011), Caldart (2004), Fernandes (2004), Arroyo (1999) entre outros. A pesquisa documental abrangeu o Projeto Pedagógico do Curso de Licenciatura Plena em Educação do Campo; Minuta Original do MEC/Secad; Edital nº 02, de 23/04/2008 (referente à chamada pública para seleção de projetos de instituições públicas de ensino superior para o PROCAMPO); Edital de 16/04/2009 e Edital nº 026/2010 (ambos correspondentes ao processo seletivo de LPEC do IFPA) e a listagem de alunos matriculados no PROCAMPO IFPA/Castanhal em 2010, entre outros. A pesquisa de campo, de caráter qualitativa, utilizou o estudo de caso e recorte temporal delimitado para o período de 2008 a 2012, em função da implementação do PROCAMPO ter ocorrido no ano inicial do período citado e a realização de entrevistas semiestruturada. Os resultados indicam que a materialização do curso em Castanhal apesar das dificuldades de natureza pedagógica e financeira vem causando efeitos positivos na atuação dos educadores do campo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of the present work was to study the control of the dynamics of diatomic heteronuclear molecules interacting with electric fields created by lasers. Specifically in this work, the molecular photoassociation phenomenon will be analyzed. At this phenomenon, the atom's relative movement is described by a particle that moves in a morse potential well under the influence of an external time dependant force related to the external field. Based on the optimum control theory (OCT), it is presented at the present work laser pulses that alternate a given initial molecular state to a desirable end state, wich in this work was represented by the minimization of a cost functional that indicates how close. To do so, a computational sistem know as Genetic Algorithm (GA) was developed that can be characterizes as an extremelly eficient technique capable of scanning the solutions space and find results close to the optimum solutions

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[EN] This paper proposes the incorporation of engineering knowledge through both (a) advanced state-of-the-art preference handling decision-making tools integrated in multiobjective evolutionary algorithms and (b) engineering knowledge-based variance reduction simulation as enhancing tools for the robust optimum design of structural frames taking uncertainties into consideration in the design variables.The simultaneous minimization of the constrained weight (adding structuralweight and average distribution of constraint violations) on the one hand and the standard deviation of the distribution of constraint violation on the other are handled with multiobjective optimization-based evolutionary computation in two different multiobjective algorithms. The optimum design values of the deterministic structural problem in question are proposed as a reference point (the aspiration level) in reference-point-based evolutionary multiobjective algorithms (here g-dominance is used). Results including

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Chapter 1 is used to introduce the basic tools and mechanics used within this thesis. Most of the definitions used in the thesis will be defined, and we provide a basic survey of topics in graph theory and design theory pertinent to the topics studied in this thesis. In Chapter 2, we are concerned with the study of fixed block configuration group divisible designs, GDD(n; m; k; λ1; λ2). We study those GDDs in which each block has configuration (s; t), that is, GDDs in which each block has exactly s points from one of the two groups and t points from the other. Chapter 2 begins with an overview of previous results and constructions for small group size and block sizes 3, 4 and 5. Chapter 2 is largely devoted to presenting constructions and results about GDDs with two groups and block size 6. We show the necessary conditions are sufficient for the existence of GDD(n, 2, 6; λ1, λ2) with fixed block configuration (3; 3). For configuration (1; 5), we give minimal or nearminimal index constructions for all group sizes n ≥ 5 except n = 10, 15, 160, or 190. For configuration (2, 4), we provide constructions for several families ofGDD(n, 2, 6; λ1, λ2)s. Chapter 3 addresses characterizing (3, r)-regular graphs. We begin with providing previous results on the well studied class of (2, r)-regular graphs and some results on the structure of large (t; r)-regular graphs. In Chapter 3, we completely characterize all (3, 1)-regular and (3, 2)-regular graphs, as well has sharpen existing bounds on the order of large (3, r)- regular graphs of a certain form for r ≥ 3. Finally, the appendix gives computational data resulting from Sage and C programs used to generate (3, 3)-regular graphs on less than 10 vertices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation concerns the intersection of three areas of discrete mathematics: finite geometries, design theory, and coding theory. The central theme is the power of finite geometry designs, which are constructed from the points and t-dimensional subspaces of a projective or affine geometry. We use these designs to construct and analyze combinatorial objects which inherit their best properties from these geometric structures. A central question in the study of finite geometry designs is Hamada’s conjecture, which proposes that finite geometry designs are the unique designs with minimum p-rank among all designs with the same parameters. In this dissertation, we will examine several questions related to Hamada’s conjecture, including the existence of counterexamples. We will also study the applicability of certain decoding methods to known counterexamples. We begin by constructing an infinite family of counterexamples to Hamada’s conjecture. These designs are the first infinite class of counterexamples for the affine case of Hamada’s conjecture. We further demonstrate how these designs, along with the projective polarity designs of Jungnickel and Tonchev, admit majority-logic decoding schemes. The codes obtained from these polarity designs attain error-correcting performance which is, in certain cases, equal to that of the finite geometry designs from which they are derived. This further demonstrates the highly geometric structure maintained by these designs. Finite geometries also help us construct several types of quantum error-correcting codes. We use relatives of finite geometry designs to construct infinite families of q-ary quantum stabilizer codes. We also construct entanglement-assisted quantum error-correcting codes (EAQECCs) which admit a particularly efficient and effective error-correcting scheme, while also providing the first general method for constructing these quantum codes with known parameters and desirable properties. Finite geometry designs are used to give exceptional examples of these codes.