953 resultados para bilevel programming
Resumo:
Incremental parsing has long been recognized as a technique of great utility in the construction of language-based editors, and correspondingly, the area currently enjoys a mature theory. Unfortunately, many practical considerations have been largely overlooked in previously published algorithms. Many user requirements for an editing system necessarily impact on the design of its incremental parser, but most approaches focus only on one: response time. This paper details an incremental parser based on LR parsing techniques and designed for use in a modeless syntax recognition editor. The nature of this editor places significant demands on the structure and quality of the document representation it uses, and hence, on the parser. The strategy presented here is novel in that both the parser and the representation it constructs are tolerant of the inevitable and frequent syntax errors that arise during editing. This is achieved by a method that differs from conventional error repair techniques, and that is more appropriate for use in an interactive context. Furthermore, the parser aims to minimize disturbance to this representation, not only to ensure other system components can operate incrementally, but also to avoid unfortunate consequences for certain user-oriented services. The algorithm is augmented with a limited form of predictive tree-building, and a technique is presented for the determination of valid symbols for menu-based insertion. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
In population pharmacokinetic studies, the precision of parameter estimates is dependent on the population design. Methods based on the Fisher information matrix have been developed and extended to population studies to evaluate and optimize designs. In this paper we propose simple programming tools to evaluate population pharmacokinetic designs. This involved the development of an expression for the Fisher information matrix for nonlinear mixed-effects models, including estimation of the variance of the residual error. We implemented this expression as a generic function for two software applications: S-PLUS and MATLAB. The evaluation of population designs based on two pharmacokinetic examples from the literature is shown to illustrate the efficiency and the simplicity of this theoretic approach. Although no optimization method of the design is provided, these functions can be used to select and compare population designs among a large set of possible designs, avoiding a lot of simulations.
Resumo:
This paper presents the multi-threading and internet message communication capabilities of Qu-Prolog. Message addresses are symbolic and the communications package provides high-level support that completely hides details of IP addresses and port numbers as well as the underlying TCP/IP transport layer. The combination of the multi-threads and the high level inter-thread message communications provide simple, powerful support for implementing internet distributed intelligent applications.
Resumo:
The problem of designing spatially cohesive nature reserve systems that meet biodiversity objectives is formulated as a nonlinear integer programming problem. The multiobjective function minimises a combination of boundary length, area and failed representation of the biological attributes we are trying to conserve. The task is to reserve a subset of sites that best meet this objective. We use data on the distribution of habitats in the Northern Territory, Australia, to show how simulated annealing and a greedy heuristic algorithm can be used to generate good solutions to such large reserve design problems, and to compare the effectiveness of these methods.
Resumo:
The principal aim of this paper is to measure the amount by which the profit of a multi-input, multi-output firm deviates from maximum short-run profit, and then to decompose this profit gap into components that are of practical use to managers. In particular, our interest is in the measurement of the contribution of unused capacity, along with measures of technical inefficiency, and allocative inefficiency, in this profit gap. We survey existing definitions of capacity and, after discussing their shortcomings, we propose a new ray economic capacity measure that involves short-run profit maximisation, with the output mix held constant. We go on to describe how the gap between observed profit and maximum profit can be calculated and decomposed using linear programming methods. The paper concludes with an empirical illustration, involving data on 28 international airline companies. The empirical results indicate that these airline companies achieve profit levels which are on average US$815m below potential levels, and that 70% of the gap may be attributed to unused capacity. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Applying programming techniques to detailed data for 406 rice farms in 21 villages, for 1997, produces inefficiency measures, which differ substantially from the results of simple yield and unit cost measures. For the Boro (dry) season, mean technical efficiency was efficiency was 56.2 per cent and 69.4 per cent, allocative efficiency was 81.3 per cent, cost efficiency was 56.2 per cent and scale efficiency 94.9 per cent. The Aman (wet) season results are similar, but a few points lower. Allocative inefficiency is due to overuse of labour, suggesting population pressure, and of fertiliser, where recommended rates may warrant revision. Second-stage regressions show that large families are more inefficient, whereas farmers with better access to input markets, and those who do less off-farm work, tend to be more efficient. The information on the sources of inter-farm performance differentials could be used by the extension agents to help inefficient farmers. There is little excuse for such sub-optimal use of survey data, which are often collected at substantial costs.
Resumo:
Existing refinement calculi provide frameworks for the stepwise development of imperative programs from specifications. This paper presents a refinement calculus for deriving logic programs. The calculus contains a wide-spectrum logic programming language, including executable constructs such as sequential conjunction, disjunction, and existential quantification, as well as specification constructs such as general predicates, assumptions and universal quantification. A declarative semantics is defined for this wide-spectrum language based on executions. Executions are partial functions from states to states, where a state is represented as a set of bindings. The semantics is used to define the meaning of programs and specifications, including parameters and recursion. To complete the calculus, a notion of correctness-preserving refinement over programs in the wide-spectrum language is defined and refinement laws for developing programs are introduced. The refinement calculus is illustrated using example derivations and prototype tool support is discussed.
Resumo:
In this paper we describe a distributed object oriented logic programming language in which an object is a collection of threads deductively accessing and updating a shared logic program. The key features of the language, such as static and dynamic object methods and multiple inheritance, are illustrated through a series of small examples. We show how we can implement object servers, allowing remote spawning of objects, which we can use as staging posts for mobile agents. We give as an example an information gathering mobile agent that can be queried about the information it has so far gathered whilst it is gathering new information. Finally we define a class of co-operative reasoning agents that can do resource bounded inference for full first order predicate logic, handling multiple queries and information updates concurrently. We believe that the combination of the concurrent OO and the LP programming paradigms produces a powerful tool for quickly implementing rational multi-agent applications on the internet.
Resumo:
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
O artigo pretende mostrar as dificuldades de sustentabilidade da atividade cinematogr??fica brasileira. Com base na an??lise da cadeia produtiva, na estrutura tribut??ria e nas falhas da legisla????o de incentivo fiscal, prop??e-se a exist??ncia de um ciclo vicioso de depend??ncia aos incentivos fiscais. Tendo em vista as dificuldades de acesso a salas de cinema, devido, entre outros fatores, ao baixo poder aquisitivo da popula????o e ?? concentra????o geogr??fica de cinemas nas grandes cidades, prop??e-se uma parceria efetiva entre cinema e televis??o, facilitada pela tecnologia digital. Isso implicar?? altera????es na legisla????o atual, bem como a cria????o de regras regulat??rias, como forma de ampliar o acesso de filmes nacionais ?? grade de programa????o televisiva e tamb??m de viabilizar, em termos industriais, essa atividade econ??mica.
Resumo:
O presente trabalho utiliza a metodologia DEA (Data Envelopment Analysis ), para avaliar a efici??ncia das 22 Ag??ncias da Previd??ncia Social ??? Ger??ncia Executiva Fortaleza (APS-GEXFOR). DEA utiliza programa????o linear cujo n??cleo das estruturas anal??ticas ?? formado a partir do modelo original CCR (Charnes, Cooper, e Rhodes). Aplicada ??s DMU (Decision Making Units), define uma Fronteira de Efici??ncia identificando Unidades Eficientes e Ineficientes. Foi utilizado o modelo DEA-CCR implementado pelo software DEA Solver??. A Previd??ncia Social (INSS) disp??e de indicadores de desempenho. Algumas vari??veis utilizadas no modelo implementado derivam desses indicadores, outras informa????es foram disponibilizadas pelos sistemas de informa????o da institui????o. A avalia????o de efici??ncia DEA das APS-GEXFOR permitiu identificar as melhores pr??ticas, mensurar a participa????o de cada vari??vel envolvida na avalia????o da unidade e projetar as unidades ineficientes na fronteira de efici??ncia, identificando metas a serem atingidas para torn??-las eficientes no conjunto observado.
Resumo:
O presente artigo pretende analisar a quest??o da qualidade da programa????o na televis??o brasileira a partir da proposta de um novo marco regulat??rio para o setor de comunica????o social eletr??nica. Essa nova lei, entre outras disposi????es, ir?? regulamentar o artigo 221 da Constitui????o Federal, que trata dos princ??pios pelos quais o conte??do televisivo deve pautar-se. Com isso, define-se qualidade levando-se em considera????o dois aspectos: diversidade e ressalvas ?? liberdade de express??o, ambos previstos na Constitui????o Federal. A partir dessa conceitua????o, prop??e-se a instrumentaliza????o do controle social sobre o conte??do televisivo e a garantia de meios para a diversidade da programa????o. Com rela????o ao primeiro aspecto, recomenda-se a atua????o transparente de uma futura ag??ncia reguladora e a implementa????o de mecanismo de controle individual da programa????o. No que tange ?? diversidade, ressalta-se a import??ncia do fortalecimento das televis??es p??blicas e medidas governamentais no sentido de estimular a multiprograma????o propiciada pelo advento da tecnologia digital.
Resumo:
A pesquisa analisa da constituição histórica da disciplina História da Educação ministrada na Faculdade de Filosofia Ciências e Letras do Estado do Espírito Santo, posteriormente incorporada a Universidade Federal do Espírito Santo entre os anos de 1951 e 2000. Investiga a constituição histórica da disciplina, as transformações programáticas, legais e institucionais referentes à disciplina de História da Educação, como também as abordagens historiográficas, periodizações e os conceitos de tempo, história e educação. A fundamentação teórica e metodológica articula-se dialogicamente a partir das construções conceituais e metodológicas de Carlo Ginzburg e Mikhail Bakhtin. A partir dos conceitos de polifonia e dialogismo, comum a ambos, investigou-se as vozes e diálogos impressos nas narrativas da disciplina de História da Educação e seu ensino, sejam em camadas mais superficiais ou profundas, encontradas no corpus documental consultado e analisado, que correspondem a: programas de ensino, transparências, leis, estruturas curriculares, documentos de departamento; resenhas e fichamentos de textos, bibliografia obrigatória e complementar, avaliações e entrevistas. Procurou-se no corpus documental dados aparentemente negligenciáveis – pistas, indícios e sinais – remontar uma realidade histórica complexa e não experimentável diretamente. Ao investigar historicamente a trajetória da disciplina História da Educação e seu ensino a partir dos parâmetros legais, programáticos e institucionais, foi possível perceber que as mudanças mais profundas operadas na disciplina não se originam das legislações e reestruturações curriculares, mas dos locais de produção e socialização do conhecimento histórico. Durante o período analisado, as duas esferas de produção historiográficas que mais influenciaram nas abordagens, periodizações e conceitos de tempo, história e educação da disciplina História da Educação do curso de pedagogia pesquisado foram: a editora responsável pela publicação e divulgação dos Manuais de História da Educação da coleção Atualidades Pedagógicas (1951-1979) e os Programas de Pós-graduação em Educação e História (1980 - 2000). Entre 1951 e finais de 1970 observa-se a influência dos Manuais de História da Educação, na organização e programação do ensino de História da Educação e uma abordagem filosófica voltada para a história das ideias pedagógicas e análises do pensamento de filósofos e educadores sobre a educação e respectivas inserções em doutrinas filosóficas europeias. A partir de 1980 as abordagens de cunho econômico, político e ideológico dos contextos históricos educativos passaram a predominar nos programas de ensino das disciplinas de História da Educação I e II, e vigoraram até meados nos anos de 1990. Na disciplina de História da Educação I a abordagem é marcada por análises do contexto de produção e organização das classes sociais; com relação à disciplina História da Educação II, até meados de 1995, trata da educação brasileira. A partir da abordagem fundamentada na Teoria da Dependência após 1995, os documentos consultados começam a mostrar outras marcas que sugerem uma abordagem voltada para a dimensão política e social, abordando a História da Educação Brasileira, a partir dos movimentos sociais e seus respectivos projetos educacionais.