930 resultados para Memory Management (Computer science)
Resumo:
In the late seventies, Megiddo proposed a way to use an algorithm for the problem of minimizing a linear function a(0) + a(1)x(1) + ... + a(n)x(n) subject to certain constraints to solve the problem of minimizing a rational function of the form (a(0) + a(1)x(1) + ... + a(n)x(n))/(b(0) + b(1)x(1) + ... + b(n)x(n)) subject to the same set of constraints, assuming that the denominator is always positive. Using a rather strong assumption, Hashizume et al. extended Megiddo`s result to include approximation algorithms. Their assumption essentially asks for the existence of good approximation algorithms for optimization problems with possibly negative coefficients in the (linear) objective function, which is rather unusual for most combinatorial problems. In this paper, we present an alternative extension of Megiddo`s result for approximations that avoids this issue and applies to a large class of optimization problems. Specifically, we show that, if there is an alpha-approximation for the problem of minimizing a nonnegative linear function subject to constraints satisfying a certain increasing property then there is an alpha-approximation (1 1/alpha-approximation) for the problem of minimizing (maximizing) a nonnegative rational function subject to the same constraints. Our framework applies to covering problems and network design problems, among others.
Resumo:
We present approximation algorithms for the three-dimensional strip packing problem, and the three-dimensional bin packing problem. We consider orthogonal packings where 90 degrees rotations are allowed. The algorithms we show for these problems have asymptotic performance bounds 2.64, and 4.89, respectively. These algorithms are for the more general case in which the bounded dimensions of the bin given in the input are not necessarily equal (that is, we consider bins for which the length. the width and the height are not necessarily equal). Moreover, we show that these problems-in the general version-are as hard to approximate as the corresponding oriented version. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The notion of knowledge artifact has rapidly gained popularity in the fields of general knowledge management and more recently knowledge-based systems. The main goal on this paper is to propose and discuss a methodology for the design and implementation of knowledge-based systems founded on knowledge artifacts. We advocate that the systems built according to this methodology can be effective to convey the flow of knowledge between different communities of practice. Our methodology has been developed from the ground up, i.e. we have built some concrete systems based on the abstract notion of knowledge artifact and synthesized our methodology based on reflections upon our experiences building these systems. In this paper, we also describe the most relevant systems we have built and how they have guided us to the synthesis of our proposed methodology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
One of the key issues in e-learning environments is the possibility of creating and evaluating exercises. However, the lack of tools supporting the authoring and automatic checking of exercises for specifics topics (e.g., geometry) drastically reduces advantages in the use of e-learning environments on a larger scale, as usually happens in Brazil. This paper describes an algorithm, and a tool based on it, designed for the authoring and automatic checking of geometry exercises. The algorithm dynamically compares the distances between the geometric objects of the student`s solution and the template`s solution, provided by the author of the exercise. Each solution is a geometric construction which is considered a function receiving geometric objects (input) and returning other geometric objects (output). Thus, for a given problem, if we know one function (construction) that solves the problem, we can compare it to any other function to check whether they are equivalent or not. Two functions are equivalent if, and only if, they have the same output when the same input is applied. If the student`s solution is equivalent to the template`s solution, then we consider the student`s solution as a correct solution. Our software utility provides both authoring and checking tools to work directly on the Internet, together with learning management systems. These tools are implemented using the dynamic geometry software, iGeom, which has been used in a geometry course since 2004 and has a successful track record in the classroom. Empowered with these new features, iGeom simplifies teachers` tasks, solves non-trivial problems in student solutions and helps to increase student motivation by providing feedback in real time. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Given a fixed set of identical or different-sized circular items, the problem we deal with consists on finding the smallest object within which the items can be packed. Circular, triangular, squared, rectangular and also strip objects are considered. Moreover, 2D and 3D problems are treated. Twice-differentiable models for all these problems are presented. A strategy to reduce the complexity of evaluating the models is employed and, as a consequence, instances with a large number of items can be considered. Numerical experiments show the flexibility and reliability of the new unified approach. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
In this note we discuss the convergence of Newton`s method for minimization. We present examples in which the Newton iterates satisfy the Wolfe conditions and the Hessian is positive definite at each step and yet the iterates converge to a non-stationary point. These examples answer a question posed by Fletcher in his 1987 book Practical methods of optimization.
Resumo:
Managing software maintenance is rarely a precise task due to uncertainties concerned with resources and services descriptions. Even when a well-established maintenance process is followed, the risk of delaying tasks remains if the new services are not precisely described or when resources change during process execution. Also, the delay of a task at an early process stage may represent a different delay at the end of the process, depending on complexity or services reliability requirements. This paper presents a knowledge-based representation (Bayesian Networks) for maintenance project delays based on specialists experience and a corresponding tool to help in managing software maintenance projects. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Optimization methods that employ the classical Powell-Hestenes-Rockafellar augmented Lagrangian are useful tools for solving nonlinear programming problems. Their reputation decreased in the last 10 years due to the comparative success of interior-point Newtonian algorithms, which are asymptotically faster. In this research, a combination of both approaches is evaluated. The idea is to produce a competitive method, being more robust and efficient than its `pure` counterparts for critical problems. Moreover, an additional hybrid algorithm is defined, in which the interior-point method is replaced by the Newtonian resolution of a Karush-Kuhn-Tucker (KKT) system identified by the augmented Lagrangian algorithm. The software used in this work is freely available through the Tango Project web page:http://www.ime.usp.br/similar to egbirgin/tango/.
Resumo:
Two Augmented Lagrangian algorithms for solving KKT systems are introduced. The algorithms differ in the way in which penalty parameters are updated. Possibly infeasible accumulation points are characterized. It is proved that feasible limit points that satisfy the Constant Positive Linear Dependence constraint qualification are KKT solutions. Boundedness of the penalty parameters is proved under suitable assumptions. Numerical experiments are presented.
Resumo:
In this paper, we propose a new method for solving large scale p-median problem instances based on real data. We compare different approaches in terms of runtime, memory footprint and quality of solutions obtained. In order to test the different methods on real data, we introduce a new benchmark for the p-median problem based on real Swedish data. Because of the size of the problem addressed, up to 1938 candidate nodes, a number of algorithms, both exact and heuristic, are considered. We also propose an improved hybrid version of a genetic algorithm called impGA. Experiments show that impGA behaves as well as other methods for the standard set of medium-size problems taken from Beasley’s benchmark, but produces comparatively good results in terms of quality, runtime and memory footprint on our specific benchmark based on real Swedish data.
Resumo:
In e-Science experiments, it is vital to record the experimental process for later use such as in interpreting results, verifying that the correct process took place or tracing where data came from. The process that led to some data is called the provenance of that data, and a provenance architecture is the software architecture for a system that will provide the necessary functionality to record, store and use process documentation. However, there has been little principled analysis of what is actually required of a provenance architecture, so it is impossible to determine the functionality they would ideally support. In this paper, we present use cases for a provenance architecture from current experiments in biology, chemistry, physics and computer science, and analyse the use cases to determine the technical requirements of a generic, technology and application-independent architecture. We propose an architecture that meets these requirements and evaluate a preliminary implementation by attempting to realise two of the use cases.
Resumo:
The behaviours of autonomous agents may deviate from those deemed to be for the good of the societal systems of which they are a part. Norms have therefore been proposed as a means to regulate agent behaviours in open and dynamic systems, and may be encoded in electronic contracts in order to specify the obliged, permitted and prohibited behaviours of agents that are signatories to such contracts. Enactment and management of electronic contracts thus enables the use of regulatory mechanisms to ensure that agent behaviours comply with the encoded norms. To facilitate such mechanisms requires monitoring in order to detect and explain violation of norms. In this paper we propose a framework for monitoring that is to be implemented and integrated into a suite of contract enactment and management tools. The framework adopts a non-intrusive approach to monitoring, whereby the states of a contract with respect to its contained norms can be inferred on the basis of messages exchanged. Specifically, the framework deploys agents that observe messages sent between contract signatories, where these messages correspond to agent behaviours and therefore indicate whether norms are, or are in danger of, being violated.
Resumo:
Neste trabalho, descreve-se o processo de produção de tecnologias educacionais e de Tecnologias da Informação e Comunicação (TICs), relacionando-as ao uso do ensino em Engenharia Geotécnica. Desenvolveu-se um sistema de informação baseado num modelo que integra recursos educacionais produzidos em diferentes formatos eletrônicos e um software baseado na Web para gestão destes recursos e das informações da disciplina Fundações. O software, denominado ENGEO, auxilia a estruturação do conhecimento envolvido no domínio de aplicação, fornecendo ao professor, ou equipe de gestão, ferramentas remotas de administração dos conteúdos e dos recursos educacionais. Aos alunos, a aplicação ENGEO fornece disponibilidade de informações e de conteúdo em um ambiente flexível. Este aplicativo foi desenvolvido para o curso de Engenharia Geotécnica, particularizado para Engenharia de Fundações, com o objetivo de introduzir e incentivar a utilização de sistemas baseados na Web para compartilhamento de conteúdo das disciplinas, implementar Tecnologias de Informação e Comunicação em aplicações para fins educacionais e fomentar discussões a respeito da identidade e perfil do Engenheiro e o papel da TICs no processo de formação do profissional de Engenharia.
Resumo:
Who was the cowboy in Washington? What is the land of sushi? Most people would have answers to these questions readily available,yet, modern search engines, arguably the epitome of technology in finding answers to most questions, are completely unable to do so. It seems that people capture few information items to rapidly converge to a seemingly 'obvious' solution. We will study approaches for this problem, with two additional hard demands that constrain the space of possible theories: the sought model must be both psychologically and neuroscienti cally plausible. Building on top of the mathematical model of memory called Sparse Distributed Memory, we will see how some well-known methods in cryptography can point toward a promising, comprehensive, solution that preserves four crucial properties of human psychology.