918 resultados para Software testing. Problem-oriented programming. Teachingmethodology
Resumo:
An important production programming problem arises in paper industries coupling multiple machine scheduling with cutting stocks. Concerning machine scheduling: how can the production of the quantity of large rolls of paper of different types be determined. These rolls are cut to meet demand of items. Scheduling that minimizes setups and production costs may produce rolls which may increase waste in the cutting process. On the other hand, the best number of rolls in the point of view of minimizing waste may lead to high setup costs. In this paper, coupled modeling and heuristic methods are proposed. Computational experiments are presented.
Resumo:
Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
The aim of task scheduling is to minimize the makespan of applications, exploiting the best possible way to use shared resources. Applications have requirements which call for customized environments for their execution. One way to provide such environments is to use virtualization on demand. This paper presents two schedulers based on integer linear programming which schedule virtual machines (VMs) in grid resources and tasks on these VMs. The schedulers differ from previous work by the joint scheduling of tasks and VMs and by considering the impact of the available bandwidth on the quality of the schedule. Experiments show the efficacy of the schedulers in scenarios with different network configurations.
Resumo:
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the epsilon(k)-global minimization of the Augmented Lagrangian with simple constraints, where epsilon(k) -> epsilon. Global convergence to an epsilon-global minimizer of the original problem is proved. The subproblems are solved using the alpha BB method. Numerical experiments are presented.
Resumo:
In the late seventies, Megiddo proposed a way to use an algorithm for the problem of minimizing a linear function a(0) + a(1)x(1) + ... + a(n)x(n) subject to certain constraints to solve the problem of minimizing a rational function of the form (a(0) + a(1)x(1) + ... + a(n)x(n))/(b(0) + b(1)x(1) + ... + b(n)x(n)) subject to the same set of constraints, assuming that the denominator is always positive. Using a rather strong assumption, Hashizume et al. extended Megiddo`s result to include approximation algorithms. Their assumption essentially asks for the existence of good approximation algorithms for optimization problems with possibly negative coefficients in the (linear) objective function, which is rather unusual for most combinatorial problems. In this paper, we present an alternative extension of Megiddo`s result for approximations that avoids this issue and applies to a large class of optimization problems. Specifically, we show that, if there is an alpha-approximation for the problem of minimizing a nonnegative linear function subject to constraints satisfying a certain increasing property then there is an alpha-approximation (1 1/alpha-approximation) for the problem of minimizing (maximizing) a nonnegative rational function subject to the same constraints. Our framework applies to covering problems and network design problems, among others.
Resumo:
We investigate several two-dimensional guillotine cutting stock problems and their variants in which orthogonal rotations are allowed. We first present two dynamic programming based algorithms for the Rectangular Knapsack (RK) problem and its variants in which the patterns must be staged. The first algorithm solves the recurrence formula proposed by Beasley; the second algorithm - for staged patterns - also uses a recurrence formula. We show that if the items are not so small compared to the dimensions of the bin, then these algorithms require polynomial time. Using these algorithms we solved all instances of the RK problem found at the OR-LIBRARY, including one for which no optimal solution was known. We also consider the Two-dimensional Cutting Stock problem. We present a column generation based algorithm for this problem that uses the first algorithm above mentioned to generate the columns. We propose two strategies to tackle the residual instances. We also investigate a variant of this problem where the bins have different sizes. At last, we study the Two-dimensional Strip Packing problem. We also present a column generation based algorithm for this problem that uses the second algorithm above mentioned where staged patterns are imposed. In this case we solve instances for two-, three- and four-staged patterns. We report on some computational experiments with the various algorithms we propose in this paper. The results indicate that these algorithms seem to be suitable for solving real-world instances. We give a detailed description (a pseudo-code) of all the algorithms presented here, so that the reader may easily implement these algorithms. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
The synthesis and self-assembly of tetragonal phase-containing L1(0)-Fe(55)Pt(45) nanorods with high coercive field is described. The experimental procedure resulted in a tetragonal/cubic phase ratio close to 1:1 for the as-synthesized nanoparticles. Using different surfactant/solvent proportions in the process allowed control of particle morphology from nanospheres to nanowires. Monodisperse nanorods with lengths of 60 +/- 5 nm and diameters of 2-3 nm were self-assembled in a perpendicular oriented array onto a substrate surface using hexadecylamine as organic spacer. Magnetic alignment and properties assigned, respectively, to the shape anisotropy and the tetragonal phase suggest that the self-assembled materials are a strong candidate to solve the problem of random magnetic alignment observed in FePt nanospheres leading to applications in ultrahigh magnetic recording (UHMR) systems capable of achieving a performance of the order of terabits/in(2).
Resumo:
Today there are many system development projects that break both budget and time plan. Often this depends on defects in the information systems that could have been prevented. The cost of test can in some cases be as high as 50 % of the projects total cost and it's at the same time an important part of development. Test as such has moved its focus from the software it self and its faults to a wider perspective on whole infrastructures of information systems where assure a good quality is important. Sogeti in the Netherlands have developed a test method called TMap (Test Management approach) that can be used for structured testing of information systems. TMap haven't been used as much as desired in the office in Borlänge. Because Microsoft is releasing a new version of their platform Visual Studio Team System (VSTS 2010) some colleges at Sogeti in the Netherlands are about to develop a template that can support the use of TMap in VSTS 2010. When we write this the template is still in development. The goal for Sogeti was to find out the differences between the test functionality in VSTS 2008 and 2010. By using the purpose with this essay, which was to analyze the test process in VSTS 2008 with TMap against the test process in VSTS 2010 together with the template we got much help to achieve the goal. The analysis was done with four different aspects: The TPI and TMMi models, problem and strength analyses and a few question formulations. The TPI and TMMi models where used to analyses and evaluate the test process. The analysis showed that there were differences between the both test processes. VSTS 2010 together with the template gave a better support to use TMap and perform test. In VSTS 2010 the test tool Camano is connected to TFS and the tool is also to make the execution and logging of tests easier. This leads to a test process that is easier to handle and has a better support for TMap.
Resumo:
In this project, two broad facets in the design of a methodology for performance optimization of indexable carbide inserts were examined. They were physical destructive testing and software simulation.For the physical testing, statistical research techniques were used for the design of the methodology. A five step method which began with Problem definition, through System identification, Statistical model formation, Data collection and Statistical analyses and results was indepthly elaborated upon. Set-up and execution of an experiment with a compression machine together with roadblocks and possible solution to curb road blocks to quality data collection were examined. 2k factorial design was illustrated and recommended for process improvement. Instances of first-order and second-order response surface analyses were encountered. In the case of curvature, test for curvature significance with center point analysis was recommended. Process optimization with method of steepest ascent and central composite design or process robustness studies of response surface analyses were also recommended.For the simulation test, AdvantEdge program was identified as the most used software for tool development. Challenges to the efficient application of this software were identified and possible solutions proposed. In conclusion, software simulation and physical testing were recommended to meet the objective of the project.
Resumo:
This Thesis project is a part of the all-round automation of production of concentrating solar PV/T systems Absolicon X10. ABSOLICON Solar Concentrator AB has been invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate the shape of concentrating parabolic reflectors.On the basis of the requirements of the company administration and needs of real production process the operation conditions for the Laser testing rig were formulated. The basic concept to use laser radiation was defined.At the first step, the complex design of the whole system was made and division on the parts was defined. After the preliminary conducted simulations the function and operation conditions of the all parts were formulated.At the next steps, the detailed design of all the parts was conducted. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Laser-testing rig were assembled and tested. Software part, which controls the Laser-testing rig work, was created on the LabVIEW basis. To tune and test software part the special simulator was designed and assembled.When all parts were assembled in the complete system, the Laser-testing rig was tested, calibrated and tuned.In the workshop of Absolicon AB, the trial measurements were conducted and Laser-testing rig was installed in the production line at the plant in Soleftea.
Resumo:
The main objective of this thesis work is to develop communication link between Runrev Revolution (IDE) and JADE (Multi-Agent System) through Socket programming using TCP/IP layer. These two independent platforms are connected using socket programming technique. Socket programming is considered to be newly emerging technology among these two platforms, the work done in this thesis work is considered to be a prototype.A Graphical simulation model is developed by salixphere (Company in Hedemora) to simulate logistic problems using Runrev Revolution (IDE). The simulation software/program is called “BIOSIM”. The logistic problems are complex, and conventional optimization techniques are unlikely very successful. “BIOSIM” can demonstrate the graphical representation of logistic problems depending upon the problem domains. As this simulation model is developed in revolution programming language (Transcript) which is dynamically typed and English-like language, it is quite slow compared to other high level programming languages. The object of this thesis work is to add intelligent behaviour in graphical objects and develop communication link between Runrev revolution (IDE) and JADE (Multi-Agent System) using TCP/IP layers.The test shows the intelligent behaviour in the graphical objects and successful communication between Runrev Revolution (IDE) and JADE (Multi-Agent System).
Resumo:
Bergkvist insjön AB is a sawmill yard which is capable of producing 350,000 cubic meter of timber every year this requires lot of internal resources. Sawmill operations can be classified as unloading, sorting, storage and production of timber. In the company we have trucks arriving at random they have to be unloaded and sent back at the earliest to avoid queuing up of trucks creating a problem for truck owners. The sawmill yard has to operate with two log stackers that does several tasks including transporting the logs from trucks to measurement station where the logs will be sorted into classes and dropped into pockets from pockets to the sorted timber yard where they are stored and finally from there to sawmill for final processing. The main issue that needs to be answered here is the lining up trucks that are waiting to be unload, creating a problem for both sawmill as well as the truck owners and given huge production volume, it is certain that handling of resources is top priority. A key challenge in handling of resources would be unloading of trucks and finding a way to optimize internal resources.To address this problem i have experimented on different ways of using internal resources, i have designed different cases, in case 1 we have both the log stackers working on sawmill and measurement station. The main objective of having this case is to make sawmill and measurement station to work all the time. Then in case 2, i have divided the work between both the log stackers, one log stacker will be working on sawmill and pocket_control and second log stacker will be working on measurement station and truck. Then in case 3 we have only one log stacker working on all the agents, this case was designed to reduce cost of production, as the experiment cannot be done in real-time due to operational cost, for this purpose simulation is used, preliminary investigation into simulation results suggested that case 2 is the best option has it reduced waiting time of trucks considerably when compared with other cases and it showed 50% increase in optimizing internal resources.
Resumo:
Många projekt misslyckas och en av anledningarna är dålig styrning av projektet i allmänhet och inom IT branschen i synnerhet. Baserad på kritik av de traditionella metoderna under de senaste åren, så har det uppkommit flera lättrörliga metoder som kallas Agila metoder. Scrum är den mest kända Agila metoden som används idag. Metoden lovar goda resultat, men i en artikel ur tidningen Computer Sweden (feb 2009) står det ”siffror visar att nio av tio Scrumprojekt misslyckas”. Artikeln triggade vårt intresse av att ta reda på vilka problem specifika för Scrum som många har kritiserat och valde därför att rikta in vår studie mot detta. Uppsatsen syftar till att undersöka om lokala IT-företag i Borlänge, Headlight, Sogeti ochstatliga nätkapacitetleverantören Trafikverket ICT lider av det allmänna problem som de andra Scrumanvändarna upplever i samband med användningen av metoden. Denna uppsats har fokus på fyra problemområden: bristfällig dokumentation, sämre effektivitet i arbetsprocessen, sämre effektivitet i arbetsprocessen i stora projekt samt bristande stöd för utvärdering. För vår studie har litteraturstudier och intervjuer genomförts. Intervjuserier gjordes på elva personer hos våra fallföretag. Målgruppen för våra intervjuer är Product Owner (PO) ScrumMaster (SM) och utvecklare. Vi kan efter genomförd studie dra slutsatsen att de allmänna upplevda problem som de andra Scrumanvändaren upplever har vi även kunnat identifiera hos våra fallföretag. Resultaten har bekräftats med insamlade data och vår teoretiska ram. I diskussionen presenterar vi rekommendationer för att undvik relaterade problem med Scrum.
Resumo:
Dynamic system test methods for heating systems were developed and applied by the institutes SERC and SP from Sweden, INES from France and SPF from Switzerland already before the MacSheep project started. These test methods followed the same principle: a complete heating system – including heat generators, storage, control etc., is installed on the test rig; the test rig software and hardware simulates and emulates the heat load for space heating and domestic hot water of a single family house, while the unit under test has to act autonomously to cover the heat demand during a representative test cycle. Within the work package 2 of the MacSheep project these similar – but different – test methods were harmonized and improved. The work undertaken includes: • Harmonization of the physical boundaries of the unit under test. • Harmonization of the boundary conditions of climate and load. • Definition of an approach to reach identical space heat load in combination with an autonomous control of the space heat distribution by the unit under test. • Derivation and validation of new six day and a twelve day test profiles for direct extrapolation of test results. The new harmonized test method combines the advantages of the different methods that existed before the MacSheep project. The new method is a benchmark test, which means that the load for space heating and domestic hot water preparation will be identical for all tested systems, and that the result is representative for the performance of the system over a whole year. Thus, no modelling and simulation of the tested system is needed in order to obtain the benchmark results for a yearly cycle. The method is thus also applicable to products for which simulation models are not available yet. Some of the advantages of the new whole system test method and performance rating compared to the testing and energy rating of single components are: • Interaction between the different components of a heating system, e.g. storage, solar collector circuit, heat pump, control, etc. are included and evaluated in this test. • Dynamic effects are included and influence the result just as they influence the annual performance in the field. • Heat losses are influencing the results in a more realistic way, since they are evaluated under "real installed" and representative part-load conditions rather than under single component steady state conditions. The described method is also suited for the development process of new systems, where it replaces time-consuming and costly field testing with the advantage of a higher accuracy of the measured data (compared to the typically used measurement equipment in field tests) and identical, thus comparable boundary conditions. Thus, the method can be used for system optimization in the test bench under realistic operative conditions, i.e. under relevant operating environment in the lab. This report describes the physical boundaries of the tested systems, as well as the test procedures and the requirements for both the unit under test and the test facility. The new six day and twelve day test profiles are also described as are the validation results.
Resumo:
Regarding the location of a facility, the presumption in the widely used p-median model is that the customer opts for the shortest route to the nearest facility. However, this assumption is problematic on free markets since the customer is presumed to gravitate to a facility by the distance to and the attractiveness of it. The recently introduced gravity p-median model offers an extension to the p-median model that account for this. The model is therefore potentially interesting, although it has not yet been implemented and tested empirically. In this paper, we have implemented the model in an empirical problem of locating vehicle inspections, locksmiths, and retail stores of vehicle spare-parts for the purpose of investigating its superiority to the p-median model. We found, however, the gravity p-median model to be of limited use for the problem of locating facilities as it either gives solutions similar to the p-median model, or it gives unstable solutions due to a non-concave objective function.