938 resultados para batch processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A job shop with one batch processing and several discrete machines is analyzed. Given a set of jobs, their process routes, processing requirements, and size, the objective is to schedule the jobs such that the makespan is minimized. The batch processing machine can process a batch of jobs as long as the machine capacity is not violated. The batch processing time is equal to the longest processing job in the batch. The problem under study can be represented as Jm:batch:Cmax. If no batches were formed, the scheduling problem under study reduces to the classical job shop scheduling problem (i.e. Jm:: Cmax), which is known to be NP-hard. This research extends the scheduling literature by combining Jm::Cmax with batch processing. The primary contributions are the mathematical formulation, a new network representation and several solution approaches. The problem under study is observed widely in metal working and other industries, but received limited or no attention due to its complexity. A novel network representation of the problem using disjunctive and conjunctive arcs, and a mathematical formulation are proposed to minimize the makespan. Besides that, several algorithms, like batch forming heuristics, dispatching rules, Modified Shifting Bottleneck, Tabu Search (TS) and Simulated Annealing (SA), were developed and implemented. An experimental study was conducted to evaluate the proposed heuristics, and the results were compared to those from a commercial solver (i.e., CPLEX). TS and SA, with the combination of MWKR-FF as the initial solution, gave the best solutions among all the heuristics proposed. Their results were close to CPLEX; and for some larger instances, with total operations greater than 225, they were competitive in terms of solution quality and runtime. For some larger problem instances, CPLEX was unable to report a feasible solution even after running for several hours. Between SA and the experimental study indicated that SA produced a better average Cmax for all instances. The solution approaches proposed will benefit practitioners to schedule a job shop (with both discrete and batch processing machines) more efficiently. The proposed solution approaches are easier to implement and requires short run times to solve large problem instances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This presentation was made at the Connecticut State Library Service Center, Willimantic, CT, April 14, 2009. It focused on digital capture workflows for both archival and derivative image creation using accepted current standards. Tools used were inexpensive by choice and focused towards the needs of small to mid-sized cultural heritage institutions who wish to begin digital capture in their own facilities.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

One of the main challenges facing next generation Cloud platform services is the need to simultaneously achieve ease of programming, consistency, and high scalability. Big Data applications have so far focused on batch processing. The next step for Big Data is to move to the online world. This shift will raise the requirements for transactional guarantees. CumuloNimbo is a new EC-funded project led by Universidad Politécnica de Madrid (UPM) that addresses these issues via a highly scalable multi-tier transactional platform as a service (PaaS) that bridges the gap between OLTP and Big Data applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tässä työssä on pyritty kartoittamaan mahdollisuudet omatoimiseen Voyager-kirjastojärjestelmän aineistotietokantojen ja asiakasrekisterien yhdistelyyn. Lähtökohtana on ollut oletus, että kohdejärjestelmän tietokantaan ei ole oikeuksia eikä sopimusteknistä mahdollisuuttakaan kirjoittaa tietoja suoraan kyselykielellä. Järjestelmän dokumentaatiota sekä verkostoa hyödyntämällä olen pyrkinyt kartoittamaan mahdollisuudet kaiken toiminnallisuuden vaatiman datan siirtoon. Hyödyntämällä järjestelmän rajapintoja, voidaan saavuttaa kustannussäästöjä sekä joustavuutta työn suorittamisen aikataulutukseen. Bibliografisen datan siirtoon Voyager-kirjastojärjestelmässä on mahdollisuus hyödyntää palvelimella eräajona suoritettavaa ohjelmaa. Tässä eräajossa voidaan siirtää sekä bibliografiset tietueet että varastotietueet. Nidetietojen kirjoittamiseksi kohteena olevaan tietokantaan käytetään Visual Studio -sovellusta, joka hyödyntää luettelointirajapintaa. Asiakastietojen siirtoon on mahdollista hyödyntää palvelimella suoritettavaa eräajoa, jonka syötteeksi kirjoitetaan määrämittainen syötetiedosto. Asiakastietueisiin sidotut lainatiedot voidaan siirtää kohdetietokantaan asiakasohjelman offline-lainaustoiminnolla.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents methods for moving object detection in airborne video surveillance. The motion segmentation in the above scenario is usually difficult because of small size of the object, motion of camera, and inconsistency in detected object shape etc. Here we present a motion segmentation system for moving camera video, based on background subtraction. An adaptive background building is used to take advantage of creation of background based on most recent frame. Our proposed system suggests CPU efficient alternative for conventional batch processing based background subtraction systems. We further refine the segmented motion by meanshift based mode association.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A recent area for investigation into the development of adaptable robot control is the use of living neuronal networks to control a mobile robot. The so-called Animat paradigm comprises a neuronal network (the ‘brain’) connected to an external embodiment (in this case a mobile robot), facilitating potentially robust, adaptable robot control and increased understanding of neural processes. Sensory input from the robot is provided to the neuronal network via stimulation on a number of electrodes embedded in a specialist Petri dish (Multi Electrode Array (MEA)); accurate control of this stimulation is vital. We present software tools allowing precise, near real-time control of electrical stimulation on MEAs, with fast switching between electrodes and the application of custom stimulus waveforms. These Linux-based tools are compatible with the widely used MEABench data acquisition system. Benefits include rapid stimulus modulation in response to neuronal activity (closed loop) and batch processing of stimulation protocols.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we give an overview of our studies by static and time-resolved X-ray diffraction of inverse cubic phases and phase transitions in lipids. In 1, we briefly discuss the lyotropic phase behaviour of lipids, focusing attention on non-lamellar structures, and their geometric/topological relationship to fusion processes in lipid membranes. Possible pathways for transitions between different cubic phases are also outlined. In 2, we discuss the effects of hydrostatic pressure on lipid membranes and lipid phase transitions, and describe how the parameters required to predict the pressure dependence of lipid phase transition temperatures can be conveniently measured. We review some earlier results of inverse bicontinuous cubic phases from our laboratory, showing effects such as pressure-induced formation and swelling. In 3, we describe the technique of pressure-jump synchrotron X-ray diffraction. We present results that have been obtained from the lipid system 1:2 dilauroylphosphatidylcholine/lauric acid for cubic-inverse hexagonal, cubic-cubic and lamellar-cubic transitions. The rate of transition was found to increase with the amplitude of the pressure-jump and with increasing temperature. Evidence for intermediate structures occurring transiently during the transitions was also obtained. In 4, we describe an IDL-based 'AXCESS' software package being developed in our laboratory to permit batch processing and analysis of the large X-ray datasets produced by pressure-jump synchrotron experiments. In 5, we present some recent results on the fluid lamellar-Pn3m cubic phase transition of the single-chain lipid 1-monoelaidin, which we have studied both by pressure-jump and temperature-jump X-ray diffraction. Finally, in 6, we give a few indicators of future directions of this research. We anticipate that the most useful technical advance will be the development of pressure-jump apparatus on the microsecond time-scale, which will involve the use of a stack of piezoelectric pressure actuators. The pressure-jump technique is not restricted to lipid phase transitions, but can be used to study a wide range of soft matter transitions, ranging from protein unfolding and DNA unwinding and transitions, to phase transitions in thermotropic liquid crystals, surfactants and block copolymers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The acceleration of industrial growth in recent decades on all continents aroused the interest of the companies to counter the impacts produced on the environment, spurred primarily by major disasters in the petroleum industry. In this context, the water produced is responsible for the largest volume of effluent from the production and extraction of oil and natural gas. This effluent has in its composition some critical components such as inorganic salts, heavy metals (Fe, Cu, Zn, Pb, Cd, ), presence of oil and chemicals added in the various production processes. In response to impact, have been triggered by research alternative adsorbent materials for water treatment and water produced, in order to removing oils and acids and heavy metals. Many surveys of diatomaceous earth (diatomite) in Brazil involve studies on the physico-chemical, mineral deposits, extraction, processing and applications. The official estimated Jazi are around 2.5 million tonnes, the main located in the states of Bahia (44%) and Rio Grande do Norte (37,4%). Moreover, these two states appear as large offshore producers, earning a prominent role in research of adsorbents such as diatomite for treatment of water produced. Its main applications are as an agent of filtration, adsorption of oils and greases, industrial load and thermal insulator. The objective of this work was the processing and characterization of diatomite diatomaceous earth obtained from the municipality of Macaíba-RN (known locally as tabatinga) as a low cost regenerative adsorbent for removal of heavy metals in the application of water produced treatment. In this work we adopted a methodology for batch processing, practiced by small businesses located in producing regions of Brazil. The characterization was made by X-ray diffraction (XRD), scanning electron microscopy (SEM) and specific surface area (BET). Research conducted showed that the improvement process used was effective for small volume production of diatomite concentrated. The diatomite obtained was treated by calcination at temperature of 900 oC for 2 hours, with and without fluxing Na2CO3 (4%), according to optimal results in the literature. Column adsorption experiments were conducted to percolation of the in nature, calcined and calcined fluxing diatomites. Effluent was used as a saline solution containing ions of Cu, Zn, Na, Ca and Mg simulating the composition of produced waters in the state of Rio Grande do Norte, Brazil. The breakthrough curves for simultaneous removal of copper ions and zinc as a result, 84.3% for calcined diatomite and diatomite with 97.3 % for fluxing. The calcined fluxing diatomite was more efficient permeability through the bed and removal of copper and zinc ions. The fresh diatomite had trouble with the permeability through the bed under the conditions tested, compared with the other obtained diatomite. The results are presented as promising for application in the petroleum industry

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A mathematical model that describes the operation of a sequential leach bed process for anaerobic digestion of organic fraction of municipal solid waste (MSW) is developed and validated. This model assumes that ultimate mineralisation of the organic component of the waste occurs in three steps, namely solubilisation of particulate matter, fermentation to volatile organic acids (modelled as acetic acid) along with liberation of carbon dioxide and hydrogen, and methanogenesis from acetate and hydrogen. The model incorporates the ionic equilibrium equations arising due to dissolution of carbon dioxide, generation of alkalinity from breakdown of solids and dissociation of acetic acid. Rather than a charge balance, a mass balance on the hydronium and hydroxide ions is used to calculate pH. The flow of liquid through the bed is modelled as occurring through two zones-a permeable zone with high flushing rates and the other more stagnant. Some of the kinetic parameters for the biological processes were obtained from batch MSW digestion experiments. The parameters for flow model were obtained from residence time distribution studies conducted using tritium as a tracer. The model was validated using data from leach bed digestion experiments in which a leachate volume equal to 10% of the fresh waste bed volume was sequenced. The model was then tested, without altering any kinetic or flow parameters, by varying volume of leachate that is sequenced between the beds. Simulations for sequencing/recirculating 5 and 30% of the bed volume are presented and compared with experimental results. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.