954 resultados para batch equilibrium
Resumo:
János Kornai’s DRSE theory (Kornai, 2014) follows the ex post model philosophy which radically rejects the ex ante set of conditions laid down by the dominant neoclassical school and the stringent limits of equilibrium, and defines its own premises for the functioning of capitalist economy. In other words, the DRSE theory represents an extremely novel trend among the various schools of economics. The theory is still only a verbal model with the following supporting pillars as the immanent features of the capitalist system: dynamism, rivalry and the surplus economy. (The English name of the theory uses the initial letters of the terms Dynamism, Rivalry, Surplus Economy). The dominance of the surplus economy, that is, oversupply is replaced by monopolistic competition, uncertainty over the volume of demand, Schumpeterian innovation, dynamism, technological progress, creative destruction and increasing return to scale with rivalry between producers and service providers for markets. This paper aims to examine whether the DRSE theory can be formulated as a formal mathematical model. We have chosen a special route to do this: first we explore the unreal ex ante assumptions of general equilibrium theory (Walras, 1874; Neumann, 1945), and then we establish some of the possible connections between the premises of DRSE, which include the crucial condition that just like in biological evolution, there is no fixed steady state in the evolutionary processes of market economy, not even as a point of reference. General equilibrium theory and DRSE theory are compared in the focus of Schumpeterian evolutionary economics.
Resumo:
A new axiomatization of the Nash equilibrium correspondence for n-person games based on independence of irrelevant strategies is given. Using a flexible general model, it is proved that the Nash equilibrium correspondence is the only solution to satisfy the axioms of non-emptiness, weak one-person rationality, independence of irrelevant strategies and converse independence of irrelevant strategies on the class of subgames of a fixed finite n-person game which admit at least one Nash equilibrium. It is also shown that these axioms are logically independent.
Resumo:
Prediction of arsenic transport and transformation in soil environment requires understanding the transport mechanisms and proper estimation of arsenic partitioning tong all three phases in soil/aquifer systems: mobile colloids, mobile soil solution, and immobile soil solids. The primary purpose of this research is to study natural dissolved organic matter (DOM)/colloid-facilitated transport of arsenic and understand the role of soil derived carriers in the transport and transformation of both inorganic and organoarsenicals in soils. ^ DOM/colloid facilitated arsenic transport and transformation in porous soil media were investigated using a set of experimental approaches including batch experiment, equilibrium membrane dialysis experiment and column experiment. Soil batch experiment was applied to investigate arsenic adsorption on a variety of soils with different characteristics; Equilibrium membrane dialysis was employed to determine the 'free' and 'colloid-bound/complexed' arsenic in water extracts of chosen soils; Column experiments were also set up in the laboratory to simulate arsenic transport and transformation through golf course soils in the presence and absence of soil-derived dissolved substances. ^ The experimental results revealed that organic matter amendments effectively reduced soil arsenic adsorption. The majority of arsenic present in the soil extracts was associated with small substances of molecular weight (MW) between 500 and 3,500 Da, Only a small fraction of arsenic was associated with higher MW substances (MW > 3,500 Da), which was operationally defined as colloidal part in this study. The association of arsenic and DOM in the soil extracts strongly affected arsenic bioavailability, arsenic transport and transformation in soils. The results of column experiments revealed arsenic complicated behavior with various processes occurring in soils studied, including: soil arsenic' adsorption, facilitated arsenic transportation by dissolved substances presented in soil extracts and microorganisms involved arsenic species transformation. ^ Soil organic matter amendments effectively reduce soil arsenic adsorption capability either by scavenging 'soil arsenic adsorption sites or by interactions between arsenic species and dissolved organic chemicals in soil solution. Close attention must be paid for facilitated arsenic transport by dissolved substances presented in soil solution and microorganisms involved arsenic species transformation in arsenic-contaminated soils.^
Resumo:
This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^
Resumo:
A job shop with one batch processing and several discrete machines is analyzed. Given a set of jobs, their process routes, processing requirements, and size, the objective is to schedule the jobs such that the makespan is minimized. The batch processing machine can process a batch of jobs as long as the machine capacity is not violated. The batch processing time is equal to the longest processing job in the batch. The problem under study can be represented as Jm:batch:Cmax. If no batches were formed, the scheduling problem under study reduces to the classical job shop scheduling problem (i.e. Jm:: Cmax), which is known to be NP-hard. This research extends the scheduling literature by combining Jm::Cmax with batch processing. The primary contributions are the mathematical formulation, a new network representation and several solution approaches. The problem under study is observed widely in metal working and other industries, but received limited or no attention due to its complexity. A novel network representation of the problem using disjunctive and conjunctive arcs, and a mathematical formulation are proposed to minimize the makespan. Besides that, several algorithms, like batch forming heuristics, dispatching rules, Modified Shifting Bottleneck, Tabu Search (TS) and Simulated Annealing (SA), were developed and implemented. An experimental study was conducted to evaluate the proposed heuristics, and the results were compared to those from a commercial solver (i.e., CPLEX). TS and SA, with the combination of MWKR-FF as the initial solution, gave the best solutions among all the heuristics proposed. Their results were close to CPLEX; and for some larger instances, with total operations greater than 225, they were competitive in terms of solution quality and runtime. For some larger problem instances, CPLEX was unable to report a feasible solution even after running for several hours. Between SA and the experimental study indicated that SA produced a better average Cmax for all instances. The solution approaches proposed will benefit practitioners to schedule a job shop (with both discrete and batch processing machines) more efficiently. The proposed solution approaches are easier to implement and requires short run times to solve large problem instances.
Resumo:
This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.
Resumo:
The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^
Resumo:
Diazotrophic (N2-fixing) cyanobacteria provide the biological source of new nitrogen for large parts of the ocean. However, little is known about their sensitivity to global change. Here we show that the single most important nitrogen fixer in today's ocean, Trichodesmium, is strongly affected by changes in CO2 concentrations. Cell division rate doubled with rising CO2 (glacial to projected year 2100 levels) prompting lower carbon, nitrogen and phosphorus cellular contents, and reduced cell dimensions. N2 fixation rates per unit of phosphorus utilization as well as C:P and N:P ratios more than doubled at high CO2, with no change in C:N ratios. This could enhance the productivity of N-limited oligotrophic oceans, drive some of these areas into P limitation, and increase biological carbon sequestration in the ocean. The observed CO2 sensitivity of Trichodesmium could thereby provide a strong negative feedback to atmospheric CO2 increase.
Resumo:
The need to preserve the environment has led to the search for new materials for efficient disposal of chemical compounds that alter the stability of our natural resources. Among these resources, stands in first place the water, as a precious commodity and scarce, leading to the proper use and reuse. As a result, the World Health Organization has established maximum permissible values in drinking water, such as: 50 mg/L, 0, 1 mg/L and 0, 5 mg/L to at-3, at-2, NH 4, respectively. For these reasons, assesses the implementation of new materials and water treatment processes aiming at the removal of these compounds, such as alumina, in the form of powder or as a support for a catalytic system using inorganic membranes capable of supporting more severe conditions of temperature and pressure by opening new possibilities for applications of membrane reactors; and also for electrochemical treatments with doped diamond bobo electrodes (BDD) as anode and copper as cathode. For such purpose, was conducted the study of adsorption of nitrate in different times to assess the time required to achieve equilibrium by employing three commercial alumina called: acidic, basic and neutral alumina, with subsequent treatment only in the acidic alumina impregnating metals (PdCu/Al2O3) for the catalytic reaction. The materials were previously characterized by XRD, SEM techniques and ABET. Aluminas presented a considerable adsortive capacity of nitrate in the first thirty minutes, equivalent to 50% of removal reaching equilibrium in that time. After treatment, using alumina as catalyst for the reaction in batch reactor (Pd-Cu/Al2O3), the results were more favourable, totalling 64% reduction of ion NO3-at the end of three hours. On the other hand, the results for the catalytic reaction using the catalytic support Pd-Cu/TiO2 in membrane reactor proved to be low. -if, in this way, improve the conditions of catalytic system to optimize the process. Already, for the electrochemical tests using DDB1 electrodes as anode, and Cu, as cathode, there was a fairly significant nitrate reduction, approximately 80% of ion removal during three hours and cost viable applications.
Resumo:
The growing interest and applications of biotechnology products have increased the development of new processes for recovery and purification of proteins. The expanded bed adsorption (EBA) has emerged as a promising technique for this purpose. It combines into one operation the steps of clarification, concentration and purification of the target molecule. Hence, the method reduces the time and the cost of operation. In this context, this thesis aim was to evaluate the recovery and purification of 503 antigen of Leishmania i. chagasi expressed in E. coli M15 and endotoxin removal by EBA. In the first step of this study, batch experiments were carried out using two experimental designs to define the optimal adsorption and elution conditions of 503 antigen onto Streamline chelating resin. For adsorption assays, using expanded bed, it was used a column of 2.6 cm in diameter by 30.0 cm in height coupled to a peristaltic pump. In the second step of study, the removal of endotoxin during antigen recovery process was evaluated employing the non-ionic surfactant Triton X-114 in the washing step ALE. In the third step, we sought developing a mathematical model able to predict the 503 antigen breakthrough curves in expanded mode. The experimental design results to adsorption showed the pH 8.0 and the NaCl concentration of 2.4 M as the optimum adsorption condition. In the second design, the only significant factor for elution was the concentration of imidazole, which was taken at 600 mM. The adsorption isotherm of the 503 antigen showed a good fit to the Langmuir model (R = 0.98) and values for qmax (maximum adsorption capacity) and Kd (equilibrium constant) estimated were 1.95 mg/g and 0.34 mg/mL, respectively. Purification tests directly from unclarified feedstock showed a recovery of 59.2% of the target protein and a purification factor of 6.0. The addition of the non-ionic surfactant Triton X-114 to the washing step of EBA led to high levels (> 99%) of LPS removal initially present in the samples for all conditions tested. The mathematical model obtained to describe the 503 antigen breakthrough curves in Streamline Chelanting resin in expanded mode showed a good fit for both parameter estimation and validation steps. The validated model was used to optimize the efficiencies, achieving maximum values of the process and of the column efficiencies of 89.2% and 75.9%, respectively. Therefore, EBA is an efficient alternative for the recovery of the target protein and removal of endotoxin from an E. coli unclarified feedstock in just one step.
Resumo:
Vodyanitskii mud volcano is located at a depth of about 2070 m in the Sorokin Trough, Black sea. It is a 500-m wide and 20-m high cone surrounded by a depression, which is typical of many mud volcanoes in the Black Sea. 75 kHz sidescan sonar show different generations of mud flows that include mud breccia, authigenic carbonates, and gas hydrates that were sampled by gravity coring. The fluids that flow through or erupt with the mud are enriched in chloride (up to 650 mmol L**-1 at 150-cm sediment depth) suggesting a deep source, which is similar to the fluids of the close-by Dvurechenskii mud volcano. Direct observation with the remotely operated vehicle Quest revealed gas bubbles emanating at two distinct sites at the crest of the mud volcano, which confirms earlier observations of bubble-induced hydroacoustic anomalies in echosounder records. The sediments at the main bubble emission site show a thermal anomaly with temperatures at 60 cm sediment depth that were 0.9 °C warmer than the bottom water. Chemical and isotopic analyses of the emanated gas revealed that it consisted primarily of methane (99.8%) and was of microbial origin (dD-CH4 = -170.8 per mil (SMOW), d13C-CH4 = -61.0 per mil (V-PDB), d13C-C2H6 = -44.0 per mil (V-PDB)). The gas flux was estimated using the video observations of the ROV. Assuming that the flux is constant with time, about 0.9 ± 0.5 x 10**6 mol of methane is released every year. This value is of the same order-of-magnitude as reported fluxes of dissolved methane released with pore water at other mud volcanoes. This suggests that bubble emanation is a significant pathway transporting methane from the sediments into the water column.
Resumo:
We study the fluctuation-dissipation relations for a three dimensional Ising spin glass in a magnetic field both in the high temperature phase as well as in the low temperature one. In the region of times simulated we have found that our results support a picture of the low temperature phase with broken replica symmetry, but a droplet behavior cannot be completely excluded.
Resumo:
Peer reviewed
Resumo:
Peer reviewed
Resumo:
The impact of alkyl chain length on the esterification of C2–C16 organic acids with C1–C4 alcohols has been systematically investigated over bulk and SBA-15 supported sulfated zirconias (SZs). Rates of catalytic esterification for methanol with acetic acid are directly proportional to the sulfur content for both SZ and SZ/SBA-15, with the high dispersion of SZ achievable in conformal coatings over mesoporous SBA-15 confering significant rate-enhancements. Esterification over the most active 0.24 mmol gcat−1 bulk SZ and 0.29 mmol gcat−1 SZ/SBA-15 materials was inversely proportional to the alkyl chain length of alcohol and acid reactants; being most sensitive to changes from methanol to ethanol and acetic to hexanoic acids respectively. Kinetic analyses reveal that these alkyl chain dependencies are in excellent accord with the Taft relationship for polar and steric effects in aliphatic systems and the enthalpy of alcohol adsorption, implicating a Langmuir–Hinshelwood mechanism. The first continuous production of methyl propionate over a SZ fixed-bed is also demonstrated.