942 resultados para State Extension Problem
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Hemorrhage represents a set of causes that focuses on women during the pregnancy and puerperal period, and that, with improper attention, results in death. The authors aimed to analyze maternal deaths related to hemorrhage that occurred in the state of Santa Catarina, Brazil. The data were obtained from the Mortality Information System and Live Births Information System from the Brazilian Ministry of Health. This was a descriptive study, in which 491 maternal deaths that occurred in the period 1997-2010 were analyzed. Of these, 61 were related to hemorrhage, corresponding to 12.42%; postpartum hemorrhage was the most prevalent cause, with 26 deaths, followed by placental abruption with 15, representing 67.21% of the cases. The maternal mortality from hemorrhage is a public health problem in the state of Santa Catarina, due to its high prevalence and the fact that its underlying causes are preventable.
Resumo:
The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.
Resumo:
The problems arising in commercial distribution are complex and involve several players and decision levels. One important decision is relatedwith the design of the routes to distribute the products, in an efficient and inexpensive way.This article deals with a complex vehicle routing problem that can beseen as a new extension of the basic vehicle routing problem. The proposed model is a multi-objective combinatorial optimization problemthat considers three objectives and multiple periods, which models in a closer way the real distribution problems. The first objective is costminimization, the second is balancing work levels and the third is amarketing objective. An application of the model on a small example, with5 clients and 3 days, is presented. The results of the model show the complexity of solving multi-objective combinatorial optimization problems and the contradiction between the several distribution management objective.
Resumo:
To recover a version of Barro's (1979) `random walk'tax smoothing outcome, we modify Lucas and Stokey's (1983) economyto permit only risk--free debt. This imparts near unit root like behaviorto government debt, independently of the government expenditureprocess, a realistic outcome in the spirit of Barro's. We showhow the risk--free--debt--only economy confronts the Ramsey plannerwith additional constraints on equilibrium allocations thattake the form of a sequence of measurability conditions.We solve the Ramsey problem by formulating it in terms of a Lagrangian,and applying a Parameterized Expectations Algorithm tothe associated first--order conditions. The first--order conditions andnumerical impulse response functions partially affirmBarro's random walk outcome. Though the behaviors oftax rates, government surpluses, and government debts differ, allocationsare very close for computed Ramsey policies across incomplete and completemarkets economies.
Resumo:
Mining in the State of Minas Gerais-Brazil is one of the activities with the strongest impact on the environment, in spite of its economical importance. Amongst mining activities, acid drainage poses a serious environmental problem due to its widespread practice in gold-extracting areas. It originates from metal-sulfide oxidation, which causes water acidification, increasing the risk of toxic element mobilization and water resource pollution. This research aimed to evaluate the acid drainage problem in Minas Gerais State. The study began with a bibliographic survey at FEAM (Environment Foundation of Minas Gerais State) to identify mining sites where sulfides occur. Substrate samples were collected from these sites to determine AP (acidity potential) and NP (neutralization potential). The AP was evaluated by the procedure of the total sulfide content and by oxygen peroxide oxidation, followed by acidity titration. The NP was evaluated by the calcium carbonate equivalent. Petrographic thin sections were also mounted and described with a special view to sulfides and carbonates. Based on the chemical analysis, the acid-base accounting (ABA) was determined by the difference of AP and NP, and the acid drainage potential obtained by the ABA value and the total volume of material at each site. Results allowed the identification of substrates with potential to generate acid drainage in Minas Gerais state. Altogether these activities represent a potential to produce between 3.1 to 10.4 billions of m³ of water at pH 2 or 31.4 to 103.7 billions of m³ of water at pH 3. This, in turn, would imply in costs of US$ 7.8 to 25.9 millions to neutralize the acidity with commercial limestone. These figures are probably underestimated because some mines were not surveyed, whereas, in other cases, surface samples may not represent reality. A more reliable state-wide evaluation of the acid drainage potential would require further studies, including a larger number of samples. Such investigations should consider other mining operations beyond the scope of this study as well as the kinetics of the acid generation by simulated weathering procedures.
Resumo:
Each year, traffic crashes in the United States result in nearly 300,000 deaths and serious injuries. In 2008 alone, traffic crashes cost the nation an estimated $230 billion, nine times more than the estimated cost of all crime. Highway crashes represent the leading cause of death and disabling injury for persons under age 35. In recognition of this problem, the Congress of the United States enacted national highway safety legislation in 1966, which led to the establishment of the National Highway Traffic Safety Administration (NHTSA). The legislation also provided for federal highway safety monies to be made available to the states with a goal of reducing death and injury on the nation’s roads. Iowa has been very active in the federal-state local highway safety partnership since the mid 1960s.
Resumo:
Report on a special investigation of the Eastern Iowa Center for Problem Gambling for the period May 1, 2007 through April 30, 2009
Resumo:
We deal with the hysteretic behavior of partial cycles in the two¿phase region associated with the martensitic transformation of shape¿memory alloys. We consider the problem from a thermodynamic point of view and adopt a local equilibrium formalism, based on the idea of thermoelastic balance, from which a formal writing follows a state equation for the material in terms of its temperature T, external applied stress ¿, and transformed volume fraction x. To describe the striking memory properties exhibited by partial transformation cycles, state variables (x,¿,T) corresponding to the current state of the system have to be supplemented with variables (x,¿,T) corresponding to points where the transformation control parameter (¿¿ and/or T) had reached a maximum or a minimum in the previous thermodynamic history of the system. We restrict our study to simple partial cycles resulting from a single maximum or minimum of the control parameter. Several common features displayed by such partial cycles and repeatedly observed in experiments lead to a set of analytic restrictions, listed explicitly in the paper, to be verified by the dissipative term of the state equation, responsible for hysteresis. Finally, using calorimetric data of thermally induced partial cycles through the martensitic transformation in a Cu¿Zn¿Al alloy, we have fitted a given functional form of the dissipative term consistent with the analytic restrictions mentioned above.
Resumo:
We present a very simple but fairly unknown method to obtain exact lower bounds to the ground-state energy of any Hamiltonian that can be partitioned into a sum of sub-Hamiltonians. The technique is applied, in particular, to the two-dimensional spin-1/2 antiferromagnetic Heisenberg model. Reasonably good results are easily obtained and the extension of the method to other systems is straightforward.
Resumo:
This paper applies probability and decision theory in the graphical interface of an influence diagram to study the formal requirements of rationality which justify the individualization of a person found through a database search. The decision-theoretic part of the analysis studies the parameters that a rational decision maker would use to individualize the selected person. The modeling part (in the form of an influence diagram) clarifies the relationships between this decision and the ingredients that make up the database search problem, i.e., the results of the database search and the different pairs of propositions describing whether an individual is at the source of the crime stain. These analyses evaluate the desirability associated with the decision of 'individualizing' (and 'not individualizing'). They point out that this decision is a function of (i) the probability that the individual in question is, in fact, at the source of the crime stain (i.e., the state of nature), and (ii) the decision maker's preferences among the possible consequences of the decision (i.e., the decision maker's loss function). We discuss the relevance and argumentative implications of these insights with respect to recent comments in specialized literature, which suggest points of view that are opposed to the results of our study.
Resumo:
Brain fluctuations at rest are not random but are structured in spatial patterns of correlated activity across different brain areas. The question of how resting-state functional connectivity (FC) emerges from the brain's anatomical connections has motivated several experimental and computational studies to understand structure-function relationships. However, the mechanistic origin of resting state is obscured by large-scale models' complexity, and a close structure-function relation is still an open problem. Thus, a realistic but simple enough description of relevant brain dynamics is needed. Here, we derived a dynamic mean field model that consistently summarizes the realistic dynamics of a detailed spiking and conductance-based synaptic large-scale network, in which connectivity is constrained by diffusion imaging data from human subjects. The dynamic mean field approximates the ensemble dynamics, whose temporal evolution is dominated by the longest time scale of the system. With this reduction, we demonstrated that FC emerges as structured linear fluctuations around a stable low firing activity state close to destabilization. Moreover, the model can be further and crucially simplified into a set of motion equations for statistical moments, providing a direct analytical link between anatomical structure, neural network dynamics, and FC. Our study suggests that FC arises from noise propagation and dynamical slowing down of fluctuations in an anatomically constrained dynamical system. Altogether, the reduction from spiking models to statistical moments presented here provides a new framework to explicitly understand the building up of FC through neuronal dynamics underpinned by anatomical connections and to drive hypotheses in task-evoked studies and for clinical applications.
Resumo:
The Iowa Department of Transportation (DOT) is continually improving the pavement management program and striving to reduce maintenance needs. Through a 1979 pavement management study, the Iowa DOT became a participant in a five state Federal Highway Administration (FHWA) study of "Transverse Cracking of Asphalt Pavements". There were numerous conclusions and recommendations but no agreement as to the major factors contributing to transverse cracking or methods of preventing or reducing the occurrence of transverse cracking. The project did focus attention on the problem and generated ideas for research. This project is one of two state funded research projects that were a direct result of the FHWA project. Iowa DOT personnel had been monitoring temperature susceptibility of asphalt cements by the Norman McLeod Modified Penetration Index. Even though there are many variables from one asphalt mix to another, the trend seemed to indicate that the frequency of transverse cracking was highly dependent on the temperature susceptibility. Research project HR-217 "Reducing the Adverse Effects of Transverse Cracking" was initiated to verify the concept. A final report has been published after a four-year evaluation. The crack frequency with the high temperature susceptible asphalt cement was substantially greater than for the low temperature susceptible asphalt cement. An increased asphalt cement content in the asphalt treated base also reduced the crack frequency. This research on prevention of transverse cracking with fabric supports the following conclusions: 1. Engineering fabric does not prevent transverse cracking of asphalt cement concrete. 2. Engineering fabric may retard the occurrence of transverse cracking. 3. Engineering fabric does not contribute significantly to the structural capability of an asphalt concrete pavement.
Resumo:
Planning with partial observability can be formulated as a non-deterministic search problem in belief space. The problem is harder than classical planning as keeping track of beliefs is harder than keeping track of states, and searching for action policies is harder than searching for action sequences. In this work, we develop a framework for partial observability that avoids these limitations and leads to a planner that scales up to larger problems. For this, the class of problems is restricted to those in which 1) the non-unary clauses representing the uncertainty about the initial situation are nvariant, and 2) variables that are hidden in the initial situation do not appear in the body of conditional effects, which are all assumed to be deterministic. We show that such problems can be translated in linear time into equivalent fully observable non-deterministic planning problems, and that an slight extension of this translation renders the problem solvable by means of classical planners. The whole approach is sound and complete provided that in addition, the state-space is connected. Experiments are also reported.
Resumo:
Statistics about people and their families interest community planners, social scientists, Extension educators, and others because the family is the fundamental social institution in our society. The purpose of this publication is to bring together in one reference many statistics about people in Iowa counties that have been published separately elsewhere. Most of the data presented are limited to only one year. This cross-sectional view is similar to a photograph that shows only one point in time. At an earlier or later time it might appear differently. Although the statistics reported in the various tables and figures represent different years, the data presented were the most recent available at the time this publication was prepared.