887 resultados para cutting stock problem with setups


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we are interested in the dynamic behavior of a parabolic problem with nonlinear boundary conditions and delay in the boundary. We construct a reaction-diffusion problem with delay in the interior, where the reaction term is concentrated in a neighborhood of the boundary and this neighborhood shrinks to boundary, as a parameter epsilon goes to zero. We analyze the limit of the solutions of this concentrated problem and prove that these solutions converge in certain continuous function spaces to the unique solution of the parabolic problem with delay in the boundary. This convergence result allows us to approximate the solution of equations with delay acting on the boundary by solutions of equations with delay acting in the interior and it may contribute to analyze the dynamic behavior of delay equations when the delay is at the boundary. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite significant advances in the care of critically ill patients, acute lung injury continues to be a complex problem with high mortality. The present study was designed to characterize early lipopolysaccharide (LPS)-induced pulmonary injury and small interfering RNA targeting focal adhesion kinase (FAK) as a possible therapeutic tool in the septic lung remodeling process. Male Wistar rats were assigned into endotoxemic group and control group. Total collagen deposition was performed 8, 16, and 24 h after LPS injection. Focal adhesion kinase expression, interstitial and vascular collagen deposition, and pulmonary mechanics were analyzed at 24 h. Intravenous injection of small interfering RNA targeting FAK was used to silence expression of the kinase in pulmonary tissue. Focal adhesion kinase, total collagen deposition, and pulmonary mechanics showed increased in LPS group. Types I, III, and V collagen showed increase in pulmonary parenchyma, but only type V increased in vessels 24 h after LPS injection. Focal adhesion kinase silencing prevented lung remodeling in pulmonary parenchyma at 24 h. In conclusion, LPS induced a precocious and important lung remodeling. There was fibrotic response in the lung characterized by increased amount in total and specific-type collagen. These data may explain the frequent clinical presentation during sepsis of reduced lung compliance, oxygen diffusion, and pulmonary hypertension. The fact that FAK silencing was protective against lung collagen deposition underscores the therapeutic potential of FAK targeting by small interfering RNA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] Brine shrimp nauplii (Artemia sp.) are used in aquaculture as the major food source for many cultured marine larvae, and also used in the adult phase for many juvenile and adult fish. One artemia species, Artemia franciscana is most commonly preferred, due to the availability of its cysts and to its ease in hatching and biomass production. The problem with A. franciscana is that its nutritional quality is relatively poor in essential fatty acids, so that it is common practice to enrich it with emulsions like SELCO and ORIGO. This “bioencapsulation”, enrichment method permits the incorporation of different kinds of products into the artemia. This brine-shrimp’s non-selective particle-feeding habits, makes it particularly suitable for this enrichment process. The bioencapsulation is done just prior to feeding the artemia to a predator organism. This allows the delivery of different substances, not only for nutrient enrichment, but also for changing pigmentation and administering medicine. This is especially useful in culturing ornamental seahorses and tropical fish in marine aquaria In this study the objectives were to determine, the relative nutrient value of ORIGO and SELCO as well as the optimal exposure to these supplements prior to their use as food-organisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesi si articola in tre capitoli. Il primo dà conto del dibattito sorto attorno alla problematica dell’inquadramento della previdenza complementare nel sistema costituzionale dell’art. 38 Cost. che ha diviso la dottrina tra quanti hanno voluto ricondurre tale fenomeno al principio di libertà della previdenza privata di cui all’ art. 38, comma 5, Cost. e quanti lo hanno invece collocato al 2° comma della stessa norma, sulla base di una ritenuta identità di funzioni tra previdenza pubblica e previdenza complementare. Tale ultima ricostruzione in particolare dopo la c.d. Riforma “Amato” è culminata nella giurisprudenza della Corte Costituzionale, che ha avuto modo di pronunciarsi sulla questione con una serie di pronunce sulla vicenda del c.d. “contributo sul contributo” e su quella della subordinazione dei requisiti di accesso alle prestazioni pensionistiche complementari alla maturazione dei requisiti previsti dal sistema obbligatorio. Il capitolo successivo si occupa della verifica della attualità e della coerenza dell’impostazione della Corte Costituzionale alla luce dell’evoluzione della disciplina dei fondi pensione. Nel terzo capitolo, infine, vengono affrontate alcune questioni aperte in relazione ai c.d. fondi pensione “preesistenti” suscettibili di sollevare preoccupazioni circa la necessità di garantire le aspettative e i diritti dei soggetti iscritti.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we study three combinatorial optimization problems belonging to the classes of Network Design and Vehicle Routing problems that are strongly linked in the context of the design and management of transportation networks: the Non-Bifurcated Capacitated Network Design Problem (NBP), the Period Vehicle Routing Problem (PVRP) and the Pickup and Delivery Problem with Time Windows (PDPTW). These problems are NP-hard and contain as special cases some well known difficult problems such as the Traveling Salesman Problem and the Steiner Tree Problem. Moreover, they model the core structure of many practical problems arising in logistics and telecommunications. The NBP is the problem of designing the optimum network to satisfy a given set of traffic demands. Given a set of nodes, a set of potential links and a set of point-to-point demands called commodities, the objective is to select the links to install and dimension their capacities so that all the demands can be routed between their respective endpoints, and the sum of link fixed costs and commodity routing costs is minimized. The problem is called non- bifurcated because the solution network must allow each demand to follow a single path, i.e., the flow of each demand cannot be splitted. Although this is the case in many real applications, the NBP has received significantly less attention in the literature than other capacitated network design problems that allow bifurcation. We describe an exact algorithm for the NBP that is based on solving by an integer programming solver a formulation of the problem strengthened by simple valid inequalities and four new heuristic algorithms. One of these heuristics is an adaptive memory metaheuristic, based on partial enumeration, that could be applied to a wider class of structured combinatorial optimization problems. In the PVRP a fleet of vehicles of identical capacity must be used to service a set of customers over a planning period of several days. Each customer specifies a service frequency, a set of allowable day-combinations and a quantity of product that the customer must receive every time he is visited. For example, a customer may require to be visited twice during a 5-day period imposing that these visits take place on Monday-Thursday or Monday-Friday or Tuesday-Friday. The problem consists in simultaneously assigning a day- combination to each customer and in designing the vehicle routes for each day so that each customer is visited the required number of times, the number of routes on each day does not exceed the number of vehicles available, and the total cost of the routes over the period is minimized. We also consider a tactical variant of this problem, called Tactical Planning Vehicle Routing Problem, where customers require to be visited on a specific day of the period but a penalty cost, called service cost, can be paid to postpone the visit to a later day than that required. At our knowledge all the algorithms proposed in the literature for the PVRP are heuristics. In this thesis we present for the first time an exact algorithm for the PVRP that is based on different relaxations of a set partitioning-like formulation. The effectiveness of the proposed algorithm is tested on a set of instances from the literature and on a new set of instances. Finally, the PDPTW is to service a set of transportation requests using a fleet of identical vehicles of limited capacity located at a central depot. Each request specifies a pickup location and a delivery location and requires that a given quantity of load is transported from the pickup location to the delivery location. Moreover, each location can be visited only within an associated time window. Each vehicle can perform at most one route and the problem is to satisfy all the requests using the available vehicles so that each request is serviced by a single vehicle, the load on each vehicle does not exceed the capacity, and all locations are visited according to their time window. We formulate the PDPTW as a set partitioning-like problem with additional cuts and we propose an exact algorithm based on different relaxations of the mathematical formulation and a branch-and-cut-and-price algorithm. The new algorithm is tested on two classes of problems from the literature and compared with a recent branch-and-cut-and-price algorithm from the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we present some combinatorial optimization problems, suggest models and algorithms for their effective solution. For each problem,we give its description, followed by a short literature review, provide methods to solve it and, finally, present computational results and comparisons with previous works to show the effectiveness of the proposed approaches. The considered problems are: the Generalized Traveling Salesman Problem (GTSP), the Bin Packing Problem with Conflicts(BPPC) and the Fair Layout Problem (FLOP).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents hybrid Constraint Programming (CP) and metaheuristic methods for the solution of Large Scale Optimization Problems; it aims at integrating concepts and mechanisms from the metaheuristic methods to a CP-based tree search environment in order to exploit the advantages of both approaches. The modeling and solution of large scale combinatorial optimization problem is a topic which has arisen the interest of many researcherers in the Operations Research field; combinatorial optimization problems are widely spread in everyday life and the need of solving difficult problems is more and more urgent. Metaheuristic techniques have been developed in the last decades to effectively handle the approximate solution of combinatorial optimization problems; we will examine metaheuristics in detail, focusing on the common aspects of different techniques. Each metaheuristic approach possesses its own peculiarities in designing and guiding the solution process; our work aims at recognizing components which can be extracted from metaheuristic methods and re-used in different contexts. In particular we focus on the possibility of porting metaheuristic elements to constraint programming based environments, as constraint programming is able to deal with feasibility issues of optimization problems in a very effective manner. Moreover, CP offers a general paradigm which allows to easily model any type of problem and solve it with a problem-independent framework, differently from local search and metaheuristic methods which are highly problem specific. In this work we describe the implementation of the Local Branching framework, originally developed for Mixed Integer Programming, in a CP-based environment. Constraint programming specific features are used to ease the search process, still mantaining an absolute generality of the approach. We also propose a search strategy called Sliced Neighborhood Search, SNS, that iteratively explores slices of large neighborhoods of an incumbent solution by performing CP-based tree search and encloses concepts from metaheuristic techniques. SNS can be used as a stand alone search strategy, but it can alternatively be embedded in existing strategies as intensification and diversification mechanism. In particular we show its integration within the CP-based local branching. We provide an extensive experimental evaluation of the proposed approaches on instances of the Asymmetric Traveling Salesman Problem and of the Asymmetric Traveling Salesman Problem with Time Windows. The proposed approaches achieve good results on practical size problem, thus demonstrating the benefit of integrating metaheuristic concepts in CP-based frameworks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this proposal is to explain the paradigm of the American foreign policy during the Johnson Administration, especially toward Europe, within the NATO framework, and toward URSS, in the context of the détente, just emerged during the decade of the sixties. During that period, after the passing of the J. F. Kennedy, President L. B. Johnson inherited a complex and very high-powered world politics, which wanted to get a new phase off the ground in the transatlantic relations and share the burden of the Cold war with a refractory Europe. Known as the grand design, it was a policy that needed the support of the allies and a clear purpose which appealed to the Europeans. At first, President Johnson detected in the problem of the nuclear sharing the good deal to make with the NATO allies. At the same time, he understood that the United States needed to reassert their leadeship within the new stage of relations with the Soviet Union. Soon, the “transatlantic bargain” became something not so easy to dealt with. The Federal Germany wanted to say a word in the nuclear affairs and, why not, put the finger on the trigger of the atlantic nuclear weapons. URSS, on the other hand, wanted to keep Germany down. The other allies did not want to share the onus of the defense of Europe, at most the responsability for the use of the weapons and, at least, to participate in the decision-making process. France, which wanted to detach herself from the policy of the United States and regained a world role, added difficulties to the manage of this course of action. Through the years of the Johnson’s office, the divergences of the policies placed by his advisers to gain the goal put the American foreign policy in deep water. The withdrawal of France from the organization but not from the Alliance, give Washington a chance to carry out his goal. The development of a clear-cut disarm policy leaded the Johnson’s administration to the core of the matter. The Non-proliferation Treaty signed in 1968, solved in a business-like fashion the problem with the allies. The question of nuclear sharing faded away with the acceptance of more deep consultative role in the nuclear affairs by the allies, the burden for the defense of Europe became more bearable through the offset agreement with the FRG and a new doctrine, the flexible response, put an end, at least formally, to the taboo of the nuclear age. The Johnson’s grand design proved to be different from the Kennedy’s one, but all things considered, it was more workable. The unpredictable result was a real détente with the Soviet Union, which, we can say, was a merit of President Johnson.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we made the first steps towards the systematic application of a methodology for automatically building formal models of complex biological systems. Such a methodology could be useful also to design artificial systems possessing desirable properties such as robustness and evolvability. The approach we follow in this thesis is to manipulate formal models by means of adaptive search methods called metaheuristics. In the first part of the thesis we develop state-of-the-art hybrid metaheuristic algorithms to tackle two important problems in genomics, namely, the Haplotype Inference by parsimony and the Founder Sequence Reconstruction Problem. We compare our algorithms with other effective techniques in the literature, we show strength and limitations of our approaches to various problem formulations and, finally, we propose further enhancements that could possibly improve the performance of our algorithms and widen their applicability. In the second part, we concentrate on Boolean network (BN) models of gene regulatory networks (GRNs). We detail our automatic design methodology and apply it to four use cases which correspond to different design criteria and address some limitations of GRN modeling by BNs. Finally, we tackle the Density Classification Problem with the aim of showing the learning capabilities of BNs. Experimental evaluation of this methodology shows its efficacy in producing network that meet our design criteria. Our results, coherently to what has been found in other works, also suggest that networks manipulated by a search process exhibit a mixture of characteristics typical of different dynamical regimes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study localized electric potentials that have an arbitrarily high energy on some given subset of a domain and low energy on another. We show that such potentials exist for general L-infinity-conductivities (with positive infima) in almost arbitrarily shaped subregions of a domain, as long as these regions are connected to the boundary and a unique continuation principle is satisfied. From this we deduce a simple, but new, theoretical identifiability result for the famous Calderon problem with partial data. We also show how to construct such potentials numerically and use a connection with the factorization method to derive a new non-iterative algorithm for the detection of inclusions in electrical impedance tomography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study a model for the breast image reconstruction in Digital Tomosynthesis, that is a non-invasive and non-destructive method for the three-dimensional visualization of the inner structures of an object, in which the data acquisition includes measuring a limited number of low-dose two-dimensional projections of an object by moving a detector and an X-ray tube around the object within a limited angular range. The problem of reconstructing 3D images from the projections provided in the Digital Tomosynthesis is an ill-posed inverse problem, that leads to a minimization problem with an object function that contains a data fitting term and a regularization term. The contribution of this thesis is to use the techniques of the compressed sensing, in particular replacing the standard least squares problem of data fitting with the problem of minimizing the 1-norm of the residuals, and using as regularization term the Total Variation (TV). We tested two different algorithms: a new alternating minimization algorithm (ADM), and a version of the more standard scaled projected gradient algorithm (SGP) that involves the 1-norm. We perform some experiments and analyse the performance of the two methods comparing relative errors, iterations number, times and the qualities of the reconstructed images. In conclusion we noticed that the use of the 1-norm and the Total Variation are valid tools in the formulation of the minimization problem for the image reconstruction resulting from Digital Tomosynthesis and the new algorithm ADM has reached a relative error comparable to a version of the classic algorithm SGP and proved best in speed and in the early appearance of the structures representing the masses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concerns of rising healthcare costs and the ever increasing desire to improve surgical outcome have motivated the development of a new robotic assisted surgical procedure for the implantation of artificial hearing devices (AHDs). This paper describes our efforts to enable minimally invasive, cost effective surgery for the implantation of AHDs. We approach this problem with a fundamental goal to reduce errors from every component of the surgical workflow from imaging and trajectory planning to patient tracking and robot development. These efforts were successful in reducing overall system error to a previously unattained level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While there are many articles in the popular press and practitioner journals concerning the Millennials (i.e., who they are and what we need to do about them), the academic literature on the subject is more limited. This chapter (1) extensively reviews this literature as published in practitioner, popular press, and academic journals across disciplines including psychology, sociology, management, human resources, and accounting education, and (2) surveys the generational study literature to determine what, if any, rigorous empirical studies exist to support (or refute) the existence of a distinct Millennial generational cohort. While the popular press is voluminous when it comes to avowed generational differences between Millennials and their predecessors, there is a paucity of peer-reviewed, academic, empirical work in the area and most of the latter suffers in some way from the overarching problem with generational research: the linear relationship between age, period, and generation that results in these variables being inherently entwined. However, even absent strong empirical evidence of a unique generational cohort, the literature offers extensive suggestions about what to do about the Millennials in our classrooms and work places. This paper better informs accounting faculty about the traits of the current generation of accounting students that are supported by empirical research versus claims made in the popular press. It argues for a more reasoned ‘‘continuous improvement’’ approach to Millennials while offering some classroom suggestions for accounting faculty members.