903 resultados para Mathematical problem with complementarity constraints


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we are interested in the dynamic behavior of a parabolic problem with nonlinear boundary conditions and delay in the boundary. We construct a reaction-diffusion problem with delay in the interior, where the reaction term is concentrated in a neighborhood of the boundary and this neighborhood shrinks to boundary, as a parameter epsilon goes to zero. We analyze the limit of the solutions of this concentrated problem and prove that these solutions converge in certain continuous function spaces to the unique solution of the parabolic problem with delay in the boundary. This convergence result allows us to approximate the solution of equations with delay acting on the boundary by solutions of equations with delay acting in the interior and it may contribute to analyze the dynamic behavior of delay equations when the delay is at the boundary. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite significant advances in the care of critically ill patients, acute lung injury continues to be a complex problem with high mortality. The present study was designed to characterize early lipopolysaccharide (LPS)-induced pulmonary injury and small interfering RNA targeting focal adhesion kinase (FAK) as a possible therapeutic tool in the septic lung remodeling process. Male Wistar rats were assigned into endotoxemic group and control group. Total collagen deposition was performed 8, 16, and 24 h after LPS injection. Focal adhesion kinase expression, interstitial and vascular collagen deposition, and pulmonary mechanics were analyzed at 24 h. Intravenous injection of small interfering RNA targeting FAK was used to silence expression of the kinase in pulmonary tissue. Focal adhesion kinase, total collagen deposition, and pulmonary mechanics showed increased in LPS group. Types I, III, and V collagen showed increase in pulmonary parenchyma, but only type V increased in vessels 24 h after LPS injection. Focal adhesion kinase silencing prevented lung remodeling in pulmonary parenchyma at 24 h. In conclusion, LPS induced a precocious and important lung remodeling. There was fibrotic response in the lung characterized by increased amount in total and specific-type collagen. These data may explain the frequent clinical presentation during sepsis of reduced lung compliance, oxygen diffusion, and pulmonary hypertension. The fact that FAK silencing was protective against lung collagen deposition underscores the therapeutic potential of FAK targeting by small interfering RNA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] Brine shrimp nauplii (Artemia sp.) are used in aquaculture as the major food source for many cultured marine larvae, and also used in the adult phase for many juvenile and adult fish. One artemia species, Artemia franciscana is most commonly preferred, due to the availability of its cysts and to its ease in hatching and biomass production. The problem with A. franciscana is that its nutritional quality is relatively poor in essential fatty acids, so that it is common practice to enrich it with emulsions like SELCO and ORIGO. This “bioencapsulation”, enrichment method permits the incorporation of different kinds of products into the artemia. This brine-shrimp’s non-selective particle-feeding habits, makes it particularly suitable for this enrichment process. The bioencapsulation is done just prior to feeding the artemia to a predator organism. This allows the delivery of different substances, not only for nutrient enrichment, but also for changing pigmentation and administering medicine. This is especially useful in culturing ornamental seahorses and tropical fish in marine aquaria In this study the objectives were to determine, the relative nutrient value of ORIGO and SELCO as well as the optimal exposure to these supplements prior to their use as food-organisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesi si articola in tre capitoli. Il primo dà conto del dibattito sorto attorno alla problematica dell’inquadramento della previdenza complementare nel sistema costituzionale dell’art. 38 Cost. che ha diviso la dottrina tra quanti hanno voluto ricondurre tale fenomeno al principio di libertà della previdenza privata di cui all’ art. 38, comma 5, Cost. e quanti lo hanno invece collocato al 2° comma della stessa norma, sulla base di una ritenuta identità di funzioni tra previdenza pubblica e previdenza complementare. Tale ultima ricostruzione in particolare dopo la c.d. Riforma “Amato” è culminata nella giurisprudenza della Corte Costituzionale, che ha avuto modo di pronunciarsi sulla questione con una serie di pronunce sulla vicenda del c.d. “contributo sul contributo” e su quella della subordinazione dei requisiti di accesso alle prestazioni pensionistiche complementari alla maturazione dei requisiti previsti dal sistema obbligatorio. Il capitolo successivo si occupa della verifica della attualità e della coerenza dell’impostazione della Corte Costituzionale alla luce dell’evoluzione della disciplina dei fondi pensione. Nel terzo capitolo, infine, vengono affrontate alcune questioni aperte in relazione ai c.d. fondi pensione “preesistenti” suscettibili di sollevare preoccupazioni circa la necessità di garantire le aspettative e i diritti dei soggetti iscritti.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we present some combinatorial optimization problems, suggest models and algorithms for their effective solution. For each problem,we give its description, followed by a short literature review, provide methods to solve it and, finally, present computational results and comparisons with previous works to show the effectiveness of the proposed approaches. The considered problems are: the Generalized Traveling Salesman Problem (GTSP), the Bin Packing Problem with Conflicts(BPPC) and the Fair Layout Problem (FLOP).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constraints are widely present in the flight control problems: actuators saturations or flight envelope limitations are only some examples of that. The ability of Model Predictive Control (MPC) of dealing with the constraints joined with the increased computational power of modern calculators makes this approach attractive also for fast dynamics systems such as agile air vehicles. This PhD thesis presents the results, achieved at the Aerospace Engineering Department of the University of Bologna in collaboration with the Dutch National Aerospace Laboratories (NLR), concerning the development of a model predictive control system for small scale rotorcraft UAS. Several different predictive architectures have been evaluated and tested by means of simulation, as a result of this analysis the most promising one has been used to implement three different control systems: a Stability and Control Augmentation System, a trajectory tracking and a path following system. The systems have been compared with a corresponding baseline controller and showed several advantages in terms of performance, stability and robustness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents hybrid Constraint Programming (CP) and metaheuristic methods for the solution of Large Scale Optimization Problems; it aims at integrating concepts and mechanisms from the metaheuristic methods to a CP-based tree search environment in order to exploit the advantages of both approaches. The modeling and solution of large scale combinatorial optimization problem is a topic which has arisen the interest of many researcherers in the Operations Research field; combinatorial optimization problems are widely spread in everyday life and the need of solving difficult problems is more and more urgent. Metaheuristic techniques have been developed in the last decades to effectively handle the approximate solution of combinatorial optimization problems; we will examine metaheuristics in detail, focusing on the common aspects of different techniques. Each metaheuristic approach possesses its own peculiarities in designing and guiding the solution process; our work aims at recognizing components which can be extracted from metaheuristic methods and re-used in different contexts. In particular we focus on the possibility of porting metaheuristic elements to constraint programming based environments, as constraint programming is able to deal with feasibility issues of optimization problems in a very effective manner. Moreover, CP offers a general paradigm which allows to easily model any type of problem and solve it with a problem-independent framework, differently from local search and metaheuristic methods which are highly problem specific. In this work we describe the implementation of the Local Branching framework, originally developed for Mixed Integer Programming, in a CP-based environment. Constraint programming specific features are used to ease the search process, still mantaining an absolute generality of the approach. We also propose a search strategy called Sliced Neighborhood Search, SNS, that iteratively explores slices of large neighborhoods of an incumbent solution by performing CP-based tree search and encloses concepts from metaheuristic techniques. SNS can be used as a stand alone search strategy, but it can alternatively be embedded in existing strategies as intensification and diversification mechanism. In particular we show its integration within the CP-based local branching. We provide an extensive experimental evaluation of the proposed approaches on instances of the Asymmetric Traveling Salesman Problem and of the Asymmetric Traveling Salesman Problem with Time Windows. The proposed approaches achieve good results on practical size problem, thus demonstrating the benefit of integrating metaheuristic concepts in CP-based frameworks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this proposal is to explain the paradigm of the American foreign policy during the Johnson Administration, especially toward Europe, within the NATO framework, and toward URSS, in the context of the détente, just emerged during the decade of the sixties. During that period, after the passing of the J. F. Kennedy, President L. B. Johnson inherited a complex and very high-powered world politics, which wanted to get a new phase off the ground in the transatlantic relations and share the burden of the Cold war with a refractory Europe. Known as the grand design, it was a policy that needed the support of the allies and a clear purpose which appealed to the Europeans. At first, President Johnson detected in the problem of the nuclear sharing the good deal to make with the NATO allies. At the same time, he understood that the United States needed to reassert their leadeship within the new stage of relations with the Soviet Union. Soon, the “transatlantic bargain” became something not so easy to dealt with. The Federal Germany wanted to say a word in the nuclear affairs and, why not, put the finger on the trigger of the atlantic nuclear weapons. URSS, on the other hand, wanted to keep Germany down. The other allies did not want to share the onus of the defense of Europe, at most the responsability for the use of the weapons and, at least, to participate in the decision-making process. France, which wanted to detach herself from the policy of the United States and regained a world role, added difficulties to the manage of this course of action. Through the years of the Johnson’s office, the divergences of the policies placed by his advisers to gain the goal put the American foreign policy in deep water. The withdrawal of France from the organization but not from the Alliance, give Washington a chance to carry out his goal. The development of a clear-cut disarm policy leaded the Johnson’s administration to the core of the matter. The Non-proliferation Treaty signed in 1968, solved in a business-like fashion the problem with the allies. The question of nuclear sharing faded away with the acceptance of more deep consultative role in the nuclear affairs by the allies, the burden for the defense of Europe became more bearable through the offset agreement with the FRG and a new doctrine, the flexible response, put an end, at least formally, to the taboo of the nuclear age. The Johnson’s grand design proved to be different from the Kennedy’s one, but all things considered, it was more workable. The unpredictable result was a real détente with the Soviet Union, which, we can say, was a merit of President Johnson.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we made the first steps towards the systematic application of a methodology for automatically building formal models of complex biological systems. Such a methodology could be useful also to design artificial systems possessing desirable properties such as robustness and evolvability. The approach we follow in this thesis is to manipulate formal models by means of adaptive search methods called metaheuristics. In the first part of the thesis we develop state-of-the-art hybrid metaheuristic algorithms to tackle two important problems in genomics, namely, the Haplotype Inference by parsimony and the Founder Sequence Reconstruction Problem. We compare our algorithms with other effective techniques in the literature, we show strength and limitations of our approaches to various problem formulations and, finally, we propose further enhancements that could possibly improve the performance of our algorithms and widen their applicability. In the second part, we concentrate on Boolean network (BN) models of gene regulatory networks (GRNs). We detail our automatic design methodology and apply it to four use cases which correspond to different design criteria and address some limitations of GRN modeling by BNs. Finally, we tackle the Density Classification Problem with the aim of showing the learning capabilities of BNs. Experimental evaluation of this methodology shows its efficacy in producing network that meet our design criteria. Our results, coherently to what has been found in other works, also suggest that networks manipulated by a search process exhibit a mixture of characteristics typical of different dynamical regimes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study localized electric potentials that have an arbitrarily high energy on some given subset of a domain and low energy on another. We show that such potentials exist for general L-infinity-conductivities (with positive infima) in almost arbitrarily shaped subregions of a domain, as long as these regions are connected to the boundary and a unique continuation principle is satisfied. From this we deduce a simple, but new, theoretical identifiability result for the famous Calderon problem with partial data. We also show how to construct such potentials numerically and use a connection with the factorization method to derive a new non-iterative algorithm for the detection of inclusions in electrical impedance tomography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radio relics are diffuse synchrotron sources generally located in the peripheries of galaxy clusters in merging state. According to the current leading scenario, relics trace gigantic cosmological shock waves that cross the intra-cluster medium where particle acceleration occurs. The relic/shock connection is supported by several observational facts, including the spatial coincidence between relics and shocks found in the X-rays. Under the assumptions that particles are accelerated at the shock front and are subsequently deposited and then age downstream of the shock, Markevitch et al. (2005) proposed a method to constrain the magnetic field strength in radio relics. Measuring the thickness of radio relics at different frequencies allows to derive combined constraints on the velocity of the downstream flow and on the magnetic field, which in turns determines particle aging. We elaborate this idea to infer first constraints on magnetic fields in cluster outskirts. We consider three models of particle aging and develop a geometric model to take into account the contribution to the relic transverse size due to the projection of the shock-surface on the plane of the sky. We selected three well studied radio relics in the clusters A 521, CIZA J2242.8+5301 and 1RXS J0603.3+4214. These relics have been chosen primarily because they are almost seen edge-on and because the Mach number of the shock that is associated with these relics is measured by X-ray observations, thus allowing to break the degeneracy between magnetic field and downstream velocity in the method. For the first two clusters, our method is consistent with a pure radiative aging model allowing us to derive constraints on the relics magnetic field strength. In the case of 1RXS J0603.3+4214 we find that particle life-times are consistent with a pure radiative aging model under some conditions, however we also collect evidences for downstream particle re-acceleration in the relic W-region and for a magnetic field decaying downstream in its E-region. Our estimates of the magnetic field strength in the relics in A 521 and CIZA J2242.8+5301 provide unique information on the field properties in cluster outskirts. The constraints derived for these relics, together with the lower limits to the magnetic field that we derived from the lack of inverse Compton X-ray emission from the sources, have been combined with the constraints from Faraday rotation studies of the Coma cluster. Overall results suggest that the spatial profile of the magnetic field energy density is broader than that of the thermal gas, implying that the ε_th /ε_B ratio decreases with cluster radius. Alternatively, radio relics could trace dynamically active regions where the magnetic field strength is biased high with respect to the average value in the cluster volume.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study a model for the breast image reconstruction in Digital Tomosynthesis, that is a non-invasive and non-destructive method for the three-dimensional visualization of the inner structures of an object, in which the data acquisition includes measuring a limited number of low-dose two-dimensional projections of an object by moving a detector and an X-ray tube around the object within a limited angular range. The problem of reconstructing 3D images from the projections provided in the Digital Tomosynthesis is an ill-posed inverse problem, that leads to a minimization problem with an object function that contains a data fitting term and a regularization term. The contribution of this thesis is to use the techniques of the compressed sensing, in particular replacing the standard least squares problem of data fitting with the problem of minimizing the 1-norm of the residuals, and using as regularization term the Total Variation (TV). We tested two different algorithms: a new alternating minimization algorithm (ADM), and a version of the more standard scaled projected gradient algorithm (SGP) that involves the 1-norm. We perform some experiments and analyse the performance of the two methods comparing relative errors, iterations number, times and the qualities of the reconstructed images. In conclusion we noticed that the use of the 1-norm and the Total Variation are valid tools in the formulation of the minimization problem for the image reconstruction resulting from Digital Tomosynthesis and the new algorithm ADM has reached a relative error comparable to a version of the classic algorithm SGP and proved best in speed and in the early appearance of the structures representing the masses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concerns of rising healthcare costs and the ever increasing desire to improve surgical outcome have motivated the development of a new robotic assisted surgical procedure for the implantation of artificial hearing devices (AHDs). This paper describes our efforts to enable minimally invasive, cost effective surgery for the implantation of AHDs. We approach this problem with a fundamental goal to reduce errors from every component of the surgical workflow from imaging and trajectory planning to patient tracking and robot development. These efforts were successful in reducing overall system error to a previously unattained level.