853 resultados para Mixed-shop
Resumo:
Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.
Resumo:
The aim of the thesi is to formulate a suitable Item Response Theory (IRT) based model to measure HRQoL (as latent variable) using a mixed responses questionnaire and relaxing the hypothesis of normal distributed latent variable. The new model is a combination of two models already presented in literature, that is, a latent trait model for mixed responses and an IRT model for Skew Normal latent variable. It is developed in a Bayesian framework, a Markov chain Monte Carlo procedure is used to generate samples of the posterior distribution of the parameters of interest. The proposed model is test on a questionnaire composed by 5 discrete items and one continuous to measure HRQoL in children, the EQ-5D-Y questionnaire. A large sample of children collected in the schools was used. In comparison with a model for only discrete responses and a model for mixed responses and normal latent variable, the new model has better performances, in term of deviance information criterion (DIC), chain convergences times and precision of the estimates.
Resumo:
In questa tesi ci occuperemo di fornire un modello MIP di base e di alcune sue varianti, realizzate allo scopo di comprenderne il comportamento ed eventualmente migliorarne l’efficienza. Le diverse varianti sono state costruite agendo in particolar modo sulla definizione di alcuni vincoli, oppure sui bound delle variabili, oppure ancora nell’obbligare il risolutore a focalizzarsi su determinate decisioni o specifiche variabili. Sono stati testati alcuni dei problemi tipici presenti in letteratura e i diversi risultati sono stati opportunamente valutati e confrontati. Tra i riferimenti per tale confronto sono stati considerati anche i risultati ottenibili tramite un modello Constraint Programming, che notoriamente produce risultati apprezzabili in ambito di schedulazione. Un ulteriore scopo della tesi è, infatti, comparare i due approcci Mathematical Programming e Constraint Programming, identificandone quindi i pregi e gli svantaggi e provandone la trasferibilità al modello raffrontato.
Resumo:
Piezoelectrics present an interactive electromechanical behaviour that, especially in recent years, has generated much interest since it renders these materials adapt for use in a variety of electronic and industrial applications like sensors, actuators, transducers, smart structures. Both mechanical and electric loads are generally applied on these devices and can cause high concentrations of stress, particularly in proximity of defects or inhomogeneities, such as flaws, cavities or included particles. A thorough understanding of their fracture behaviour is crucial in order to improve their performances and avoid unexpected failures. Therefore, a considerable number of research works have addressed this topic in the last decades. Most of the theoretical studies on this subject find their analytical background in the complex variable formulation of plane anisotropic elasticity. This theoretical approach bases its main origins in the pioneering works of Muskelishvili and Lekhnitskii who obtained the solution of the elastic problem in terms of independent analytic functions of complex variables. In the present work, the expressions of stresses and elastic and electric displacements are obtained as functions of complex potentials through an analytical formulation which is the application to the piezoelectric static case of an approach introduced for orthotropic materials to solve elastodynamics problems. This method can be considered an alternative to other formalisms currently used, like the Stroh’s formalism. The equilibrium equations are reduced to a first order system involving a six-dimensional vector field. After that, a similarity transformation is induced to reach three independent Cauchy-Riemann systems, so justifying the introduction of the complex variable notation. Closed form expressions of near tip stress and displacement fields are therefore obtained. In the theoretical study of cracked piezoelectric bodies, the issue of assigning consistent electric boundary conditions on the crack faces is of central importance and has been addressed by many researchers. Three different boundary conditions are commonly accepted in literature: the permeable, the impermeable and the semipermeable (“exact”) crack model. This thesis takes into considerations all the three models, comparing the results obtained and analysing the effects of the boundary condition choice on the solution. The influence of load biaxiality and of the application of a remote electric field has been studied, pointing out that both can affect to a various extent the stress fields and the angle of initial crack extension, especially when non-singular terms are retained in the expressions of the electro-elastic solution. Furthermore, two different fracture criteria are applied to the piezoelectric case, and their outcomes are compared and discussed. The work is organized as follows: Chapter 1 briefly introduces the fundamental concepts of Fracture Mechanics. Chapter 2 describes plane elasticity formalisms for an anisotropic continuum (Eshelby-Read-Shockley and Stroh) and introduces for the simplified orthotropic case the alternative formalism we want to propose. Chapter 3 outlines the Linear Theory of Piezoelectricity, its basic relations and electro-elastic equations. Chapter 4 introduces the proposed method for obtaining the expressions of stresses and elastic and electric displacements, given as functions of complex potentials. The solution is obtained in close form and non-singular terms are retained as well. Chapter 5 presents several numerical applications aimed at estimating the effect of load biaxiality, electric field, considered permittivity of the crack. Through the application of fracture criteria the influence of the above listed conditions on the response of the system and in particular on the direction of crack branching is thoroughly discussed.
Resumo:
The future hydrogen demand is expected to increase, both in existing industries (including upgrading of fossil fuels or ammonia production) and in new technologies, like fuel cells. Nowadays, hydrogen is obtained predominantly by steam reforming of methane, but it is well known that hydrocarbon based routes result in environmental problems and besides the market is dependent on the availability of this finite resource which is suffering of rapid depletion. Therefore, alternative processes using renewable sources like wind, solar energy and biomass, are now being considered for the production of hydrogen. One of those alternative methods is the so-called “steam-iron process” which consists in the reduction of a metal-oxide by hydrogen-containing feedstock, like ethanol for instance, and then the reduced material is reoxidized with water to produce “clean” hydrogen (water splitting). This kind of thermochemical cycles have been studied before but currently some important facts like the development of more active catalysts, the flexibility of the feedstock (including renewable bio-alcohols) and the fact that the purification of hydrogen could be avoided, have significantly increased the interest for this research topic. With the aim of increasing the understanding of the reactions that govern the steam-iron route to produce hydrogen, it is necessary to go into the molecular level. Spectroscopic methods are an important tool to extract information that could help in the development of more efficient materials and processes. In this research, ethanol was chosen as a reducing fuel and the main goal was to study its interaction with different catalysts having similar structure (spinels), to make a correlation with the composition and the mechanism of the anaerobic oxidation of the ethanol which is the first step of the steam-iron cycle. To accomplish this, diffuse reflectance spectroscopy (DRIFTS) was used to study the surface composition of the catalysts during the adsorption of ethanol and its transformation during the temperature program. Furthermore, mass spectrometry was used to monitor the desorbed products. The set of studied materials include Cu, Co and Ni ferrites which were also characterized by means of X-ray diffraction, surface area measurements, Raman spectroscopy, and temperature programmed reduction.
Resumo:
The present work, then, is concerned with the forgotten elements of the Lebanese economy, agriculture and rural development. It investigates the main problematic which arose from these forgotten components, in particular the structure of the agricultural sector, production technology, income distribution, poverty, food security, territorial development and local livelihood strategies. It will do so using quantitative Computable General Equilibrium (CGE) modeling and a qualitative phenomenological case study analysis, both embedded in a critical review of the historical development of the political economy of Lebanon, and a structural analysis of its economy. The research shows that under-development in Lebanese rural areas is not due to lack of resources, but rather is the consequence of political choices. It further suggests that agriculture – in both its mainstream conventional and its innovative locally initiated forms of production – still represents important potential for inducing economic growth and development. In order to do so, Lebanon has to take full advantage of its human and territorial capital, by developing a rural development strategy based on two parallel sets of actions: one directed toward the support of local rural development initiatives, and the other directed toward intensive form of production. In addition to its economic returns, such a strategy would promote social and political stability.
Resumo:
In this work we develop and analyze an adaptive numerical scheme for simulating a class of macroscopic semiconductor models. At first the numerical modelling of semiconductors is reviewed in order to classify the Energy-Transport models for semiconductors that are later simulated in 2D. In this class of models the flow of charged particles, that are negatively charged electrons and so-called holes, which are quasi-particles of positive charge, as well as their energy distributions are described by a coupled system of nonlinear partial differential equations. A considerable difficulty in simulating these convection-dominated equations is posed by the nonlinear coupling as well as due to the fact that the local phenomena such as "hot electron effects" are only partially assessable through the given data. The primary variables that are used in the simulations are the particle density and the particle energy density. The user of these simulations is mostly interested in the current flow through parts of the domain boundary - the contacts. The numerical method considered here utilizes mixed finite-elements as trial functions for the discrete solution. The continuous discretization of the normal fluxes is the most important property of this discretization from the users perspective. It will be proven that under certain assumptions on the triangulation the particle density remains positive in the iterative solution algorithm. Connected to this result an a priori error estimate for the discrete solution of linear convection-diffusion equations is derived. The local charge transport phenomena will be resolved by an adaptive algorithm, which is based on a posteriori error estimators. At that stage a comparison of different estimations is performed. Additionally a method to effectively estimate the error in local quantities derived from the solution, so-called "functional outputs", is developed by transferring the dual weighted residual method to mixed finite elements. For a model problem we present how this method can deliver promising results even when standard error estimator fail completely to reduce the error in an iterative mesh refinement process.
Resumo:
Mixed tethered bilayer lipid membranes (tBLMs) are described based on the self-assembly of a monolayer on template stripped gold, of an archea analogue thiolipid, 2,3-di-o-phytanyl-sn-glycerol-1-tetraethylene glycol-D,L--lipoic acid ester lipid (DPTL), and a newly designed dilution molecule, tetraethylene glycol-D,L--lipoic acid ester (TEGL). The usage of spacer and addition of extra dilution molecules between the substrate and the bilayer is that this architecture provides an ionic reservoir underneath the membrane, avoiding direct contact of the embedded membrane proteins with the gold electrodes and increasing the lateral diffusion of the bilayer, thus allowing for the incorporation of complex channels proteins which are failed in non-diluted systems. The tBLM is completed by fusion of liposomes made from a mixture of 1,2-diphythanolyl-sn-glycero-3-phosphocholine (DPhyPC), cholesterol, and 1,2-diphytanoyl-sn-Glycero-3-phosphate (DPhyPG) in a molar ratio of 6:3:1. Varying the mixing ratio, the optimum mixing ratio was obtained at a dilution factor of DPTL and TEGL at 90%:10%. Only under these conditions, the mixed tBLM showed electrical properties, as shown by EIS, which are comparable to a BLM. With higher dilution factors, a defect-free lipid bilayer was not formed. Formation of bilayers have been characterized by different techniques, such as surface plasmon resonance (SPR), electrochemical impedance spectroscopy (EIS), atomic force microscopy (AFM), and quartz crystal microbalance (QCM). Different proteins such as hemolysin, melittin, gramicidin, M2, Maxi-K, nAChR and bacteriohodopsin are incorporated into these tBLMs as shown by SPR and EIS studies. Ionic conductivity at 0 V vs. Ag|AgCl, 3M KCl were measured by EIS measurements. Our results indicate that these proteins have been successfully incorporated into a very stable tBLM environment in a functionally active form. Therefore, we conclude that the mixed tBLMs have been successfully designed as a general platform for biosensing and screening purposes of membrane proteins.
Resumo:
In various imaging problems the task is to use the Cauchy data of the solutions to an elliptic boundary value problem to reconstruct the coefficients of the corresponding partial differential equation. Often the examined object has known background properties but is contaminated by inhomogeneities that cause perturbations of the coefficient functions. The factorization method of Kirsch provides a tool for locating such inclusions. In this paper, the factorization technique is studied in the framework of coercive elliptic partial differential equations of the divergence type: Earlier it has been demonstrated that the factorization algorithm can reconstruct the support of a strictly positive (or negative) definite perturbation of the leading order coefficient, or if that remains unperturbed, the support of a strictly positive (or negative) perturbation of the zeroth order coefficient. In this work we show that these two types of inhomogeneities can, in fact, be located simultaneously. Unlike in the earlier articles on the factorization method, our inclusions may have disconnected complements and we also weaken some other a priori assumptions of the method. Our theoretical findings are complemented by two-dimensional numerical experiments that are presented in the framework of the diffusion approximation of optical tomography.
Resumo:
Due to the high price of natural oil and harmful effects of its usage, as the increase in emission of greenhouse gases, the industry focused in searching of sustainable types of the raw materials for production of chemicals. Ethanol, produced by fermentation of sugars, is one of the more interesting renewable materials for chemical manufacturing. There are numerous applications for the conversion of ethanol into commodity chemicals. In particular, the production of 1,3-butadiene whose primary source is ethanol using multifunctional catalysts is attractive. With the 25% of world rubber manufacturers utilizing 1,3-butadiene, there is an exigent need for its sustainable production. In this research, the conversion of ethanol in one-step process to 1,3-butadiene was studied. According to the literature, the mechanisms which were proposed to explain the way ethanol transforms into butadiene require to have both acid and basic sites. But still, there are a lot of debate on this topic. Thus, the aim of this research work is a better understanding of the reaction pathways with all the possible intermediates and products which lead to the formation of butadiene from ethanol. The particular interests represent the catalysts, based on different ratio Mg/Si in comparison to bare magnesia and silica oxides, in order to identify a good combination of acid/basic sites for the adsorption and conversion of ethanol. Usage of spectroscopictechniques are important to extract information that could be helpful for understanding the processes on the molecular level. The diffuse reflectance infrared spectroscopy coupled to mass spectrometry (DRIFT-MS) was used to study the surface composition of the catalysts during the adsorption of ethanol and its transformation during the temperature program. Whereas, mass spectrometry was used to monitor the desorbed products. The set of studied materials include MgO, Mg/Si=0.1, Mg/Si=2, Mg/Si=3, Mg/Si=9 and SiO2 which were also characterized by means of surface area measurements.
Resumo:
This PhD thesis is embedded into the Arctic Study of Tropospheric Aerosol, Clouds and Radiation (ASTAR) and investigates the radiative transfer through Arctic boundary-layer mixed-phase (ABM) clouds. For this purpose airborne spectral solar radiation measurements and simulations of the solar and thermal infrared radiative transfer have been performed. This work reports on measurements with the Spectral Modular Airborne Radiation measurement sysTem (SMART-Albedometer) conducted in the framework of ASTAR in April 2007 close to Svalbard. For ASTAR the SMART-Albedometer was extended to measure spectral radiance. The development and calibration of the radiance measurements are described in this work. In combination with in situ measurements of cloud particle properties provided by the Laboratoire de M¶et¶eorologie Physique (LaMP) and simultaneous airborne lidar measurements by the Alfred Wegener Institute for Polar and Marine Research (AWI) ABM clouds were sampled. The SMART-Albedometer measurements were used to retrieve the cloud thermodynamic phase by three different approaches. A comparison of these results with the in situ and lidar measurements is presented in two case studies. Beside the dominating mixed-phase clouds pure ice clouds were found in cloud gaps and at the edge of a large cloud field. Furthermore the vertical distribution of ice crystals within ABM clouds was investigated. It was found that ice crystals at cloud top are necessary to describe the observed SMART-Albedometer measurements. The impact of ice crystals on the radiative forcing of ABM clouds is in vestigated by extensive radiative transfer simulations. The solar and net radiative forcing was found to depend on the ice crystal size, shape and the mixing ratio of ice crystals and liquid water droplets.
Resumo:
The cone penetration test (CPT), together with its recent variation (CPTU), has become the most widely used in-situ testing technique for soil profiling and geotechnical characterization. The knowledge gained over the last decades on the interpretation procedures in sands and clays is certainly wide, whilst very few contributions can be found as regards the analysis of CPT(u) data in intermediate soils. Indeed, it is widely accepted that at the standard rate of penetration (v = 20 mm/s), drained penetration occurs in sands while undrained penetration occurs in clays. However, a problem arise when the available interpretation approaches are applied to cone measurements in silts, sandy silts, silty or clayey sands, since such intermediate geomaterials are often characterized by permeability values within the range in which partial drainage is very likely to occur. Hence, the application of the available and well-established interpretation procedures, developed for ‘standard’ clays and sands, may result in invalid estimates of soil parameters. This study aims at providing a better understanding on the interpretation of CPTU data in natural sand and silt mixtures, by taking into account two main aspects, as specified below: 1)Investigating the effect of penetration rate on piezocone measurements, with the aim of identifying drainage conditions when cone penetration is performed at a standard rate. This part of the thesis has been carried out with reference to a specific CPTU database recently collected in a liquefaction-prone area (Emilia-Romagna Region, Italy). 2)Providing a better insight into the interpretation of piezocone tests in the widely studied silty sediments of the Venetian lagoon (Italy). Research has focused on the calibration and verification of some site-specific correlations, with special reference to the estimate of compressibility parameters for the assessment of long-term settlements of the Venetian coastal defences.
Resumo:
The project of this Ph.D. thesis is based on a co-supervised collaboration between Università di Bologna, ALMA MATER STUDIORUM (Italy) and Instituto de Tecnología Química, Universitat Politècnica de València ITQ-UPV (Spain). This Ph.D. thesis is about the synthesis, characterization and catalytic testing of complex mixed-oxide catalysts mainly related to the family of Hexagonal Tungsten Bronzes (HTBs). These materials have been little explored as catalysts, although they have a great potential as multifunctional materials. Their peculiar acid properties can be coupled to other functionalities (e.g. redox sites) by isomorphous substitution of tungsten atoms with other transition metals such as vanadium, niobium and molybdenum. In this PhD thesis, it was demonstrated how it is possible to prepare substituted-HTBs by hydrothermal synthesis; these mixed-oxide were fully characterize by a number of physicochemical techniques such as XPS, HR-TEM, XAS etc. They were also used as catalysts for the one-pot glycerol oxidehydration to acrylic acid; this reaction might represent a viable chemical route to solve the important issue related to the co-production of glycerin along the biodiesel production chain. Acrylic acid yields as high as 51% were obtained and important structure-reactivity correlations were proved to govern the catalytic performance; only fine tuning of acid and redox properties as well as the in-framework presence of vanadium are fundamental to achieve noteworthy yields into the acid monomer. The overall results reported herein might represent an important contribution for future applications of HTBs in catalysis as well as a general guideline for a multifaceted approach for their physicochemical characterization.
Resumo:
Questo lavoro di tesi verte sulla progettazione architettonica di un grattacielo ad uso misto nel cuore di Dubai. E’ stato scelto come sito di collocazione proprio Dubai in quanto fiorente cittadina in grande e continua espansione. In uno skyline così eterogeneo, caratterizzato da grattacieli imponenti, è stato possibile progettare un edificio dall’importante volumetria e dalla particolare conformazione. Partendo da un modello di riferimento in campo biologico, il Saguaro Cactus, si è tratto spunto al fine di creare un ambiente che, seppure nella sua imponenza, potesse, dal suo interno, trasmettere un senso di spazio fluido e continuo ai suoi fruitori. A raggiungimento di tal scopo si è pensato ad una superficie fluida, continua, scanalata che avvolgesse tutta la struttura, creando rientranze, aggetti ed aperture trattandone
Resumo:
The level of improvement in the audiological results of Baha(®) users mainly depends on the patient's preoperative hearing thresholds and the type of Baha sound processor used. This investigation shows correlations between the preoperative hearing threshold and postoperative aided thresholds and audiological results in speech understanding in quiet of 84 Baha users with unilateral conductive hearing loss, bilateral conductive hearing loss and bilateral mixed hearing loss. Secondly, speech understanding in noise of 26 Baha users with different Baha sound processors (Compact, Divino, and BP100) is investigated. Linear regression between aided sound field thresholds and bone conduction (BC) thresholds of the better ear shows highest correlation coefficients and the steepest slope. Differences between better BC thresholds and aided sound field thresholds are smallest for mid-frequencies (1 and 2 kHz) and become larger at 0.5 and 4 kHz. For Baha users, the gain in speech recognition in quiet can be expected to lie in the order of magnitude of the gain in their hearing threshold. Compared to its predecessor sound processors Baha(®) Compact and Baha(®) Divino, Baha(®) BP100 improves speech understanding in noise significantly by +0.9 to +4.6 dB signal-to-noise ratio, depending on the setting and the use of directional microphone. For Baha users with unilateral and bilateral conductive hearing loss and bilateral mixed hearing loss, audiological results in aided sound field thresholds can be estimated with the better BC hearing threshold. The benefit in speech understanding in quiet can be expected to be similar to the gain in their sound field hearing threshold. The most recent technology of Baha sound processor improves speech understanding in noise by an order of magnitude that is well perceived by users and which can be very useful in everyday life.