25 resultados para Two-Level Optimization
Resumo:
Dissertation to obtain the degree of Doctor of Philosophy in Biomedical Engineering
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia do Ambiente Perfil de Engenharia de Sistemas Ambientais
Resumo:
Optimization is a very important field for getting the best possible value for the optimization function. Continuous optimization is optimization over real intervals. There are many global and local search techniques. Global search techniques try to get the global optima of the optimization problem. However, local search techniques are used more since they try to find a local minimal solution within an area of the search space. In Continuous Constraint Satisfaction Problems (CCSP)s, constraints are viewed as relations between variables, and the computations are supported by interval analysis. The continuous constraint programming framework provides branch-and-prune algorithms for covering sets of solutions for the constraints with sets of interval boxes which are the Cartesian product of intervals. These algorithms begin with an initial crude cover of the feasible space (the Cartesian product of the initial variable domains) which is recursively refined by interleaving pruning and branching steps until a stopping criterion is satisfied. In this work, we try to find a convenient way to use the advantages in CCSP branchand- prune with local search of global optimization applied locally over each pruned branch of the CCSP. We apply local search techniques of continuous optimization over the pruned boxes outputted by the CCSP techniques. We mainly use steepest descent technique with different characteristics such as penalty calculation and step length. We implement two main different local search algorithms. We use “Procure”, which is a constraint reasoning and global optimization framework, to implement our techniques, then we produce and introduce our results over a set of benchmarks.
Resumo:
Based on the report for the unit “Foresight Methods Analysis” of the PhD programme on Technology Assessment at the Universidade Nova de Lisboa, under the supervision of Prof. Dr. António B. Moniz
Resumo:
Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.
Resumo:
Phosphorus (P) is becoming a scarce element due to the decreasing availability of primary sources. Therefore, recover P from secondary sources, e.g. waste streams, have become extremely important. Sewage sludge ash (SSA) is a reliable secondary source of P. The use of SSAs as a direct fertilizer has very restricted legislation due to the presence of inorganic contaminants. Furthermore, the P present in SSAs is not in a plant-available form. The electrodialytic (ED) process is one of the methods under development to recover P and simultaneously remove heavy metals. The present work aimed to optimize the P recovery through a 2 compartment electrodialytic cell. The research was divided in three independent phases. In the first phase, ED experiments were carried out for two SSAs from different seasons, varying the duration of the ED process (2, 4, 6 and 9 days). During the ED treatment the SSA was suspended in distilled water in the anolyte, which was separated from the catholyte by a cation exchange membrane. From both ashes 90% of P was successfully extracted after 6 days of treatment. Regarding the heavy metals removal, one of the SSAs had a better removal than the other. Therefore, it was possible to conclude that SSAs from different seasons can be submitted to ED process under the same parameters. In the second phase, the two SSAs were exposed to humidity and air prior to ED, in order to carbonate them. Although this procedure was not successful, ED experiments were carried out varying the duration of the treatment (2 and 6 days) and the period of air exposure that SSAs were submitted to (7, 14 and 30 days). After 6 days of treatment and 30 days of air exposure, 90% of phosphorus was successfully extracted from both ashes. No differences were identified between carbonated and non-carbonated SSAs. Thus, SSAs that were exposed to the air and humidity, e.g. SSAs stored for 30 days in an open deposit, can be treated under the same parameters as the SSAs directly collected from the incineration process. In the third phase, ED experiments were carried out during 6 days varying the stirring time (0, 1, 2 and 4 h/day) in order to investigate if energy can be saved on the stirring process. After 6 days of treatment and 4 h/day stirring, 80% and 90% of P was successfully extracted from SSA-A and SSA-B, respectively. This value is very similar to the one obtained for 6 days of treatment stirring 24 h/day.
Resumo:
The present PhD thesis develops the cell functional enviromics (CFE) method to investigate the relationship between environment and cellular physiology. CFE may be defined as the envirome-wide cellular function reconstruction through the collection and systems-level analysis of dynamic envirome data. Throughout the thesis, CFE is illustrated by two main applications to cultures of a constitutive P. pastoris X33 strain expressing a scFv antibody fragment. The first application addresses the challenge of culture media development. A dataset was built from 26 shake flask experiments, with variations in trace elements concentrations and basal medium dilution based on the standard BSM+PTM1. Protein yield showed high sensitivity to culture medium variations, while biomass was essentially determined by BSM dilution. High scFv yield was associated with high overall metabolic fluxes through central carbon pathways concomitantly with a relative shift of carbon flux from biosynthetic towards energy-generating pathways. CFE identified three cellular functions (growth, energy generation and by-product formation) that together described 98.8% of the variance in observed fluxes. Analyses of how medium factors relate to identified cellular functions showed iron and manganese at concentrations close to PTM1 inhibit overall metabolic activity. The second application addresses bioreactor operation. Pilot 50 L fed-batch cultivations, followed by 1H-NMR exometabolite profiling, allowed the acquisition of data for 21 environmental factors over time. CFE identified five major metabolic pathway groups that are frequently activated by the environment. The resulting functional enviromics map may serve as template for future optimization of media composition and feeding strategies for Pichia pastoris. The present PhD thesis is a step forward towards establishing the foundations of CFE that is still at its infancy. The methods developed herein are a contribution for changing the culture media and process development paradigm towards a holistic and systematic discipline in the future.
Resumo:
Despite the extensive literature in finding new models to replace the Markowitz model or trying to increase the accuracy of its input estimations, there is less studies about the impact on the results of using different optimization algorithms. This paper aims to add some research to this field by comparing the performance of two optimization algorithms in drawing the Markowitz Efficient Frontier and in real world investment strategies. Second order cone programming is a faster algorithm, appears to be more efficient, but is impossible to assert which algorithm is better. Quadratic Programming often shows superior performance in real investment strategies.
Resumo:
Previous research demonstrated that the sequence of informational cues and the level of distraction have an impact on the judgment of a product’s quality. This study investigates the influence of the force behind the processing of these cues, working memory (WM). The results indicate that without distraction, consumers with low and high WM capacity (WMC) equally base their product evaluation on the first sequential cue. In the presence of a distractor, however, low WM individuals are no longer able to recall the initial cue, and thus derive their product judgment from the final cue. Moreover, evidence of intercultural differences in the perception of product related cues, and their aptitude for signaling a favorable product quality is provided.
Resumo:
The interest in using information to improve the quality of living in large urban areas and its governance efficiency has been around for decades. Nevertheless, the improvements in Information and Communications Technology has sparked a new dynamic in academic research, usually under the umbrella term of Smart Cities. This concept of Smart City can probably be translated, in a simplified version, into cities that are lived, managed and developed in an information-saturated environment. While it makes perfect sense and we can easily foresee the benefits of such a concept, presently there are still several significant challenges that need to be tackled before we can materialize this vision. In this work we aim at providing a small contribution in this direction, which maximizes the relevancy of the available information resources. One of the most detailed and geographically relevant information resource available, for the study of cities, is the census, more specifically the data available at block level (Subsecção Estatística). In this work, we use Self-Organizing Maps (SOM) and the variant Geo-SOM to explore the block level data from the Portuguese census of Lisbon city, for the years of 2001 and 2011. We focus on gauging change, proposing ways that allow the comparison of the two time periods, which have two different underlying geographical bases. We proceed with the analysis of the data using different SOM variants, aiming at producing a two-fold portrait: one, of the evolution of Lisbon during the first decade of the XXI century, another, of how the census dataset and SOM’s can be used to produce an informational framework for the study of cities.