973 resultados para Optimization techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geographic information systems (GIS) and artificial intelligence (AI) techniques were used to develop an intelligent snow removal asset management system (SRAMS). The system has been evaluated through a case study examining snow removal from the roads in Black Hawk County, Iowa, for which the Iowa Department of Transportation (Iowa DOT) is responsible. The SRAMS is comprised of an expert system that contains the logical rules and expertise of the Iowa DOT’s snow removal experts in Black Hawk County, and a geographic information system to access and manage road data. The system is implemented on a mid-range PC by integrating MapObjects 2.1 (a GIS package), Visual Rule Studio 2.2 (an AI shell), and Visual Basic 6.0 (a programming tool). The system could efficiently be used to generate prioritized snowplowing routes in visual format, to optimize the allocation of assets for plowing, and to track materials (e.g., salt and sand). A test of the system reveals an improvement in snowplowing time by 1.9 percent for moderate snowfall and 9.7 percent for snowstorm conditions over the current manual system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of forensic investigations of problems across Iowa, a research study was developed aimed at providing solutions to identified problems through better management and optimization of the available pavement geotechnical materials and through ground improvement, soil reinforcement, and other soil treatment techniques. The overall goal was worked out through simple laboratory experiments, such as particle size analysis, plasticity tests, compaction tests, permeability tests, and strength tests. A review of the problems suggested three areas of study: pavement cracking due to improper management of pavement geotechnical materials, permeability of mixed-subgrade soils, and settlement of soil above the pipe due to improper compaction of the backfill. This resulted in the following three areas of study: (1) The optimization and management of earthwork materials through general soil mixing of various select and unsuitable soils and a specific example of optimization of materials in earthwork construction by soil mixing; (2) An investigation of the saturated permeability of compacted glacial till in relation to validation and prediction with the Enhanced Integrated Climatic Model (EICM); and (3) A field investigation and numerical modeling of culvert settlement. For each area of study, a literature review was conducted, research data were collected and analyzed, and important findings and conclusions were drawn. It was found that optimum mixtures of select and unsuitable soils can be defined that allow the use of unsuitable materials in embankment and subgrade locations. An improved model of saturated hydraulic conductivity was proposed for use with glacial soils from Iowa. The use of proper trench backfill compaction or the use of flowable mortar will reduce the potential for developing a bump above culverts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, cleaning of ceramic filter media was studied. Mechanisms of fouling and dissolution of iron compounds, as well as methods for cleaning ceramic membranes fouled by iron deposits were studied in the literature part. Cleaning agents and different methods were closer examined in the experimental part of the thesis. Pyrite is found in the geologic strata. It is oxidized to form ferrous ions Fe(II) and ferric ions Fe(III). Fe(III) is further oxidized in the hydrolysis to form ferric hydroxide. Hematite and goethite, for instance, are naturally occurring iron oxidesand hydroxides. In contact with filter media, they can cause severe fouling, which common cleaning techniques competent enough to remove. Mechanisms for the dissolution of iron oxides include the ligand-promoted pathway and the proton-promoted pathway. The dissolution can also be reductive or non-reductive. The most efficient mechanism is the ligand-promoted reductive mechanism that comprises two stages: the induction period and the autocatalytic dissolution.Reducing agents(such as hydroquinone and hydroxylamine hydrochloride), chelating agents (such as EDTA) and organic acids are used for the removal of iron compounds. Oxalic acid is the most effective known cleaning agent for iron deposits. Since formulations are often more effective than organic acids, reducing agents or chelating agents alone, the citrate¿bicarbonate¿dithionite system among others is well studied in the literature. The cleaning is also enhanced with ultrasound and backpulsing.In the experimental part, oxalic acid and nitric acid were studied alone andin combinations. Also citric acid and ascorbic acid among other chemicals were tested. Soaking experiments, experiments with ultrasound and experiments for alternative methods to apply the cleaning solution on the filter samples were carried out. Permeability and ISO Brightness measurements were performed to examine the influence of the cleaning methods on the samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) analysis of the solutions was carried out to determine the dissolved metals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkimus keskittyy kansainväliseen hajauttamiseen suomalaisen sijoittajan näkökulmasta. Tutkimuksen toinen tavoite on selvittää tehostavatko uudet kovarianssimatriisiestimaattorit minimivarianssiportfolion optimointiprosessia. Tavallisen otoskovarianssimatriisin lisäksi optimoinnissa käytetään kahta kutistusestimaattoria ja joustavaa monimuuttuja-GARCH(1,1)-mallia. Tutkimusaineisto koostuu Dow Jonesin toimialaindekseistä ja OMX-H:n portfolioindeksistä. Kansainvälinen hajautusstrategia on toteutettu käyttäen toimialalähestymistapaa ja portfoliota optimoidaan käyttäen kahtatoista komponenttia. Tutkimusaieisto kattaa vuodet 1996-2005 eli 120 kuukausittaista havaintoa. Muodostettujen portfolioiden suorituskykyä mitataan Sharpen indeksillä. Tutkimustulosten mukaan kansainvälisesti hajautettujen investointien ja kotimaisen portfolion riskikorjattujen tuottojen välillä ei ole tilastollisesti merkitsevää eroa. Myöskään uusien kovarianssimatriisiestimaattoreiden käytöstä ei synnytilastollisesti merkitsevää lisäarvoa verrattuna otoskovarianssimatrisiin perustuvaan portfolion optimointiin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mapping the microstructure properties of the local tissues in the brain is crucial to understand any pathological condition from a biological perspective. Most of the existing techniques to estimate the microstructure of the white matter assume a single axon orientation whereas numerous regions of the brain actually present a fiber-crossing configuration. The purpose of the present study is to extend a recent convex optimization framework to recover microstructure parameters in regions with multiple fibers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The threats caused by global warming motivate different stake holders to deal with and control them. This Master's thesis focuses on analyzing carbon trade permits in optimization framework. The studied model determines optimal emission and uncertainty levels which minimize the total cost. Research questions are formulated and answered by using different optimization tools. The model is developed and calibrated by using available consistent data in the area of carbon emission technology and control. Data and some basic modeling assumptions were extracted from reports and existing literatures. The data collected from the countries in the Kyoto treaty are used to estimate the cost functions. Theory and methods of constrained optimization are briefly presented. A two-level optimization problem (individual and between the parties) is analyzed by using several optimization methods. The combined cost optimization between the parties leads into multivariate model and calls for advanced techniques. Lagrangian, Sequential Quadratic Programming and Differential Evolution (DE) algorithm are referred to. The role of inherent measurement uncertainty in the monitoring of emissions is discussed. We briefly investigate an approach where emission uncertainty would be described in stochastic framework. MATLAB software has been used to provide visualizations including the relationship between decision variables and objective function values. Interpretations in the context of carbon trading were briefly presented. Suggestions for future work are given in stochastic modeling, emission trading and coupled analysis of energy prices and carbon permits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Search engine optimization & marketing is a set of processes widely used on websites to improve search engine rankings which generate quality web traffic and increase ROI. Content is the most important part of any website. CMS web development is now become very essential for most of organizations and online businesses to develop their online system and websites. Every online business using a CMS wants to get users (customers) to make profit and ROI. This thesis comprises a brief study of existing SEO methods, tools and techniques and how they can be implemented to optimize a content base website. In results, the study provides recommendations about how to use SEO methods; tools and techniques to optimize CMS based websites on major search engines. This study compares popular CMS systems like Drupal, WordPress and Joomla SEO features and how implementing SEO can be improved on these CMS systems. Having knowledge of search engine indexing and search engine working is essential for a successful SEO campaign. This work is a complete guideline for web developers or SEO experts who want to optimize a CMS based website on all major search engines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis work is to develop and study the Differential Evolution Algorithm for multi-objective optimization with constraints. Differential Evolution is an evolutionary algorithm that has gained in popularity because of its simplicity and good observed performance. Multi-objective evolutionary algorithms have become popular since they are able to produce a set of compromise solutions during the search process to approximate the Pareto-optimal front. The starting point for this thesis was an idea how Differential Evolution, with simple changes, could be extended for optimization with multiple constraints and objectives. This approach is implemented, experimentally studied, and further developed in the work. Development and study concentrates on the multi-objective optimization aspect. The main outcomes of the work are versions of a method called Generalized Differential Evolution. The versions aim to improve the performance of the method in multi-objective optimization. A diversity preservation technique that is effective and efficient compared to previous diversity preservation techniques is developed. The thesis also studies the influence of control parameters of Differential Evolution in multi-objective optimization. Proposals for initial control parameter value selection are given. Overall, the work contributes to the diversity preservation of solutions in multi-objective optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the optimum design of 3R manipulators is formulated and solved by using an algebraic formulation of workspace boundary. A manipulator design can be approached as a problem of optimization, in which the objective functions are the size of the manipulator and workspace volume; and the constrains can be given as a prescribed workspace volume. The numerical solution of the optimization problem is investigated by using two different numerical techniques, namely, sequential quadratic programming and simulated annealing. Numerical examples illustrate a design procedure and show the efficiency of the proposed algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to analyze the agreement between measurements of unloaded oxygen uptake and peak oxygen uptake based on equations proposed by Wasserman and on real measurements directly obtained with the ergospirometry system. We performed an incremental cardiopulmonary exercise test (CPET), which was applied to two groups of sedentary male subjects: one apparently healthy group (HG, n=12) and the other had stable coronary artery disease (n=16). The mean age in the HG was 47±4 years and that in the coronary artery disease group (CG) was 57±8 years. Both groups performed CPET on a cycle ergometer with a ramp-type protocol at an intensity that was calculated according to the Wasserman equation. In the HG, there was no significant difference between measurements predicted by the formula and real measurements obtained in CPET in the unloaded condition. However, at peak effort, a significant difference was observed between oxygen uptake (V˙O2)peak(predicted)and V˙O2peak(real)(nonparametric Wilcoxon test). In the CG, there was a significant difference of 116.26 mL/min between the predicted values by the formula and the real values obtained in the unloaded condition. A significant difference in peak effort was found, where V˙O2peak(real)was 40% lower than V˙O2peak(predicted)(nonparametric Wilcoxon test). There was no agreement between the real and predicted measurements as analyzed by Lin’s coefficient or the Bland and Altman model. The Wasserman formula does not appear to be appropriate for prediction of functional capacity of volunteers. Therefore, this formula cannot precisely predict the increase in power in incremental CPET on a cycle ergometer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prediction of proteins' conformation helps to understand their exhibited functions, allows for modeling and allows for the possible synthesis of the studied protein. Our research is focused on a sub-problem of protein folding known as side-chain packing. Its computational complexity has been proven to be NP-Hard. The motivation behind our study is to offer the scientific community a means to obtain faster conformation approximations for small to large proteins over currently available methods. As the size of proteins increases, current techniques become unusable due to the exponential nature of the problem. We investigated the capabilities of a hybrid genetic algorithm / simulated annealing technique to predict the low-energy conformational states of various sized proteins and to generate statistical distributions of the studied proteins' molecular ensemble for pKa predictions. Our algorithm produced errors to experimental results within .acceptable margins and offered considerable speed up depending on the protein and on the rotameric states' resolution used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population-based metaheuristics, such as particle swarm optimization (PSO), have been employed to solve many real-world optimization problems. Although it is of- ten sufficient to find a single solution to these problems, there does exist those cases where identifying multiple, diverse solutions can be beneficial or even required. Some of these problems are further complicated by a change in their objective function over time. This type of optimization is referred to as dynamic, multi-modal optimization. Algorithms which exploit multiple optima in a search space are identified as niching algorithms. Although numerous dynamic, niching algorithms have been developed, their performance is often measured solely on their ability to find a single, global optimum. Furthermore, the comparisons often use synthetic benchmarks whose landscape characteristics are generally limited and unknown. This thesis provides a landscape analysis of the dynamic benchmark functions commonly developed for multi-modal optimization. The benchmark analysis results reveal that the mechanisms responsible for dynamism in the current dynamic bench- marks do not significantly affect landscape features, thus suggesting a lack of representation for problems whose landscape features vary over time. This analysis is used in a comparison of current niching algorithms to identify the effects that specific landscape features have on niching performance. Two performance metrics are proposed to measure both the scalability and accuracy of the niching algorithms. The algorithm comparison results demonstrate the algorithms best suited for a variety of dynamic environments. This comparison also examines each of the algorithms in terms of their niching behaviours and analyzing the range and trade-off between scalability and accuracy when tuning the algorithms respective parameters. These results contribute to the understanding of current niching techniques as well as the problem features that ultimately dictate their success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La survie des réseaux est un domaine d'étude technique très intéressant ainsi qu'une préoccupation critique dans la conception des réseaux. Compte tenu du fait que de plus en plus de données sont transportées à travers des réseaux de communication, une simple panne peut interrompre des millions d'utilisateurs et engendrer des millions de dollars de pertes de revenu. Les techniques de protection des réseaux consistent à fournir une capacité supplémentaire dans un réseau et à réacheminer les flux automatiquement autour de la panne en utilisant cette disponibilité de capacité. Cette thèse porte sur la conception de réseaux optiques intégrant des techniques de survie qui utilisent des schémas de protection basés sur les p-cycles. Plus précisément, les p-cycles de protection par chemin sont exploités dans le contexte de pannes sur les liens. Notre étude se concentre sur la mise en place de structures de protection par p-cycles, et ce, en supposant que les chemins d'opération pour l'ensemble des requêtes sont définis a priori. La majorité des travaux existants utilisent des heuristiques ou des méthodes de résolution ayant de la difficulté à résoudre des instances de grande taille. L'objectif de cette thèse est double. D'une part, nous proposons des modèles et des méthodes de résolution capables d'aborder des problèmes de plus grande taille que ceux déjà présentés dans la littérature. D'autre part, grâce aux nouveaux algorithmes, nous sommes en mesure de produire des solutions optimales ou quasi-optimales. Pour ce faire, nous nous appuyons sur la technique de génération de colonnes, celle-ci étant adéquate pour résoudre des problèmes de programmation linéaire de grande taille. Dans ce projet, la génération de colonnes est utilisée comme une façon intelligente d'énumérer implicitement des cycles prometteurs. Nous proposons d'abord des formulations pour le problème maître et le problème auxiliaire ainsi qu'un premier algorithme de génération de colonnes pour la conception de réseaux protegées par des p-cycles de la protection par chemin. L'algorithme obtient de meilleures solutions, dans un temps raisonnable, que celles obtenues par les méthodes existantes. Par la suite, une formulation plus compacte est proposée pour le problème auxiliaire. De plus, nous présentons une nouvelle méthode de décomposition hiérarchique qui apporte une grande amélioration de l'efficacité globale de l'algorithme. En ce qui concerne les solutions en nombres entiers, nous proposons deux méthodes heurisiques qui arrivent à trouver des bonnes solutions. Nous nous attardons aussi à une comparaison systématique entre les p-cycles et les schémas classiques de protection partagée. Nous effectuons donc une comparaison précise en utilisant des formulations unifiées et basées sur la génération de colonnes pour obtenir des résultats de bonne qualité. Par la suite, nous évaluons empiriquement les versions orientée et non-orientée des p-cycles pour la protection par lien ainsi que pour la protection par chemin, dans des scénarios de trafic asymétrique. Nous montrons quel est le coût de protection additionnel engendré lorsque des systèmes bidirectionnels sont employés dans de tels scénarios. Finalement, nous étudions une formulation de génération de colonnes pour la conception de réseaux avec des p-cycles en présence d'exigences de disponibilité et nous obtenons des premières bornes inférieures pour ce problème.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parmi les méthodes d’estimation de paramètres de loi de probabilité en statistique, le maximum de vraisemblance est une des techniques les plus populaires, comme, sous des conditions l´egères, les estimateurs ainsi produits sont consistants et asymptotiquement efficaces. Les problèmes de maximum de vraisemblance peuvent être traités comme des problèmes de programmation non linéaires, éventuellement non convexe, pour lesquels deux grandes classes de méthodes de résolution sont les techniques de région de confiance et les méthodes de recherche linéaire. En outre, il est possible d’exploiter la structure de ces problèmes pour tenter d’accélerer la convergence de ces méthodes, sous certaines hypothèses. Dans ce travail, nous revisitons certaines approches classiques ou récemment d´eveloppées en optimisation non linéaire, dans le contexte particulier de l’estimation de maximum de vraisemblance. Nous développons également de nouveaux algorithmes pour résoudre ce problème, reconsidérant différentes techniques d’approximation de hessiens, et proposons de nouvelles méthodes de calcul de pas, en particulier dans le cadre des algorithmes de recherche linéaire. Il s’agit notamment d’algorithmes nous permettant de changer d’approximation de hessien et d’adapter la longueur du pas dans une direction de recherche fixée. Finalement, nous évaluons l’efficacité numérique des méthodes proposées dans le cadre de l’estimation de modèles de choix discrets, en particulier les modèles logit mélangés.