995 resultados para Optimization software
Resumo:
Algoritmo que optimiza y crea pairings para tripulaciones de líneas aéreas mediante la posterior programación en Java.
Resumo:
In this paper, we are proposing a methodology to determine the most efficient and least costly way of crew pairing optimization. We are developing a methodology based on algorithm optimization on Eclipse opensource IDE using the Java programming language to solve the crew scheduling problems.
Resumo:
We present a georeferenced photomosaic of the Lucky Strike hydrothermal vent field (Mid-Atlantic Ridge, 37°18’N). The photomosaic was generated from digital photographs acquired using the ARGO II seafloor imaging system during the 1996 LUSTRE cruise, which surveyed a ~1 km2 zone and provided a coverage of ~20% of the seafloor. The photomosaic has a pixel resolution of 15 mm and encloses the areas with known active hydrothermal venting. The final mosaic is generated after an optimization that includes the automatic detection of the same benthic features across different images (feature-matching), followed by a global alignment of images based on the vehicle navigation. We also provide software to construct mosaics from large sets of images for which georeferencing information exists (location, attitude, and altitude per image), to visualize them, and to extract data. Georeferencing information can be provided by the raw navigation data (collected during the survey) or result from the optimization obtained from imatge matching. Mosaics based solely on navigation can be readily generated by any user but the optimization and global alignment of the mosaic requires a case-by-case approach for which no universally software is available. The Lucky Strike photomosaics (optimized and navigated-only) are publicly available through the Marine Geoscience Data System (MGDS, http://www.marine-geo.org). The mosaic-generating and viewing software is available through the Computer Vision and Robotics Group Web page at the University of Girona (http://eia.udg.es/_rafa/mosaicviewer.html)
Resumo:
Ski resorts are deploying more and more systems of artificial snow. These tools are necessary to ensure an important economic activity for the high alpine valleys. However, artificial snow raises important environmental issues that can be reduced by an optimization of its production. This paper presents a software prototype based on artificial intelligence to help ski resorts better manage their snowpack. It combines on one hand a General Neural Network for the analysis of the snow cover and the spatial prediction, with on the other hand a multiagent simulation of skiers for the analysis of the spatial impact of ski practice. The prototype has been tested on the ski resort of Verbier (Switzerland).
Resumo:
In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.
Resumo:
Blowing and drifting of snow is a major concern for transportation efficiency and road safety in regions where their development is common. One common way to mitigate snow drift on roadways is to install plastic snow fences. Correct design of snow fences is critical for road safety and maintaining the roads open during winter in the US Midwest and other states affected by large snow events during the winter season and to maintain costs related to accumulation of snow on the roads and repair of roads to minimum levels. Of critical importance for road safety is the protection against snow drifting in regions with narrow rights of way, where standard fences cannot be deployed at the recommended distance from the road. Designing snow fences requires sound engineering judgment and a thorough evaluation of the potential for snow blowing and drifting at the construction site. The evaluation includes site-specific design parameters typically obtained with semi-empirical relations characterizing the local transport conditions. Among the critical parameters involved in fence design and assessment of their post-construction efficiency is the quantification of the snow accumulation at fence sites. The present study proposes a joint experimental and numerical approach to monitor snow deposits around snow fences, quantitatively estimate snow deposits in the field, asses the efficiency and improve the design of snow fences. Snow deposit profiles were mapped using GPS based real-time kinematic surveys (RTK) conducted at the monitored field site during and after snow storms. The monitored site allowed testing different snow fence designs under close to identical conditions over four winter seasons. The study also discusses the detailed monitoring system and analysis of weather forecast and meteorological conditions at the monitored sites. A main goal of the present study was to assess the performance of lightweight plastic snow fences with a lower porosity than the typical 50% porosity used in standard designs of such fences. The field data collected during the first winter was used to identify the best design for snow fences with a porosity of 50%. Flow fields obtained from numerical simulations showed that the fence design that worked the best during the first winter induced the formation of an elongated area of small velocity magnitude close to the ground. This information was used to identify other candidates for optimum design of fences with a lower porosity. Two of the designs with a fence porosity of 30% that were found to perform well based on results of numerical simulations were tested in the field during the second winter along with the best performing design for fences with a porosity of 50%. Field data showed that the length of the snow deposit away from the fence was reduced by about 30% for the two proposed lower-porosity (30%) fence designs compared to the best design identified for fences with a porosity of 50%. Moreover, one of the lower-porosity designs tested in the field showed no significant snow deposition within the bottom gap region beneath the fence. Thus, a major outcome of this study is to recommend using plastic snow fences with a porosity of 30%. It is expected that this lower-porosity design will continue to work well for even more severe snow events or for successive snow events occurring during the same winter. The approach advocated in the present study allowed making general recommendations for optimizing the design of lower-porosity plastic snow fences. This approach can be extended to improve the design of other types of snow fences. Some preliminary work for living snow fences is also discussed. Another major contribution of this study is to propose, develop protocols and test a novel technique based on close range photogrammetry (CRP) to quantify the snow deposits trapped snow fences. As image data can be acquired continuously, the time evolution of the volume of snow retained by a snow fence during a storm or during a whole winter season can, in principle, be obtained. Moreover, CRP is a non-intrusive method that eliminates the need to perform man-made measurements during the storms, which are difficult and sometimes dangerous to perform. Presently, there is lots of empiricism in the design of snow fences due to lack of data on fence storage capacity on how snow deposits change with the fence design and snow storm characteristics and in the estimation of the main parameters used by the state DOTs to design snow fences at a given site. The availability of such information from CRP measurements should provide critical data for the evaluation of the performance of a certain snow fence design that is tested by the IDOT. As part of the present study, the novel CRP method is tested at several sites. The present study also discusses some attempts and preliminary work to determine the snow relocation coefficient which is one of the main variables that has to be estimated by IDOT engineers when using the standard snow fence design software (Snow Drift Profiler, Tabler, 2006). Our analysis showed that standard empirical formulas did not produce reasonable values when applied at the Iowa test sites monitored as part of the present study and that simple methods to estimate this variable are not reliable. The present study makes recommendations for the development of a new methodology based on Large Scale Particle Image Velocimetry that can directly measure the snow drift fluxes and the amount of snow relocated by the fence.
Resumo:
MOTIVATION: The detection of positive selection is widely used to study gene and genome evolution, but its application remains limited by the high computational cost of existing implementations. We present a series of computational optimizations for more efficient estimation of the likelihood function on large-scale phylogenetic problems. We illustrate our approach using the branch-site model of codon evolution. RESULTS: We introduce novel optimization techniques that substantially outperform both CodeML from the PAML package and our previously optimized sequential version SlimCodeML. These techniques can also be applied to other likelihood-based phylogeny software. Our implementation scales well for large numbers of codons and/or species. It can therefore analyse substantially larger datasets than CodeML. We evaluated FastCodeML on different platforms and measured average sequential speedups of FastCodeML (single-threaded) versus CodeML of up to 5.8, average speedups of FastCodeML (multi-threaded) versus CodeML on a single node (shared memory) of up to 36.9 for 12 CPU cores, and average speedups of the distributed FastCodeML versus CodeML of up to 170.9 on eight nodes (96 CPU cores in total).Availability and implementation: ftp://ftp.vital-it.ch/tools/FastCodeML/. CONTACT: selectome@unil.ch or nicolas.salamin@unil.ch.
Resumo:
Diplomityön tavoitteena oli kehittää kolmannen sukupolven fyysistä protokollakerrosta matkapuhelimen ohjelmistoarkkitehtuurille. Kolmannen sukupolven matkapuhelinjärjestelmät ovat aikaisempia järjestelmiä monimutkaisempia. Ohjelmiston koon ja monimutkaisuuden sekä aikataulujen kiireellisyyden vuoksi on tullut tarve ottaa käyttöön formaaleja menetelmiä ohjelmiston kehitystyöhön. Formaalit kuvauskielet mahdollistavat tarkan, yksiselitteisen ja simuloitavissa olevan järjestelmäkuvauksen muodostamisen. Fyysinen protokollakerros tarjoaa tiedon siirtoa ylemmille protokollakerroksille. Tämän tiedonsiirron hallinta vaatii protokollakerrosten välistä viestinvälitystä. Formaaleja kuvauskieliä käyttämällä voidaan viestinvälityksen toteutusta automatisoida ja siinä tarvittavaa logiikkaa havainnollistaa. Työssä suunniteltiin, toteutettiin ja testattiin ylempien protokollakerrosten kanssa kommunikoivaa osaa fyysisestä protokollakerroksesta. Tuloksena saatiin solunvalintatoiminnallisuuden vaatiman kommunikoinnin ja tilakoneen toteutus ohjelmistoarkkitehtuurissa. Ohjelmistonkehityksen alkuvaiheiden havaittiin olevan fyysisen kerroksen suorituskyvyn kannalta merkittävässä asemassa, koska tällöin viestinvälityksen optimointi on helpointa. Formaalit kuvauskielet eivät ole sellaisenaan täysin soveltuvia tarkoin määritellyn ohjelmistoarkkitehtuurin osien kehitykseen.
Resumo:
The threats caused by global warming motivate different stake holders to deal with and control them. This Master's thesis focuses on analyzing carbon trade permits in optimization framework. The studied model determines optimal emission and uncertainty levels which minimize the total cost. Research questions are formulated and answered by using different optimization tools. The model is developed and calibrated by using available consistent data in the area of carbon emission technology and control. Data and some basic modeling assumptions were extracted from reports and existing literatures. The data collected from the countries in the Kyoto treaty are used to estimate the cost functions. Theory and methods of constrained optimization are briefly presented. A two-level optimization problem (individual and between the parties) is analyzed by using several optimization methods. The combined cost optimization between the parties leads into multivariate model and calls for advanced techniques. Lagrangian, Sequential Quadratic Programming and Differential Evolution (DE) algorithm are referred to. The role of inherent measurement uncertainty in the monitoring of emissions is discussed. We briefly investigate an approach where emission uncertainty would be described in stochastic framework. MATLAB software has been used to provide visualizations including the relationship between decision variables and objective function values. Interpretations in the context of carbon trading were briefly presented. Suggestions for future work are given in stochastic modeling, emission trading and coupled analysis of energy prices and carbon permits.
Resumo:
The goal of the Master’s thesis is to develop and to analyze the optimization method for finding a geometry shape of classical horizontal wind turbine blades based on set of criteria. The thesis develops a technique that allows the designer to determine the weight of such factors as power coefficient, sound pressure level and the cost function in the overall process of blade shape optimization. The optimization technique applies the Desirability function. It was never used before in that kind of technical problems, and in this sense it can claim to originality of research. To do the analysis and the optimization processes more convenient the software application was developed.
Resumo:
The purpose of this study was to simulate and to optimize integrated gasification for combine cycle (IGCC) for power generation and hydrogen (H2) production by using low grade Thar lignite coal and cotton stalk. Lignite coal is abundant of moisture and ash content, the idea of addition of cotton stalk is to increase the mass of combustible material per mass of feed use for the process, to reduce the consumption of coal and to increase the cotton stalk efficiently for IGCC process. Aspen plus software is used to simulate the process with different mass ratios of coal to cotton stalk and for optimization: process efficiencies, net power generation and H2 production etc. are considered while environmental hazard emissions are optimized to acceptance level. With the addition of cotton stalk in feed, process efficiencies started to decline along with the net power production. But for H2 production, it gave positive result at start but after 40% cotton stalk addition, H2 production also started to decline. It also affects negatively on environmental hazard emissions and mass of emissions/ net power production increases linearly with the addition of cotton stalk in feed mixture. In summation with the addition of cotton stalk, overall affects seemed to negative. But the effect is more negative after 40% cotton stalk addition so it is concluded that to get maximum process efficiencies and high production less amount of cotton stalk addition in feed is preferable and the maximum level of addition is estimated to 40%. Gasification temperature should keep lower around 1140 °C and prefer technique for studied feed in IGCC is fluidized bed (ash in dry form) rather than ash slagging gasifier
Resumo:
This master’s thesis was done for a small company, Vipetec Oy, which offers specialized technological services for companies mainly in forest industry. The study was initiated partly because the company wants to expand its customer base to a new industry. There were two goals connected to each other. First was to find out how much and what kind of value current customers have realized from ATA Process Event Library, one of the products that the company offers. Second was to determine the best way to present this value and its implications for future value potential to both current and potential customers. ATA helps to make grade and product changes, starting after machine downtime, and recovery from production break faster for customers. All three events sometimes occur in production line. The faster operation results to savings in time and material. In addition to ATA Vipetec also offers other services related to development of automation and optimization of controls. Theoretical part concentrates on the concept of value, how it can be delivered to customers, and what kind of risk customer faces in industrial purchasing. Also the function of reference marketing towards customers is discussed. In the empirical part the realized value for existing customers is evaluated based on both numerical data and interviews. There’s also a brief case study about one customer. After that the value-based reference marketing for a target industry is examined through interviews of these potential customers. Finally answers to the research questions are stated and compared also to the theoretical knowledge about the subject. Results show that those customers’ machines which use the full service concept of ATA usually are able to save more time and material than the machines which use only some features of the product. Interviews indicated that sales arguments which focus on improved competitive status are not as effective as current arguments which focus on numerical improvements. In the case of potential customers in the new industry, current sales arguments likely work best for those whose irregular production situations are caused mainly by fault situations. When the actions of Vipetec were compared to ten key elements of creating customer references, it was seen that many of them the company has either already included in its strategy or has good chances to include them with the help of the results of this study.
Resumo:
Docosahexaenoic acid is an essential polyunsaturated fatty acid with important metabolic activities. Its conjugated double bonds make it susceptible to decomposition. Its stability may be improved through fatty acid entrapment with a spray-drying technique; however, the many parameters involved in this technique must be considered to avoid affecting the final product quality. Therefore, this study aimed to evaluate the entrapment conditions and yields of fish oil enriched with docosahexaenoic acid ethyl ester. Microcapsules were obtained from Acacia gum using a spray-drying technique. The experimental samples were analyzed by chromatography and delineated by Statistica software, which found the following optimum entrapment conditions: an inlet temperature of 188 °C; 30% core material; an N2 flow rate of 55 mm; and a pump flow rate of 12.5 mL/minute. These conditions provided a 66% yield of docosahexaenoic acid ethyl ester in the oil, corresponding to 19.8% of entrapped docosahexaenoic acid ethyl ester (w/w). This result was considered significant since 30% corresponded to wall material.
Resumo:
The goat placental immunomodulatory peptides were produced by fermentation with Aspergillus Niger. The objective of the present study was to investigate the effects of fermentation parameters (carbon source content, pH, and time) on spleen lymphocyte proliferation for the highest immune activity of the fermentation broth using response surface methodology (RSM). According to the data analysis by the Design-Expert® software, the stimulation index value (23.51%), which is the maximum immune activity, was obtained under the following conditions: content of carbon source 1.97 g·L-1, initial pH 5.0, and 74.43 h of fermentation time. Under the optimized fermentation conditions, at a certain concentration range, the fermentation broth produced a significant effect on the proliferation of mouse spleen lymphocytes. Ultrafiltration technique was performed to separate the fermentation broth with different MW (molecular weight). It was found that peptides in the range of <10 KDa were the main bioactivity fractions for the immunomodulatory and antioxidant activities.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.