925 resultados para Fix and optimize
Resumo:
De nombreux problèmes pratiques qui se posent dans dans le domaine de la logistique, peuvent être modélisés comme des problèmes de tournées de véhicules. De façon générale, cette famille de problèmes implique la conception de routes, débutant et se terminant à un dépôt, qui sont utilisées pour distribuer des biens à un nombre de clients géographiquement dispersé dans un contexte où les coûts associés aux routes sont minimisés. Selon le type de problème, un ou plusieurs dépôts peuvent-être présents. Les problèmes de tournées de véhicules sont parmi les problèmes combinatoires les plus difficiles à résoudre. Dans cette thèse, nous étudions un problème d’optimisation combinatoire, appartenant aux classes des problèmes de tournées de véhicules, qui est liée au contexte des réseaux de transport. Nous introduisons un nouveau problème qui est principalement inspiré des activités de collecte de lait des fermes de production, et de la redistribution du produit collecté aux usines de transformation, pour la province de Québec. Deux variantes de ce problème sont considérées. La première, vise la conception d’un plan tactique de routage pour le problème de la collecte-redistribution de lait sur un horizon donné, en supposant que le niveau de la production au cours de l’horizon est fixé. La deuxième variante, vise à fournir un plan plus précis en tenant compte de la variation potentielle de niveau de production pouvant survenir au cours de l’horizon considéré. Dans la première partie de cette thèse, nous décrivons un algorithme exact pour la première variante du problème qui se caractérise par la présence de fenêtres de temps, plusieurs dépôts, et une flotte hétérogène de véhicules, et dont l’objectif est de minimiser le coût de routage. À cette fin, le problème est modélisé comme un problème multi-attributs de tournées de véhicules. L’algorithme exact est basé sur la génération de colonnes impliquant un algorithme de plus court chemin élémentaire avec contraintes de ressources. Dans la deuxième partie, nous concevons un algorithme exact pour résoudre la deuxième variante du problème. À cette fin, le problème est modélisé comme un problème de tournées de véhicules multi-périodes prenant en compte explicitement les variations potentielles du niveau de production sur un horizon donné. De nouvelles stratégies sont proposées pour résoudre le problème de plus court chemin élémentaire avec contraintes de ressources, impliquant dans ce cas une structure particulière étant donné la caractéristique multi-périodes du problème général. Pour résoudre des instances de taille réaliste dans des temps de calcul raisonnables, une approche de résolution de nature heuristique est requise. La troisième partie propose un algorithme de recherche adaptative à grands voisinages où de nombreuses nouvelles stratégies d’exploration et d’exploitation sont proposées pour améliorer la performances de l’algorithme proposé en termes de la qualité de la solution obtenue et du temps de calcul nécessaire.
Resumo:
L’étiquette « homme-orchestre » est apposée à une grande variété de musiciens qui se distinguent en jouant seuls une performance qui est normalement interprétée par plusieurs personnes. La diversité qu’a pu prendre au cours du temps cette forme n’est pas prise en compte par la culture populaire qui propose une image relativement constante de cette figure tel que vue dans les films Mary Poppins (1964) de Walt Disney et One-man Band (2005) de Pixar. Il s’agit d’un seul performeur vêtu d’un costume coloré avec une grosse caisse sur le dos, des cymbales entre les jambes, une guitare ou un autre instrument à cordes dans les mains et un petit instrument à vent fixé assez près de sa bouche pour lui permettre d’alterner le chant et le jeu instrumental. Cette thèse propose une analyse de l’homme-orchestre qui va au-delà de sa simple production musicale en situant le phénomène comme un genre spectaculaire qui transmet un contenu symbolique à travers une relation tripartite entre performance divertissante, spectateur et image. Le contenu symbolique est lié aux idées caractéristiques du Siècle des lumières tels que la liberté, l’individu et une relation avec la technologie. Il est aussi incarné simultanément par les performeurs et par la représentation de l’homme-orchestre dans l’imaginaire collectif. En même temps, chaque performance sert à réaffirmer l’image de l’homme-orchestre, une image qui par répétitions est devenue un lieu commun de la culture, existant au-delà d’un seul performeur ou d’une seule performance. L’aspect visuel de l’homme-orchestre joue un rôle important dans ce processus par une utilisation inattendue du corps, une relation causale entre corps, technologie et production musicale ainsi que par l’utilisation de vêtements colorés et d’accessoires non musicaux tels des marionnettes, des feux d’artifice ou des animaux vivants. Ces éléments spectaculaires divertissent les spectateurs, ce qui se traduit, entre autres, par un gain financier pour le performeur. Le divertissement a une fonction phatique qui facilite la communication du contenu symbolique.
Resumo:
In a recent study, the serotype 3 Dearing strain of mammalian orthoreovirus was adapted to Vero cells; cells that exhibit a limited ability to support the early steps of reovirus uncoating and are unable to produce interferon as an antiviral response upon infection. The Vero cell-adapted virus (VeroAV) exhibits amino acids substitutions in both the σ1 and μ1 outer capsid proteins but no changes in the σ3 protein. Accordingly, the virus was shown not to behave as a classical uncoating mutant. In the present study, an increased ability of the virus to bind at the Vero cell surface was observed and is likely associated with an increased ability to bind onto cell-surface sialic acid residues. In addition, the kinetics of μ1 disassembly from the virions appears to be altered. The plasmid-based reverse genetics approach confirmed the importance of σ1 amino acids substitutions in VeroAV's ability to efficiently infect Vero cells, although μ1 co-adaptation appears necessary to optimize viral infection. This approach of combining in vitro selection of reoviruses with reverse genetics to identify pertinent amino acids substitutions appears promising in the context of eventual reovirus modification to increase its potential as an oncolytic virus.
Selective N-monomethylation of aniline using Zn1-x CoxFe2O4( x=0, 0.2, 0.5, 0.8 and 1.0)type systems
Resumo:
A series of ferrites having the general formula Zn1-xCoxFe2O4 (x=0, 0.2, 0.5, 0.8 and 1.0)were prepared by soft chemical route. The materials were characterized by adopting various physico-chemical methods. The reaction of aniline with methanol was studied in a fixed-bed reactor system as a potential source for the production of various methyl anilines. It was observed that systems possessing low ‘ x’ values are highly selective and active for N-monoalkylation of aniline leading to N-methylaniline. Reaction parameters were properly varied to optimize the reaction conditions for obtaining N-methylaniline selectively and in better yield. Among the systems Zn0.8Co0.2Fe2O4 is remarkable due to its very high activity and excellent stability. Under the optimized conditions N-methylaniline selectivity exceeded 98%. Even at a methanol to aniline molar ratio of 2, the yield of N-methylaniline was nearly 50%, whereas its yield exceeded 71% at the molar ratio of 5. ZnFe2O4, though executed better conversion than Zn0.8Co0.2Fe2O4 in the initial period of the run, deactivates quickly as the reaction proceeds. The Lewis acidity of the catalysts is mainly responsible for the good performance. Cation distribution in the spinel lattice influences their acido-basic properties and, hence, these factors have been considered as helpful parameters to evaluate the activity of the systems.
Resumo:
To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.
Resumo:
The overall focus of the thesis involves the synthesis and characterization of CdSe QDs overcoated with shell materials for various biological and chemical sensing applications. Second chapter deals with the synthesis and characterization of CdSe and CdSe/ZnS core shell QDs. The primary attention of this work is to develop a simple method based on photoinduced charge transfer to optimize the shell thickness. Synthesis of water soluble CdSe QDs, their cytotoxicity analysis and investigation of nonlinear optical properties form the subject of third chapter. Final chapter deals with development of QD based sensor systems for the selective detection of biologically and environmentally important analytes from aqueous media.
Resumo:
Identification and Control of Non‐linear dynamical systems are challenging problems to the control engineers.The topic is equally relevant in communication,weather prediction ,bio medical systems and even in social systems,where nonlinearity is an integral part of the system behavior.Most of the real world systems are nonlinear in nature and wide applications are there for nonlinear system identification/modeling.The basic approach in analyzing the nonlinear systems is to build a model from known behavior manifest in the form of system output.The problem of modeling boils down to computing a suitably parameterized model,representing the process.The parameters of the model are adjusted to optimize a performanace function,based on error between the given process output and identified process/model output.While the linear system identification is well established with many classical approaches,most of those methods cannot be directly applied for nonlinear system identification.The problem becomes more complex if the system is completely unknown but only the output time series is available.Blind recognition problem is the direct consequence of such a situation.The thesis concentrates on such problems.Capability of Artificial Neural Networks to approximate many nonlinear input-output maps makes it predominantly suitable for building a function for the identification of nonlinear systems,where only the time series is available.The literature is rich with a variety of algorithms to train the Neural Network model.A comprehensive study of the computation of the model parameters,using the different algorithms and the comparison among them to choose the best technique is still a demanding requirement from practical system designers,which is not available in a concise form in the literature.The thesis is thus an attempt to develop and evaluate some of the well known algorithms and propose some new techniques,in the context of Blind recognition of nonlinear systems.It also attempts to establish the relative merits and demerits of the different approaches.comprehensiveness is achieved in utilizing the benefits of well known evaluation techniques from statistics. The study concludes by providing the results of implementation of the currently available and modified versions and newly introduced techniques for nonlinear blind system modeling followed by a comparison of their performance.It is expected that,such comprehensive study and the comparison process can be of great relevance in many fields including chemical,electrical,biological,financial and weather data analysis.Further the results reported would be of immense help for practical system designers and analysts in selecting the most appropriate method based on the goodness of the model for the particular context.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
This thesis entitled Development of nitrifying ans photosynthetic sulfur bacteria based bioaugmentation systems for the bioremediation of ammonia and hydregen sulphide in shrimp culture. the thesis is to propose a sustainable, low cost option for the mitigation of toxic ammonia and hydrogen sulphide in shrimp culture systems. Use of ‘bioaugmentors’ as pond additives is an emerging field in aquaculture. Understanding the role of organisms involved in the ‘bioaugmentor’ will obviously help to optimize conditions for their activity.The thesis describes the use of wood powder immobilization of nitrifying consortia.Shrimp grow out systems are specialized and highly dynamic aquaculture production units which when operated under zero exchange mode require bioremediation of ammonia, nitrite nitrogen and hydrogen sulphide to protect the crop. The research conducted here is to develop an economically viable and user friendly technology for addressing the above problem. The nitrifying bacterial consortia (NBC) generated earlier (Achuthan et al., 2006) were used for developing the technology.Clear demonstration of better quality of immobilized nitrifiers generated in this study for field application.
Resumo:
Mangroves are considered to play a significant role in global carbon cycling. Themangrove forests would fix CO2 by photosynthesis into mangrove lumber and thus decrease the possibility of a catastrophic series of events - global warming by atmospheric CO2, melting of the polar ice caps, and inundation of the great coastal cities of the world. The leaf litter and roots are the main contributors to mangrove sediments, though algal production and allochthonous detritus can also be trapped (Kristensen et al, 2008) by mangroves due to their high organic matter content and reducing nature are excellent metal retainers. Environmental pollution due to metals is of major concern. This is due to the basic fact that metals are not biodegradable or perishable the way most organic pollutants are. While most organic toxicants can be destroyed by combustion and converted into compounds such as C0, C02, SOX, NOX, metals can't be destroyed. At the most the valance and physical form of metals may change. Concentration of metals present naturally in air, water and soil is very low. Metals released into the environment through anthropogenic activities such as burning of fossils fuels, discharge of industrial effluents, mining, dumping of sewage etc leads to the development of higher than tolerable or toxic levels of metals in the environment leading to metal pollution. Of course, a large number of heavy metals such as Fe, Mn, Cu, Ni, Zn, Co, Cr, Mo, and V are essential to plants and animals and deficiency of these metals may lead to diseases, but at higher levels, it would lead to metal toxicity. Almost all industrial processes and urban activities involve release of at least trace quantities of half a dozen metals in different forms. Heavy metal pollution in the environment can remain dormant for a long time and surface with a vengeance. Once an area gets toxified with metals, it is almost impossible to detoxify it. The symptoms of metal toxicity are often quite similar to the symptoms of other common diseases such as respiratory problems, digestive disorders, skin diseases, hypertension, diabetes, jaundice etc making it all the more difficult to diagnose metal poisoning. For example the Minamata disease caused by mercury pollution in addition to affecting the nervous system can disturb liver function and cause diabetes and hypertension. The damage caused by heavy metals does not end up with the affected person. The harmful effects can be transferred to the person's progenies. Ironically heavy metal pollution is a direct offshoot of our increasing ability to mass produce metals and use them in all spheres of existence. Along with conventional physico- chemical methods, biosystem approachment is also being constantly used for combating metal pollution
Resumo:
The group cyanobacteria includes a large number of organisms characterised by a low state of cellular organization. Their cells lack a well defined nucleus. Cell division is by division of the protoplast by an ingrowth of the septum. These organisms are characterised generally by a blue green colouration of the cell, the chief pigments being chlorophyll-a, carotenes, xanthophylls, C phycocyanin and C phycoerythrin. The product of photosynthesis is glycogen. These organisms lack flagellate reproductive bodies and there is a total lack of sexual reproduction. They are also unique because of the presence of murein in the place of cellulose (cell wall) and the absence of chloroplast, mitochondria and endoplasmic reticulum. Just like bacteria some of them possess Plasmids and can fix atmospheric nitrogen. In the present study growth kinetics, heavy metal tolerance, tolerance mechanisms, heavy metal intake, and antibacterial activity of §ynechocystics salina Wislouch - a nanoplanktonic, euryhaline, Cyanobacterium present in Cochin back waters has been carried out for the potential biotechnological application of this organism. _§; salina occur as small spherical cells of 3n diameter (sometimes in pairs) with bluish green colour. The species is characterised by jerky movement of the cells and is structrually similar to other cyanobacteria
Resumo:
Use of short fibers as reinforcing fillers in rubber composites is on an increasing trend. They are popular due to the possibility of obtaining anisotropic properties, ease of processing and economy. In the preparation of these composites short fibers are incorporated on two roll mixing mills or in internal mixers. This is a high energy intensive time consuming process. This calls for developing less energy intensive and less time consuming processes for incorporation and distribution of short fibers in the rubber matrix. One method for this is to incorporate fibers in the latex stage. The present study is primarily to optimize the preparation of short fiber- natural rubber composite by latex stage compounding and to evaluate the resulting composites in terms of mechanical, dynamic mechanical and thermal properties. A synthetic fiber (Nylon) and a natural fiber (Coir) are used to evaluate the advantages of the processing through latex stage. To extract the full reinforcing potential of the coir fibers the macro fibers are converted to micro fibers through chemical and mechanical means. The thesis is presented in 7 chapters
Resumo:
Light in its physical and philosophical sense has captured the imagination of human mind right from the dawn of civilization. The invention of lasers in the 60’s caused a renaissance in the field of optics. This intense, monochromatic, highly directional radiation created new frontiers in science and technology. The strong oscillating electric field of laser radiation creates a. polarisation response that is nonlinear in character in the medium through which it passes and the medium acts as a new source of optical field with alternate properties. It was in this context, that the field of optoelectronics which encompasses the generation, modulation, transmission etc. of optical radiation has gained tremendous importance. Organic molecules and polymeric systems have emerged as a class of promising materials of optoelectronics because they offer the flexibility, both at the molecular and bulk levels, to optimize the nonlinearity and other suitable properties for device applications. Organic nonlinear optical media, which yield large third-order nonlinearities, have been widely studied to develop optical devices like high speed switches, optical limiters etc. Transparent polymeric materials have found one of their most promising applicationsin lasers, in which they can be used as active elements with suitable laser dyes doped in it. The solid-matrix dye lasers make possible combination of the advantages of solid state lasers with the possibility of tuning the radiation over a broad spectral range. The polymeric matrices impregnated with organic dyes have not yet widely used because of the low resistance of the polymeric matrices to laser damage, their low dye photostability, and low dye stability over longer time of operation and storage. In this thesis we investigate the nonlinear and radiative properties of certain organic materials and doped polymeric matrix and their possible role in device development
Resumo:
In today's complicated computing environment, managing data has become the primary concern of all industries. Information security is the greatest challenge and it has become essential to secure the enterprise system resources like the databases and the operating systems from the attacks of the unknown outsiders. Our approach plays a major role in detecting and managing vulnerabilities in complex computing systems. It allows enterprises to assess two primary tiers through a single interface as a vulnerability scanner tool which provides a secure system which is also compatible with the security compliance of the industry. It provides an overall view of the vulnerabilities in the database, by automatically scanning them with minimum overhead. It gives a detailed view of the risks involved and their corresponding ratings. Based on these priorities, an appropriate mitigation process can be implemented to ensure a secured system. The results show that our approach could effectively optimize the time and cost involved when compared to the existing systems
Resumo:
This work deals with the optical properties of supported noble metal nanoparticles, which are dominated by the so-called Mie resonance and are strongly dependent on the particles’ morphology. For this reason, characterization and control of the dimension of these systems are desired in order to optimize their applications. Gold and silver nanoparticles have been produced on dielectric supports like quartz glass, sapphire and rutile, by the technique of vapor deposition under ultra-high vacuum conditions. During the preparation, coalescence is observed as an important mechanism of cluster growth. The particles have been studied in situ by optical transmission spectroscopy and ex situ by atomic force microscopy. It is shown that the morphology of the aggregates can be regarded as oblate spheroids. A theoretical treatment of their optical properties, based on the quasistatic approximation, and its combination with results obtained by atomic force microscopy give a detailed characterization of the nanoparticles. This method has been compared with transmission electron microscopy and the results are in excellent agreement. Tailoring of the clusters’ dimensions by irradiation with nanosecond-pulsed laser light has been investigated. Selected particles are heated within the ensemble by excitation of the Mie resonance under irradiation with a tunable laser source. Laser-induced coalescence prevents strongly tailoring of the particle size. Nevertheless, control of the particle shape is possible. Laser-tailored ensembles have been tested as substrates for surface-enhanced Raman spectroscopy (SERS), leading to an improvement of the results. Moreover, they constitute reproducible, robust and tunable SERS-substrates with a high potential for specific applications, in the present case focused on environmental protection. Thereby, these SERS-substrates are ideally suited for routine measurements.