988 resultados para DISTRIBUTED OPTIMIZATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis deals with the preparation and dielectric characterization of Poly aniline and its analogues in ISM band frequency of 2-4 GHz that includes part of the microwave region (300 MHz to 300 GHz) of the electromagnetic spectrum and an initial dielectric study in the high frequency [O.05MHz-13 MHz]. PolyaniIine has been synthesized by an in situ doping reaction under different temperature and in the presence of inorganic dopants such as HCl H2S04, HN03, HCl04 and organic dopants such as camphorsulphonic acid [CSA], toluenesulphonic acid {TSA) and naphthalenesulphonic acid [NSA]. The variation in dielectric properties with change in reaction temperature, dopants and frequency has been studied. The effect of codopants and microemulsions on the dielectric properties has also been studied in the ISM band. The ISM band of frequencies (2-4 GHz) is of great utility in Industrial, Scientific and Medical (ISM) applications. Microwave heating is a very efficient method of heating dielectric materials and is extensively used in industrial as well as household heating applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main source of protein for human and animal consumption is from the agricultural sector, where the production is vulnerable to diseases, fluctuations in climatic conditions and deteriorating hydrological conditions due to water pollution. Therefore Single Cell Protein (SCP) production has evolved as an excellent alternative. Among all sources of microbial protein, yeast has attained global acceptability and has been preferred for SCP production. The screening and evaluation of nutritional and other culture variables of microorganisms are very important in the development of a bioprocess for SCP production. The application of statistical experimental design in bioprocess development can result in improved product yields, reduced process variability, closer confirmation of the output response to target requirements and reduced development time and overall cost.The present work was undertaken to develop a bioprocess technology for the mass production of a marine yeast, Candida sp.S27. Yeasts isolated from the offshore waters of the South west coast of India and maintained in the Microbiology Laboratory were subjected to various tests for the selection of a potent strain for biomass production. The selected marine yeast was identified based on ITS sequencing. Biochemical/nutritional characterization of Candida sp.S27 was carried out. Using Response Surface Methodology (RSM) the process parameters (pH, temperature and salinity) were optimized. For mass production of yeast biomass, a chemically defined medium (Barnett and Ingram, 1955) and a crude medium (Molasses-Yeast extract) were optimized using RSM. Scale up of biomass production was done in a Bench top Fermenter using these two optimized media. Comparative efficacy of the defined and crude media were estimated besides nutritional evaluation of the biomass developed using these two optimized media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two stage processes consisting of precursor preparation by thermal evaporation followed by chalcogenisation in the required atmosphere is found to be a feasible technique for the PV materials such as n-Beta In2S3, p-CulnSe2, p-CulnS2 and p-CuIn(Sel_xSx)2. The growth parameters such as chalcogenisation temperature and duration of chalcogenisation etc have been optimised in the present study.Single phase Beta-In2S3 thin films can be obtained by sulfurising the indium films above 300°C for 45 minutes. Low sulfurisation temperatures required prolonged annealing after the sulfurisation to obtain single phase Beta-1n2S3, which resulted in high material loss. The maximum band gap of 2.58 eV was obtained for the nearly stoichiometric Beta-In2S3 film which was sulfurised at 350°C. This wider band gap, n type Beta-In2S3 can be used as an alternative to toxic CdS as window layer in photovoltaics .The systematic study on the structural optical and electrical properties of CuInSe2 films by varying the process parameters such as the duration of selenization and the selenization temperature led to the conclusion that for the growth of single-phase CuInSe2, the optimum selenization temperature is 350°C and duration is 3 hours. The presence of some binary phases in films for shorter selenization period and lower selenization temperature may be due to the incomplete reaction and indium loss. Optical band gap energy of 1.05 eV obtained for the films under the optimum condition.In order to obtain a closer match to the solar spectrum it is desirable to increase the band gap of the CulnSe2 by a few meV . Further research works were carried out to produce graded band gap CuIn(Se,S)2 absorber films by incorporation of sulfur into CuInSe2. It was observed that when the CulnSe2 prepared by two stage process were post annealed in sulfur atmosphere, the sulfur may be occupying the interstitial positions or forming a CuInS2 phase along with CuInSe2 phase. The sulfur treatment during the selenization process OfCu11 ln9 precursors resulted in Culn (Se,S)2 thin films. A band gap of 1.38 eV was obtained for the CuIn(Se,S)2.The optimised thin films n-beta 1n2S3, p-CulnSe2 and p-Culn(Sel-xSx)2 can be used for fabrication of polycrystalline solar cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Controlling the inorganic nitrogen by manipulating carbon / nitrogen ratio is a method gaining importance in aquaculture systems. Nitrogen control is induced by feeding bacteria with carbohydrates and through the subsequent uptake of nitrogen from the water for the synthesis of microbial proteins. The relationship between addition of carbohydrates, reduction of ammonium and the production of microbial protein depends on the microbial conversion coefficient. The carbon / nitrogen ratio in the microbial biomass is related to the carbon contents of the added material. The addition of carbonaceous substrate was found to reduce inorganic nitrogen in shrimp culture ponds and the resultant microbial proteins are taken up by shrimps. Thus, part of the feed protein is replaced and feeding costs are reduced in culture systems.The use of various locally available substrates for periphyton based aquaculture practices increases production and profitability .However, these techniques for extensive shrimp farming have not so far been evaluated. Moreover, an evaluation of artificial substrates together with carbohydrate source based farming system in reducing inorganic nitrogen production in culture systems has not yet been carried-out. Furthermore, variations in water and soil quality, periphyton production and shrimp production of the whole system have also not been determined so-far.This thesis starts with a general introduction , a brief review of the most relevant literature, results of various experiments and concludes with a summary (Chapter — 9). The chapters are organised conforming to the objectives of the present study. The major objectives of this thesis are, to improve the sustainability of shrimp farming by carbohydrate addition and periphyton substrate based shrimp production and to improve the nutrient utilisation in aquaculture systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Faculty of Marine Sciences,Cochin University of Science and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the early 19th century, industrial revolution was fuelled mainly by the development of machine based manufacturing and the increased use of coal. Later on, the focal point shifted to oil, thanks to the mass-production technology, ease of transport/storage and also the (less) environmental issues in comparison with the coal!! By the dawn of 21st century, due to the depletion of oil reserves and pollution resulting from heavy usage of oil the demand for clean energy was on the rising edge. This ever growing demand has propelled research on photovoltaics which has emerged successful and is currently being looked up to as the only solace for meeting our present day energy requirements. The proven PV technology on commercial scale is based on silicon but the recent boom in the demand for photovoltaic modules has in turn created a shortage in supply of silicon. Also the technology is still not accessible to common man. This has onset the research and development work on moderately efficient, eco-friendly and low cost photovoltaic devices (solar cells). Thin film photovoltaic modules have made a breakthrough entry in the PV market on these grounds. Thin films have the potential to revolutionize the present cost structure of solar cells by eliminating the use of the expensive silicon wafers that alone accounts for above 50% of total module manufacturing cost.Well developed thin film photovoltaic technologies are based on amorphous silicon, CdTe and CuInSe2. However the cell fabrication process using amorphous silicon requires handling of very toxic gases (like phosphene, silane and borane) and costly technologies for cell fabrication. In the case of other materials too, there are difficulties like maintaining stoichiometry (especially in large area films), alleged environmental hazards and high cost of indium. Hence there is an urgent need for the development of materials that are easy to prepare, eco-friendly and available in abundance. The work presented in this thesis is an attempt towards the development of a cost-effective, eco-friendly material for thin film solar cells using simple economically viable technique. Sn-based window and absorber layers deposited using Chemical Spray Pyrolysis (CSP) technique have been chosen for the purpose

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron. The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Wireless Sensor Networks (WSN), neglecting the effects of varying channel quality can lead to an unnecessary wastage of precious battery resources and in turn can result in the rapid depletion of sensor energy and the partitioning of the network. Fairness is a critical issue when accessing a shared wireless channel and fair scheduling must be employed to provide the proper flow of information in a WSN. In this paper, we develop a channel adaptive MAC protocol with a traffic-aware dynamic power management algorithm for efficient packet scheduling and queuing in a sensor network, with time varying characteristics of the wireless channel also taken into consideration. The proposed protocol calculates a combined weight value based on the channel state and link quality. Then transmission is allowed only for those nodes with weights greater than a minimum quality threshold and nodes attempting to access the wireless medium with a low weight will be allowed to transmit only when their weight becomes high. This results in many poor quality nodes being deprived of transmission for a considerable amount of time. To avoid the buffer overflow and to achieve fairness for the poor quality nodes, we design a Load prediction algorithm. We also design a traffic aware dynamic power management scheme to minimize the energy consumption by continuously turning off the radio interface of all the unnecessary nodes that are not included in the routing path. By Simulation results, we show that our proposed protocol achieves a higher throughput and fairness besides reducing the delay

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: To develop a new medium for enhanced production of biomass of an aquaculture probiotic Pseudomonas MCCB 103 and its antagonistic phenazine compound, pyocyanin. Methods and Results: Carbon and nitrogen sources and growth factors, such as amino acids and vitamins, were screened initially in a mineral medium for the biomass and antagonistic compound of Pseudomonas MCCB 103. The selected ingredients were further optimized using a full-factorial central composite design of the response surface methodology. The medium optimized as per the model for biomass contained mannitol (20 g l)1), glycerol (20 g l)1), sodium chloride (5 g l)1), urea (3Æ3 g l)1) and mineral salts solution (20 ml l)1), and the one optimized for the antagonistic compound contained mannitol (2 g l)1), glycerol (20 g l)1), sodium chloride (5Æ1 g l)1), urea (3Æ6 g l)1) and mineral salts solution (20 ml l)1). Subsequently, the model was validated experimentally with a biomass increase by 19% and fivefold increase of the antagonistic compound. Conclusion: Significant increase in the biomass and antagonistic compound production could be obtained in the new media. Significance and Impact of the Study: Media formulation and optimization are the primary steps involved in bioprocess technology, an attempt not made so far in the production of aquaculture probiotics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A marine isolate of jáÅêçÅçÅÅìë MCCB 104 has been identified as an aquaculture probiotic antagonistic to sáÄêáç. In the present study different carbon and nitrogen sources and growth factors in a mineral base medium were optimized for enhanced biomass production and antagonistic activity against the target pathogen, sáÄêáç=Ü~êîÉóá, following response surface methodology (RSM). Accordingly the minimum and maximum limits of the selected variables were determined and a set of fifty experiments programmed employing central composite design (CCD) of RSM for the final optimization. The response surface plots of biomass showed similar pattern with that of antagonistic activity, which indicated a strong correlation between the biomass and antagonism. The optimum concentration of the carbon sources, nitrogen sources, and growth factors for both biomass and antagonistic activity were glucose (17.4 g/L), lactose (17 g/L), sodium chloride (16.9 g/L), ammonium chloride (3.3 g/L), and mineral salts solution (18.3 mL/L). © KSBB

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnosis of Hridroga (cardiac disorders) in Ayurveda requires the combination of many different types of data, including personal details, patient symptoms, patient histories, general examination results, Ashtavidha pareeksha results etc. Computer-assisted decision support systems must be able to combine these data types into a seamless system. Intelligent agents, an approach that has been used chiefly in business applications, is used in medical diagnosis in this case. This paper is about a multi-agent system named “Distributed Ayurvedic Diagnosis and Therapy System for Hridroga using Agents” (DADTSHUA). It describes the architecture of the DADTSHUA model .This system is using mobile agents and ontology for passing data through the network. Due to this, transport delay can be minimized. It is a system which will be very helpful for the beginning physicians to eliminate his ambiguity in diagnosis and therapy. The system is implemented using Java Agent DEvelopment framework (JADE), which is a java-complaint mobile agent platform from TILab.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we have evolved a generic software architecture for a domain specific distributed embedded system. The system under consideration belongs to the Command, Control and Communication systems domain. The systems in such domain have very long operational lifetime. The quality attributes of these systems are equally important as the functional requirements. The main guiding principle followed in this paper for evolving the software architecture has been functional independence of the modules. The quality attributes considered most important for the system are maintainability and modifiability. Architectural styles best suited for the functionally independent modules are proposed with focus on these quality attributes. The software architecture for the system is envisioned as a collection of architecture styles of the functionally independent modules identified