849 resultados para adaptive blind source separation method
Resumo:
Separation of metallic from semiconducting single-walled carbon nanotubes has been a major challenge for some time and some previous efforts have resulted in partial success. We have accomplished the separation effectively by employing fluorous chemistry wherein the diazonium salt of 4-heptadecafluorooc tylaniline selectively reacts with the metallic nanotubes present in the mixture of nanotubes. The resulting fluoroderivative was extracted in perfluorohexane leaving the semiconducting nanotubes in the aqueous layer. The products have been characterized by both Raman and electronic absorption spectroscopy. The method avoids the cumbersome centrifugation step required by some other procedures.
Resumo:
Data mining involves nontrivial process of extracting knowledge or patterns from large databases. Genetic Algorithms are efficient and robust searching and optimization methods that are used in data mining. In this paper we propose a Self-Adaptive Migration Model GA (SAMGA), where parameters of population size, the number of points of crossover and mutation rate for each population are adaptively fixed. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions and a set of actual classification datamining problems. Michigan style of classifier was used to build the classifier and the system was tested with machine learning databases of Pima Indian Diabetes database, Wisconsin Breast Cancer database and few others. The performance of our algorithm is better than others.
Resumo:
We consider the problem of transmission of several discrete sources over a multiple access channel (MAC) with side information at the sources and the decoder. Source-channel separation does not hold for this channel. Sufficient conditions are provided for transmission of sources with a given distortion. The channel could have continuous alphabets (Gaussian MAC is a special case). Various previous results are obtained as special cases.
Resumo:
A low temperature polyol process, based on glycolaldehyde mediated partial reduction of FeCl3 center dot 6H(2)O at 120 degrees C in the presence of sodium acetate as an alkali source and 2,2'-(ethylenedioxy)-bis-(ethylamine) as an electrostatic stabilizer has been used for the gram-scale preparation of biocompatible, water-dispersible, amine functionalized magnetite nanoparticles (MNPs) with an average diameter of 6 +/- 0.75 nm. With a reasonably high magnetization (37.8 e.m.u.) and amine groups on the outer surface of the nanoparticles, we demonstrated the magnetic separation and concentration implications of these ultrasmall particles in immunoassay. MRI studies indicated that these nanoparticles had the desired relaxivity for T-2 contrast enhancement in vivo. In vitro biocompatibility, cell uptake and MR imaging studies established that these nanoparticles were safe in clinical dosages and by virtue of their ultrasmall sizes and positively charged surfaces could be easily internalized by cancer cells. All these positive attributes make these functional nanoparticles a promising platform for further in vitro and in vivo evaluations.
Resumo:
The problem of identifying parameters of nonlinear vibrating systems using spatially incomplete, noisy, time-domain measurements is considered. The problem is formulated within the framework of dynamic state estimation formalisms that employ particle filters. The parameters of the system, which are to be identified, are treated as a set of random variables with finite number of discrete states. The study develops a procedure that combines a bank of self-learning particle filters with a global iteration strategy to estimate the probability distribution of the system parameters to be identified. Individual particle filters are based on the sequential importance sampling filter algorithm that is readily available in the existing literature. The paper develops the requisite recursive formulary for evaluating the evolution of weights associated with system parameter states. The correctness of the formulations developed is demonstrated first by applying the proposed procedure to a few linear vibrating systems for which an alternative solution using adaptive Kalman filter method is possible. Subsequently, illustrative examples on three nonlinear vibrating systems, using synthetic vibration data, are presented to reveal the correct functioning of the method. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
A new deterministic three-dimensional neutral and charged particle transport code, MultiTrans, has been developed. In the novel approach, the adaptive tree multigrid technique is used in conjunction with simplified spherical harmonics approximation of the Boltzmann transport equation. The development of the new radiation transport code started in the framework of the Finnish boron neutron capture therapy (BNCT) project. Since the application of the MultiTrans code to BNCT dose planning problems, the testing and development of the MultiTrans code has continued in conventional radiotherapy and reactor physics applications. In this thesis, an overview of different numerical radiation transport methods is first given. Special features of the simplified spherical harmonics method and the adaptive tree multigrid technique are then reviewed. The usefulness of the new MultiTrans code has been indicated by verifying and validating the code performance for different types of neutral and charged particle transport problems, reported in separate publications.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
In an effort to find a simple and common single-source precursor route for the group 13 metal nitride semiconductor nanostructures, the complexes formed by the trichlorides of Al, Ga and In with urea have been investigated. The complexes, characterized by X-ray crystallography and other techniques, yield the nitrides on thermal decomposition. Single crystalline nanowires of AlN, GaN and InN have been deposited on Si substrates covered with Au islands by using the complexes as precursors. The urea complexes yield single crystalline nanocrystals under solvothermal conditions. The successful synthesis of the nanowires and nanocrystals of these three important nitrides by a simple single-precursor route is noteworthy and the method may indeed be useful in practice.
Resumo:
The aim of this study was to estimate the development of fertility in North-Central Namibia, former Ovamboland, from 1960 to 2001. Special attention was given to the onset of fertility decline and to the impact of the HIV epidemic on fertility. An additional aim was to introduce parish registers as a source of data for fertility research in Africa. Data used consisted of parish registers from Evangelical Lutheran congregations, the 1991 and 2001 Population and Housing Censuses, the 1992 and 2000 Namibia Demographic and Health Surveys, and the HIV sentinel surveillances of 1992-2004. Both period and cohort fertility were analysed. The P/F ratio method was used when analysing census data. The impact of HIV infection on fertility was estimated indirectly by comparing the fertility histories of women who died at an age of less than 50 years with the fertility of other women. The impact of the HIV epidemic on fertility was assessed both among infected women and in the general population. Fertility in the study population began to decline in 1980. The decline was rapid during the 1980s, levelled off in the early 1990s at the end of war of independence and then continued to decline until the end of the study period. According to parish registers, total fertility was 6.4 in the 1960s and 6.5 in the 1970s, and declined to 5.1 in the 1980s and 4.2 in the 1990s. Adjustment of these total fertility rates to correspond to levels of fertility based on data from the 1991 and 2001 censuses resulted in total fertility declining from 7.6 in 1960-79 to 6.0 in 1980-89, and to 4.9 in 1990-99. The decline was associated with increased age at first marriage, declining marital fertility and increasing premarital fertility. Fertility among adolescents increased, whereas the fertility of women in all other age groups declined. During the 1980s, the war of independence contributed to declining fertility through spousal separation and delayed marriages. Contraception has been employed in the study region since the 1980s, but in the early 1990s, use of contraceptives was still so limited that fertility was higher in North-Central Namibia than in other regions of the country. In the 1990s, fertility decline was largely a result of the increased prevalence of contraception. HIV prevalence among pregnant women increased from 4% in 1992 to 25% in 2001. In 2001, total fertility among HIV-infected women (3.7) was lower than that among other women (4.8), resulting in total fertility of 4.4 among the general population in 2001. The HIV epidemic explained more than a quarter of the decline in total fertility at population level during most of the 1990s. The HIV epidemic also reduced the number of children born by reducing the number of potential mothers. In the future, HIV will have an extensive influence on both the size and age structure of the Namibian population. Although HIV influences demographic development through both fertility and mortality, the effect through changes in fertility will be smaller than the effect through mortality. In the study region, as in some other regions of southern Africa, a new type of demographic transition is under way, one in which population growth stagnates or even reverses because of the combined effects of declining fertility and increasing mortality, both of which are consequences of the HIV pandemic.
Resumo:
The most difficult operation in the flood inundation mapping using optical flood images is to separate fully inundated areas from the ‘wet’ areas where trees and houses are partly covered by water. This can be referred as a typical problem the presence of mixed pixels in the images. A number of automatic information extraction image classification algorithms have been developed over the years for flood mapping using optical remote sensing images. Most classification algorithms generally, help in selecting a pixel in a particular class label with the greatest likelihood. However, these hard classification methods often fail to generate a reliable flood inundation mapping because the presence of mixed pixels in the images. To solve the mixed pixel problem advanced image processing techniques are adopted and Linear Spectral unmixing method is one of the most popular soft classification technique used for mixed pixel analysis. The good performance of linear spectral unmixing depends on two important issues, those are, the method of selecting endmembers and the method to model the endmembers for unmixing. This paper presents an improvement in the adaptive selection of endmember subset for each pixel in spectral unmixing method for reliable flood mapping. Using a fixed set of endmembers for spectral unmixing all pixels in an entire image might cause over estimation of the endmember spectra residing in a mixed pixel and hence cause reducing the performance level of spectral unmixing. Compared to this, application of estimated adaptive subset of endmembers for each pixel can decrease the residual error in unmixing results and provide a reliable output. In this current paper, it has also been proved that this proposed method can improve the accuracy of conventional linear unmixing methods and also easy to apply. Three different linear spectral unmixing methods were applied to test the improvement in unmixing results. Experiments were conducted in three different sets of Landsat-5 TM images of three different flood events in Australia to examine the method on different flooding conditions and achieved satisfactory outcomes in flood mapping.
Resumo:
In this paper, we propose a self Adaptive Migration Model for Genetic Algorithms, where parameters of population size, the number of points of crossover and mutation rate for each population are fixed adaptively. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions, when compared with Island model GA(IGA) and Simple GA(SGA).
Resumo:
A constant switching frequency current error space vector-based hysteresis controller for two-level voltage source inverter-fed induction motor (IM) drives is proposed in this study. The proposed controller is capable of driving the IM in the entire speed range extending to the six-step mode. The proposed controller uses the parabolic boundary, reported earlier, for vector selection in a sector, but uses simple, fast and self-adaptive sector identification logic for sector change detection in the entire modulation range. This new scheme detects the sector change using the change in direction of current error along the axes jA, jB and jC. Most of the previous schemes use an outer boundary for sector change detection. So the current error goes outside the boundary six times during sector change, in one cycle,, introducing additional fifth and seventh harmonic components in phase current. This may cause sixth harmonic torque pulsations in the motor and spread in the harmonic spectrum of phase voltage. The proposed new scheme detects the sector change fast and accurately eliminating the chance of introducing additional fifth and seventh harmonic components in phase current and provides harmonic spectrum of phase voltage, which exactly matches with that of constant switching frequency voltage-controlled space vector pulse width modulation (VC-SVPWM)-based two-level inverter-fed drives.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
In this paper, we propose a self Adaptive Migration Model for Genetic Algorithms, where parameters of population size, the number of points of crossover and mutation rate for each population are fixed adaptively. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions, when compared with Island model GA(IGA) and Simple GA(SGA).
Resumo:
1. 1. A simple method has been devised for the estimation of phloroglucinol based on the formation of an intense colored compound with a modified Ehrlich reagent in the presence of trichloroacetic acid. Some factors affecting the formation of color have been studied. 2. 2. Careful regulation of trichloroacetic acid content in the system permits its estimation in 1–15 μg in the micro range and in 10–50 μg in the macro range. Phloroglucinol lends itself to ready separation by paper chromatography and estimation after clution from paper.