88 resultados para Fusion approaches


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of reconstruction of a refractive-index distribution (RID) in optical refraction tomography (ORT) with optical path-length difference (OPD) data is solved using two adaptive-estimation-based extended-Kalman-filter (EKF) approaches. First, a basic single-resolution EKF (SR-EKF) is applied to a state variable model describing the tomographic process, to estimate the RID of an optically transparent refracting object from noisy OPD data. The initialization of the biases and covariances corresponding to the state and measurement noise is discussed. The state and measurement noise biases and covariances are adaptively estimated. An EKF is then applied to the wavelet-transformed state variable model to yield a wavelet-based multiresolution EKF (MR-EKF) solution approach. To numerically validate the adaptive EKF approaches, we evaluate them with benchmark studies of standard stationary cases, where comparative results with commonly used efficient deterministic approaches can be obtained. Detailed reconstruction studies for the SR-EKF and two versions of the MR-EKF (with Haar and Daubechies-4 wavelets) compare well with those obtained from a typically used variant of the (deterministic) algebraic reconstruction technique, the average correction per projection method, thus establishing the capability of the EKF for ORT. To the best of our knowledge, the present work contains unique reconstruction studies encompassing the use of EKF for ORT in single-resolution and multiresolution formulations, and also in the use of adaptive estimation of the EKF's noise covariances. (C) 2010 Optical Society of America

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the events near the fusion interfaces of dissimilar welds using a phase-field model developed for single-phase solidification of binary alloys. The parameters used here correspond to the dissimilar welding of a Ni/Cu couple. The events at the Ni and the Cu interface are very different, which illustrate the importance of the phase diagram through the slope of the liquidus curves. In the Ni side, where the liquidus temperature decreases with increasing alloying, solutal melting of the base metal takes place; the resolidification, with continuously increasing solid composition, is very sluggish until the interface encounters a homogeneous melt composition. The growth difficulty of the base metal increases with increasing initial melt composition, which is equivalent to a steeper slope of the liquidus curve. In the Cu side, the initial conditions result in a deeply undercooled melt and contributions from both constrained and unconstrained modes of growth are observed. The simulations bring out the possibility of nucleation of a concentrated solid phase from the melt, and a secondary melting of the substrate due to the associated recalescence event. The results for the Ni and Cu interfaces can be used to understand more complex dissimilar weld interfaces involving multiphase solidification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The strategy of translationally fusing the alpha-and beta-subunits of human chorionic gonadotropin (hCG) into a single-chain molecule has been used to produce novel analogs of hCG. Previously we reported expression of a biologically active singlechain analog hCG alpha beta expressed using Pichia expression system. Using the same expression system, another analog, in which the alpha-subunit was replaced with the second beta-subunit, was expressed (hCG beta beta) and purified. hCG beta beta could bind to LH receptor with an affinity three times lower than that of hCG but failed to elicit any response. However, it could inhibit response to the hormone in vitro in a dose- dependent manner. Furthermore, it inhibited response to hCG in vivo indicating the antagonistic nature of the analog. However, it was unable inhibit human FSH binding or response to human FSH, indicating the specificity of the effect. Characterization of hCG alpha beta and hCG beta beta using immunological tools showed alterations in the conformation of some of the epitopes, whereas others were unaltered. Unlike hCG, hCG beta beta interacts with two LH receptor molecules. These studies demonstrate that the presence of the second beta-subunit in the single-chain molecule generated a structure that can be recognized by the receptor. However, due to the absence of alpha-subunit, the molecule is unable to elicit response. The strategy of fusing two beta-subunits of glycoprotein hormones can be used to produce antagonists of these hormones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Image fusion is a formal framework which is expressed as means and tools for the alliance of multisensor, multitemporal, and multiresolution data. Multisource data vary in spectral, spatial and temporal resolutions necessitating advanced analytical or numerical techniques for enhanced interpretation capabilities. This paper reviews seven pixel based image fusion techniques - intensity-hue-saturation, brovey, high pass filter (HPF), high pass modulation (HPM), principal component analysis, fourier transform and correspondence analysis.Validation of these techniques on IKONOS data (Panchromatic band at I m spatial resolution and Multispectral 4 bands at 4 in spatial resolution) reveal that HPF and HPM methods synthesises the images closest to those the corresponding multisensors would observe at the high resolution level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Synthetic approach to 3-alkoxythapsane, comprising of the carbon framework of a small group of sesquiterpenes containing three contiguous quaternary carbon atoms has been described. A combination of alkylation, orthoester Claisen rearrangement and intramolecular diazoketone cyclopropanation has been employed for the creation of the three requisite contiguous quaternary carbon atoms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The motivation behind the fusion of Intrusion Detection Systems was the realization that with the increasing traffic and increasing complexity of attacks, none of the present day stand-alone Intrusion Detection Systems can meet the high demand for a very high detection rate and an extremely low false positive rate. Multi-sensor fusion can be used to meet these requirements by a refinement of the combined response of different Intrusion Detection Systems. In this paper, we show the design technique of sensor fusion to best utilize the useful response from multiple sensors by an appropriate adjustment of the fusion threshold. The threshold is generally chosen according to the past experiences or by an expert system. In this paper, we show that the choice of the threshold bounds according to the Chebyshev inequality principle performs better. This approach also helps to solve the problem of scalability and has the advantage of failsafe capability. This paper theoretically models the fusion of Intrusion Detection Systems for the purpose of proving the improvement in performance, supplemented with the empirical evaluation. The combination of complementary sensors is shown to detect more attacks than the individual components. Since the individual sensors chosen detect sufficiently different attacks, their result can be merged for improved performance. The combination is done in different ways like (i) taking all the alarms from each system and avoiding duplications, (ii) taking alarms from each system by fixing threshold bounds, and (iii) rule-based fusion with a priori knowledge of the individual sensor performance. A number of evaluation metrics are used, and the results indicate that there is an overall enhancement in the performance of the combined detector using sensor fusion incorporating the threshold bounds and significantly better performance using simple rule-based fusion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior and the use of appropriate data mining techniques on the chosen representation. In this paper, we use the sequence of system calls to characterize program execution. The data mining tasks addressed are learning to map system call streams to fault labels and automatic identification of fault causes. Spectrum kernels and SVM are used for the former while latent semantic analysis is used for the latter The techniques are demonstrated for the intrusion dataset containing system call traces. The results show that kernel techniques are as accurate as the best available results but are faster by orders of magnitude. We also show that latent semantic indexing is capable of revealing fault-specific features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel second order cone programming formulation for designing robust classifiers which can handle uncertainty in observations. Similar formulations are also derived for designing regression functions which are robust to uncertainties in the regression setting. The proposed formulations are independent of the underlying distribution, requiring only the existence of second order moments. These formulations are then specialized to the case of missing values in observations for both classification and regression problems. Experiments show that the proposed formulations outperform imputation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The static response of thin, wrinkled membranes is studied using both a tension field approximation based on plane stress conditions and a 3D nonlinear elasticityformulation, discretized through 8-noded Cosserat point elements. While the tension field approach only obtains the wrinkled/slack regions and at best a measure of the extent of wrinkliness, the 3D elasticity solution provides, in principle, the deformed shape of a wrinkled/slack membrane. However, since membranes barely resist compression, the discretized and linearized system equations via both the approaches are ill-conditioned and solutions could thus be sensitive to discretizations errors as well as other sources of noises/imperfections. We propose a regularized, pseudo-dynamical recursion scheme that provides a sequence of updates, which are almost insensitive to theregularizing term as well as the time step size used for integrating the pseudo-dynamical form. This is borne out through several numerical examples wherein the relative performance of the proposed recursion scheme vis-a-vis a regularized Newton strategy is compared. The pseudo-time marching strategy, when implemented using 3D Cosserat point elements, also provides a computationally cheaper, numerically accurate and simpler alternative to that using geometrically exact shell theories for computing large deformations of membranes in the presence of wrinkles. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mycobacterial spheroplasts were prepared by treatment of the glycinesensitized cells with a combination of lipase and lysozyme. They were stable for several hours at room temperature but were lysed on treatment with 0.1% sodium dodecyl sulfate. The spheroplasts could be regenerated on a suitable medium. Fusion and regeneration of the spheroplasts were attempted using drug resistant mutant strains ofM. smegmalis. Recombinants were obtained from spheroplast fusion mediated by polyethylene glycol and dimethyl sulfoxide. Simultaneous expression of rccombinant properties was observed only after an initial lag in the isolated clones. This has been explained as due to “chromosome inactivation” in the fused product.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The availability of electrophoretically homogeneous rabbit penicillin carrier receptor protein (CRP) by affinity chromatography afforded an idealin vitro system to calculate the thermodynamic parameters of binding of penicillin and analogues with CRP as well as competitive binding of such analogues with CRP in presence of14C-penicillin G. The kinetics of association of CRP with 7-deoxy penicillin which does not bind covalently with CRP have been studied through equilibrium dialysis with14C-7-deoxybenzyl penicillin and found to be K=2·79×106M−1.−ΔG=8·106 k cal/mole as well as fluorescence quenching studies with exciter λ 280 K=3·573×106M−1,−ΔG=8·239 k cal/mole. The fluorescence quenching studies have been extended to CRP-benzyl penicillin and CRP-6-aminopenicillanic acid (6APA) systems also. The fluorescence data with benzyl penicillin indicate two conformational changes in CRP—a fast change corresponding to the non-covalent binding to CRP with 7-deoxy penicillin and a slower change due to covalent bond formation. With 6-APA the first change is not observed but the conformational change corresponding to covalent binding is only seen. Competitive binding studies indicate that the order of binding of CRP with the analogues of penicillin is as follows: methicillin > 6APA > carbenicillin >o-nitrobenzyl penicillin > cloxacillin ≈ benzyl penicillin ≈ 6-phenyl acetamido penicillanyl alcohol ≈ 7 phenyl acetamido desacetoxy cephalosporanic acid ≈p-amino benzyl penicillin ≈p-nitro benzyl penicillin > ticarcillin >o-amino benzyl penicillin > amoxycillin > 7-deoxy benzyl penicillin > ampicillin.From these data it has been possible to delineate partially the topology of the penicillin binding cleft of the CRP as well as some of the functional groups in the cleft responsible for the binding process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Specific penicillin-carrier receptor proteins (CRP) have been isolated from the sera of penicillin allergic rabbits and human subjects in the unconjugated native state in electrophoretically homogeneous form by employing a synthetic polymeric affinity template containing the 7-deoxy analogue of penicillin G. The synthesis of the 7-deoxy analogue has been described. In this affinity system the antipenicillin-antibody is desorbed by 0·9M thiourea and the CRP in 8M urea. The CRP after incubation with penicillin is converted into the full-fledged antigen. Studies on the origin of CRP and the nature of antibody as well as comparative studies on the properties of the rabbit antibody and those of antibodies elicited by a BSA-BPO conjugate are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The availability of an electrophoretically homogeneous rabbit penicillin carrier receptor protein (CRP) and rabbit antipenicillin antibody afforded an idealin vitro system to calculate the thermodynamic parameters of the binding of14C benzyl penicillin CRP conjugate (antigen) to the purified rabbit antipenicillin antibody. The thermodynamic parameters of this antigen-antibody reaction has been studied by radio-active assay method by using millipore filter. Equilibrium constant (K) of this reaction has been found to be 2·853×109M−2 and corresponding free energy (ΔG) at 4°C and 37°C has been calculated to be −12·02 and −13·5 kcal/mole, enthalpy (ΔH) and entropy (ΔS) has been found to be 361 kcal/mole and +30 eu/mole respectively. Competitive binding studies of CRP-analogue conjugates with the divalent rabbit antibody has been carried out in the presence of14C-penicilloyl CRP. It was found that 7-deoxy penicillin-CRP complex and 6-amino penicilloyl CRP conjugate binds to the antibody with energies stronger than that with the14C-penicilloyl CRP. All the other analogue conjugates are much weaker in interfering with the binding of the penicilloyl CRP with the antibody. The conjugate of methicillin,o-nitro benzyl penicillin and ticarcillin with CRP do not materially interfere in the process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews integrated economic and ecological models that address impacts and adaptation to climate change in the forest sector. Early economic model studies considered forests as one out of many possible impacts of climate change, while ecological model studies tended to limit the economic impacts to fixed price-assumptions. More recent studies include broader representations of both systems, but there are still few studies which can be regarded fully integrated. Full integration of ecological and economic models is needed to address forest management under climate change appropriately. The conclusion so far is that there are vast uncertainties about how climate change affects forests. This is partly due to the limited knowledge about the global implications of the social and economical adaptation to the effects of climate change on forests.