69 resultados para Importance sampling
Resumo:
Phase transformations of Al2O3 and Na2O · 6Al2O3 prepared by the gel route have been investigated for the first time by 27Al MAS NMR spectroscopy in combination with x-ray diffraction. Of particular interest in the study is the kinetics of the γ → α and γ → β transformations, respectively, in these two systems. Analysis of the kinetic data shows the important role of nucleation in both these transformations.
Resumo:
An extension of the supramolecular synthon-based fragment approach (SBFA) method for transferability of multipole charge density parameters to include weak supramolecular synthons is proposed. In particular, the SBFA method is applied to C-H center dot center dot center dot O, C-H center dot center dot center dot F, and F center dot center dot center dot F containing synthons. A high resolution charge density study has been performed on 4-fluorobenzoic acid to build a synthon library for C-H center dot center dot center dot F infinite chain interactions. Libraries for C-H center dot center dot center dot O and F center dot center dot center dot F synthons were taken from earlier work. The SBFA methodology was applied successfully to 2- and 3-fluorobenzoic acids, data sets for which were collected in a routine manner at 100 K, and the modularity of the synthons was demonstrated. Cocrystals of isonicotinamide with all three fluorobenzoic acids were also studied with the SBFA method. The topological analysis of inter- and intramolecular interaction regions was performed using Bader's AIM approach. This study shows that the SBFA method is generally applicable to generate charge density maps using information from multiple intermolecular regions.
Resumo:
Sampling disturbance is unavoidable and hence the laboratory testing most often is on partially disturbed samples. This paper deals with the development of a simple method to assess degree of sample disturbance from the prediction of yield stress due to cementation and comparison of yield stress in compression of partially disturbed sample with reference to a predicted compression path of the clay devoid of any mechanical disturbance. The method uses simple parameters which are normally determined in routine investigations.
Resumo:
Understanding the dendrimer-drug interaction is of great importance to design and optimize the dendrimer-based drug delivery system. Using atomistic molecular dynamics (MD) simulations, we have analyzed the release pattern of four ligands (two soluble drugs, namely, salicylic acid (Sal), L-alanine (Ala), and two insoluble drugs, namely, phenylbutazone (Pbz) and primidone (Prim)), which were initially encapsulated inside the ethylenediamine (EDA) cored polyamidoamine (PAMAM) dendrimer using the docking method. We have computed the potential of mean force (PMF) variation with generation 5 (G5)-PAMAM dendrimer complexed with drug molecules using umbrella sampling. From our calculated PMF values, we observe that soluble drugs (Sal and Ala) have lower energy barriers than insoluble drugs (Pbz and Prim). The order of ease of release pattern for these drugs from G5 protonated PAMAM dendrimer was found to be Ala > Sal > Prim > Pbz. In the case of insoluble drugs (Prim and Pbz), because of larger size, we observe much nonpolar contribution, and thus, their larger energy barriers can be reasoned to van der Waals contribution. From the hydrogen bonding analysis of the four PAMAM drug complexes under study, we found intermolecular hydrogen bonding to show less significant contribution to the free energy barrier. Another interesting feature appears while calculating the PMF profile of G5NP (nonprotonated)-PAMAM Pbz and G5NP (nonprotonated)-PAMAM-Sal complex. The PMF was found to be less when the drug is bound to nonprotonated dendrimer compared to the protonated dendrimer. Our results suggest that encapsulation of the drug molecule into the host PAMAM dendrimer should be carried out at higher pH values (near pH 10). When such complex enters the human body, the pH is around 7.4 and at that physiological pH, the dendrimer holds the drug tightly. Hence the release of drug can occur at a controlled rate into the bloodstream. Thus, our findings provide a microscopic picture of the encapsulation and controlled release of drugs in the case of dendrimer-based host-guest systems.
Resumo:
We consider the speech production mechanism and the asso- ciated linear source-filter model. For voiced speech sounds in particular, the source/glottal excitation is modeled as a stream of impulses and the filter as a cascade of second-order resonators. We show that the process of sampling speech signals can be modeled as filtering a stream of Dirac impulses (a model for the excitation) with a kernel function (the vocal tract response),and then sampling uniformly. We show that the problem of esti- mating the excitation is equivalent to the problem of recovering a stream of Dirac impulses from samples of a filtered version. We present associated algorithms based on the annihilating filter and also make a comparison with the classical linear prediction technique, which is well known in speech analysis. Results on synthesized as well as natural speech data are presented.
Resumo:
Compressive Sampling Matching Pursuit (CoSaMP) is one of the popular greedy methods in the emerging field of Compressed Sensing (CS). In addition to the appealing empirical performance, CoSaMP has also splendid theoretical guarantees for convergence. In this paper, we propose a modification in CoSaMP to adaptively choose the dimension of search space in each iteration, using a threshold based approach. Using Monte Carlo simulations, we show that this modification improves the reconstruction capability of the CoSaMP algorithm in clean as well as noisy measurement cases. From empirical observations, we also propose an optimum value for the threshold to use in applications.
Resumo:
We address the problem of sampling and reconstruction of two-dimensional (2-D) finite-rate-of-innovation (FRI) signals. We propose a three-channel sampling method for efficiently solving the problem. We consider the sampling of a stream of 2-D Dirac impulses and a sum of 2-D unit-step functions. We propose a 2-D causal exponential function as the sampling kernel. By causality in 2-D, we mean that the function has its support restricted to the first quadrant. The advantage of using a multichannel sampling method with causal exponential sampling kernel is that standard annihilating filter or root-finding algorithms are not required. Further, the proposed method has inexpensive hardware implementation and is numerically stable as the number of Dirac impulses increases.
Resumo:
The redox regulation of protein tyrosine phosphatase 1B (PTP1B) via the unusual transformation of its sulfenic acid (PTP1B-SOH) to a cyclic sulfenyl amide intermediate is studied by using small molecule chemical models. These studies suggest that the sulfenic acids derived from the H2O2-mediated reactions o-amido thiophenols do not efficiently cyclize to sulfenyl amides and the sulfenic acids produced in situ can be trapped by using methyl iodide. Theoretical calculations suggest that the most stable conformer of such sulfenic acids are stabilized by n(O) -> sigma* (S-OH) orbital interactions, which force the -OH group to adopt a position trans to the S center dot center dot center dot O interaction, leading to an almost linear arrangement of the O center dot center dot center dot S-O moiety and this may be the reason for the slow cyclization of such sulfenic acids to their corresponding sulfenyl amides. On the other hand, additional substituents at the 6-position of o-amido phenylsulfenic acids that can induce steric environment and alter the electronic properties around the sulfenic acid moiety by S center dot center dot center dot N or S center dot center dot center dot O nonbonded interactions destabilize the sulfenic acids by inducing strain in the molecule. This may lead to efficient the cyclization of such sulfenic acids. This model study suggests that the amino acid residues in the close proximity of the sulfenic acid moiety in PTP1B may play an important role in the cyclization of PTP1B-SOH to produce the corresponding sulfenyl amide.
Resumo:
There is a growing recognition of the need to integrate non-trophic interactions into ecological networks for a better understanding of whole-community organization. To achieve this, the first step is to build networks of individual non-trophic interactions. In this study, we analyzed a network of interdependencies among bird species that participated in heterospecific foraging associations (flocks) in an evergreen forest site in the Western Ghats, India. We found the flock network to contain a small core of highly important species that other species are strongly dependent on, a pattern seen in many other biological networks. Further, we found that structural importance of species in the network was strongly correlated to functional importance of species at the individual flock level. Finally, comparisons with flock networks from other Asian forests showed that the same taxonomic groups were important in general, suggesting that species importance was an intrinsic trait and not dependent on local ecological conditions. Hence, given a list of species in an area, it may be possible to predict which ones are likely to be important. Our study provides a framework for the investigation of other heterospecific foraging associations and associations among species in other non-trophic contexts.
Resumo:
In this paper, we propose low-complexity algorithms based on Monte Carlo sampling for signal detection and channel estimation on the uplink in large-scale multiuser multiple-input-multiple-output (MIMO) systems with tens to hundreds of antennas at the base station (BS) and a similar number of uplink users. A BS receiver that employs a novel mixed sampling technique (which makes a probabilistic choice between Gibbs sampling and random uniform sampling in each coordinate update) for detection and a Gibbs-sampling-based method for channel estimation is proposed. The algorithm proposed for detection alleviates the stalling problem encountered at high signal-to-noise ratios (SNRs) in conventional Gibbs-sampling-based detection and achieves near-optimal performance in large systems with M-ary quadrature amplitude modulation (M-QAM). A novel ingredient in the detection algorithm that is responsible for achieving near-optimal performance at low complexity is the joint use of a mixed Gibbs sampling (MGS) strategy coupled with a multiple restart (MR) strategy with an efficient restart criterion. Near-optimal detection performance is demonstrated for a large number of BS antennas and users (e. g., 64 and 128 BS antennas and users). The proposed Gibbs-sampling-based channel estimation algorithm refines an initial estimate of the channel obtained during the pilot phase through iterations with the proposed MGS-based detection during the data phase. In time-division duplex systems where channel reciprocity holds, these channel estimates can be used for multiuser MIMO precoding on the downlink. The proposed receiver is shown to achieve good performance and scale well for large dimensions.
Resumo:
In this paper, we consider the problem of finding a spectrum hole of a specified bandwidth in a given wide band of interest. We propose a new, simple and easily implementable sub-Nyquist sampling scheme for signal acquisition and a spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy in the frequency domain by testing a group of adjacent subbands in a single test. The sampling scheme deliberately introduces aliasing during signal acquisition, resulting in a signal that is the sum of signals from adjacent sub-bands. Energy-based hypothesis tests are used to provide an occupancy decision over the group of subbands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes. We extend this framework to a multi-stage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including non-contiguous spectrum hole search. Further, we provide the analytical means to optimize the hypothesis tests with respect to the detection thresholds, number of samples and group size to minimize the detection delay under a given error rate constraint. Depending on the sparsity and SNR, the proposed algorithms can lead to significantly lower detection delays compared to a conventional bin-by-bin energy detection scheme; the latter is in fact a special case of the group test when the group size is set to 1. We validate our analytical results via Monte Carlo simulations.
Resumo:
Fluorescence microscopy has become an indispensable tool in cell biology research due its exceptional specificity and ability to visualize subcellular structures with high contrast. It has highest impact when applied in 4D mode, i.e. when applied to record 3D image information as a function of time, since it allows the study of dynamic cellular processes in their native environment. The main issue in 4D fluorescence microscopy is that the phototoxic effect of fluorescence excitation gets accumulated during 4D image acquisition to the extent that normal cell functions are altered. Hence to avoid the alteration of normal cell functioning, it is required to minimize the excitation dose used for individual 2D images constituting a 4D image. Consequently, the noise level becomes very high degrading the resolution. According to the current status of technology, there is a minimum required excitation dose to ensure a resolution that is adequate for biological investigations. This minimum is sufficient to damage light-sensitive cells such as yeast if 4D imaging is performed for an extended period of time, for example, imaging for a complete cell cycle. Nevertheless, our recently developed deconvolution method resolves this conflict forming an enabling technology for visualization of dynamical processes of light-sensitive cells for durations longer than ever without perturbing normal cell functioning. The main goal of this article is to emphasize that there are still possibilities for enabling newer kinds of experiment in cell biology research involving even longer 4D imaging, by only improving deconvolution methods without any new optical technologies.