899 resultados para two-Gaussian mixture model
Resumo:
In this article, we introduce the general statistical analysis approach known as latent class analysis and discuss some of the issues associated with this type of analysis in practice. Two recent examples from the respiratory health literature are used to highlight the types of research questions that have been addressed using this approach.
Resumo:
In this paper, numerical modelling of fracture in concrete using two-dimensional lattice model is presented and also a few issues related to lattice modelling technique applicable to concrete fracture are reviewed. A comparison is made with acoustic emission (AE) events with the number of fractured elements. To implement the heterogeneity of the plain concrete, two methods namely, by generating grain structure of the concrete using Fuller's distribution and the concrete material properties are randomly distributed following Gaussian distribution are used. In the first method, the modelling of the concrete at meso level is carried out following the existing methods available in literature. The shape of the aggregates present in the concrete are assumed as perfect spheres and shape of the same in two-dimensional lattice network is circular. A three-point bend (TPB) specimen is tested in the experiment under crack mouth opening displacement (CMOD) control at a rate of 0.0004 mm/sec and the fracture process in the same TPB specimen is modelled using regular triangular 2D lattice network. Load versus crack mouth opening isplacement (CMOD) plots thus obtained by using both the methods are compared with experimental results. It was observed that the number of fractured elements increases near the peak load and beyond the peak load. That is once the crack starts to propagate. AE hits also increase rapidly beyond the peak load. It is compulsory here to mention that although the lattice modelling of concrete fracture used in this present study is very similar to those already available in literature, the present work brings out certain finer details which are not available explicitly in the earlier works.
Resumo:
In this paper, we introduce the three-user cognitive radio channels with asymmetric transmitter cooperation, and derive achievable rate regions under several scenarios depending on the type of cooperation and decoding capability at the receivers. Two of the most natural cooperation mechanisms for the three-user channel are considered here: cumulative message sharing (CMS) and primary-only message sharing (PMS). In addition to the message sharing mechanism, the achievable rate region is critically dependent on the decoding capability at the receivers. Here, we consider two scenarios for the decoding capability, and derive an achievable rate region for each one of them by employing a combination of superposition and Gel'fand-Pinsker coding techniques. Finally, to provide a numerical example, we consider the Gaussian channel model to plot the rate regions. In terms of achievable rates, CMS turns out to be a better scheme than PMS. However, the practical aspects of implementing such message-sharing schemes remain to be investigated.
Resumo:
Distributed space time coding for wireless relay networks where the source, the destination and the relays have multiple antennas have been studied by Jing and Hassibi. In this set up, the transmit and the receive signals at different antennas of the same relay are processed and designed independently, even though the antennas are colocated. In this paper, a wireless relay network with single antenna at the source and the destination and two antennas at each of the R relays is considered. In the first phase of the two-phase transmission model, a T -length complex vector is transmitted from the source to all the relays. At each relay, the inphase and quadrature component vectors of the received complex vectors at the two antennas are interleaved before processing them. After processing, in the second phase, a T x 2R matrix codeword is transmitted to the destination. The collection of all such codewords is called Co-ordinate interleaved distributed space-time code (CIDSTC). Compared to the scheme proposed by Jing-Hassibi, for T ges AR, it is shown that while both the schemes give the same asymptotic diversity gain, the CIDSTC scheme gives additional asymptotic coding gain as well and that too at the cost of negligible increase in the processing complexity at the relays.
Resumo:
In this paper, we consider robust joint designs of relay precoder and destination receive filters in a nonregenerative multiple-input multiple-output (MIMO) relay network. The network consists of multiple source-destination node pairs assisted by a MIMO-relay node. The channel state information (CSI) available at the relay node is assumed to be imperfect. We consider robust designs for two models of CSI error. The first model is a stochastic error (SE) model, where the probability distribution of the CSI error is Gaussian. This model is applicable when the imperfect CSI is mainly due to errors in channel estimation. For this model, we propose robust minimum sum mean square error (SMSE), MSE-balancing, and relay transmit power minimizing precoder designs. The next model for the CSI error is a norm-bounded error (NBE) model, where the CSI error can be specified by an uncertainty set. This model is applicable when the CSI error is dominated by quantization errors. In this case, we adopt a worst-case design approach. For this model, we propose a robust precoder design that minimizes total relay transmit power under constraints on MSEs at the destination nodes. We show that the proposed robust design problems can be reformulated as convex optimization problems that can be solved efficiently using interior-point methods. We demonstrate the robust performance of the proposed design through simulations.
Resumo:
We interpret the recent discovery of a 125 GeV Higgs-like state in the context of a two-Higgs-doublet model with a heavy fourth sequential generation of fermions, in which one Higgs doublet couples only to the fourth-generation fermions, while the second doublet couples to the lighter fermions of the first three families. This model is designed to accommodate the apparent heaviness of the fourth-generation fermions and to effectively address the low-energy phenomenology of a dynamical electroweak-symmetry-breaking scenario. The physical Higgs states of the model are, therefore, viewed as composites primarily of the fourth-generation fermions. We find that the lightest Higgs, h, is a good candidate for the recently discovered 125 GeV spin-zero particle, when tan beta similar to O(1), for typical fourth-generation fermion masses of M-4G = 400-600 GeV, and with a large t-t' mixing in the right-handed quark sector. This, in turn, leads to BR(t' -> th) similar to O(1), which drastically changes the t' decay pattern. We also find that, based on the current Higgs data, this two-Higgs-doublet model generically predicts an enhanced production rate (compared to the Standard Model) in the pp -> h -> tau tau channel, and reduced rates in the VV -> h -> gamma gamma and p (p) over bar /pp -> V -> hV -> Vbb channels. Finally, the heavier CP-even Higgs is excluded by the current data up to m(H) similar to 500 GeV, while the pseudoscalar state, A, can be as light as 130 GeV. These heavier Higgs states and the expected deviations from the Standard Model din some of the Higgs production channels can be further excluded or discovered with more data.
Resumo:
Variable Endmember Constrained Least Square (VECLS) technique is proposed to account endmember variability in the linear mixture model by incorporating the variance for each class, the signals of which varies from pixel to pixel due to change in urban land cover (LC) structures. VECLS is first tested with a computer simulated three class endmember considering four bands having small, medium and large variability with three different spatial resolutions. The technique is next validated with real datasets of IKONOS, Landsat ETM+ and MODIS. The results show that correlation between actual and estimated proportion is higher by an average of 0.25 for the artificial datasets compared to a situation where variability is not considered. With IKONOS, Landsat ETM+ and MODIS data, the average correlation increased by 0.15 for 2 and 3 classes and by 0.19 for 4 classes, when compared to single endmember per class. (C) 2013 COSPAR. Published by Elsevier Ltd. All rights reserved.
Resumo:
An important question in kernel regression is one of estimating the order and bandwidth parameters from available noisy data. We propose to solve the problem within a risk estimation framework. Considering an independent and identically distributed (i.i.d.) Gaussian observations model, we use Stein's unbiased risk estimator (SURE) to estimate a weighted mean-square error (MSE) risk, and optimize it with respect to the order and bandwidth parameters. The two parameters are thus spatially adapted in such a manner that noise smoothing and fine structure preservation are simultaneously achieved. On the application side, we consider the problem of image restoration from uniform/non-uniform data, and show that the SURE approach to spatially adaptive kernel regression results in better quality estimation compared with its spatially non-adaptive counterparts. The denoising results obtained are comparable to those obtained using other state-of-the-art techniques, and in some scenarios, superior.
Three-dimensional localization of multiple acoustic sources in shallow ocean with non-Gaussian noise
Resumo:
In this paper, a low-complexity algorithm SAGE-USL is presented for 3-dimensional (3-D) localization of multiple acoustic sources in a shallow ocean with non-Gaussian ambient noise, using a vertical and a horizontal linear array of sensors. In the proposed method, noise is modeled as a Gaussian mixture. Initial estimates of the unknown parameters (source coordinates, signal waveforms and noise parameters) are obtained by known/conventional methods, and a generalized expectation maximization algorithm is used to update the initial estimates iteratively. Simulation results indicate that convergence is reached in a small number of (<= 10) iterations. Initialization requires one 2-D search and one 1-D search, and the iterative updates require a sequence of 1-D searches. Therefore the computational complexity of the SAGE-USL algorithm is lower than that of conventional techniques such as 3-D MUSIC by several orders of magnitude. We also derive the Cramer-Rao Bound (CRB) for 3-D localization of multiple sources in a range-independent ocean. Simulation results are presented to show that the root-mean-square localization errors of SAGE-USL are close to the corresponding CRBs and significantly lower than those of 3-D MUSIC. (C) 2014 Elsevier Inc. All rights reserved.
Resumo:
Using a refined two-dimensional hybrid-model with self-consistent microwave absorption, we have investigated the change of plasma parameters such as plasma density and ionization rate with the operating conditions. The dependence of the ion current density and ion energy and angle distribution function at the substrate surface vs. the radial position, pressure and microwave power were discussed. Results of our simulation can be compared qualitatively with many experimental measurements.
Resumo:
A time averaged two-dimensional fluid model including an electromagnetic module with self-consistent power deposition was developed to simulate the transport of a low pressure radio frequency inductively coupled plasma source. Comparsions with experiment and previous simulation results show, that the fluid model is feasible in a certain range of gas pressure. In addition, the effects of gas pressure and power input have been discussed.
Resumo:
Documento de trabajo
Resumo:
Part I. Proton Magnetic Resonance of Polynucleotides and Transfer RNA.
Proton magnetic resonance was used to follow the temperature dependent intramolecular stacking of the bases in the polynucleotides of adenine and cytosine. Analysis of the results on the basis of a two state stacked-unstacked model yielded values of -4.5 kcal/mole and -9.5 kcal/mole for the enthalpies of stacking in polyadenylic and polycytidylic acid, respectively.
The interaction of purine with these molecules was also studied by pmr. Analysis of these results and the comparison of the thermal unstacking of polynucleotides and short chain nucleotides indicates that the bases contained in stacks within the long chain poly nucleotides are, on the average, closer together than the bases contained in stacks in the short chain nucleotides.
Temperature and purine studies were also carried out with an aqueous solution of formylmethionine transfer ribonucleic acid. Comparison of these results with the results of similar experiments with the homopolynucleotides of adenine, cytosine and uracil indicate that the purine is probably intercalating into loop regions of the molecule.
The solvent denaturation of phenylalanine transfer ribonucleic acid was followed by pmr. In a solvent mixture containing 83 volume per cent dimethylsulf oxide and 17 per cent deuterium oxide, the tRNA molecule is rendered quite flexible. It is possible to resolve resonances of protons on the common bases and on certain modified bases.
Part II. Electron Spin Relaxation Studies of Manganese (II) Complexes in Acetonitrile.
The electron paramagnetic resonance spectra of three Mn+2 complexes, [Mn(CH3CN)6]+2, [MnCl4]-2, and [MnBr4]-2, in acetonitrile were studied in detail. The objective of this study was to relate changes in the effective spin Hamiltonian parameters and the resonance line widths to the structure of these molecular complexes as well as to dynamical processes in solution.
Of the three systems studied, the results obtained from the [Mn(CH3CN)6]+2 system were the most straight-forward to interpret. Resonance broadening attributable to manganese spin-spin dipolar interactions was observed as the manganese concentration was increased.
In the [MnCl4]-2 system, solvent fluctuations and dynamical ion-pairing appear to be significant in determining electron spin relaxation.
In the [MnBr4]-2 system, solvent fluctuations, ion-pairing, and Br- ligand exchange provide the principal means of electron spin relaxation. It was also found that the spin relaxation in this system is dependent upon the field strength and is directly related to the manganese concentration. A relaxation theory based on a two state collisional model was developed to account for the observed behavior.
Resumo:
Background: Recently, with the access of low toxicity biological and targeted therapies, evidence of the existence of a long-term survival subpopulation of cancer patients is appearing. We have studied an unselected population with advanced lung cancer to look for evidence of multimodality in survival distribution, and estimate the proportion of long-term survivors. Methods: We used survival data of 4944 patients with non-small-cell lung cancer (NSCLC) stages IIIb-IV at diagnostic, registered in the National Cancer Registry of Cuba (NCRC) between January 1998 and December 2006. We fitted one-component survival model and two-component mixture models to identify short-and long-term survivors. Bayesian information criterion was used for model selection. Results: For all of the selected parametric distributions the two components model presented the best fit. The population with short-term survival (almost 4 months median survival) represented 64% of patients. The population of long-term survival included 35% of patients, and showed a median survival around 12 months. None of the patients of short-term survival was still alive at month 24, while 10% of the patients of long-term survival died afterwards. Conclusions: There is a subgroup showing long-term evolution among patients with advanced lung cancer. As survival rates continue to improve with the new generation of therapies, prognostic models considering short-and long-term survival subpopulations should be considered in clinical research.
Resumo:
Southern bluefin tuna (SBT) (Thunnus maccoyii) growth rates are estimated from tag-return data associated with two time periods, the 1960s and 1980s. The traditional von Bertalanffy growth model (VBG) and a two-phase VBG model were fitted to the data by maximum likelihood. The traditional VBG model did not provide an adequate representation of growth in SBT, and the two-phase VBG yielded a significantly better fit. The results indicated that significant change occurs in the pattern of growth in relation to a VBG curve during the juvenile stages of the SBT life cycle, which may be related to the transition from a tightly schooling fish that spends substantial time in near and surface shore waters to one that is found primarily in more offshore and deeper waters. The results suggest that more complex growth models should be considered for other tunas and for other species that show a marked change in habitat use with age. The likelihood surface for the two-phase VBG model was found to be bimodal and some implications of this are investigated. Significant and substantial differences were found in the growth for fish spawned in the 1960s and in the 1980s, such that after age four there is a difference of about one year in the expected age of a fish of similar length which persists over the size range for which meaningful recapture data are available. This difference may be a density-dependent response as a consequence of the marked reduction in the SBT population. Given the key role that estimates of growth have in most stock assessments, the results indicate that there is a need both for the regular monitoring of growth rates and for provisions for changes in growth over time (possibly related to changes in abundance) in the stock assessment models used for SBT and other species.